As I sit here typing on my keyboard and clicking my mouse, I find myself desperately hoping we may finally be moving beyond our narrow view of what form interfaces between humans and technology can take to one where we can be more creative and innovative...and effective!
Seems to me that today, almost every interface is limited to no more than a variation on more button pushing! Computers, for example, come with keyboards and mice, and we even carry this metaphor of button-pushing onto the screen with “virtual buttons” that we have to "push". And even smaller devices, such as mobile phones, are designed with button-driven interfaces.
Recently we’ve transferred button-pushing over to “touch” screens on phones and we think that this is just "oh so cool' and innovative. Give me a break! Most examples don’t even work as well as a real button! Ever tried to type more than a word or two on a touch screen keyboard?
Enough with the buttons already!!
There is a glimmer of hope, however, in the world of interfaces! Some of the new phones from Samsung (currently only available in China) use Immersion’s VibeTonz technology, a tactile feedback that simulates the feeling of pushing a mechanical button even though the surface is completely flat and does not move. It is the latest example of a new breed of haptic technologies that do for the sense of touch what high definition displays and surround sound do for eyes and ears. (see links at end for more on haptic technologies)
But I believe that the lack of feedback in touch screen devices will severely limit their success and adoption, and that someday we will shake our heads when we remember how we lived “out of touch” for so long without the use of touch in interfaces.
Hope on the Horizon?
But surely we can be more creative and innovative with our interactions and interfaces than limiting every solution to simply a variation on more button pushing! Currently we are seeing innovative uses of touch screens that provide a better interface and interaction with our digital world. Applications on the iPhone and other smartphones, for example, let you flick or flip things around, such as a set of digital photos or stacks of music CD artwork. This use of “gestures” as a way of interacting has great promise and recent examples, such as the Wii controller and some of the multi-person, multi-touch interfaces on digital surfaces are showing great promise.
Since touch works in both directions—as feedback to us and feedback from us—I'm delighted to see the use of haptic interfaces with the Wii controller, which generates physical feedback to your hands as you strike a tennis ball or golf ball, for example, and simultaneously uses your physical input to provide velocity and angle as you swing a golf club and the resultant force applied to the ball. Airplane and car simulation game controllers (joy sticks, steering wheel, brake & gas pedals) also provide similar haptic feedback. Flight simulators for training pilots and amusement park “virtual reality rides” show us how truly real, natural, and effective touch interfaces can be.
Let’s Stop Flapping and Start Flying!
But these incremental improvements, like touch screen buttons and limited haptic feedback, also serve to continue our "flapping" behavior of mimicking physical button pushing. Our tendency to focus on copying what worked well in the past, such as designing planes that mimic flapping wings on birds or pushing buttons for machines, is in fact holding us back from truly "flying"! If you are not familiar with my use of this metaphor of "flapping vs. flying", take a few minutes to read or listen to my story on “Confusing Flapping with Flying”.
To start truly "flying", that is developing truly innovative solutions, I think we need to focus on the use of touch and tactile feedback as some of the essential elements of successful interactions and interfaces. For example, the ability for us to feel and touch any object even though it is not “really there” and similarly for our technology to acquire the sense of feel and touch.
Less Virtual, More Real
Early applications of virtual reality used to seem pretty cool but now they seem, as my daughter liked to say to me, “Oh so yesterday, Dad!”. These early efforts failed because they were too virtual and too far from reality. Today, we can see clearly how much of a difference it makes when, for example, motion, sound ,and color is added to movies. More recently ,the addition of some 3D, high definition, and high fidelity has narrowed this "reality gap" to our eyes and ears. And virtual reality applications required all that extraneous equipment, such as headsets, goggles, helmets, gloves and even full body suits, to provide the feedback to or from us, which severely separate or distanced us from the reality of the situation.
So, what if we moved towards being more real and more natural? What if we had digital models with haptic interfaces that let us feel, handle, turn, assemble and disassemble them? How about applying full haptic feedback to a holographic images? Imagine if holographs provided not just stunning visual 3D reality, but also came with haptic feedback to your hands, so that you could feel their edges, curves, and shapes?
Digital Clay: Digital Prototyping gets Real
Moving along the reality scale, we already have very functional and bidirectional digital prototyping via 3D scanners and 3D printers. Together these can both create digital models from existing objects (input) as well create new fully-functional physical objects (output). Such digital prototyping can all be done BEFORE we start using large amounts of materials and energy to produce the “real things” ** and before we use the energy and resources to create these physical entities. This capability may well turn out to be one of the single greatest ways of conserving resources and energy and bringing some much needed practical reality to the “green” movement.
Leaping (and lurching) further forward, wouldn’t it be great have true virtual digital prototyping that gave us an experience equal to that of having the real thing in our hands? How about if we could move beyond digital clay to “real clay” where haptic sensors and generators were built-in at the nano level or could be more like the holographic + haptic models I mentioned previously. With this digital clay in your hands, you could now create any shapes and objects you like "by hand" or you could add some "clay" to your 3D printed models or you could use 3D printing to create a whole model out of digital clay and use this as the starting point of your design work or testing.
AND, now how about if we also hook up some of that digital clay “backwards”, so that rather than limiting the feedback to one direction—from our hands to the clay— it could also move from the clay to our hands?!
Say what, Wayne?
Well, imagine you are on a team whose members reside in many different locations. Your team is designing some physical object, such as a new smart phone. Rather than “just” having a 3D printer create the latest design iteration of the phone on your desk as a “solid”, what if every team member had one made out of digital clay and whenever you changed the phone's shape or moved some buttons, for example, the other phones at the other locations changed accordingly. You each would have a working version.
THAT sounds a lot more real and a lot less virtual to me, and it's all possible by adding the use of our sense of touch to our interactions with technology.
** BTW, this example starts to show how increasingly strained the concept of “real” is becoming and what I see as another trend towards the closing and ultimate elimination of the “reality gap”. But I’ll expand more upon this in a future posting.
The integration of sensory feedback and control within technology provides another look at touch working in reverse (so to speak), and it is equally important and fascinating. One recent example is the progress being made with “e-Skin” for robots. This new material is the first that combines two key characteristics:
- The ability to stretch like rubber
- The ability to conduct electricity like metal into a single substance
They’ve accomplished this with carbon nanotubes and used it to allow robots or machines to “feel” heat, pressure, and texture. I was equally intrigued with e-skin's potential to to stretch into larger sizes and onto curved non-flat shapes. I can imagine how this could lead to having displays in your pocket that you can pull out, unroll, and stretch into a large scale screen for viewing, or applications as "skin" grafts for burn victims.
Hiding Everything but the Benefits
Some have suggested that Jules Verne completely missed the computer and all its technology in his futuristic books such as 2000 Leagues Under the Sea, and Around the World in Eighty Days. But I think Verne got it completely right by NOT focusing on or imagining computers and technology. He simply assumed that the compute power was just “there” behind the scenes. I loved the way he kept the focus on the fantastic capabilities and the grand adventures we would have if such capabilities existed. Who cares how or what enables them?
So it is my belief and my hope that we are finally seeing technology dissolve into the background and into ourselves so that the focus can be on people, challenges, solutions, and great adventures in life. I know I'm signed up for the pursuit of joy and as such I welcome and champion any and all movement towards more natural interfaces and the use of all of our human senses to communicate and work more effectively with each other and with our supporting cast of machines and technology.