Since the first computer was released for mass consumption, our society has been driven to improve on the capability of hardware in order to keep up with the demand for bigger, faster, more dynamic applications. The focus was about making the computer capable of more without considering the other part of the equation, which is how the user interacts with that computer. While we have become efficient and productive users of a mouse and keyboard, they are merely extensions of the computer and the user. They are the “middle men.”
The Natural User Interface (NUI) provides a shift in thinking about how we should be able to communicate with computers. It enables us to behave in the same way we do with nearly every other object used in our everyday lives. We touch objects, make gestures to elaborate the spoken word, and we think constantly.
I believe the popularization of the smartphone over the last decade has played a substantial role in leading us to this shift. These devices allow us to stay connected to family, friends, and work 24/7. The demand for connectivity and real-time information was so great that people would go to any lengths to stay in constant communication. Information was being consumed at an astounding rate, but using these devices was still relatively non-intuitive. The smaller the devices became, the more difficult they were to operate. We were quickly reaching the limits of what could be done with the status quo. Something needed to change in order for us to continue our technological evolution. This change occurred in January 2007, when Apple introduced the iPhone.
In four short years, the iPhone has single-handedly revolutionized the mobile phone industry. The ability to use a single finger to navigate, read a book, scroll through photos, emails, documents, or contacts fascinated us and provided us the ability to interact with computers in an extraordinarily simple way. “App” developers have provided us with hundreds of thousands of applications to satiate our appetite for information, enable us to stay connected, and live our lives on the fly. The break-neck speed at which iPhones flew off the shelves led every other mobile phone manufacturer to develop their own touch-screen devices and has opened up a world of possibilities in terms of how we will continue to interface with computers going forward.
The ability to control a mobile phone using a touch-screen has naturally led to the increased desire for the same capability on larger devices, such as Tablet computers, laptops, and PCs. Windows 7 offers users the ability to utilize touch-screen applications created by third-party developers. Some of the notable applications developed for the touch-screen include Corel Digital Studio 2010, Kindle for PC, and Microsoft Blackboard. Even more exciting is the announcement that the next release of Windows will be a fully functioning touch-screen OS (as you can see in this video). I can only imagine that the release of Windows 8, scheduled for 2012, will provide a more natural, user-friendly experience and the types of applications we’ve seen developed for the iPhone, Android, BlackBerry, and Windows phones will be easily consumed on these devices.
Another exciting advancement in the NUI space is the release of Microsoft Surface, a multi-touch, collaborative platform that has enabled developers to create some absolutely amazing applications. Two such applications are VitruView, developed by Interknowlogy and Real Estate Agent, developed by Seven Steps. VitruView enables the annotation and manipulation of a 3D heart image. Doctors can load x-rays over the 3D model and diagram the surgical requirements to their patients. Real Estate Agent allows realtors and their clients to see real estate listings available in specific areas by price range or with specific requirements. Both of these applications provide a glimpse into the potential real-world applications that are about to redefine the way we work every day.
While the focus of human-computer interaction has shifted to touch-screen interfaces over the past few years, there are other forms of NUI that are quickly gaining momentum and I believe will further improve the intuitive nature of these interactions, while allowing us to create technologies that could, up until this point, be possible only in movies.
With the public release of the Xbox Kinect SDK in June 2011, Microsoft has pushed the boundaries even further, enabling us to use our bodies as the interface. While the major development focus in this space has been for recreational purposes, one can easily envision some very ground-breaking applications. One in particular that comes to my mind would be the ability to control a physical object remotely, perhaps a robot or other mechanical device. Imagine the potential. What if the application provided the precision to allow a surgeon to perform an emergency surgery on a patient halfway across the globe? Or in the event of a pilot emergency on a commercial flight, providing air traffic controllers the ability to override the plane’s systems and guide the aircraft to a safe landing would be pretty remarkable!
The most astonishing example of the potential for NUI I have seen was part of Tim Huckaby’s keynote address at DevConnections this past March. During his address, Tim, who is the CEO of Interknowlogy, played a video of a product developed by Emotiv, a neuro-engineering company responsible for the creation of the Epoc Headset. This amazing product allows the user to control a computer directly with their thoughts! You can see a short video of this amazing device at http://www.youtube.com/watch?v=7utG3NqhBoU. Consider the potential. How about an application that maps your brain activity or one that would allow a blind person to use a virtual computer keyboard? What about an application that could permit a hearing-impaired individual to create their own music, just with the power of thought? The possibilities are only as limited as our imagination.