At about the time when the “coolest” clamshells mobile phones were little bigger than clams and certain candybar types were nearly thin enough to swipe, the industry remembered that the devices have to be big enough for adult fingers. Industry watchers suggest that the reminder touched off the touchscreen revolution, led by the iPhone. Midway through the revolution, designers are looking at lively phone features to fix persistent user-interface deficiencies. Next generation phones will be anything but inert.
The texting trend and data-enabled phones probably hastened the revolution. A simple phone call requires 11 or fewer taps, not an overwhelming task for most hands and nimble fingers. Text messages and phone emails, on the other hand, multiply the number of taps and exacerbate any issues with hand size or dexterity. And often they are keyed with the thumbs, which are much thicker than the digits.
A study published in the Journal of Usability Studies in May 2008 found a direct correlation between thumb size and customer satisfaction when texting a message. If a participant’s thumb circumference was large, he or she tended to be dissatisfied with the key size and space between keys.
A touchscreen gives big-thumbed users a break, but it introduces a limitation that isn’t present with the traditional keypad-plus-display phones. Apple Computer Inc. described the limitation in one of its applications to the United States Patent Office this year: "One of a touchscreen’s biggest advantages (i.e., the ability to utilize the same physical space for different functions) is also one of a touchscreen’s biggest disadvantages. When the user is unable to view the display (because the user is occupied with other tasks), the user can only feel the smooth, hard surface of the touchscreen, regardless of the shape, size and location of the virtual buttons and/or other display elements. This makes it difficult for users to find icons, hyperlinks, textboxes, or other user-selectable input elements that are being displayed, if any are even being displayed, without looking at the display."
The Computer Department of Glasgow University in Scotland pointed out in an article on the university website that touchscreens do not resolve a common problem in cell phones: “The majority of interaction on a mobile device is visual, thus placing a huge demand on the user’s visual attention, which can be dangerous in certain mobile situations or socially inappropriate in meetings. . . . Furthermore, when interacting with a button, for example, with the fingertip – the image of the button is covered by the fingertip and therefore any visual feedback can go unnoticed.”
One of the key features lost in touchscreen interaction is the ability to feel the buttons, the article added. “Although touchscreen keyboards are based on physical keyboard designs, they do not produce the natural haptic response which occurs when a button is touched.”
This is the preamble to the department’s argument for incorporating haptics, now a hot specialty, in touchscreen interfaces.
The Search for a Tactile Solution
A Google search on “haptics” brings up almost a million URLs, many of which link to various conferences around the world for specialists in the field.
The noun grew from “haptic,” a standard dictionary adjective “relating to the sense of touch.” Haptic systems are defined as displaying force and touch information to human operators. The Glasgow article explains haptic feedback as enabling “users to ‘feel’ their interface and any feedback provided, thus reducing the visual demand and allowing for more socially appropriate and subtle interaction.”
Cross-fertilization from other industries is contributing to haptics advances for cell phones. Silicon Valley’s Immersion Corp.has licensed its haptics innovations to the makers of many medical and consumer products. Their medical-industry partners make robots and equipment for minimally invasive surgery, and the haptics provide surgeons with tactile feedback. The company’s innovations also show up in gaming controllers, car dashboards, GPS gadgets, media players and some 70 million cell phones, by the company’s reckoning.
South Korea’s Samsung, LG Electronics and Pantech, Finland’s Nokia, Canada’s Research In Motion are among the mobile computing giants already using haptic feedback. It is still primitive – in the Samsung Haptic 2 model and iPhone it amounts to simple vibrating buzzes – but the makers are eying next-generation haptics to overcome present limitations in the technology.
Professor William Provancher, head of the Haptics and Embedded Mechatronics Laboratory at the University of Utah, describes the limitations as “quite challenging.” He writes on his university web page that “reecreating all of the sensations we experience during object manipulation in virtual simulations with a haptic feedback device is especially true if the goal is to permit a person to move and interact with their hands as they would naturally.”
Professor Provancher seems other limitations that must be overcome. “In addition to the challenges of packaging a device in a compact space, these devices should not emit extraneous or unintended vibrations, to which our hands our quite sensitive.” He recommends representing “some of the cues and sensations experienced when touching real objects. As part of his dissertation research while in Stanford’s Biomimetics and Dexterous Manipulation Lab, he suggested simply rendering the location of contact between the virtual object and finger. This type of feedback device is referred to as a contact location display (CLD).
Two years ago, Apple applied for a patent on an application that addressed one of the limitations noted by the professor. A July 2009 Computerworld article, which follows Apple’s journey through touchscreen technology, explains that the patent application describes physical bumps and depressions in a screen that can come and go based on what is projected onscreen. When the keyboard is present, for example, the keys become physical bumps on the screen surface.
A Wall Street Journal (WSJ) article in September 2009 noted that Apple took an early lead in exploring both the possibilities of touchscreens and haptics-related limitations. When the iPhone was introduced in 2007, its multitouch capability was among its most eye-catching features. Apple Chairman Steve Jobs demonstrated how photos could be resized simply with a pinching or expanding gesture using a thumb and a finger on the screen.
High-Fidelity Haptics for Next-Generation Cell Phones
According to the WSJ, Immersion expects the next-generation of the technology to serve up thousands of different sensations that will be immediately recognizable to people, convincingly creating sensations associated with sound and also with the shape and texture of onscreen objects.
Immersion CTO Christophe Ramstein told Computerworld that next-generation haptics will provide cues about what’s happening on screen. One application of this is simulating the feel of a real keyboard on a virtual, onscreen keyboard, he explained. Haptics can be employed to simulate the feeling of moving your finger from one key to the next, even before a key is pressed.
According to the WSJ, Immersion says three mobile carriers soon will launch applications it created that allow users to communicate emotions nonverbally. For example, frustration can be communicated by shaking the phone, which will create a vibration that will be felt by the other party. That person might then choose to respond with what the developers call a “love tap”—a rhythmic tapping on the phone that will produce a heartbeat-like series of vibrations on the other party’s phone.
Immersion’s general manager of touch business, Craig Vachon, told the WSJ that “The technology is such that we could blindfold you and you would be able to feel the demarcation between the keys of a keypad, on a completely flat touch screen.” He said the technology is being tested in handsets now and should be made available to consumers sometime in the next 12 months.
A Touchy Keyboard
For now, at least, many mobile users prefer the accuracy of a physical keyboard rather than a touch screen. A new technology is being designed to allow the best of both worlds: a keyboard that can also respond to touch commands.
According to the WSJ, the keyboard, under development by Berlin-based Tech21 Sensor GmbH, responds normally when a letter or number is pressed. But a user could, say, scroll down a Web page by running a finger lightly across the keyboard; pressing a little harder might be the command to scroll faster or zoom in on a page. “We want to provide any object with keys or a keypad with smarter keys,” Tech21 Chairman Christian Lindholm told the WSJ.
Communicating Emotions
In different ways, both articles point out that high-fidelity haptics can bring an otherwise cold, flat screen to life. They say the technology is valuable because it can contribute to an emotional connection both between people communicating electronically, and between humans and machines.
Tara Mullaney, a graduate student at the School of the Art Institute in Chicago, wrote recently in Design & Emotion that she sees haptic design as a way to "bridge technology and feelings, allowing us to create deeper connections and more complex sensory interactions between objects and their users."
Sources: Journal of Usability Studies; Glasgow University; University of Utah; Computerworld; Wall Street Journal; Design & Emotion; Immersion Corp., Apple Computer Inc.
This article originally appeared in The Ergonomics Report™ on 2009-09-16.