Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":184891,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']

A future full of touchscreens? It’s all in the software

A future full of touchscreens? It’s all in the software

Editor’s note: This story is part of our Microsoft-sponsored series on cutting-edge innovation. Dr. Andrew Hsu is Technology Strategist for human interface company Synaptics.

Until recently, a device designer wanting to add a touchscreen to a device would have faced a lot of challenges. Integration issues, increased cost, usability problems would have stood in the way. But now that mobile handsets have made touchscreens ubiquitous, that resistance has all but evaporated. A number of suppliers (including my own company, Synaptics) have now proven touchscreen technology in the high-volume handset market, and the supply chain is rapidly lowering costs and solving usability issues.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":184891,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']

Resistive touchscreen technology is still the lowest-cost alternative; however, newer capacitive-based products are increasingly cost competitive. And where manufacturers would formerly have nixed the idea of adding touch-enabled displays purely on the basis of cost, they now see the technology as essential to increasing their sales and helping them stay competitive. Indeed, a common rally cry heard in many design meetings continues to be, “We have to make our device work like the iPhone.”

Touchscreens are more than just a cool way to interface with a device. They allow designers to create highly dynamic, configurable, personalizable controls for a device, which means designers can optimize the device controls in software to suit the needs of any particular application or use scenario. Yes, designers now have a virtually open canvas to create customized user interfaces. As an end-user, I certainly appreciate the design economy and usability benefits of dynamic user interfaces. For example, when my touchscreen phone is running its music player application, a customized set of control elements appears on the screen. The custom UI is optimized to provide information about each control and only show the control that can be used at any given time. This is in stark contrast to hardware-based controls, in which users are forever trying to figure out what each button does in a given application (or if that button even does anything at all).

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

But with opportunity and flexibility come much responsibility. Unlike mechanical controls, which designers have to take into consideration during the hardware design phase, a touchscreen shifts the planning of device controls into the software design phase. This shift can certainly improve the user experience, since designers can implement the layout with fewer hardware constraints. However, the shift could also lead to real disasters in usability. The obvious example is inconsistent behavior of controls across applications — imagine how confusing it would be if a touchscreen phone required users to single-tap the virtual keypad buttons to dial phone numbers but to double-tap on the virtual buttons in their music player application.

And then there are two serious discoverability challenges involved. The first of these are the functions included for power users of a particular application that standard users won’t be aware of. These are routinely exploited in games — remember those secret “fierce” maneuvers in Mortal Kombat activated by pressing buttons in a specific sequences? Another example might be a virtual button whose function changes if you press it for an extended period of time.

The second discoverability challenge is much more serious and is perhaps the most daunting issue in my anecdotal survey of what scares people about touchscreen-based interfaces. This is the problem of how to clearly and intuitively identify user controls from other things drawn on the screen. Although touching a display element that is not designed to be touched does not result in any particular interaction crisis or damage to a device, it can cause significant frustration among users as they attempt to figure out which on-screen elements are touchable (who reads user manuals these days?). This issue is further exacerbated by the fact that most devices still don’t offer any type of feedback to users that a touch input has occurred. So users can’t figure out if they touched the element correctly but the device didn’t respond, if they didn’t touch it correctly, or if the element they thought was touchable, in fact wasn’t.

Not surprisingly, many people, especially older user who grew up on well-defined mechanical input controls, shy away from touchscreen devices, claiming the interfaces are too hard to use.

So what should be done to address this usability challenge? Quite possibly, nothing. Over time, people will simply learn how to cope with this issue and automatically understand the difference between touchable and non-touchable graphics elements. For example, in the Android user interface, users already know that touching the signal strength bars at the top of the screen doesn’t do anything (although it probably should) but that dragging the entire top bar of graphics elements downward will expose the “curtain” of recent status updates. And if the market consolidates into a handful of de facto standards, there’ll be fewer quirky interfaces people will need to learn. This is effectively how QWERTY keyboards persist despite their horrible usability.

But with a de facto standard interface evolving, there continues to be a huge opportunity for someone or some organization to “get things right” with a well-designed touchscreen interface. While Donald Norman passionately makes the case for good design in his landmark book The Design of Everyday Things, it’s time to recast his principles of good design to software-based user interfaces as they make the shift to touchscreens. It could be this recognition and execution of good software user interface design that will finally convince many, including my mother, to buy into the fact that touchscreens can make devices better, not just more confusing.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":184891,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']

What do you think about the usability challenge?

Will interface design standards make touchscreens more usable?customer surveys

To read more recent stories in this series, visit the Conversations on Innovation site, or click below:
Cleantech’s next generation: smaller, nimbler, smarter
How JavaScript will lead the way to open video
TV 2.0: Hulu’s flatlining, and the networks are ready to innovate
What will it take to make mobile payments mainstream in the US?

Dr. Andrew Hsu has been the primary technical contact for Synaptics’ worldwide customers in the handheld space since 1999. He joined Synaptics in 1996 and led the company’s efforts into establishing a presence in the mobile handset market. Hsu developed Synaptics’ ClearPad technology, a transparent sensor that can be mounted under curved plastic and glass. He led the marketing for ClearPad, resulting in the first production phone with a capacitive touchscreen, the LG Prada. Prior to his current role, Hsu was a scientist and a design engineer that helped develop the first single-chip TouchPad solution for Synaptics.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":184891,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More