Interacting via touch is on everyone’s lips. Nevertheless, many companies shy away from reorganizing their applications respectively, because touch screen interactions require different concepts and can accompany huge restructuring efforts and high costs. Also, such restructuring is not riskless, since touch-interaction is not the most suitable paradigm for all application scenarios. The following article presents the combo box as an example to illustrate how mouse- and touch-interaction concepts can be integrated to get the best of both worlds. The article aims at inspiring especially those of you, who appreciate mouse- and touch-operation alike.
First of all, it is doubtful that interacting via finger is more pleasant and more suitable for all application scenarios and branches. It is a fact by now that a bunch of consumer products and smart phones are available on the market and perform compelling via touch interaction (and these devices, such as the iPhone, most probably owe their success to this fact). Also in the industrial environment touch screens are successfully deployed, as you can see for example in one of our Case Study. Business applications such as ERP or CRM systems, used at classical desktop work places, however, are still preferably operated via mouse.
In times of operation systems as Windows 7, which support touch interactions natively, it is a great temptation to make the own application touchable. Mostly, these trials are reduced on simply increasing the size of standard GUI elements. Obviously, this is insufficient for serious employment. Especially at classical desktop workplaces, the hardware ergonomics should be considered before thinking about the software. Otherwise you quickly end up with the famous „gorilla arm“.
If such an ergonomic hardware setup cannot be provided, users should be given the chance to switch between touch- and mouse-operation. This again places demands on the software: GUI elements have to be comfortably accessible via both – finger and mouse. What this means in particular, shall be demonstrated with a short example.
The following two scenarios on a micro-interaction level demonstrate step by step how a combo box usually reacts during the selection of an item, once optimized for mouse operation (left) and once optimized for operation via single touch (right). We assume a combo box with only a few items, so that scrolling is not necessary (as it is common for example in pie menus).
Mouse- and Touch-operation of a Combo Box – A Comparison
|1) Initial state|
|The combo box is closed and no item is selected.|
|2) MouseOver state|
|The combo box changes its color and after a short interval shows a tooltip, which describes the purpose of the combo box.||As to the touch interaction, tooltips cannot be realized because of the missing MouseOver state. Touch systems mostly go without tooltips in general.|
|3) Preparation “open popup”|
|After pushing the mouse button, the combo box is visualized in state “pressed” (MouseDown event).|
|4) Opening the popup with the shortlist|
|Releasing the mouse button opens the popup together with the element list (MouseUp event).||Already the first contact with the touch screen opens the popup with the element list (TouchDown event).|
|5) Navigating through the shortlist|
|The popup stays open, even if the mouse leaves the combo box. The popup does not close before clicking outside of the combo box. Without holding the mouse button, the selection can be made (MouseMove event). Latter seems like a trivial detail, however, can be a huge advantage since moving the mouse while at the same time pressing the mouse button causes motor problems for some users.||A selection is made through wiping the finger onto the list element without losing contact to the screen (Wipe event). In contrast to moving the mouse while holding mouse button pressed, this is a more comfortable and more natural way of interaction.|
|6) Selection of an element|
|Pressing the left mouse button selects an element. The popup remains open (MouseDown event).||This intermediate step is not applicable in an touch context.|
|7) Adopting an element|
|As the mouse button is released, the clicked element appears in the combo box as being selected (MouseUp event).||As the user releases the finger, the currently highlighted element becomes the selected item in the combo box (TouchUp event).|
Using a mouse, selecting an element in a combo box requires one additional click. The primary explanation for this is the deficiency that comes along with moving the mouse while at the same time holding the mouse button pressed. The touch interaction in turn also entails certain drawbacks, since it does not support tooltips in this case.
In the following section we will show how the combo box can be optimized on a micro-interaction design level, so that it can be both mouse-friendly and touch-friendly:
The New Behaviour of the Combo Box
|1.1) Opening a shortlist and display of the tooltip|
|The popup should already open with the MouseDown/TouchDown event. Ideally the popup opens while being shifted to the left, so that the finger has enough room to wipe over the list without covering its items. Letting the popup open to the above the combo box could be an alternative, which entails the disadvantage though, that the movement of the finger feels more uncomfortable, since the fingernail– depending on the touch screen’s surface – could lift a bit from the nail bed or “stammer” across the surface.
Preferably, the tooltip should open shortly after the MouseDown/TouchDown event, this time above the combo box, since the finger could cover it otherwise.
The items of the popup’s list could be visualized a bit larger, so that the selection of an item can be comfortably made with the finger.
|1.2) Closing the shortlist|
|The shortlist closes after a MouseUp/TouchUp event, however, without triggering an action. In this respect, the user can safely explore the interface.|
|2) Selection of an element|
|While holding the mouse button pressed or touching the surface the user can highlight an element. (Drag/Wipe event). As already mentioned above, pressing the mouse button while moving the mouse feels inconvenient for some people. However, there is a solution for this audience, too: if the popup list stays opened in step 1.1 even after releasing the mouse button, users can conveniently roll the mouse to the item they want to select (without holding the mouse button ) and then perform the selection through an additional click.|
|3) Selecting an element|
|By releasing the mouse button or lifting the finger, the recently highlighted item becomes the selected item (MouseUp/TouchUp event). The shortlist is closed again.|
The operation via mouse as well as the operation via touch goes along with certain advantages and disadvantages. For interface designers a direct comparison on a micro-interaction level is worthwhile. Users benefit it any case, since in a system made for both worlds, they can switch instantly from touch to mouse or vice versa without having to reconfigure or restart the application.