Cookies help us in providing our services. By using our services, you agree that we save Cookies. Learn more.


Thinking Out of the Box

Touching the desktop – Modern micro-interaction and burdens of the past

Maren Wolff

They are considered intuitive and their handling easy to learn – Touchscreens. To humans it feels far more natural to touch an object of interest with the finger on screen instead of using the mouse. Apart from the clearly easier hand-eye-coordination, touchscreens create an elegant and user friendly experience through merging input and output actions into one device.

But even despite of all these advantages, they can create a lot of frustration and anger, which probably every one of us has realized at some point. For example: If you accidently call someone although you only tried to scroll down the address list, if you have to type in a word five times, because you hit the wrong letter, or the alignment of “Ok” and “Cancel” is so narrow that you are afraid to click the wrong one. It would be too good to be true, if touchscreens did not raise new usability problems. Especially the usage of desktop operating systems like Windows 7 or OS X with touch devices creates a bunch of problems.

Buttonsize – Influence on accuracy and speed

One of the difficulties, which I mentioned above, is the right size of buttons. Fitt’s Law (1954) can give a first hint for a compromise between usability and limited screen space. It contains the reliable and frequently researched effect, which says that large and near are easier to reach than small and distant. Through this, exact calculations about the effect of shrinking buttons can be made: the accuracy that is needed to hit them and lowered speed that is a result of this accuracy (“speed-accuracy-trade-off”). It is of course not possible to enlarge buttons by an unlimited amount, to enhance the interaction speed (on a smartphone with a small screen, for example). To solve that, the so called “compensation” method is used.

For example, an application observes the semantic context of text input. It replaces wrong letters, or it creates their correct order (i.e. taht ? that). Patrick Baudisch and Christian Holz went even further: Through their model (The Generalized Perceived Input Point Model, 2010) it is possible to explain the hit accuracy of the user through the angle or orientation of the finger, as well as the mental model of interaction that the user experiences. Based on this model, they developed a prototype which compensates these factors. As a result, users reached a hit accuracy of 95% on a 4,3mm- Button. An impressive result, considering that the currently determined minimal sizes that got comparable hit accuracy took about 10,5mm – 26mm. Unfortunately, the relatively complex hardware and the delay after every input, will put behind a public release for this technology for quite some time.

Micro-Interaction – The dilemma of old interaction methods and new technology

To get to the bottom of the problem, we need to step into the so-called micro-interaction-design. Touchscreens only contain a small amount of usage possibilities. There can only be a limited amount of fingers (with single-touch only one) and gestures for interaction. On the other hand, the wish for transferring Desktop-mouse functionality to touch devices seems very common (see Blogarticle Mouse vs. Touch). These functions include: Opening interactive context menus via right- click or showing tooltips, which contain specific information about the element which is placed directly under the mouse courser (the so-called “hover” state).

Both functionalities (context menus and tooltips) create a problem, at least to desktop developers. On the one hand, it is simply not possible to convert a right-click or hover- interaction directly to a touch device but on the other hand users that are accustomed to these functionalities for a long time would be frustrated, if they would suddenly disappear.
The importance of context menus and tooltips is indisputable. Especially inexperienced users profit through context information, even on a low amount of screen space. But how can we implement these functionalities on a touch screen in the best way?

iOS and OS X

As the thin, touch-optimized brother of OS X, iOS doesn’t have the discussed problems of functionality, since Apple bravely broke with old OS X traditions. On touch screens, OS X is ahead of Windows 7, because it didn’t focus on context menus the first place, since Apple optimized the whole interaction for the left-click only. So the missing right-click has always been replaced by separate buttons or the “long-click”, which creates an additional micro-interaction-state for the user, but at least in a consistent way so that it can be directly transferred to a one finger-touch-operation. But tooltips via touch-operation are still a problem – even for OS X .

Windows 7

Similar to OS X (but not quite so) Windows 7 tries to replace the opening of context menus via right-click, through the so called “Longtap” and introduces a couple of usability problems (maybe because of the limited possibilities of older single-touch-screens). For example, the long-tap slows down the workflow, but what is even worse: the feedback of the “pressed”-state is just missing (obviously because the system needs to decide if the intention of the user is a single- or long-tap, which it can only do after the long-tap timer has expired). This is especially bad for a device that already has no haptic feedback, leading to an insecure experience of the user. Besides that, the long-tap has far more similarities to the hover, which would have made it a better interaction method for the tooltip (in this respect, Windows 7 has the same issues with the tooltip like OS X). A two-finger-tap could be a more intuitive interaction for the context menu, because the touching of the second finger is comparable to (right-) clicking with the second finger. Of course, this solution would require a multi-touch device.

Windows 8

Windows 8 resolves the context menu issue with a gesture („Swipe Down“ Windows 8 User Experience Guidelines, Chapter: Touch, Commanding and Controls). It remains to be seen if it triggers too easy or too difficult and if by doing so results in a usability problem.

Micro-interaction in different operating systems and their results “Conservatively effected” “Bravely into the future”
Windows 7 OS X Windows 8 iOS
Left-click Mostly object selection or triggering buttons Mostly object selection or triggering buttons Direct triggering of elements
One-Finger-Tap Mostly object selection or triggering buttons Mostly object selection or triggering buttons Direct triggering of elements Mostly object selection or triggering buttons
Right-click Mostly opening context menus Often dismissed. If not: opening context menus Context menu /Action Bar
Two-Finger-Tap i.e. alternative Zoom Not used (except on Touchpad) Not used, replaced by: „Swipe Down“ gesture for object selection in Windows 8 Apps
Hover Tooltip Tooltip Tooltip
Long-tap (“Press and Hold”) Opening context menus Often dismissed. If not: context menus Tooltip or opening context menus (“learn”) context menu (also reachable through indirection)

Alternative approach

When switching to touch screens, it is worth looking in detail at the possibilities of different micro-interactions, avoiding a loss of functionality. A rather nice method, which is not exploited to its full potential, could be the two-finger-tap. Touching with a second finger (with no need to hit a specific point) should allow a comfortable workflow. At the same time, it has no similarities to any existing gesture or other interactions, which could result in triggering other results (as long as these two fingers do not move away from their initial place). As I said before, the two-finger-tap is similar to the right-click, so it could be perfect for opening context menus (without slowing down the workflow).

Technical restrictions

But for a smoothly running technical implementation, there are some things that need to be set up first. For example, a timer would be needed to deny the undesired triggering of elements due to the minimal delay between the touch of both fingers (“sequence of fingers”). Also, the two- two-finger-tap only works with multi-touch devices, since at least two finger contacts have to be recognized by the device. Therefore, the touch device has to be capable of identifying two neighbored fingers as individual contact points and must not confuse them with one very large finger, resulting in a falsely interpreted left-click (“finger positions”).

However, the most obvious problem is the already discussed button size. Often buttons are so small that it is difficult to click them with one finger. How difficult will it be with two fingers?

Strategies for “hit” area calculation

To solve this problem, the “hit” area could be calculated and interpreted differently with two-finger-tap compared to a one-finger taps. Usually an UI-element is recognized as “being hit”, if the registered touch point (or mouse pointer) lies within the borders of the UI-element during the click or touch event. But if two touch points are recognized, it would be possible to draw a connection line between them to identify the element that has been hit. The line segment that lies within an element by the largest extend, will select this element as “being hit”.

The following graphics show, in a schematic way, the possible finger positions/sequence together with which UI element is recognized as “being hit”. Four different finger positions and three “hit interpretation” strategies need to be distinguished.

Strategy 1:
The first finger already determines the element that will be triggered on contact.
As you can see in the graphic, with this strategy two of four possible finger positions/sequences accidently trigger a wrong element.
Strategy 2:
Only two fingers within the borders of the element determine if it is triggered.
You can see that only one of four of the possible finger positions/sequences triggers the desired element.
Strategy 3:
The biggest interference of the element bounds with the connection line determines which of the elements is triggered. With this strategy every possible finger position/sequence triggers the desired UI element.

The hypothesis is that this kind of calculation has a positive influence on the user experience. Since it is possible to just roughly hit the UI element, instead of hitting a specific and exact point and since you can even mistakenly hit other elements around the targeted item, a significant growth of usage speed (see: Fitt’s law) should be the consequence. This effect influences the confidence in the control of the device which should influence the entire user experience in a positive way.


Mouse-controlled functionalities, known from the Desktop cannot be transferred to touch devices without modifications. Technical and physical constraints must be considered to gain satisfying results in terms of usability and user experience. One possible micro-interaction to transfer the opening of context menus (known from the Desktop) to touch devices could be the “two-finger tap” which, however, requires a smarter calculation and interpretation of which element is actually being hit by the user.
Would it really be possible to implement this interaction design proposal, for instance, to control the triggering of context menus and tooltips on a desktop operating system? Could it be the new right-click? Or maybe even trigger a whole new function? To answer these questions, of course, this method has to be tested intensively first. The rather common long-tap, which works as the right-click on today’s touch screens and to which people have gotten used to even despite of its insufficiencies, already speaks against this new approach.
Finally, however, the past should have taught us that sticking to old traditions almost always leads to suboptimal UIs (see: Win7 and OS X on touch devices), whereas a brave questioning or cutting of old traditions leads to far better results most of the time (see: Windows 8 and iOS). Windows 8 still has the problem that Desktop applications need to be fully functional within the touch interaction environment. The breaking of old traditions is happening right now, but only partially for the touch-optimized “Windows 8” apps. Time will show if the imminent official release of Windows 8 will unleash the excitement of touching on Desktop applications and tablets, or not.

Literature resource:

  • Holz, C. and Baudisch, P. (2010). The Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprint. CHI 201,0 April 10–15, 2010
  • Fitts, P.M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47, 381-391.
  • Hall, A.D., Cunningham, J.B., Roache, R.P., and Cox, J.W. (1988). Factors affecting performance using touchentry systems: Tactual recognition fields and system accuracy. Journal of Applied Psychology, 4, 711–720.
  • Vogel, D. and Baudisch, P. (2007). Shift: A Technique for Operating Pen-Based Interfaces Using Touch. In Proc. CHI’07, 657–666.
  • Windows 8 User Experience Guidelines
  • iOS Human Interface Guidelines
  • Windows User Experience Interaction Guidelines for Windows 7 and Windows Vista

All trademarks or registered trademarks are properties of their respective owners.

Want to know more about our services, products or our UX process?
We are looking forward to hearing from you.

Corporate Experience Manager
+49 681 959 3110
Contact form

Before sending your request, please confirm that we may contact you by clicking in the checkbox above.
  • Saarbrücken

    Science Park Saar, Saarbrücken

    South West Location

    Headquarter Saarbrücken
    Centigrade GmbH
    Science Park 2
    66123 Saarbrücken
    On the map

    +49 681 959 3110

    +49 681 959 3119

  • Mülheim an der Ruhr

    Games Factory Mülheim an der Ruhr

    North West Location

    Office Mülheim
    Centigrade GmbH
    Kreuzstraße 1-3
    45468 Mülheim an der Ruhr
    North Rhine-Westphalia
    On the map

    +49 208 883 672 89

    +49 681 959 3119

  • Haar · Munich

    Haar / München

    South Location

    Office Munich
    Centigrade GmbH
    Bahnhofstraße 18
    85540 Haar · Munich
    On the map

    +49 89 20 96 95 94

    +49 681 959 3119

  • Frankfurt am Main

    Frankfurt am Main

    Central Location

    Office Frankfurt
    Centigrade GmbH
    Kaiserstraße 61
    60329 Frankfurt am Main
    On the map

    +49 69 241 827 91

    +49 681 959 3119