Deictic Communication across Distances: Visualising Remote Pointing Gestures on Mobile Devices

Navas Medrano S, Pfeiffer M, Kray C

Forschungsartikel in Sammelband (Konferenz) | Peer reviewed

Zusammenfassung

Deictic expressions, such as ‘what is that’ while pointing at an object, play an important role in face-to-face communication, for example when describing locations and orientation or when identifying objects. If two parties are not collocated, e.g. when communicating via mobile phones, such deictic expressions cannot easily be exchanged between the remote parties. In this paper, we propose three ways to visualise deictic pointing gestures to a remote communication partner: 1) fingerprint overlay, 2) natural hand overlay and 3) map-with-viewshed (see Fig. 1). We evaluated these visualisations in a lab-based user study, where participants had to identify various realistic targets on a mobile phone. Overall, participants preferred and were most successful with fingerprint overlay. We also identified properties of the target objects that affected how well a pointing gesture could be transmitted. Our results can inform the design of future interfaces to transmit pointing gestures across distances.

Details zur Publikation

StatusVeröffentlicht
Veröffentlichungsjahr2018
Sprache, in der die Publikation verfasst istEnglisch
KonferenzProceedings of the 32nd International BCS Human Computer Interaction Conference (HCI 2018), Belfast, UK, undefined
DOI10.14236/ewic/HCI2018.11
Link zum Volltexthttp://dx.doi.org/10.14236/ewic/HCI2018.11
StichwörterRemote; pointing gesture; mobile phone: visualisation

Autor*innen der Universität Münster

Kray, Christian
Professur für Geoinformatik (Prof. Kray)
Navas Medrano, Samuel
Professur für Geoinformatik (Prof. Kray)
Pfeiffer, Max
Institut für Geoinformatik (ifgi)