TY - GEN
T1 - Using mid-air haptics to guide mid-air interactions
AU - Neate, Timothy
AU - Maffra, Sergio Alvares
AU - Frier, William
AU - You, Zihao
AU - Wilson, Stephanie
N1 - Funding Information:
We would like to thank the participants for their involvement in the study and the anonymous reviewers of this work for their detailed feedback. We acknowledge funding from City, University of London’s Research Development Fund’s Pump Priming Scheme.
Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023/8/25
Y1 - 2023/8/25
N2 - When users interact with mid-air gesture-based interfaces, it is not always clear what interactions are available, or how they might be executed. Mid-air interfaces offer no tactile affordances, pushing systems to rely on other modalities (e.g. visual) to guide users regarding how to interact with the interface. However, these alternative modalities are not always appropriate or feasible (e.g. eyes-free interactions), meaning that they are not easy to learn from touch alone. Despite the possibility of conveying contactless haptic information in mid-air through ultrasound phased arrays, this technology has been limited to providing feedback on user interactions. In this paper, we explore the feasibility of using mid-air haptics to guide gestures in mid-air. Specifically, we present approaches to guide the user’s hand in cardinal directions, execute a hand gesture and navigate a 2D mid-air plane, which we tested with 27 participants. After, reporting encouraging results which suggest good accuracy and relatively low workload, we reflect on the feasibility and challenges of using haptic guidance mechanisms in mid-air.
AB - When users interact with mid-air gesture-based interfaces, it is not always clear what interactions are available, or how they might be executed. Mid-air interfaces offer no tactile affordances, pushing systems to rely on other modalities (e.g. visual) to guide users regarding how to interact with the interface. However, these alternative modalities are not always appropriate or feasible (e.g. eyes-free interactions), meaning that they are not easy to learn from touch alone. Despite the possibility of conveying contactless haptic information in mid-air through ultrasound phased arrays, this technology has been limited to providing feedback on user interactions. In this paper, we explore the feasibility of using mid-air haptics to guide gestures in mid-air. Specifically, we present approaches to guide the user’s hand in cardinal directions, execute a hand gesture and navigate a 2D mid-air plane, which we tested with 27 participants. After, reporting encouraging results which suggest good accuracy and relatively low workload, we reflect on the feasibility and challenges of using haptic guidance mechanisms in mid-air.
KW - Mid-air haptics
KW - guidance
KW - gesture
KW - mid-air interfaces
UR - http://www.scopus.com/inward/record.url?scp=85171473294&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-42280-5_3
DO - 10.1007/978-3-031-42280-5_3
M3 - Conference contribution
SN - 9783031422799
SN - 9783031422805
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 43
EP - 64
BT - Human-Computer Interaction – INTERACT 2023
A2 - Abdelnour Nocera, José
A2 - Kristín Lárusdóttir, Marta
A2 - Petrie, Helen
A2 - Piccinno, Antonio
A2 - Winckler, Marco
PB - Springer
CY - Cham
ER -