In this research activity, we study potential factors that influence touch-based interaction, which are supported by results in related embodied cognition research.
In particular, we studied the influence of semantic weight in a touch-based drag and drop task. Simply put, we intended to answer the question: “Do people drag the representation of a smaller and lighter real world object (e.g., the image of an apple) different than the representation of a heavier and larger real world object (e.g., the image of a car) on a touch screen?” In a user study, we reproduced predicted effects from neuroscience (i.e., that the meaning of words and images activate action tendencies related to the meaning of the represented objects) and demonstrated, in a touch-related typical HCI task, their manifestation in finger movement on a state of the art touch-enabled device. In a related study, we embedded the Stroop-effect into a drag and drop task on a touch-screen and explored the influence of workload on measures explaining fingertip movement.