Group Members: Karthik, Rayan, Ahmed
Objective
Our objective for the project is to experiment with haptic methods that display data.
In this iteration, we mainly wanted to experiment with simple data visualization and to build the software and hardware infrastructure for operating the Arduino boards. We focused on a single type of graph and tried various kinds of haptic feedback.
Motivation
The paper written by Paneels and Roberts (2010) provides a great introduction to various works completed in the field of haptic data visualization. This field has been explored for different reasons, such as presenting information to blind or partially sighted users, to help with presenting overloaded visualizations or to reinforce other modalities.
The motivation behind this iteration is to follow up on this work, specifically focusing on ways that visually impaired users could haptically experience what a bar graph “feels” like.
In some ways, our group had the “advantage” of not having the same type of hardware, therefore we got to experiment with more types of haptic feedback to represent bar graphs. Three members of our team focused on vibrotactile feedback while I focused on using the Haply.
Previously, bar graphs have been haptically represented as enclosure effects by Yu and Brewster (2002). They used a type of enclosure effect supported by Immersion TouchSense SDK and the WingMan FF mouse. A bar is represented as a rectangular enclosed area that once the mouse cursor enters the bar it will be forced to remain inside until they apply a bigger force to overcome the force constraint of the bar edges. They also used the PHANToM and polygons supported by the GHOST SDK. Similar to the mouse, this representation uses a V-shaped groove that once the PHANToM pointer enters the inside of the wall it will become trapped until the user applies a greater force. They also used an audio implementation as a reinforcing stimulus where if the user presses a button, the program will speak out the data values of the bar. Also, a sound is produced to represent the height of the bar using the pitch. For example, a tall bar will produce a high pitch when the pointer is on the bar.
Getting inspiration from this paper, I decided to create an almost opposite effect of the enclosure by experimenting with the Fisica package as well as the PID loop.
Approach
To simulate a visually impaired user, I closed my eyes while using the graph through all the experiments.
First, I used the Fisica package to create bars of the same height as our objective bar graph using the FBox. The Fboxes are hidden under the visualization of the actual graph because I realized it is difficult to create forces on the bar graph itself. This worked well, but I realized the graph has a problem because if the bars have no gaps in between them and they have the same height then it could just feel like a straight line (Fig. 1).
To mitigate this, I created a graph that had bigger gaps using the same type of FBoxes (Fig. 2). I found it difficult to navigate without getting lost between the bars as it almost feels like a maze. I decided that perhaps some more indication needs to be made between the “world edges” and the actual bar edges by implementing auditory feedback where if a user is touching a “world edge” it would play a sound. However, I was unable to get the sound to stop playing or turning into noise when the pointer is not touching the edges due to the simulation thread running at 1kHz. I also experimented with using different types of friction values to differentiate the edges, however, I could not feel a significant difference in sensation.
Overall, I think the representation by Yu and Brewster (2002) where the bars are enclosures is more appropriate and prevents the user from “getting lost” by trapping the pointer.
I also tried to represent the bars using the PID mechanism because I thought it may be better since the user will be guided by the Haply. Since a bar graph has a complex outline, I was not able to get a stable response by tracking all the edges and had to simplify the bar graph by tracking only one point at the top of each bar (Fig. 3). Basically, turning the bar graph into a line chart. I used similar types of values for the PID variables as in Lab 4.
Overall, I think the PID loop provides a better experience because the user is guided, however, it can be argued that simplifying a bar graph into a line chart is taking away the “experience” or “feel” of a bar graph.
Without actually experiencing the vibrotactile feedback, I think vibrotactile could provide a better representation for bar graphs because vibration can be very rich and with many customizable variables. Also, I think the issue of “getting lost” between the bars won’t be as prominent using vibrotactile.
Reflection
I realized that it is difficult to remember the relative height of the bars once the pointer/device has gone to the next bar. I wonder if it is because I am not “trained” to only rely on my hands and haptic sensations and visualize the graph with my eyes closed. Also, in my implementation, it is unclear what the category of the bar is on the X-axis and the height of the bar. This could be implemented in a similar manner to Rayan and Ahmed’s auditory feedback implementation and the representation by Yu and Brewster (2002).
In this iteration, we achieved our goal of experimenting with simple haptic data visualization. It would be interesting to extend this to other types of graphs, however, other graphs may not really provide any new or interesting findings. I believe that each type of haptic representation is perhaps more suitable for different graphs and there will not be a one-size-fits-all solution. For example, a line graph could easily be represented using Haply, but using vibrotactile could be a very strange and confusing experience. Maybe a potentially more useful device would be a Haply that is able to create vibrotactile feedback as well to enrich the experience.
Next Iteration
In the next iteration, we plan to try visualization and analysis of data with more dimensions. In terms of analysis, we will try to inform the user of important information in their data since haptic feedback is usually associated with a sense of urgency. We also have an idea to use the vibrotactile feedback in Android smartphones to communicate our haptic experiences due to software limitations.
Refrences
S. Paneels and J. C. Roberts, “Review of Designs for Haptic Data Visualization,” in IEEE Transactions on Haptics, vol. 3, no. 2, pp. 119–137, April-June 2010, doi: 10.1109/TOH.2009.44.
Wai Yu and S. Brewster, “Comparing two haptic interfaces for multimodal graph rendering,” Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002, Orlando, FL, USA, 2002, pp. 3–9, doi: 10.1109/HAPTIC.2002.998934.