Wednesday, August 29, 2012

Paper Reading #1: TapSense - Enhancing finger interaction on touch surfaces

Introduction:
"TapSense: Enhancing finger interaction on touch surfaces," is a research paper from the CHI conference, written by Chris Harrison, Julia Schwarz, and Scott E. Hudson of the CHI Institute at Carnegie Mellon University. Julia Schwarz is a PhD student at Carnegie Mellon University, studying in the CHI Institute. She is also a very good skier. In fact, she is so good that she was a ski instructor for a while before going to graduate school. Here is a very impressive video of her doing a ton of awesome ski tricks while juggling. Chris Harrison is also a PhD student at Carnegie Mellon University, whose research focuses on mobile interaction techniques and input technologies.   Before Carnegie Mellon, Chris earned his bachelors and masters of science in Computer Science from NYU. Scott Hudson is a CHI professor at Carnegie Mellon, and sponsor of Chris and Julia's research project. In fact, he is the director and founder of the CHI (HCII) PhD program at Carnegie Mellon University.

Summary:
In "TapSense", the team modifies two touch-screen devices in order to provide more types of interaction with touch displays. They are able to differentiate between a finger tip, finger pad, nail, and knuckle tap on a touch screen. This remarkable feat is accomplished by attaching a medical stethoscope and an electret condenser microphone to a touch-screen device. An electret condenser microphone is a type of microphone which does not require polarizing power supply, but instead uses a permanently charged material. They use the sound recorded by this setup to differentiate between the acoustic sound produced by tapping different parts of a human hand on a touch screen device. Below in figure 1, you can see example hand touches possible by using TapSense.


The idea behind wanting to be able to differentiate between different hand motions on a touch screen is to provide a higher level of input possibilities. These input variations could be used as "right-click" or "scrolling" features. Current "right-click" features on touch-screen devices include holding or double-tapping motions, which can sometimes be counter intuitive. The research team hopes to use TapSense to help improve user experiences with touch-devices.

Related Work:
All three of the contributing authors of this paper are active in the CHI field, including many other research projects. The main other research paper by Chris Harrison and Julia Schwarz that caught my eye was "Phone as a Pixel: Enabling Ad-Hoc, Large Scale Displays using Mobile Devices" in which they write a software program to create a large-screen display, using individual mobile phones as pixels.  As for related work not mentioned in this paper, there are a number of different papers published on enhancing touch surface interaction in one way or another. "TeslaTouch: electrovibration for touch surfaces" [1] is a paper describing how one team of CHI researchers has found a way of measuring pressure differences accurate enough to allow a range of actions based on the pressure of the "tap". In "Exploring physical information cloth on a multitouch table," [2] the researchers take a step away from the traditional touch surface and create a drapable sensor cloth that can be used for a variety of different touch capabilities. Another different, but related research paper, "BubbleWrap: a textile-based electromagnetic haptic display" [3] explores the possibility of creating touch surfaces that have different resistances or firmnesses. This would allow them to require users to alter the touch pressure they use to press a button for example.

Other related works:
"Evaluating tactile feedback and direct vs. indirect stylus input in pointing and crossing selection tasks" [4]
"Low-cost multi-touch sensing through frustrated total internal reflection" [5]
"Transformed up-down methods in psychoacoustics" [6]
"ToolGlass and magic lenses: The see-through interface" [7]
"ShapeTouch: leveraging contact shape on interactive surfaces" [8]
"HoloWall: designing a finger, hand, body, and object sensitive wall" [9]
"Rubbing and tapping for precise and rapid selection on touch-screen displays" [10]
 

Evaluation:
In order to evaluate the tactile feedback of different parts of a human hand on a touch screen device, the team uses quantitative, unbiased sensor readings that provide acoustic levels. However, when the team was assessing the overall functionality of the TapSense program, they tend to use mostly quantitative, objective evaluation. This is mainly because there exists an easy way to test the accuracy of the application. They speak of the comparison to using a single finger touch input method, but this appears to be a relative comparison and includes personal opinion. An example of this can be seen when the team compares their tool to other tools that provide similar functionality. One alternative uses a wristband sensor that measures impact force and movement range, which the team discredits because this requires the users to drastically change the way they interact with a touch screen by adding a wrist band.

Discussion:
The TapSense team came up with a novel concept and a great proof-of-concept with this touch-device modification system. While the TapSense system used in this research project requires an external stethoscope and microphone, new phones could integrate smaller, cheaper, and more efficient hardware embedded in the phone. Another great thing about TapSense is computer and mobile-technology device enthusiasts will actually use its features. Some inventions require too much change in normal activity for society to fully use their features. TapSense requires such a small change in user input and that is why I believe that users are more likely to use the great features. The main area for improvement that I noticed was multi-touch TapSense applications. At the time of the TapSense research paper publication, the TapSense system could not differentiate between double finger taps and double knuckle taps, because the separate taps dissipate the acoustic response in different, unpredictable ways.

Sources:

  1. Bau, Olivier and Poupyrev, Ivan and Israr, Ali and Harrison, Chris. TeslaTouch: electrovibration for touch surfaces. In CHI’2010, Ext. UIST ’10. ACM. pp. 283-292
  2. Mikulecky, Kimberly and Hancock, Mark and Brosz, John and Carpendale, Sheelagh. Exploring physical information Cloth on a multitouch table. In CHI’2011, Ext. ITS ’11. ACM. pp. 140-149
  3. Bau, O., U. Petrevski, and W. Mackay. BubbleWrap: a textile-based electromagnetic haptic display. in CHI'2009, Ext. Abstracts. 2009: ACM. pp. 3607-3612
  4. Forlines, C. and R. Balakrishnan. Evaluating tactile feedback and direct vs. indirect stylus input in pointing and crossing selection tasks. in CHI'08. 2008: ACM. pp. 1563–1572
  5. Han, J. Low-cost multi-touch sensing through frustrated total internal reflection. in UIST'05. 2005: ACM. pp. 115-119
  6. Levitt, H., Transformed up-down methods in psychoacoustics. The Journal of the Acoustical society of America, 1971. 49(2): pp. 467-477.
  7. Bier, E. A., Stone, M. C., Pier, K., Buxton, W., and DeRose, T. D. Toolglass and magic lenses: The see-through interface. In Proc. SIGGRAPH, 73–80. ACM, 1993.
  8. Cao, X., Wilson, A. D., Balakrishnan, R., Hinckley, K., and Hudson, S. E. ShapeTouch: leveraging contact shape on interactive surfaces. In Proc. of Tabletop, 129–136. IEEE, 2008
  9. Matsushita, N. and J. Rekimoto. HoloWall: designing a finger, hand, body and object sensitive wall. in UIST'97. 1997: ACM. pp. 209-218
  10. Olwal, A., S. Feiner, and S. Heyman. Rubbing and tapping for precise and rapid selection on touch-screen displays. in CHI'08. 2008: ACM. pp. 295-304

CHI 436 - Assignment 0


Hi Everyone,

My name is Mike Watts and I am a fifth year senior computer engineering student (electrical track). The best way to contact me is by email at mike.watts27@gmail.com. I am taking this course to learn more about how humans interact with technology. As far as experience and what I bring to this class, well... I am a human, so that is a good start. Like most of the students in this class, I have interacted with computers all of my life and I often think of why we program computers to act a certain way. After college I plan on working in industry, probably in IT or software development. I also would like to live somewhere that I can enjoy the outdoors and row, which is one of my passions. Ten years from now, I hope that I am building a career in whichever discipline I decide to pursue, staying in shape by rowing competitively, and growing a family of my own. I believe that the next big technological advancement in computer science will be human avatars that will interact with the world, while their owners sit at home or are kept alive by medical machines. Sounds crazy, right? Not according to Russia's 2045 initiative, which hopes to provide robotic avatars by 2020 and holographic avatars by 2045. If I could travel back in time, I would go back and meet Neil Armstrong. What, too soon? Out of all of my shoes, my favorites would be regular ol' running shoes, which is what you will see me in around campus most of the time. If I learned another language, it would probably be German, because most of my family is from Heidelberg, Germany. As for an interesting fact about myself... I have been dating the same girl since freshman year of high school. It has been over nine years and we have never broken up.

I hope that this introductory blog gives you an insight into who I am.