Skip to content
Surf Wiki
Save to docs
general/multi-touch

From Surf Wiki (app.surf) — the open knowledge base

Multi-touch

Touchscreen interactions using multiple fingers

Multi-touch

Touchscreen interactions using multiple fingers

Multi-touch screen

In computing, multi-touch is technology that enables a surface (a touchpad or touchscreen) to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Multi-touch may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures using gesture recognition.

Several uses of the term multi-touch resulted from the quick developments in this field, and many companies using the term to market older technology which is called gesture-enhanced single-touch or several other terms by other companies and researchers. Several other similar or related terms attempt to differentiate between whether a device can exactly determine or only approximate the location of different points of contact to further differentiate between the various technological capabilities, but they are often used as synonyms in marketing.

Multi-touch is commonly implemented using capacitive sensing technology in mobile devices and smart devices. A capacitive touchscreen typically consists of a capacitive touch sensor, application-specific integrated circuit (ASIC) controller and digital signal processor (DSP) fabricated from CMOS (complementary metal–oxide–semiconductor) technology. A more recent alternative approach is optical touch technology, based on image sensor technology.

Definition

In computing, multi-touch is technology which enables a touchpad or touchscreen to recognize more than one or more than two points of contact with the surface. Apple popularized the term "multi-touch" in 2007 with which it implemented additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures.

The two different uses of the term resulted from the quick developments in this field, and many companies using the term to market older technology which is called gesture-enhanced single-touch or several other terms by other companies and researchers. Several other similar or related terms attempt to differentiate between whether a device can exactly determine or only approximate the location of different points of contact to further differentiate between the various technological capabilities, but they are often used as synonyms in marketing.

History

1960–2000

The use of touchscreen technology predates both multi-touch technology and the personal computer. Early synthesizer and electronic instrument builders like Hugh Le Caine and Robert Moog experimented with using touch-sensitive capacitance sensors to control the sounds made by their instruments. IBM began building the first touch screens in the late 1960s. In 1972, Control Data released the PLATO IV computer, an infrared terminal used for educational purposes, which employed single-touch points in a 16×16 array user interface. These early touchscreens only registered one point of touch at a time. On-screen keyboards (a well-known feature today) were thus awkward to use, because key-rollover and holding down a shift key while typing another were not possible.

Exceptions to these were a "cross-wire" multi-touch reconfigurable touchscreen keyboard/display developed at the Massachusetts Institute of Technology in the early 1970s and the 16 button capacitive multi-touch screen developed at CERN in 1972 for the controls of the Super Proton Synchrotron that were under construction.

The prototypes<ref>{{citation

|access-date=2010-05-25

In 1976 a new x-y capacitive screen, based on the capacitance touch screens developed in 1972 by Danish electronics engineer Bent Stumpe, was developed at CERN.{{citation |author-link1=Bent Stumpe |access-date=2010-05-25 |author-link1=Bent Stumpe |access-date=2010-05-25

In the early 1980s, The University of Toronto's Input Research Group were among the earliest to explore the software side of multi-touch input systems. A 1982 system at the University of Toronto used a frosted-glass panel with a camera placed behind the glass. When a finger or several fingers pressed on the glass, the camera would detect the action as one or more black spots on an otherwise white background, allowing it to be registered as an input. Since the size of a dot was dependent on pressure (how hard the person was pressing on the glass), the system was somewhat pressure-sensitive as well. Of note, this system was input only and not able to display graphics.

In 1983, Bell Labs at Murray Hill published a comprehensive discussion of touch-screen based interfaces, though it makes no mention of multiple fingers.{{cite book In the same year, the video-based Video Place/Video Desk system of Myron Krueger was influential in development of multi-touch gestures such as pinch-to-zoom, though this system had no touch interaction itself.

By 1984, both Bell Labs and Carnegie Mellon University had working multi-touch-screen prototypes – both input and graphics – that could respond interactively in response to multiple finger inputs. The Bell Labs system was based on capacitive coupling of fingers, whereas the CMU system was optical. In 1985, the canonical multitouch pinch-to-zoom gesture was demonstrated, with coordinated graphics, on CMU's system. In October 1985, Steve Jobs signed a non-disclosure agreement to tour CMU's Sensor Frame multi-touch lab. In 1990, Sears et al. published a review of academic research on single and multi-touch touchscreen human–computer interaction of the time, describing single touch gestures such as rotating knobs, swiping the screen to activate a switch (or a U-shaped gesture for a toggle switch), and touchscreen keyboards (including a study that showed that users could type at 25 words per minute for a touchscreen keyboard compared with 58 words per minute for a standard keyboard, with multi-touch hypothesized to improve data entry rate); multi-touch gestures such as selecting a range of a line, connecting objects, and a "tap-click" gesture to select while maintaining location with another finger are also described.

In 1991, Pierre Wellner advanced the topic publishing about his multi-touch "Digital Desk", which supported multi-finger and pinching motions. Various companies expanded upon these inventions in the beginning of the twenty-first century.

2000–present

Between 1999 and 2005, the company Fingerworks developed various multi-touch technologies, including Touchstream keyboards and the iGesture Pad. in the early 2000s Alan Hedge, professor of human factors and ergonomics at Cornell University published several studies about this technology. In 2005, Apple acquired Fingerworks and its multi-touch technology.

In 2004, French start-up JazzMutant developed the Lemur Input Device, a music controller that became in 2005 the first commercial product to feature a proprietary transparent multi-touch screen, allowing direct, ten-finger manipulation on the display.

In January 2007, multi-touch technology became mainstream with the iPhone, and in its iPhone announcement Apple even stated it "invented multi touch", however both the function and the term predate the announcement or patent requests, except for the area of capacitive mobile screens, which did not exist before Fingerworks/Apple's technology (Fingerworks filed patents in 2001–2005, subsequent multi-touch refinements were patented by Apple).

However, the U.S. Patent and Trademark office declared that the "pinch-to-zoom" functionality was predicted by U.S. Patent # 7,844,915 relating to gestures on touch screens, filed by Bran Ferren and Daniel Hillis in 2005, as was inertial scrolling, thus invalidated a key claims of Apple's patent.

In 2001, Microsoft's table-top touch platform, Microsoft PixelSense (formerly Surface) started development, which interacts with both the user's touch and their electronic devices and became commercial on May 29, 2007. Similarly, in 2001, Mitsubishi Electric Research Laboratories (MERL) began development of a multi-touch, multi-user system called DiamondTouch.

In 2008, the Diamondtouch became a commercial product and is also based on capacitance, but able to differentiate between multiple simultaneous users or rather, the chairs in which each user is seated or the floorpad on which the user is standing. In 2007, NORTD labs open source system offered its CUBIT (multi-touch).

Small-scale touch devices rapidly became commonplace in 2008. The number of touch screen telephones was expected to increase from 200,000 shipped in 2006 to 21 million in 2012.

In May 2015, Apple was granted a patent for a "fusion keyboard", which turns individual physical keys into multi-touch buttons.

Applications

Apple has retailed and distributed numerous products using multi-touch technology, most prominently including its iPhone smartphone and iPad tablet. Additionally, Apple also holds several patents related to the implementation of multi-touch in user interfaces, however the legitimacy of some patents has been disputed. Apple additionally attempted to register "Multi-touch" as a trademark in the United Stateshowever its request was denied by the United States Patent and Trademark Office because it considered the term generic.

Multi-touch sensing and processing occurs via an ASIC sensor that is attached to the touch surface. Usually, separate companies make the ASIC and screen that combine into a touch screen; conversely, a touchpad's surface and ASIC are usually manufactured by the same company. There have been large companies in recent years that have expanded into the growing multi-touch industry, with systems designed for everything from the casual user to multinational organizations.

It is now common for laptop manufacturers to include multi-touch touchpads on their laptops, and tablet computers respond to touch input rather than traditional stylus input and it is supported by many recent operating systems.

A few companies are focusing on large-scale surface computing rather than personal electronics, either large multi-touch tables or wall surfaces. These systems are generally used by government organizations, museums, and companies as a means of information or exhibit display.

Implementations

Multi-touch has been implemented in several different ways, depending on the size and type of interface. The most popular form are mobile devices, tablets, touchtables and walls. Both touchtables and touch walls project an image through acrylic or glass, and then back-light the image with LEDs.

Touch surfaces can also be made pressure-sensitive by the addition of a pressure-sensitive coating that flexes differently depending on how firmly it is pressed, altering the reflection.

Handheld technologies use a panel that carries an electrical charge. When a finger touches the screen, the touch disrupts the panel's electrical field. The disruption is registered as a computer event (gesture) and may be sent to the software, which may then initiate a response to the gesture event.

In the past few years, several companies have released products that use multi-touch. In an attempt to make the expensive technology more accessible, hobbyists have also published methods of constructing DIY touchscreens.

Capacitive

Capacitive technologies include:

  • Surface Capacitive Technology or Near Field Imaging (NFI)
  • Projected Capacitive Touch (PCT)
    • Mutual capacitance
    • Self-capacitance
  • In-cell Capacitive

Resistive

Resistive technologies include:

  • Analog Resistive
  • Digital Resistive or In-Cell Resistive

Optical

Optical touch technology is based on image sensor technology. It functions when a finger or an object touches the surface, causing the light to scatter, the reflection of which is caught with sensors or cameras that send the data to software that dictates response to the touch, depending on the type of reflection measured.

Optical technologies include:

  • Optical Imaging or Infrared technology
  • Rear Diffused Illumination (DI)
  • Infrared Grid Technology (opto-matrix) or Digital Waveguide Touch (DWT) or Infrared Optical Waveguide
  • Frustrated Total Internal Reflection (FTIR)
  • Diffused Surface Illumination (DSI)
  • Laser Light Plane (LLP)
  • In-Cell Optical

Wave

Acoustic and radio-frequency wave-based technologies include:

  • Surface Acoustic Wave (SAW)
  • Bending Wave Touch (BWT)
    • Dispersive Signal Touch (DST)
    • Acoustic Pulse Recognition (APR)
  • Force-Sensing Touch Technology

Multi-touch gestures

Main article: Pointing device gesture

Multi-touch touchscreen gestures enable predefined motions to interact with the device and software. An increasing number of devices like smartphones, tablet computers, laptops or desktop computers have functions that are triggered by multi-touch gestures.

10/GUI

10/GUI is a proposed new user interface paradigm. Created in 2009 by R. Clayton Miller, it combines multi-touch input with a new windowing manager.

It splits the touch surface away from the screen, so that user fatigue is reduced and the users' hands don't obstruct the display. |access-date= 2009-10-14 |access-date=2009-10-16 |archive-url=https://web.archive.org/web/20091019113633/http://www.engadget.com/2009/10/15/10-gui-interface-looks-to-redefine-the-touch-enabled-desktop/ |archive-date=19 October 2009 |url-status=live

An open source community preview of the Con10uum window manager was made available in November, 2009. |access-date=2010-07-02 |archive-url=https://web.archive.org/web/20100609155610/http://wpfcon10uum.codeplex.com/ |archive-date=9 June 2010 |url-status=live

References

References

  1. (18 October 2015). "Multi-Touch Technology and the Museum: An Introduction".
  2. Crowley-Milling, Michael. (29 September 1977). "New Scientist". Reed Business Information.
  3. (2017-04-24). "Advanced Series on Directions in High Energy Physics". World Scientific.
  4. (May 2010). "Touchscreen technology basics & a new development". CMOS Emerging Technologies Research.
  5. (5 March 2010). "Finger Fail: Why Most Touchscreens Miss the Point".
  6. "Multi-touch definition of Multi-touch in the Free Online Encyclopedia.". encyclopedia2.thefreedictionary.com.
  7. "Glossary - X2 Computing". x2computing.com.
  8. (2010). "Infostructures: A Transport Research Project". Freerange Press.
  9. (August 2012). "A review of technologies for sensing contact location on the surface of a display". Journal of the Society for Information Display.
  10. "What is Multitouch".
  11. Buxton, Bill. [http://www.billbuxton.com/multitouchOverview.html "Multitouch Overview"]
  12. "Multi-Touch Technology, Applications and Global Markets".
  13. (1976-01-01). "Proceedings of the 3rd annual conference on Computer graphics and interactive techniques - SIGGRAPH '76".
  14. (May 24, 1973). "Two devices for operator interaction in the central control of the new CERN accelerator". CERN.
  15. Petersen, Peter. (1983). "Man-machine communication". Aalborg University.
  16. Merchant, Brian. (22 June 2017). "The One Device: The Secret History of the iPhone". Transworld.
  17. (2012). "The evolution of CERN's capacitive touchscreen". University of Copenhagen.
  18. (1 June 2010). "CERN touch screen". A joint Fermilab/SLAC publication.
  19. (1974). "Data processing". CERN Courier.
  20. Mehta, Nimish (1982), A Flexible Machine Interface, M.A.Sc. Thesis, Department of Electrical Engineering, University of Toronto supervised by Professor K.C. Smith.
  21. Krueger, Myron. (7 April 2008). "Videoplace '88".
  22. Krueger, Myron, W., Gionfriddo, Thomas., &Hinrichsen, Katrin (1985). VIDEOPLACE - An Artificial Reality, Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’85), 35–40.
  23. Dannenberg, R.B., McAvinney, P. and Thomas, M.T. Carnegie-Mellon University Studio Report. In Proceedings of the International Computer Music Conference (Paris, France, October 19–23, 1984). ICMI. pp. 281-286.
  24. McAvinney, P. The Sensor Frame - A Gesture-Based Device for the Manipulation of Graphic Objects. Carnegie Mellon University, 1986.
  25. TEDx Talks. (2014-06-15). "Future of human/computer interface: Paul McAvinney at TEDxGreenville 2014".
  26. (1985-01-01). "Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '85". ACM.
  27. O'Connell, Kevin. "The Untold History of MultiTouch".
  28. Sears, A., Plaisant, C., Shneiderman, B. (June 1990) A new era for high-precision touchscreens. Advances in Human-Computer Interaction, vol. 3, Hartson, R. & Hix, D. Eds., Ablex (1992) 1-33 HCIL-90-01, CS-TR-2487, CAR-TR-506. [http://www.cs.umd.edu/local-cgi-bin/hcil/rr.pl?number=90-01]
  29. Wellner, Pierre. 1991. The Digital Desk. {{YouTube. S8lCetZ_57g
  30. [http://www.informatik.uni-trier.de/~ley/db/indices/a-tree/w/Wellner:Pierre.html Pierre Wellner's papers] {{Webarchive. link. (2012-07-18 via DBLP)
  31. Westerman, W., Elias J.G. and A.Hedge (2001) Multi-touch: a new tactile 2-d gesture interface for human-computer interaction Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting, Vol. 1, 632-636.
  32. Shanis, J. and Hedge, A. (2003) Comparison of mouse, touchpad and multitouch input technologies. Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting, Oct. 13–17, Denver, CO, 746-750.
  33. Thom-Santelli, J. and Hedge, A. (2005) Effects of a multitouch keyboard on wrist posture, typing performance and comfort. Proceedings of the Human Factors and Ergonomics Society 49th Annual Meeting, Orlando, Sept. 26-30, HFES, Santa Monica, 646-650.
  34. (June 25, 2013). "Apple's most important acquisitions". Network World.
  35. "Multi-Touch Systems that I Have Known and Loved".
  36. "Developing the First Commercial Product that Uses Multi-Touch Technology". SID Information Display Magazine.
  37. (2006). ""And Boy Have We Patented It"".
  38. "US patent 7,046,230 "Touch pad handheld device"".
  39. Jobs. "Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics".
  40. "Application programming interfaces for scrolling operations".
  41. "US patent office rejects claims of Apple 'pinch to zoom' patent". PCWorld.
  42. "Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter".
  43. Wong, May. 2008. Touch-screen phones poised for growth [https://www.usatoday.com/tech/products/2007-06-21-1895245927_x.htm https://www.usatoday.com/tech/products/2007-06-21-1895245927_x.htm]. Retrieved April 2008.
  44. (26 May 2015). "Apple Patent Tips Multi-Touch Keyboard".
  45. Heater, Brian. (27 January 2009). "Key Apple Multi-Touch Patent Tech Approved". PCmag.com.
  46. (20 December 2012). "Apple's Pinch to Zoom Patent Has Been Tentatively Invalidated". Gizmodo.
  47. Golson, Jordan. (26 September 2011). "Apple Denied Trademark for Multi-Touch". MacRumors.
  48. Scientific American. 2008. [http://www.scientificamerican.com/article.cfm?id=how-it-works-touch-surfaces-explained "How It Works: Multitouch Surfaces Explained"]. Retrieved January 9, 2010.
  49. Brandon, John. 2009. [http://www.computerworld.com/s/article/9138644/How_the_iPhone_works "How the iPhone Works] {{Webarchive. link. (2012-10-10)
  50. [http://www.humanworkshop.com/index.php?modus=e_zine&sub=articles&item=99 DIY Multi-touch screen]
  51. [http://www.multi-touch-solution.com/en/knowledge-base-en/ Knowledge base:Multitouch technologies. Digest author: Gennadi Blindmann]
  52. "Diffused Illumination (DI) - NUI Group".
  53. [https://gizmodo.com/229464/minority-report-touch-interface-for-real Minority Report Touch Interface for Real]. Gizmodo.com. Retrieved on 2013-12-09.
    1. [http://garry.posterous.com/quantum-of-solaces-multitouch-ui-video-wall-g " Quantum of Solace Multitouch UI"]
  54. Garofalo, Frank Joseph. "User Interfaces For Simultaneous Group Collaboration Through Multi-Touch Devices". Purdue University.
  55. {{YouTube. YZoa1jHSpBw. "District 9 - Ship UI"
Info: Wikipedia Source

This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.

Want to explore this topic further?

Ask Mako anything about Multi-touch — get instant answers, deeper analysis, and related topics.

Research with Mako

Free with your Surf account

Content sourced from Wikipedia, available under CC BY-SA 4.0.

This content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.

Report