Wisconsin Alumni Research Foundation

Information Technology
Information Technology
Virtual Touch Screens: New Input for Smaller Devices
WARF: P160021US01

Inventors: Xinyu Zhang, Chi Zhang, Joshua Tabor, Jialiang Zhang

The Wisconsin Alumni Research Foundation (WARF) is seeking commercial partners interested in developing a new virtual input technology for mobile devices.
As mobile devices become smaller, touch screen inputs have remain largely unchanged. But the use of the display as an input surface can become a problem as the user’s finger obstructs increasingly larger portions of the screen in devices like smartphones and smartwatches.

While alternative mobile interaction technologies like motion-sensing software and vision-based input detection systems have been developed, these embodiments rely on external sensors that must be attached to the hands or bulky cameras and projectors.
The Invention
UW–Madison researchers have developed a new virtual touch screen technology utilizing unused space to the side of a device display. The technology is a low-cost passive finger localization system based on visible light sensing. It provides a simple and convenient interface that does not require additional external equipment such as a sensor attached to the finger.

A sensor system on the edge of a mobile device uses photodetectors and a light source to track finger motion based on reflected light signals within the narrow light-sensing plane of the virtual touch screen. The signals are converted to orthogonal coordinates and subsequently output to the graphics display screen.
  • Mobile devices, including smartphones, tablets and smartwatches
  • Gaming devices
  • Alternative user interfaces
Key Benefits
  • Interface is simple, convenient and intuitive.
  • Eliminates problems with a finger blocking the display during normal touch screen operation
  • Optimizes screen surface area using near-field space
  • Does not substantially increase the size of the device
  • Eliminates need for bulky external projection equipment
  • Does not require the user to wear a device or sensor on their finger
  • Minimizes costs associated with alternative keyboard and motion tracking hardware
  • Zhang C., Tabor J., Zhang J. and Xinyu Zhang. 2015. Extending Mobile Interaction Through Near-Field Visible Light Sensing. In ACM International Conference on Mobile Computing and Networking (MobiCom), 2015.
  • Wang J. J., Zhao K. C., Zhang X. Y. and Peng, C. Y. 2014. Ubiquitous Keyboard for Small Mobile Devices: Harnessing Multipath Fading for Fine-Grained Keystroke Localization. MobiSys ’14. 14-27.
For current licensing status, please contact Michael Carey at [javascript protected email address] or 608-960-9867