
Two researchers from the Swiss Federal Institute of Technology in Zurich (ETH) have developed an artificial intelligence (AI) that enables devices to more precisely identify where a finger touches the user interface. Computer science professor Christian Holz and doctoral student Paul Streli from the Sensing, Interaction & Perception Lab are now presenting their AI solution for more precise touch screens at the specialist conference on human factors in computing systems, CHI 2021, according to a press release.
In their presentation they show that a third of the errors in current devices are due to the fact that input sensing is only low resolution. While the displays of mobile end devices have become increasingly higher-resolution, the sensors that detect input by a finger on the touch screen have barely been developed since their introduction in the mid-2000s. The resolution of the most recent iPhone is 2532 x 1170 pixels, but its touch sensor can only detect inputs with a resolution of around 32x15 pixels. This is almost 80 times lower than the screen resolution. The new solution from Holz and Streli, CapContact, eliminates these errors with a type of machine learning – deep learning.
According to the press release, the AI solution has a good chance of being used for new touch recognition in future cell phones and tablets. The system even distinguishes between touch surfaces when fingers move very closely together on a screen in order to enlarge text or images. Current devices can barely differentiate between these touches.
Related news
Meet with an expansion expert
Our services are free of charge and include:
- Introduction to key contacts in industry, academia, and government
- Advice on regulatory framework, taxes, labor, market, and setting up a company
- Custom-made fact-finding visits, including office and co-working space