Researchers at UC Santa Barbara have developed a new display technology that enables users to both see and feel on-screen graphics. The innovation uses screens patterned with small pixels that expand into bumps when illuminated, allowing for dynamic graphical animations that can be viewed visually and sensed through touch.
Max Linnander, a PhD candidate in the RE Touch Lab led by mechanical engineering professor Yon Visell, led the research. Their findings were published in Science Robotics this month.
The idea began as a challenge from Visell to Linnander in September 2021: “The question was simple enough: Could the light that forms an image be converted into something that can be felt?” Linnander said.
“We didn’t know if it was feasible,” Visell added. “The possibility that it might be impossible — and the very idea of enabling people to ‘feel light’ — made the question irresistible.”
After nearly a year of theoretical work and computer simulations, the team moved to building prototypes. Progress was slow until December 2022, when Linnander demonstrated a working prototype for Visell using just a single pixel activated by brief flashes from a diode laser.
“I put my finger on the pixel and felt a clear tactile pulse whenever the light flashed,” Visell recalled. “That was a special moment — the moment we knew the core idea could work.”
The technology relies on thin display surfaces containing arrays of millimeter-sized optotactile pixels. Each pixel is powered by projected light from a low-power laser, which also serves as its control mechanism. Inside each pixel is an air-filled cavity with a suspended graphite film; when hit by light, the film heats up quickly, causing the air to expand and push out the surface above it by up to one millimeter.
This process happens rapidly enough that scanning a beam across multiple pixels produces dynamic shapes and animations that can be both seen and touched. Because all power and control are delivered via light, there is no need for embedded wiring or electronics within the display surface; instead, a scanning laser illuminates each pixel in quick succession.
Linnander noted that their devices have more than 1,500 independently addressable pixels—significantly more than previous tactile displays—and larger formats may be possible using modern laser video projectors.
User studies showed participants could accurately locate individual illuminated pixels by touch alone with millimeter precision, perceive moving graphics, and distinguish different spatial and temporal patterns. According to researchers, these results indicate that their system can produce diverse tactile content.
Visell acknowledged historical precedents for turning light into mechanical action: “In the 19th century, Alexander Graham Bell and others used focused sunlight, modulated by the blades of a rotating fan, to excite sound in air-filled test tubes.” He explained that similar physical principles now underpin their digital display technology.
Potential applications include automotive touchscreens simulating physical controls, electronic books with tangible illustrations, and architectural surfaces supporting mixed reality experiences. Visell concluded: “Whatever the future may hold, the technology his team has invented embodies a simple, intriguing idea: anything you see, you can also feel.”



