Reimagining the web browser for eye-gaze control – the Cactus project

Eye trackers allow users to control a computer using only their gaze

April 20, 2025| Chris Porter and Marie Buhagiar4 min read
Cactus set-up with an eye tracker and an adaptive switch. Photo: Marie Buhagiar/Chris PorterCactus set-up with an eye tracker and an adaptive switch. Photo: Marie Buhagiar/Chris Porter

The web has evolved into an essential space where we connect socially and professionally. Standard browsers like Chrome, Edge and Firefox provide access through interfaces such as keyboards, mice and touchscreens. For individuals with low vision or blindness, screen readers such as NVDA and JAWS open up access. But what happens if you have little to no motor control?

At the Human-Computer Interaction Laboratory within the Faculty of ICT, University of Malta, we are working on Cactus, a cutting-edge web browser that reimagines how people with limited mobility engage with the web. For these individuals, traditional input methods are often inaccessible. This is where eye-tracking technology comes into play.

Eye trackers allow users to control a computer using only their gaze. In theory, this opens up an entirely hands-free method of interaction. However, eye trackers face a fundamental challenge in daily use.

Unlike the precise control you get with a mouse or keyboard, eye-tracking technology introduces a small but significant degree of imprecision, making it especially challenging to select narrow links, buttons or fly-out menus found on most websites. Precision targeting – something mouse and keyboard users take for granted – becomes frustratingly unreliable.

Cactus was designed to overcome this barrier. The browser studies a website’s underlying structure and content, and then creates an alternative representation using a Quadtree data structure that organises interactive page elements by location for efficient searching and processing.

When it detects the user’s gaze in a particular area, the browser identifies and highlights nearby interactive elements, displaying them as eye-tracker-friendly targets in an interaction sidebar. 

Users can interact with these targets by focusing on them briefly. Optionally, for persons who have some mobility, Cactus also allows the possibility to select targets using an adaptive switch – such as a blink switch, head switch, foot switch or any other switch tailored to the user’s physical abilities. Adding a secondary input modality significantly improves interaction speed by eliminating the need to dwell on targets. Cactus was designed to integrate seamlessly with adaptive switches, allowing users to map multiple switches to different browser functionalities.

One of our biggest challenges was ensuring Cactus could adapt to the wide variety of websites that ignore or only partially follow accessibility standards like the Web Content Accessibility Guidelines (WCAG).

Cactus meets this challenge with intelligent algorithms that reshape web content in real time – bridging the gap between design intention and real-world usability.

Cactus is also fully cross-platform, running on Windows, macOS and Linux. This project represents a commitment to digital inclusivity at the HCI Lab, Faculty of ICT and the University of Malta, working to support digitally underserved members of our society. 

Cactus is funded by Xjenza Malta under the FUSION R&I: Research Excellence Programme.

Photo of the week

Photo: NASA/ESA/C.R. O’Dell (Vanderbilt University)Photo: NASA/ESA/C.R. O’Dell (Vanderbilt University)

An image of the Helix nebula, captured by the Hubble telescope. Also sometimes referred to as the ‘Eye of God’ or the ‘Eye of Sauron’ nebula, it is a planetary nebula in the constellation Aquarius, with an estimated diameter of just under three light years at a distance of around 655 light years from Earth. 

Sound Bites

•         Researchers have unravelled the mapping behind a previously unknown control system for vision. The complex nature through which vision is achieved necessitates a seamless balance of several brain structures. Scientists have managed to map the smallest of these control systems, an inhibitory pathway in the retina, that has been attributed to 44 different types of cells, each with its own specific function. This research could be a cornerstone in future studies looking into reasons behind the development of certain eye disorders.

For more soundbites, listen to Radio Mocha every Saturday at 7.30pm on Radju Malta and the following Monday at 9pm on Radju Malta 2 https://www.fb.com/RadioMochaMalta/.

DID YOU KNOW?

•         Peripheral blindness:  Due to how our vision works, people often miss elements placed in corners or edges of software interfaces. Even people with perfect vision experience significant detail loss outside their central focus area. Effective designs account for this by placing critical elements where users naturally look, or making peripheral elements larger and with higher contrast to compensate for our biological limitations.

•         Change blindness: Your brain’s efficiency at filtering information creates another surprising effect: people often fail to notice significant visual changes when they occur on the screen during eye movements (saccades), blinks or brief interruptions. This is why users may miss important notifications or updates that appear while they’re focusing elsewhere on your interface – and not because they’re not paying attention. Good software interface design compensates for this by using motion, contrast, sound or persistent visual indicators to draw attention to important changes, especially those that are outside a person’s current focus.

For more trivia, see: www.um.edu.mt/think.

 

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.