Friday, 10 January 2014

Lancaster University student develops way to transfer computer files with your eyes

Image via Lancaster University
A new way of dragging and dropping files has been developed by Lancaster University student Jayson Turner.

The system, called EyeDrop, enables the user to move items on screen by gazing at the object through an eye tracker. This automatically selects the object which can then be transferred to a tablet or smart phone via the touchscreen.

The system relies on the two devices being connected wirelessly and was developed by PhD student Jayson Turner at the School of Computing and Communications, along with Andreas Bulling and Hans Gellersen.

“It’s very swift and smooth because I wanted to eliminate the unnecessary steps in the interaction  and let people move things quickly and fluently," he explains.

"Selected content can contain metadata allowing it to be used for varying purposes. It’s useful if, for example, you want to drag objects to an interactive map and plot a route. It will allow you to manipulate the object – like a photo – as you transfer it and share it.”

He said there are still many issues to be overcome, including the problem of needing to wear eyetracker glasses to be able to use Eyedrop.

“But this could be overcome if instead, the gaze tracking technology was included within say a display so it lets you select and cut and paste an image being displayed. But that has privacy issues since not everyone would want this, so all this needs to be resolved at the same time the technology is being developed.”

Jayson presented his research at the twelfth Conference on Mobile and Ubiquitous Multimedia in Sweden, in December.

• The proceedings of the Conferenceare now available in the ACM digital library. They are not indexed yet, but soon will be. Using the direct link and then table of contents will show and all publications incl. the bibtex and pdf files: http://dl.acm.org/citation.cfm?id=25418

Here's a direct link to a page where you can buy a copy of Jayson's original paper, Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction

No comments: