Well, I’ve been after a pair of eyetracking glasses for about five years, ever since I found out that such a thing existed. Now, I finally have a pair in my office. Yes, I am as excited as a six year old at Xmas.
We’ve bought a really nice piece of kit from Pupil Labs, a Berlin-based company who have acted as disruptors in the eyetracking market which has hitherto been dominated by Sweden’s Tobii. As promised in their literature, the glasses are lightweight and comfortable to wear. For my taste they’re a bit ‘nerd chic’ but they don’t look especially weird. Pupil Labs have a new piece of kit coming out later this year which allows you to have much more customisation and stylish frames, but even so, the glasses don’t look like anything particularly remarkable. Which is, of course, perfect.
Some of the work I read about mobile eyetracking (MET) written in the early/mid-2010s had participants talking about how self-conscious they felt wearing the rather alien-looking glasses in public spaces. These new designs definitely mitigate this and after a little while I didn’t really notice that I was wearing them when having a walk around.
My colleague Sang-Hoon in Sports Science has promised to sit down with me at some point and show me how to use the machine learning software he’s built for automatic object recognition. In the meantime, I’ve had a bit of a play with the standard functions that Pupil Labs provide and I’ll admit to being incredibly impressed – not least that you can produce a useful analysis with little to no skill.
The glasses plug into a fairly ordinary smartphone which provides power and computational resources. You simply put on the glasses, press ‘record’ on the app and you’re ready to go – no complex calibration or any other setup needed. A side mounted camera records what is in your field of vision while infrared sensors built into the frames monitor your eye movement. Then you simply send your participants off to walk around the environment that you’re investigating. At the end of the data collection, you press ‘stop’ and the app automatically uploads the recording to Pupil Labs’ (GDPR-compliant) cloud server for processing. Then you get a nice video which shows what was in your field of view with an overlay showing where your eyes were pointing in the scene.
MET can be used for clinical work, making very precise measurements to examine cognitive response to different stimuli. Most users, however, seem to use them more in a descriptive way – looking at the amount of time spent looking at different objects within the study environment. One of the cleverest things that you can do is take reference photos of an object of interest (an information board, a display case, even something as large as a building) and connect the eye movement record from the head mounted camera onto a still photograph. There’s a little bit of photogrammetry operating in the background to make this work but the upshot is something like this...
Here we have a side-by-side comparison showing me looking around the lobby of our building next to a reference image of an information board that is located there. The red dot shows where I’m looking and you can see that the red dot on the right-hand reference image appears in the same places as it does (at different angles) in the eye tracking video.
Based on this you can then create a heatmap showing the areas where I had the most fixations as an indicator of the parts of the reference image I paid most attention to. Although Pupil Lab’s inbuilt software currently only allows you to create heatmaps for one user at a time, it’s a fairly simply matter to extract the raw data and create overlays of multiple participants. That allows you to examine whether, say, female participants look at different things in an environment compared to men.
It's cute and there’s much to play with here. There are a handful of people working in planning and urban design who have started to use this technology since it became available but it’s still early days. I’m currently exploring the possibilities of a collaborative pilot project with a colleague in the UAE looking at urban design and sustainability in Ras Al-Khaimah. Similarly, there’s a possibility for doing some work looking at visitor engagement in the Lapworth, the geological museum that is part of my department. If anything comes of either of these, I’ll post an update presently.
Phil Jones is a cultural geographer based at the University of Birmingham.
Phil Jones, Geographer