Just as we are heads down in the final run to the end of term and Xmas I've finally managed to regain access to my website after several months of password-based shenanigans. What better way to celebrate than to upload the piece that I intended to post at the end of the summer about eye tracking...
One of the things I promised myself while I was in Australia was that I would hack together an inexpensive eyetracker this summer that I could give my students to play with. What I’ve done is buy a Tobii 4C – basically a toy that is designed to be used with certain video games to pan the screen around via eye movement. This cost me £121 on Amazon. It’s a small bar that attaches to the bottom of your screen and plugs into your computer via USB. All the clever stuff is done onboard the eyetracker itself (which is essentially a camera that can see your eye movements) so it will run very happily on a low-powered laptop.
Tobii explicitly prevent you from using their API to capture the datastream from this device so that you can’t use it to record the raw eyetracking data for research purposes. This is fair enough given that a lot of people who don’t need clinical grade eyetracking would probably find this device ‘good enough’ and thus not pay the hefty subscriptions to use Tobii’s more accurate equipment and software.
Tobii do, however, allow you to create a representation of your eye movements across the screen using their ‘Ghost’ software from which you can undertake what we might refer to as descriptive eye tracking. You can then connect this to streaming software (they recommend OBS Studio) which can either record or live-stream what you see on your screen to services like Youtube and Twitch. Thus you can record a video with an eyetracking overlay showing the researcher what their participants were looking at when viewing the screen. This can work with movies, games, websites, or simply a collection of images in PowerPoint. In the image below you can see me playing around with a heatmap style representation of where I’m looking on an image of a fantasy city.
In no way is this good enough for doing advanced psychological work. Indeed, such a set up would be rightly dismissed as ‘descriptive’ by anyone who works in this field. It is, however, quite cute and a cheap way of showing the principles and possibilities of work using this kind of technology. And sometimes ‘descriptive’ work is good enough to highlight potentially interesting research questions that you might want to investigate through other means.
I used this setup at the RGS ‘Digital Landscapes’ event co-organised by my excellent PhD student Tess Osborne in August. Again, it’s more about starting a conversation than doing anything approaching ‘science’ with this kind of tool. I've since given it to the third year students taking my Geographies of the Body module to see what projects they'd come up with. They've done some quite cool stuff looking at representations of London and New York in cinema, a project looking at Asian beauty standards and a comparaison of green versus white (snowy) space. It’s very exciting seeing what the students come up with when you give them the opportunity to play around with different methods – a point I made at our last open day when I was getting applicants (and their parents!) to play with this eyetracking set up, as well as a VR driving experience.
Phil Jones is a cultural geographer based at the University of Birmingham.
Phil Jones, Geographer