The last six months have been unusually quiet. After a very busy period during the various Covid-related changes to our working practices, I’ve been given a reduced teaching load this year in order to write some grants. This has been a very nice payback to the lack of research time over the last two years. Just prior to the Xmas break I was finally able to submit a large grant to the ESRC that I’d first thought about in January. The success rate on open call grants is very low, but it’s nice to have a punt at something that I’d really like to work on (it’s about groups of people walking together in VR).
In the last few months I’ve also applied to a number of smaller funding schemes, both internally and externally. Just before Xmas I heard I’d been successful in a bid to a UoB equipment fund which means that in the new year I’ll be able to order a pair of eyetracking glasses and a powerful laptop set up for machine learning. I’ve wanted a pair of eyetracking glasses for years – indeed, I wrote about them on this blog back in April 2018. Pupil Labs are a relatively new entrant to the eyetracking sector but have established themselves as a rival to market leaders Tobii – not least by significantly undercutting their prices and making analysis tools available on an open source basis. This has meant we've been able to ask for a pair of their glasses without needing to seek significant outside funding. For those disinclined to read the previous blog, the basic idea of mobile eyetracking is recording video of what you’re looking at via a front facing camera attached to a pair of glasses. Simultaneously, infrared cameras monitor where participants' pupils are pointing at any moment. Combining these measures means that one can see what people are paying attention to within their field of view. Psychologists use this technology to do some genuinely fascinating work looking at minute variations in speed of movement, fixation and so on to examine questions around individual cognition. People like me, on the other hand, tend to think about the datasets in a more descriptive way – essentially asking what catches people’s attention and how much time they spend looking at different things in front of them. Sang-Hoon Yeo, a colleague from sports and exercise sciences, has done some brilliant work using machine learning analysis of video footage to automatically classify elements within a person’s field of vision (people, cars, green space etc.). This automated process allows a more efficient calculation of the time spent looking at different elements in a scene. I was introduced to him as part of setting up the new Birmingham XR group and got very excited about the potential of the technique he’s developed. Hence with the new glasses there’s a bunch of interesting projects that we can collaborate on, including revisiting a couple of project ideas that fell down for lack of funding to source the equipment. I’m most excited about exploring how we can use the glasses to look at visitor engagement and have had a chat with a colleague from the Lapworth Museum who is happy to let us try a few things out. There’s also potential to go back to the idea of looking at eyetracking and cycling. So while the end to this year has been unusually calm, I suspect that there will be much to do in 2023. I’m excited to get started.
1 Comment
|
AuthorPhil Jones is a cultural geographer based at the University of Birmingham. Archives
September 2023
Categories |
Phil Jones, Geographer
The INTERMITTENTLY updated blog
Proudly powered by Weebly