The 4th was strong in this one!

16 December 2015 / Tim Holmes

Wow, literally only a few days to go till the excitement and festivities begin! But after the release of Star Wars we still have the anticipation of card shops full of Valentines and supermarket shelves full of Easter Eggs to tell us it’s nearly Christmas and time for the obligatory plethora of reviews of the year. So, not wanting to be left out, I thought I’d use this opportunity to pick out a few of my favourite highlights from Acuity’s 2015 achievements, including a couple of things which might have escaped your attention if you only know us from this blog. AI turned 4 this year, which was a milestone in itself since it meant we found ourselves in uncharted territory, far beyond the extent of our original business plan! For me the year was dominated by the rapidly growing interest in head-mounted eye-tracking which in turn has resulted in my spending lots of time working with people collecting data in “the real world” and helping them to deal with the challenges of such uncontrolled environments. This aspect of 2015 was partly responsible for two of my highlights, so without further procrastination, and in no specific order, here’s my Top 5 for year…

Tipping point

Going into 2015 I had set myself a goal of being more active on our blog which, due to other work demands, had gone a bit quiet in the second half of last year. So this year I’ve hit you with more than 20 blog posts and I don’t know how many tweets. One of the dominant topics on the blog this year has been the Eye-Tracking Tips of the Day, which now total 40, and taken together they constitute a free and practical guide to eye-tracking whether you are coming from an academic or commercial background. I recently posted an index to the tips to make them more usable which you can find here.

As I already mentioned, technology and software in eye-tracking are moving on at a pace, and whilst I toyed with the idea of pulling the tips together into a PDF “book” I think the blog format makes it easier for me to roll-out amendments and new tips and hopefully this will be seen as an ongoing valuable resource to all who don’t want that sinking feeling you experience when you get back from a day’s data collection only to find you have nothing useful to analyse! In 2016, therefore, I expect you’ll see another round of tips appearing and also, as mentioned in my last tip-post, I will be writing in more detail about some of the meatier topics. I actually got fan mail from the blog this year, which was super nice, and have had some great chats on Twitter about the tips, so please keep in contact and let me know what you think, or if you have any suggestions for other eye-tracking issues you think I’ve overlooked!

Sacher taught! (see what I did there?)

If you read the blog posts from ECEM (European Conference on Eye-Movements) in the summer you will not be surprised to see this as one of my highlights. Since embarking on my PhD back in 2006, I’ve been to, and presented at, many conferences and it’s always interesting to reflect on what makes a conference successful. ECEM got the mix just right this year with great organisation, excellent keynotes and, yes, a beautiful University and city as its backdrop. The research I saw being presented kept me invigorated till the very last poster session, which was a good job, because that’s where mine was, and really highlighted the diversity of cognitive and behavioural research questions that can be addressed with eye-tracking. It was also great to see the Open Source movement being well represented with a range of tools and algorithms for research being shown. OK, so the lunch bags sucked, although even being given a lunch bag sets it above several conferences I’ve been too, and yes, some of the rooms were incredibly hot and overcrowded, but there’s something quite heart-warming about being in a QUEUE to hear talks about visual search that confirms what every truly good conference should, which is that as a researcher you are part of a community, and it is only through the free movement of ideas that the field as a whole will progress.

“Snakes. Why’d it have to be snakes?”

And now, as they say, for something completely different. In the past few years I’ve done my bit for the promotion of eye-tracking through appearances on TV and Radio, and if you didn’t blink and miss it, you might have caught me helping with the eye-tracking for Jamie Oilver’s Sugar Rush on Channel 4 this year. But eye-tracking isn’t all we do at AI, and so when I was offered the chance to get involved with something based on a different biometric signature such as Electro-dermal Activity I wasn’t going to say no – especially when it meant a couple of days in the desert outside LA!

Acting as scientific advisor for what is essentially a commercial is an interesting experience, because you are constantly juggling between entertainment and scientific rigor. In the Toyota Tacoma Sal a Jugar video, we presented at competition comprising a range of different activities such as kayaking, off-road driving and camping for three teams of two. The twist was that the winning team would not be decided based on their skills or time to completion for the tasks, but instead was determined by their emotional response to the competition. In other words, the team that got most excited by what they were doing were the winners. The resulting 2:17 film doesn’t have time to detail the technology or science behind the data collection and analysis, performed by a very stressed Tim hunting for mobile data coverage in the dessert at 4am, and instead replaces it with me doing my best impression of Anne Robinson on The Weakest Link, but for those of you who missed it, it is a great example of how we at AI obtain some of the experience that enables us to talk about data collection in challenging, real-world scenarios, because I suspect “beware of the rattlesnakes” is not on most people’s checklist! The punishing heat, the waterproofing of technology, the access to power and internet for data streaming were all issues we had to overcome and, whilst the environment may have been a little more extreme than for shopper research on Oxford Street or a UX study in Shoreditch, they represent the real world that increasingly forms the back-drop for commercial and, finally, for academic research. The whole thing was hard work but also a lot fun and with 1.3M views on Facebook for the film it turned out to be not such a bad way for promoting the use of such methods!

Eye-school reuinion

As a result of that LA film shoot, it was a very jet lagged Tim, who had only flown back the night before, that stood up in-front of the attendees of the first Royal Holloway Real-World Eye-Tracking Workshop on July 20th to talk about designing eye-tracking research, selecting technology and analysing eye-movement data, but the enthusiasm of those attending combined with mine for the subject at hand to make this truly memorable. When I was a Teaching Fellow at Royal Holloway, I always enjoyed working with smaller groups more than I did lecturing to large theatres because of the immediacy of the learning they provide. Since starting AI, I have developed a number of training days and workshops for companies like The Guardian, BSkyB, Ralph Lauren, GSK and Omega Pharmaceuticals and have always found this approach to work best, especially given the practical nature of behavioural research. So it was super exciting to have the opportunity to work with my friends and collaborators at RHUL (Szonya Durant, Johannes Zanker, Robin Walker and Anat Bardi) to develop a 2-day eye-tracking workshop focussed specifically on the issues of researching in the real-world. If you follow the blog, you’ll know it sold out within hours, and we are currently updating some of the content ready for a second running of the course in April next year, and so it is my genuine hope that this will become a highlight of each and every year. For those who have been trained by eye-tracking manufacturers or resellers, this course offers a unique opportunity to update knowledge, compare platforms in action and also to understand not just HOW you do stuff, but more importantly WHY. I only wish I’d had something like this available to me when I started out in this field, although then I might not have come up with all those eye-tracking tips of the day! 😉

Oxford not-at-all-blues

Finally, I couldn’t complete my list of high-points without mentioning a quiet moment I had in Oxford this year reflecting on the success and uniqueness of Acuity Intelligence which make it such a special place to work. I had been invited to speak at the 11th Oxford Symposium: Trends in Retail Competition: Choice and Innovation in Grocery – not the catchiest name for a symposium and certainly not an event I would have predicted speaking at when I started my PhD! Of course, for those who follow the blog, this came out of our research into copycat packaging which identifies the inevitability of unconscious errors in shopper decision making at point of sale that results directly from packaging similarity. The research continues to attract a lot of interest, and 2016 should see our full peer-reviewed paper on the subject being published, but in the meantime we have a few copies of the proceedings from this symposium if you’re interested.

Apart from the thrill of speaking at Oxford University, my mum was very proud, this research, like previous work, is an example of scientific rigor being applied to real-world questions in the commercial world, and is one of the aspects of AI I’m most proud of. I know there are plenty of academics who turn their noses up at research performed with commercial funding, citing potential conflicts of interest and confirmation biases, but I’ve found that a frank discussion of these issues at the start has enabled us to work with some companies in the same way I have with more traditional research funders i.e. propose the research, obtain the funding, perform the research and analysis independently of the funder, and then report back to them, always with the commitment to release the results to the public domain. Now clearly this doesn’t work with all types of research, and we do plenty of other commercial research which never sees the light of day because our customers don’t want it to, but I would encourage a few more academics to consider attempting this model if there’s a good alignment between your research questions and a non-traditional funder of scientific research – try pitching it, you might just be surprised by the answer!

So that’s it – 2015 in a nutshell. All that remains is for me to thank all our customers and everyone who has worked with me this year to make the 4th year of AI so exciting. I especially want to acknowledge the contribution of Alice who left AI a couple of months ago to spread the word about eye-tracking and UX research since she played a part in many of the initiatives I have mentioned. I have some big ideas for AI in 2016, but first I need to complete my preparations for the main event – now where did I put my lightsaber…?


Leave a Reply