There are numerous reasons why I wanted to do this - mainly because it was on every count better than the only other option I had - but I won't go into them here. All I'll say is that, after the Leeds Multimodality Conference, during which I attended an excellent Tobii workshop with Dr Tim Holmes of Acuity Intelligence and found out about PsychoPy, I was convinced that this would be the solution to the many problems I had been facing. I could say that it was also the start of a new set of problems, and I wouldn't be lying if I did, but the slant that I prefer is that it was the start of a (very) steep learning curve, one which I am very happy to be nearing the top of, and one which I can confidently say has changed my technophobic ways for the better.
Armed with some helpful user guides, words of encouragement from Tim and a few very kind and highly intelligent neighbours and colleagues, I finally got to the stage where I was able to run my full experiment through PsychoPy as I had hoped I would. Part of my problem was a lack of information online for people who really do know nothing about coding - I am (was?) one of these people, and while I see now that I got myself quite far using only information sourced from the excellent PsychoPy User Group, life would have been much easier, and a large amount of time would have been saved, if I'd had access to a simple step-by-step guide to setting up a simple experiment in PsychoPy that was able to talk to Tobii.
So, here is that step-by-step guide, and if one other computing novice out there benefits from it then it will all have been worthwhile! I may have missed out some obvious issues, and I may be using a non-technical language that is insulting to anyone who knows anything about this stuff: for this I apologise in advance, and welcome any comments, suggestions or advice on this topic.There are still some technical issues, and if these get resolved I will most happily post further information as I acquire it. People much smarter than me have helped and still are helping with this at every stage, and I should say that none of them have found it easy or obvious - one small fact that I have consoled myself with on a number of occasions!
Please note that this guide is tailored to Windows computers only. If you are on Mac or Linux then let me know and I can send some more specific information :)
First thing to note: you need to find out what bitness your computer is, as this is important for everything you do from hereon-in. You can find that out here, if you don't already know.
2. Add a code component into your experiment. This will
enable communication between PsychoPy and the eye-tracker. A simple code can be
copied from the Stroop for eye-tracking
demo, which can be found in the materials from an ECEM workshop, located
here (under 'Previous Events'). On page 17-19 of the PDF Py4ET you can find a code for the Stroop demo. This
can be copied and then amended to suit the purposes of your own experiment.
Don’t forget to make sure that the code is aligned exactly as it is in the Stroop
demo: if you miss out an indentation it will not work!
For the purposes of my experiment I only needed PsychoPy to
talk to the eye-tracker; no response was required from the participants in
terms of mouse-clicks or keyboard-presses, so I deleted these sections from the
Stroop demo code. If you leave them in it will still work, but you will find
code in your data that specifies when you clicked the mouse to start, or
pressed Escape to finish.
3. Now you need to set up the iohub. This took
me a while as it wouldn't work and I couldn't work out why. Iohub is a package
for use with Python, and enables the use of external devices and the monitoring
and coding of events through these devices.
First, you need to make sure that you have Python installed
on your computer. Iohub is now merged with PsychoPy, so if you have an updated
version of PsychoPy (1.74 or higher) you will already have iohub installed by
default. However, there are a few more packages that need to be installed
before PsychoPy will talk to the iohub. These can be found here.
So, before you start, it’s worth getting a few things in
order on your C: drive. Make sure that Python has a folder on the C: drive –
C:\Python27. Then make sure that your experiment is saved in this drive, under
…\Lib\site-packages.
Now you want to download all of the iohub dependencies for
the version of Python that matches the version in your C: drive, and save them
all to the \site-packages folder. This makes sure that everything is in the
right place for your experiment – if it isn’t in the right place, PsychoPy
won’t know where to retrieve the files from.
4. Once the iohub is installed you need to install the SDK.
This is a language binding, which means it enables communication between
various coding software and Tobii. It works with Python, and so can be used
with PsychoPy, but is also compatible with EPrime and Matlab. SDK simply stands
for software development kit, and Acuity Intelligence has created it to be
fully integrated with Tobii, so it’s very easy to run.
Before installing the SDK you need to make sure you have
Bonjour downloaded on your system. This is a device which locates any eye
trackers that are connected to your computer either through a USB or through a
network.
You'll also need Microsoft Visual C++ 2008 SP1 Redistributable Package, which can be downloaded here.
Now it’s time to install the SDK, which comes as a zip file,
and should be unpacked to the C: drive. As well as the SDK files you’ll find
some information on how to build experiments through the SDK (as opposed to
using iohub) and some demos, too.
5. Next you need to add the appropriate eye-tracker to the
code and make sure that all of the relevant information is provided in
PsychoPy. In ‘Experiment Settings’ at the top of the PsychoPy screen you will
find some blank boxes for ‘Experiment info’. Create a field for the eye tracker
called Eye Tracker (or similar) and under default type in tobii_std.yaml. When
you run the experiment this should come up in the dialogue box before you
start.
Now we need to return to the ECEM materials folder to locate
the .yaml file which will tell PsychoPy all of the necessary information about
the eye tracker. Go to the Stroop demos folder (with eye tracking) and you will
find four .yaml files all labelled in relation to various eye-trackers. Copy
the tobii_std file and paste it into the site-packages folder.
6. Now it’s time to set a path for the experiment so that
all of the packages can talk to one another. This is most easily done in the
Environment Variables settings of your computer. In Windows 8 this can be found
under Control Panel > System and
Security > System > Advanced System Settings > Advanced. Click on
the ‘Environment Variables’ button and then click on ‘New…’. Under Variable Name type PYTHONPATH, and set
the variable to the Modules folder in the unzipped SDK folder. So, if the
modules folder can be found under C:\...\tobii-analytics-sdk-3.0.83-win-x32\tobii-analytics-sdk-3.0.83-win-Win32\Python27\Modules,
you need to set the variable up as %PYTHONPATH%;C:\Users\Catherine\Documents\tobii-analytics-sdk-3.0.83-win-x32\tobii-analytics-sdk-3.0.83-win-Win32\Python27\Modules.
Now if you try to run your experiment in PsychoPy it should
work!
7. Finally, we have
to get Tobii ready to start tracking eye movements while PsychoPy is running
the actual experiment. This bit is easy! Just set up a new experiment and add
the ‘Screen Record’ icon from the media toolbar to the timeline. Now you are
ready to run the experiment!
To run your experiment, begin recording in Tobii and
calibrate the participant as you would normally. When the calibration is
finished, start the experiment in Tobii as normal – this will enable the
eye-tracker to get data on the participant’s eye movements – and then run your
experiment in PsychoPy. When the experiment is over, just press ‘Esc’ (or the
equivalent key to finish running the experiment in Tobii) and screen recording
will stop on the eye tracker.
Good luck! And please provide any feedback: both positive and negative comments are welcome!