resources

Datasets

Over the years, we have collected eye-tracking data from thousands of participants. We publish these alongside our papers and hope they will be useful for the research community. Please check out the publications section or browse all projects on Ben's OSF. A good starting point may be Max's 2024 PNAS paper, pooling data from several hundred observers, each free-viewing 700 complex scenes: https://osf.io/y4dje/

We often structure the data in the way it is done here. See below for a brief overview or check our MATLAB exercises for eyetracking analyses for a more in-depth explanation.

Coming to the example of Max's dataset, in the DataEye folder you will find three datasets. Let's look at the Gi2023 one with 251 observers. The Matlab file IndGazeAllFixData.mat contains a struct called LabeledFix (see to the left). LabeledFix.Data is a cell array with one entry per participant. Each of these entries is a 2-d matrix, containing all fixations of this participant. For instance, LabeledFix.Data{1} contains a matrix with 8531 fixations (lines) and 64 features (columns). Fixation features are listed in LabeledFix.Hdr and start with the image number*, the fixation number (on this image by this participant), the object number** the fixation fell on, the x and y coordinates of the fixated pixel in image space, and the duration and onset time of the fixation in ms. These are followed by semantic and other features of the fixation target, many of which may be irrelevant to you.

In this case, we also provide a preprocessed version of the LabeledFix struct in IndGazeAllPreprocessed.mat. In this version, fixations with a duration < 100 ms and any fixations beginning with a latency < 100ms after image onset are excluded, as explained in the paper.

*image numbers correspond to the order of images in the stimuli folder, which you can find at the parent level
**object numbers correspond to those provided in the attributes structs, which contain object masks etc. for each image. Please check out the MATLAB exercises for eyetracking analyses for more details

Software

We spend a lot of our time developing code for experiments and data analyses (mostly in MATLAB). Here are some of the toolboxes and apps we find helpful:



Custom code: we're committed to publishing our code and data wherever possible. For individual projects, check out the publications page and OSF.

Textbooks & tutorials

Some of the links above include learning resources for the respective packages. For neuroimaging analyses we recommend the excellent Handbook of Functional MRI Data Analysis by Poldrack et al. and Statistical Analysis of fMRI Data by Ashby. To get into MATLAB, we recommend MATLAB for Neuroscientists by Wallisch and colleagues.

We also developed two tutorials for lab members beginning their coding journey and made them public- one with MATLAB exercises for MRI analyses, one with MATLAB exercises for eyetracking analyses. 

Neuroimaging courses

In April 2021 we started PNiP: Perceptual Neuroimaging in Practice, a weekly methods seminar for researchers, which is losely based on SPM's MfD. The second generation of PNiPsters just finished the course and we may start over again if and when we accumulate a critical mass of motivated junior researchers. Click here to find out more.


In September 2022 Mareike Grotheer and Ben held a one-day methods course on neuroimaging for members of the Collaborative Research Center 135. Here you can find Ben's slides on What is fMRI and what can I do with it?