Some of you already know what the above images are about, but I'm so happy about getting this to work, that I thought I'd post a blog about it. It happens to coincide with the end of this year's REU (Research Experience for Undergraduates) program during which I had 5 undergrads in the lab working on several eye tracking projects. This year it was all about video, which prompted me to develop the program responsible for drawing the above images. Collecting eye movement data (x,y,t) over video is what I worked on in Barcelona. It took me pretty much most of those 6 weeks to get enough C/C++ code together to be able to display video while recording gaze data. Once that was done, I handed the program over to the REUs who then ran four studies and who also extended the program to do various other things. Meanwhile, the whole effort motivated me to figure out how to display the captured data atop the video frames as a means to visualize the recorded gaze data. The algorithm for generating the above heatmaps is pretty straightforward and is well-known. Step 1 involves dropping a Gaussian point-spread function at each gaze location, growing the resultant heightfield with as many gaze points as collected per each video frame. Step 2 requires finding the maximum value in the heightmap. Sounds easy, but for an NxN image, it takes O(N^2) operations. Step 3 then requires normalization of the heightfield (division by the max value). Step 4 then recolors the height (luminance) by mapping it to the rainbow color palette. The last two steps, which can be combined into one, together take another O(N^2) steps. The image above at left was created this way for a data set of 24 scanpaths (sequence of gaze points) on the CPU. Looks good but it's slow (took about a minute). The image at right took only a fraction of a second and looks almost identical. The trick here is to use the GPU to reduce the number of operations form order O(N^2) to O(log(N)) for the max value localization and O(1) for the recoloring. On one particular workstation with a decent graphics card I observed a 700-fold speedup due to these reductions. That just blew me away, which is why I'm so excited about this development. I recently moved that bit of GPU code onto my video playing code and sure enough, even for a fairly large data set (oh, about 8 people or so), the code appears to play the video at real-time (30 Hz) rates. I suppose I should take timings of this just to confirm how long it takes...this could make a nice little paper someplace. Other eye tracking types might like to know how the whole thing is put together...

1 hour ago