Pupil capture - jackspaceBerkeley/pupil GitHub Wiki

About

Pupil Capture is the software used with the Pupil Headset. The software reads the video streams coming in from the world camera and the eye camera. Pupil Capture uses the video streams to detect your pupil, track your gaze, detect and track markers in your environment, record video and events, and stream data in realtime.

Calibration

Pupil uses two cameras. One camera records a subject's eye movements -- we call this the eye camera. Another camera records the subject's field of vision -- we call this the world camera. In order to know what someone is looking at, we must find the parameters to a function that correlates these two streams of information.

Before every calibration

Make sure that the users pupil is properly tracked. Make sure that the world camera is in focus for the distance at which you want to calibrate, and that you can see the entire area you want to calibrate within the world cameras extents (FOV).

Calibration Methods

First select the calibration method you would like to use:

Calibration Methods

Screen Marker Calibration

This is the default method, and a quick method to get started. It is best suited for close range eye-tracking in a narrow field of view.

Screen marker calibration controls

  1. Select Screen Marker Calibration
  2. Select your Monitor (if more than 1 monitor)
  3. Toggle Use fullscreen to use the entire extents of your monitor (recommended). You can adjust the scale of the pattern for a larger or smaller calibration target.
  4. Press c on your keyboard or click the blue circular C button in the left hand side of the world window to start calibration.
  5. Follow the marker on the screen with your eyes. Try to keep your head still during calibration.
  6. The calibration window will close when calibration is complete.
  7. Load the Show Calibration plugin from the General sub-menu to evaluate calibration quality.

In the Advanced sub-menu you can set the sample duration -- the number of frames to sample the eye and marker position. You can also set parameters that are used to debug and detect the circular marker on the screen.

Manual Marker Calibration

This method is done with an operator and a subject. It is suited for midrange distances and can accommodate a wide field of view. You need markers made of concentric circles, like the two shown below.

Manual Calibration Marker Manual Calibration Stop Marker

Download markers to print or display on smartphone/tablet screen.

Manual Marker Calibration GUI

  1. Select Manual Marker Calibration
  2. Press c on your keyboard or click the blue circular C button in the left hand side of the world window to start calibration.
  3. Stand in front of the subject (the person wearing the Pupil headset) at the distance you would like to calibrate. (1.5-2m)
  4. Ask the subject to follow the marker with their eyes and hold their head still.
  5. Show the marker to the subject and hold the marker still. You will hear a "click" sound when data sampling starts, and one second later a "tick" sound when data sampling stops.
  6. Move the marker to the next location and hold the marker still.
  7. Repeat until you have covered the subjects field of view (generally about 9 points should suffice).
  8. Show the 'stop marker' or press c on your keyboard or click the blue circular C button in the left hand side of the world window to stop calibration.
  9. Load the Show Calibration plugin from the General sub-menu to evaluate calibration quality.

You will notice that there are no standard controls, only an Advanced sub-menu to control detection parameters of the marker and to debug by showing edges of the detected marker in the world view.

Natural Features Calibration

This method is for special situations and far distances. Usually not required.

Natural Features Calibration GUI

  1. Select Natural Features Calibration
  2. Press c on your keyboard or click the blue circular C button in the left hand side of the world window to start calibration.
  3. Ask the subject (the person wearing the Pupil headset) to look a point in within their field of vision. Note -- pick a salient feature in the environment.
  4. Click on that point in the world window.
  5. Data will be sampled.
  6. Repeat until you have covered the subjects field of view (generally about 9 points should suffice)
  7. Press c on your keyboard or click the blue circular C button in the left hand side of the world window to stop calibration.
  8. Load the Show Calibration plugin from the General sub-menu to evaluate calibration quality.

Calibration Results

Loading the Show Calibration plugin from the General sub-menu will show an evaluation of the calibration quality. In a "good" calibration, the Number of used samples should be more than 180 and the fraction of used data points should be more that 0.75.

Show Calibration Results

The green outline show the calibrated area. Orange shows the sampled data points. Red shows outliers. Usually large outliers are blinks, other large outliers can often be attributed to subject error (not looking at the marker). Open image in another tab to see it at full resolution.

Notes on calibration accuracy

Using screen based 9 point calibration method, you should easily be able to archive tracking accuracy within the physiological limits (1-2 visual degrees).

  • Any calibration is accurate only at its depth level relative to the eye (parallax error).
  • Any calibration is only accurate to the field of view (in the world video) that you calibrated. For example: If during your calibration you only looked at markers or natural features (depending on your calibration method) that are in the left half, you will not have good accuracy in the right half.

Recording

Press r on your keyboard or press the blue circular R button in the left hand side of the world window to start recording. You will see red text with the elapsed time of recording next to the R button. To stop recording, press r on your keyboard or press the R button on screen.

Recorder

You can set the folder or Path to recordings and the Recording session name in the Recorder sub-menu within the GUI. Note -- you must specify an existing folder, otherwise the Path to recordings will revert to the default path.

What will be in the session folder?

If you open up a session folder you will see a collection of video(s) and data files. Take a look at Data format to see exactly what you get.

Pupil Broadcast Server

Pupil Server is a plugin that is used to broadcast data over the network using the excellent library Zero MQ.

Using Pupil Server

Starting Pupil Server is easy.

  • Load the Pupil Server plugin from the General sub-menu in the GUI.

  • It will automatically begin broadcasting at the default Address specified.

  • Change the address and port as desired.

    Pupil Server settings

  • If you want to change the address, just type in the address after the tcp://

Look at Streaming Data

If you want to see the data streaming from Pupil Server, you can just open up your browser (e.g. Chrome, Safari, Firefox) and paste the address into the navigation bar. In Chrome this will automatically start a download of the stream. Once you stop the stream, you can open up the download in a text editor to see the data.

Receiving Data with your own app.

ZeroMQ has bindings to many languages. Reading the stream using python goes like so:

import zmq
 
#network setup
port = "5000"
context = zmq.Context()
socket = context.socket(zmq.SUB)
socket.connect("tcp://127.0.0.1:"+port)
#filter by messages by stating string 'STRING'. '' receives all messages
socket.setsockopt(zmq.SUBSCRIBE, '')
 
while True:
    msg = socket.recv()
    print "raw msg:\n", msg
    
    items = msg.split("\n") 
    msg_type = items.pop(0)
    items = dict([i.split(':') for i in items[:-1] ])
    
    if msg_type == 'Pupil':
        try:
            print "norm_gaze: ", items['norm_gaze']
 
        except KeyError:
            pass
    else:
        # process non gaze position events from plugins here
        pass

We have written some simple Python scripts that you can try using Pupil Server to have your gaze control a mouse. Or just print out streaming from Pupil Server. Check out Pupil Tools at this gist.

Message Format for Pupil Server

Messages from pupil server are broadcast as simple strings with newline delimiters. This formatting happens in pupil_server. See these lines of code as reference. A raw message sent from Pupil server would look like this:

Pupil
confidence:0.695435635564
norm_gaze:(-0.23400470399355955, 0.23847916737192332)
apparent_pupil_size:41.609413147
norm_pupil:(0.76884605884552, 0.35504735310872393)
timestamp:1389761135.56

In this example string we did not exclude timestamp, norm_pupil, confidence, and apparent_pupil_size in the exclude list.

Filtering messages broadcast by Pupil Server

In general the pupil_server will stream any event and any message from the pupil server, a lot of stuff! This may not be desirable. This is why we have the exclude list. You put stuff in there that you do not want streamed. All you have to do is modify the exclude_list in the pupil server plugin. If you want only the normalized gaze coordinates, and nothing else, to be broadcast by pupil server, then just add to the list of things to be excluded. Example code snippet below.

# only 'norm_pos', and nothing more
self.exclude_list = ['diameter', 'id', 'angle', 'apparent_pupil_size', 'axes', 'center', 'confidence', 'ellipse', 'major', 'minor', 'norm_pupil', 'pos_in_roi', 'timestamp']

Marker Tracking

The Marker Tracking plugin allows you to define surfaces within your environment and track surfaces in realtime using a 5x5 square marker. We were greatly inspired by the ArUco marker tracking library.

  • Markers - We use a 5x5 square marker. This is not the same marker that is used by ArUco (they use 7x7).
  • Using a 5x5 marker gives us 64 unique markers.
  • Why the 5x5 grid? The 5x5 grid allows us to make smaller markers that can still be detected. Markers can be printed on paper, stickers, or displayed on the screen.

See the video linked below for an introduction and workflow.

Marker Tracking Intro Video

Defining Surfaces with Markers

A surface can be defined by one or more markers. Surfaces can be defined with Pupil Capture in real-time, or offline with Pupil Player. Below we provide an outline of steps.

  • Generate markers with this script, or download the image below.

All 64 Markers Note When printing markers, ensure that white space remains around the square marker. You can scale the markers to different sizes, but make sure to have a white border width of at least 1.2 x the marker grid size for marker, unless the marker is affixed onto a white (or light colored) background.

  • Define surfaces within your environment using one or more fiducial markers. Surfaces can be defined with a minimum of one marker. The maximum number of markers per surface is limited by the number of markers we can produce with a 5x5 grid.
  • Use Pupil Capture or Pupil Player to register surfaces, name them, and edit them.
  • Registered surfaces are saved automatically, so that the next time you run Pupil Capture or Pupil Player, your surfaces (if they can be seen) will appear when you start the marker tracking plugin.
  • Surfaces defined with more than 2 markers are detected even if some markers go outside the field of vision or are obscured.
  • We have created a window that shows registered surfaces within the world view and the gaze positions that occur within those surfaces in realtime.
  • Streaming Surfaces with Pupil Capture - Detected surfaces as well as gaze positions relative to the surface can be streamed locally or over the network with pupil server. Check out this video for a demonstration.
  • Surface Metrics with Pupil Player - if you have defined surfaces, you can generate surface visibility reports or gaze count per surface. See our blog post for more information.