Visualizing And Evaluating Binocular Pupil Matching

This notebook has been created in order to evaluate existing and future implementations of the binocular pupil matching algorithm.

Existing implementations

We compare 4 existing implementations:

Setup

Load Gaze Mapper implementations

In [1]:
from gaze_mappers import GAZE_MAPPERS

Utility functions

These functions generate artifical pupil data with a fixed frame rate

In [2]:
from utils import (
    generate_monocular_pupil_data,
    generate_uniform_numbers,
    generate_timestamps,
    take,
)

Mixing pupil data

We model the Pupil Capture pipeline by calculating an arrival time in the gaze mapper and mapping the pupil data in order of the arrival time.

The arrival_time is calculated with following values:

  • timestamp: Pupil creation time, fixed fps
  • camera jitter: Uniform random value, newly generated for each pupil datum, models slight variations in exposure timings
  • transport delay: Fixed value added to each pupil datum, models delay from transferring the pupil data from the eye to the world process
  • transport jitter: Uniform random value, newly generated for each pupil datum, models slight variations in transport timings

See utils.custom_binocular_mixer() for details.

In [15]:
from utils import custom_binocular_mixer

Plotting functions

Functions for visualizing matched pupil and gaze data.

In [4]:
from utils import plot_mapped_gaze_data

Example usage

  1. Define pupil streams
  2. Setup mixer, simulating transfer to world process
  3. Intialise mapper
  4. Map pupil data
  5. Visualization
In [5]:
sample_count = 100
both_fps = 200


eye0 = generate_monocular_pupil_data(
    eye_id=0,
    confidences=generate_uniform_numbers(),
    timestamps=generate_timestamps(fps=both_fps,),
)


eye1 = generate_monocular_pupil_data(
    eye_id=1,
    confidences=generate_uniform_numbers(),
    timestamps=generate_timestamps(fps=both_fps,),
)

binocular_mixer = custom_binocular_mixer(
    sample_count=sample_count,
    both_camera_jitter_range=(0, 1 / (2 * both_fps)),
    both_transport_jitter_range=(0, 1 / (2 * both_fps)),
    eye0_transport_delay=0.01,
)

pupils = binocular_mixer(eye0, eye1)
pupil_list = take(sample_count, pupils)

test_mapper = GAZE_MAPPERS[-1]
mapped_data = test_mapper.map_batch(pupil_list)

plot_mapped_gaze_data(mapped_data, title="Test")

del pupil_list
del mapped_data
del test_mapper

Binocular matches by time

This graph visualizes which pupil datums are being matched to binocular gaze (red lines). Small dots represent low confidence data which should not be considered for binocular mapping and should be mapped monocularly as soon as possible. The dot color encodes the arrival time. The green line visualizes how the temporal cutoff changes over time.

Creation time vs arrival time

This graph visualizes how creation time relates to arrival time. Blue lines show binocular matches. Dot colors encode eye id.

Timestamps difference by stream

Binocular, monocular eye0, and monocular eye1 gaze data are considered seperate streams. This graph visualizes that the gaze for each stream is yielded in a monotonic fashion.

Putting it all together

Functions for conveniently generating pupil streams, mixing, and mapping them.

In [6]:
from utils import evaluate_gaze_mappers

Case 1: Ideal case

Same framerate, no jitter

In [7]:
evaluate_gaze_mappers(
    both_timestamp_fps=200,
    binocular_data_mixer=custom_binocular_mixer(
        sample_count=sample_count,
        both_camera_jitter_range=(0, 0),
        both_transport_jitter_range=(0, 0),
        eye0_transport_delay=0.0,
    ),
    eye0_confidence_stream=generate_uniform_numbers(),
    eye1_confidence_stream=generate_uniform_numbers(),
)
v1_18_Binocular_Gaze_Mapper
Monotonic_Binocular_Gaze_Mapper
No_Upsampling_Binocular_Gaze_Mapper
Dynamic_Cutoff_Binocular_Gaze_Mapper