(Downloads - 0)
For more info about our services contact : help@bestpfe.com
Table of contents
1 Introduction
1.1 Research Questions
1.2 Scope
1.3 Outline
2 Background
2.1 Appearance-Based Unconstrained Gaze Estimation
2.1.1 Problem Description
2.1.2 Accuracy Metrics
2.1.3 The GazeCapture Dataset
2.2 Artificial Neural Networks
2.2.1 Artificial Neuron
2.2.2 Neural Networks and Connected Layers
2.2.3 Learning and Backpropagation
2.2.4 Convolutional Layers
2.2.5 Depthwise Separable Convolutions
2.2.6 Pooling Layers
2.3 Transfer Learning
2.4 Similarity Learning
2.4.1 Siamese Neural Network
2.5 Calibration
3 Related Work
3.1 Eye Tracking for Everyone
3.1.1 iTracker: a Deep Neural Network for Eye Tracking
3.1.2 Calibration with SVR
3.2 MobileNets
3.3 A Differential Approach for Gaze Estimation with Calibration
3.4 It’s Written All Over Your Face
4 Siamese Regression for GazeCapture
4.1 The Siamese Neural Network for Regression
4.1.1 Neural Network Architecture
4.1.2 Training the Siamese Neural Network
4.1.3 Gaze Inference Using the Siamese Neural Network
4.2 Intermediate Experiments
5 Results
5.1 Siamese Neural Network and Calibration Points for Gaze Estimation
5.1.1 Inference Time
5.2 Miniature Models
6 Discussion
6.1 Mobile Phones vs Tablets
6.2 Effect of Increasing Calibration Points
6.3 Even Spread or Random
6.4 The Efficacy of SiameseNeuralNetworks for Gaze Estimation with Calibration
6.5 Inference Time
6.6 Transfer Learning from ImageNet to GazeCapture
6.7 Fine-tuning for Specific Device and Orientation
6.8 Increased Data Quantity for Gaze Difference
6.9 iTracker vs MobileNet
6.10 Depthwise Separable Convolutions
7 Conclusions
7.1 Transfer Learning
7.2 Calibration Points with Siamese Neural Networks
7.3 Future Work
Bibliography




