Postdoc position for classification of cognitive and affective states using eye tracking in the context of a training task.
Faubert Lab is looking for a Postdoc to work on the detection and classification of cognitive and affective states via eye tracking data during pilot training tasks. The Postdoc will explore both online and offline solutions as well as traditional machine learning and novel deep learning approaches. The Postdoc will also collaborate with other researchers exploring different modalities (e.g. EEG, Facial Recognition) for cognitive and affective states classification. Multimodal models will have to be developed to take advantages of the synergy between different sensors and measurements.
Faubert Lab expertise is in vision and perception and we are looking for someone with programming skills coming from a technical background such as engineering or computer science. Preferably with experience in machine learning and/or deep learning.
This project is part of an NSERC industrial collaborative research and development grant with CAE, a world leader in flight simulation and training. 1 year position, with a salary of $40k.
Please send a curriculum vitae and a letter outlining research experience and career goals to faubertlab [at] opto.umontreal.ca
For more information on Faubert Lab, please visit http://faubertlab.com/
We thank all applicants for applying for this position, but only those considered for an interview will be contacted.
The U.S. Air Force is short 2,000 pilots and needs to train new aviators faster but with the same quality. To do this, the USAF has committed to revolutionizing pilot training with the launch of the “Pilot Training Next” (PTN) initiative.
The AFWERX Innovation Challenge for PTN received almost150 proposals from leading companies around the world. Sixteen were selected as finalists to compete in the solution showcase shootout in Austin, TX with the award going to five companies. NAI and their flagship product, NeuroTracker was selected based on its scientific validation of cognitive assessment and performance.
NeuroTracker is a world leading cognitive assessment, training and rehabilitation platform in use by Special Operations Forces (SOF) across the world to improve focus, situational awareness, predictive agility and decision-making under pressure.
Former Air Force T-38 Instructor Pilot and NAI Advisory Board Member, Bob O’Malley commented on the initiative explaining:
“I am impressed by the approach taken by the US Air Force and SAIC with its Pilot Training Next initiative. Involving commercially available technologies and techniques to innovate its pilot training program is aggressive, but timely. Employing tools like NeuroTracker will improve pilot cognitive skills allowing the Air Force to produce pilots faster and better”
It’s no secret that the Washington Capitals won the 2017-2018 Stanley Cup.
Alexander Ovechkin made sure that the whole world would know they won, after all those years being eliminated early in the playoffs despite having one of the best hockey players of this generation (and all time!).
However, Ovechkin didn’t do it all by himself. Support players also had to step up to accomplish the impossible – winning the Stanley cup. And stepping up, Tom Wilson certainly did!
My dad always said, ‘If you’re going to lift a weight with your arm in the gym, why wouldn’t you go work your brain out with your eyes?’
Tom Wilson has been using the NeuroTracker on a regular basis to train his cognitive and visual abilities. Looking at his results this year, it seems to have helped quite a bit! Obviously, many other factors can be the cause of this evolution, but we’re happy to see that our research is being used by the winners of the Stanley Cup!
Tom Wilson funnily describes the NeuroTracker as “a bunch of balls bouncing around your monitor like an old Microsoft screensaver.” Indeed the NeuroTracker is a very simple task on the outside, yet with a lot of science and literature on the inside. Having a « simple task » allows us (the laboratory and its collaborators) to have a better scientific control over the effects we’re studying.
NeuroTracker is a technology developed by Prof. Jocelyn Faubert’s lab.
During the Montréal Grand Prix, Jocelyn Faubert, holding the NSERC – ESSILOR industrial chair on visual functions, and his students offered the Grand Prix’ attendees a driving simulation to raise awareness about the key role of vision in safe driving behaviors. In partnership with Essilor Canada, this activity is part of a larger initiative from Essilor and the Fédération Internationale de l’Automobile to promote the importance of a good vision for road safety.
More than 375 visitors experienced the simulation and exchanged with the students about key visual components for drivers and the importance of having a high-quality vision.
You can read the original news on the Optometry School website. (French only)
Jesse Michaels won the third prize of the Student poster category for his presentation entitled: « Importance de la charge mentale et des mesures perceptivo-cognitives pour évaluer les différences liées a l’âge lors de la conduite automobile en simulateur » at the 7th annual RRSR Symposium, May 24, 2018.
The Influence of the External Signal Modulation Waveform and Frequency on the Performance of a Photonic Forced Oscillator – Noemi Sánchez-Castro, Martha Alicia Palomino-Ovando, Denise Estrada-Wiese, Nydia Xcaret Valladares, Jesus Antonio del Río, Maria Beatriz de la Mora, Rafael Doti, Jocelyn Faubert and Jesus Eduardo Lugo
Three-dimensional multiple object tracking in the pediatric population: the NeuroTracker and its promising role in the management of mild traumatic brain injury – Laurie-Ann Corbin-Berrigan, Kristina Kowalski, Jocelyn Faubert, Brian Christie , Isabelle Gagnon
Training with a three-dimensional multiple object-tracking (3D-MOT) paradigm improves attention in students with a neurodevelopmental condition: a randomized controlled trial – D. Tullo, J. Guy, J. Faubert, A. Bertone
Vanessa Harrar received the Award of Excellence – Poster Presentation category for her presentation entitled: « Audio-visual multiple object tracking » at the 23rd Vision Health Research Network Annual Meeting, November 7, 2017
Audio-visual multiple object tracking
Vanessa Harrar1,2, Eugenie Roudaia1,2 and Jocelyn Faubert1,2
Visual Psychophysics and Perception Laboratory, School of Optometry, University of Montreal, Montreal, QC, Canada
NSERC-Essilor Industrial Research Chair, Montreal, QC, Canada
The ability to track objects as they move is critical for successful interaction with objects in the world. The multiple object tracking (MOT) paradigm has demonstrated that, within limits, our visual attention capacity allows us to track multiple moving objects among distracters. Very little is known about dynamic auditory attention and the role of multisensory binding in attentional tracking. Here, we examined whether dynamic sounds congruent with visual targets could facilitate tracking in a 3D-MOT task. Participants tracked one or multiple target-spheres among identical distractor-spheres during 8 seconds of movement in a virtual cube. In the visual condition, targets were identified with a brief colour change, but were then indistinguishable from the distractors during the movement. In the audio-visual condition, the target-spheres were accompanied by a sound, which moved congruently with the change in the target’s position. Sound amplitude varied with distance from the observer and inter-aural amplitude difference varied with azimuth. Preliminary results with one target showed that performance was better in the audiovisual condition, which suggests that congruent sounds can facilitate attentional visual tracking. However, with multiple targets, the sounds did not facilitate tracking. This suggests that audiovisual binding may not be possible when attention is divided between multiple targets.