Algorithm ‘sees’ when people’s eyes meet
A new machine-learning tool detects eye contact during recorded face-to-face interactions as accurately as expert observers can.
A new machine-learning tool detects eye contact during recorded face-to-face interactions as accurately as expert observers can.
Looking at eyes, noses and mouths may prompt slower recognition in the brains of autistic people than in those of non-autistic people.
Unlike typical toddlers, those with autism tend not to share experiences involving sound — dancing to music with their parents, for example, or calling attention to the source of a sound.
A pair of tools that gauge social abilities in rhesus macaques may help researchers study autism-like behaviors in monkeys.
Autistic people tend to have trouble shifting their gaze to take in all the details of a scene.
Autistic infants as young as 6 months display subtle signs of the condition, according to a study of visual attention.
Understanding how gaze differs in autistic people may help improve their lives.
Researchers have created an animated monkey avatar that makes realistic facial expressions — and that may yield insight into how autistic people interpret facial expressions.
The social brain has a sweet spot that activates when people look each other in the eyes but not when they look at eyes in a video.
The relatives of autistic people often have mild traits of the condition. Studying these family members could broaden our understanding of autism.