Ever wish your VCR could sense your boredom and fast-forward through the dull parts of a movie? How about a compact disc player that could play music based on your mood?
While such ideas sound like something out of a futuristic film, researchers at the Massachusetts Institute of Technology's Media Lab are working on computers that can "read" a person's mind by monitoring body movements."From a human standpoint, machines are very rude," said Alex "Sandy" Pentland, the academic head of the Media Lab. "It's about making machines aware of peo-ple."
The technology is still experimental and years away from mass production. But under the direction of MIT professor Rosalind Picard, the Media Lab's Affective Computing group has developed computer systems that use biorhythmic sensors attached to a user's body and tiny cameras that record facial gestures to develop individual emotional profiles.
When a user becomes interested, frustrated or bored, the software can adjust, Pentland said.
"Technology has run us long enough and now we're finding ways for us to run technology," he said. "We want to make technology respond to us in a way that is helpful."
Among the projects under development is an "affective tutor," a computer education program able to sense a student's level of interest or frustration. The software can sense states like boredom, anxiety, confusion and interest - as well as facial cues like smiles and frowns - and adjust its instruction accordingly.
An "affective VCR" could monitor a viewer's face for signs of boredom and skip to the most exciting scenes of a movie, as well as shut itself off when a viewer falls asleep.