It may sound like a page out of a science fiction book, but that's where Intel is heading in its research to make computers become smarter. The idea that computers and devices becoming more aware of its user's surrounding is what that's needed to bring users that next level of experience. Imagine a phone that knows you've just entered your car and knows you're about to drive to work. It could start to look up traffic information to prepare you for your journey while it sorts your favorite music tracks based on your mood, just so you'll have a more pleasant journey to work.
According to Intel's Chief Technology Officer and Senior Fellow Justin Rattner, all that intelligence can be built into the devices that we carry with us today because of its increased processing power and better all-day connectivity. But for the devices to reach a heightened level of awareness, innovative sensing capabilities need to be developed, such that the user's surrounding would form rich context data which the device could use to better understand the user's needs. With context-aware devices, it can anticipate our needs, advise and guide us through our day like a personal assistant.
Devices today already have a good array of sensors that it could use, such as location through GPS, accelerometers for movement and rich soft data sources from your calendar, social network and preferences. But for devices to hit the next level of awareness, sensors need to be smarter and devices need to be able to combine various data sources to make accurate inferences about the user's state of mind and activity.
In a demonstration during the keynote, Rattner demonstrated how a context aware remote control that is able to immediately sense who is holding it and subsequently decides what kind of TV channels the user wants to watch based on smart TV preferences set for the person. The TV could learn the preferences based on data it collects about the users surfing behavior, past viewing habits and even from what the person discusses with friends on social media platforms like Twitter or Facebook.
In the final demonstration, Rattner presented the Human Brain project jointly developed between Intel with Carnegie Mellon University and the University of Pittsburgh. The aim of this project is to enable humans to communicate with computers and mobile devices through their thoughts. Sounds impossible, but research has shown that neural activity patterns could be interpreted and may one day become a new form of input.
While the future may look exciting, privacy will be a major concern with so much data being analyzed and made available across applications and devices. It is probably one reason why Intel invested into McAfee, so that security would be built into the silicon itself.
"Our vision is to enable devices to generate and use contextual information for a greatly enhanced user experience while ensuring the safety and privacy of an individual's personal information. Underlying this new level of security are several forthcoming Intel hardware-enabled techniques that dramatically improve the ability of all computing devices to defend against possible attacks," said Rattner.