Intel, the hardware company whose processors and chips are almost ubiquitous, sees software in its future. It’s a future that involves users interacting directly with computers, using gestures, voice and even eye tracking.
It’s a vision that Intel Corporate Vice President Mooly Eden laid out in a presentation in Israel last month. According to Eden, Intel’s ambitious “Perceptual Computing” (Per-C) project is blazing along, and it won’t be more than a few years till we are able to communicate with computers and devices as they manage our schedule, tell us where our lost keys are, figure out what we want to watch on TV, and much more.
The goal of Per-C, Eden said, is to develop computers and devices “that are natural, intuitive, and immersive.”
The Control-Alt-Delete keyboard combination used to restart a computer, he said, “is not natural. Neither is communicating with a keyboard or a mouse. Even touch technology isn’t intuitive” in that you don’t often communicate with other people by touching them. “Gestures, voice, and facial expressions — the natural things that God gave you — those are intuitive,” said Eden, and it’s Intel’s aim to integrate those capabilities into computing experiences.
How close that Per-C future is — or isn’t — was on display at an event at Intel’s Haifa campus last month, as over a dozen entrepreneurs showed off their Per-C programming skills at Intel Israel’s first Perceptual Computing Hackathon. Participants were given 48 hours to come up with a Per-C application that makes use of the Intel Per-C toolkit, incorporating use of the Creative 3D cameras distributed to each team along with voice commands via a microphone.
The idea, said an Intel spokesperson, was to show off Per-C’s potential, and winners received tickets and accommodations for a trip to San Francisco in September to participate in Intel’s Perceptual Computing Challenge, a worldwide contest which will feature over 700 projects.
And, as hoped, some of the applications showed off not only what Per-C will be capable of but what it can do right now. “People may not realize how close Intel is to this revolution, but some of the applications that were on display at the Hackathon are good examples of what we will soon see as everyday uses for this technology,” the spokesperson said.
For example, one team showed off a “dating game” it developed using the camera to determine how a partner on a date felt about what was going on. The app is based on the theory that people who connect with each other mirror each other’s behavior — smiling and gazing at each other, closing the distance between them, nodding and gesturing at each other, etc. The more consistent and close this behavior, the better the date is going, according to this theory. The app uses a 3D camera to detect the other party’s actions and then compares it to those of the app user. A demo of the dating game was presented on a computer but it could easily be rewritten for smartphones equipped with 3D cameras.
Another app, called SignIt, uses gestures to let people log onto their devices. It’s easy to forget passwords composed of text and numbers, but it’s easy to remember gestures. The app records gesture combinations and sets them up as a password. While the demo showed the app working properly, more work is needed on it, the team said, to ensure that the app rejects the same gestures made by an unauthorized individual.
A third Per-C application, Creative Rating, will make it easier for users to like or dislike what they see on screen. According to the app’s developers, only 1% of users on sites like Facebook offer “likes” for movies, restaurants, and businesses. Using the 3D camera, Creative Rating lets users show their like or dislike by giving a thumbs up or thumbs down; an API could record that like or dislike on Facebook or other sites where users’ opinions are solicited. A more advanced application of the technology could save users the work of even lifting their thumb, for example by recognizing a happy or sad face or an approving or disapproving look.
The winner of the Hackathon, as decided by the audience and a panel of judges, was an app called Hand Your Music, which used hand movements to create a sound and light show. Hooked up to a commercial lighting system and a big-beat sound system, the app uses an algorithm that increases the volume and intensity of computer-generated music in response to the intensity, direction, and type of gestures detected by the 3D camera. The more intense the gesture, like “hitting” the air, the louder and faster the music and the more varied and elaborate the light show.
While much of the technology for Perceptual Computing as a major force may already be here, Intel is quite aware that commercial applications may still be some time off. To judge from online postings and articles on tech sites, gestures are still seen by many as gimmicky, a cute idea for a game or a lifestyle app but not useful for serious work.
But Intel has time, said Igal Iancu, Intel director of perceptual computing strategic planning. The company is supplying the technology, and it will up to the developer community to make creative use of it.
“While Intel is supplying hardware, middleware and software samples and support, we don’t claim to know or own all the opportunities,” he said. “Events such as the Hackathon are aimed to open up the opportunity to new ideas in the Perceptual Computing domain, whether on the usage, technology or even the business aspects. Only with close interaction with the community, we believe, will this new domain flourish.”