Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

Futurist Forum

Computers That Feel And Think: An IBM Scientist Fact Checks His Predictions

Last year, IBM's Bernard Meyerson forecasted that computers will develop the ability to see, smell, and even hear in 5 years. How are we doing on that?

Computers That Feel And Think: An IBM Scientist Fact Checks His Predictions

[Image: Abstract via Shutterstock]

Predictions about the future are a dime a dozen. But how often do we look back and see whether a forecast was on the mark?

Today, we published IBM's 5 in 5 report, a look at the technology trends that will change our world in the next five years. We thought it was a good time to take a look at how one of the predictions made in last year's report turned out.

In the 2012 report, Dr. Bernard Meyerson, head of IBM Innovation, discussed our "ability to give machines some of the capabilities of the right side of the human brain"—the part that controls creativity, intuition, facial recognition, systems thinking, music, emotion, and some sensory perception.

He said that, in five years, we will soon have the ability to touch and actually feel things through our phones. Not only will computers will be able both be able to see images and understand them, they will also understand taste and know what you like to eat; they will have a sense of smell; and they will be able to hear—and filter out the sounds that matter.

What progress has been made toward that? We talked with Meyerson to get an answer.


"These are the 5 in 5, which means some of these things are not just hypothetical. They're actually already in development," says Meyerson.

As an example, he points to a high school science experiment. A student (with permission) tapped into the microphones of his friends' cell phones to study noise pollution around their town. "He was able to literally put together a noise map, time dependent, which is actually a quite valuable bit of insight. And nobody's actually doing anything manually; the systems performs it autonomically," Meyerson says.


Meyerson explains that sense of smell is related to the ability to sense chemical compounds in the air—something a phone can do with biosensors attached or built in. These days, biosensors are able to, for example, sense and carry out specific attacks on bacteria and viruses, and in development, researchers are putting the same technology on the surface of a chip to detect biohazards.

Says Meyerson: "There's a limited story now, but one of the examples we use, which I do believe is actually getting there, is you can literally have a sensor in your phone where it would warn you that you're about to go into insulin shock or the other way, into diabetic shock, because your blood level...blood sugar levels are simply way out of bounds, either way too high or way too low."


Meyerson says that, in a way, the early form of visual recognition in computers is the fingerprint sensor. The next level are security systems that are designed to do not just facial recognition but object recognition in a particular image. You could figure out, for example, whether someone walked into a train station, or dropped their bag in a suspicious location.

"Using heavier duty computing capacity than sitting in your cell phone—we have the ability to identify [objects in images]," he says. "And it's only a question of time until that capability comes down into your phone."

As mobile bandwidth improves, this trend will also accelerate because data doesn't need to be processed on the device. "Even if you don't have the full compute-and-recognition capability on your device, you can instantly send back the image that you want analyzed to a remote location where it will be analyzed and give you a heads up as to what you're looking at," says Meyerson.


Today, there are tactile surfaces being developed that are just a series of pins that vibrate as a person touches them, mimicking different kinds of textures.

"You can actually send a texture to a remote location, which is step one." says Meyerson. "What will be kind of interesting, is, if nothing else, with the advent of more and more 3-D printers, you could actually print up a local version of the texture of the surface for you to play with. Same thing. It's just right now a separate device that hasn't been incorporated, to my knowledge, at least, in a handheld. But that will eventually get there."

Meyerson stands by his prediction that all of these technologies are coming. Whether they will be here within the five-year mark is still a question mark. Nothing is quite to a consumer level yet, but early signs are good.

The Fast Company Innovation Festival