Updated on
Summary Computers, like humans, can learn in a narrow mimicry of what the human brain is capable of.
When Google tries to fill in your search box based only on a few keystrokes, or your iPhone predicts words as you type a text message, its only a narrow mimicry human brain.The challenge in training a computer to behave like a human brain is technological and physiological, testing the limits of computer and brain science. But researchers from IBM Corp. say theyve made a key step toward combining the two worlds.The company announced Thursday that it has built two prototype chips that it says process data more like how humans digest information than the chips that now power PCs and supercomputers.The chips represent a significant milestone in a six-year-long project that has involved 100 researchers and some $41 million in funding from the governments Defense Advanced Research Projects Agency, or DARPA.Thats the Pentagon arm that focuses on long-term research and previously brought the world the Internet. IBM has also committed an undisclosed amount of money.The prototypes offer further evidence of the growing importance of parallel processing, or computers doing multiple tasks simultaneously. That is important for rendering graphics and crunching large amounts of data.The uses of the IBM chips so far are prosaic, such as steering a simulated car through a maze, or playing Pong. It may be a decade or longer before the chips make their way out of the lab and into actual products.But whats important is not what the chips are doing, but how theyre doing it, says Giulio Tononi, a professor of psychiatry at the University of Wisconsin at Madison who worked with IBM on the project.The chips ability to adapt to types of information that it wasnt specifically programmed to expect is a key feature.Theres a lot of work to do still, but the most important thing is usually the first step, Tononi said in an interview. And this is not one step, its a few steps.Technologists have long imagined computers that learn like humans. Your iPhone or Googles servers can be programmed to predict certain behavior based on past events. But the techniques being explored by IBM and other companies and university research labs around cognitive computing could lead to chips that are better able to adapt to unexpected information.IBMs interest in the chips lies in their ability to potentially help process real-world signals such as temperature or sound or motion and make sense of them for computers.
Featured
