Showing posts with label Dharmendra Modha. Show all posts
Showing posts with label Dharmendra Modha. Show all posts

Thursday, August 18, 2011

Supercomputing + Neuroscience + Nanotechnology

This thing is looking to be pretty badass.
VentureBeat: IBM produces first working chips modeled on the human brain: so-called cognitive computing chips could one day simulate and emulate the brain’s ability to sense, perceive, interact and recognize ..... Dharmendra Modha .... is the principal investigator of the DARPA project, called Synapse (Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE). He is also a researcher at the IBM Almaden Research Center in San Jose, Calif. ...... “This is the seed for a new generation of computers, using a combination of supercomputing, neuroscience, and nanotechnology,” Modha said in an interview with VentureBeat. ”The computers we have today are more like calculators. We want to make something like the brain. It is a sharp departure from the past.” ...... the project could turn computing on its head, overturning the conventional style of computing that has ruled since the dawn of the information age and replacing it with something that is much more like a thinking artificial brain. The eventual applications could have a huge impact on business, science and government. The idea is to create computers that are better at handling real-world sensory problems than today’s computers can. ...... It has “neurons,” or digital processors that compute information. It has “synapses” which are the foundation of learning and memory. And it has “axons,” or data pathways that connect the tissue of the computer. ....... In von Neumann machines, memory and processor are separated and linked via a data pathway known as a bus. ....... With the human brain, the memory is located with the processor ...... The brain-like processors with integrated memory don’t operate fast at all, sending data at a mere 10 hertz, or far slower than the 5 gigahertz computer processors of today. But the human brain does an awful lot of work in parallel, sending signals out in all directions and getting the brain’s neurons to work simultaneously. Because the brain has more than 10 billion neuron and 10 trillion connections (synapses) between those neurons, that amounts to an enormous amount of computing power. ......... “We are now doing a new architecture,” Modha said. “It departs from von Neumann in variety of ways.” ...... Modha said that this new kind of computing will likely complement, rather than replace, von Neumann machines, which have become good at solving problems involving math, serial processing, and business computations. The disadvantage is that those machines aren’t scaling up to handle big problems well any more. They are using too much power and are harder to program. ........ These new chips won’t be programmed in the traditional way. Cognitive computers are expected to learn through experiences, find correlations, create hypotheses, remember, and learn from the outcomes. They mimic the brain’s “structural and synaptic plasticity.” The processing is distributed and parallel, not centralized and serial. ....... can mimic the event-driven brain, which wakes up to perform a task. ...... The goal is to create a computer that not only analyzes complex information from multiple senses at once, but also dynamically rewires itself as it interacts with the environment, learning from what happens around it. ...... neurobiology ...... IBM wants to build a computer with 10 billion neurons and 100 trillion synapses ...... will consume one kilowatt of power and will occupy less than two liters of volume ..... a cognitive computer could monitor the world’s water supply via a network of sensors and tiny motors that constantly record and report data such as temperature, pressure, wave height, acoustics, and ocean tide. It could then issue tsunami warnings in case of an earthquake. Or, a grocer stocking shelves could use an instrumented glove that monitors sights, smells, texture and temperature to flag contaminated produce. Or a computer could absorb data and flag unsafe intersections that are prone to traffic accidents. Those tasks are too hard for traditional computers.