Traditional Computers
Uses a fraction of the power of a traditional computer: the brain. In neuromorphic computing, electronic devices mimic neurons and synapses, and are interconnected in a way that resembles the brain’s electrical networks. It’s not new — researchers have been working on this technology since the 1980s. But the energy requirements of the AI revolution have increased the pressure to bring the nascent technology into the real world. Current systems and platforms exist mainly as research tools, but supporters say they could offer huge gains in energy efficiency, Those with commercial ambitions include hardware giants such as Intel and IBM.Commercialization have reached commercialisation.” This is an important development, says Tony Kenyon, professor of nanoelectronics and nanophotonic materials at University College London, who works in this field. “While there is still no killer app… there are many areas where neuromorphic computing will provide significant benefits in energy efficiency and performance, and I’m sure we’ll start to see widespread adoption of this technology as it matures,” he says.
Neuromorphic computing encompasses a variety of approaches – from simply a more brain-inspired approach, to a nearly perfect simulation of the human brain (which we’re really nowhere near). But there are some basic design properties that set it apart from traditional computing. First, unlike traditional computers, neuromorphic computers don’t have separate memory and processing units. Instead, those tasks are performed together in the same location on a chip. Professor Kenyon notes that removing the need to transfer data between the two reduces the energy used and speeds up processing times. Event-driven approaches to computing may also become common.
Stay connected with Janta Se Rishta for news updates
#Technology #consumption #data #centers #artificial #intelligence #cryptocurrencies #double #levels
2024-06-18 14:10:21