A team of scientists at #IBM Research in Zurich, have created an artificial version of neurons using #phasechangememory materials to store and process data. These phase change based artificial neurons can be used to detect patterns and discover correlations in the areas of big data and unsupervised machine learning. The results of the decade-long research to use phase-change materials for memory applications were recently published in the journal Nature Nanotechnology. The research team is led by Evangelos Eleftheriou. The development of energy-efficient, ultra-dense integrated neuromorphic technologies for applications in cognitive computing is getting a lot of attention. This technology, foundation for event-based computations, could lead to the development of extremely dense neuromorphic computing systems (computers inspired by the efficiency of the human brain) with co-located memory and processing units to speed up cognitive computing and analyze IoT Big Data. Deep learning, the rapidly developing branch of artificial intelligence, is inspired by the science behind the biological brains and how they are put together. The team applied a series of electrical pulses to the artificial neurons, which resulted in the progressive crystallization of the phase-change material, ultimately causing the neuron to fire, a function known as “integrate-and-fire” property of the biological neurons. They have organized hundreds of artificial neurons into populations and used them to represent fast and complex signals. Moreover, the artificial neurons have been shown to sustain billions of switching cycles, which would correspond to multiple years of operation at an update frequency of 100 Hz. The energy required for each neuron update was less than five picojoule and the average power less than 120 microwatts — for comparison, 60 million microwatts powers a 60 watt lightbulb. In this interview, one of the research team members, Manuel Le Gallo (IBM Research Scientist), talked about what makes neuromorphic computing more efficient than conventional computing.
No comments:
Post a Comment