Tweaking AI Software to Function Like a Human Brain Improves Computer's Learning Abilityhttps://gumc.georgetown.edu/news-release/tweaking-ai-software-to-function-like-a-human-brain-improves-computers-learning-ability/Computer-based artificial intelligence can function more like human intelligence when programmed to use a much faster technique for learning new objects, say two neuroscientists who designed such a model that was designed to mirror human visual learning. They reported their results in the journal
Frontiers in Computational Neuroscience"Our model provides a biologically plausible way for artificial neural networks to learn new visual concepts from a small number of examples," says Riesenhuber. "We can get computers to learn much better from few examples by leveraging prior learning in a way that we think mirrors what the brain is doing."
Humans can quickly and accurately learn new visual concepts from sparse data—sometimes just a single example. Even three- to four-month-old babies can easily learn to recognize zebras and distinguish them from cats, horses, and giraffes. But computers typically need to "see" many examples of the same object to know what it is, Riesenhuber explains.
The big change needed was in designing software to identify relationships between entire visual categories, instead of trying the more standard approach of identifying an object using only low-level and intermediate information, such as shape and color, Riesenhuber says.
"The computational power of the brain's hierarchy lies in the potential to simplify learning by leveraging previously learned representations from a databank, as it were, full of concepts about objects," he says.
Riesenhuber and Rule found that artificial neural networks, which represent objects in terms of previously learned concepts, learned new visual concepts significantly faster.Rule explains, "Rather than learn high-level concepts in terms of low-level visual features, our approach explains them in terms of other high-level concepts. It is like saying that a platypus looks a bit like a duck, a beaver, and a sea otter."
----------------------------------------
Research Team Demonstrates World's Fastest Optical Neuromorphic Processorhttps://techxplore.com/news/2021-01-team-world-fastest-optical-neuromorphic.htmlAn international team of researchers led by Swinburne University of Technology has demonstrated the world's fastest and most powerful optical neuromorphic processor for artificial intelligence (AI), which operates
faster than 10 trillion operations per second (TeraOPs/s) and is capable of processing ultra-large scale data. Published in the journal
Nature, this breakthrough represents an enormous leap forward for neural networks and neuromorphic processing in general.
The team demonstrated an optical neuromorphic processor operating more than 1000 times faster than any previous processor, with the system also processing record-sized ultra-large scale images—enough to achieve full facial image recognition, something that other optical processors have been unable to accomplish.
"This breakthrough was achieved with 'optical micro-combs," as was our world-record internet data speed reported in May 2020," says Professor Moss, Director of Swinburne's Optical Sciences Centre.
While state-of-the-art electronic processors such as the Google TPU can operate beyond 100 TeraOPs/s, this is done with tens of thousands of parallel processors. In contrast, the optical system demonstrated by the team uses a single processor and was achieved using a new technique of simultaneously interleaving the data in time, wavelength and spatial dimensions through an integrated micro-comb source."This processor can serve as a universal ultrahigh bandwidth front end for any neuromorphic hardware —optical or electronic based—bringing massive-data machine learning for real-time ultrahigh bandwidth data within reach," says co-lead author of the study, Dr. Xu, Swinburne alum and postdoctoral fellow with the Electrical and Computer Systems Engineering Department at Monash University.
"We're currently getting a sneak-peak of how the processors of the future will look. It's really showing us how dramatically we can scale the power of our processors through the innovative use of microcombs," Dr. Xu explains.
Xingyuan Xu et al.
11 TOPS photonic convolutional accelerator for optical neural networks,
Nature (2021).
https://www.nature.com/articles/s41586-020-03063-0----------------------------------------
Machine Learning at the Speed of Light: New Paper Demonstrates Use of Photonic Structures for AIhttps://techxplore.com/news/2021-01-machine-paper-photonic-ai.htmlLight-based processors, called photonic processors, enable computers to complete complex calculations at incredible speeds. New research published this week in the journal
Nature examines the potential of photonic processors for artificial intelligence applications.
The results demonstrate for the first time that these devices can process information rapidly and in parallel, something that today's electronic chips cannot do.The researchers combined phase-change materials—the storage material used, for example, on DVDs—and photonic structures to store data in a nonvolatile manner without requiring a continual energy supply. This study is also the first to combine these optical memory cells with a chip-based frequency comb as a light source, which is what allowed them to calculate on 16 different wavelengths simultaneously.
In the paper, the researchers used the technology to create a convolutional neural network that would recognize handwritten numbers. They found that the method granted never-before-seen data rates and computing densities.
"Exploiting light for signal transference enables the processor to perform parallel data processing through wavelength multiplexing, which leads to a higher computing density and many matrix multiplications being carried out in just one timestep.
In contrast to traditional electronics, which usually work in the low GHz range, optical modulation speeds can be achieved with speeds up to the 50 to 100 GHz range."J. Feldmann et al.
Parallel convolutional processing using an integrated photonic tensor core,
Nature (2021)
https://www.nature.com/articles/s41586-020-03070-1----------------------------------------
Accelerating AI Computing to the Speed of Lighthttps://techxplore.com/news/2021-01-ai.htmlA University of Washington-led team has come up with an optical computing core prototype that uses phase-change material. This system is fast, energy efficient and capable of accelerating the neural networks used in AI and machine learning. The technology is also scalable and directly applicable to cloud computing.
The team published these findings Jan. 4 in
Nature Communications.
Changming Wu et al,
Programmable phase-change metasurfaces on waveguides for multimode photonic convolutional neural network,
Nature Communications (2021).
https://www.nature.com/articles/s41467-020-20365-z