Global Technology Briefing

Training neural networks to perform tasks, such as recognizing images or navigating self-driving cars, currently requires lots of computing hardware and electricity. But thanks to a new artificial neuron device developed by researchers at the University of California San Diego such neural network computations may eventually require between 100 and 1000 times less energy and chip area than existing CMOSbased hardware.Neural networks are a series of connected layers of artificial neurons, where the output of one layer provides the input to the next. Generating that input is done by applying a mathematical calculation called a non-linear activation function. This is a critical aspect of running a neural network. But applying this function requires a lot of computing power and circuitry...

This content is for BUSINESS BRIEFINGS members only.
Login Join Now