Global Technology Briefing

Training neural networks to perform tasks, such as recognizing images or navigating self-driving cars, currently requires lots of computing hardware and electricity. But thanks to a new artificial neuron device developed by researchers at the University of California San Diego such neural network computations may eventually require between 100 and 1000 times less energy and chip area than existing CMOSbased hardware.

Neural networks are a series of connected layers of artificial neurons, where the output of one layer provides the input to the next. Generating that input is done b.....

This content is for BUSINESS BRIEFINGS members only.

Leave a Reply

Your email address will not be published. Required fields are marked *