Artificial intelligence (AI) models like ChatGPT run on algorithms and have great appetites for data, which they process through machine learning, but what about the limits of their data-processing abilities? Researchers led by Professor Sun Zhong from Peking University’s School of Integrated Circuits and Institute for Artificial Intelligence set out to solve the von Neumann bottleneck that limits data-processing.
Computing scheme accelerates machine learning while improving energy efficiency of traditional data operations
Reader’s Picks
-
New fathers who faced economic hardships remained deeply connected to their children despite negative stereotypes, according to a new study.This [...]
-
Farming is a notoriously hard profession with long hours spent operating dangerous equipment and performing other arduous tasks. New Rice [...]
-
Africa is often viewed as a relatively young continent, with less than 7% of the population over 60. But this [...]