Published:
February 15, 2022
A recent article in Quanta Magazine explored why neural networks must be much larger than conventionally expected. The findings, which come from a paper by Sébastien Bubeck of Microsoft Research and Mark Sellke of Stanford University, give some insight into a question that has come up repeatedly over several decades.
WNCG professor Alex Dimakis was quoted in the article, providing background on overparameterization in neural networks.
Read the article via quantamagazine.org.