NeurIPS 2019 Paper Review
Andrew S – Quantitative Researcher Random deep neural networks are biased towards simple functions Giacomo De Palma, Bobak Toussi Kiani, Seth Lloyd Deep neural networks often have more parameters than datapoints in their training set, so one might naively expect them to be very overfit, and have poor generalisation properties. They are fully capable of learning randomly-labelled […]
Read article