Representation power of neural networks.
This talk will present a series of mathematical vignettes on the representation power of neural networks. Amongst old results, the classical universal approximation theorem will be presented, along with Kolmogorov's superposition theorem. Recent results will include depth hierarchies (for any choice of depth, there exists functions which can only be approximated by slightly less deep networks when they have exponential size), connections to polynomials (namely, rational functions and neural networks well-approximate each other), and the power of recurrent networks. Open problems will be sprinkled throughout.
Contact: Sheila Shull at 626.395.4560 firstname.lastname@example.org