Caltech applied mathematicians develop methodologies for modeling and analyzing data using tools from probability and statistics, signal processing and information theory, as well as linear algebra and optimization. Venkat Chandrasekaran has introduced methodologies for statistical inference and signal processing that are widely used. Babak Hassibi and Victoria Kostina develop algorithms for signal processing, communications, and learning. Franca Hoffmann has designed new techniques for inference on graphs and ensemble filtering. Eric Mazumdar studies the theory of learning algorithms in uncertain, dynamic environments.
Houman Owhadi has established probabilistic scientific computing (solving partial differential equations and as learning/inference problems with kernels/Gaussian processes). He is known for his work on Uncertainty Quantification, data assimilation, the robustness/brittleness of inference and adversarial methods. Oscar Bruno has introduced new uses of machine learning techniques for use in identification and treatment of flow singularities and solution of inverse problems. Tom Hou has introduced new data-driven methods to extract hidden patterns and instantaneous frequencies in nonlinear time series and developed multiscale invertible deep generative neural networks to study high dimensional Bayesian inference problems.
Andrew Stuart works on scientific machine learning, data assimilation and inverse problems, and more generally is interested in developing methodology and theory for the integration of model- and data-centric predictive science and engineering. Joel Tropp has developed now-standard algorithms for sparse optimization and randomized matrix computations, and he formulated the theory of matrix concentration.
There are strong connections between the mathematics of data science and other CMS strengths. Contributors include machine learning researchers Anima Anandkumar, Frederick Eberhardt, and Yisong Yue.