## Fall Semester 1997 |
## College of Natural Science |

Refreshments served at 11:15

Thursday, October 2, 1997
Exploring the Energy Landscapes of Folding Proteins
| |

Peter Wolynes, School of Chemical Sciences, University of Illinois*
In analogy with other complex systems such as glasses and spin glasses, a protein can exist in many states. To understand the dynamical processes it is necessary to characterize the energies of a large ensemble of structures statistically. The energy landscape theory of folding provides a practical tool for developing new algorithms to predict a protein structure from its amino acid sequence. *Member, National Academy of Sciences. |

Thursday, November 6, 1997
Asymptotics of Randomly Perturbed Dynamical Systems
| |

A. V. Skorokhod, Department of Statistics and Probability, Michigan State University*
The main results, obtained in recent years by myself and colleagues, relate to averaging theorems, deviations of a perturbed system from its average, and the construction of diffusion approximations of randomly perturbed systems. As applications we consider conservative and gradient perturbed systems, and systems that are connected with Genetics, Ecology and Demographics. *Academician, Ukrainian Academy of Sciences. |

Thursday, December 4, 1997
Maximum Likelihood in Positive Linear Inverse Problems
| |

Yehuda Vardi, Department of Statistics, Rutgers University
Positive linear inverse problems are omnipresent in scientific applications. The talk will present maximum-likelihood based methods for approximate solutions of such problems. Examples and applications will be described including a new algorithm for an approximate inversion of the discrete Radon transform (recovering a set on the lattice from a few of its projections in "main directions." |

Thursday, January 15, 1998
Sculpting of Fractal River Basins
| |

Jayanth R. Banavar, Department of Physics, Penn State University
Transportation networks for the aggregation of the precipitated water in the drainage basins of rivers offer striking examples of scale-free structures with nearly universal exponents characterizing the power-law behavior of various physical quantities. I will describe attempts to understand the selection principle employed by nature in the choice of these fractal networks. |

Thursday, February 5, 1998
On-Line Learning Algorithms and Related Topics
| |

Robert Schapire, AT&T Laboratories
In on-line learning, learning takes place in a sequence of trials. On each trial, the learner makes a prediction and then receives feedback so that training and testing take place at the same time. There are a number of simple but highly robust learning algorithms for this general setup, including some that can be analyzed and shown to work quite well even when no statistical assumptions of any kind are made about the trials. In this talk, I will introduce these and draw connections with game theory, portfolio selection, multi-armed bandit problems and boosting (combining advice of several experts). |

Thursday, March 5, 1998
Curve Estimation
| |

Ildar Ibragimov, Steklov Mathematics Institute at St. Petersburg*
The problem goes back to the classic works of Gauss, Legendre, and LaPlace connected with the determination of planets' and comets' paths (least squares method). In this classic approach a function is supposed known up to a few unknown parameters which have to be estimated. In more recent works one considers problems of estimating f without a-priori knowledge of the form of f. The purpose of this lecture is to give a survey of these old and new problems of curve estimation and describe mathematical methods used to investigate them. *Academician, Russian Academy of Sciences. |

Thursday, April 2, 1998
Wavelet-Domain Estimation Methods for Photon Imaging
| |

Robert Nowak, Department of Electrical Engineering, Michigan State University
We present a wavelet-domain filtering procedure for photon imaging in the presence of Poisson noise and apply it to imaging in nuclear medicine. |

Thursday, April 23, 1998
Multiplicative Update Algorithms-New Learning Algorithms
| |

Manfred Warmuth, Computer and Information Sciences, U. of California, Santa Cruz
A new family of algorithms has been recently developed within the Computational Learning Theory community. These algorithms update their parameters using gradients of the loss function in their exponent and have radically different behavior from those based on e.g. various heuristics such as direct use of gradient descent. We show how the new family can be applied to linear regression, density estimation, and Hidden Markov Models. |

## Raoul LePage |
## entropy@msu.edu |