Machine learning and Information Theory are two broad research areas with strong connections. In this presentation, we will cover two topics that explore de use of information-theoretic measures in learning. On the first topic, we will present results that show how adequate the adoption of mutual information is for predicting the operational quality of a transformation (or encoder) in classification. These results offer new insights into adopting information measures in machine learning, like mutual information and cross-entropy. For the second topic, we will discuss the idea of information sufficiency, representing a model's latent structure, and explore a non-parametric data-driven method to detect this type of structure from data. We will focus on the learning-decision task of testing independence using a non-parametric mutual information estimator. We present non-asymptotic and asymptotic results that support the advantage of this approach and elaborate on applications for learning data structure and model-change detection.
Jorge F. Silva (Senior Member, IEEE) is an Associate Professor in the Electrical Engineering (EE) Department at Universidad de Chile and Principal Investigator with the Advanced Center of Electrical and Electronic Engineering in Valparaiso-Chile. Jorge F. Silva received an M.Sc. and Ph.D. in Electrical Engineering from the University of Southern California (USC), Los Angeles, CA, USA, 2005 and 2008, respectively. Jorge F. Silva was a Research Assistant with the Signal Analysis and Interpretation Laboratory (SAIL), USC, during 2003–2008 and was also a Research Intern with the Speech Research Group, Microsoft Corporation, Redmond, in 2005. He received the Outstanding Thesis Award 2009 for Theoretical Research of the Viterbi School of Engineering, the Viterbi Doctoral Fellowship 2007-2008, and Simon Ramo Scholarship 2007-2008 USC. He was an Associate Editor for the IEEE Transactions on Signal Processing from 2006 to 2008.