Skip to main content
notice

Seminar by Dr. Denis Pankratov (University of Toronto)

January 29, 2018
|


Speaker: Dr. Denis Pankratov (University of Toronto)                                                                                                                              

Title:  Information-Theoretic Bottlenecks in Computer Science 


Date: Monday Jan 29, 2018


Time: 10.30am-12pm


Room: EV 11.119

ABSTRACT

General computational process is captured by Turing machines or circuits. This formalism allows studying both upper and lower bounds. Theoretical computer science has made tremendous progress on both fronts. Classes P, NP, P/poly, the notion of completeness, and various algorithmic paradigms are among the most useful guiding principles for the design of new algorithms. Unfortunately, impossibility results are often based on unproven, although widely-believed assumptions. Proving unconditional superpolynomial lower bounds on a problem in NP seems completely out of reach of current techniques (currently, the best such circuit lower bound is 3.011 n - o(n)). Many other computational models (weaker or incomparable) do admit unconditional lower bounds and also provide useful guiding principles for designing new algorithms. These models abstract away general computation, but introduce an information-theoretic bottleneck. In this talk, I will survey some of my results in such models. Time permitting, we will discuss communication complexity, online algorithms, proof complexity, monotone circuit complexity, and interactive information theory.

BIO

Denis Pankratov received his PhD from the University of Chicago. He is now a postdoctoral fellow at the University of Toronto. His main research interests are in theoretical computer science. More specifically, Denis is interested in conceptually simple algorithms and in understanding computation in terms of information flow and communication complexity.




Back to top

© Concordia University