Ohio State nav bar

Logic Seminar - Greg Igusa

math_sculpture
December 1, 2015
1:45PM - 2:45PM
Math Tower 154

Date Range
Add to Calendar 2015-12-01 13:45:00 2015-12-01 14:45:00 Logic Seminar - Greg Igusa Title: Reducibilities utilizing incomplete or imperfect informationSpeaker: Greg Igusa (Notre Dame)Abstract: In complexity theory, there is an empirically observed phenomenon, that sometimes a problem might be easy to solve in practice despite being proved to have a high complexity in the worst case. This discrepancy has been frequently dismissed as an unfortunate situation in which practice is not accurately predicted by theory, but there has been recent work in complexity theory to rigorously study this phenomenon.In 1986, Levin introduced "average-case complexity." Then, in 2003, Kapovich, Miasnikov, Schupp and Shpilrain introduced "generic-case complexity." Generic-case complexity looks at the majority of instances of a problem while completely ignoring the most difficult ones. In particular, an unsolvable problem can have a well-defined generic-case complexity as long as there is an algorithm that can solve most instances of the problem.In 2012, Jockusch and Schupp introduced "generic computability" to study precisely the recursion-theoretic content of these sorts of algorithms. A generic computation of a real is a computation that usually halts: a partial recursive function whose domain has density 1 (in the sense of asymptotic density on the natural numbers) such that the partial recursive function correctly computes the real on its domain.Jockusch and Schupp also introduced a related notion: "coarse computability." A real is coarsely computable if there is a total computable function that is correct about that real on a set of density 1. (An algorithm that always halts, and usually gives correct answers.) To study the degree structure of generic computability, they also define "generic reducibility," a notion of reducibility in which oracles, like computations, are only forced to halt on density 1.We define coarse reducibility, as well as several other reducibility notions, and we use these to illustrate the differences between incomplete and imperfect computations, and also to illustrate the ways in which replacing "density 1" with other notions of largeness affects the final reducibility. We explore these reducibilities, the relationships between them, and links between them and hyperarithmeticity, reverse mathematics, and randomness.  Math Tower 154 Department of Mathematics math@osu.edu America/New_York public

Title: Reducibilities utilizing incomplete or imperfect information

Speaker: Greg Igusa (Notre Dame)

Abstract: In complexity theory, there is an empirically observed phenomenon, that sometimes a problem might be easy to solve in practice despite being proved to have a high complexity in the worst case. This discrepancy has been frequently dismissed as an unfortunate situation in which practice is not accurately predicted by theory, but there has been recent work in complexity theory to rigorously study this phenomenon.

In 1986, Levin introduced "average-case complexity." Then, in 2003, Kapovich, Miasnikov, Schupp and Shpilrain introduced "generic-case complexity." Generic-case complexity looks at the majority of instances of a problem while completely ignoring the most difficult ones. In particular, an unsolvable problem can have a well-defined generic-case complexity as long as there is an algorithm that can solve most instances of the problem.

In 2012, Jockusch and Schupp introduced "generic computability" to study precisely the recursion-theoretic content of these sorts of algorithms. A generic computation of a real is a computation that usually halts: a partial recursive function whose domain has density 1 (in the sense of asymptotic density on the natural numbers) such that the partial recursive function correctly computes the real on its domain.

Jockusch and Schupp also introduced a related notion: "coarse computability." A real is coarsely computable if there is a total computable function that is correct about that real on a set of density 1. (An algorithm that always halts, and usually gives correct answers.) To study the degree structure of generic computability, they also define "generic reducibility," a notion of reducibility in which oracles, like computations, are only forced to halt on density 1.

We define coarse reducibility, as well as several other reducibility notions, and we use these to illustrate the differences between incomplete and imperfect computations, and also to illustrate the ways in which replacing "density 1" with other notions of largeness affects the final reducibility. We explore these reducibilities, the relationships between them, and links between them and hyperarithmeticity, reverse mathematics, and randomness.
 

Events Filters: