It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
Randomness has long been described in many fashions. From a predictability standpoint, a sequence is random if no scheme (under various definitions of computability) can predict the next bit any better than a coin-flip. From an observational standpoint, a sequence is random if it is not compressible, or in other words, it has high Kolmogorov Complexity. Under the prediction paradigm, we combine learning algorithms from Computation Learning Theory with Resource Bounded Measure and Dimension to show various interesting classes are learnable thus have p-dimension 0. We extend some of these techniques to the set of Kolmogorov random strings.
It has long been known that if NP has a sparse complete set under various types of polynomial-time reductions, and in particular, if NP ⊆ P btt(Pctt(SPARSE)), then P = NP. Under the assumption that NP does not have measure 0, a proof that (Pbtt(Pctt(SPARSE))) does have measure 0 would imply that NP does not have sparse complete sets under the composed reduction. We show the stronger result that dimp(P α log n–tt(P ctt(DENSEc))) = 0 by combining Maass and Turán's algorithm for learning halfspaces with Hitchcock's result that [special characters omitted](2cn, 2cn, 2o(n )), the set of all sequences 2cn-time reducible to concepts learnable in 2cn time with a subexponential mistake bound, has p-dimension 0.
In a similar fashion, we show how the notion of reliable reductions of Arvind, et al., yields a “one-sided” learning algorithm, i.e. an he First Learneralgorithm that learns a subset of the target concept. Using this, we can show that Phd(Pctt(DENSE c)) has p-dimension 0. We can then extend this to show that Phd(Pctt(RC)) has p-dimension 0, where RC is the set of random strings as defined by Kolmogorov Complexity. Thus we can conclude that if μp(NP) ≠ 0, then NP does not have any non-dense complete sets under the composed bounded-truth-table-conjunctive reduction, or the composed Hausdorff-conjunctive redunction. Furthermore, we can conclude that NP cannot be characterized by reductions to random strings using only the composed Hausdorff-conjunctive reduction.
The best unconditional circuit lower bound known is that the class MA EXP requires exponential-size circuits, but one of the lowest conditional circuit lower bounds was given by Fortnow and Klivans, who showed that if a class [special characters omitted] of circuits is learnable in the Exact Learning model of Angluin, then EXPNP is not contained in [special characters omitted]. We combine Exact Learning (with membership queries) with the betting games of Buhrman, et al., to improve this bound to show, under a weaker hypothesis, that EXP cannot be contained in [special characters omitted]. However, it seems unlikely that we can improve our results to use Valiant's PAC learning model, which would have been advantageous given the result of Bshouty, et al., that circuits are learnable in ZPPNP. The reason for this is that randomized betting games yield practically no greater advantage than the randomized martingales of Moser, and Regan and Sivakumar.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer