Search

Search Funnelback University

Search powered by Funnelback
21 - 30 of 84 search results for katalk:za31 24 / / / / / |u:www.dpmms.cam.ac.uk where 0 match all words and 84 match some words.
  1. Results that match 1 of 2 words

  2. vt06final.dvi

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/vt.pdf
    5 Jun 2020: Γn(F) will be to obtain a bound on the deviationprobability (24) with an appropriately chosen shadowfunction D. ... Probab., 24(2):916–931, 1996. [17] P.W. Glynn and D. Ormoneit. Hoeffding’s inequality foruniformly ergodic Markov chains.
  3. paper.dvi

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/icassp.pdf
    5 Jun 2020: 0 0.5 1 1.5 2 2.5 3 3.50.24. 0.25. 0.26. 0.27.
  4. 2000 Conference on Information Sciences and Systems, Princeton…

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/ciss00.pdf
    5 Jun 2020: References. [1] H. Akaike. Prediction and entropy. In A celebration of statis-tics, pages 1–24.
  5. 466 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. ...

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/poisson3J.pdf
    5 Jun 2020: 24], [6], manylarge-deviations results [12], [16], [13], the martingale conver-gence theorem [5], [6], and the Hewitt–Savage – law [29].See also the powerful comments in [18, pp. ... Statist. Plann. In-ference, vol. 92, no. 1–2, pp. 7–12, 2001.
  6. PubTeX output 1999.09.27:1044

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/lossyJ.pdf
    5 Jun 2020: for all. (24)Then, by the duality relationship (15) and the fact that. ... Inform.Theory, vol. 43, pp. 1439–1451, Sept. 1997. [24] H. Morita and K.
  7. Fisher Information, Compound PoissonApproximation, and the Poisson…

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/cpa-isit07.pdf
    5 Jun 2020: with mean λ;see also [10], [24]. ... 5, no. 6, pp.1021–1034, 1999. [24] F. Topsøe, “Maximum entropy versus minimum risk and applicationsto some classical discrete distributions,” IEEE Trans.
  8. Suppressing Covid-19:Public Health Policy and Effective Mass-Testing…

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/Covid_talk_slides.pdf
    11 Sep 2020: log. 10). days since exposure. For 24 hours in the beginningthe tests give different results,.
  9. Approximating a Diffusion by a Finite-State Hidden Markov Model ...

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/HMM.pdf
    5 Jun 2020: The proof is similar to the proof of Proposition 6.1 of [24]. ... The final equation follows from (24) and the definition of Dκ in (17).
  10. 4228 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. ...

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/UBthin.pdf
    5 Jun 2020: 3). (23). 4). (24). where. Observe that, since the Poisson-Charlier polynomials form an. ... 24] O. Johnson, “Log-concavity and the maximum entropy property of thePoisson distribution,” Stochastic Process.
  11. Entropy and the Law of Small Numbers I. Kontoyiannis∗ ...

    https://www.dpmms.cam.ac.uk/~ik355/PAPERS/poisson3.pdf
    5 Jun 2020: 3. the convergence of Markov chains [31][24][6], many large deviations results [12][16][13], themartingale convergence theorem [5][6], and the Hewitt-Savage 0-1 law [29]. ... 24] D.G. Kendall. Information theory and the limit-theorem for Markov chains

Refine your results

Search history

Recently clicked results

Recently clicked results

Your click history is empty.

Recent searches

Recent searches

Your search history is empty.