Randomness Through Computation: Some Answers, More Questions

Subscribe to RSS
Free download. Book file PDF easily for everyone and every device. You can download and read online Randomness Through Computation: Some Answers, More Questions file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Randomness Through Computation: Some Answers, More Questions book. Happy reading Randomness Through Computation: Some Answers, More Questions Bookeveryone. Download file Free Book PDF Randomness Through Computation: Some Answers, More Questions at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Randomness Through Computation: Some Answers, More Questions Pocket Guide.

Indeterminism and Randomness Through Physics K. Svozil 9. Delahaye The Road to Intrinsic Randomness S. Wolfram Part III. Hutter Schmidhuber Part IV. Randomness, Information and Computability Calude Contents Randomness, Computability and Information J. Miller Studying Randomness Through Computation A. Nies Computability, Algorithmic Randomness and Complexity R.

Downey Is Randomness Native to Computer Science? Ten Years After M. Computational Complexity, Randomized Algorithms and Applications Allender Connecting Randomness to Computation M. Li Staiger Randomness in Algorithms O. Watanabe Contents xviii Part VI. Panel Discussions Transcriptions Is the Universe Random? Calude, J. Casti, G. Chaitin, P. Davies, K. Wolfram What is Computation? How Does Nature Compute? Calude, G. Chaitin, E.

Fredkin, A. Leggett, R. An example from my own experience is the following. A fellow researcher once came upon a curious sequence in connection with his work on a certain sorting algorithm. This gives us a new sequence 1, 0, 0, 1, 1, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,. The question was whether anything sensible could be said about S, e. As a consequence, we know that the sequence is not periodic, for example.

This is a deterministic rule he devised for certain cellular automata which operate on a linear tape which appears to generate sequences with no discernible structure. In fact, it is used in Mathematica for generating random numbers, and appears to work quite satisfactory. However, it may be that no one e. With this technique, one can prove the existence of many remarkable mathematical objects by proving that the probability that they exist is positive. In fact, many of the sharpest results on the sizes of such objects are only obtained by using the probabilistic method.

However, this method gives absolutely no clue as to how such objects might actually constructed. It would certainly be wonderful if someone could make progress in this direction. It turns out to be relatively easy to give explicit constructions of such quasirandom graphs, which makes it quite useful in many situations in which an explicit construction of a random-like object is desired.

One mathematical area where this is especially apparent is in an area of combinatorics called Ramsey theory. Basically, Is Randomness Necessary? For example, it can be shown that for any choice of a positive number N , there is a least number W N so that no matter how the numbers from 1 to W N are colored red or blue, in at least one of the colors we must always have an arithmetic progression of equally spaced numbers in a single color. It is of great interest to estimate the size of W N. However, it provided an empirical basis that must have helped keep my feet solidly on the ground whenever I felt the pull of theory.

Aldo grew into a distinguished particle physicist; I partly morphed into a computer scientist. Fast forward a few years — I was now working on my thesis in computer sciences. This time it was a case of one doctoral student with two advisors, John Holland — he of genetic algorithms — and Arthur Burks — a collaborator of the late John von Neumann in the self-replicating automata project. For my formation this was my most productive period, especially because I had been given carte blanche to do whatever I wanted; so I spent practically the whole time furiously teaching myself so many things I needed to know.

At this stage of the game two things happened that played a determining role in the evolution of my interests and my strategies. We never did! The whole thing was about measure theory — on its own a perfectly legitimate business, but with no new probability contents whatsoever. No prior knowledge of probability is assumed [italics mine], but browsing through an elementary book such as the one by Feller,5 with its diverse and vivid examples, gives an excellent feeling for the subject. Then I would explore another rule, and so on, systematically.

Thus, any trajectory must eventually enter a cycle. In a computer simulation, to know when the system had entered a cycle one ought to check whether the current state matches any of the previously encountered ones. This was of course rather expensive; so I made the computer memorize, along the simulation, a sparse number of breakpoint states, and compare each new state only with the breakpoints. If I found a match, I would know that I was on a cycle; but to know exactly when the cycle had been entered I had to recompute part of the trajectory, from a breakpoint forward or from the current state backwards.

Probability is a Lot of Logic at Once 11 i. Might it have more than one predecessor? And this opened a Pandora box of challenges and at the same time a treasure chest of rewards. Our own physical world appears to follow, at the microscopic level, strictly invertible laws this applies to both classical and quantum mechanics. Part of the ensuing story is told in a quite readable early review. Worse than that, even when feasible solutions are found on paper, their material implementation turns out to depend on large and complex infrastructure.

Thus, beyond a certain point and actually rather soon , the amortization costs of the infrastructure will wipe out any savings in operating expenses. Whether and to what extent it is also good for you — only you can be the judge! The abandonment of superstitious beliefs about the existence of Phlogiston, of Cosmic Ether, Absolute Space and Time,. Probability too, if regarded as something Probability is a Lot of Logic at Once 13 endowed with some kind of objective existence [italics mine], is no less a misleading misconception, an illusory attempt to exteriorize or materialize our true probabilistic beliefs.

Enter Jaynes see Refs. Of course not! Why, then? This is real progress towards resolving the objective-vs-subjective dilemma. According to Jaynes, a probability distribution is created by me on the spot on the basis of the information I currently have, but there should be nothing subjective about the creation process itself. So far, our fraction has come out of the ratio of two integers — two counts.

Let me give you a hint. My reply — we are talking now about the interpretation of probability — is this. We envisage two possible universes, one with three piglets and one with four, and we take them both — as two distinct cases. For each of these two cases all other counts remain the same and so will appear two times two farmers, two daughters, etc.

The number 3. For de Finetti and Jaynes, it captures aspects of a subjective state of knowledge, of information one has. Scenario 2 has the same day, the same 16 T. To stay within the law, his trading decisions should be based on a probability distribution that deliberately ignores important information he may have, and thus violates maxent. What is true in a syllogism is not the conclusion, but a certain relationship with its premises. What a probability expresses is certain numerical aspects of a situation as described, i. All the same, as Jaynes stresses, the conceptual analysis of what we do when we introduce a probability distribution stops here; passage to the continuum or to more sophisticated measure-theoretical accounting does not need — and, I add, should not — entail any conceptual or interpretational novelties.

This is what makes a probability theory possible — indeed, it is the essence of any such theory. Note that substantially the same approach to interpretation had already been entertained by Laplace, as early as , as a matter of commonsense. Those numbers, i. Probability is a Lot of Logic at Once 19 Remark that that is precisely what evolution, which has successfully been in the inferring business for four billion years, gets away with, as a strategy to perform its wonders by!

Here is where it makes its entrance. What worth is a probability — a count or a number — that may come out of our theory, if this number changes all the time? The problem is that our descriptions typically refer to objects from the real world, and the world and the objects therein evolve according to a nontrivial dynamics — so that the descriptions themselves keep changing! Are there any properties of these evolving descriptions that nonetheless remain constant? In Figure 2. We could model the individual collisions of some molecules of hydrogen and oxygen and explicitly count their number.

Alternatively, using schema 2. By multiplying these two pressures together we get a good idea of the number of collisions between the two species. Weights move along step by step, together with the respective microstates, and are added together at every merger of trajectories. The occupation octuple is none but an ordinary probability distribution, if we normalize each entry by dividing it Probability is a Lot of Logic at Once 21 by the number of initial cases 3, here.

Certified randomness in quantum physics

For a deterministic system, microscopy entropy is in general monotonically decreasing. A special case is the dynamics of Figure 2. Here trajectories never merge, occupation numbers hop from one state to its successor together with the state itself but never merge — they are merely permuted by the dynamics. With this coordinatization, the occupation tuples used above become genuine indicator sets, with only 0 and 1 values, whether the system is invertible or not. I will just mention, without discussion or 22 T.

As simple as that. Among the latter was Jaynes, whose lifelong detour into probability theory had started as just a minor side step for reconnoitering the quantum mechanics fortress before attempting to breach it. I had intended originally to specialize in Quantum Electrodynamics, but this proved to be impossible [because] whenever I look at any quantum-mechanical calculations, the basic craziness of what we are doing rises in my gorge[! Gradually, I came to see that the foundation of probability theory and the role of human information have to be brought in, and I have spent many years trying to understand them in the greatest generality.

By now we all know about Darwinian evolution, mindless bureaucracies, and the loose cannon of the free market. If we now turn gravity on, the energy budget will come to include potential energy. That state is preferred not because it is most orderly — though it so appears if one only sees the chain and not the gas molecules — but because indeed it is most random. That is the typical shape of a maxent argument — to just bring to the surface an underlying combinatorial tautology. We are comparing apples with oranges, and there is no hope that turning an orange into an apple will be as simple a job as painting it red.

Jaynes himself must have been pursuing this rainbow, spending an entire lifetime at developing more mature and sophisticated tools, and yet ever seeing it recede towards the horizon. The conservation principles of an- T. The Lagrange—Euler—Hamilton principle is one of stationary not least action. I intend to pick up that trail again with better preparation and equipment. Probability is a Lot of Logic at Once 27 science of thermodynamics — what Mach called, in fact, Pure Thermodynamics — mostly expressed in terms of partial derivatives like those that still decorate thermodynamics textbook.

It was easy for them to believe that those equations, which look so commanding and coherent, directly captured the essential reality of physics. Those lofty equations could be derived, as convenient large-number approximations, by applying purely logical and combinatorial arguments — not new principles of physics — to ordinary particulate mechanics: the phenomena they described were but accounting epiphenomena.

The way I have depicted the conceptualization and the use of probability, it is clear that most of the actual work goes into generating a variety of working hypothesis — even while accepting that most of them will be found wanting. Since, in this game, we can submit as many tickets as we wish, fantasy is more important than technique. Here is an edifying story. Much more recently , taking into account the quantum-mechanical electrical structure of orbitals, Lennard—Jones established, for simple nonpolar molecules, an inverse seventh-power attractive law.

The moral of this story is that, if one is looking for a microscopic model for the laws of ideal gases, there is no dearth of plausible candidates. On the contrary, the same generic macroscopic behavior embodied by these laws will emerge from almost any microscopic model — using attractive or repulsive forces, power-law or exponential or whatever — that displays certain basic symmetries. Far from there being no plausible candidates, there were too many to choose from! What are the prospects for progress? Edward Fredkin and Norman Margolus have been throwing in the ring the suggestion that, in a combinatorial model, it may be the conserved quantities — say, a quantity specifying a number of indestructible tokens — that generate symmetries, rather than the other way around.

If we get interesting combinatorial models that way, we may get inspiration for novel models of quantum mechanics. When will enough be enough? Operations theory swears by the branchand-bound approach and it appears that humans, and even animals, intuitively follow a similar strategy. Say, you want to know whether it is the case that A. In the long run, q you may decide to invest part of your budget into trying to prove that A is undecidable. In other words, you hedge your bets and invent new bet options. In Peter Shor invented an algorithm that, on a quantum computer, would factor integers in polynomial time — while the current best factoring algorithms for a classical machine take exponential time.

This goal has so far proven elusive. Some possible reasons are given for the paucity of quantum algorithms so far discovered. Here, the issue of interpretation — of what irreducible mechanisms might lie at the base of quantum mechanics — comes to the forefront. After all, evolution does perform design miracles even with no brains, but the conceptual simplicity of its design loop is bought by an exponential amount of waste, pain, and senseless cruelty.

We are caught in a Faustian dilemma. The more we wish for quantumness to support miraculously fast computation and who can rule that out yet? Breiman, Probability, Addison—Wesley Milano Probability is a Lot of Logic at Once 31 5. Lev B. Levitin and T. Rombauer and M. Ruelle, Chance and Chaos, Princeton Scientist 92 , — J Unconventional Computing 1 , 3— Mentrasti and S.

Capobianco and P. Z 5 , 52— The evaluation of the random nature of outputs produced by various generators has became vital for the communications and banking industries where digital signatures and key management are crucial for information processing and computer security. A number of classical empirical tests of randomness are reviewed in Knuth. The popular battery of tests for randomness, Diehard,16 demands fairly long strings bits. The goal was to develop a novel battery of stringent procedures. The resulting suite22 was successfully applied to pseudo-random binary sequences produced by current generators.

This collection of tests was not designed to identify the best possible generator, but rather to provide a user with a characteristic that allows one to make an informed decision about the source. The key selection criteria for inclusion in 33 34 A. While an attempt was made to employ only procedures which are optimal from the point of view of statistical theory, this concern was secondary to practical considerations.

In the next sections we review some of the tests designed for this purpose. Most of them are based on known results of probability theory and information theory, a few of these procedures are new. These ciphers are widely used in cryptographic applications. One of the requirements was that its output sequence should look like a random string even when the input is not random.

To measure their performance, a numerical characteristic of the degree of randomness was required. Although larger P-values do not imply validity of the hypothesis at hand, when a string is being tested by a number of tests, they are necessary to continue the study of multi- Statistical Testing of Randomness: New and Old Procedures 35 faceted aspects of non-randomness. A very important property of P-values is that they are uniformly distributed over the unit interval when the tested null hypothesis is correct.

Thus, about 10 P-values should be expected in the interval 0, 0. The P-values obtained were tested for uniformity. Thus, each algorithm under a particular test of randomness generated three hundred decisions with regard to agreement of the output and the randomness hypothesis. If none of the P-values fell below 0. This procedure has been criticized19 from several perspectives.

According to principles of statistical inference it is preferable to have one summary statistic on the basis of a long sequence rather than a number of such statistics obtained from shorter subsequences. But testing of encryption algorithms is not a purely statistical exercise.

The validation of uniform Pvalues does not enter into the calculation of the power of a particular test, yet it can be seriously recommended from a practical point of view. The whole problem of testing randomness is not as unambiguous as the parametric hypothesis testing problems of classical mathematical statistics. Murphy19 compares this process to interleaving or decimating an underlying message, so that either some signal is removed or some noise is added. Exploration and validation of various transmission regimes may be a more apt comparison. Besides, from the cryptographic point of view, the entropy of the concatenated text is larger than that of the original sequence.

The concept of randomization in statistical design theory presents a similar idea. Randomized designs do not 36 A. Rukhin lead to better performance of statistical procedures if the postulated model is correct. However, they provide a commonly recommended safeguard against violations of the model assumptions.

Equally misguided seem to be arguments in favor of tests which are invariant to data transformations. In the statistical inference context this principle is violated by virtually all proper Bayes procedures. In AES testing context symmetrization over all data permutations is unpractical if not impossible. This data category was abandoned at later stages. The concept of admissibility of tests cf. Indeed, practical considerations led to inclusion in the suite not only the frequency monobit or balance test, but also the frequency test within a block.

Submission history

Theory 60 , — Stanford Undergrad is your guide to undergraduate academics and opportunities run by the Vice Provost for Undergraduate Education. He traveled given in and nearly in with over 97 of the force in both forms, in forests very experienced as Sustainable. Distinguish between equally likely and not equally likely experiments and thus predict the probability of simple experiments. Discover how machine learning algorithms work including kNN, decision trees, naive bayes, SVM, ensembles and much more in my new book , with 22 tutorials and examples in excel. Development and preliminary evaluation of the measure of understanding of macroevolution: introducing the MUM. A , —

Indeed, this is certainly the case in the previous example of block ciphers, and more generally for all pseudorandom number generators which are based on recursive formulas. In view of this fact, one may expect only a measure of randomness to be attested to by a given string. This set indexes all possible probability distributions of the observed data. Think Statistical Testing of Randomness: New and Old Procedures 37 of the life-time distribution of a device, or of the distribution of defective items in a lot.

If the distributions of two test statistics under the randomness hypothesis coincide, we consider them to be equivalent as the P-values are the same for both of these statistics. Let n be the length of the string under testing. If a one-sided alternative corresponds to distributions of T which are stochastically larger than the distribution of T under the null hypothesis, then the P-value is A. Its large values are indicative of the fact that the null hypothesis is false, i.

As discussed later, the Pvalue can be obtained from the incomplete gamma-function, and its small values lead one to believe in the falsity of the null hypothesis. This type of statistic is common in the suite.

Associated Data

For some tests the alternative to our randomness hypothesis may not necessarily be restricted to distributions of T which are stochastically larger or smaller than the distribution of T evaluated under this hypothesis. Section 3. Under the randomness hypothesis, these P-values have an approximate uniform distribution on the interval 0, 1 exactly uniform in the continuous case. This can be tested, for example, by the classical Kolmogorov-Smirnov test. The reported P-value cab be written as the incomplete gamma function.

By replacing Gn by this distribution one obtains approximate P-values. However, not all tests based on the probabilistic properties of random walk are equally suitable for randomness testing. For example, the limiting distribution of the proportion of time Un that the sums Sk are non-negative, leads to a fairly weak test. Rukhin 40 3. Discrete Fourier Transform Spectral Test The spectral test which appeared in the suite turned out to be troublesome.

As it happened, it was not properly investigated, which resulted in a wrong statistic and a faulty constant. This fact was duly noticed by the cryptographic community. These facts lead to the following procedure. Then 0. The 2m degrees of freedom, Q m, 0. Because of this patterning, it is natural to investigate statistical tests based on the occurrences of words patterns or templates.

We start here with the tests which utilize the observed numbers of words or the frequency of a given word in a sequence of length M. Such words cannot be written as. Each of these facts can be used for randomness testing. A test of randomness can be based on the number of possibly overlapping occurrences of templates in the string. Rukhin 42 1. Hamano is given in the Table 3. Table 3. It is also important in molecular biology, in DNA analysis, and for gene recognition.

This extension opens the possibility of choosing q in an optimal way. Thus, the distribution of the number of words with given r can be expected approximately equal to that of the sum of Bernoulli random variables whose success probability is this Poisson probability. We consider the case of arbitrary m in the next section. These formulas lead to very accurate answers for the expected value and the variance. See Refs. Testing Randomness via Words with a Given Frequency More powerful tests can be derived by using the observed numbers of words which appear in a random text a prescribed number of times i.

In practice these statistics are easier to evaluate than the empirical distribution of occurrences of all m-words. This test is asymptotically optimal not only within the class of linear statistics, but in the class of all functions of X 0 ,. The numbers X r of m-letter patterns the original non-overlapping consecutive substrings of length 2m , which occurred r times with the weights from the table lead to the asymptotically optimal test.

A powerful heuristic idea is that random sequences are those that cannot be compressed or those that are most complex. However its practical implementation is limited by scarcity of relevant compression code based statistics whose approximate distributions can be evaluated. Such a register consists of L delay elements each having one input and one output. Here c1 ,.

Both of these distributions can be conjoined in a discrete distribution obtained via a mixture of two geometric random variables one of them taking only negative values. The sequence Tn converges in distribution to the random variable T whose distribution is skewed to the Statistical Testing of Randomness: New and Old Procedures 47 right. In view of the discrete nature of this distribution one can use the strategy described in Section 3.

Tests based on Data Compression The original suite attempted to develop a randomness test based on the Lempel-Ziv algorithm30 of data compression via parsing of the text. Let Wn represent the number of words in the parsing of a binary random sequence of length n according to this algorithm. Unfortunately, this test failed because the normal approximation was too poor, i. The test looks back through the entire sequence while inspecting 48 A. Rukhin the test segment of L-bit blocks, checking for the nearest previous exact match and recording the distance in number of blocks to that previous match.

In view of this fact, it may be advisable to test the randomness hypothesis by verifying normality of the observed values Fn assuming that the variance is unknown. This can be done via a classical statistical technique, namely the t-test. Unfortunately, this complexity characteristic is not computable,4 and there is no hope for a test which is directly based on it. This means that a message of any length n can be both compressed and decoded.

This entropy equals to log q if the randomness hypothesis is true, and it is smaller than log q under any alternative that can be modeled by an ergodic stationary process. For universal codes the power of the corresponding test tends to one as n increases.

Since this area is so important, one can expect more stringent methods based on new ideas. In particular, a study of overlapping spatial patterns is of great interest, as it may lead to such procedures. References 1. Aldous, D. Probability Theory and Related Fields 79, , — Barbour A. Poisson Approximation, Clarendon Press, Oxford, Coron, J-S. Cover, T. Elements of Information Theory, J. Wiley, New York, NY, , 5. Gibbons, J. P-values: interpretations and methodology.

American Statistician 29, , 20— Rukhin 6. Guibas, L. Strings overlaps, pattern matching and nontransitive games. Theory A 30, , — Gustafson, H. A computer package for measuring the strength of encryption algorithms. Computers and Security 13, , Hamano, K. The distribution of the spectrum for the discrete Fourier transform test included in SP Fundamentals E88, , Fundamentals E92, , Killmann, W. T-Systems Integration, Technical Report, Kim, S. Kirschenhofer, P. Knuth, D. The Art of Computer Programming, Vol. AddisonWesley Inc. Kolchin, V. Random Allocations. Whinston Sons, Washington, DC, Marsaglia, G.

Diehard: A battery of tests for randomness. Monkey tests for random number generators. Maurer, U. A universal statistical test for random bit generators. Journal of Cryptology 5, , Murphy, S. Rueppel, R. Analysis and Design of Stream Ciphers. Springer, Berlin, Rukhin, A. Admissibility: survey of a concept in progress. International Statistical Review 63, , 95— Testing randomness: a suite of statistical procedures. Theory of Probability and its Applications 45, , — Pattern correlation matrices and their properties.

Linear Algebra and its Applications , , — Distribution of the number of words with a prescribed frequency and tests of randomness. Advances in Applied Probability 34, , — Pattern correlation matrices for Markov sequences and tests of randomness. Theory of Probability and its Applications 51, , — A statistical test suite for the validation of cryptographic random number generators. Ryabko, B.

Using information theory approach to randomness testing. Journal of Statistical Planning and Inference , , 95— Soto, J. Szpankowski, W. Average Case Analysis of Algorithms on Sequences.

Randomness through computation : some answers, more questions

WileyInterscience, New York, Ziv, J. A universal algorithm for sequential data compression. Many experimental data sets prove to follow an approximate version of it, and so do many mathematical series and continuous random variables. This phenomenon received some interest, and several explanations have been put forward.

Some authors hinted—implicitly—that the two most important characteristics of a random variable when it comes to Benford are regularity and scatter. The proofs only need simple mathematical tools, making the analysis easy. Previous explanations thus become corollaries of a more general and simpler one. Lolbert17 recently proved that no r. For instance, Scott and Fasli24 reported that only In his seminal paper, Benford1 tested 20 data sets including lakes areas, length of rivers, populations, etc. The same is true of mathematical sequences or continuous r. Some authors focus on particular random variables,7 sequence,14 real data,3 or orbits of dynamical systems.

The sequence 0. However, a non expert reader would hardly notice the smooth-and-scattered implications of these developments. However, no theorem is given that would formalise this idea. Delahaye corroborates a widespread intuition. The proof of this theorem is straightforward and requires only basic mathematical tools.

You are here

Buy Randomness Through Computation: Some Answers, More Questions on ykoketomel.ml ✓ FREE SHIPPING on qualified orders. Randomness Through Computation. Some Answers, More Questions The scope of Randomness Through Computation is novel. Each contributor shares their.

Furthermore, as we shall see, several of the existing explanations can be understood as corollaries of ours. Scatter and regularity do not presuppose any log-related properties such as the property of log-normality, scale-invariance, or multiplicative properties. We set that a r.

X is u-Benford for a function u if u X is uniform mod 1. First, we hypothesize that a continuous r. X with density f is almost uniform mod 1 as soon as it is scattered and regular. X approaching uniformity mod 1. Figure 4. Illustration of the idea that a regular p. X are stacked to form the p.

The slopes partly compensate, so that the resulting p. If the initial p. These two ideas are formalized and proved in theorem 1. Theorem 4. Its maximum is sup Id. Lognormal distributions have been related to Benford. This may be modelled by a r. Delahaye Yi being a sequence of random variables. Indeed, our basic idea is that X being scattered and regular enough implies log X to be scattered and regular as well, so that log X should be almost uniform mod 1.

The same should be true of any u X , u being a function preserving scatter and regularity. First, let us set out a generalized version of theorem 1, the proof of which is closely similar to that of theorem 1. Sequences Although our two theorems only apply to continuous r. In this section, we experimentally test u-Benfordness for a few sequences vn and four functions u. We will use six mathematical sequences. The result of the experiment is given in Table 4. Table 4. Each cell displays the KolmogorovSmirnov z and the corresponding p value.

Only the last three are log-Benford.

Doing the 62 N. Putting aside the case of nn , what Table 4. Of course, this rule-of-thumb is not to be taken as a theorem. The last column shows that our previous conjectured rule has exceptions: divergence speed is not an absolute criterion by itself. However, allowing for exceptions, it is still a good rule-of-thumb. Continuous random variables Our theorems apply on continuous r. We now focus on three examples of such r. The p.

The c. Let X be such a r. The proof is complete. It appears, as shown in Table 4. The last line shows the results and p -values of the Kolmogorov-Smirnov tests applied to a sample. Indeed, many explanations have been proposed for this approximate law to hold so often. These explanations involve complex characteristics, sometimes directly related to logarithms, sometimes through multiplicative properties.

Our idea—formalized in theorem 1—is more simple and general. The fact that real data often are regular and scattered is intuitive. Other explanations, of course, are acceptable as well. But it may be argued that some of the most popular explanations are in fact corollaries of our theorem. As we have seen when studying Pareto type II density, mixtures of distributions may lead to regular and scattered density, to which theorem 1 applies. Thus, we may argue that a mixture of densities is nearly Benford because it is necessarily scattered and regular.

Apart from the fact that our explanation is simpler and arguably more general, a good argument in its favor is that Benfordness may be generalized—unlike log-related explanations. Scale invariance or multiplicative properties are log-related. But as we have seen, Benfordness is not dependent on log, and can easily be generalized. Actually, it seems that square root is a better candidate than log.

The historical importance of log-Benfordness is of course due to the implications in terms of leading digits which bears no equivalence with square-root. Delahaye References 1. Benford, F. The law of anomalous numbers. Proceedings of the American Philosophical Society, 78, , 2. Berger, A. Burke J. Cho, W. The American Statistician, 61, , Diaconis, P. The distribution of leading digits and uniform distribution mod 1.

The Annals of Probability, 5 1 , , Drake, P. Journal of Accounting Education, 18, , 7. Engel, H. Statistics and Probability Letters, 63, , Fewster, R. The American Statistician, 63 1 , , Janvresse, E. Volume 41, Number 4, , Hales, D.

  • Psychiatry, Subjectivity, Community: Franco Basaglia and Biopolitics.
  • Use of random variables.
  • Defining Literary Criticism: Scholarship, Authority and the Possession of Literary Knowledge, 1880-2002?
  • Living off the Grid: A Simple Guide to Creating and Maintaining a Self-reliant Supply of Energy, Water, Shelter and More;
  • Caspar David Friedrich and the Subject of Landscape (2nd Edition).

Testing the accuracy of employee-reported data: An inexpensive alternative approach to traditional methods. European Journal of Operational Research, Hill, T. Hill T. Jolissaint, P.

Bonus Question 2:

Kontorovich, A. Kossovsky, A. Towards a better understanding of the leading digits phenomena. Lolbert, T. Mathematical Social Sciences 55, , — Mardia, K. Directional statistics. Chichester: Wiley, Example 4. Newcomb, S. American Journal of Mathematics, 4, , Paolella, M. Fundamental probabilty: A computational approach.

Chichester: John Wiley and Son, Pietronero, L. Explaining the uneven distribution of numbers in nature: The laws of Benford and Zipf. Physica A, , , Pinkham, R.

Statistics 32, , Raimi, R. The American Mathematical Monthly, 83, , Scott, P. Sehity, T. International Journal of Research in Marketing, 22, —, Smith, S. Randomness got a new status with the birth of quantum mechanics: access to information on a given systems passes through a nondeterministic process measurement. In computer sciences, randomness is at the core of algorithmic information theory, all the while nondeterministic algorithms and networks present crucial random aspects. Finally, an extensive use of randomness is made also in biology. Paul Let us analyse in more detail the kind of randomness that emerges in the various disciplines.

Physical Randomness Quantum Randomness in quantum mechanics has a special status, as it is of intrinsic origin. Therefore easy checking of the validity of a result must be part of the handled problem. In this perspective, quantum randomness cannot be viewed as a form of hidden or incomplete determination. Some recent work went beyond this limit. Computer Science Randomness Probabilistic and nondeterministic models of computation have been extensively investigated in the computer science literature, as well as their combination. Still there is no agreement about the precise nature of nondeterminism, and its relation with probability.

One community which is particularly sensible to the problem is that of Computer Security. Although it was discovered only recently, the issue has rapidly become known and recognized as crucial, to the point that the organizers of the edition of the main forum in Computer Security, the IEEE FCS, set up a panel to discuss about nondeterminism.

In sequential computation the term nondeterminism refers to models in which the transition relation goes from one state to a set of states, like nondeterministic Turing machines. A characteristic of this kind of models is that their intended meaning is in terms of maysemantics, in the sense that the computation is considered successful if at least one of the alternative branches is successful. We argue that this has nothing to do with randomness: the re-execution of the system gives always the same result s , and a deterministic implementation is always possible 76 G.

In general we want to abstract from these, and we use the notion of scheduler to represent how the choices are resolved. The classical methods transfer successfully in molecular analysis in Biology, where only physical processes are observed, even though there are meant to happen within cells. In System Biology, however, phase or reference spaces that is, the spaces of possible evolutions are far from being predetermined. Typically, the proper biological observables of Darwinian Evolution, namely phenotypes and species,26,33 are not pre-given or there is no way to give them in advance within a space of all possible evolutions, in a sound theory.

And, of course, there is no way to pre-give the possible molecular interactions internal and systemic as well as the feedbacks, from the forthcoming ecosystems onto molecular cascades. It appears everywhere. Physical Randomness Classical In classical dynamics, it is possible to give a notion of individual random element. Such points are called typical for T. Physical Randomness Quantum The appearance of non-determinism in quantum mechanics was a shock. Therefore it took a lot of time to accept this fact, although it appears totally natural now to us that a description on the world could be fully statistical.

At the meantime recent works see Ref. The role of algorithmic randomness in dynamical systems, especially in ergodic ones, has already be the subject of previous research. Hoyrup and C. Rojas June and Refs. In other words, it is pseudorandom in the strongest way and it is impossible to see in it any regularity whatsoever. A closer analysis of this issue is part of our project, in Some Bridging Results and Challenges 79 view of our experience on the relation physical dynamical vs.

Concurrency, on the contrary, seems to give rise to a true notion of randomness, due to the unpredictable and unbacktrable nature of interaction between independent and asynchronous agents. We mention in particular the works by Nestmann and Pierce,41 and by Palamidessi. Another line of investigation has been pursued by Wegner and his collaborators. However the approach is based on the conventional interpretation of nondeterminism in automata theory.

An interesting result has bee found recently by Busi and her colleagues they investigate a process calculus CCS with replication and show that in this calculus the existence of a divergent path is decidable. Still, in a subsequent paper14 they show that this formalism can encode Turing machines. Paul otherwise decidability of existence of divergence would imply decidability of termination. In all the above investigations nondeterminism plays a central role: since in concurrency there is no backtracking, it becomes important to control nondeterminism as much as possible, and the expressive power of a concurrent language usually lies on the capability to exert such control.

In a more practical fashion, nondeterminism has been used in concurrency theory as a convenient abstraction from run-time information. Essentially, a concurrent program is nondeterministic because when we write it we do not know yet what will determine at run-time the choice of a particular path, so all the possibilities must be considered. A common misconception of nondeterminism is to consider it a probabilistic mechanism with uniform distribution.

The confusion probably originates from epistemic considerations: the total lack of knowledge about which of the available alternatives will be chosen, in a nondeterministic process, evokes the concept of maximum entropy. But maximum entropy represents the maximum degrees of uncertainty within the probabilistic setting. Nondeterminism is outside the realm of probability and it represents an even higher degree of uncertainty.

In any case, confusing nondeterminism with uniform probability has induced wrong approaches. We argue that this due to several aspects: not only a nondeterministic property cannot express the quantitative aspects of a probabilistic one, but also this transformation requires angelic nondeterminism nondeterminism working in favour of the property , which is a strong assumption usually not guaranteed in a concurrent setting where nondeterminism is typically demonic.

By their linear causality, these assumptions do not seem to have still integrated the views on the interplay of interactions in XXth century physics, that is the richness of the approaches to physical determination and randomness. As a matter of fact, the analysis of randomness is part of the proposal for a structure of determination of physical processes, in particular in classical dynamics, where randomness is deterministic unpredictability. Paul 5. In short, a major point of these results, is given by the use of discrete, though asymptotic, tools for algorithmic randomness, in applications to continuous dynamics.

By this, our approach, by its relation to physically meaningful dynamical systems complements the relevant existing work on algorithmic randomness. Further applications or correlations between the recent ideas in algorithmic randomness, like the ones developed by the authors above, and our understanding of classical and quantum dynamics is one of the paths to be explored. We will hint below on how a relation to our approach can be developed. Singapore: World Scientific.

Hector Zenil - unknown. Chance Versus Randomness. Antony Eagle - - Stanford Encyclopedia of Philosophy. On Implementing a Computation. David J. Chalmers - - Minds and Machines 4 4 Valerie Gray Hardcastle - - Synthese 3 A Computational Foundation for the Study of Cognition. Chalmers - - Journal of Cognitive Science 12 4 Randomness Is Unpredictability. Nir Fresco - - Philosophy and Technology 26 1 What Is Nature-Like Computation? A Behavioural Approach and a Notion of Programmability.

Hector Zenil - - Philosophy and Technology 3 Schnorr Randomness. Rodney G. Griffiths - - Journal of Symbolic Logic 69 2 - Added to PP index Total views 16 , of 2,, Recent downloads 6 months 4 , of 2,, How can I increase my downloads? Sign in to use this feature. Applied ethics. History of Western Philosophy.

Normative ethics.