Zipf–Mandelbrot law

From HandWiki
Short description: Discrete probability distribution
Zipf–Mandelbrot
Parameters N{1,2,3} (integer)
q[0;) (real)
s>0 (real)
Support k{1,2,,N}
pmf 1/(k+q)sHN,q,s
CDF Hk,q,sHN,q,s
Mean HN,q,s1HN,q,sq
Mode 1
Entropy sHN,q,sk=1Nln(k+q)(k+q)s+ln(HN,q,s)

In probability theory and statistics, the Zipf–Mandelbrot law is a discrete probability distribution. Also known as the Pareto–Zipf law, it is a power-law distribution on ranked data, named after the linguist George Kingsley Zipf who suggested a simpler distribution called Zipf's law, and the mathematician Benoit Mandelbrot, who subsequently generalized it.

The probability mass function is given by:

f(k;N,q,s)=1/(k+q)sHN,q,s

where HN,q,s is given by:

HN,q,s=i=1N1(i+q)s

which may be thought of as a generalization of a harmonic number. In the formula, k is the rank of the data, and q and s are parameters of the distribution. In the limit as N approaches infinity, this becomes the Hurwitz zeta function ζ(s,q). For finite N and q=0 the Zipf–Mandelbrot law becomes Zipf's law. For infinite N and q=0 it becomes a Zeta distribution.

Applications

The distribution of words ranked by their frequency in a random text corpus is approximated by a power-law distribution, known as Zipf's law.

If one plots the frequency rank of words contained in a moderately sized corpus of text data versus the number of occurrences or actual frequencies, one obtains a power-law distribution, with exponent close to one (but see Powers, 1998 and Gelbukh & Sidorov, 2001). Zipf's law implicitly assumes a fixed vocabulary size, but the Harmonic series with s=1 does not converge, while the Zipf–Mandelbrot generalization with s>1 does. Furthermore, there is evidence that the closed class of functional words that define a language obeys a Zipf–Mandelbrot distribution with different parameters from the open classes of contentive words that vary by topic, field and register.[1]

In ecological field studies, the relative abundance distribution (i.e. the graph of the number of species observed as a function of their abundance) is often found to conform to a Zipf–Mandelbrot law.[2]

Within music, many metrics of measuring "pleasing" music conform to Zipf–Mandelbrot distributions.[3]

Notes

  1. Powers, David M W (1998). "Applications and explanations of Zipf's law". Joint conference on new methods in language processing and computational natural language learning. Association for Computational Linguistics. pp. 151–160. 
  2. Mouillot, D; Lepretre, A (2000). "Introduction of relative abundance distribution (RAD) indices, estimated from the rank-frequency diagrams (RFD), to assess changes in community diversity". Environmental Monitoring and Assessment (Springer) 63 (2): 279–295. doi:10.1023/A:1006297211561. http://cat.inist.fr/?aModele=afficheN&cpsidt=1411186. Retrieved 24 Dec 2008. 
  3. Manaris, B; Vaughan, D; Wagner, CS; Romero, J; Davis, RB. "Evolutionary Music and the Zipf–Mandelbrot Law: Developing Fitness Functions for Pleasant Music". Proceedings of 1st European Workshop on Evolutionary Music and Art (EvoMUSART2003) 611. https://archive.today/wQYN. 

References

  • B.B. Wolman and E. Nagel, ed (1965). "Information Theory and Psycholinguistics". Scientific psychology. Basic Books.  Reprinted as
    • R.C. Oldfield and J.C. Marchall, ed (1968). "Information Theory and Psycholinguistics". Language. Penguin Books. 
  • Powers, David M W (1998). "Applications and explanations of Zipf's law". Joint conference on new methods in language processing and computational natural language learning. Association for Computational Linguistics. pp. 151–160. 
  • Zipf, George Kingsley (1932). Selected Studies of the Principle of Relative Frequency in Language. Cambridge, MA: Harvard University Press. 
  • Van Droogenbroeck F.J., 'An essential rephrasing of the Zipf–Mandelbrot law to solve authorship attribution applications by Gaussian statistics' (2019) [1]