![]() |
Discrete Mathematics & Theoretical Computer Science |
In this paper, we show that data streams can sometimes usefully be studied as random permutations. This simple observation allows a wealth of classical and recent results from combinatorics to be recycled, with minimal effort, as estimators for various statistics over data streams. We illustrate this by introducing RECORDINALITY, an algorithm which estimates the number of distinct elements in a stream by counting the number of $k$-records occurring in it. The algorithm has a score of interesting properties, such as providing a random sample of the set underlying the stream. To the best of our knowledge, a modified version of RECORDINALITY is the first cardinality estimation algorithm which, in the random-order model, uses neither sampling nor hashing.
Source : ScholeXplorer
IsRelatedTo DOI 10.4230/lipics.aofa.2022.12 Source : ScholeXplorer IsRelatedTo HANDLE 2117/377535
|