Parallel Randomness

dice

My latest paper Evaluation of splittable pseudo-random generators has appeared online last week in the Journal of Functional Programming.

What is the big deal of randomness? Randomness is the key to several common applications of computers, including games and secure communication to name but two of the most obvious ones. Games, such as lotteries and poker, are obviously supposed to be random. Imperfect randomness amounts to loading the dice. Secure communications may be less obvious to the uninitated, but the cryptography which is used to keep secrets depends on random keys to be secure. Loaded dice gives the attacker or intruder an edge. And obviously we want our banking and credit card transactions to be secure.

Programming randomness in computers was recognised as an importent challenge even in the early days of computing (around 1950), and it has been studied ever since. One would think the topic be exhausted by now. In fact, I believed enough had been written on computer randomness when I started this work in 2013.

In fact, randomness is well understood for application in sequential computer programs, i.e. a program which only utilises one of the CPU cores in the computer. A typical consumer-end computer these days has four. If you want to use all of them in one application, the software must be written for parallel execution.

True randomness is hard to achieve in a computer. It is possible to a certain degree, but a large number of random values take time to generate. Instead, the common solution is a so-called pseudo-random number generator (PRNG). Many exist. Most of them are flawed, but there are enough PRNG constructions which are considered trustworthy. However, the well-known solutions are all sequential. The random numbers are generated one by one in sequence. Distributing randomness across parallel threads (or multiple cores in the CPU to take a concrete view) is non-trivial.

Parallel randomness was recognised as an important problem already in the first half of the 1980-s. Yet, the literature is sparse. Some constructions have appeared, but few inspires any confidence. In fact, my paper demonstrates serious flaws in almost all of the known constructions. The first solution which inspires any confidence is that of Claessen and Pałka in 2013.

A preprint is available on my web page.

Seminar: The New Visualisation Lab – The What and the How

SoftICE member Arne Styve will tomorrow Wednesday 17 June give a brief presentation of the background for the new state-of-the-art visualisation lab at Aalesund University College, the technologies used, and the plans and the vision. The lab will be at the core for research and education purposes across all faculties at AAUC, and central the our newly started master programme in Simulation and Visualisation.

We invite for a broad discussion on possible applications after the talk.

We expect the giant canvas to arrive within a week or two, ready to be installed over the Summer. Thus there is little to see in the actual lab at the moment, and the tour and demo of the lab is postponed till the Autumn.

The seminar is open for all and will take place in room Borgundfjorden at 12.30 on Wednesday 17 June 2015, AAUC main building.