New Hutter Prize Winner Achieves Milestone for Lossless Compression of Human Knowledge

New Hutter Prize Winner Achieves Milestone for Lossless Compression of Human Knowledge
Since 2006 Baldrson (Slashdot reader #78,598) has been part of the team verifying “The Hutter Prize for Lossless Compression of Human Knowledge,” an ongoing challenge to compress a 100-MB excerpt of Wikipedia (approximately the amount a human can read in a lifetime).

“The intention of this prize is to encourage development of intelligent compressors/programs as a path to Artificial General Intelligence,” explains the project’s web site.

15 years ago, Baldrson wrote a Slashdot post explaining the logic (titled “Compress Wikipedia and Win AI Prize”):

The basic theory, for which Hutter provides a proof, is that after any set of observations the optimal move by an AI is find the smallest program that predicts those observations and then assume its environment is controlled by that program. Think of it as Ockham’s Razor on steroids.
The amount of the prize also increases based on how much compression is achieved. (So if you compress the 1GB file x% better than the current record, you’ll receive x% of the prize…) The first prize was awarded in 2006. And now Baldrson writes with the official news that this Spring another prize was claimed after reaching a brand new milestone:
Artemiy Margaritov’s STARLIT algorithm’s 1.13% cleared the 1% improvement hurdle to beat the last benchmark, set by Alexander Rhatushnyak. He receives a bonus in proportion to the time since the last benchmark was set, raising his award by 60% to €9000. [$10,632 USD] Congratulations to Artemiy Margaritov for his winning submission!

Read more of this story at Slashdot.