But back to some true randomness at http://phys.org/news/2016-06-algorithm-random.html - I think the idea that your randomness might lose its randomness because the other systems it relies on for randomness aren't truly random is interesting. I wonder if, at the heart of the matter, that means that all systems are crackable if you can understand the state of the objects that went into the end system. That is, it's all obfuscation until you obfuscate to the point that the variables reach infinity.
"How do you know for sure that the measurement devices used to measure the physical system don't have some underlying predictability due to the way they were constructed? To overcome this problem, scientists have developed strict requirements on the devices, but these "device-independent" protocols are so strict that they are very slow at generating large amounts of random numbers."
I also think it's interesting that there is during (dynamic?) and post (static?) randomization, with the latter being computationally heavy and more akin to certification of randomization. You have to wonder if some day someone with enough horsepower will discover that all the GUIDs generated by current systems aren't random and are discoverable and we'll be forced into a Year-2000 like situation where we have to recompute system ids and update them (and therefore have to update all the relationships - nasty).
No comments:
Post a Comment