Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
@andros705 when speaking of experience, you can't take shortcuts and expect the same results of having lived it.
In other words, seeing others do things (and fail or not), is a very different and incomplete lesson when compared to doing it yourself.
Thus, no compression algorithm, meaning that you can't reduce an experience to anything else without losing information. -
Mitiko62056y@lucaspar Compression is all about repetotion. Maybe he means evrything changes all the time and unpredictably
-
@Mitiko idk, random data can still be compressed, but that's another interpretation I guess :)
-
Mitiko62056y@lucaspar Random data can be compressed if there is at least some repeatability or somewhat obvious pattern i.e. 123456789 is easy to compress when delta encoded
-
Indeed, but patterns do appear in random samples. Randomness is about probability and, therefore, predictions. Compression algorithms do not depend on any prediction nor non-determinism - the data is already defined.
-
Mitiko62056y@lucaspar Well actually, the ones with NN, like paq8 use AI given the given context or part of it to make predictions about the next char/byte/bit... Then it uses some kind of statistical encoding or rle. Furthermore you can track entropy and compression ratio to restart the context if needed.
Markov chains used in LZMA is basically just predictions, based on context change.
Related Rants
Best quote I have seen in a while...
random
quote
andy jassy
aws
experience