Re: data compression program



In article <1145994249.797643.184230@xxxxxxxxxxxxxxxxxxxxxxxxxxxx>,
"Einstein" <michaelhh@xxxxxxxxx> wrote:

1) 50% is probably on text based items... and so it's pretty much junk
therefore.

2) I have a theory that random binary lossless compressions maximum per
cycle limit is 80%, and this would not be achievable without a LOT of
processing time loss.

Shannon has a theory (with well-accepted proofs) that "random binary
lossless compressions" are not possible. You can only losslessly
compress out the REDUNDANCY in data, and truly random data has no
redundancy. Oops.

3) My software is 75% completed, and is going to prove this already.

"Ninety-eight percent complete / But it's been that way for weeks"

4) I have a patent already on my invention pending :)

See another post in this thread re the value of some patents.

5) Wanna post a White Paper on this invention? I would love to see your
math methods
.