Main Page | Report Page

 

  Computers Forum Index » Computer Compression » Not understanding Gödel...

Author Message
Mark Nelson...
Posted: Sat Jul 31, 2010 6:42 pm
 
I might be mistaken, but I believe Jacko invoked Gödel in an attempt
to show somehow that the Pigeonhole Principle (counting argument) is
invalid:

Quote:
(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy

It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".

As long as we believe that:

a) The counting argument properly models the act of compression and
decompression using set theory.

and

b) The counting argument has been proven true (and by inference its
negation is known to be false, I suppose.)

We can rely on its ability to make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)

Waving Gödel and throwing out the word "inconsistent" might just show
that you don't understand the Incompleteness Theorems. They go a long
way towards telling you what you can and can't do with formal systems,
and they definitely don't throw out the value of proven theorems
within those systems.

So Jacko, if you want to violate the counting argument, you will have
to do what Jules does, and make sly hints that you have somehow
stepped outside the box - your compression and decompression systems
are therefore not properly modeled by the Pigeonhole Principle. I
haven't seen you do that.

IANAM, so I will gladly take my lumps if I have misspoken here, and it
is certainly very possible.

- Mark
 
biject...
Posted: Sat Jul 31, 2010 7:51 pm
 
On Jul 31, 12:42 pm, Mark Nelson <snorkel... at (no spam) gmail.com> wrote:
Quote:
I might be mistaken, but I believe Jacko invoked Gödel in an attempt
to show somehow that the Pigeonhole Principle (counting argument) is
invalid:

(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy

It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".

As long as we believe that:

a) The counting argument properly models the act of compression and
decompression using set theory.

and

b) The counting argument has been proven true (and by inference its
negation is known to be false, I suppose.)

We can rely on its ability to make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)

Waving Gödel and throwing out the word "inconsistent" might just show
that you don't understand the Incompleteness Theorems. They go a long
way towards telling you what you can and can't do with formal systems,
and they definitely don't throw out the value of proven theorems
within those systems.

So Jacko, if you want to violate the counting argument, you will have
to do what Jules does, and make sly hints that you have somehow
stepped outside the box - your compression and decompression systems
are therefore not properly modeled by the Pigeonhole Principle. I
haven't seen you do that.

IANAM, so I will gladly take my lumps if I have misspoken here, and it
is certainly very possible.

- Mark

Mark its been years since I was in school but I always loved Godel's
theorems. Basically it meant to me is that there are statements in
mathematics that can not be proven with some finite set of axioms
because there are statements that can not be proven one way or the
other. Changing some thing like a few axioms in geometry is what led
to other geometries besides Euclid's.
I like Jacko but I think the counting theorem or whatever you call it
is not one of those questionable statements. Its obviously true.
The best you can do with lossless compression is to map one set
of files to another.
People get confused since any file is compressible especially big
files. Its easy for some one to take a small set of random fies and
write a compressor to compress that set smaller. People play with
such a small set they falsely think that since this set was random
and they compress these files they quickly jump to the conclusion
that all big files are compressible to smaller files.
And the fact is they are all compressible no matter how random.
The problem is that you have to use different mappings with
different compressors. No one compress can do it all. As
soon as you add bits to select the various compressors you
original used you are increase the length its a zero sum game
at best.

David A. Scott
--
My Crypto code
http://bijective.dogma.net/crypto/scott19u.zip
http://www.jim.com/jamesd/Kong/scott19u.zip old version
My Compression code http://bijective.dogma.net/
**TO EMAIL ME drop the roman "five" **
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged.
As a famous person once said "any cryptograhic
system is only as strong as its weakest link"
 
James Dow Allen...
Posted: Sun Aug 01, 2010 7:49 am
 
On Aug 1, 1:42 am, Mark Nelson <snorkel... at (no spam) gmail.com> wrote:
Quote:
We can rely on [pigeonhole principle's] ability to
make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)
...
IANAM, so I will gladly take my lumps if I have misspoken here, and it
is certainly very possible.

IANAM either but don't think you've misspoken.

Infinite sets can get interesting. For example, the set of
integers can "get compressed" into the set of even integers
using the method of Hilbert's Hotel. And even for tinyish
infinities the Axiom of Choice is controversial. That Axiom
leads to paradoxical results; maybe it even leads to a
perpetual compression algorithm for some infinite case!

But finite numbers are a different ball of wax. Pigeonhole
should be considered perfectly satisfactory for any file
small enough to be processed in 100 trillion centuries
even assuming that every one of the universe's particles,
nearly a googol in number, could be exploited to yield the
power of today's Pentium chip. Mark, is the demo that
you are (or are not?) being invited to operating on
input bigger than that?

Hmmm. This post, meant to affirm the Principle of Dirichlet's
Pigeons, will probably be misused as an argument in *favor of*
perpetual compression! Smile Sorry about that. :-(

James
 
biject...
Posted: Sun Aug 01, 2010 2:44 pm
 
On Aug 1, 1:49 am, James Dow Allen <jdallen2... at (no spam) yahoo.com> wrote:


Quote:

Infinite sets can get interesting.  For example, the set of
integers can "get compressed" into the set of even integers
using the method of Hilbert's Hotel.  And even for tinyish
infinities the Axiom of Choice is controversial.  That Axiom
leads to paradoxical results; maybe it even leads to a
perpetual compression algorithm for some infinite case!


I would not call that compression. Many people get confused
about infinite sets. You can map the set of all integers to the
set of all even integers, so what. You still have the same
number of integers. What confuses some is that the total
number of integers is the same as the total number of odd
or even integers. There is more than one infinity.
Bit when writing a compression program if you let the bits
mean either all integers or just even integers so what.

Where this paradox does come into compression is with
bijective compression. I have seen many proofs over the years
by intelligent people who believed that they had proofs that
bijective file compression was impossible. They would usually
wrongly quote other people but they never looked at my huffman
or arithmetic, bijective compressors since they knew that they
were correct. It's interesting how the human mind works.

When I worked for Uncle people showed my published papers
on what the limits where for some optimal theorms for use
in inertial navigation algorithms and such. The paper though
they claimed to be optimal where wrong the authors really missed
the simple approach and sadly as we get more and more less
educated engineers this country we are in trouble because peer
reviewed papers in real hard science are becoming crap. Just
look at the peer reviewed papers that pass for science in the global
warming religion that is using its power to destroy the west.
I am assuming the chinese who are better educated can see
what is happening and maybe they will not fall into the same
trap that is destroying the west.

David A. Scott
--
My Crypto code
http://bijective.dogma.net/crypto/scott19u.zip
http://www.jim.com/jamesd/Kong/scott19u.zip old version
My Compression code http://bijective.dogma.net/
**TO EMAIL ME drop the roman "five" **
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged.
As a famous person once said "any cryptograhic
system is only as strong as its weakest link"
 
jacko...
Posted: Mon Aug 02, 2010 11:38 pm
 
On 31 July, 19:42, Mark Nelson <snorkel... at (no spam) gmail.com> wrote:
Quote:
I might be mistaken, but I believe Jacko invoked Gödel in an attempt
to show somehow that the Pigeonhole Principle (counting argument) is
invalid:

(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy

It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".

Sure do. The "no system that can prove it's own consistancy is
consistant, and this is provable" argument is where I'm going with it.

Quote:
As long as we believe that:

a) The counting argument properly models the act of compression and
decompression using set theory.

and

b) The counting argument has been proven true (and by inference its
negation is known to be false, I suppose.)

I would say that, assumed true would be more like the process involved
in the negation of an assumption which was shown to be false. The
opposite of an assumption involves 'proving' there is only one
opposite, not assuming there is only one.

Quote:
We can rely on its ability to make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)

It will provide results for all shannon coding compression.

Quote:
Waving Gödel and throwing out the word "inconsistent" might just show
that you don't understand the Incompleteness Theorems. They go a long
way towards telling you what you can and can't do with formal systems,
and they definitely don't throw out the value of proven theorems
within those systems.

See above in reference to the phrase 'half-proven'.

Quote:
So Jacko, if you want to violate the counting argument, you will have
to do what Jules does, and make sly hints that you have somehow
stepped outside the box - your compression and decompression systems
are therefore not properly modeled by the Pigeonhole Principle. I
haven't seen you do that.

And i thought any JK 'time' sequence all generated from a constant
sized bit content 'by applying a time function' had a fixed size. And
so any advantageous manipulation of such a sequence to give it
'information content' is well, within the bounds of being useful.
 
jacko...
Posted: Mon Aug 02, 2010 11:48 pm
 
The counting argument applies as per the fact that the bits I -> T(I) -
Quote:
T(T(I)) -> ... I'' are constant size and not in violation.

And change of the I to prime anything by another of the elements of
set S (as someone called it) is not in violation.

Some useful replacement properties among the set S can be used to
modulate information onto the stream represented by the I time
sequence. All symbols have same bit length, so any substitution (or
modulation if you will) does not change the number of bits, but may
represent information.

Simple, enough?
 
Thomas Richter...
Posted: Tue Aug 03, 2010 5:15 am
 
jacko wrote:

Quote:
(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy
It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".

Sure do. The "no system that can prove it's own consistancy is
consistant, and this is provable" argument is where I'm going with it.

Not quite. Gödel does not say that. He rather says that a system that is
sufficiently complex to include the axioms of natural numbers isn't able
to be both consistent and complete. Weaker systems might very well be
complete and consistent. In fact, the axiom system that contains no
axioms is surely consistent and complete: It cannot formulate any
theorem, and thus there is not a single theorem that cannot be proven.

Quote:
b) The counting argument has been proven true (and by inference its
negation is known to be false, I suppose.)

I would say that, assumed true would be more like the process involved
in the negation of an assumption which was shown to be false. The
opposite of an assumption involves 'proving' there is only one
opposite, not assuming there is only one.

No. Logic is binary, which means that if the opposite has shown to be
false, the theorem is true. The possibility of indirect proofs - this is
just one - is a direct consequence of the axioms of logic and has
nothing to do with Gödel. There is nothing to be "assumed". Actually, it
is the same type of logic computer gates are designed from. You may deny
the axioms of logic (and get overrun on the next pedestrian crossing),
but your computer surely doesn't. (-:

Quote:
We can rely on its ability to make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)

It will provide results for all shannon coding compression.

No, it doesn't. You probably don't know Shannon good enough - or at all-
but the counting argument has no relation to channel capacity or
Shannon's theorem. You don't need information theory or entropy or
anything like that. You don't need a model, nor a source, nor
probabilities. So don't trash-talk - all you need is: The definition of
what a function is, the definition of what a finite set is, the
definition of one-to-one, and the insight that a computer program
transforming deterministically input into output can be modeled by a
function, and the axioms of logic. That is all. There *is* no Shannon here.

Do you actually understand what you're talking about? Apparently, not.

Quote:
Waving Gödel and throwing out the word "inconsistent" might just show
that you don't understand the Incompleteness Theorems. They go a long
way towards telling you what you can and can't do with formal systems,
and they definitely don't throw out the value of proven theorems
within those systems.

See above in reference to the phrase 'half-proven'.

And that reference is simply incorrect. If a theorem is proven within an
axiom set, it remains proven, regardless of anything Gödel says. Gödel's
theorem rather limits the set of provable theorems within a formal
system, it does not negate the value of theorems that already have been
proven true.

Quote:
So Jacko, if you want to violate the counting argument, you will have
to do what Jules does, and make sly hints that you have somehow
stepped outside the box - your compression and decompression systems
are therefore not properly modeled by the Pigeonhole Principle. I
haven't seen you do that.

And i thought any JK 'time' sequence all generated from a constant
sized bit content 'by applying a time function' had a fixed size. And
so any advantageous manipulation of such a sequence to give it
'information content' is well, within the bounds of being useful.

And if it is within the bounds being a deterministic function turning
input sequences to output sequences, the counting theorem applies. It
applies to all functions from finite sets to finite sets, so it does to
yours. And Gödel has absolutely nothing to do with that.

So long,
Thomas
 
jacko...
Posted: Tue Aug 03, 2010 9:55 am
 
On 3 Aug, 02:54, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
Quote:
jacko wrote:
(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy
It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".

Sure do. The "no system that can prove it's own consistancy is
consistant, and this is provable" argument is where I'm going with it.

Not quite. Gödel does not say that. He rather says that a system that is
sufficiently complex to include the axioms of natural numbers isn't able
to be both consistent and complete. Weaker systems might very well be
complete and consistent. In fact, the axiom system that contains no
axioms is surely consistent and complete: It cannot formulate any
theorem, and thus there is not a single theorem that cannot be proven.

Try "forever undecided" - a puzzle guide to godel.

Quote:
b) The counting argument has been proven true (and by inference its
negation is known to be false, I suppose.)

I would say that, assumed true would be more like the process involved
in the negation of an assumption which was shown to be false. The
opposite of an assumption involves 'proving' there is only one
opposite, not assuming there is only one.

No. Logic is binary, which means that if the opposite has shown to be
false, the theorem is true. The possibility of indirect proofs - this is
just one - is a direct consequence of the axioms of logic and has
nothing to do with Gödel. There is nothing to be "assumed". Actually, it
is the same type of logic computer gates are designed from. You may deny
the axioms of logic (and get overrun on the next pedestrian crossing),
but your computer surely doesn't. (-:

Logic , or more specifically "boolean logic/predicate logic" is
binary. English is not.

Quote:
We can rely on its ability to make predictions about compression. This
is true even if set theory is incomplete (which it is) and
inconsistent (I don't know.)

It will provide results for all shannon coding compression.

No, it doesn't. You probably don't know Shannon good enough - or at all-
but the counting argument has no relation to channel capacity or
Shannon's theorem. You don't need information theory or entropy or
anything like that. You don't need a model, nor a source, nor
probabilities. So don't trash-talk - all you need is: The definition of
what a function is, the definition of what a finite set is, the
definition of one-to-one, and the insight that a computer program
transforming deterministically input into output can be modeled by a
function, and the axioms of logic. That is all. There *is* no Shannon here.

It may bare little relation to channel capacity, and the definition of
"how" compression "uses" the finite set one to one function.

Quote:
Do you actually understand what you're talking about? Apparently, not.

Waving Gödel and throwing out the word "inconsistent" might just show
that you don't understand the Incompleteness Theorems. They go a long
way towards telling you what you can and can't do with formal systems,
and they definitely don't throw out the value of proven theorems
within those systems.

See above in reference to the phrase 'half-proven'.

And that reference is simply incorrect. If a theorem is proven within an
axiom set, it remains proven, regardless of anything Gödel says. Gödel's
theorem rather limits the set of provable theorems within a formal
system, it does not negate the value of theorems that already have been
proven true.

The only one opposite axiom is relevant to boolean logic, not to
assumptions over the english language.

Quote:
So Jacko, if you want to violate the counting argument, you will have
to do what Jules does, and make sly hints that you have somehow
stepped outside the box - your compression and decompression systems
are therefore not properly modeled by the Pigeonhole Principle. I
haven't seen you do that.

And i thought any JK 'time' sequence all generated from a constant
sized bit content 'by applying a time function' had a fixed size. And
so any advantageous manipulation of such a sequence to give it
'information content' is well, within the bounds of being useful.

And if it is within the bounds being a deterministic function turning
input sequences to output sequences, the counting theorem applies. It
applies to all functions from finite sets to finite sets, so it does to
yours. And Gödel has absolutely nothing to do with that.

And it says nothing of binary decisions to choose if a substituation
of something with the same bit length content, can or can not be used
to represent information.
 
Thomas Richter...
Posted: Tue Aug 03, 2010 5:27 pm
 
jacko wrote:
Quote:
On 3 Aug, 02:54, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
jacko wrote:
(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy
It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".
Sure do. The "no system that can prove it's own consistancy is
consistant, and this is provable" argument is where I'm going with it.
Not quite. Gödel does not say that. He rather says that a system that is
sufficiently complex to include the axioms of natural numbers isn't able
to be both consistent and complete. Weaker systems might very well be
complete and consistent. In fact, the axiom system that contains no
axioms is surely consistent and complete: It cannot formulate any
theorem, and thus there is not a single theorem that cannot be proven.

Try "forever undecided" - a puzzle guide to godel.

Care about a less trivial example? Here is one: Propositional calculus.
That is, the simple part of logic where you have propositions
(A,B,C,...) that can be true or false, and where you have theorems like:

not (A and B) = (not A) or (not B)

Within propositional calculus, every well-formed formula is either true
(thus a theorem) and then has a proof, or is false. Thus, it is
complete. And it is consistent. Actually, the proof of this is rather
easy. Can you see it?

The game changes as soon as you add "quantors", like "for all" or "there
exists". For first order logic - one quantor allowed - the system is
still complete and consistent, and there is then - obviously - a proof
mechanism. (Harder, though, but not too hard). For second order logic, I
forgot - I believe it is currently unknown.

Quote:
Logic , or more specifically "boolean logic/predicate logic" is
binary. English is not.

Luckily, math works on logic, not on English.

So long,
Thomas
 
jacko...
Posted: Wed Aug 04, 2010 2:43 pm
 
On 3 Aug, 14:27, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
Quote:
jacko wrote:
On 3 Aug, 02:54, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
jacko wrote:
(Ref: counting argument and Kurt
Godel incompleteness theorem w.r.t. proof of consitancy
It is true that Gödel proved that the set theory we use as the
foundation for arithmetic (and I assume for the proof of Dirichlet's
drawer principle) cannot be both consistent and complete. But I think
he doesn't understand what is meant by "consistent" and "complete".
Sure do. The "no system that can prove it's own consistancy is
consistant, and this is provable" argument is where I'm going with it..
Not quite. Gödel does not say that. He rather says that a system that is
sufficiently complex to include the axioms of natural numbers isn't able
to be both consistent and complete. Weaker systems might very well be
complete and consistent. In fact, the axiom system that contains no
axioms is surely consistent and complete: It cannot formulate any
theorem, and thus there is not a single theorem that cannot be proven.

Try "forever undecided" - a puzzle guide to godel.

Care about a less trivial example? Here is one: Propositional calculus.
That is, the simple part of logic where you have propositions
(A,B,C,...) that can be true or false, and where you have theorems like:

not (A and B) = (not A) or (not B)

Within propositional calculus, every well-formed formula is either true
(thus a theorem) and then has a proof, or is false. Thus, it is
complete. And it is consistent. Actually, the proof of this is rather
easy. Can you see it?

That's the point of 'incompleteness' and it's relation to 'proof of
consistancy'. Consistancy is unprovable given sufficient complexity,
as any such system which can provide a proof can also prove something
true which is false, the property of consistancy is one such thing.

Quote:
The game changes as soon as you add "quantors", like "for all" or "there
exists". For first order logic - one quantor allowed - the system is
still complete and consistent, and there is then - obviously - a proof
mechanism. (Harder, though, but not too hard). For second order logic, I
forgot - I believe it is currently unknown.

Logic , or more specifically "boolean logic/predicate logic" is
binary. English is not.

Luckily, math works on logic, not on English.

The initial statement of assumption in the counting argument is
treated as a single bit predicate. (Giggle giggle) This is where the
relation to the coding used comes in 'describing the compression
program used'. It is not a single bit concept, as proven in one way by
the theorem result. Hence the assumption is dependent on it's result
in the currently 'used' form in the FAQ.

Try 'all predicates are single bit predicates' - are you getting there?
 
jacko...
Posted: Wed Aug 04, 2010 2:52 pm
 
More specifically I(S) = -log2 P(S) involves a logorithmic singularity
in the logorithm. The multi roots of the complex and the indeterminate
of 0 will provide you with some idea of why it is not a single bit
predicate.
 
Thomas Richter...
Posted: Thu Aug 05, 2010 5:15 am
 
jacko wrote:
Quote:
More specifically I(S) = -log2 P(S) involves a logorithmic singularity
in the logorithm. The multi roots of the complex and the indeterminate
of 0 will provide you with some idea of why it is not a single bit
predicate.

What? Are you drunk? P(x) is a non-negative function, and thus log is
well-defined except for P(x) != 0. And there, P(x) log P(x) is well
defined even for P(x) = 0 by continuous extension, as one can see
easily. And then, just again, *entropy* has absolutely *nothing* to do
with the counting argument. I do not need a probability space to see
that a map from a larger, finite set into a smaller finite set cannot be
one-to-one. I only need the definition of a function.
 
Thomas Richter...
Posted: Thu Aug 05, 2010 9:16 am
 
jacko wrote:

Quote:
The initial statement of assumption in the counting argument is
treated as a single bit predicate.

No, it's not. The assumption is that of two finite sets N and M with |N|
Quote:
|M| and f: N->M. That is not a bit, nor does it establish a predicate.

(Giggle giggle) This is where the
relation to the coding used comes in 'describing the compression
program used'. It is not a single bit concept, as proven in one way by
the theorem result. Hence the assumption is dependent on it's result
in the currently 'used' form in the FAQ.

Try 'all predicates are single bit predicates' - are you getting there?

Mild hits on your backhead might help?
 
jacko...
Posted: Fri Aug 06, 2010 6:11 pm
 
On 5 Aug, 06:13, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
Quote:
jacko wrote:
More specifically I(S) = -log2 P(S) involves a logorithmic singularity
in the logorithm. The multi roots of the complex and the indeterminate
of 0 will provide you with some idea of why it is not a single bit
predicate.

What? Are you drunk? P(x) is a non-negative function, and thus log is
well-defined except for P(x) != 0. And there, P(x) log P(x) is well
defined even for P(x) = 0 by continuous extension, as one can see
easily. And then, just again, *entropy* has absolutely *nothing* to do
with the counting argument. I do not need a probability space to see
that a map from a larger, finite set into a smaller finite set cannot be
one-to-one. I only need the definition of a function.

And your "absolute" certainty of nothing to do with is based on a
proof I suppose, or just religious fever a.k.a. sunday psychosis.
 
jacko...
Posted: Fri Aug 06, 2010 6:12 pm
 
On 5 Aug, 06:16, Thomas Richter <t... at (no spam) math.tu-berlin.de> wrote:
Quote:
jacko wrote:
The initial statement of assumption in the counting argument is
treated as a single bit predicate.

No, it's not. The assumption is that of two finite sets N and M with |N|
  >|M| and f: N->M. That is not a bit, nor does it establish a predicate.

That is the refutation, not the assumption.

Quote:
(Giggle giggle) This is where the
relation to the coding used comes in 'describing the compression
program used'. It is not a single bit concept, as proven in one way by
the theorem result. Hence the assumption is dependent on it's result
in the currently 'used' form in the FAQ.

Try 'all predicates are single bit predicates' - are you getting there?

Mild hits on your backhead might help?

A real solution of the penance kind.
 
 
Page 1 of 2    Goto page 1, 2  Next
All times are GMT
The time now is Thu Apr 24, 2014 4:02 am