18th March 2007 - 04:53 PM
Well if we're working with logic, which appears to include most everything describable in mathematics or science, then ultimately you can break anything down (even uncertainties) into yes or no descriptions and this could be seen as binary information (usually 1 is equated with yes or true and 0 with no or false).
You can write numbers in binary, just like in decimal as well. In decimal, the digits grow by powers of 10, so:
2,571 = 2*1000 + 5 * 100 + 7 * 10 + 1 * 1
Or you could write this also as:
2,571 = 2*10*10*10 + 5 * 10*10 + 7 * 10 + 1 * 1
WIth binary numbers the powers are of 2, so we could determine what the value of 11010 is by computing:
1*2*2*2*2 + 1*2*2*2 + 0*2*2 + 1*2 + 0 = 16+8+4 = 28
There are some way of figuratively knowing less than a single binary digit of information, by only knowing a probability (an uncertainty with a bias), so you might know that something is "true" 75% of the time, for example, but this is in many ways more than a single bit of information because the amount of uncertainty could have a binary description as well. (You could, for example, do a 20 questions version of "Are you confident in this value? Yes or No? "Would that be very (non)confident or somewhat (non)confident ?" - Yes or No? etc...
Kind of like a lawyer asking you questions :lol)
There might be some things that couldn't be described as bits of information, an example I've given in the past is that you can't communicate the conscious sensation of smelling something to a rock, no matter how many "bits" of information or yes or no answers you gave, though it should be theoretically possible for a specific smell to be described or stored as bits of information and recalled, assuming there exists some pathway for this communication to occur.
Anyway, a good analogy is that anything that could be described or written could be translated into a binary language doing the same thing, once the symbols that the language represents have been conveyed or agreed upon ahead of time, so something still needs to exist to decode binary information, but in terms of storage, memory or communication, it should be possible to do it all in binary.
For example, words could be written in binary as well. We could create the table:
And write the "sentence" 00 01 10, for "Jane runs fast" or 00 01 11 for "Jane runs slow". Though binary information relies on other structures to interprete it, it's the least restrictive and most general form of communication for information.
You can also see binary relationships in most concepts in that they have contrasts or antonyms - day exists as a compliment to night and visa versa or up/down, add/subtract, male/female etc. A concept has to relate to something else or it has no meaning with regard to anything else - "yutgrix" is a meaningless word because it has no relationship to anything else except maybe an "anti-yutgrix" or "non-yutgrix", so you have to define what it means and that creates a pairing that could be seen as a binary relationship. A trinary relationship is one where three things are involved, though you could still break these into three binary pairings of relationships, so the lower common denominator for describing about anything is binary (though binary descriptions can be very long because each bit says very little).