See page for author [Public Domain], via Wikimedia Commons |
George Boole was the first to express the mathematics behind the logical operations we use today: XOR (which he saw as binary addition modulo 2) and AND (which was binary multiplication). The OR and NOT operations were added later. And although he was something of a contemporary to Charles Babbage, the two never met to discuss their similar ideas, although Boole was very interested in automation. His mathematical logic also foretold some AI operations, as he sought to use his theory to help explain how people thought and reasoned.
The XOR operation takes two binary numbers, adds them together, and then just looks at the rightmost digit for the result. It is an "exclusive OR" - the result will be 1 if, and only if, only one of the values being added is 1. We can make a simple table:
0 + 0 = 0
0 + 1 = 1
1 + 0 = 1
1 + 1 = 0
Even simpler, the AND operation also takes two binary numbers and multiplies them. The result will be 1 if, and only if, both of the values are 1. Another simple table:
0 x 0 = 0
1 x 0 = 0
0 x 1 = 0
1 x 1 = 1
Boolean logic in coding
Decades after Boole's work, Shannon realized that his math could be applied to digital circuits, and thus to the computing that was evolving in the 1930s. We're most familiar with these operators today in slightly different forms. Instead of the XOR of Boole's time, we have a disjunctive OR which is more similar to how we use the word "or" in English.
0 v 0 = 0
0 v 1 = 1
1 v 0 = 1
1 v 1 = 1
The alert among you might figure out that a OR b == (a XOR b) XOR (a AND b).
Combined with NOT, which turns a 0 into a 1 and a 1 into a 0, we have the fundamental operations that most of us might use in an "if" statement, or many other conditional operations today in most programming languages.
Many languages are willing to play fast and loose with the concept of what represents false and true. Most agree that a zero will be false - but that anything non-zero ends up being true. Some languages are more strict - they have a boolean type and only things that evaluate to boolean types can be used with AND, OR, and NOT operations. Languages that aren't as strict are easier to work with - but also easier to make mistakes with. (Many is the time that a programmer meant to compare two values, ended up assigning one to the other, and getting a comparison result that was wrong.)
Combining boolean logic with multiple bits allow computers to do masking and quick comparisons and are a the heart of most graphics drivers and programming language implementations.
Most, but not all
All of this seems pretty straightforward, but it was pretty revolutionary when Boole came up with it in the 19th century. And it isn't guaranteed to be valid even in all programming languages today. For example, SQL appears to implement boolean logic, but actually has a third possible value which throws all the logic conditions out the window. Since SQL values can be null, which cannot be compared to non-null values, you can get operations which, when tested, and when their negation is tested, both evaluate to false. But that is a complication for another time.
No comments:
Post a Comment