Leibniz

DoMath
Parha (토론 | 기여)님의 2008년 11월 14일 (금) 15:41 판
(차이) ← 이전 판 | 최신판 (차이) | 다음 판 → (차이)

Gottfried Wilhelm Leibniz (1646 - 1716)

Gottfried Leibniz laid the modern foundation of the movement from decimal to binary as far back as 1666 with his 'On the Art of Combination', laying out a method for reducing all logic to exact statements.

Leibniz believed logic, or ‘the laws of thought’ could be moved from a verbal state - which was subject to the ambiguities of language, tone and circumstance - into an absolute mathematical condition:

"A sort of universal language or script, but infinitely different from all those projected hitherto, for the symbols and even words in it would direct the reason, and errors, except for those of fact, would be mere mistakes in calculation. It would be very difficult to form or invent this language or characteristic, but very easy to understand it without any dictionaries."

The concept was a bit high-flown for his time, and Leibniz' idea was ignored by the scientific community of his day. He let his proposition drop - until about ten years later when the Chinese 'Book of Changes', or 'I Ching', came his way.

Leibniz found some sort of confirmation for his theories in the I Ching's depiction of the universe as a progression of contradicting dualities, a series of on-off, yes-no possibilities, such as dark-light and male-female, which formed the complex interaction of life and consciousness. He reasoned that, if life itself could be reduced to a series of straightforward propositions, so could thought, or logic.

Heartened by his new insights, Leibniz set out to refine his rudimentary binary system, studiously transposing numerals into seemingly infinite rows of ones and zeros - even though he couldn't really find a use for them.

Leibniz' stepped wheel calculator was built for decimal numbers. Although he apparently gave some thought over the years to another machine which would incorporate his beloved binary system, the long strings of binary numbers that replaced single decimal digits must have seemed daunting.

Actually, they must have seemed overwhelming, because Leibniz seemed to lose the plot towards the end of his life, endowing his binary system with a kind of quasi-religious mysticism. Binary numbers, he came to believe, represented Creation. The number one portraying God; and zero depicting Void.

Leibniz died without achieving his dream of a universal mathematical/logical language, but leaving the fundamental idea of the binary yes-no/on-off principle for others to play with, including Ploucquet, Lambert and Castillon. George Boole picked up their combined efforts roughly 125 years later for another buff and polish.




Binary - So Simple a Computer Can Do It

While every modern computer exchanges and processes information in the ones and zeros of binary, rather than the more cumbersome ten-digit decimal system, the idea isn't a new one.

Australia's aboriginal peoples counted by two, and many tribes of the African bush sent complex messages using drum signals at high and low pitches. Morse code, as well, uses two digits (dots and dashes) to represent the alphabet.

Gottfried Leibniz laid the modern foundation of the movement from decimal to binary as far back as 1666, while John Atanasoff, a physics professor at Iowa State College, had built a prototype binary computer by 1939.

In the meantime, Claude Shannon, Konrad Zuse and George Stibitz had been pondering away in their own corners of the world, musing on the benefits of combining binary numbers with boolean logic.

. . . . . . . . . . . . . . . . . . . .

Today, of course, and in almost every computer built since the 1950s, the binary system has replaced the decimal (which really only came about because it was handy to be able to count on your fingers) and advanced digital computer capabilities to an incredible degree.

Basically, binary simplifies information processing. Because there must always be at least two symbols for a processing system to be able to distinguish significance or purpose, binary is the smallest numbering system that can be used.

The computer's CPU need only recognise two states, on or off, but (with just a touch of Leibniz' mysticism) from this on-off, yes-no state all things flow - in the same way as a switch must always be open or closed, or an electrical flow on or off, a binary digit must always be one or zero.

If switches are then arranged along boolean guidelines, these two simple digits can create circuits capable of performing both logical and mathematical operations.

The reduction of decimal to binary does increase the length of the number, a lot, but this is more than made up for in the increase in speed, memory and utilisation.

Especially utilisation. Remember, computers aren't always dealing with pure numbers or logic. Pictures and sound must first be reduced to numerical equivalents that, in turn, have to be decoded again for the end result.