I first became familiar with Bertrand Russell https://en.wikipedia.org/wiki/Bertrand_Russell because of his massive philosophical writings. My introduction to his logical ideas started with his book about language, An Inquiry Into Meaning And Truth, a 300-page book that concluded a single word has no meaning. (I reread it 2 words at a time.) Obviously, most of the book dealt with definitions and logic. One of his Cambridge University students defined what is needed to become a computer.
“A Turing machine is a mathematical model of computation describing an abstract machine[1] that manipulates symbols on a strip of tape according to a table of rules.[2] Despite the model’s simplicity, it is capable of implementing any computer algorithm.”
The Greek philosophers Socrates – Wikipedia and Plato – Wikipedia among other great accomplishments taught propositional logic. This method of determining an action can be duplicated in simple hardware logic gates, even conceivably before biology existed. The industrial revolution created production lines requiring control mechanisms. First mechanical devices such as Jacquard machine – Wikipedia, then computer-controlled History of computing hardware – Wikipedia machines controlled progressively more complicated functions.
The fundamental mathematics and logic required were formally studied by a group of University of Cambridge – Wikipedia professors and students. Bertrand Russell – Wikipedia co-authored Principia Mathematica – Wikipedia, a gigantic collection defining symbolic logic. One of his students, Ludwig Wittgenstein – Wikipedia worked extensively on logic and language. By the way, Russell wrote A Inquiry into the Meaning of Words, a 250-page book that can be summed up in one sentence, Individual words have no meaning.
Another Cambridge University math student, Alan Turing – Wikipedia, created a hypothetical computer, now called the Turing machine – Wikipedia, that became the model for stored memory computers. The machine has locations for instruction type and two data sources. A symbol is read from memory and interpreted as a program instruction. The machine then manipulates the data in memory using the rules associated with that instruction. For example, if the symbol is ADD, the machine would add the two inputs creating a sum. It would then move to the next instruction.
Modern mainframe and personal computers still use this basic architecture with one remarkable improvement, the ability to respond to an Interrupt – Wikipedia. This might be the user pressing a key, a signal from the internet, a printer reporting a problem, or low power warning. Because processor speed and memory speed are exponentially faster than input devices, a processor can service other programs while waiting for data. With machines that have multiple processors, interrupts can assign tasks to idle processors.
Nature has had over 2 billion years to form logic gates from deposited silicon compounds. The grand canyon base Vishnu Basement Rocks – Wikipedia contains silicon dioxide Quartz – Wikipedia , the basic building blocks of the computer.
With a “machine” that is capable of detecting and analyzing data, the next step would be to allow it to supervise what it is doing.
Next Chapter Consciousness