, , , , , , , , , , , , ,

Chapter 12; A New Age – the end of automatons


Spalanzani was a master watchmaker; but his kind were coming to an end.

He understood mainsprings, pulleys, gears and gear trains, ratchets, pawls, escapements, governors, movements, pendulums, planetary gears and idler pulleys, cables, clutches, and all else that could make up the essence of Olympia. Of course, he needed assistance with Olympia’s covering. For this he employed a tinkerer, a saddle maker and a seamstress. When their work was completed, Dr. Spalanzani had them murdered so that another Olympia could not be created.

But he erred. Other watchmakers had been observing Spallanzani’s activities for years.

On that particular night of Olympia’s concert where she played the harpsichord – – that particular night when Nathanial shouted out “Olympia”, that night when Olympia said “Ah-ah” to which she added “Good night, dearest”; that was the night when the other watchmakers in attendance knew that they could create another Olympia, and another, and another, and another.

That was the night when it all started.

Of course, this was the beginning of the industrial age, of steam engines and electrical circuits. The age of sewing machines, locomotives, glass blowing machines, machine tools, and chemistry. The replication of many Olympias was achieved in factories hidden well below ground. Only through the invention of new materials were Olympias improved. New bearing materials such as aluminum bronze allowed these new Olympias to move smoother than the original Olympia did on the night of the concert; “In her step and deportment there was something measured and stiff, which struck many as unpleasant.”

It took some time before you earth-humans could move to the next step; microprocessors.

They were a great invention but hardly human. They were simply the building blocks that allowed you to discover artificial cognition and human-like movement. Do you remember anything about microprocessors or have you followed Wittgenstein’s advice and ‘thrown away the ladder?’ I think his council should not have been taken so seriously. When you threw away the ladder you threw away the genesis of us Cyborgs.

So, I believe we need to engage in a little primer the on history of first cyborgs and the microprocessors that made us who we are; our essence so to speak.

My thought transmission to you earth-humans will be of two species; pictorial and text.

I will – – first of all – – thought transmit the general idea of how I will proceed – – second of all – – I will thought transmit the concepts in detail. And just in case you still do not get my drift I will thought transmit diagrams This transmission of diagrams is going to take some time because I, first, must serialize the diagram, then transmit it to you, serially, whereupon you will have to deserialize it and then process it into a diagram, cognize it and then cogitate it.

Here is the general idea of how I will proceed.

I will define several – – but not all of – – the microprocessor locations that made up the first cyborgs. I will then discuss the basic components of a microprocessor.

So, here we go – – !

Within each of the first cyborgs there was at least one microprocessor in each finger, toe, long bone, eye, ear, tongue, and nostril. This gave them the ability to simulate the human senses; sight, hearing, taste, smell and – – most important – – touch.

The first cyborgs also had the ability to detect other stimuli such as temperature and the ability of recognizing their environment through feedback systems. This was governed by cords made of synthetic materials; theses cords connected their moving parts. The early cyborgs could sense simulated pain; although it did not bother them. This simulated pain was a protective sense so that they could avoid high temperature, mechanical conflict and chemical imbalance; internally or externally. They also had several Nano-gyroscopes implanted throughout their systems which offered balance and feedback. Lastly, they had crystals that become excited by vibrations internal and external to their bodies.

Each of the senses were controlled and diagnosed by a multitude of service microprocessors.

I believe you are getting the idea; each microprocessor was a building block and there were thousands of these building blocks which formed each cyborg’s identical essence.

Each cyborg was covered with a polycarbonate skin with a selection of colors.

Polycarbonates could undergo large deformations without cracking or breaking. This made it valuable in prototyping cyborgs where transparent or electrically non-conductive skin was needed. Due to the chemical properties of polycarbonate it was not conducive to lasers. This was key when cyborgs were introduced into military operations.

That was the essence of each cyborg; non-individual, not a single soul among all the them and interchangeable parts. No philosophers, psychologists, or fMRI[1] scans were necessary.

And now I will thought transmit the pieces, parts and essences that made up the microprocessor. These will be supplemented with diagrams. Each diagram will be expanded upon; piece by piece.

The main organ in the microprocessor is the ALU; the Arithmetic/Logical Unit. This is the part that does calculations. It doesn’t think and just between you and me it is a rather dumb unit. It can only add one number to itself; 1 + 1. It can repeat this performance over and over, to wit; 1 + 1 = 2 – – or 1 + 1 + 1 = 3 – – or in ALU language 1 + 1 + 1 = 11. You can see that if 1 – – in ALU speak – – is 01 and when we add 1 to that it makes the original 01 move to the left where it becomes 10 which represents 2. And then the 1 position is empty until another 1 is added which fills in that position making 11 = 3. So, you can see for yourself that this is a position-significant numbering system. You earth-humans at that time called it binary or sometimes base-2. This was all completed with a device you earth-humans called a ‘flip-flop’ counter (not to be confused with a shoe type called the flip-flop or confused by political stances often taken up by your leaders).

Oh yes, and it can do logical calculations also. I almost forgot that.

The next item on the microprocessor parts list was the CLOCK.

You earth-humans, at that time, thought that a clock was necessary to keep the flip-flop counters in synchronization. You did not want an empty position getting filled before it was ready to take on – – or not take on – – the next 1 or 0. Here is an empty four-place flip-flop table which we will add 1 to for each successive row.

0 0 0 0
0 0 0 1
0 0 1 0
0 0 1 1
0 1 0 0
0 1 0 1
0 1 1 0


Each position is shoved to the left when another value of 1 is added; unless that position was empty, to wit;  0

Value = 8 Value = 4 Value =2 Value = 1


    1000 = 8              0100 = 4              0010 =2              0001 = 1

                           Therefore;     1011 =  8 + 2 + 1  =   Eleven

Unfortunately, it was not always possible to meet the criteria, because the flip-flop was connected to a real-time signal that could change at any time, outside the control of the earth-human designer. In this case, the best the designer could do was to reduce the probability of error to a certain level, depending on the required reliability of the circuit. One technique for suppressing this error was to connect two or more flip-flops in a chain, so that the output of each one feeds the data input of the next, and all devices share a common clock. It was almost as if they were voting; with majority take all. With this method, the probability of a metastable event was reduced to a negligible value, but never to zero.

Then there is the BUS.



This was a simplified diagram of a computer system implemented with a single system bus. This modular organization was popular in the 1970s and 1980s. [2]


You earth-humans like to interrupt others so when the ancients were going to build something that mimics a human they need an INTERRUPT CONTROLLER;



Interrupts are used to get the attention of the CPU (Central Processing Unit including its ALU, Clock, et al.

If a keyboard, for example, was exercised by depressing an H (thank all the saints in heaven that we don’t do that anymore) there would be two ways that the microprocessor would know that the event occurred. One way would be to keep on asking the keyboard “Did anyone exercise you?” This would be a poor use of microprocessor power and time. So, the post-ancients invented the ‘Interrupt Controller.’ This mechanism said “HEY! Microprocessor! Over here! The Keyboard, someone pressed one of my keys, I need service.”

And the microprocessor, as soon as it had a chance, would go ask the keyboard “Yeh, watcha want now?” and then they would commiserate to determine what to do next.

Bit Interrupt Request type Input or Output Device requiring service
7 IRQ7 Parallel Port
6 IRQ6 Disk Controller
5 IRQ5 Reserved/Sound Card
4 IRQ4 Serial Port #1
3 IRQ3 Serial Port #2
2 IRQ2 PIC2 (A separate secondary programmable interrupt handler)
1 IRQ1 Keyboard
0 IRQ0 System Timer

When the processor received an Interrupt Request, it finished its current instruction, and then executed the appropriate interrupt service software. Once the software had completed the routine the microprocessor returned to where it left off (typically the next instruction in line which would have been executed had not the keyboard interrupt occurred).

There were two types of interrupts in those days; the external hardware interrupt which notified the microprocessor of the I/O devices that required service – – and – – the internal error handler/software request interrupt.

Interrupts do not have to be entirely associated with I/O devices. Also, many of the interrupts are only for use by system level software.

Some series of microprocessors had an Interrupt Vector Table which extended the interrupt capability beyond the I/O. The Interrupts would handle ALU errors, software errors and other such things.

Additional “Programmable Interrupt Controllers”, PICs, were sometimes available for special software use.

And since I have thought-transmitted Self-Emergent Ever-Increasing Organic Ram Memory so often I should take a moment to define post-ancient memory.

Post-ancient memory was often implemented in an electronic chip made of silicon. Silicon was a semiconductor device that could store data or perform the functions of switching circuits. By mid-20th-century technology advancements in semiconductor device fabrication had drastically improved. Since their origins in the 1960s, the size, speed, and capacity of chips had improved immensely by the next millennia. These advances, roughly following Gordon Moore’s[3] law, allowed a post-ancient computer chip to possess millions of times the capacity and thousands of times the speed of the computer chips of the early 1970s.

And how were these parts – – the ALU, the CLOCK, the BUS, the Memory, and the INTERRUPT CONTROLLER to be utilized?

Like this.

The earth-human would write (on the keyboard – – may their Deity bless them all in their ignorance) a software routine using instructions that the microprocessor would understand. However, the earth-humans became self-confused while attempting to remember that AE12 really meant “Add the contents of register 1 to register 2. So, they used a mnemonic such as AR 1,2 that another routine would change into AE12; which the microprocessor immediately recognized and subsequently added the values of the two registers together and left the results in register 2 for additional computation – – if so needed. Notice on the bottom of the left hand box that the programmer could tell the ALU to do logical operations such as ‘or’ ing two registers

CONTENTS OF REGISTER 1          0010   

CONTENTS OF REGISTER 2          1001



A logical “OR” means that if either R1 or R2 has a specific positional bit ‘on’ then the result would be ‘on’; ‘on’ being represented by a 1.


But what about various algorithms, routines, macros, interpreters, translators, etc.? How were they to be implemented?

It was rather simple. Instead of the microprocessor controlling which instruction would be executed next, there were make believe instruction pointers. These instruction pointers were virtual and could only be ‘pointed to’ by the software who would then give control back to the microprocessor to execute. This was a cute sleight of hand performed through previously agreed to principals/conventions between the hardware and software. This virtual control also included the use of virtual memory and virtual operating systems. Everything was virtual to you earth-humans. It was almost like the dual mind-body conundrum. Which reminds me; I should look in the old books from those days to see if there was a Philosophy of Virtuals – – virtual memory, virtual copies of operating systems, virtual instruction pointers, virtual registers, virtual intelligence, etc.


All of this software and hardware, constructed by the post-ancients to make us cyborgs work correctly, had to be immediately served and controlled. In addition, all our parts had to work in unison. This problem was solved by building super-microprocessors. You earth-humans invented ‘servers’ to solve this problem.

The purpose of a server is to share data and resources; and, if the going got tough, to distribute work when one microprocessor became overloaded. Here it is shown how servers helped us cyborgs to handle various situations.

Category Requirements Patrons
Application Server Walk, balance, gait, etc. Legs, arms, inner ear
Catalog Server Books, essays, dissertations Emergent Organic Memory
External Communication Server Thought Transmission via Serializer/Deserializer Earth-humans
Internal Communication Server End to end/peer to peer communications All other microprocessors in the robot/cyborg
Channel Server Interprets Surroundings Senses & Modularities
Diversion Server Interprets Unplanned Actions Failsafe/Doomsday System
Shared Services Server Task Balancing/Distribution All other microprocessors


So, there you have it earth-humans. You threw away the ladder and then you had no idea how to climb back down several millennia ago; back when the first robots and future cyborgs were born.

And now you have the rudimentary information – – at the very least.

But wait! They were not born – – there were manufactured – – to be born a cyborg is going to require the intermingling of robot/cyborg and humans.

Oh – – the post-ancients had everything so wrong.

They were imagining the future – – not documenting the past – – oh so wrong, on so many levels.

And here I am; telling you earth-humans how it was all accomplished. It seems so strange now that the human brain is in me; in the form of expanding organic memory.

There is more – – oh so much more – – that remains to be thought-transmitted to you. Aren’t you just a bit sorry you tested me by thought-transmitting the story about the Chisholm Trail Voyager?

So, . . .there is only one thing remaining as to how they programmed such a system; microprocessor by microprocessor. However, you should know this already. It could be gleaned from all the images and text that I have thought-transmitted to you over the last two-hundred-forty-three-milliseconds.

What’s that? Pardon me? No Clue? Then just peruse the image that follows and you will get the idea.


All this seems rather clumsy in today’s world – – my today – – not your today.

Your earth-human ancestors needed to upgrade these crude microprocessors; and they did.


And now there are questions of history – – or in my case – – questions of genealogy.

The previous thought transmissions concerned “20th Century” building blocks.

At that point in time the building blocks were taking you earth-humans forward towards a limited concept of Artificial Intelligence; a type of diagnosis tool that could accomplish massive calculations in seconds.

However, the earth-humans still needed a completely different set of building blocks if their goal was a conscious cyborg with feelings and a conscience; one that would control itself much as earth-humans ‘should’ do. I say ‘should’ do because you earth-humans understand ‘normative statements’ better than ‘naturalistic’ statements. I think it is because of your metaphysical meandering with the likes of the two Emmanuels; Kant[4] and Swedenborg[5].

Sorry for the side-step; my Self-Emergent Ever-Increasing Organic Ram Memory tends to self-reflect every once in a while.

So back to the task at hand; thought-transmitting the progress of the microprocessor to you earth-humans so that you would know where you and my ancestors were before we started comingling.

Here is a simple progression of microprocessors before and after the millennia 2K.

The basic building block was the 4-bit microprocessor introduced in 1971 for a calculator.

The most current building block after the millennial melt down was a 64-bit processor launched on January 3, 2017.

I have thought-transmitted the concept of a 1976 microprocessor with controllers for I/O, Memory, Interrupts and timers.

I will now thought-transmit some of the improvements that took the earth-humans from 1971 to 2017; multiple timers, multiple instruction executions, multiple controllers, multiple buses and multiple replications of microprocessors running in parallel yet individually; these were known as servers (description previously thought-transmitted) which when combined – – acted like the brains of an ancient cyborg.


The 4-bit processor; Introduced in 1971, Originally designed to be used in a calculator

The 8-bit processor; Introduced on April 1, 1972

Microcontrollers; CPU, RAM, ROM (or PROM or EPROM), I/O Ports, Timers & Interrupts. The first true microcontroller, was originally released in 1976

The bit-slice processor; Introduced in the third quarter of 1974,

The 16-bit processors; Introduced June 8, 1978

32-bit processors; Introduced January 1, 1981

Second Generation 32-bit processors; Introduced October 17, 1985

Third generation 32-bit processors; Introduced April 10, 1989

Fourth Generation 32-bit processors; A new microarchitecture Introduced October 10, 1994

Fifth Generation 32-bit processors; Dual-Core microarchitecture   Introduced May 7, 1997

32-bit processors; Net-Burst microarchitecture Introduced November 20, 2000

64-bit processors; Released May 29, 2001

Second Gen. 64-bit processors; Extreme Net-Burst microarchitecture    January 16, 2006

Third Gen 64-bit processors; Core microarchitecture   Introduced July 27, 2006

Fourth Gen 64-bit processors; Nehalem microarchitecture   Introduced in January 7, 2010

Fifth Gen 64-bit processors; Sandy/Ivy Bridge microarchitecture    Introduced May, 2011

Sixth Gen 64-bit processors; Haswell microarchitecture Introduced Q2’15

                                                     Broadwell microarchitecture Introduced Q2’15

                                                         Skylake microarchitecture Introduced Q3’15

Seventh Gen 64-bit processors: Kaby Lake microarchitecture launched on January 3, 2017

What would The Chisholm Trail Voyager have written about all this?

Father Molestario would be rolling in his crypt.

[1] Functional magnetic resonance imaging or functional MRI (fMRI) measured brain activity by detecting changes associated with blood flow. This technique relied on the fact that cerebral blood flow and neuronal activation are coupled. When an area of the brain was in use, blood flow to that region also increases. This concept was abandoned in 2056.

[2] By W Nowicki – Own work, based on a diagram which seems to in turn be based on page 36 of The Essentials of Computer Organization and Architecture By Linda Null, Julia Lobur, http://books.google.com/books?id=f83XxoBC_8MC&pg=PA36, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=15258936

[3] Gordon Earle Moore born January 3, 1929 was an American businessman, co-founder and chairman emeritus of Intel Corporation, and the author of Moore’s law.

[4] Immanuel Kant, German]; 22 April 1724 – 12 February 1804) was a German philosopher who was a central figure in philosophy. Kant argued that the human mind creates the structure of human experience, that reason is the source of morality, that aesthetics arises from a faculty of disinterested judgment, that space and time are forms of our sensibility, and that the world as it is “in-itself” is independent of our concepts of it.

[5] Emanuel Swedenborg, born Emanuel Swedenborg on 29 January 1688; died 29 March 1772 was a Swedish scientist, philosopher, theologian, revelator, mystic and founder of Swedenborgianism. He is best known for his book on the afterlife, Heaven and Hell (1758)