Tue, Feb 5, 2019
John von Neumann had incredible breadth, one of the great thinkers of the 2oth century. Very gregarious and articulate. He studied at Princeton Institute for Advanced Studies. One place to store memory. Von Neumann knew about Turing. Turing’s paper influenced the way Neumann was thinking about the way to design a computer.
He stayed at the institute. Pretty young, 54 or something, had cancer.
Von Neumann, Herman Goldstein, Arthur Burks Paper
Burke describes computers in a very similar manner in which doctors describes patients.
Burke’s use of the term organ describes the components that affect a machine makes it very human. Is his language appropriate in describing a computer?
Burks describes a compromise between the desire for speed of operation and the desire for simplicity or cheapness of the machine. Is one more desirable than the other? When should you choose one over the other?
Transfers into memory are of two sorts: substitutions where previous quantity is cleared and replaced and partial substitutions where memory location-number is replaced by a new memory location-number. Why should this distinction be made?
The Selectron is a storage tube that is planned to have a non-amplitude-sensitive switching system where the electron beam is directed on a plate with a small fraction of a millisecond. It contains light- or electron-sensitive film/wires/tapes to fulfill our needs. Dead storage is an extension of our secondary storage medium.
In parallel machines, the corresponding pairs of digits are added simultaneously while a serial one has pairs of digits added serially in time.
An example of speed and simplicity both being achieved is the use of binary over decimal. The main virtue of the binary system as against the decimal is the greater simplicity and speed with which the elementary operations can be performed, as stated by Burks.
Burke also writes that another example is this in that building two ma-chines may appear to be expensive, but since most of the cost of a scientific computer lies in development rather than production, this consideration is not so important as it might seem.Experience may show that for most problems the two machines need not be operated in parallel. In this case, we sacrifice speed for simplicity.
Wilkes’s invention, the Electronic Delay Storage Automatic Calculator, exhibits many of the same tradeoffs mentioned in Burke’s paper, one of which is whether we can build a parallel machine. An arithmetical unit mentioned by Wilkes is a piece of equipment consisting of identical units repeated many times andmuch larger than that in a serial machine. According to Wilkes, control in a parallel machine is simpler than in a serial machine.
The same operations mentioned in Burks paper also appears in Wilkes paper:
- add, subtract, multiply (two orders, one for the multiplier, one for the multiplicand),
- right and left shift (any number of places),
- transfer from the accumulator to the store,
- conditional operation depending on the sign of the number in the accumulator,
- conditional operation depending on the sign of the number in the B register
Are there any enhancements made by Wilkes that was not made by Burks? What are some different design choices made by these two authors in their respective machines?
dis-dib. pv1 had a 4-bit, 18 bit word. 1 bit of direction, the rest were addresses. The machine did not have a division instruction, rather a divide step instruction. von Neumann talks about why you have enormous circuitry and a one-bit addition. Divide step is much simpler. Iterate 18 times and at some point after machine had been delivered, we can obtain a divide instruction. For backwards compatability, we can flip the switch down. Tradeoff between hardware and software.
Microcode is the building blocks of software. Each assembly instruction translates roughly to microcode. Iterating and divide step instruction, except the microcode opens and closes certain logic elements. Easy to change the microcode than to saunder. Microcode is firmware. Do not have integrated circuits without too much complexity. You don’t want compilers to be dealing with trivial instruction sets.
Side Channel Attacks
Read bits of memory from an unprivileged program. Hashed the microcode, a new chip effectively.