https://archive.computerhistory.org/resources/text/Ferranti/Ferranti.Sirius.1961.102646236.pdf

https://archive.computerhistory.org/resources/text/Ferranti/Ferranti.Sirius2.1961.102646235.pdf

https://en.wikipedia.org/wiki/Ferranti_Sirius

The Lost Cause

In 1962, when I was about to leave Ferranti Ltd. in Manchester England, to take my family to the U.S.A., the company held an in-house conference to discuss the implications of the coming of integrated circuits.

Previously, after graduating in Engineering from Cambridge in 1959, I had been working on the logic design of the first transistorised computer, Sirius. It had 2,000 logic gates and 40,000 bits of memory (= 5,000B), and sold for £25,000. I also attended a course in programming the earlier, valve computer, called Pegasus. https://collection.sciencemuseumgroup.org.uk/objects/co62559/ferranti-pegasus-computer-1956-mainframes-computers

Ken Johnson (KCJ) pointed out that if one component was added to a (static R.A.M.) memory bit, which comprised about ten components, then a column of words in memory could be searched in parallel. I was dumbstruck. This meant that the whole world of digital computers changed. The Content Addressable memory (also called Associative memory) was upon us. We could ask a memory to deliver to us words with a particular characteristic, without reading them out of memory one word at a time as we had had to do with previous memory technologies, for instance the then fashionable magnetic core memory. This would massively speed up the digital computer. Obviously, since the new technology for memory was the same as the new technology for processing, we would, later on, be able to instruct all words in memory with a certain characteristic, to be modified in a prescribed way, in parallel, without even reading them out from memory (parallel processing). This would further speed up the digital computer.

I departed for Los Angeles and my new job, in Ampex, with high expectations. At the time, I predicted that digital electronics was set to take 10% of G.D.P.

In the event, the world stuck to Von Neumann, machines with only one processor, and processing within memory was taboo for the next 40 years. Deviation from Von Neumann, will probably be banned for another 40 years, until at least the year 2040. The implications for digital electronics are disastrous, limiting it to much less than 10% of G.D.P., today make numerous applications impossible, for instance the simulation of global warming, a feat easy to accomplish with the Kernel Machine (see this website), with its one million processors (or even better, using a special, larger Kernel machine for the particular application of global warming) 10 to 100 million processors, working in parallel. This would be a machine mostly working as S.I.M.D. with one processor with its own memory dedicated to each square mile of the earth's surface. (Successful delivery into the marketplace of the "Catt Spiral" memory machine proved the viability of the approach.)

For a list of applications which are frustrated by this worldwide commitment to only one processor, see E. Galea, Supercomputers and the need for speed, New Scientist, 12nov88, p50, or I. Catt, The Kernel Logic Machine, Electronics and Wireless World, mar89, p154. 

Ivor Catt 5jan01

The New Bureaucracy

Dinosaur among the Data?