I - Computers were humans

Term “computer” comes directly from the long practice of human computation. Well before computers became personal, an entire profession of arithmetic laborers, mostly women, were contracted for scientific and military production. These computers did the work of summation and logic that feels so embedded into devices today. 

Computers at the Jet Propulsion Laboratory in 1936. Image copyright NASA/JPL-Caltech.
It wasn’t until the middle of the 20th century that mechanical devices were produced that could compute faster or more efficiently than people. This bottleneck was mechanical for a long time- mechanical computers function on a series of true/false switches that must be flipped back and forth to compute and hold values. This still happens now, albeit in solid-state silicon parts, but back then they actually moved. Like all mechanical parts, they wore out, broke, or behaved in atypical ways. Famously, literal bugs would get into the warm machines, jamming switches and blocking paths and causing all sorts of errors.
John von Neumann in front of his computer at the Institute for Advanced Study, Princeton University, 1945. This machine calculated numbers for the Manhattan Project. It wasn’t until WWII and the development of the atom bomb that (suddenly, extremely well funded) computers began to seriously surpass the abilities of people. Image copyright by the Institute for Advanced Study, photographed by Alan Richards.
To understand computation at this level, it is worth spending a moment thinking about binary code and binary computing. Computers can be seen as layers of technology, with increasing complexity and abstraction as they go up. Programming can happen at different layers. Merely doing AND/OR operations inside of a logic gate is technically coding. But it is not the kind of coding that most people do when they ‘write code’. Most people now deal with ‘high level’ code, which has a layer of instructions that sit between the human and the computer. In the 1950’s Grace Hopper developed the idea of a language for programming similar to English. This is a language that can be independent from the circuitry of the machine, and is automatically translated to lower level instructions by a compiler which most programmers don’t interact with directly. However, to interact with computers in this way they need to represent text that humans can understand, not simply as a series of true/false arguments. To do this, computers assign sequences of binary codes to characters. To a computer, textual data is also numerical data. We will get into the specifics of this later, but first we will look back in time, in section II - A long way to here.