I was reading an article somewhere (can’t find it now) about Steve Wozniak and his work on the Apple II’s original disk drive. The basic gist of the article was, up until then, disk drives were hugely expensive because everything had to be done with hardware. It breaks down something like this
- “Computer Disks” were a new technology and still being figured out
- Figuring anything out takes multiple iterations
- Each iteration would require a full manufacturing cycle since everything was controlled in hardware.
- Manufacturing cycles are expensive.
- They’re also slow, meaning less iterations, which keeps disks as something that was still being figured out.
What Wozniak did was take a bunch of the rules for how a disk starts up, reads/writes data, etc. off the actual boards of the drive and put them into the software of the operating system.
Previously, whenever a disk would spin up, seek to a certain point on the disk, read data, etc., it knew what to because of instructions etched in it’s circuit board. Wozniak said f-that, and wrote code in the operating system that told the disk what to do. All that was need from a manufactoring standpoint was a simple, generic set of instructions, and each iteration could happen in software.
It also required a huge level of detail management by the programmer (Wozniak) and insane amounts of time to code.
Which brings me to the point.
This is the model that most of the software industry runs on. Software was adopted by big business to remove the huge up front expense of a manufacturing cycle. This is why there’s such resistance to a lot of the best practices espoused by books like Peopleware (short version: Trust your developers, let software take as long as it needs to, or let software take the form that a schedule dictates.) … businessmen think software was invented to solve the problems of “obstinate” engineers demanding enough time to do something right before millions of dollars were invested in an assembly line.