Register or log in to access this video
An inter-generational (mother-daughter) conversation about the history of repeating trends in the tech and software industry and how they have surfaced once more in our modern AI climate.
“AI will mean no one needs software engineers anymore!”
“We can just write the spec and the LLM will do all the coding!”
“What happens if we hire software engineers who turn out not to be able to code?”
We’ve all heard versions of these statements (and others) as AI trends seize the industry. What many of us don’t know is that none of these are actually new claims or new fears. As a child I was privileged to hear about trends, blind alleys, failures, and successes in the software industry at large from my mother, who worked as an engineer for 50 years (from the mid 1960s to the mid 2010s).
I’d like to invite the audience to join me for an interview with my mother about our industry’s history. We’ll discuss past trends, including COBOL and “4GL” languages, which were supposed to replace the need for skilled engineers; the industry’s earlier forays into offshoring software implementation; and some historical struggles to hire good engineers. Learning from our history can give us all significant insight into likely outcomes of current trends, and point us to the most productive ways to use new technology without getting lost in hype. Even if it’s different this time, our history can show us the traps and pitfalls to avoid in a brave new world.
Key takeaways
- Understanding the history of the industry is incredibly valuable for predicting its future
- The software industry has, over and over, insisted that this new thing will mean we “don’t need programmers/SWEs”
- This promise has never materialized, because “writing code” is not the hard part of software engineering