We’ve mistaken complexity for progress — and forgotten how things work. A 41-year-old computer still boots instantly, while today’s “smart” tech buckles.
As with your original comment, i like your argument. :) Additionally, I dig the wall of text. WoT, written well, leaves little ambiguity and helps focus the conversation.
I don’t disagree on any particular point. I agree that its a net positive for programming to be approachable to more people, and that it can’t be approachable to many while requiring apollo era genius and deep understanding of technology. It would be a very different world if only PhDs could program computers.
To that, I believe the article author is overstating a subtle concern that I think is theoretically relevant and important to explore.
If, over the fullness of decades, programming becomes so approachable (ie, you tell an AI in plain language what you want and it makes it flawlessly), people will have less incentive to learn the foundational concepts required to make the same program “from scratch”. Extending that train of thought, we could reach a point where a fundamental, “middle-technology” fails and there simply isn’t anyone who understands how to fix the problem. I suspect there will always be hobbiests and engineers that maintain esoteric knowledge for a variety of reasons. But, with all the levels of abstraction and fail points inadvertently built in to code over so much time passing, it’s possible to imagine a situation where essentially no-one understands the library of the language that a core dependency was written in decades before. Not only would it be a challange to fix, it could be hard to find in the first place.
If the break happens in your favorite cocktail recipe app, its Inconvenient. If the break happens in a necessary system relied on by fintec to move peoples money from purchase to vendor to bank to vendor to person, the scale and importance of the break is devastating to the world. Even if you can seek out and find the few that have knowledge enough to solve the problem, the time spent with such a necessary function of modern life unavailable would be catastrophic.
If a corporation, in an effort to save money, opts to hire a cheap ‘vibe-coder’ in the '20s and something they ‘vibe’ winds up in important stacks, it could build fault lines into future code that may be used for who-knows-what decades from now.
There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs.
You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.
There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.
There are some actual real-life examples that match your theoreticals, but the piece missing is the scale of consequences. What has generally occurred is that the fallout from the old thing failing wasn’t that big of a deal, or that a modern solution could be designed and built completely replacing the legacy solution even without full understanding of it.
A really really small example of this if from my old 1980s Commodore 64 computer. At the time it used a very revolutionary sound chip to make music and sound effects. It was called the SID chip. Here’s one of the them constructed in 1987.
It combined digital technologies (which are still used today) with analog technologies (that nobody makes anymore in the same way). Sadly, these chips also have a habit of dying over time because of how they were originally manufactured. With the supply of these continuously shrinking there were efforts to come up with a modern replacement. Keep in mind these are hobbyists. What they came up with was this:
This is essentially a whole Raspberry Pi computer that fits in the same socket in the 1980s Commodore 64 that accepts the input music instructions from the computer and runs custom written software to produce the same desired output the legacy digital/analog SID chip built in 1982. The computing power in this modern replacement SID chip replacement is more than 30x that of the entire Commodore 64 from the 80s! It could be considered overkill to use so much computing power where the original didn’t, but again, compute is dirt cheap today. This new part isn’t expensive either. Its about $35 to buy.
This is what I think will happen when our legacy systems finally die without the knowledge to service or maintain them. Modern engineers using modern technologies will replace them providing the same function.
I certainly hope so! Human ingenuity has gotton us here. I’m interacting with you across who knows how much distance, using a handheld device that folds up.
…but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road.
I appreciate your optimism and pragmatic understanding. You seem to be a solution driven person that believes in our ability to reason and fix things. We’ll definitely need that type of attitude and approach when and if something goes sideways.
…but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road.
Absolutely true.
I guess my thought is that the benefits of our rapid growth outweigh the consequences of forgotten technology. I’ll admit though, I’m not unbiased. I have a vested interest. I do very well professionally being the bridge of some older technologies to modern ones myself.
As with your original comment, i like your argument. :) Additionally, I dig the wall of text. WoT, written well, leaves little ambiguity and helps focus the conversation. I don’t disagree on any particular point. I agree that its a net positive for programming to be approachable to more people, and that it can’t be approachable to many while requiring apollo era genius and deep understanding of technology. It would be a very different world if only PhDs could program computers. To that, I believe the article author is overstating a subtle concern that I think is theoretically relevant and important to explore.
If, over the fullness of decades, programming becomes so approachable (ie, you tell an AI in plain language what you want and it makes it flawlessly), people will have less incentive to learn the foundational concepts required to make the same program “from scratch”. Extending that train of thought, we could reach a point where a fundamental, “middle-technology” fails and there simply isn’t anyone who understands how to fix the problem. I suspect there will always be hobbiests and engineers that maintain esoteric knowledge for a variety of reasons. But, with all the levels of abstraction and fail points inadvertently built in to code over so much time passing, it’s possible to imagine a situation where essentially no-one understands the library of the language that a core dependency was written in decades before. Not only would it be a challange to fix, it could be hard to find in the first place. If the break happens in your favorite cocktail recipe app, its Inconvenient. If the break happens in a necessary system relied on by fintec to move peoples money from purchase to vendor to bank to vendor to person, the scale and importance of the break is devastating to the world. Even if you can seek out and find the few that have knowledge enough to solve the problem, the time spent with such a necessary function of modern life unavailable would be catastrophic. If a corporation, in an effort to save money, opts to hire a cheap ‘vibe-coder’ in the '20s and something they ‘vibe’ winds up in important stacks, it could build fault lines into future code that may be used for who-knows-what decades from now. There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.
There are some actual real-life examples that match your theoreticals, but the piece missing is the scale of consequences. What has generally occurred is that the fallout from the old thing failing wasn’t that big of a deal, or that a modern solution could be designed and built completely replacing the legacy solution even without full understanding of it.
A really really small example of this if from my old 1980s Commodore 64 computer. At the time it used a very revolutionary sound chip to make music and sound effects. It was called the SID chip. Here’s one of the them constructed in 1987.
It combined digital technologies (which are still used today) with analog technologies (that nobody makes anymore in the same way). Sadly, these chips also have a habit of dying over time because of how they were originally manufactured. With the supply of these continuously shrinking there were efforts to come up with a modern replacement. Keep in mind these are hobbyists. What they came up with was this:
This is essentially a whole Raspberry Pi computer that fits in the same socket in the 1980s Commodore 64 that accepts the input music instructions from the computer and runs custom written software to produce the same desired output the legacy digital/analog SID chip built in 1982. The computing power in this modern replacement SID chip replacement is more than 30x that of the entire Commodore 64 from the 80s! It could be considered overkill to use so much computing power where the original didn’t, but again, compute is dirt cheap today. This new part isn’t expensive either. Its about $35 to buy.
This is what I think will happen when our legacy systems finally die without the knowledge to service or maintain them. Modern engineers using modern technologies will replace them providing the same function.
I certainly hope so! Human ingenuity has gotton us here. I’m interacting with you across who knows how much distance, using a handheld device that folds up. …but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road. I appreciate your optimism and pragmatic understanding. You seem to be a solution driven person that believes in our ability to reason and fix things. We’ll definitely need that type of attitude and approach when and if something goes sideways.
Absolutely true.
I guess my thought is that the benefits of our rapid growth outweigh the consequences of forgotten technology. I’ll admit though, I’m not unbiased. I have a vested interest. I do very well professionally being the bridge of some older technologies to modern ones myself.