I have learned that when coding something complex, when you get bogged down in issues and start losign track of what is done the urge to just wipe it all and start over is often overwhelming.
And often, a bad idea.
Well, perhaps not with a specific program, but in general the existing code has already had a lot of bugs worked out. If you wipe that all away and start fresh, you will often face the same sorts of issues and have to spend time figuring out the solutions - again.
That's part of why tech seems to just keep adding layers, instead of reworking a program in its entirety. There are times where a complete redo is a good idea, but most of the time nobody wants to spend the resources redoing something that is already getting the job done.
And so you get layer, on top of layer, on top of layer, until eventually it looks like that xkcd comic:
What's interesting to me is that over and over and over again I see this urge to simplify things, to reduce the complexity, to somehow make it easier to manage.
Except what tends to happen is that yet another layer gets placed on top, one that looks prettier and perhaps consolidates all the things you need to monitor... but it doesn't actually change the underlying structure, and in fact can even obfuscate it a bit more.
Which is all fine and dandy when things work, but someone still has to understand that structure - especially when something breaks.
There are a ton of tools for automating what we do, and for the most part I love them. If you have some complex task that involves doing hundreds of different things in the exact same order, the exact same way, every time - automate it! People are prone to error, they're one bad day or one interruption from skipping a step or repeating a step or adding the wrong input, and if you can build a pipeline or a script that does it all correctly, it's a great time saver.
But someone has to know the process well enough to create that pipeline or script. And if something needs updated, or upgraded, or modified, then someone has to know it well enough to make those changes too.
In other words - automation can't really replace the need for human expertise. Not completely.
Looking back, I feel like something similar happens with government programs and privatization.
The main belief I heard, growing up, was that 'governments have no incentive to improve, whereas competing businesses do. Therefore private businesses are more efficient, and it's better to privatize where you can.'
But... that's only true in certain circumstances. There are certain public goods that benefit society as a whole, but don't have great incentives for private companies.
I was thinking about that because I picked up Careless People (in a great example of the Streisand effect), and was reading about their reasoning for internet.org. Basically they realized that if they wanted to expand their userbase, more people needed to have access to the internet. So they created an organization that was supposed to encourage basic internet access.
Kind of like how in 1954 the NTCA worked to encourage telephone access in rural areas. Except it's important to note that the government helped a lot with funding these efforts.
In the private sector, the cost of connecting far-flung and distant communities isn't really worth the amount of money you can make from adding those customers to your business. Not unless you get the government to help tip the scale a bit.
And yet, although the wiki indicates that they did indeed cooperate with some governments for this, in the book it seems the initial focus was on building drones, satellites, and lasers to deliver the internet.
No talk at all about creating a program like the NTCA.
Perhaps they thought those drones, satellites, and lasers would help create a profitable business? It does sound a bit like Starlink now that I think about it. I don't think Starlink has yet reached the point where the profits outweigh the costs involved in building the structure...
But I'll leave that to the corporate folks. The part I wanted to point out was that, at least initially, there didn't seem to be any realization that this is actually where you would want government involvement. That the public good of internet access is not something that private companies are likely to find profitable. Not once you get past dense populations where building the infrastructure easily brings in enough customers to pay for itself.
There is not a universal truth here, no situation where private businesses are always better, nor government funded organizations.
They are different tools in the toolbox.
This also, in some ways, reminds me of my experiences in Iraq. Because in the interests of keeping the military small, quite a few services were privatized out. Supposedly it saves money in the long run, but when you spend a good decade or more in another country... you generally are paying more for those privatized services then you would if you'd kept it in house.
That's not even getting into the security concerns, especially when all the dining facilities are run by private companies who hire foreign nationals because their labor is cheap.
But that's a whole other conversation. What I really wanted to get at was this - even with privatization, the government still had oversight duties.
You don't get to just say 'here's a contract, have at it.'
Sure, they may be 'competing' with other private businesses, but ultimately the government is the funder and the one who decides which one gets the contract. And if they're not ensuring the company is doing the work as expected, even those private companies are prone to waste, fraud, and abuse.
This might not seem related to how I started this post, but I think at the core is a similar urge - simplify and abstract away the complexity.
Except it doesn't ever go away, not really.
It's just hidden, and now you've got a bunch of people running around acting knowledgeable when all they have is a surface level understanding.
No comments:
Post a Comment