I have thought about this article for a while. I would start it in my mind, scratch it, and then restart it. I would write it, delete it, and rewrite it. It has been a never ending circle of agonizing over minutiae and never actually releasing anything. Long gaps in the publishing of articles on this site concern this exact process. Fuck it. Here goes.
There’s a really big problem in tech that no one really wants to admit. Left or right or other, the entire tech community from hardware folks to software folks and everyone in between. The lone voices have been the grey beards who are largely ignored with “okay boomer” or similar epithets whether or not they are boomers or match any epithet applied to them. Ageism and elitism and many other “isms” are used to dismiss these folks who actually do have valid arguments regardless of prevailing sentiments. For my own part, while I have been aware of this problem, I have vacillated and never done anything about it. What is this problem? Waste.
Software developers will make excuses for using Python and including 9001 packages that they haven’t actually reviewed or audited, and they will say “well! everyone knows that C is just unsafe! I had to use Python! It was faster, and I can just include all this code, and it is safe!” So, rather than be diligent and debug and go through QA process like an adult, the programmer took an easy route to memory safety and possibly opened 9001 vulnerabilities by using 9001 software packages that hadn’t been vetted by the programmer or by the programming team in question. The reality is, the programmer, like most programmers, is given a landscape of powerful computers with ample resources and simply doesn’t want to do any work to make something lean and functional. Also, this argument that C is unsafe is usually easily destroyed not just by the constant use of unvetted code (from npm, gem, composer, cargo, pip, cpan, or from frameworks) but also in the fact that most languages are written in C (shut up rusters, I know Rust was initially written in OCaml and then rewritten in Rust). What one fool can do, so can another, and blaming your tools for being bad does not explain why it is good enough for your language but not for your project.
This sense of “ease”... yeah, it’s easier to not really write your own software, and just string together other people’s work. That is called scripting. I do that every day in Bash as part of my job. I also do that to automate work flows. A significantly advanced script will eventually start to approach a program in its own right. This isn’t a problem if the script is performant, uses trusted code, and it isn’t constantly running. It won’t add to waste.
What waste am I talking about? Two kinds: electrical power, physical. All these frameworks, all this untrusted code inclusion, all this use of less performant languages... this contributes to code bloat which in turn uses more electricity. This means more resources wasted, more pollution created, and that cycle continues because of maintaining the power structures to feed that code. We don’t often think about this as it concerns an individual computer, but when you think of the number of computing devices in use around the globe it becomes rather staggering. I am talking about code pushing this as hardware investments have increased due to bad software requiring the increase in hardware. Millions and millions of people did not want to upgrade their computers because the computers didn’t work. They wanted to upgrade because the computers became “slow” due to software bloat. Sadly, millions of computers were thrown in trash heaps at each generation of computing, merely because software required more resources.
The computers were not at fault. The software companies were. I can run AutoCAD, spreadsheet software, word processors, games, email, web browsing, and more on an 8088 with an 8087 coprocessor, and a 64 megabyte hard disk with about 960K RAM. I may not be able to do all of those things at once, but really... I am a human. I can do only one thing at a time. Having more things open is a convenience, but it is not truly a necessity. Of course, I am not advocating a return to the IBM XT, I am simply making a point. Everything we now do was possible in the past with far more modest computing machinery.
The point I am trying to make in this example is that people are getting more hardware than their needs required, and that the powering of that hardware is wasteful. A machine that uses less power is likely adequate to their tasks. Currently, most people could get by on a Raspberry Pi with 4GB of RAM. It’s a modest machine, but it can do everything that the vast majority of people require. It is less powerful than many leading smartphones, which tells us a bit about smartphone waste as well.
The hardware manufacturers are not much better than the programmers of course. Rather than innovate in a meaningful way (which is expensive), they would rather make small iterative advancements, push them out as "magical" and sell you another iTrinket that doesn't do anything your old one didn't do. The old one will then be thrown in the garbage. The precious metals and energy wasted in creating it forgotten, and more land will be stripmined to make your next new iTrinket.
This physical trash problem is no joke. 59.1 million tons of e-waste for 2019 alone. That’s enough to fill 350 cruise ships of decent size. This what is happening. Less performant code is actually bad for the planet, bad for people, and bad for economies. All of the labor and all of the material could be used to enrich lives in meaningful ways. Instead, it is being used to make social media platforms that divide us, games that are addictive in ways similar to gambling, a constant barrage of propaganda from streaming services and glowing screens, and porn that minimizes relationships and over-sexualizes society. Is this really what we wish to be feeding? Is this really what we want out of humanity? The escapes that people are seeking due to the crushing realities of life make those crushing realities of life far worse than they would otherwise be. The human addiction to our escapes is creating more need for that escape. It’s heroin on a global scale.
Lefties are willing to use paper straws that wilt and make their drinks gross, but talking about the plastic involved in their smartphones and cameras (not to mention wild land put under cultivation for more paper)? Oh hell no! The right wing will talk about the need for energy independence, but you want to talk about reducing the energy required for all those data centers? Oh hell no! Both of these issues are serious issues, and both should be talked about.
The latest framework may help you produce a deliverable, but it is also going to contribute to e-waste. The problem starts with the software and systems engineers. The software developers need to write lean code, and the systems engineers need to take the time to configure, performance tune, capacity plan and then provision hardware accordingly. Programmers need to take the time to make things more performant. Hardware engineers need to make things modular and upgradable. Companies need to be finding new revenue streams rather than convincing people to toss out perfectly good hardware.
The constant fanboyism regarding new orchestration systems like puppet, chef, ansible, and salt is no better. The fanboyism over docker and kubernetes is no better. The fanboyism over the cloud and virtual machines is no better. These are all overhead. If you are using an orchestration framework, you’re adding machines to the pool that will be constant on and waiting for your command. You are also not tuning per machine, you’re using configs in the large which drives up power consumption (there are people who take that into account and plan accordingly but I am making generalizations because I am talking about waste in the aggregate). Containers and virtualization are largely done because so called “safer” languages and frameworks are producing code that cannot be trusted and therefore must be isolated. Additionally, those containers and virtualization schemes add to the power consumption problem. Especially as the machines are over-provisioned to allow for future spin up of more virtual machines or containers! If you are using docker because you’re too lazy to learn how to install and configure something... you really need a new job.
Likewise, the fact that you are pushing datasets that saturate your connection should be considered. The fact that your code is bad and requires legions of servers should be considered. Did you really need all of those machines, or did your languages, packages, frameworks, and data sets decide that you needed those machines and you were unwilling to make the required investments to fight them? In the systemic laziness of the industry, we have added to a series of global problems, and at some point we will all be forced to pay the price.
© MMIX - MMXX, absurd.wtf