One area that is often overlooked in the software industry is the “social cost” of building and improving software. While most of us who work as software developers spend most of our time figuring out the coolest new features to add, and how to delight our customers to no end, we don’t often sit down and think about what all this extra code means to the world around us. In particular, is what we’re adding actually useful? Are the CPU cycles taken to execute that code repeatedly (and, if you’re truly successful, on millions upon millions of computers worldwide) really necessary?
We’re all familiar with one form or another of performance work in software – setting goals for what is considered “reasonable” for latency and throughput for a particular set of operations. But what if we take this optimization process a step further and apply a second filter of ecological prudence to the mix?
Say, for example, you have a hypothetical program that distributes some large collection of data to all of its clients. Rather than sort the data on the server side, you instead leave the sorting to the clients. While in theory this isn’t a huge deal, let’s also say that this process is repeated hundreds of times a day to tens of thousands of clients. So, what starts out as a minor (and probably trivial) design decision can actually begin to add up to a lot of extraneous CPU cycles spread out over tens of thousands of machines – meaning lots and lots of watt-hours of electricity being chewed up needlessly every day.
Chances are your perf tests won’t hiccup over such a design choice, but the difference in aggregate power consumption between approaches (one where the sort is performed a single time on the server versus n-times on each client) couldn’t be greater.
Now, this isn’t to say that we all need to go and re-think every piece of software ever written – that would be absurd and probably impossible. But, something that we should accustom ourselves to doing, as designers and implementers of software systems, is to think about how our system designs consume power at scale. Are there places where large computations are being performed repeatedly and redundantly? The example of pre-sorting data before sending it to clients is an obvious oversimplification – but in many systems, similar “waste patterns” can emerge.
After spending a number of years building software that ran on millions of computers, I became acutely aware of how many instances of waste patterns actually exist out there. While you could do what I ended up doing (taking a job at Verdiem where I could build software that actively reduces energy consumption), a less severe approach is to simply embrace waste-conscious design practices (many of us are already doing this, often unknowingly).
As you find these waste patterns in your designs, take note of them – share them with your colleagues, write about them in your blog, etc. - and tell the world what you did to reduce waste and why it really can matter (and, more often than not, why getting rid of that pattern didn’t end up tanking performance or killing your software’s functionality). Indeed, as more and more developers become aware of and embrace waste-conscious design practices, the aggregate energy savings will also multiply – less pollution, fewer new power plants, lower energy costs – once again, helping to reduce the hidden “social cost” of building software.
No comments:
Post a Comment