The Software Crisis, 2012 edition

This morning’s breakfast experiment involves rough estimation of the changes in cost and capacity of computing power over the last few years, and imagining where that will leave us in ten years.

In summary: hardware is already superpowered. The gap between hardware and software is already huge. It’s only going to get worse. What is to be done about the parlous state of software?

SD Cards

Rough googling suggests a 512MB card cost about $60 in 2005. That’s around $68 in 2012 dollars. A commodity 32GB card costs about $25 today. Assuming exponential decline in price, or equivalently exponential increase in capacity for constant cost, we see that we can expect capacity per dollar to roughly double year on year.


More rough googling suggests CPU capacity (measured in GFLOPS) is increasing at roughly a factor of 1.6 per year. GPUs are improving more quickly, approximately doubling in speed each year.


Wikipedia informs me that in 1971, DRAM cost 5c per bit, and in 1999 it cost 20µc per bit. Again assuming exponential scaling, that gives approximately a 1.56 increase in capacity year on year.



In five years time, expect to be working with machines that are ten times as fast, that have ten times as much RAM, and that have thirty-two times as much secondary storage as today’s machines.

In ten years time, expect machines one hundred times as fast, with one hundred times the amount of RAM, and one thousand times the amount of storage.

I’m not even sure how to measure progress in software. But my impression is that it isn’t keeping up its end of the bargain. Perhaps we’re seeing linear improvement, at best.

I think a big part of the problem is that our ambition hasn’t increased to match our capacity. We haven’t kept our expectations from software in line with the ability of our new hardware.

Comments (closed)
Daniel Yokomizo 13:59, 30 Jul 2012

In some measures software development improved by two orders of magnitude since the seventies. Parnas used KWIC ( as a programming example in "On the Criteria To Be Used in Decomposing Systems into Modules" ( and remarked "such a system could be produced by a good programmer within a week or two", but nowadays a good programmer would do it under an hour, due to better languages, better libraries, and improvements in programming techniques (e.g. using proper abstractions, functional composition, etc.).

Some progress is impossible, certain algorithms developed earlier can not possibly be improved, but we had many improvements and discovery of new algorithms and data-structures (e.g. Okasaki's work) that improved some programs in orders of magnitude. Another example is in the contrast between ZFS or BTRFS and FAT.

Our software isn't much faster because either we couldn't improve it due to algorithmic limitations (e.g. impossible to sort in better than O(n log n) in the general case) or because we do so much more than we previously did (e.g. FAT vs ZFS) that speed is lost due to additional features.

Adrian Kuhn 02:01, 29 Nov 2012

I dare to disagree that there’s no progress in software. When comparing programming today to ten years ago it has become so much more painless. Programmers get their questions answered within seconds on Stackoverflow, rather than waiting for days in news groups that rather choose to lecture you that you’re is not your. Programmers share their code on github with the whole world and fork and merge with a simple click, rather than struggling with CVS or even not using version control. Languages come with a plenthora of APIs and libraries that hadn’t be around back then. New languages have done away with compile and deploy cycles, all it takes is save and reload and my changes are on the client. Et cetera. Take for example D3.js, its so painless to use.  I can deploy my code to the cloud where it runs on virtualized machines, software enabling write once run everywhere. App stores have solved the problem of  shareware, bringing customers and software makers together. Agile has brought us methodologies that actually work and scale. I could go on forever. To me all that feels like a factor of 10x, or even 100x if you ’d been doing Java back then. There is no software crisis.