Posted by Pete McBreen 07 Feb 2010 at 17:26
Sometimes it seems that while we were not looking, things changed.
Not too many years ago -
- Hardware was the largest part of any software project budget. Now, unless you are working at a massive scale, the cost of the computing hardware is a rounding error on the bottom line.
- Scripting languages were too slow for use on real projects, but the web has well and truly demonstrated that this is false.
Not too sure how this is happening, but it seems that when we first learn about something, those ideas stick and it is hard to change what we know to match the current reality. When I started commercial software development, it was common to build systems on a PDP-11 with under 512KB of RAM. These days a laptop comes with at least 2GB of RAM, an increase of main memory of a factor of 4,000, but sometimes I still catch myself trying to save a few bytes when designing some aspect of a system.
The open question for now is how to detect this type of slow change (even if the pace of technological change is not all that slow compared to other changes.) This is an important question because many societies and groups have been hit by surprises that in hindsight are obvious, and the consequences were catastrophic;
- When cutting down trees in an area, when does the population realize that there is a serious problem with deforestation?
- When does a drought become a climate shift that means the area is no longer amenable to the current mode of agriculture?
- When does the exploitation of fish in a fishery result in the collapse of the stocks in that fishery?
On the technology side, when do the desktop application developers get hit overtaken by the web applications running in a browser? Functionality wise, we can deliver nearly equivalent functionality over the web provided we have the bandwidth, so maybe it is time to recreate departmental applications as web applications?