Back in 1863, a load of men with spades, horses and rope dug up the road between Paddington Station and Farringdon in London and built the first underground metro railway line in the world. There were no diggers, no laser-levels, no computer simulations and no trains to take all the dirt away. Fast forward to the present day and Crossrail is nearing completion in London covering much of the same alignment with additional branches to Docklands and Liverpool Street in a very large and complex east-west train line.

So what? Well the per-mileage cost of Crossrail is twice as much as the original line dug with spades in adjusted rates! Twice as much despite nearly 200 years of supposed progress in technology. Sure, it's not quite the same thing and Crossrail has more safety hoops to jump through but how does that happen? If we burned all the technology, could we have built Crossrail for half as much money?

I read a similarly depressing article about modernizing the signalling of the New York subway system. Sums of money like $200 million were being talked about for systems that were incomplete, only semi-functional and which didn't cover the network. $200 million? I'm pretty sure a company of 50 people could have manually installed all the equipment for a 10th of that money. We're not talking some futuristic technology, just some transponders and software to coordinate. This on top of the reality that train systems are already controlled worldwide by various systems that have already been developed. New York aren't doing anything differently.

I wonder whether the reality is that Technology always promises so much but in reality it doesn't make most things any more efficient. An email system is just a way of wasting 20 minutes writing down what could have been said in 30 seconds on the phone (remember those?).

Take a step back though. Why is it like that? Because we're people and we like to pretend that we know what is going on. Try telling a customer that it would be much more cost effective to change their business processes to match the way some off-the-shelf software already works and they'll tell you to clear off. They would rather change functionality in software or write something completely bespoke with all the costs - both up front and maintenance - even though the chances are it will need to change in 5 years anyway, at which point we do it all again.

I worked for a company that created a mortgage system for a large bank and it was pretty complex. Lots of external services to talk to, lots of users to support. It took a few years and still cost less than £10 million. How does New York and a thousand other organisations spend such heinous amounts of money - very often getting very little in return (cancelled or curtailed projects, technology that is out of date as soon as it's released).

The most important question though is, "what on earth keeps going wrong?". I read about Obama Care and how the original system cost something like $50B and failed miserably. They grabbed a lot of clever people who sorted it out and ended up with a system costing $2B that worked. Where did that $48 B go?

Are we lacking tools? Expertise? Management? Experience? Maybe we are still applying 200 year old project management principles to software where people think it is OK to change the requirements half-way through and maybe we need a completely new approach?

Or maybe, technology benefits are mostly a fallacy that provide additional possibilities but without necessarily improving what we already have?