Learning from our mistakes? Not a chance!
There is a very famous saying that we, "learn from our mistakes". What a cute pithy saying, which is of course completely untrue. How many times have we got drunk after promising we would not do it again? How many times have we stopped exercising even though we have already learned that exercise is good? How many times have I copied and pasted code including the bits that I should have changed in the process but didn't? Loads of times!
Every few weeks, we read another story in the paper about some kind of abuse scandal or corruption probe or another case of where a government official/council worker/teacher has made some monumental cock-up costing time, money and embarrassment. One thing that is clear amid the claims of, "we are investigating how we can avoid this happening again" is that human beings are fairly useless at learning from our mistakes, certainly others' mistakes but even our own.
Why is this? Well, there are lots of reasons, often the right choice is balanced with some more powerful force like laziness, tiredness or greed. Often, in my opinion, there are just an awful lot of mediocre people in jobs who either don't really care or who don't have the ability to either see whether improvements can be made (before something bad happens) or to do the best in their job. But there is another reason which I want to mention, without which it sounds like a fairly downbeat post about a mostly un-winnable situation - that is that we are also terribly bad at information sharing.
Sharing information sounds fairly straight-forward but it isn't. One of the reasons why is that truth is not always known objectively - there might be 2 or more opinions on the correct way to do something. Another big issue is that it is hard to present knowledge in other format than pure information, which is the least effective at inspiring people to follow and learn from it.
Let's take software development. One of the most effective ways of ensuring quality software development are code-reviews and code-release checklists. Have you spell-checked messages, have you added translations into the translations database, have you signed off x and y and informed the test team about the release etc. Guess what? Most developers can't stand this stuff. We enjoy coding, we don't enjoy "paper work" so this process-oriented approach is great in theory but doesn't work in practice.
Another problem is how you distill the vast and abstract world of knowledge, even a specific subset like computer science, into a form which is searchable and useful? Currently, most programming knowledge (I would suggest) is obtained by searching for specific phrases on Google, such as "how to access MySql from C#" and then copying and pasting the answer from somewhere like Stack Overflow without necessarily understanding all the meta-data. Is it up to date? Is it the best (or one of the best) ways to carry out the task? Are there settings in the code that I need to change for my implementation? Are there security considerations? Is there any way to verify the expertise of the author? In fact, the scoring mechanism on Stack Overflow is extremely poor. You can ask one question that a lot of people up-vote (for whatever reason) and achieve a multi-1000 reputation without having any knowledge, you could likewise answer 1000 questions and not even get 1000 reputation from that.
There is also an awkwardness about the idea of "community-driven", the idea that by adding lots of opinions into the mix, the right answer will pop out. This is, of course, not true in the same way that the most popular vote in a General Election does not necessarily equate to the best choice of government. It also does not distinguish between the quiet but clever person and the loud but stupid person who thinks that the way to do something is whatever "worked for them". It is good that people are able to review and question information and 'facts' but it is not a good idea, in my opinion, to air all of this in public in comments that just confuse the uninitiated into not knowing what is what.
So what would be the solution? I would like to see a site where the information is reasonably structured into smallish chunks, each of which could be split into separate headings such as "legal requirements", "security considerations", "platform considerations" etc. and which perhaps is produced and published by a single individual who then becomes responsible for curating and updating the content based on people's feedback. Useful feedback could be rewarded with reputation, trolls could be down-voted or banned and curators can themselves be reported if a reviewing thinks their work is unfair or misleading. Perhaps people could apply to curate certain sections that they are more qualified to curate based on job/qualification etc. there could then be either a contest or the incumbent could plead no contest and hand over to the more qualified person. With rate limiting, it would prevent people from just spouting nonsense like you see on Yahoo answers all the time, perhaps each person only gets to comment once per day - and could even win another vote if their feedback was considered useful.
It seems that we should be able to spare thousands of developers around the world the problems associated with making all the same mistakes that have probably already been made by someone else. Even with topics as high-level as Defense-in-depth or "keep it short and simple", there is no reason why these shouldn't be published, debated and updated. Old information could be recorded so that people can easily see why something that saw on a web forum might not be relevant (it was required before a library was updated or a new release of a framework was made).
I'm not sure if this is something that is doable or whether, as I have learned from software, the first 90% is easy and the last 10% is impossible! Who knows!?
Every few weeks, we read another story in the paper about some kind of abuse scandal or corruption probe or another case of where a government official/council worker/teacher has made some monumental cock-up costing time, money and embarrassment. One thing that is clear amid the claims of, "we are investigating how we can avoid this happening again" is that human beings are fairly useless at learning from our mistakes, certainly others' mistakes but even our own.
Why is this? Well, there are lots of reasons, often the right choice is balanced with some more powerful force like laziness, tiredness or greed. Often, in my opinion, there are just an awful lot of mediocre people in jobs who either don't really care or who don't have the ability to either see whether improvements can be made (before something bad happens) or to do the best in their job. But there is another reason which I want to mention, without which it sounds like a fairly downbeat post about a mostly un-winnable situation - that is that we are also terribly bad at information sharing.
Sharing information sounds fairly straight-forward but it isn't. One of the reasons why is that truth is not always known objectively - there might be 2 or more opinions on the correct way to do something. Another big issue is that it is hard to present knowledge in other format than pure information, which is the least effective at inspiring people to follow and learn from it.
Let's take software development. One of the most effective ways of ensuring quality software development are code-reviews and code-release checklists. Have you spell-checked messages, have you added translations into the translations database, have you signed off x and y and informed the test team about the release etc. Guess what? Most developers can't stand this stuff. We enjoy coding, we don't enjoy "paper work" so this process-oriented approach is great in theory but doesn't work in practice.
Another problem is how you distill the vast and abstract world of knowledge, even a specific subset like computer science, into a form which is searchable and useful? Currently, most programming knowledge (I would suggest) is obtained by searching for specific phrases on Google, such as "how to access MySql from C#" and then copying and pasting the answer from somewhere like Stack Overflow without necessarily understanding all the meta-data. Is it up to date? Is it the best (or one of the best) ways to carry out the task? Are there settings in the code that I need to change for my implementation? Are there security considerations? Is there any way to verify the expertise of the author? In fact, the scoring mechanism on Stack Overflow is extremely poor. You can ask one question that a lot of people up-vote (for whatever reason) and achieve a multi-1000 reputation without having any knowledge, you could likewise answer 1000 questions and not even get 1000 reputation from that.
There is also an awkwardness about the idea of "community-driven", the idea that by adding lots of opinions into the mix, the right answer will pop out. This is, of course, not true in the same way that the most popular vote in a General Election does not necessarily equate to the best choice of government. It also does not distinguish between the quiet but clever person and the loud but stupid person who thinks that the way to do something is whatever "worked for them". It is good that people are able to review and question information and 'facts' but it is not a good idea, in my opinion, to air all of this in public in comments that just confuse the uninitiated into not knowing what is what.
So what would be the solution? I would like to see a site where the information is reasonably structured into smallish chunks, each of which could be split into separate headings such as "legal requirements", "security considerations", "platform considerations" etc. and which perhaps is produced and published by a single individual who then becomes responsible for curating and updating the content based on people's feedback. Useful feedback could be rewarded with reputation, trolls could be down-voted or banned and curators can themselves be reported if a reviewing thinks their work is unfair or misleading. Perhaps people could apply to curate certain sections that they are more qualified to curate based on job/qualification etc. there could then be either a contest or the incumbent could plead no contest and hand over to the more qualified person. With rate limiting, it would prevent people from just spouting nonsense like you see on Yahoo answers all the time, perhaps each person only gets to comment once per day - and could even win another vote if their feedback was considered useful.
It seems that we should be able to spare thousands of developers around the world the problems associated with making all the same mistakes that have probably already been made by someone else. Even with topics as high-level as Defense-in-depth or "keep it short and simple", there is no reason why these shouldn't be published, debated and updated. Old information could be recorded so that people can easily see why something that saw on a web forum might not be relevant (it was required before a library was updated or a new release of a framework was made).
I'm not sure if this is something that is doable or whether, as I have learned from software, the first 90% is easy and the last 10% is impossible! Who knows!?