Friday Opinion: Why I don't like "learn to code" blitz courses
These have become popular recently: courses that are basically offering tidy, canned, straight-forward code teaching so that within 1/2/3 months, you can write your own web sites. There are various free online sites and even others that you pay for, which promise a lot and which, in the most simple terms probably do what they say - except, in my opinion, software is rarely if ever "in simple terms".
The difficult parts of designing/writing/deploying and maintaining software are not found in for loops or "what is a class". It is in the subtle art/magic of requirements capture; in determining how the whole system should work before you start coding so you don't spend your entire time re-writing the site's functionality; it's found in the tools that you use, sometimes being forced to use a tool you don't normally use, learning all it's key shortcuts, it's bugs and even how to do the most simple things; it's working at annoying file permissions and deployment problems and those bugs that people annoy you with that you can't re-create because you didn't add enough logging or you don't have command-line access to your hosting server or the knowledge of how to tail or grep error logs.
There are many things which make software development a pain and which are the accumulated learnings of often many years of experience which someone new to the business can simply not know or be taught. Why? Because there are so many potential problems/errors/software applications that it would be impossible to cover everything.
What I am saying is that Software Engineering really needs to be considered as a professional craft and not as some hobby that people with fast typing skills can use to quickly hack the Pentagon or work out which type of badger caused an imprint on a crime scene in an episode of CSI. I think it should be considered more like medicine!
You would not dream of offering courses to become a doctor in 3 months (and would probably not be allowed) and why not? For EXACTLY the same reasons that you can't learn a decent amount of software in 3 months. There are so many variations/conditions/drugs/habits/budget constraints etc that even knowing a good amount about the human body can simply not prepare you for. Imagine that you knew exactly what all the parts of the heart were called, what they did and how they join together - would that be enough to perform a heart bypass? Think about it and there are so many things that anatomy itself could not tell you. What staff am I required to use for the operation? what tools do I need and what is available? What kinds of things can go wrong and therefore do I need to mitigate? Do I have to decide the anesthetic or do I have to agree it with the anesthetist?
I think this is a good analogy because the two really are similar. The basic mechanics of the job are reasonably straight-forward and potentially very quick to learn but the peripheral skills can take years to learn and then master. During this period, you need to experience many scenarios, have the support of many and differently skilled professionals and learn when to ask for help and when you should have learned what to do. Sure, the outcome of failing a medical procedure is potentially much worse than poor software but poor software can have a cost. One example was a trading system that because of a poorly designed deployment cost the company around $70 million dollars inside the space of a few minutes!
So, what does this mean for how we should train people in software? Well, we can again, in my opinion, use the example of medicine. We should have high-school level basic computer skills equivalent to learning biology or physics. This gives a nice groundwork and most importantly, allows people who might be interested to find out about the subject in school where they are attending anyway (i.e. they don't have to use their own time just to find out whether they are interested without the help and experience of a teacher and their class). You could then have an advanced course perhaps, or a vocational course that could cover, let's say, entry-level computing - suitable to get a basic job in the industry but not something that suggests that you should be allowed to single-handedly produce anything other than a framework-based site by themself. This could be a bit like the pre-med studies carried out for medicine. They wouldn't allow you to become a doctor but they would allow access to medical-related jobs or as a basis for a similar and related job. If someone was still into it and wanted to be the sort of person who signs-off a complete site, like someone who could be in charge of a medical operation, they should need a degree-level qualification (however that would be defined/judged). There is no reason why these couldn't be treated in the same way as other professional qualifications - law, architecture, medicine - where relevant institutions would have to certify that a course meets a certain criteria.
I know many people don't like the idea of academia to prove competence because if we're honest, we know that a degree does not prove competence and we can all name unqualified people who are very able to carry out a certain job better than a qualified person. However, the important thing is to consider the alternative - we have no qualification requirement and just have to hope that people know what they are doing. That sounds much worse to me - it's like saying that just because some qualified doctors make mistakes, it undermines the whole value of their training and qualification - which is nonsense.
This all, of course, does not preclude people without qualifications from doing anything. In the same way as trainee doctors can practice with supervision, trainee architects can design buildings but cannot sign them off and trainee lawyers can produce certain documentation that would often have to be checked by a qualified lawyer.
If we continue to be too blazé about software, we will just keep making mistakes and some of these will be very costly (for someone or other!) and perhaps even if the problem seems small - it might be you who has their identity stolen or their life's collection of photos deleted by an attacker etc. Mistakes are easy in all professions but mistakes that are made for reasons that are easily addressed with structured training are pretty poor in my opinion. Why can't we accept that software is not a hacker's craft and it needs to be taken seriously?
The difficult parts of designing/writing/deploying and maintaining software are not found in for loops or "what is a class". It is in the subtle art/magic of requirements capture; in determining how the whole system should work before you start coding so you don't spend your entire time re-writing the site's functionality; it's found in the tools that you use, sometimes being forced to use a tool you don't normally use, learning all it's key shortcuts, it's bugs and even how to do the most simple things; it's working at annoying file permissions and deployment problems and those bugs that people annoy you with that you can't re-create because you didn't add enough logging or you don't have command-line access to your hosting server or the knowledge of how to tail or grep error logs.
There are many things which make software development a pain and which are the accumulated learnings of often many years of experience which someone new to the business can simply not know or be taught. Why? Because there are so many potential problems/errors/software applications that it would be impossible to cover everything.
What I am saying is that Software Engineering really needs to be considered as a professional craft and not as some hobby that people with fast typing skills can use to quickly hack the Pentagon or work out which type of badger caused an imprint on a crime scene in an episode of CSI. I think it should be considered more like medicine!
You would not dream of offering courses to become a doctor in 3 months (and would probably not be allowed) and why not? For EXACTLY the same reasons that you can't learn a decent amount of software in 3 months. There are so many variations/conditions/drugs/habits/budget constraints etc that even knowing a good amount about the human body can simply not prepare you for. Imagine that you knew exactly what all the parts of the heart were called, what they did and how they join together - would that be enough to perform a heart bypass? Think about it and there are so many things that anatomy itself could not tell you. What staff am I required to use for the operation? what tools do I need and what is available? What kinds of things can go wrong and therefore do I need to mitigate? Do I have to decide the anesthetic or do I have to agree it with the anesthetist?
I think this is a good analogy because the two really are similar. The basic mechanics of the job are reasonably straight-forward and potentially very quick to learn but the peripheral skills can take years to learn and then master. During this period, you need to experience many scenarios, have the support of many and differently skilled professionals and learn when to ask for help and when you should have learned what to do. Sure, the outcome of failing a medical procedure is potentially much worse than poor software but poor software can have a cost. One example was a trading system that because of a poorly designed deployment cost the company around $70 million dollars inside the space of a few minutes!
So, what does this mean for how we should train people in software? Well, we can again, in my opinion, use the example of medicine. We should have high-school level basic computer skills equivalent to learning biology or physics. This gives a nice groundwork and most importantly, allows people who might be interested to find out about the subject in school where they are attending anyway (i.e. they don't have to use their own time just to find out whether they are interested without the help and experience of a teacher and their class). You could then have an advanced course perhaps, or a vocational course that could cover, let's say, entry-level computing - suitable to get a basic job in the industry but not something that suggests that you should be allowed to single-handedly produce anything other than a framework-based site by themself. This could be a bit like the pre-med studies carried out for medicine. They wouldn't allow you to become a doctor but they would allow access to medical-related jobs or as a basis for a similar and related job. If someone was still into it and wanted to be the sort of person who signs-off a complete site, like someone who could be in charge of a medical operation, they should need a degree-level qualification (however that would be defined/judged). There is no reason why these couldn't be treated in the same way as other professional qualifications - law, architecture, medicine - where relevant institutions would have to certify that a course meets a certain criteria.
I know many people don't like the idea of academia to prove competence because if we're honest, we know that a degree does not prove competence and we can all name unqualified people who are very able to carry out a certain job better than a qualified person. However, the important thing is to consider the alternative - we have no qualification requirement and just have to hope that people know what they are doing. That sounds much worse to me - it's like saying that just because some qualified doctors make mistakes, it undermines the whole value of their training and qualification - which is nonsense.
This all, of course, does not preclude people without qualifications from doing anything. In the same way as trainee doctors can practice with supervision, trainee architects can design buildings but cannot sign them off and trainee lawyers can produce certain documentation that would often have to be checked by a qualified lawyer.
If we continue to be too blazé about software, we will just keep making mistakes and some of these will be very costly (for someone or other!) and perhaps even if the problem seems small - it might be you who has their identity stolen or their life's collection of photos deleted by an attacker etc. Mistakes are easy in all professions but mistakes that are made for reasons that are easily addressed with structured training are pretty poor in my opinion. Why can't we accept that software is not a hacker's craft and it needs to be taken seriously?