Continuous Professional Development for Developers
Probably because of the fact that programming is relatively new and has changed much over the past 60 years, I still feel that Programming is treated more akin to being a car mechanic and less than a profession like law or medicine.
For example, it is common for the lay-person to think that since "you work with computers", it means you must know everything about hardware and software across every year, version and manufacturer a bit like people expect car mechanics to know about everything. However, most of those same people would know not to expect expert advice about cancer from an opthamologist.
Likewise, programming, like car mechanics is a completely unregulated profession. Sure, there are qualifications you could get or course you can take but these are no mandatory, you can start programming from school and be deploying directly to a mission-critical site from day 1 with much less supervision than a doctor would get.
What this mainly implies is that ability and value as a programmer is extremely subjective, whereas that in medicine and law can much more easily (although not perfectly) be summed up by various qualifications, experience, specialisms and oversight.
As an Employer, I often recruit people and I have seen many people with only 6 years programming experience asking for £65-£70K per year, way more than the national average and in many cases, much more than they deserve. Why? Because in the absence of any kind of regulation or industry-agreed career progression, I am left to decide how good I think I am. Maybe I have always been able to solve the problems I have faced as a developer. That might actually be because you are amazing but it might also be either that you have faced a very narrow and potentially realtively easy set of problems or otherwise that you solved them in sub-standard ways but either nobody noticed or nobody said anything - in both cases, you could be incredibly sub-standard but how would you know?
I think that as software has become and is likely to stay such a large part of our lives, we cannot treat it with the contempt that we have been. We cannot employ people largely arbitrarily and with the addition of a few designs or reviews expect what is produce to be of the same quality we would expect from a doctor or lawyer. Data breaches make me angry since almost every one I have heard about is due to sloppy quality-control. Rubbish UX on web sites makes me angry as I try and interact with things like HMRC's tax site or even my weekly shop. People with official qualifications at least stand a chance of doing things properly, people without don't really stand any chance.
Now the truth is that very few of us have the luxury of only employing really good people, the truth is that most people are average (by definition), some are below and some are above. That doesn't mean that we couldn't all produce decent stuff with some training but we have to assume that the default position is that most of our product will be "average", some will be great and some poor - and the same fro architecture, design and any other software disciplines. However, we can improve the overall quality if we have some formal way of training and recognising those who can achieve what we want.
Imagine wanting a senior UX designer and being able to specify, "Must be Level 3 qualified" and likewise for developers, DBAs, designers etc? Taking away a lot of the judgement we have to make about job candidates ability.
This comes down to a number of options and although I don't think I would like to decide which mix of these are relevant, I really think we need something. Things you can learn in an academic environment like school/college/university and other things that might require job-based training. We could even require a certain amount up-front like doctors require and since people seem happy to be paid doctor wages for programming, perhaps they can invest in up-front training?
The first option is simply different levels of computer science/design/database training. In the UK, we have GCSE, A Level, Diplomas, Degrees and stuff and these all have value if they are accredited by an institution. Of course, they are not perfect and plenty of us would probably quote the "clever graduate with a 1st calss degree and no clue how to actually do anything", but the truth is we need to be clever and pragmatic and saying that some clever people are not practical does not diminish the value of good education. Yes it costs money but most people now expect more salary in the first year than their entire degree would cost so why not?
With the levels, of course, you don't just have to have pass/fail. Someone could be a level 1 programmer with a basic GCSE and would be permitted to carry out certain work but perhaps other changes need to be made or reviewed by a level 3 degreee qualified dev. You can work your way up levels in the same way as most other professions - by carrying out on-the-job academic training. Your employer should give you some time towards this although the employer should also requier commitment from the employee since this equates to potentially thousands of pounds of investment.
A second option is the professional accreditation via an institution. These can cause much controversy since it is easy to be a member and not be either clever or pragmatic - at least once you did what you needed to do to get membership. This is sad because these organisations are uniquely placed to be bastions of expertise, purveyors of best-practice and champions of improvement. I think some of these definitely need a boost of energy but they would be helped enormously if the government(s) would endorse a credential as a necessary qualification to produce software professionally - at least for one named person in an organisation.
A third option relates to ongoing testing. Of course,being good at anything is not just about knowledge but that doesn't mean that knowledge is not important in its own right. A programmer might argue that they don't need to understand space/time/memory trade-offs in a choice of collections but actually, they should know, without needing to google it! They should know because its a fundamental, because they should have learned it at some point, because if it is in the front of their mind, they get their choices correct first-time with minimal effort and not later after a failed code-review or even worse, making it into production. It might not sound bad but its like putting damaged wheels on a car. It might only be a small problem but it is still an issue and can easily affect other parts of the car due to its slowness/memory usage etc. There are a few cool sites for testing yourself with various mathematical or programming puzzles and I don't know how you would create a central pool of these but they could easily become part of an annual check-up/test (like plumbers and electricians have to do).
Fourthly, real-world projects are a great way to ensure your skills are not for the sake of it but are actually useful. I don't like employing someone who has never worked somewhere for more than 3 years because they lose the experience of making decisions, seeing them into production and then living with those decisions later. Writing up these projects for someone to read and check is a great way to improve your communication skills and also make you actually think about what you did and why you did it - it is common for me to do something "just because" and its only when someone challenges it that I really wonder why I chose the way I did.
Lastly, I think a lot of effort can be made with so-called Soft Skills. We might think about "Team work" but also processes, personality types, getting counselling for our insecurities and fears, learning how to manage well, how to evaluate ideas etc. are such a critical part of what makes someone good that we should somehow encode these into CPD work. I would love to require that all "level 3" developers do a public talk once per year to make them think abour slides, style, content etc.
I am unsure whether the government is really interested in this or has even considered it - perhaps they are worried that we don't have enough qualified people to make it all happen but I don't see any reason to think it shouldn't happen to some degree and therefore, why not start with some baby steps to take us in the right direction? It could be voluntary, I suppose but mandated for certain industries/companies and then if you can't be bothered to do it, you will only ever work for some little noddy company buildnig web sites and you won't be able to call yourselves "professionally accredited". I don't know. I would rather we all did it but we can wait and see what happens!
For example, it is common for the lay-person to think that since "you work with computers", it means you must know everything about hardware and software across every year, version and manufacturer a bit like people expect car mechanics to know about everything. However, most of those same people would know not to expect expert advice about cancer from an opthamologist.
Likewise, programming, like car mechanics is a completely unregulated profession. Sure, there are qualifications you could get or course you can take but these are no mandatory, you can start programming from school and be deploying directly to a mission-critical site from day 1 with much less supervision than a doctor would get.
What this mainly implies is that ability and value as a programmer is extremely subjective, whereas that in medicine and law can much more easily (although not perfectly) be summed up by various qualifications, experience, specialisms and oversight.
As an Employer, I often recruit people and I have seen many people with only 6 years programming experience asking for £65-£70K per year, way more than the national average and in many cases, much more than they deserve. Why? Because in the absence of any kind of regulation or industry-agreed career progression, I am left to decide how good I think I am. Maybe I have always been able to solve the problems I have faced as a developer. That might actually be because you are amazing but it might also be either that you have faced a very narrow and potentially realtively easy set of problems or otherwise that you solved them in sub-standard ways but either nobody noticed or nobody said anything - in both cases, you could be incredibly sub-standard but how would you know?
I think that as software has become and is likely to stay such a large part of our lives, we cannot treat it with the contempt that we have been. We cannot employ people largely arbitrarily and with the addition of a few designs or reviews expect what is produce to be of the same quality we would expect from a doctor or lawyer. Data breaches make me angry since almost every one I have heard about is due to sloppy quality-control. Rubbish UX on web sites makes me angry as I try and interact with things like HMRC's tax site or even my weekly shop. People with official qualifications at least stand a chance of doing things properly, people without don't really stand any chance.
Now the truth is that very few of us have the luxury of only employing really good people, the truth is that most people are average (by definition), some are below and some are above. That doesn't mean that we couldn't all produce decent stuff with some training but we have to assume that the default position is that most of our product will be "average", some will be great and some poor - and the same fro architecture, design and any other software disciplines. However, we can improve the overall quality if we have some formal way of training and recognising those who can achieve what we want.
Imagine wanting a senior UX designer and being able to specify, "Must be Level 3 qualified" and likewise for developers, DBAs, designers etc? Taking away a lot of the judgement we have to make about job candidates ability.
This comes down to a number of options and although I don't think I would like to decide which mix of these are relevant, I really think we need something. Things you can learn in an academic environment like school/college/university and other things that might require job-based training. We could even require a certain amount up-front like doctors require and since people seem happy to be paid doctor wages for programming, perhaps they can invest in up-front training?
The first option is simply different levels of computer science/design/database training. In the UK, we have GCSE, A Level, Diplomas, Degrees and stuff and these all have value if they are accredited by an institution. Of course, they are not perfect and plenty of us would probably quote the "clever graduate with a 1st calss degree and no clue how to actually do anything", but the truth is we need to be clever and pragmatic and saying that some clever people are not practical does not diminish the value of good education. Yes it costs money but most people now expect more salary in the first year than their entire degree would cost so why not?
With the levels, of course, you don't just have to have pass/fail. Someone could be a level 1 programmer with a basic GCSE and would be permitted to carry out certain work but perhaps other changes need to be made or reviewed by a level 3 degreee qualified dev. You can work your way up levels in the same way as most other professions - by carrying out on-the-job academic training. Your employer should give you some time towards this although the employer should also requier commitment from the employee since this equates to potentially thousands of pounds of investment.
A second option is the professional accreditation via an institution. These can cause much controversy since it is easy to be a member and not be either clever or pragmatic - at least once you did what you needed to do to get membership. This is sad because these organisations are uniquely placed to be bastions of expertise, purveyors of best-practice and champions of improvement. I think some of these definitely need a boost of energy but they would be helped enormously if the government(s) would endorse a credential as a necessary qualification to produce software professionally - at least for one named person in an organisation.
A third option relates to ongoing testing. Of course,being good at anything is not just about knowledge but that doesn't mean that knowledge is not important in its own right. A programmer might argue that they don't need to understand space/time/memory trade-offs in a choice of collections but actually, they should know, without needing to google it! They should know because its a fundamental, because they should have learned it at some point, because if it is in the front of their mind, they get their choices correct first-time with minimal effort and not later after a failed code-review or even worse, making it into production. It might not sound bad but its like putting damaged wheels on a car. It might only be a small problem but it is still an issue and can easily affect other parts of the car due to its slowness/memory usage etc. There are a few cool sites for testing yourself with various mathematical or programming puzzles and I don't know how you would create a central pool of these but they could easily become part of an annual check-up/test (like plumbers and electricians have to do).
Fourthly, real-world projects are a great way to ensure your skills are not for the sake of it but are actually useful. I don't like employing someone who has never worked somewhere for more than 3 years because they lose the experience of making decisions, seeing them into production and then living with those decisions later. Writing up these projects for someone to read and check is a great way to improve your communication skills and also make you actually think about what you did and why you did it - it is common for me to do something "just because" and its only when someone challenges it that I really wonder why I chose the way I did.
Lastly, I think a lot of effort can be made with so-called Soft Skills. We might think about "Team work" but also processes, personality types, getting counselling for our insecurities and fears, learning how to manage well, how to evaluate ideas etc. are such a critical part of what makes someone good that we should somehow encode these into CPD work. I would love to require that all "level 3" developers do a public talk once per year to make them think abour slides, style, content etc.
I am unsure whether the government is really interested in this or has even considered it - perhaps they are worried that we don't have enough qualified people to make it all happen but I don't see any reason to think it shouldn't happen to some degree and therefore, why not start with some baby steps to take us in the right direction? It could be voluntary, I suppose but mandated for certain industries/companies and then if you can't be bothered to do it, you will only ever work for some little noddy company buildnig web sites and you won't be able to call yourselves "professionally accredited". I don't know. I would rather we all did it but we can wait and see what happens!