The web is a very difficult place because for every attempt to make something better, you have to make it not a breaking-change, because in most cases, like it not, you have millions of existing browsers, websites, web servers and internet hardware to consider.
The vendors have control over browsers, at least the automatic updating ones like Chrome, FF, Opera, Safari and Edge but we still have millions of older browsers in-play and not all vendors will support all features.
Anyway, Content Security Policy (CSP) is an attempt at improvement which I think, in most cases, is either a marginal improvement at best or completely pointless at worst. It is an example of something I can only think is invented by either Academics or people who only have very simple websites and think that people who complain about these things are ignorant or unintelligent.
What is CSP? It is a header that tells the browser what sources it is allowed to load from what origins but also includes functionality to allow (or not) inline CSS and JS.
This sounds OK so far, and it makes sense in theory. Cross-site scripting is a very serious security risk on a website and if you Google it, you will find many examples where XSS has caused some kind of data leakage, leveraging the reputation of a proper organisation by running bad scripts in the context of the trusted page. How can we prevent this when we are likely to have hundreds or thousands of potential ways in which an attacker could add content that might accidentally run as a script in our pages?
So with CSP, the two most important parts are 1) You can say, "only run scripts from this origin(s)" 2) You can disable inline scripts entirely and only allow scripts from external JS files.
So then, they obviously decided, without much foundation as far as I can tell, that maybe CSP should also allow you to lock down Ajax, Frames, Fonts, Styles, Objects, Images and other groups as well. Now, there are some known attack vectors with these other types but nowadays, these are extremely infrequent. Attack by font? Attack by image?
So what are the problems? Who cares if we lock down things that are not likely to be a problem?
The first problem is that this is a header and it is big. On one of our sites (not the worst offender) the header comes back at 1200 bytes per request. This is significant bearing in mind that in most cases, we are trying to keep payloads in the single digit KB in size. There are both theoretical and practical limits on the size of these headers so what happens if you simply have too many origins to include in a reasonable size? CSP per-page? Simplify your sources? All great in theory but for most people are not going to be worth the risk. Get it wrong and you literally break your site.
Another significant challenge is trying to find all of these origins in the first place. Sure, you can enable report-only mode and setup an account but these can generate noise that is hard to see through sometimes. We have a site that often gets reports of a font that we don't use, that doesn't exist in the requested directory and which we can't recreate but which still fills up the report inbox with error reports. Even if you have confidence and don't have the noise, how long do you leave it before you know you have captured everything and can enable CSP fully? Again, easy to postulate that people who find it hard have bad web applications but legacy is a bitch sometimes.
Google Tag Manager? Good luck, that adds about 8 sources for the various URLs it downloads it tat from. If you are going to inject scripts via GTM, these would also need to be added and in most cases probably requires a code deployment so that is now more difficult.
One of the two most important features, by far, is the ability to disable inline scripts but good luck with that! Frameworks that you use, legacy code, etc can all be littered with scripts. You want to use Microsoft's auto cdn fallback mechanism to cope with unavailable cdns? Nope. Inline script. Want to use webforms with validators? Nope, inline scripts. Move all the code to external files? Possible but also possibly a massive job for a marginal gain.
On top of that, a load of browsers only support a subset of CSP. What happens if you define script-src-elem in Firefox? Console warning! Nice!
There are improvements afoot in the latest CSP version 3 to allow the CSP to be obtained and cached from an external source to avoid the size of the header but, of course, not all browsers support it so you end up having to add that as well as the older format so what's the point?
All of this paints a picture of another feature that could have been PoC'd much earlier on rather than the relatively short lifetime of something that is already at version 3 which is already deprecating features!
I think most people would be better served having a more secure Pull request verification process and some pen-test tools to ensure your site is secure. Since you need that anyway, it is likely to be much more successful than spending all this time slowing down every single request and getting random broken pages.