A security policy is what it is implemented as, not what it’s goals were.
FISMA secure means
– Old technology is more secure than new technology.
Lengthy approval processes for new technologies means obsolete, insecure technologies will stay in production for longer than a streamlined approval process.
– Known bugs are more secure than unknown bugs.
Lengthy approval processes for patching means known flaws are kept in production rather than risk introducing unknown flaws.
– In house written encryption is better than third party code.
Lengthy and non-existent approval process for party components encourage teams to use less secure handwritten code instead of third party code.
– Closed source is better than owning the source code
Open source barely existed when FISMA was imagined, so government hasn’t really decided what to do about open source, so as often is the case with hard questions, the answer is “No you can’t.” (Can I use closed source code? No! Can I use Open source code? No! Can I write code? Well, I guess we can only slow you down, since that is what we hired you for- but as soon as we can figure out how to say ‘no’, we will.)
– Non-developers can “document” security into a system.
FISMA seems to have forgotten that security holes are created by developers and have to be discovered and closed by developers. A strictly process person can describe the system, but without reading code, they can’t really find any flaws.