Fifty years ago, high atop Launch Complex 34 at Cape Canaveral, a spark caused by faulty wiring ignited flammable materials in the pure oxygen environment inside the Apollo 1 capsule during a "plugs out" test. Astronauts Grissom, Chafee and White were burned to death because of bad threat and security management that are still common within many organizations:
Unrealistic Milestone Management
Engineers were given a tall development order to complete in a timeframe that was simply impractical. Mistakenly, the engineering team did not push back and ask for the additional budget and infrastructure necessary to succeed. The result was that the lead contractor, North American, cut corners on safety and testing, and produced work that didn't meet the overly stringent quality requirements of the mission. However, the project was allowed to proceed because management feared the consequences of forcing a delay to address the issues.
Lesson Learned: Consequences can be dire if engineers and project managers don't have the resolution to push for security. While pushback doesn’t always result in additional resources, funding, or development time needed to be successful; development teams need to strive towards creating applications that do not allow their users to become victimized due to a vulnerability within the software.
Convenience Trumped Safety
The preferred method of attaching various objects to the interior of the spacecraft used a new Velcro system. In the pure oxygen atmosphere of the capsule, the Velcro ignited easily, burned quickly at high temperature, and released toxic gasses that blinded and choked the crew.
Lesson Learned: Critical security requirements must be enforced, even over the objections of developers, project managers, or users who disagree with the requirement. Enforce the security requirement first and then debate about the requirement later in a safe environment with the necessary experts and stakeholders. Writing secure software takes diligence; cutting corners for speed or convenience is dangerous because in software shortcuts become exploits.
Risks Not Realistically Evaluated
The engineers did not consider what would happen if a fire broke out inside the capsule. If they had, the door to the capsule would never have been designed to open inward. If cabin pressure was significantly higher than the outside pressure (a condition likely during a fire since hot gas expands) the crew would be unable to open the door. Once the fire started, the crew was doomed. The Apollo management team was focused on some highly visible risks like a fire on the launch pad or exposure to the vacuum of space; yet they didn't consider the risk of something as basic as electrical sparks. It was a failure of imagination -- mundane risks plagued a team tasked with imagining futuristic risks.
Lesson Learned: A thorough threat modeling process would have helped reduce the risk of a tragedy occurring. Development teams need to think through the logical implications of potential threats to an application and create security controls or processes to overcome them. Also, don't ignore seemingly unlikely threats. Prioritize the threats based upon potential severity and then actively mitigate the risk of those threats that could cause potentially catastrophic effects.