security-vendor.jpgWhen you buy 3rd party software or outsource application development, you inherent all the
vulnerabilities that the vendor fails to eradicate. To mitigate financial and operational risk, it’s important that security and technology professionals write clearly defined security requirements into contracts. While every organization has contextual details that will influence the particulars, below are some general considerations:

Before You Buy

How has your supplier thought about security?

A contract does not replace diligence; however, given the frequency of security breaches today, what language should you (the customer) minimally have with your supplier when sourcing a software application? You might want to consider adding some language that ensures the software is built according to a Product Requirements Documents and in a workmanlike manner in accordance with or exceeding all industry security standards. Often a vendor’s documentation is absent or intentionally ambiguous regarding security controls. You also may want to specify the software comply with applicable requirements that reflect elements of your internal security and privacy policies, which might  include standards such as PCI-DSS or HIPAA. Make sure to include acceptance testing criteria specific to security to protect yourself against software not written according to your requirements.

Have you thought enough about security?

There are different diligence processes that might apply, depending on the lifecycle stage of the software application and other circumstances. If the application is off-the-shelf, for example, you might do a survey of the vendor's security practices upfront and, based on the survey results, fashion appropriate contractual language. You can also audit outsourced development companies regarding their security practices during your software development project – make sure to include adjustment clauses into your contract ahead of time! 

After You've Bought

Security doesn't end upon delivery! 

Consider security for your entire software development lifecycle (SDLC.) For example, even after acceptance of the Software, will there be production scanning or penetration testing? Your contract may require cybersecurity insurance with a customer as a named beneficiary, notice of any known breach and/or a customer’s right to perform penetration testing. Make note of these in advance. If the application will run on your system as opposed to a supplier’s, consider making a penetration test of the application part of the acceptance criteria. If the application is SaaS, carefully consider if the supplier’s security and privacy policies are adequate in light of the most recent security events. Consider adding a provision that focuses on the risk you want the supplier to manage, the specific steps to avoid that risk, and steps to take if that risk is realized. Finally, consider a Vulnerability Service Level Agreement (VSLA.) Cooperation after a suspected or known security breach may not be adequate. A VSLA can be structured into vulnerability severity tiers (e.g. critical, high, medium, low) and include the following:

  • Classification/example of vulnerability severity level – for example a critical vulnerability might be defined as “Attacker gains access to admin or root privileges allowing remote read and write access to the system and remote commands” 
  • Response and resolution time - can be contextual to the amount of time that you provide adequate details or the amount of risk that was mitigated with the fix (e.g. system offline if necessary)

-Appropriate Software Security Control Types for Third Party Service and Product Providers:

-Vendor BSIMM (vBSIMM) Certification Program for Third-Party Risk Professional:

-Third-Party Security Assurance Special Interest Group (PCI Security Standards Council):
The New Imperative in Risk Management: Know Your Third Parties: