Security Definition of Done Examples
This document provides practical examples of Definition of Done (DoD) criteria for security, organized by maturity level and domain. These examples can be customized to fit specific team needs and technology stacks.
Introduction
A Definition of Done (DoD) for security establishes clear, measurable criteria that all code and systems must satisfy before being considered complete from a security perspective. Implementing a robust security DoD:
- Embeds security throughout the development lifecycle
- Reduces the risk of vulnerabilities in production systems
- Decreases the cost of remediation through early detection
- Promotes a consistent security posture across teams
- Enhances compliance with security standards and regulations
- Creates a "security by design" culture
How to Use These Examples
- Assess Your Current State: Determine your team's current security maturity level
- Select Appropriate Criteria: Choose DoD items that align with your maturity level
- Customize for Your Context: Modify criteria to match your technology stack and risk profile
- Implement Incrementally: Start with essential practices and gradually add more
- Review and Refine: Periodically evaluate and update your DoD as your team matures
Maturity Levels Overview
Each DoD example includes a maturity indicator:
- Level 1 - Essential (⭐): Fundamental security practices that all teams should implement
- Level 2 - Intermediate (⭐⭐): Enhanced security practices for teams with established basics
- Level 3 - Advanced (⭐⭐⭐): Sophisticated security practices for high-performing teams
- Anti-pattern (👎): Common security pitfalls to avoid
Security DoD by Domain
1. Authentication and Authorization
Story/Task Level DoD
Criteria |
Maturity |
Description |
Authentication implemented |
⭐ |
Authentication mechanism in place for all protected resources |
Authorization validated |
⭐ |
Role-based access controls implemented and verified |
Password security |
⭐ |
Password policies enforced (complexity, expiration, etc.) |
Sensitive operations protected |
⭐⭐ |
Multi-factor authentication for sensitive operations |
Session management |
⭐⭐ |
Secure session handling (timeout, revocation, etc.) |
Token security |
⭐⭐ |
JWT or other tokens properly secured and validated |
Privilege escalation testing |
⭐⭐⭐ |
Tests verify protection against privilege escalation |
Auth bypass testing |
⭐⭐⭐ |
Tests verify protection against authentication bypasses |
Default credentials removed |
⭐ |
No default or development credentials in production code |
Hard-coded credentials |
👎 |
Including hard-coded credentials or API keys in code |
Example Implementation Checklist
## Authentication and Authorization Checklist
- [ ] Authentication mechanism implemented and tested
- [ ] Authorization checks in place for all protected resources
- [ ] Password policies enforced per organizational standards
- [ ] No default, test, or hardcoded credentials
- [ ] Session management implemented securely
- [ ] Token validation implemented for all token types
- [ ] Privilege boundaries tested to prevent escalation
2. Data Protection
Story/Task Level DoD
Criteria |
Maturity |
Description |
Sensitive data identified |
⭐ |
All sensitive data fields identified and documented |
Data encryption at rest |
⭐ |
Sensitive data encrypted when stored |
Data encryption in transit |
⭐ |
TLS/SSL implemented for all data transmission |
Data classification |
⭐⭐ |
Data properly classified according to sensitivity levels |
Key management |
⭐⭐ |
Proper key management practices implemented |
Data minimization |
⭐⭐ |
Only necessary data collected and stored |
Data anonymization |
⭐⭐⭐ |
Anonymization techniques applied to sensitive data |
Secure data deletion |
⭐⭐⭐ |
Verified procedures for secure data deletion |
Unencrypted sensitive data |
👎 |
Storing passwords or other sensitive data in plain text |
Data leakage in logs |
👎 |
Sensitive data appears in logs or error messages |
Example Implementation Checklist
## Data Protection Checklist
- [ ] Sensitive data inventory completed
- [ ] Encryption implemented for sensitive data at rest
- [ ] TLS/SSL implemented for all data in transit
- [ ] No sensitive data in logs, error messages, or debug output
- [ ] Key management processes documented and implemented
- [ ] Data retention policies implemented
- [ ] Data minimization principles applied
3. Input Validation and Output Encoding
Story/Task Level DoD
Criteria |
Maturity |
Description |
Input validation |
⭐ |
All user inputs validated for type, length, format, and range |
Output encoding |
⭐ |
Output encoded to prevent XSS and injection attacks |
Server-side validation |
⭐ |
All validation performed server-side (not just client-side) |
Parameterized queries |
⭐⭐ |
Parameterized queries used for all database operations |
Content Security Policy |
⭐⭐ |
CSP headers configured and tested |
API input validation |
⭐⭐ |
API endpoints validate all input parameters |
File upload validation |
⭐⭐⭐ |
Comprehensive validation for file uploads (type, size, content) |
Context-specific encoding |
⭐⭐⭐ |
Context-specific output encoding (HTML, JS, CSS, etc.) |
Client-side validation only |
👎 |
Relying solely on client-side validation |
String concatenation in queries |
👎 |
Building SQL or other queries through string concatenation |
Example Implementation Checklist
## Input Validation and Output Encoding Checklist
- [ ] All user inputs validated on the server-side
- [ ] Output encoded appropriately for the context
- [ ] Parameterized queries used for all database operations
- [ ] Content Security Policy headers configured
- [ ] File uploads validated and sanitized
- [ ] No string concatenation for building queries
- [ ] API input parameters validated and sanitized
4. Secure Coding Practices
Story/Task Level DoD
Criteria |
Maturity |
Description |
No known vulnerabilities |
⭐ |
Code free of known vulnerability patterns |
Error handling security |
⭐ |
Errors handled securely without exposing sensitive information |
Memory safety |
⭐⭐ |
Memory handled safely to prevent buffer overflows, etc. |
Race conditions |
⭐⭐ |
Code protected against race conditions |
Secure defaults |
⭐⭐ |
Security features enabled by default |
Least privilege principle |
⭐⭐ |
Code runs with minimal necessary privileges |
Time-of-check to time-of-use |
⭐⭐⭐ |
Protection against TOCTOU vulnerabilities |
Side-channel protection |
⭐⭐⭐ |
Protection against timing and other side-channel attacks |
Security comments removed |
⭐ |
No commented-out security controls or "TODO: Fix security" |
Disabling security for convenience |
👎 |
Disabling security features for testing but not re-enabling |
Example Implementation Checklist
## Secure Coding Checklist
- [ ] Code reviewed for common vulnerability patterns
- [ ] Error handling implemented without exposing sensitive information
- [ ] Memory safety considerations addressed
- [ ] Race conditions identified and mitigated
- [ ] Secure defaults implemented
- [ ] Least privilege principle applied
- [ ] No security bypasses or disabled controls
5. Security Testing
Story/Task Level DoD
Criteria |
Maturity |
Description |
Security unit tests |
⭐ |
Unit tests for security controls |
SAST performed |
⭐ |
Static Application Security Testing completed |
Security test cases |
⭐⭐ |
Test cases for security requirements |
DAST performed |
⭐⭐ |
Dynamic Application Security Testing completed |
Security regression tests |
⭐⭐ |
Regression tests for security features |
Penetration testing |
⭐⭐⭐ |
Penetration testing completed for critical features |
Fuzz testing |
⭐⭐⭐ |
Fuzz testing for input handling |
Security test coverage |
⭐⭐⭐ |
Security test coverage metrics met |
Skipping security testing |
👎 |
Bypassing security testing due to time constraints |
False positives ignored |
👎 |
Ignoring security scan results without proper review |
Example Implementation Checklist
## Security Testing Checklist
- [ ] Security unit tests implemented and passing
- [ ] SAST completed with no high/critical issues
- [ ] Security-focused test cases implemented
- [ ] DAST completed for web applications
- [ ] Security regression tests added to test suite
- [ ] All security scan results triaged
- [ ] Penetration testing completed for critical features
6. Dependency Management
Story/Task Level DoD
Criteria |
Maturity |
Description |
No known vulnerable dependencies |
⭐ |
Dependencies scanned for known vulnerabilities |
Dependencies documented |
⭐ |
All dependencies documented with versions |
Minimized dependencies |
⭐⭐ |
Only necessary dependencies included |
Dependencies up-to-date |
⭐⭐ |
Dependencies updated to secure versions |
SCA performed |
⭐⭐ |
Software Composition Analysis completed |
Transitive dependencies checked |
⭐⭐⭐ |
Transitive dependencies also screened for vulnerabilities |
Dependency pinning |
⭐⭐⭐ |
Dependencies pinned to specific secure versions |
Using wildcard versions |
👎 |
Using wildcard or latest versions without security review |
Outdated dependencies |
👎 |
Using outdated dependencies with known vulnerabilities |
Example Implementation Checklist
## Dependency Management Checklist
- [ ] Dependencies scanned for known vulnerabilities
- [ ] No dependencies with known high/critical vulnerabilities
- [ ] All dependencies documented with versions
- [ ] Unnecessary dependencies removed
- [ ] Dependencies updated to secure versions
- [ ] Transitive dependencies analyzed
- [ ] Dependencies pinned to specific secure versions
7. Secure Configuration
Story/Task Level DoD
Criteria |
Maturity |
Description |
No default configurations |
⭐ |
Default configurations and credentials changed |
Security headers |
⭐ |
Security headers implemented (HSTS, X-Content-Type-Options, etc.) |
Environment separation |
⭐⭐ |
Clear separation between environments (dev/test/prod) |
Secure cookies |
⭐⭐ |
Cookies configured with secure attributes |
Minimize attack surface |
⭐⭐ |
Unnecessary features, documentation, samples disabled |
Security-focused configuration |
⭐⭐⭐ |
Configuration hardened using CIS benchmarks or similar |
Configuration validation |
⭐⭐⭐ |
Automated validation of security configurations |
Development settings in production |
👎 |
Using development settings in production environment |
Excessive permissions |
👎 |
Running with excessive permissions or as root/admin |
Example Implementation Checklist
## Secure Configuration Checklist
- [ ] Default configurations and credentials changed
- [ ] Security headers implemented and verified
- [ ] Environments properly separated
- [ ] Cookies configured with secure attributes
- [ ] Attack surface minimized
- [ ] Configuration hardened according to security standards
- [ ] Production configuration validated
8. Threat Modeling
Story/Task Level DoD
Criteria |
Maturity |
Description |
Basic threat identification |
⭐ |
Basic security threats identified |
Mitigations documented |
⭐ |
Mitigations for identified threats documented |
Data flow diagrams |
⭐⭐ |
Data flow diagrams created and reviewed |
STRIDE analysis |
⭐⭐ |
STRIDE or similar threat modeling approach used |
Attack surface analysis |
⭐⭐⭐ |
Attack surface thoroughly analyzed |
Abuse cases documented |
⭐⭐⭐ |
Abuse cases and scenarios documented and tested |
Formal threat model updated |
⭐⭐⭐ |
Formal threat model updated with new findings |
No threat consideration |
👎 |
Implementing features without considering security threats |
Assuming threats are covered |
👎 |
Assuming threats are already addressed without verification |
Example Implementation Checklist
## Threat Modeling Checklist
- [ ] Security threats identified and documented
- [ ] Mitigations implemented for identified threats
- [ ] Data flow diagrams updated
- [ ] STRIDE or similar analysis performed
- [ ] Attack surface analyzed
- [ ] Abuse cases considered and addressed
- [ ] Formal threat model updated if applicable
9. Security Monitoring and Logging
Story/Task Level DoD
Criteria |
Maturity |
Description |
Security logging implemented |
⭐ |
Security events logged (auth attempts, access control, etc.) |
Log data protected |
⭐ |
Log data protected from unauthorized access |
No sensitive data in logs |
⭐⭐ |
Logs verified to exclude sensitive information |
Log integrity |
⭐⭐ |
Log integrity protected |
Centralized logging |
⭐⭐ |
Logs sent to centralized logging system |
Alerting configured |
⭐⭐⭐ |
Alerts configured for suspicious activity |
Log correlation |
⭐⭐⭐ |
Log correlation implemented for security events |
Insufficient logging |
👎 |
Not logging security-relevant events |
Logging sensitive data |
👎 |
Including passwords, tokens, or PII in log files |
Example Implementation Checklist
## Security Monitoring and Logging Checklist
- [ ] Security-relevant events logged
- [ ] Logs protected from unauthorized access and tampering
- [ ] No sensitive data included in logs
- [ ] Log format includes necessary context (who, what, when, where)
- [ ] Logs integrated with centralized logging system
- [ ] Alerting configured for security events
- [ ] Log retention policy implemented
10. Incident Response Readiness
Story/Task Level DoD
Criteria |
Maturity |
Description |
Contacts documented |
⭐ |
Security contacts and escalation paths documented |
Basic incident handling |
⭐ |
Basic procedures for handling security incidents |
Error messages don't help attackers |
⭐⭐ |
Error messages don't provide information useful to attackers |
System recovery tested |
⭐⭐ |
Recovery procedures tested |
Incident response playbooks |
⭐⭐⭐ |
Detailed playbooks for different incident types |
Automated responses |
⭐⭐⭐ |
Automated responses for common security events |
No planning for failure |
👎 |
No consideration of how to handle security incidents |
Informative error messages to users |
👎 |
Showing stack traces or detailed error info to users |
Example Implementation Checklist
## Incident Response Readiness Checklist
- [ ] Security contacts and escalation procedures documented
- [ ] Basic incident handling procedures in place
- [ ] Error messages reviewed to not help attackers
- [ ] Recovery procedures documented and tested
- [ ] Incident response playbooks created if applicable
- [ ] Automated responses configured for known attack patterns
Complete DoD Examples by Maturity Level
Level 1 (Essential) Security DoD Example
# Definition of Done - Security (Level 1)
Code/systems are considered "Done" when:
## Authentication and Authorization
- [ ] Authentication implemented for all protected resources
- [ ] Authorization checks implemented for all protected operations
- [ ] Password security policies enforced
- [ ] No default or hardcoded credentials
## Data Protection
- [ ] Sensitive data identified and documented
- [ ] Encryption implemented for sensitive data at rest
- [ ] TLS/SSL implemented for data in transit
- [ ] No sensitive data in logs or error messages
## Input Validation and Secure Coding
- [ ] All user inputs validated
- [ ] Output encoded to prevent injection attacks
- [ ] Parameterized queries used for database operations
- [ ] Error handling implemented securely
## Security Testing
- [ ] Basic security tests implemented
- [ ] SAST completed with no high/critical issues
- [ ] Dependencies scanned for vulnerabilities
## Configuration and Deployment
- [ ] Default configurations and credentials changed
- [ ] Security headers implemented
- [ ] No development settings in production
Level 2 (Intermediate) Security DoD Example
# Definition of Done - Security (Level 2)
All Level 1 requirements, plus:
## Authentication and Authorization
- [ ] Multi-factor authentication for sensitive operations
- [ ] Secure session management implemented
- [ ] Token security measures implemented
- [ ] Authorization matrix documented and verified
## Data Protection
- [ ] Data classified according to sensitivity levels
- [ ] Key management procedures implemented
- [ ] Data minimization principles applied
- [ ] Data masking for non-production environments
## Input Validation and Secure Coding
- [ ] Content Security Policy configured
- [ ] Complete API input validation
- [ ] Protection against race conditions
- [ ] Secure defaults implemented
## Security Testing
- [ ] Security-focused test cases implemented
- [ ] DAST completed for web applications
- [ ] SCA performed for all dependencies
- [ ] Security regression tests added
## Configuration and Deployment
- [ ] Clear separation between environments
- [ ] Cookies configured with secure attributes
- [ ] Attack surface minimized
- [ ] Centralized logging implemented
## Threat Modeling
- [ ] Data flow diagrams created
- [ ] STRIDE analysis performed
- [ ] Security requirements traced to implementation
Level 3 (Advanced) Security DoD Example
# Definition of Done - Security (Level 3)
All Level 1 and 2 requirements, plus:
## Authentication and Authorization
- [ ] Privilege escalation testing completed
- [ ] Authentication bypass testing completed
- [ ] Advanced session management (concurrent session control, etc.)
- [ ] Context-aware authorization implemented
## Data Protection
- [ ] Data anonymization techniques applied
- [ ] Secure data deletion procedures verified
- [ ] Advanced key management with key rotation
- [ ] Data loss prevention measures implemented
## Input Validation and Secure Coding
- [ ] Context-specific output encoding
- [ ] Protection against side-channel attacks
- [ ] Time-of-check to time-of-use protections
- [ ] Comprehensive file upload validation
## Security Testing
- [ ] Penetration testing completed for critical features
- [ ] Fuzz testing for input handling
- [ ] Security test coverage metrics met
- [ ] Red team exercises for critical systems
## Configuration and Deployment
- [ ] Configuration hardened using industry benchmarks
- [ ] Automated security configuration validation
- [ ] Infrastructure as code security verified
- [ ] Continuous security monitoring implemented
## Threat Modeling
- [ ] Attack surface thoroughly analyzed
- [ ] Abuse cases documented and tested
- [ ] Formal threat model updated
- [ ] Security architecture review completed
Integrating Security DoD with Development Workflow
To effectively implement these DoD criteria:
- Shift Left: Integrate security requirements and checks early in the development process
- Automate Security Testing: Incorporate security testing into CI/CD pipelines
- Security Champions: Designate security champions within development teams
- Continuous Education: Provide ongoing security training for all team members
- Feedback Loop: Establish a feedback loop for security findings
- Risk-based Approach: Prioritize security activities based on risk
- Regular Reviews: Conduct regular security reviews of DoD criteria
Security DoD Templates
Pull Request Security Review Template
## Security Review Checklist
### Authentication and Authorization
- [ ] Authentication mechanisms properly implemented
- [ ] Authorization checks in place for all operations
- [ ] No hardcoded credentials or secrets
### Data Protection
- [ ] Sensitive data properly encrypted
- [ ] No sensitive data in logs or error messages
- [ ] Data handling complies with privacy requirements
### Input Validation and Output Encoding
- [ ] All user inputs validated server-side
- [ ] Output properly encoded for the context
- [ ] Parameterized queries used for database operations
### Secure Coding
- [ ] No common vulnerability patterns
- [ ] Error handling implemented securely
- [ ] Security controls properly implemented
### Security Testing
- [ ] Security tests pass
- [ ] SAST/DAST issues addressed
- [ ] Dependency vulnerabilities addressed
### Security Reviewer Notes
[Any specific security concerns or recommendations]
Security Story Template Example
h2. Security Requirements
* [List specific security requirements for the story]
h2. Definition of Done - Security
h3. Essential (Must Have)
* Authentication and authorization implemented
* Input validation and output encoding in place
* Sensitive data protected appropriately
* SAST completed with no high/critical issues
* Basic security tests implemented
h3. Expected (Should Have)
* Security-focused test cases added
* Threat model updated if needed
* Security logging implemented
* DAST completed for web components
* Dependencies checked for vulnerabilities
h3. Advanced (If Applicable)
* Penetration testing for critical features
* Security architecture review
* Fuzz testing for complex inputs
* Side-channel attack protection
Related Resources