Agile methodology has become the dominant approach in commercial software development for good reasons: it reduces delivery risk through frequent integration, surfaces misunderstandings early through working software demonstrations, and allows scope to be adjusted as understanding of requirements evolves. These benefits are real and applicable to defense software programs. The challenge is that defense programs introduce constraints that standard Agile practices are not designed to handle — and ignoring those constraints does not make them disappear.
This article examines where Agile as commonly practiced conflicts with defense requirements, what adaptations have proven effective, and where the conventional wisdom about Agile-in-defense needs to be updated based on what actually works in practice.
Where Agile Conflicts with Defense Requirements
Security classification and access control. Standard Agile assumes that all team members can work on all parts of the codebase and attend all ceremonies. Defense programs with classified components may restrict who can work on which subsystems based on security clearance level. This creates team structures that Agile's cross-functional team model does not anticipate: a sprint team may need to be partitioned into cleared and uncleared segments, with limited ability to pair program or conduct shared code reviews across clearance boundaries. Sprint planning and retrospectives that would normally be conducted with the full team may need to be segregated.
ITAR and export control constraints. For programs involving US-origin technology subject to International Traffic in Arms Regulations (ITAR), team composition is constrained by citizenship requirements. Standard Agile hiring practices — assembling the most capable available team — may conflict with ITAR requirements that restrict foreign nationals from accessing certain technical data. This affects team size, composition flexibility, and the ability to augment teams rapidly, all of which Agile relies upon.
Formal accreditation and Authority to Operate (ATO). Defense software systems typically require security accreditation before they can be deployed to operational environments. Accreditation processes — whether under Risk Management Framework (RMF) in the US context, or equivalent frameworks in other nations — are not sprint-friendly. They involve substantial documentation, evidence collection, third-party assessment, and formal decision-making by an Authorizing Official. This creates a structural mismatch: Agile produces working software continuously, but deployment is gated by accreditation timelines that may lag development by months.
Air-gapped development environments. Defense programs handling classified data often require development to occur in air-gapped environments — systems physically disconnected from public networks. Standard CI/CD pipelines depend on internet-accessible package repositories, container registries, and cloud-based build infrastructure. An air-gapped environment requires all of these to be replicated internally, which is achievable but requires significant infrastructure investment and creates ongoing synchronization challenges when component versions need to be updated.
Adaptations: Sprint-Gated Security Reviews and Accreditation-Aware CI/CD
Effective defense Agile programs do not eliminate the constraints above — they accommodate them explicitly in the development process.
Sprint-gated security reviews integrate security assessment activities into the sprint cycle rather than treating them as a one-time gate. Each sprint includes security review activities proportional to the security-relevant changes made: a sprint that modifies authentication logic includes a focused security review of those changes; a sprint that adds non-sensitive reporting functionality may require only standard code review. This spreads the security review load across the program and prevents the accumulation of security debt that produces an unmanageable review backlog before accreditation.
Sprint-gated reviews also create a continuous evidence base for accreditation. Rather than producing accreditation documentation retrospectively at program end — which is both inefficient and of questionable accuracy — security review records generated per-sprint constitute contemporaneous evidence that security activities were performed as development proceeded. Accreditors increasingly accept this sprint-by-sprint evidence model as more rigorous than point-in-time documentation.
Accreditation-aware CI/CD addresses the air-gap and accreditation timeline challenges. The pipeline is designed with the eventual accreditation environment in mind: it uses only components that can be mirrored in the air-gapped environment, it generates the artifact types that accreditors will require (SBOMs, vulnerability scan reports, static analysis results), and it maintains build reproducibility so that the exact artifact submitted for accreditation can be regenerated if needed.
Accreditation-aware CI/CD also sequences work to maximize the portion of accreditation evidence that can be generated automatically. Manual documentation that accreditors require — system security plans, security assessment reports, plans of action and milestones — is templated and updated continuously rather than written from scratch at program end.
Observed pattern: Programs that treat accreditation as a separate workstream parallel to Agile development consistently perform better than programs that attempt to defer accreditation activities to the end. The overhead of maintaining accreditation awareness throughout the program is lower than the overhead of attempting to reconstruct evidence and remediate findings after development is nominally complete.
Documentation Requirements vs. the Agile Manifesto
The Agile Manifesto states a preference for "working software over comprehensive documentation." This is a reasonable principle for commercial software where the primary measure of progress is delivered value. In defense programs, documentation serves functions beyond capturing what the software does: it is the legal record of what was contracted, what was delivered, and how it was verified; it is the evidence base for regulatory compliance; and it is the artifact that allows long-lived defense systems to be maintained and upgraded by future teams who have no continuity with the original developers.
The practical resolution is not to choose between working software and documentation, but to define explicitly which documentation artifacts are required, when they must be produced, and what level of detail is necessary. A Software Requirements Specification for a defense program should not be confused with a comprehensive Agile backlog; it must be a formal, versioned document with traceability to design and test artifacts. A Software Design Document is not the same as architecture decision records, though both can coexist.
Defense Agile programs that work effectively produce documentation continuously alongside software, treating documentation artifacts as deliverables in their own right rather than an afterthought. This requires explicit allocation of capacity — if documentation is not represented in sprint planning, it will not be produced at the pace required.
What Actually Works in Practice
The defense programs that successfully apply Agile principles without compromising compliance share several characteristics.
They use a scaled framework — SAFe (Scaled Agile Framework) or a program-specific adaptation — that provides a structure above the sprint level for handling program-level concerns: release planning, dependency management between teams, and interaction with the contracting authority. Raw Scrum without a program-level structure rarely scales to defense program complexity.
They invest in a secure, compliant development infrastructure before the program begins, not during it. The air-gapped environment, the accreditation-aware pipeline, and the document management system are program prerequisites, not products of the first few sprints. Programs that start development before this infrastructure is in place consistently encounter costly and schedule-impacting retrofits.
They distinguish between compliance activities that can be agile (security reviews per sprint, continuously updated documentation, automated evidence generation) and those that cannot (formal accreditation decision, customer acceptance testing, contractual milestone reviews). The former can be integrated into the sprint cadence; the latter must be planned as program-level events with their own schedules and entry criteria.
They train the entire team — not just process owners — on the defense-specific requirements that affect daily work: what constitutes a classified development environment, how to handle export-controlled data, what triggers a change request versus a sprint backlog update. Compliance failures in defense Agile programs most frequently result from team members applying commercial Agile instincts in contexts where those instincts are wrong, not from deliberate non-compliance.