Core Concepts
SPEC-First Development Philosophy:
- EARS format ensures unambiguous requirements
- Requirement clarification prevents scope creep
- Systematic validation through test scenarios
- Integration with TDD workflow for implementation
- Quality gates enforce completion criteria
- Constitution reference ensures project-wide consistency
Constitution Reference (SDD 2025 Standard)
Constitution defines the project DNA that all SPECs must respect. Before creating any SPEC, verify alignment with project constitution defined in .moai/project/tech.md.
Constitution Components:
- Technology Stack: Required versions and frameworks
- Naming Conventions: Variable, function, and file naming standards
- Forbidden Libraries: Libraries explicitly prohibited with alternatives
- Architectural Patterns: Layering rules and dependency directions
- Security Standards: Authentication patterns and encryption requirements
- Logging Standards: Log format and structured logging requirements
Constitution Verification:
- All SPEC technology choices must align with Constitution stack versions
- No SPEC may introduce forbidden libraries or patterns
- SPEC must follow naming conventions defined in Constitution
- SPEC must respect architectural boundaries and layering
WHY: Constitution prevents architectural drift and ensures maintainability
IMPACT: SPECs aligned with Constitution reduce integration conflicts significantly
SPEC Workflow Stages
Stage 1 - User Input Analysis: Parse natural language feature description
Stage 2 - Requirement Clarification: Four-step systematic process
Stage 3 - EARS Pattern Application: Structure requirements using five patterns
Stage 4 - Success Criteria Definition: Establish completion metrics
Stage 5 - Test Scenario Generation: Create verification test cases
Stage 6 - SPEC Document Generation: Produce standardized markdown output
EARS Format Deep Dive
Ubiquitous Requirements - Always Active:
- Use case: System-wide quality attributes
- Examples: Logging, input validation, error handling
- Test strategy: Include in all feature test suites as common verification
Event-Driven Requirements - Trigger-Response:
- Use case: User interactions and inter-system communication
- Examples: Button clicks, file uploads, payment completions
- Test strategy: Event simulation with expected response verification
State-Driven Requirements - Conditional Behavior:
- Use case: Access control, state machines, conditional business logic
- Examples: Account status checks, inventory verification, permission checks
- Test strategy: State setup with conditional behavior verification
Unwanted Requirements - Prohibited Actions:
- Use case: Security vulnerabilities, data integrity protection
- Examples: No plaintext passwords, no unauthorized access, no PII in logs
- Test strategy: Negative test cases with prohibited behavior verification
Optional Requirements - Enhancement Features:
- Use case: MVP scope definition, feature prioritization
- Examples: OAuth login, dark mode, offline mode
- Test strategy: Conditional test execution based on implementation status
Requirement Clarification Process
Step 0 - Assumption Analysis (Philosopher Framework):
Before defining scope, surface and validate underlying assumptions using AskUserQuestion.
Assumption Categories:
- Technical Assumptions: Technology capabilities, API availability, performance characteristics
- Business Assumptions: User behavior, market requirements, timeline feasibility
- Team Assumptions: Skill availability, resource allocation, knowledge gaps
- Integration Assumptions: Third-party service reliability, compatibility expectations
Assumption Documentation:
- Assumption Statement: Clear description of what is assumed
- Confidence Level: High, Medium, or Low based on evidence
- Evidence Basis: What supports this assumption
- Risk if Wrong: Consequence if assumption proves false
- Validation Method: How to verify before committing significant effort
Step 0.5 - Root Cause Analysis:
For feature requests or problem-driven SPECs, apply Five Whys:
- Surface Problem: What is the user observing or requesting?
- First Why: What immediate need drives this request?
- Second Why: What underlying problem creates that need?
- Third Why: What systemic factor contributes?
- Root Cause: What fundamental issue must the solution adddess?
Step 1 - Scope Definition:
- Identify supported authentication methods
- Define validation rules and constraints
- Determine failure handling strategy
- Establish session management approach
Step 2 - Constraint Extraction:
- Performance Requirements: Response time targets
- Security Requirements: OWASP compliance, encryption standards
- Compatibility Requirements: Supported browsers and devices
- Scalability Requirements: Concurrent user targets
Step 3 - Success Criteria Definition:
- Test Coverage: Minimum percentage target
- Response Time: Percentile targets (P50, P95, P99)
- Functional Completion: All normal scenarios pass verification
- Quality Gates: Zero linter warnings, zero security vulnerabilities
Step 4 - Test Scenario Creation:
- Normal Cases: Valid inputs with expected outputs
- Error Cases: Invalid inputs with error handling
- Edge Cases: Boundary conditions and corner cases
- Security Cases: Injection attacks, privilege escalation attempts
Plan-Run-Sync Workflow Integration
PLAN Phase (/moai:1-plan):
- manager-spec agent analyzes user input
- EARS format requirements generation
- Requirement clarification with user interaction
- SPEC document creation in .moai/specs/ directory
- Git branch creation (optional --branch flag)
- Git Worktree setup (optional --worktree flag)
RUN Phase (/moai:2-run):
- manager-tdd agent loads SPEC document
- RED-GREEN-REFACTOR TDD cycle execution
- moai-workflow-testing skill reference for test patterns
- Domain Expert agent delegation (expert-backend, expert-frontend, etc.)
- Quality validation through manager-quality agent
SYNC Phase (/moai:3-sync):
- manager-docs agent synchronizes documentation
- API documentation generation from SPEC
- README and architecture document updates
- CHANGELOG entry creation
- Version control commit with SPEC reference
Parallel Development with Git Worktree
Worktree Concept:
- Independent working directories for multiple branches
- Each SPEC gets isolated development environment
- No branch switching needed for parallel work
- Reduced merge conflicts through feature isolation
Worktree Creation:
- Command /moai:1-plan "login feature" "signup feature" --worktree creates multiple SPECs
- Result creates project-worktrees directory with SPEC-specific subdirectories
Worktree Benefits:
- Parallel Development: Multiple features developed simultaneously
- Team Collaboration: Clear ownership boundaries per SPEC
- Dependency Isolation: Different library versions per feature
- Risk Reduction: Unstable code does not affect other features
---