Agency Playbook
A practical guide for agencies and consultancies implementing AI coding assistants for clients.
Understanding Client Contexts
Client Assessment Framework
Before recommending patterns, evaluate:
-
Risk Tolerance
- Conservative: Established enterprises, regulated industries
- Moderate: Growth-stage companies, competitive markets
- Aggressive: Startups, innovation-focused organizations
-
Technical Maturity
- Legacy systems and technical debt
- Modern infrastructure and practices
- Cloud-native and DevOps culture
-
Organizational Readiness
- Developer openness to AI tools
- Management support and budget
- Existing innovation processes
Client Engagement Phases
Phase 1: Discovery and Assessment
Patterns to Review:
- Philosophy and Mindset - Set expectations
- Risk Assessment - Evaluate readiness
- Framework Selection Guide - Tool evaluation
Deliverables:
- Readiness assessment report
- Risk analysis and mitigation plan
- Recommended implementation approach
- ROI projections
Phase 2: Pilot Implementation
Patterns to Apply:
- Core Architecture - Foundation setup
- Initialization Process - Getting started
- Real World Examples - Practical demos
Deliverables:
- Pilot environment setup
- Initial use case implementation
- Performance metrics baseline
- Pilot evaluation report
Phase 3: Production Rollout
Patterns to Implement:
- Team Workflows - Team collaboration
- Deployment Guide - Production deployment
- Observability and Monitoring - Operations setup
Deliverables:
- Production deployment
- Monitoring and analytics setup
- Training materials and documentation
- Handover and support plan
Pattern Selection by Client Type
Startup Clients
Recommended Patterns:
- Start with low-risk patterns
- Focus on velocity and iteration
- Minimal governance overhead
Key Patterns:
- Building Your Own AMP - Custom solutions
- Parallel Tool Execution - Speed optimization
- Feature Flag Integration - Rapid iteration
Mid-Market Clients
Recommended Patterns:
- Balance of managed-risk patterns
- Focus on team collaboration
- Gradual governance introduction
Key Patterns:
- From Local to Collaborative - Team adoption
- Authentication and Identity - Access management
Enterprise Clients
Recommended Patterns:
- Start with high-risk patterns awareness
- Comprehensive governance from day one
- Integration with existing systems
Key Patterns:
- Enterprise Integration - System connections
- The Permission System - Access control
- Performance at Scale - Enterprise scale
Regulated Industry Clients
Recommended Patterns:
- Compliance-first approach
- Extensive audit trails
- Zero-trust security model
Key Patterns:
- Sharing and Permissions - Data classification
- Risk Assessment - Compliance evaluation
- Observability and Monitoring - Audit trails
Common Client Objections
"Our code is too sensitive"
Response Patterns:
- Implement The Permission System
- Start with non-sensitive projects
- Use on-premise deployment options
- Demonstrate security controls and compliance
"AI will replace our developers"
Response Patterns:
- Focus on augmentation, not replacement
- Show Real World Examples
- Emphasize skill development opportunities
- Demonstrate productivity gains for existing team
"It's too expensive"
Response Patterns:
- Calculate ROI with metrics from pilots
- Start with small team trials
- Compare to developer salary costs
- Show Performance Tuning for cost optimization
"We don't trust AI-generated code"
Response Patterns:
- Implement review workflows
- Start with low-risk use cases
- Show System Prompts and Model Settings
- Demonstrate quality improvements
Pricing and Packaging
Assessment Package
- 2-week engagement
- Readiness assessment
- Tool evaluation
- Implementation roadmap
Pilot Package
- 4-6 week engagement
- Environment setup
- Use case implementation
- Team training
- Success metrics
Implementation Package
- 3-6 month engagement
- Full production deployment
- Integration with existing systems
- Comprehensive training
- Ongoing support setup
Transformation Package
- 6-12 month engagement
- Organization-wide rollout
- Process transformation
- COE establishment
- Continuous optimization
Success Metrics
Leading Indicators
- Developer activation rate
- Tool usage frequency
- Feature adoption rate
- User satisfaction scores
Lagging Indicators
- Velocity improvement
- Bug reduction rate
- Time to market decrease
- Developer retention improvement
Resources and Tools
Assessment Tools
- Risk Assessment template
- Framework Selection Guide
- ROI calculation spreadsheet
Implementation Resources
- Pattern Template for documentation
- Migration Strategies for transitions
- Lessons Learned for pitfall avoidance
Case Studies
- AMP Implementation Cases
- Claude Code vs Anon Kode comparison
- Framework Wars Analysis for tool selection