Development process isn’t glamorous, but it’s the difference between projects that deliver on time and projects that drift. After integrating AI tools deeply into my workflow, my process has evolved. Here’s what it actually looks like now.
Discovery and Requirements
This phase is almost entirely human. I talk to clients, understand the business problem, map out what the software needs to do, and figure out the edge cases that aren’t obvious from the initial description.
AI’s role here is limited but useful. After a discovery call, I’ll sometimes use Claude to help me organize and structure my notes — “here’s what I learned, help me identify any gaps in my understanding” or “turn these notes into a requirements doc.” The AI is organizing my thinking, not doing the discovery.
The actual understanding of the business comes from conversation and experience. No shortcut here.
Architecture and Design
This is where I’ve found AI most useful as a thinking partner, not an authority.
For a new project, I’ll describe the domain and constraints and ask “what are the trade-offs between these two approaches?” This produces a structured comparison I can evaluate. The AI doesn’t decide — I decide, informed by the analysis.
For database schema design, I’ll draft a schema and ask for potential problems. This often surfaces things I’d catch in code review later, but earlier is cheaper.
For integration design — how does this application talk to the ERP, or the payment processor, or the shipping API — AI is excellent at helping me plan the contract before I write any code.
Implementation
The core of the workflow change. My typical feature build:
1. Write the spec. Before touching the keyboard for code, I write what the feature needs to do. Comments in the file, a spec document, a GitHub issue — doesn’t matter. The act of writing forces clarity.
2. Generate the scaffold. Model, migration, controller, routes, serializer, test file stubs — the AI generates this in seconds and I review it. Most of it is correct, some needs adjustment, occasionally there’s a significant error that would have wasted 20 minutes of my time if I’d written it myself.
3. Implement the business logic. This part I write. The complex rules, the edge cases, the domain-specific handling. AI might help me look up syntax or suggest a cleaner way to write something, but the logic itself comes from my understanding of the business.
4. Write tests. AI generates test stubs based on the code. I fill in the edge cases and business rule tests. I run everything and verify behavior against the spec from step 1.
5. Review. I use AI to do a first-pass review — “are there any obvious issues with this implementation?” It catches things sometimes. I also do my own review. Then the code ships.
Integration and External Services
Connecting to third-party APIs is where I’ve found AI most valuable after boilerplate generation.
API integration usually involves:
- Understanding the external API’s contract (endpoints, auth, rate limits)
- Writing the client library
- Writing tests that don’t actually hit the production API
- Handling errors and retries gracefully
AI is good at generating client libraries and test helpers once I’ve described the API contract. It knows the common patterns — exponential backoff, idempotency keys for payment APIs, webhook signature verification — and applies them correctly most of the time.
Code Review and Refactoring
I use AI to review my own code before submitting PRs. It’s not a replacement for peer review, but it’s a useful first pass.
More usefully, I use AI for refactoring assistance. “This method is doing too much — help me break it apart” or “this code works but it’s hard to read — what’s a cleaner version?” These are conversations AI handles well.
Debugging
When I’m stuck on a bug, describing it to an AI often helps me find the issue — not because the AI knows my system, but because explaining it clearly to another party (even a machine) forces me to think through the assumptions I’m making.
The “rubber duck debugging” concept is real. AI is a very patient and reasonably intelligent rubber duck.
Deployment and Operations
Mostly automated through existing tooling (CI/CD, deployment scripts, monitoring alerts). AI helps me write and review Dockerfiles, nginx configuration, and CI pipeline configuration. It’s good at this.
The Net Effect
Projects move faster. Not because I’m cutting corners — the spec still gets written, the tests still get written, the review still happens. But the implementation time on each step is compressed.
On a recent project, I estimated I delivered in 6 weeks something that would have taken 8-9 weeks two years ago. That extra 2-3 weeks of capacity gets reinvested in quality — more thorough testing, better documentation, more time spent on the hard architectural decisions.
If you want software built by someone who uses modern tools seriously and has the experience to use them well, let’s talk.