~/sideeffects
§ TRAINING

The AI-Native Engineer

You're not being replaced. You're being promoted.

Live training that turns strong engineers into better ones. Your team learns to ship faster with AI, not get replaced by it.

§ 01 — The problem

Your team has AI tools. Now what?

The ground is shifting under every engineering team. Leadership bought the licenses. Everyone has Copilot. Maybe someone tried Cursor. But the gap between "we have AI tools" and actually shipping faster with them? Bigger than anyone wants to admit.

Most teams are stuck on autocomplete. Meanwhile the engineers who've figured this out are building in a week what used to take a quarter. The difference isn't the tool. It's how you work with it.

§ 02 — The reframe

The job changed. The craft didn't.

AI writes the code now. So what do you do? The same things that always mattered: decide what to build, design how it fits together, and make sure it actually works. That's engineering.

Ten years of experience didn't stop being useful. You just apply it differently. Less typing, more thinking. Less grunt work, more building.

Honestly? It's the most fun we've had writing software in years. Your team should be having this much fun too.

§ 03 — Receipts

Receipts.

3x GitHub contributions in one year
0 Lines of code written by hand in 6 months
20+ Repos maintained solo with AI workflows

Nick's numbers. Output tripled. Quality held. Staff Engineer, still shipping every day.

Nick built a production AI agent (the WorkOS CLI AI Installer) that configures and ships SDK integrations on its own, and maintained 20+ repos as a solo DX engineer using these same workflows. This is how we actually work — and what we want to show your team how to do.

§ 04 — Curriculum

What your team will learn

A baseline curriculum tailored to your team's stack, experience, and goals.

  1. 01

    The AI Mindset

    Before touching any tools — how to think about working with AI. What changes, what doesn't, and why your engineering experience matters more now, not less.

    • From writing code to directing agents — mental models that work
    • Addressing the fear and FUD head-on
    • Where AI excels vs. where you still need a human in the loop
    • Why senior engineers have an advantage, not a disadvantage
  2. 02

    Context is Everything

    The difference between "make this work" and "build this well" is the context you provide. Whirlwind tour of AI coding tools — we focus on whatever your team has access to.

    • Prompt engineering for real codebases, not toy examples
    • Structuring context so the agent builds what you actually want
    • Hands-on with your team's tools (Claude Code, Copilot, Cursor, etc.)
    • When to use chat, inline, and agentic modes
  3. 03

    Build It Live

    We build something complex together, in real-time. Not slides about best practices. Actual code, actual problems, actual shipping.

    • Pick a real feature or project and build it start to finish
    • Watch the workflow in action — how to direct, iterate, and course-correct
    • Everyone codes along and builds the same thing with their own tools
    • See what "fast" actually looks like when you're working with agents
  4. 04

    Architecture-First Thinking

    When code is cheap, design matters more. How to think about systems, tradeoffs, and the decisions that agents can't make for you.

    • Designing systems when generating code costs nothing
    • Tradeoff analysis and making decisions agents can't
    • How to spec work so agents produce better output
    • Avoiding the "it works but it's a mess" trap
  5. 05

    Review, Verify, Ship

    Reading code is the job now. How to evaluate what AI wrote, catch the subtle bugs it introduces, and test code you didn't write.

    • Code review patterns for AI-generated code
    • Common failure modes — what agents get wrong and how to spot it
    • Testing strategies when you didn't write the implementation
    • Building confidence in code you didn't type
  6. 06

    Working in Parallel

    Nick tripled his GitHub contributions in one year. Here's how to run multiple workstreams at once with agents doing the heavy lifting.

    • Running multiple agents on different tasks simultaneously
    • Managing context across parallel workstreams
    • When to parallelize and when to focus
    • Practical workflows for juggling projects without losing quality
§ 05 — For leaders

For engineering leaders

You got the mandate: adopt AI. But what does that actually mean for your team? How do you measure it? What does "good" look like? This isn't just training for ICs.

We cover how to think about AI adoption as a leader — setting expectations, measuring impact, building conventions that stick, and knowing what to watch out for.

First 30 Days

Team adopts new workflows. You see AI usage shift from autocomplete to agentic coding. PR velocity starts climbing.

60 Days

Engineers are self-sufficient with AI tools. Internal conventions are established. The team has a shared language for how they work with AI.

90 Days

Shipping velocity is measurably higher. Engineers are tackling projects they would've deprioritized before. The team wonders how they worked without this.

§ 06 — Add-ons

Go deeper where you need it.

Everyone gets the core modules. These are examples of add-ons we've built before — or we can create something new based on what your team actually needs.

  • Tool Selection & Evaluation

    Cutting through the noise. Which tools actually work, which are hype, and how to evaluate them for your stack.

  • Internal Conventions & Guardrails

    Setting team standards for AI usage. Style guides, review policies, quality gates.

  • Something Else Entirely

    Have a specific challenge? We'll scope a custom module together on the intro call.

§ 07 — Format

How it works.

Live, not pre-recorded

Real-time instruction with Q&A and pair programming. Remote or on-site — whatever works for your team.

Your stack, your problems

We adapt the curriculum to your languages, frameworks, and the projects you're actually working on.

Flexible format

One intensive day, a series of sessions over a week, or something in between. We scope it on the intro call.

§ 08 — Deliverables

What your team walks away with.

  • AI Workflow Playbook

    A written guide covering the workflows, patterns, and prompts covered in training — tailored to your stack.

  • Working Examples

    The code your team built during the live session — real examples to reference, not throwaway demos.

  • Team Conventions Template

    A starting point for internal AI usage guidelines — review standards, quality gates, and tool recommendations.

  • Session Recording

    Full recording of the training for team members who couldn't attend or anyone who wants a refresher.

  • Optional Follow-Up Session

    30-60 days after training, we regroup. Address new questions, reinforce what stuck, and troubleshoot what didn’t.

§ 09 — Instructor

Meet your instructor.

Nick Nisi

I'm a Staff Engineer. I haven't written a line of code by hand in six months and my output has never been higher. I built the WorkOS CLI AI Installer (a production agent that ships SDK integrations on its own), I've given 50+ conference talks, and I was a host on JS Party.

50+ Conference TalksStaff EngineerJS Party AlumOpen Source
§ 10 — Fit

Is this right for your team?

$ great-fit
  • Teams of 3-50 engineers shipping a product
  • Organizations where leadership said "use AI" but nobody explained how
  • Engineers using Copilot for autocomplete but not much else
  • Eng managers who need to measure and report on AI adoption
$ not-a-fit
  • Looking for a course on building AI/ML models or agents
  • Non-technical teams or individuals looking for 1:1 coaching
  • Teams already shipping with agentic workflows daily
§ 11 — Pricing

Calculate your training investment.

Priced based on team size with volume discounts for larger teams.

  • Volume discounts

    Lower per-head rate for larger teams.

  • Follow-up sessions

    Optional reinforcement after training.

  • Satisfaction guarantee

    Your team ships with confidence or we make it right.

Training cost calculator

$0
Min 3 $0 avg / engineer
Follow-Up Session $2,500

A follow-up session 30-60 days after training to reinforce learnings and address new questions.

Total investment $0

Larger team? Let's talk — we'll put together custom pricing.

Schedule a consultation
§ 12 — FAQ

Common questions.

What makes this different from AI/ML courses?
We don't build AI models. We teach engineers how to use AI to ship software faster. No RAG pipelines, no fine-tuning. Just practical workflows you'll use on Monday.
How long is the training?
Depends on your team. Could be a day, could be a week. We figure out the right scope together on the intro call.
What's the ideal team size?
3-15 engineers. Small enough that everyone gets hands-on time, big enough for good discussion.
Do we need AI experience?
Nope. You need to be a solid engineer. That's the prerequisite. We handle the AI part.
What tools do you cover?
Claude Code, Claude Desktop, Cursor, GitHub Copilot, and others. Every team's toolset is different — we'll focus on what your company already has access to and what makes sense for your stack.
Can we customize the curriculum?
That's the whole idea. The six core modules are the baseline. We go deeper wherever your team needs it.
What's the investment?
There's a pricing calculator on this page for a ballpark. Exact numbers depend on team size and what we cover. Let's talk about it on a call.
§ 13 — Next step

Let's talk about your team.

30 minutes. No pitch deck. We'll figure out if this is a good fit and what your team actually needs.