How to Onboard New Developers with AI Coding Rules
Use AI coding rules to get new team members writing production-quality code from day one. Faster onboarding, fewer review cycles, less tribal knowledge lost.
The onboarding problem nobody talks about
A new developer joins your team. They're talented. Their take-home project was excellent. You're excited.
Then their first PR lands. It's technically correct, but nothing looks like your codebase. Error handling is inconsistent. The naming conventions are off. They used a pattern you abandoned six months ago. The PR review turns into a 20-comment thread explaining context that exists nowhere in the docs.
This isn't a people problem. It's a knowledge transfer problem.
Your team has accumulated years of tribal knowledge: decisions made, patterns agreed upon, pitfalls discovered the hard way. That knowledge lives in people's heads, in Slack threads, in buried PR comments. A new hire has no way to absorb it quickly, and you have no efficient way to transmit it.
AI coding rules change this equation entirely.
What AI coding rules actually encode
When developers write AI coding rules, they're forced to make implicit knowledge explicit. Everything your senior engineers do automatically (preferring named exports, always adding Zod validation at API boundaries, using a specific error-handling pattern) gets written down in a form the AI can act on.
The rules become a living document of how your team writes code. And when a new developer installs those rules into their editor, they get months of accumulated best practices before they write their first line of code.
This is fundamentally different from a style guide or README. A README sits in a browser tab that gets closed. AI rules are active participants in every code generation, edit, and review session. The AI won't let a new hire forget them.
Think about what this means for the most common onboarding problems:
Inconsistent patterns: The AI enforces your patterns on every file, not just the ones a reviewer happens to check.
Unknown pitfalls: "Never use X because Y" goes in the rules. The AI will warn against it automatically.
Stack unfamiliarity: New hires often know the language but not your specific library choices. Rules document why you chose Drizzle over Prisma, how you structure React Query hooks, what your Tailwind conventions are.
Unspoken architecture decisions: The monorepo structure, where shared utilities live, which packages are for internal use only. All of this can be captured.
Setting up a new-hire rules kit
An onboarding rules kit is a curated collection of rules specifically designed to accelerate ramp-up. It's distinct from your general team rules: it goes deeper on context, explains more, and covers things senior engineers take for granted.
Here's a structure that works well:
.cursor/rules/
onboarding/
stack-overview.mdc # What we use and why
architecture.mdc # Project structure decisions
patterns.mdc # Recurring code patterns with examples
anti-patterns.mdc # Things we specifically avoid
workflow.mdc # PR process, branch naming, commit style
stack-overview.mdc
This file does the job of the technology section of your onboarding doc, but the AI will actually use it:
---
description: Tech stack overview for new team members
alwaysApply: true
---
## Our stack (and why)
- **Next.js 15 App Router** — We migrated from Pages Router in Q3 2025. Do not use Pages Router patterns.
- **Drizzle ORM** — Chosen over Prisma for edge compatibility. Schema lives in src/db/schema.ts.
- **Zod** — All external data (API inputs, env vars, form data) is validated with Zod. No exceptions.
- **React Query** — All server state. Do not use useState + useEffect for data fetching.
- **Tailwind CSS v4** — No CSS modules, no styled-components, no inline styles.
When in doubt about a library choice, ask before installing something new.
patterns.mdc
This is the most valuable file. It shows the actual patterns your team uses:
---
description: Common code patterns — read this before writing any feature
alwaysApply: true
---
## API routes
All API routes follow this structure. Do not deviate:
```typescript
export async function POST(request: Request) {
const body = await request.json();
const validated = MySchema.safeParse(body);
if (!validated.success) {
return Response.json({ error: validated.error.flatten() }, { status: 400 });
}
// business logic here
return Response.json({ data: result });
}
React components
- Always named exports (no default exports for components)
- Props interface defined above the component, never inline
- Co-locate component tests in tests folder next to the component
### anti-patterns.mdc
Explicitly listing what not to do is underrated:
```markdown
---
description: Patterns we specifically avoid
alwaysApply: true
---
## Do not use these patterns
- **useEffect for data fetching** — Use React Query instead
- **any type** — Use unknown and narrow it, or define a proper type
- **Direct database calls in components** — All DB access goes through server actions or API routes
- **console.log in commits** — Use our logger utility from @/lib/logger
- **Hardcoded strings in UI** — Strings live in constants files, not components
- **Direct Cloudflare API calls** — Use the wrappers in @/lib/cf/
Each of these was a deliberate decision. If you think an exception is warranted, open a discussion first.
Sharing the kit across your team
Writing the rules is the easy part. The hard part is distribution: making sure every new hire gets the current version, and making sure the kit stays updated as your codebase evolves.
This is exactly what agent skills on localskills.sh are built for. You publish the onboarding kit once, and every new hire installs it with a single command. When you update the rules, they pull the latest version.
Here's the workflow:
1. Publish the kit
npm install -g @localskills/cli
localskills login
localskills publish --name your-org/onboarding-kit
2. New hire installs it on day one
localskills install your-org/onboarding-kit --target cursor
For teams using multiple editors, install into all of them at once:
localskills install your-org/onboarding-kit --target cursor claude windsurf
3. Keep it current
# Run this whenever you update the kit
localskills pull
The kit is versioned. You can tag a stable release, roll back if an update causes issues, and track how many developers have installed it. You can read the full publishing walkthrough in Publish Your First Skill.
Private organizations can mark skills as private. Only members with your team's API token can install them. This keeps proprietary patterns inside your org.
What to put in the onboarding kit vs. the general team rules
There's a useful distinction between two kinds of rules:
| Type | Audience | Purpose |
|---|---|---|
| Onboarding rules | New hires, first 90 days | Context-heavy explanations, "why we do this" |
| Team standards rules | Everyone, always | Concise, actionable, the current state |
Your general team AI coding standards should be tight: each rule is something everyone knows, written for the AI to act on. The onboarding kit can be more verbose. A new hire needs to understand that you chose Drizzle for edge compatibility. A senior engineer who's been on the project for two years doesn't need that reminder.
Consider maintaining both: a your-org/team-standards skill that everyone uses, and a your-org/onboarding skill that new hires install for their first few months.
Measuring the impact
Onboarding improvement is notoriously hard to measure. But AI rules create some natural metrics:
PR review cycle time: Track the average number of review rounds for first PRs from new hires, before and after implementing rules. Most teams see a 40-60% reduction in "this doesn't follow our patterns" comments.
Time to first merged PR: The median time from first commit to first merged PR is a good proxy for ramp-up speed. With AI rules handling the mechanical enforcement, new hires spend less time in review loops and more time building.
Rule violation frequency: If you have linting or CI checks that catch common mistakes, track how often new hires trigger them. A well-maintained rules kit should reduce this over the first few weeks.
Qualitative feedback: Ask new hires directly. "Did the AI rules help you understand how to write code here?" is a simple interview question in your 30-day check-in.
One practical approach: keep a simple log of every PR comment that explains a team convention. After a month with AI rules, check if those comments have decreased. They will.
Example onboarding skills to start with
If you're not sure where to begin, these are the most impactful rules to capture first:
Error handling philosophy: Does your team use try/catch everywhere? Return result types? Log to a specific service? New hires will default to whatever they learned last. A clear rule fixes this immediately.
Import conventions: Absolute vs. relative imports, barrel file usage, alias paths. This shows up in every single file.
Component structure: Where props go, how to handle loading and error states, when to use server vs. client components in Next.js.
Testing expectations: What requires tests, what doesn't, how to mock services, what test file naming convention you use.
Git workflow: Commit message format, branch naming, PR description template. Boring but creates real friction when mismatched.
Start with these five categories and you'll cover 80% of the patterns that cause onboarding friction.
A practical day-one onboarding script
Here's what a day-one setup looks like with localskills.sh:
# Clone the repo
git clone https://github.com/your-org/your-project
# Install team skills
npm install -g @localskills/cli
localskills login
localskills install your-org/onboarding-kit --target cursor claude
localskills install your-org/team-standards --target cursor claude
# Verify installation
localskills list
Add this to your onboarding doc. The whole thing takes under two minutes. From that point on, the new hire's editor is configured with your team's knowledge, not a blank slate.
The bigger picture
The best onboarding documentation is documentation that can't be ignored. A README can be skimmed and forgotten. A Notion doc gets outdated. AI coding rules are enforced in real time, every time code is written.
When you invest in writing good team rules, you're not just improving your AI setup. You're building a knowledge base that makes your team more resilient to turnover, more consistent in output, and faster at integrating new people.
The tribal knowledge problem is one of the oldest in software engineering. AI rules are the first practical tool for getting that knowledge out of people's heads and into every developer's editor, every day.
Ready to build your team's onboarding kit? Create a free account, install the CLI, and publish your first skill in under 10 minutes. Get started at localskills.sh/signup.
npm install -g @localskills/cli
localskills login
localskills publish --name your-org/onboarding-kit