How to Share AI Coding Rules Across a Remote Team
Learn how to keep your distributed remote team's AI coding assistants consistent using shared rules, versioned updates, and role-based access control.
The remote team AI problem
Your team is spread across time zones. Every engineer uses Cursor, Claude Code, or Windsurf to write code. Shared AI coding rules are supposed to keep everyone consistent, but the AI tools are genuinely useful only when each developer has configured them the same way - and that rarely happens.
Alice in Berlin has rules that enforce your component naming conventions. Bob in São Paulo has a slightly different version he updated last month. Your new hire in Singapore has no rules at all and is generating code that doesn't match any pattern the rest of the team uses.
This is the AI drift problem: your coding assistants diverge from each other, and from your actual standards, silently and continuously.
The output looks like code. It compiles. But it doesn't match your patterns, it breaks your linting rules, or it introduces conventions that nobody agreed to. Code review catches some of it. The rest accumulates as technical debt.
Why manual rule sharing breaks down
The instinct is to solve this with a shared git repository or a Notion page. Somebody writes the rules, commits them, and tells everyone to copy-paste them into their AI tool. It works for about two weeks.
Here's what actually happens:
Rules go stale. Your stack evolves. You migrate from Pages Router to App Router, adopt Drizzle, switch testing frameworks. The rules file in the onboarding doc doesn't get updated. New hires adopt old patterns.
Everyone has a slightly different version. Engineers copy the rules once and then modify them. Bob adds a section about your internal API client. Alice removes the testing rules because she finds them restrictive. You now have eight versions of the rules file across the team, and nobody knows which one is canonical.
Cross-tool maintenance is a nightmare. If your team uses multiple AI coding tools (Cursor and Claude Code are a common combination), you're maintaining separate rule files for each. Cursor uses .mdc files in .cursor/rules/, Claude Code uses markdown in .claude/rules/, and Windsurf uses .md files in .windsurf/rules/. Keeping three copies synchronized manually doesn't scale.
New hires start from zero. Onboarding is already hard. Adding "figure out how to configure your AI tool to match our conventions" to the list is friction you don't need.
What centralized AI rules look like
The fix is to treat your AI rules like any other shared dependency: version them, publish them, and install them automatically.
Instead of copy-pasting rules, you publish them to a registry once:
# Publish your team's rules from any project
localskills publish --name acme-corp/api-conventions
Then any engineer on the team installs them with a single command:
localskills install acme-corp/api-conventions --target cursor claude windsurf
The CLI places the rules in the right location for each tool automatically. No manual file management. No copy-pasting.
When you update the rules (because your stack changed, or you refined a pattern), you publish a new version. Everyone's local install stays in sync:
# Pull the latest version of all installed skills
localskills pull
This is the same mental model as npm install or pip install, applied to AI tool configuration.
Organizing rules for distributed teams
Large teams need more structure than a single rules file. The most effective approach is to split rules by role and responsibility.
Core rules (everyone installs these)
Core rules cover the fundamentals that apply regardless of what someone works on: language conventions, import ordering, naming patterns, error handling.
## Language and style
- TypeScript strict mode everywhere - no `any` types
- Named exports only - no default exports from component files
- Async/await over Promise chains
- Error handling: always catch at the boundary, log with context
## Naming conventions
- Components: PascalCase (UserProfile, not user-profile)
- Hooks: camelCase with use prefix (useAuthSession)
- API routes: lowercase with hyphens (/api/user-profile)
- Database tables: snake_case (user_sessions, not userSessions)
Specialized rules (role-specific installs)
Backend engineers install API and database rules. Frontend engineers install component and styling rules. Full-stack engineers install both.
# Backend engineer setup
localskills install acme-corp/core acme-corp/api-conventions acme-corp/database --target cursor claude
# Frontend engineer setup
localskills install acme-corp/core acme-corp/components acme-corp/styling --target cursor claude
# Full-stack setup
localskills install acme-corp/core acme-corp/api-conventions acme-corp/database acme-corp/components --target cursor claude
You can document this in your onboarding guide with exact commands. New engineers run their role's install commands on day one and have a correctly configured AI tool immediately.
This maps naturally to how remote teams already organize access. Backend engineers don't need frontend styling rules in their AI context window, and including them dilutes the signal.
Automatic updates across time zones
One of the hardest problems for remote teams is staying synchronized when nobody shares office hours.
When a staff engineer in Berlin improves the database query patterns on Tuesday morning, you want those improvements to propagate to engineers in São Paulo and Singapore before they write new code, not six months later when someone notices inconsistencies in code review.
With versioned skills, this is automatic. The Berlin engineer publishes a new version:
localskills publish --name acme-corp/database --message "Add connection pooling patterns, update query batching examples"
Engineers with the skill installed can pull the update whenever they start their day:
localskills pull
Or configure auto-pull in their shell profile so it runs automatically:
# In ~/.zshrc or ~/.bashrc
alias dev="localskills pull && code ."
Symlinks make this automatic. The skill is installed as a symlink in your project's rules directory, so updates are reflected immediately without modifying project files.
Role-based access control for private rules
Not all rules should be public. Your internal API conventions, proprietary patterns, and business logic that manifests in code style shouldn't be on a public registry.
Private skills are scoped to your organization. Only authenticated members of your team can install them:
# Publish as private (organization members only)
localskills publish --name acme-corp/internal-patterns --visibility private
# Engineers authenticate with a team token
localskills login --token $ACME_CORP_TOKEN
localskills install acme-corp/internal-patterns --target cursor claude windsurf
Team tokens are not tied to individual user accounts, which means they survive employee turnover. When an engineer leaves, you don't need to rotate tokens across every machine. The team token stays valid, and you remove the individual's access.
This matters for remote teams where engineers use personal machines and manage their own development environments. A team token in the environment gives the CLI the credentials it needs without requiring centralized machine management.
See what agent skills are and how they work for a deeper explanation of the authentication model.
Tracking adoption across the team
Sharing rules is only half the problem. Knowing whether engineers are actually using them is the other half.
Remote teams can't walk to someone's desk and ask if they've updated their AI tool configuration. You need visibility into adoption.
Skills on localskills.sh have built-in analytics:
- Install counts: How many unique installs has this skill received?
- Version distribution: What percentage of your team is on the latest version?
- Pull activity: When did each install last pull an update?
If you push a critical update (say, rules that enforce a new security pattern after an incident), you can see how quickly it propagates across the team. Engineers still on old versions get a nudge in the next code review.
This isn't surveillance. It's the same visibility you have with any shared dependency: you want to know that your team is actually using the updated version.
Cross-tool consistency
Remote teams often have tool diversity that office teams don't. Engineers use what they prefer. One engineer swears by Cursor. Another uses Claude Code. A third is experimenting with Windsurf. In a distributed team, standardizing everyone onto a single tool is harder and less worthwhile.
The real requirement is that your AI conventions are consistent regardless of which tool someone uses. A rule that says "use the Drizzle query builder, not raw SQL" should apply whether the engineer is using Cursor, Claude Code, or any other AI coding assistant.
With a unified skill published to localskills.sh, one publish command covers all tools:
# Install the same skill into all three tools at once
localskills install acme-corp/database --target cursor claude windsurf
The CLI transforms the content into the right format for each tool: .mdc for Cursor's .cursor/rules/ folder, markdown for Claude Code's .claude/ directory (rules go to .claude/rules/, skills to .claude/skills/), and .md files for Windsurf's .windsurf/rules/ directory.
Read more about using AI coding rules across different tools to understand the format differences and how the CLI handles them.
Putting it into practice
Here's a practical workflow for rolling this out on a remote team.
Week 1: Audit and consolidate
Gather the current AI rules files from your team. There will be multiple versions. Identify the best elements of each:
- What conventions do all versions agree on? Those go in core rules.
- What's role-specific? Split into specialized rule sets.
- What's outdated or team-specific noise? Remove it.
Publish the consolidated version to localskills.sh under your organization's namespace.
Week 2: Roll out to the team
Update your onboarding documentation with install commands for each role. Send the team a Slack message with their specific install command. Keep it to one line. Friction in adoption matters.
# Example message to the backend team:
localskills install acme-corp/core acme-corp/api acme-corp/database --target cursor claude
Week 3: Establish update cadence
Pick an owner for each skill, ideally the engineer or team lead most responsible for that domain. Define a lightweight process for proposing rule changes: a comment in the relevant Slack channel, a PR against the rules document, whatever matches your team culture.
Set an expectation that engineers run localskills pull at the start of each work session, or automate it in their dev setup.
Ongoing: Refine based on code review
Code review is your feedback signal. When you catch a pattern that the AI tool keeps generating incorrectly, that's a missing or unclear rule. Update the skill, publish a new version, and the fix propagates automatically.
Over time, your AI tools drift toward your actual conventions instead of away from them.
Why this matters for remote work specifically
In a co-located team, rule drift gets caught faster. Engineers talk, see each other's code directly, and the culture corrects over time. Remote teams don't have those ambient correction mechanisms.
Every AI tool misconfiguration is a silent divergence that compounds. The engineer in Singapore writes code using patterns that no one on the rest of the team has ever seen, because their AI tool was configured with rules from six months ago. The Berlin engineer's AI generates code that looks right but breaks the component conventions that were updated last quarter.
Centralized, versioned AI rules are the same solution remote teams already apply to dependencies, linting configs, and CI pipelines: define it once, distribute it, and keep it synchronized automatically.
The difference is that your AI tool configuration is now part of that infrastructure instead of floating in a Notion doc.
Want to set up centralized AI rules for your team today? Create a free account on localskills.sh, publish your first skill, and share it with your team in minutes.
Also see: publishing your first skill and setting team AI coding standards.
# Install the CLI and get started
npm install -g @localskills/cli
localskills login
localskills publish