Design System Problems

Design Token Linting

January 15, 2026 • 5 min read

Design Token Linting

Design token linting applies automated rules to catch token quality issues before they enter the system. Like code linting catches programming style issues, token linting catches naming violations, invalid values, and structural problems in token definitions. Linting provides fast feedback during authoring, preventing issues rather than fixing them later.

What Is Design Token Linting

Design token linting is the automated analysis of token definitions against configurable rules. Linters examine token files and report violations, typically distinguishing between errors (must fix) and warnings (should consider fixing).

Token linting operates on source token files, catching issues during authoring before transformation and distribution. This early detection prevents problems from propagating to generated outputs and consuming applications.

How Design Token Linting Works

Token linters process token source files, applying rules that check various aspects of token quality.

Naming rules enforce consistent naming patterns:

// Rule: token names must use kebab-case
const namePattern = /^[a-z][a-z0-9]*(-[a-z0-9]+)*$/;

function validateTokenName(name) {
  const segments = name.split('.');
  for (const segment of segments) {
    if (!namePattern.test(segment)) {
      return { valid: false, message: `Segment "${segment}" is not kebab-case` };
    }
  }
  return { valid: true };
}

Structure rules verify organizational patterns:

// Rule: color tokens must have category prefix
function validateColorStructure(token) {
  if (token.type === 'color' && !token.name.startsWith('color.')) {
    return { valid: false, message: 'Color tokens must start with "color."' };
  }
  return { valid: true };
}

Value rules check that values meet expectations:

// Rule: spacing values must be multiples of 4
function validateSpacingValue(token) {
  if (token.type === 'spacing') {
    const value = parseInt(token.value);
    if (value % 4 !== 0) {
      return { valid: false, message: `Spacing value ${value} is not a multiple of 4` };
    }
  }
  return { valid: true };
}

Metadata rules ensure required information is present:

// Rule: all tokens must have descriptions
function validateDescription(token) {
  if (!token.description || token.description.trim() === '') {
    return { valid: false, message: 'Token must have a description' };
  }
  return { valid: true };
}

Key Considerations

Common Questions

What tools support token linting?

Several approaches exist for token linting, from general JSON linters to specialized token tools.

JSON Schema validation uses standard JSON Schema to validate token file structure:

{
  "$schema": "http://json-schema.org/draft-07/schema#",
  "type": "object",
  "patternProperties": {
    ".*": {
      "type": "object",
      "required": ["value"],
      "properties": {
        "value": { "type": ["string", "number", "object"] },
        "description": { "type": "string" },
        "type": { "type": "string" }
      }
    }
  }
}

Custom linting scripts implement organization-specific rules:

// token-lint.js
const fs = require('fs');
const glob = require('glob');

const rules = [validateNaming, validateTypes, validateDescriptions];

function lintTokenFile(filepath) {
  const tokens = JSON.parse(fs.readFileSync(filepath));
  const errors = [];

  walkTokens(tokens, (token, path) => {
    for (const rule of rules) {
      const result = rule(token, path);
      if (!result.valid) {
        errors.push({ path, message: result.message });
      }
    }
  });

  return errors;
}

Style Dictionary plugins can add validation during the build process:

StyleDictionary.registerHook({
  name: 'pre-transform-validate',
  fn: (dictionary) => {
    // Custom validation logic
    dictionary.allTokens.forEach(token => {
      if (!token.description) {
        console.warn(`Token ${token.name} missing description`);
      }
    });
  }
});

How should linting integrate with development?

Linting should provide feedback at multiple stages of development.

Editor integration provides immediate feedback while editing. VS Code extensions can run linting on file save:

// .vscode/settings.json
{
  "json.schemas": [{
    "fileMatch": ["tokens/**/*.json"],
    "url": "./schemas/token-schema.json"
  }]
}

Pre-commit hooks run linting before commits are created:

#!/bin/bash
# .husky/pre-commit
npx token-lint tokens/**/*.json

CI pipeline runs comprehensive linting on pull requests:

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - run: npm run lint-tokens

The progression from editor to commit to CI provides multiple opportunities to catch issues, with increasing comprehensiveness at each stage.

How should linting rules evolve?

Linting rules should evolve with the token system and organizational learning.

Start minimal with rules addressing clear, uncontroversial issues. Broken references, invalid values, and missing required fields are good starting points.

Add rules based on problems encountered. When issues slip through to production, consider whether a lint rule could have caught them.

Review rules periodically to ensure they remain relevant. Rules that never catch issues may be unnecessary. Rules that catch too many issues may need refinement.

Communicate changes when adding or modifying rules. Team members need to understand what rules exist and why.

Version lint configuration alongside tokens. As tokens evolve, lint rules should evolve correspondingly, tracked in version control.

Summary

Design token linting applies automated rules to catch quality issues during authoring. Rules address naming, structure, values, and metadata. Tools range from JSON Schema validation to custom scripts to build-time hooks. Integration at editor, commit, and CI stages provides multiple feedback opportunities. Effective linting rules evolve based on encountered problems and periodic review, supporting token quality without creating unnecessary friction.

Buoy scans your codebase for design system inconsistencies before they ship

Detect Design Drift Free
← Back to Token Management