Hi everyone,
I wanted to share some technical insights and challenges I faced while building a CLI tool for OpenAPI auditing. Instead of a simple 'I made this' post, I’d like to discuss the architectural decisions and the pitfalls of distributing Node-based CLIs.
The 'Why': Beyond Schema Validation Standard validation only tells you if the YAML is valid. But in a large-scale environment, we found that 'valid' isn't 'good'. We struggled with:
Technical Hurdles: Distribution & Environment If you're building a CLI in Node, these were my 'lessons learned' this week:
#!/usr/bin/env node. Without it, Windows users might have the .js file associated with their IDE, breaking the execution flow../config works in dev, but fails when the package is run via npx. I had to move to absolute path resolution using __dirname to ensure the assets are found regardless of the user's current directory.The Roadmap: Enforcing Dry Specs The next technical challenge I'm tackling is enforcing a 'Reference-Only' architecture. The goal is to traverse the AST to ensure every component (schemas, responses) is a $ref, preventing the bloated 10k-line YAML files that become impossible to maintain.
I'd love to hear from anyone else who has built dev-tools: How do you handle opinionated rule-sets without alienating users?
Code for context: https://github.com/vicente32/auditapi