Packages
CLI
Convert or migrate OpenAPI definitions to AgentBridge protocol
Usage
Run the CLI to convert an OpenAPI/Swagger file to AgentBridge:
CLI will prompt you for the ANTHROPIC_API_KEY
environment variable if it's not already set, nonetheless, it's required to run. You can find your API key in the Anthropic console.
Options
Option | Alias | Description |
---|---|---|
-V | --version | Output the version number |
-o | --output <path> | Output file path (default: agentbridge.json ) |
--no-cache | - | Disable LLM response caching |
-d | --debug | Enable debug mode with detailed output |
-v | --verbose | Enable verbose logging with detailed processing information |
-c | --concurrency <number> | Number of endpoints to process in parallel (default: 3 ) |
-h | --help | Display help for command |
Performance tuning
Concurrency
The -c
or --concurrency
flag controls how many endpoints are processed simultaneously. Higher values can speed up processing for large APIs but may increase the likelihood of hitting rate limits. Default is 5
.
Rate limit handling
The CLI automatically handles Anthropic API rate limits:
- Uses the retry-after header from Anthropic for precise backoff timing
- Implements exponential backoff with jitter when headers aren't available
- Scales retry attempts based on concurrency settings
- Provides detailed logs about rate limit status when in verbose mode
If you frequently encounter rate limit errors, try:
- Reducing the concurrency value
- Using the cache (enabled by default) to reduce API calls in subsequent runs
- Upgrading your Anthropic API tier for higher rate limits
Features
- Parse OpenAPI/Swagger specifications
- Enhance API descriptions with AI
- Detect multi-step workflows
- Map data flows between endpoints
- Debug mode with detailed processing information
- Token usage tracking and cost estimation
- Intelligent rate limit handling with automatic retries
- Parallel processing of endpoints with configurable concurrency