API Essentials
Concurrency
How many requests you can run in parallel ?
Our API allows up to 5 concurrent requests at a time. Any additional requests beyond this limit will automatically be queued and processed as capacity becomes available.
How It Works
- Maximum of 5 requests can run in parallel
- Additional requests are automatically queued
- Requests are processed in order as slots become available
- No action required on your end to handle queuing
Best Practices
-
Request Management
- Design your application to handle queued requests gracefully
- Monitor request status to track progress
- Consider implementing timeouts for time-sensitive operations
-
Batch Processing
- Use bulk endpoints when possible to maximize efficiency
- Keep batch sizes reasonable (100-1000 items recommended)
- Monitor job status for async operations
-
Error Handling
- Implement proper error handling for concurrent and queued requests
- Watch for timeout errors on long-running queued requests
- Have fallback logic for handling failed requests
Tips for Optimal Performance
- Space out non-urgent requests to minimize queuing
- Use async endpoints for long-running operations
- Monitor your API usage patterns
- Consider request priority when designing your implementation