Activity shows individual transactions as they flow through Fenra. Use it to debug issues, investigate anomalies, or monitor live usage.
What You’ll See
A table of transactions with:
| Column | Description |
|---|
| Timestamp | When the transaction was processed |
| Provider | OpenAI, Anthropic, etc. |
| Model | Specific model used |
| Cost | Calculated cost in USD |
| Feature | Your feature name (if provided) |
| Environment | Production, staging, etc. |
Click any row to see full details.
Transaction Details
Each transaction includes:
- Cost Breakdown: Input cost, output cost, total
- Usage Metrics: Token counts, image counts, etc.
- Context: All metadata you sent (feature, environment, user, session)
- Request ID: For correlating with your logs
Filtering
Find specific transactions:
| Filter | Use Case |
|---|
| Date/Time | Last hour, 24 hours, or custom range |
| Provider/Model | Find transactions for specific models |
| Cost Range | Find expensive transactions (min/max cost) |
| Environment | Production vs. staging |
| Feature | Specific product features |
| Request ID | Find a specific transaction by ID |
Use Cost Range to find outliers. Set a minimum cost (e.g., $1) to surface expensive transactions.
Real-Time Mode
Toggle real-time mode to see transactions as they arrive:
- New transactions appear at the top
- Auto-refresh every few seconds
- Perfect for monitoring during deploys or tests
Common Use Cases
Debugging
When something looks wrong:
- Filter to the relevant time window
- Find the suspicious transactions
- Check the usage metrics and metadata
- Correlate with your application logs using request_id
Investigating Cost Spikes
When the dashboard shows a spike:
- Filter to the spike period
- Sort by cost (highest first)
- Look for unusually expensive transactions
- Check what model, feature, or user caused them
Monitoring
During launches or experiments:
- Enable real-time mode
- Filter to the relevant feature/environment
- Watch transactions flow in
- Catch issues immediately
Export
Download transaction data:
- CSV: For spreadsheet analysis
- JSON: For programmatic processing
Exports respect your current filters.
For large time ranges:
- Use filters to narrow results
- Pagination keeps things responsive
- For deep analysis, export and use external tools