Skip to main content
This guide covers how to send xAI usage data to Fenra.

Grok API

xAI’s API follows the OpenAI format:
async function chat(messages) {
  const response = await fetch('https://api.x.ai/v1/chat/completions', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': `Bearer ${process.env.XAI_API_KEY}`
    },
    body: JSON.stringify({
      model: 'grok-beta',
      messages
    })
  });

  const result = await response.json();

  // Send to Fenra
  await fetch('https://api.fenra.io/ingest/usage', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-Api-Key': process.env.FENRA_API_KEY
    },
    body: JSON.stringify({
      provider: 'xai',
      model: result.model,
      usage: [{
        type: 'tokens',
        metrics: {
          input_tokens: result.usage.prompt_tokens,
          output_tokens: result.usage.completion_tokens,
          total_tokens: result.usage.total_tokens
        }
      }],
      context: {
        billable_customer_id: process.env.BILLABLE_CUSTOMER_ID
      }
    })
  });

  return result;
}

Supported Models

Fenra supports all xAI models. Common models include:
ModelDescription
grok-betaLatest Grok model
grok-vision-betaGrok with vision

Next Steps