Integration

SDK Integration

Use NeuronEdge with your existing OpenAI or Anthropic SDKs—just change the base URL. No new libraries to install, no code rewrites.

Quick Integration

The fastest way to integrate NeuronEdge is to reconfigure your existing LLM SDK:

OpenAI SDK (TypeScript)

typescript
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
  baseURL: 'https://api.neuronedge.ai/v1/openai',  // Change base URL
  defaultHeaders: {
    'Authorization': `Bearer ${process.env.NEURONEDGE_API_KEY}`,  // Add NeuronEdge key
  },
});

// Use normally - PII is automatically protected
const response = await openai.chat.completions.create({
  model: 'gpt-5.2',
  messages: [{ role: 'user', content: 'Hello, my name is John Smith' }],
});

OpenAI SDK (Python)

python
from openai import OpenAI
import os

client = OpenAI(
    api_key=os.environ["OPENAI_API_KEY"],
    base_url="https://api.neuronedge.ai/v1/openai",  # Change base URL
    default_headers={
        "Authorization": f"Bearer {os.environ['NEURONEDGE_API_KEY']}",  # Add NeuronEdge key
    },
)

# Use normally - PII is automatically protected
response = client.chat.completions.create(
    model="gpt-5.2",
    messages=[{"role": "user", "content": "Hello, my name is John Smith"}],
)

Anthropic SDK (TypeScript)

typescript
import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
  baseURL: 'https://api.neuronedge.ai/v1/anthropic',  // Change base URL
  defaultHeaders: {
    'Authorization': `Bearer ${process.env.NEURONEDGE_API_KEY}`,
  },
});

// Use normally - PII is automatically protected
const response = await anthropic.messages.create({
  model: 'claude-sonnet-4-20250514',
  max_tokens: 1024,
  messages: [{ role: 'user', content: 'Hello, my name is John Smith' }],
});

Anthropic SDK (Python)

python
import anthropic
import os

client = anthropic.Anthropic(
    api_key=os.environ["ANTHROPIC_API_KEY"],
    base_url="https://api.neuronedge.ai/v1/anthropic",  # Change base URL
    default_headers={
        "Authorization": f"Bearer {os.environ['NEURONEDGE_API_KEY']}",  # Add NeuronEdge key
    },
)

# Use normally - PII is automatically protected
response = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello, my name is John Smith"}],
)

Framework Compatibility

NeuronEdge works with any framework that uses standard LLM SDKs:

LangChain
LlamaIndex
Vercel AI SDK
Haystack
LiteLLM
Guidance
Semantic Kernel
AutoGen

Get Started

The easiest way to start is with your existing SDK. Just change two lines:

  1. 1.Set baseURL to https://api.neuronedge.ai/v1/{provider}
  2. 2.Add your NeuronEdge API key to Authorization header

That's it. All your existing code works unchanged with automatic PII protection.