Token.js

Integrate 200+ LLMs with one TypeScript SDK using OpenAI's format. Free and open source. No proxy server required.

Features

  • Use OpenAI's format to call 200+ LLMs from 10 providers.

  • Supports tools, JSON outputs, image inputs, streaming, and more.

  • Runs completely on the client side. No proxy server needed.

  • Free and open source under MIT.

Supported Providers

  • AI21

  • Anthropic

  • AWS Bedrock

  • Cohere

  • Gemini

  • Groq

  • Mistral

  • OpenAI

  • Perplexity

  • OpenRouter

  • Any other model provider with an OpenAI compatible API

Setup

Installation

Usage

Import the Token.js client and call the create function with a prompt in OpenAI's format. Specify the model and LLM provider using their respective fields.

Access Credentials

We recommend using environment variables to configure the credentials for each LLM provider.

Streaming

Token.js supports streaming responses for all providers that offer it.

Function Calling

Token.js supports the function calling tool for all providers and models that offer it.

Feature Compatibility

This table provides an overview of the features that Token.js supports from each LLM provider.

Provider
Chat Completion
Streaming
Function Calling Tool
JSON Output
Image Input

OpenAI

Anthropic

Bedrock

Mistral

Cohere

AI21

Gemini

Groq

Perplexity

OpenRouter

OpenAI Compatible

Legend

Symbol
Description

Supported by Token.js

Not supported by the LLM provider, so Token.js cannot support it

Note: Certain LLMs, particularly older or weaker models, do not support some features in this table. For details about these restrictions, see our LLM provider documentation.

License

Token.js is free and open source software licensed under MIT.

Last updated

Was this helpful?