Connect OpenAI
Assistants API in minutes

OpenAI is an AI research and deployment company that provides access to various large language models (LLMs) through a Platform REST API. Connectors enables you to quickly integrate an LLM or Assistant with your existing APIs - all without writing a bunch of code.

Try out the Connector!

  • Click ► GenerateStory to run the operation. You just got back real data from a ChatGPT model! Don't see GenerateStory? Try closing the tab in Sandbox.

    Want to try it with your OpenAI account? Just add your Authorization header to the request - don't worry, we don't log any data or headers from this demo.

  • Toggle the Response dropdown and select Query Plan. This is the strategy for executing an incoming operation efficiently.

  • Toggle the Query Plan dropdown to Connectors Debugger to click into the details of each request.

Get started with your LLM or Assistant

  • Create a new graph in GraphOS and follow the "Set up your local development environment" instructions.

  • Update your local router.yaml file to configure your OPENAI_API_KEY - be sure to update your supergraph.yaml to use the same subgraph name (chat).

# router.yaml
connectors:
  subgraphs:
    chat.openai:
      $config:
        apiKey: ${env.OPENAI_API_KEY}
        
# supergraph.yaml
federation_version: =2.10.0
subgraphs:
  chat:
    routing_url: http://openai
    schema:
      file: connector.graphql
export OPENAI_API_KEY=...
rover dev \
  --supergraph-config supergraph.yaml \
  --router-config router.yaml

Connectors resources