
Serverless compute service

OpenAI is an AI research and deployment company that provides access to various large language models (LLMs) through a Platform REST API. Connectors enables you to quickly integrate an LLM or Assistant with your existing APIs - all without writing a bunch of code.
Click ► GenerateStory to run the operation. You just got back real data from a ChatGPT model! Don't see GenerateStory? Try closing the tab in Sandbox.
Want to try it with your OpenAI account? Just add your Authorization header to the request - don't worry, we don't log any data or headers from this demo.
Toggle the Response dropdown and select Query Plan. This is the strategy for executing an incoming operation efficiently.
Toggle the Query Plan dropdown to Connectors Debugger to click into the details of each request.
Create a new graph in GraphOS and follow the "Set up your local development environment" instructions.
Update your local router.yaml file to configure your OPENAI_API_KEY - be sure to update your supergraph.yaml to use the same subgraph name (chat).
# router.yaml
connectors:
subgraphs:
chat.openai:
$config:
apiKey: ${env.OPENAI_API_KEY}
# supergraph.yaml
federation_version: =2.10.0
subgraphs:
chat:
routing_url: http://openai
schema:
file: connector.graphqlCopy the prebuilt Connector file content into the .graphql file in your local project.
Run rover dev with your OPENAI_API_KEY.
export OPENAI_API_KEY=...
rover dev \
--supergraph-config supergraph.yaml \
--router-config router.yamlLearn more about Connectors by checking out the Connectors documentation.
Have questions? Check out the Apollo Community for Connectors to discuss.