Set up AI
In this guide, you will learn how to get stared with the Amplify AI kit. This includes defining your AI backend with Conversation and Generation routes, and securely connecting to them from your frontend application.
Prerequisites
Before you begin, you will need:
You will also need an AWS account that is setup for local development and has access to the Bedrock Foundation Model(s) you want to use. You can request access to Bedrock models by going in to the Bedrock console and requesting access.
Create an Amplify backend
Run the create amplify script in your project directory:
npm create amplify@latest
Then run the Amplify sandbox to start your local cloud sandbox:
npx ampx sandbox
This will provision the cloud resources you define in your amplify folder and watch for updates and redeploy them.
Build your AI backend
To build an AI backend, you define AI 'routes' in your Amplify Data schema. An AI route is like an API endpoint for interacting with backend AI functionality. There are currently 2 types of routes:
- Conversation: A conversation route is a streaming, multi-turn API. Conversations and messages are automatically stored in DynamoDB so users can resume conversations. Examples of this are any chat-based AI experience or conversational UI.
- Generation: A single synchronous request-response API. A generation route is just an AppSync Query. Examples of this are: generating alt text for an image, generating structured data from unstructured input, summarization, etc.
To define AI routes, open your amplify/data/resource.ts file and use a.generation()
and a.conversation()
in your schema.
import { a, defineData, type ClientSchema } from '@aws-amplify/backend';
const schema = a.schema({ // This will add a new conversation route to your Amplify Data backend. chat: a.conversation({ aiModel: a.ai.model('Claude 3 Haiku'), systemPrompt: 'You are a helpful assistant', }) .authorization((allow) => allow.owner()),
// This adds a new generation route to your Amplify Data backend. generateRecipe: a.generation({ aiModel: a.ai.model('Claude 3 Haiku'), systemPrompt: 'You are a helpful assistant that generates recipes.', }) .arguments({ description: a.string(), }) .returns( a.customType({ name: a.string(), ingredients: a.string().array(), instructions: a.string(), }) ) .authorization((allow) => allow.authenticated()),});
If you have the Amplify sandbox running, when you save this file it will pick up the changes and redeploy the necessary resources for you.
Connect your frontend
Once the cloud sandbox is up and running, it will also create an amplify_outputs.json
file, which includes relevant connection information to your AI routes and other Amplify configuration.
To connect your frontend code to your backend, you need to:
- Configure the Amplify library with the Amplify client configuration file (
amplify_outputs.json
). - Generate a new API client from the Amplify library.
- Make an API request with end-to-end type-safety.
Install the client libraries
Install the Amplify client library to your project:
npm add aws-amplify
Configure the libraries
Generate the data client
Next, generate a type-safe frontend client to talk to our backend using our backend data schema and the generateClient()
function provided by the Amplify libraries.
import { generateClient } from "aws-amplify/api";import { Schema } from "../amplify/data/resource";
export const client = generateClient<Schema>({ authMode: "userPool" });
Use a generation
import { client } from './client';
const { data } = await client.generations.generateRecipe({ description: 'A gluten free chocolate cake'});
Use a conversation
AI conversations are scoped to a user, so your users will need to be logged in with Amplify auth. The easiest way to do this is with the Authenticator component.
const { data: conversation } = await client.conversations.chat.create();
// Assistant messages come back as websocket events// over a subscriptionconversation.onStreamEvent({ next: (event) => { console.log(event); }, error: (error) => { console.log(error); }});
// When sending user messages you only need to send// the latest message, the conversation history// is stored in DynamoDB and retrieved in Lambdaconversation.sendMessage({ content: [{ text: "hello" }],})