Context
For LLMs to provide high-quality answers to users' questions, they need to have the right information. Sometimes this information is contextual, based on the user or the state of the application. To allow for this, you can send aiContext
with any user message to the LLM, which can be any unstructured or structured data that might be useful.
import { generateClient } from "aws-amplify/data";import type { Schema } from "../amplify/data/resource";
const client = generateClient<Schema>({ authMode: 'userPool' });
const { data: conversation } = await client.conversations.chat.create();
conversation.sendMessage({ content: [{ text: "hello" }], // aiContext can be any shape aiContext: { username: "danny" }})