model
is responsible for deciding which tools to call and generating the final response
First, add QSTASH_TOKEN
and OPENAI_API_KEY
to your environment:
baseURL
and apiKey
:
create<Provider>
methods exported from provider packages. For example, in order to use Anthropic you can do the following:
agentCallParams
parameter:
agents.AISDKModel
as the agentCallParams
parameter.
context.run
to run them as a Workflow step. This means, you can’t use steps like context.call, context.notify etc in execute/invoke right away. If you want to define steps in invoke/execute, you can use the executeAsStep
option:
model
: The LLM model that the agent will use.name
: The name of the agent. This name will be used when naming the context.call steps for this agent. context.call
is used when calling the LLM provider (such as OpenAI).maxSteps
: The maximum number of times this agent can call the LLM provider (e.g., OpenAI).tools
: The list of tools available to the agent for completing tasks.background
: A description of the agent, which is used as a system prompt to provide context for the agent’s behavior.Manager LLM
first called the academic
agent. The academic
agent used the wikiTool
three times, each with a different Japanese city, and summarized the results. Next, the Manager LLM
called the mathematician
agent, which used its calculate
tool to compute the total population of the three cities and returned the result to the Manager LLM
. With information about the cities and their total population, the Manager LLM
generated the final response.