Search
Search docs, blog posts, and ecosystem packages with citations.
Enter a query to see grounded citations.
We can't find the internet
Attempting to reconnect
Search docs, blog posts, and ecosystem packages with citations.
Add LLM reasoning to a Jido agent with jido_ai, configure a provider, and run your first AI-enhanced command.
Complete Your first agent before starting. You need an API key for OpenAI or another supported provider.
jido_ai adds LLM reasoning on top of the core Jido runtime. req_llm is the provider-agnostic transport layer that jido_ai uses underneath.
Mix.install([
{:jido, "~> 2.0"},
{:jido_ai, github: "agentjido/jido_ai", branch: "main"},
{:req_llm, "~> 1.6"}
])
# Suppress verbose runtime logs so we only see warnings and errors
Logger.configure(level: :warning)
Set your OpenAI API key. In Livebook, add OPENAI_API_KEY as a
Livebook Secret prefixed with LB_.
The cell below checks both the Livebook secret and a plain environment variable.
openai_key = System.get_env("LB_OPENAI_API_KEY") || System.get_env("OPENAI_API_KEY")
if openai_key do
ReqLLM.put_key(:openai_api_key, openai_key)
:configured
else
IO.puts("Set OPENAI_API_KEY as a Livebook Secret or environment variable to run the LLM cells.")
:no_key
end
In the previous tutorial you built a deterministic agent with typed state and actions. This tutorial adds LLM reasoning to a Jido agent.
The Jido.AI.Agent macro gives your agent a system prompt, a model, and the full Jido lifecycle. The module below defines a greeter that generates a short welcome message.
defmodule MyApp.Greeter do
use Jido.AI.Agent,
name: "greeter",
description: "Generates a friendly greeting",
tools: [],
model: "openai:gpt-4o-mini",
system_prompt: """
You are a friendly greeter.
Generate a short, warm welcome message.
One or two sentences maximum.
"""
end
model: "openai:gpt-4o-mini" selects a fast, inexpensive model. You can swap in any model string supported by req_llm (for example "anthropic:claude-haiku-4-5"). tools: [] means no external tool calls for now.
The agent server needs the Jido runtime (a registry, a dynamic supervisor, and a task supervisor). Define a Jido module and start it:
defmodule MyApp.Jido do
use Jido, otp_app: :my_app
end
{:ok, _} = MyApp.Jido.start_link(name: Jido)
Start the agent through Jido.AgentServer and send it a prompt:
{:ok, pid} = Jido.AgentServer.start_link(agent: MyApp.Greeter)
MyApp.Greeter.ask_sync(pid, "Say hello to someone just getting started with Jido.")
If you see a greeting string, your AI integration is working. If you get a provider error, verify that your API key is set correctly.
Here is the flow you just ran:
MyApp.Jido.start_link/1 started the Jido runtime (registry, agent supervisor, and task supervisor). Jido.AgentServer.start_link/1 spawned a process for your agent and registered it in the runtime. ask_sync/2 sent the prompt to the agent process. req_llm. Your agent module contains no HTTP calls, no API client code, and no mutable state. The LLM interaction is handled by the runtime. Your agent definition stays declarative and testable.
When you move from Livebook to a Mix project, add the dependencies to mix.exs and configure the provider key in config/runtime.exs.
mix.exs
defp deps do
[
{:jido, "~> 2.0"},
{:jido_ai, github: "agentjido/jido_ai", branch: "main"},
{:req_llm, "~> 1.6"}
]
end
config/runtime.exs
import Config
config :req_llm, openai_api_key: System.get_env("OPENAI_API_KEY")
config :logger, level: :warning
Export the key in the shell session where you run your app:
export OPENAI_API_KEY="your-api-key-here"
Then add your Jido app to your application’s supervision tree:
my_app/jido.ex
defmodule MyApp.Jido do
use Jido, otp_app: :my_app
end
my_app/application.ex
defmodule MyApp.Application do
use Application
@impl true
def start(_type, _args) do
children =
[
...
{MyApp.Jido, name: Jido}
]
opts = [strategy: :one_for_one, name: MyApp.Supervisor]
Supervisor.start_link(children, opts)
end
end
jido_ai surface at jido_ai.