This guide explains how to use the built-in OllamaAgent
node type to interact with local Ollama models in your Greentic flows. Agents support generation, embeddings, and structured assistant prompts with tool calling and state updates.
The OllamaAgent
wraps a call to your local Ollama instance using the ollama-rs client. It supports the following modes:
chat
: Structured prompting with JSON output (default)generate
: Plain text generationembed
: Generate vector embeddings for textExample YAML:
generate_reply:
ollama:
task: "Summarise the payload"
model: "llama3"
mode: generate
ollama_url: "http://localhost:11434"
Fields:
task
: Required. Used as part of the input to the LLM.model
: Optional. Model name (e.g., llama3:instruct
, mistral
, etc.)mode
: Optional. One of generate
, chat
, or embed
. Default: chat
ollama_url
: Optional. Default: http://localhost:11434
tool_names
: Optional list of tools this agent can callmodel_options
: Optional advanced options (e.g., temperature)This is the default and most powerful mode. The agent receives:
task
: Instruction of what to dopayload
: Latest message or extracted datastate
: Context memory of the sessionconnections
: Allowed follow-upstools
: Optional callable toolsThe model must return structured JSON with:
{
"payload": { ... },
"state": {
"add": [...],
"update": [...],
"delete": [...]
},
"tool_call": {
"name": "weather_api",
"action": "forecast",
"input": { "q": "London" }
},
"connections": ["next_node"],
"reply_to_origin": false
}
Empty fields should be omitted. Errors in structure will be logged.
For simple completion tasks:
generate_summary:
ollama:
mode: generate
task: "Create summary"
model: "llama3"
Payload:
{ "prompt": "Explain why the sky is blue" }
Returns:
{ "generated_text": "..." }
To compute vector embeddings:
vectorise:
ollama:
mode: embed
task: "Embed this"
model: "llama3-embed"
Payload:
{ "text": "The quick brown fox." }
Returns:
{ "embeddings": [0.123, -0.456, ...] }
You’re now ready to use AI agents inside Greentic to power dynamic, tool-calling, state-aware flows!