Skip to main content
Vision Agents provides a conversation system that maintains context across interactions. The system supports persistent storage through Stream Chat (default) and in-memory storage for development.
Vision Agents requires a Stream account for real-time transport. Persistent conversations use Stream Chat automatically.

Persistent Conversations (Default)

By default, agents use StreamConversation which persists messages to Stream Chat. No additional setup required.
from vision_agents.core import User, Agent, Runner
from vision_agents.core.agents import AgentLauncher
from vision_agents.plugins import getstream, gemini

async def create_agent(**kwargs) -> Agent:
    return Agent(
        edge=getstream.Edge(),
        agent_user=User(name="Assistant", id="agent"),
        instructions="Remember details from our conversation across sessions.",
        llm=gemini.Realtime(),
    )

async def join_call(agent: Agent, call_type: str, call_id: str, **kwargs) -> None:
    await agent.create_user()
    call = await agent.create_call(call_type, call_id)

    async with agent.join(call):
        # Conversation automatically:
        # - Stores user messages from STT
        # - Stores agent responses from LLM
        # - Persists to Stream Chat
        # - Maintains context across sessions
        await agent.simple_response("Hello! I'll remember our conversation.")
        await agent.finish()

if __name__ == "__main__":
    Runner(AgentLauncher(create_agent=create_agent, join_call=join_call)).cli()
Messages are streamed to an ephemeral endpoint before persisting, ensuring real-time UI updates without affecting performance.

In-Memory Conversations

For development and testing, use InMemoryConversation:
from vision_agents.core.agents.conversation import InMemoryConversation

async def create_agent(**kwargs) -> Agent:
    llm = gemini.LLM()
    llm._conversation = InMemoryConversation("Be friendly", [])

    return Agent(
        edge=getstream.Edge(),
        agent_user=User(name="Assistant", id="agent"),
        instructions="You're a conversational AI assistant.",
        llm=llm,
    )

Custom Conversation Storage

Implement the Conversation abstract base class for custom storage:
from vision_agents.core.conversation import Conversation, Message

class CustomConversation(Conversation):
    def add_message(self, message: Message, completed: bool = True):
        """Add a message to your custom storage."""
        pass

    def update_message(self, message_id: str, input_text: str, user_id: str,
                      replace_content: bool, completed: bool):
        """Update an existing message."""
        pass

Message Structure

Each message includes:
FieldDescription
contentMessage text
roleuser or assistant
user_idSender identifier
timestampWhen the message was sent
idUnique message identifier

Next Steps