The Dapr Conversation API is currently in alpha. This page presents the recommended, minimal patterns to use it effectively with the Python SDK:

Prerequisites

For full, end‑to‑end flows and provider setup, see:

Plain conversation (no tools)

from dapr.clients import DaprClient
from dapr.clients.grpc import conversation

# Build a single‑turn Alpha2 input
user_msg = conversation.create_user_message("What's Dapr?")
alpha2_input = conversation.ConversationInputAlpha2(messages=[user_msg])

with DaprClient() as client:
    resp = client.converse_alpha2(
        name="echo",  # replace with your LLM component name
        inputs=[alpha2_input],
        temperature=1,
    )

    for msg in resp.to_assistant_messages():
        if msg.of_assistant.content:
            print(msg.of_assistant.content[0].text)

Key points:

Decorator-based tools offer a clean, ergonomic approach. Define a function with clear type hints and detail docstring, this is important for the LLM to understand how or when to invoke the tool; decorate it with @conversation.tool. Registered tools can be passed to the LLM and invoked via tool calls.

from dapr.clients import DaprClient
from dapr.clients.grpc import conversation

@conversation.tool
def get_weather(location: str, unit: str = 'fahrenheit') -> str:
    """Get current weather for a location."""
    # Replace with a real implementation
    return f"Weather in {location} (unit={unit})"

user_msg = conversation.create_user_message("What's the weather in Paris?")
alpha2_input = conversation.ConversationInputAlpha2(messages=[user_msg])

with DaprClient() as client:
    response = client.converse_alpha2(
        name="openai",  # your LLM component
        inputs=[alpha2_input],
        tools=conversation.get_registered_tools(),  # tools registered by @conversation.tool
        tool_choice='auto',
        temperature=1,
    )

    # Inspect assistant messages, including any tool calls
    for msg in response.to_assistant_messages():
        if msg.of_assistant.tool_calls:
            for tc in msg.of_assistant.tool_calls:
                print(f"Tool call: {tc.function.name} args={tc.function.arguments}")
        elif msg.of_assistant.content:
            print(msg.of_assistant.content[0].text)

Notes:

Minimal multi‑turn with tools

This is the go‑to loop for tool‑using conversations:

from dapr.clients import DaprClient
from dapr.clients.grpc import conversation

@conversation.tool
def get_weather(location: str, unit: str = 'fahrenheit') -> str:
    return f"Weather in {location} (unit={unit})"

history: list[conversation.ConversationMessage] = [
    conversation.create_user_message("What's the weather in San Francisco?")]

with DaprClient() as client:
    # Turn 1
    resp1 = client.converse_alpha2(
        name="openai",
        inputs=[conversation.ConversationInputAlpha2(messages=history)],
        tools=conversation.get_registered_tools(),
        tool_choice='auto',
        temperature=1,
    )

    # Append assistant messages; execute tool calls; append tool results
    for msg in resp1.to_assistant_messages():
        history.append(msg)
        for tc in msg.of_assistant.tool_calls:
            # IMPORTANT: validate inputs and enforce guardrails in production
            tool_output = conversation.execute_registered_tool(
                tc.function.name, tc.function.arguments
            )
            history.append(
                conversation.create_tool_message(
                    tool_id=tc.id, name=tc.function.name, content=str(tool_output)
                )
            )

    # Turn 2 (LLM sees tool result)
    history.append(conversation.create_user_message("Should I bring an umbrella?"))
    resp2 = client.converse_alpha2(
        name="openai",
        inputs=[conversation.ConversationInputAlpha2(messages=history)],
        tools=conversation.get_registered_tools(),
        temperature=1,
    )

    for msg in resp2.to_assistant_messages():
        history.append(msg)
        if not msg.of_assistant.tool_calls and msg.of_assistant.content:
            print(msg.of_assistant.content[0].text)

Tips:

Functions as tools: alternatives

When decorators aren’t practical, two options exist.

A) Automatic schema from a typed function:

from enum import Enum
from dapr.clients.grpc import conversation

class Units(Enum):
    CELSIUS = 'celsius'
    FAHRENHEIT = 'fahrenheit'

def get_weather(location: str, unit: Units = Units.FAHRENHEIT) -> str:
    return f"Weather in {location}"

fn = conversation.ConversationToolsFunction.from_function(get_weather)
weather_tool = conversation.ConversationTools(function=fn)

B) Manual JSON Schema (fallback):

from dapr.clients.grpc import conversation

fn = conversation.ConversationToolsFunction(
    name='get_weather',
    description='Get current weather',
    parameters={
        'type': 'object',
        'properties': {
            'location': {'type': 'string'},
            'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']},
        },
        'required': ['location'],
    },
)
weather_tool = conversation.ConversationTools(function=fn)

Async variant

Use the asynchronous client and async tool execution helpers as needed.

import asyncio
from dapr.aio.clients import DaprClient as AsyncDaprClient
from dapr.clients.grpc import conversation

@conversation.tool
def get_time() -> str:
    return '2025-01-01T12:00:00Z'

async def main():
    async with AsyncDaprClient() as client:
        msg = conversation.create_user_message('What time is it?')
        inp = conversation.ConversationInputAlpha2(messages=[msg])
        resp = await client.converse_alpha2(
            name='openai', inputs=[inp], tools=conversation.get_registered_tools()
        )
        for m in resp.to_assistant_messages():
            if m.of_assistant.content:
                print(m.of_assistant.content[0].text)

asyncio.run(main())

If you need to execute tools asynchronously (e.g., network I/O), implement async functions and use conversation.execute_registered_tool_async with timeouts.

Safety and validation (must‑read)

An LLM may suggest tool calls. Treat all model‑provided parameters as untrusted input.

Recommendations:

See also inline notes in dapr/clients/grpc/conversation.py (e.g., tool(), ConversationTools, execute_registered_tool) for parameter binding and error handling details.

Key helper methods (quick reference)

This section summarizes helper utilities available in dapr.clients.grpc.conversation used throughout the examples.

Tip: The @conversation.tool decorator is the easiest way to create a tool. It auto-generates the schema from your function, allows an optional namespace/name override, and auto-registers the tool (you can set register=False to defer registration).