Skip to main content
Connect your existing API to an AI Voice Agent and integrate it into a Flutter app using ready-made templates.

Prerequisites

  • Python 3.8+ and Flutter SDK 3.0+
  • API keys: DEEPGRAM_API_KEY, GEMINI_API_KEY
  • Existing API documentation (OpenAPI, codebase, or API endpoints)

Step 1: Generate Postman Collection

Use Cursor, Windsurf, Antigravity, or Claude code to generate a Postman collection from your API. Copy the prompt below and paste it along with your API documentation:
You are an API documentation extraction wizard.

Objective: Generate a complete Postman Collection (schema v2.1) in **raw JSON** for this project. The collection should represent every HTTP, GraphQL, gRPC, or WebSocket endpoint you can infer from the codebase, including request/response details and example payloads when available.

Guidelines:
1. Inspect source files for routing definitions, controllers, services, schemas, DTOs, GraphQL resolvers, RPC service definitions, etc.
2. For each endpoint include:
   • Name (human-readable)
   • Method and full path (e.g. GET /api/v1/users/:id)
   • Description – populate the Postman description field with a concise summary (use inline comments, annotations, or best inference)
   • Request details: params, query, headers, body schema/example
   • Response examples with status codes and body schema
3. Group related endpoints into folders that mirror their module/domain structure.
4. If the project already contains an OpenAPI/Swagger or Postman file, merge and enrich it rather than starting from scratch.
5. Follow the official Postman Collection v2.1 JSON schema exactly—no extra keys.
6. Output **only** the JSON. Do not wrap in markdown fences, do not add commentary.

URL Format Requirements:
- ALWAYS use {{base_url}} as a Postman variable instead of hardcoded URLs
- Structure URLs in the following format:
  "url": {
    "raw": "{{base_url}}/path/to/endpoint",
    "protocol": "https",
    "host": ["example", "com"],
    "path": ["path", "to", "endpoint"]
  }
- The raw field should always start with {{base_url}} followed by the path
- Extract the host from the codebase if available, otherwise use a placeholder like ["api", "example", "com"]
- Path segments should be split into the path array

Collection Structure:
- Include info object with _postman_id, name, description, and schema fields
- Use nested item arrays for folders and requests:
  - Folders are items with a name, item array (containing requests), and optional description
  - Requests are items with name, request object, and optional response array
- Include description fields at collection level (in info), folder level, and request level
- Response examples should include:
  - name (e.g., "Successful Response")
  - originalRequest (copy of the request)
  - status (e.g., "OK")
  - code (HTTP status code, e.g., 200)
  - _postman_previewlanguage (e.g., "json")
  - header array with Content-Type
  - cookie array (empty if none)
  - body with example response JSON as a string
Action: Paste your API documentation/codebase into the LLM with the prompt above, save the output as postman.json.

Step 2: Set Up Environment Variables

Create a .env file in your project directory:
# STT (Speech-to-Text)
DEEPGRAM_API_KEY=your-deepgram-api-key-here

# LLM (Large Language Model)
GEMINI_API_KEY=your-gemini-api-key-here

# Server Authentication (optional, defaults to "demo-api-key")
KURALIT_API_KEY=demo-api-key

# API Base URL (optional, defaults to "http://localhost:35814")
API_BASE_URL=http://localhost:35814
Action: Replace the placeholder values with your actual API keys.

Step 3: Configure Python Server

Create server.py:
import os
import uvicorn
from pathlib import Path
from kuralit.server.agent_session import AgentSession
from kuralit.server.websocket_server import create_app
from kuralit.tools.api import RESTAPIToolkit

def validate_api_key(api_key: str) -> bool:
    return api_key == os.getenv("KURALIT_API_KEY", "demo-api-key")

# Load Postman collection
collection_path = Path("postman.json")
tools = []

if collection_path.exists():
    api_toolkit = RESTAPIToolkit.from_postman_collection(
        collection_path=str(collection_path),
        base_url=os.getenv("API_BASE_URL", "http://localhost:35814")
    )
    tools.append(api_toolkit)
    print(f"✅ Loaded {len(api_toolkit.get_functions())} API tools")

# Create agent with API tools
agent = AgentSession(
    stt="deepgram/nova-2:en-US",
    llm="gemini/gemini-2.0-flash-001",
    vad="silero/v3",
    turn_detection="multilingual/v1",
    tools=tools if tools else None,
    instructions="You have access to REST API tools. Use them when users make API-related requests.",
)

app = create_app(
    api_key_validator=validate_api_key,
    agent_session=agent,
)

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)
Action: Run python server.py - your server is now running at ws://localhost:8000/ws

Step 4: Integrate Flutter App

Use Cursor, Windsurf, Antigravity, or Claude code to integrate the Kuralit Agent Overlay into your Flutter app. Copy the prompt below and paste it:
I need to integrate the Kuralit Agent Overlay into my Flutter app. Please:

1. **Add the kuralit_sdk dependency** from [pub.dev](https://pub.dev/packages/kuralit_sdk) to my pubspec.yaml:
   dependencies:
     kuralit_sdk: ^0.1.1
   
   Then run flutter pub get to install the package.

2. **Initialize the Kuralit SDK** in my screen/widget's initState():
   - Use Kuralit.init() with KuralitConfig (serverUrl, apiKey, appId, debug)
   - Generate a session ID using Kuralit.generateSessionId()
   - Optionally connect to the server with Kuralit.connect() (or let the overlay handle connection)
   - Set an _isInitialized flag to track initialization status

3. **Add a button/trigger** that calls KuralitAgentOverlay.show(context, sessionId: sessionId) - it's just one line to open the full-screen overlay!

4. **Handle connection errors** gracefully (try-catch around Kuralit.connect())

The overlay automatically handles everything:
- WebSocket connection management
- Audio recording and streaming (with automatic permission requests)
- Real-time transcription display
- Conversation history with beautiful animated UI
- Voice and text input modes (switchable)
- Golden animated borders and visual effects
- Error handling with user-friendly messages

**Key benefit:** Opening the overlay is just one line - KuralitAgentOverlay.show(context, sessionId: sessionId)

**Implementation requirements:**
- Follow Flutter best practices
- Add all necessary imports (package:kuralit_sdk/kuralit.dart)
- Make configuration values (API key, App ID, server URL) easily configurable
- Add proper error handling for connection failures
- Ensure it works on both iOS and Android
- Handle microphone permissions gracefully

Please implement this integration and show me where to add my API credentials.

---

## Reference Implementation

Here's the complete reference code you can use:

import 'package:flutter/material.dart';
import 'package:kuralit_sdk/kuralit.dart';

class HomeScreen extends StatefulWidget {
  const HomeScreen({super.key});

  @override
  State<HomeScreen> createState() => _HomeScreenState();
}

class _HomeScreenState extends State<HomeScreen> {
  String? _sessionId;
  bool _isInitialized = false;

  @override
  void initState() {
    super.initState();
    _initializeSDK();
  }

  void _initializeSDK() {
    Kuralit.init(KuralitConfig(
      serverUrl: 'ws://10.0.2.2:8000/ws',  // Replace with your server URL
      apiKey: 'demo-api-key',                // Replace with your API key
      appId: 'my-app',                        // Replace with your App ID
      debug: true,
    ));

    _sessionId = Kuralit.generateSessionId();
    _connect();
    setState(() => _isInitialized = true);
  }

  Future<void> _connect() async {
    try {
      await Kuralit.connect();
    } catch (e) {
      print('Connection failed: $e');
    }
  }

  void _openOverlay() {
    if (_sessionId == null) return;
    // Open agent overlay - just one line!
    KuralitAgentOverlay.show(
      context,
      sessionId: _sessionId!,
    );
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('Agent Overlay Example')),
      body: Center(
        child: ElevatedButton(
          onPressed: _isInitialized ? _openOverlay : null,
          child: const Text('Open Agent Overlay'),
        ),
      ),
    );
  }
}

**Note:** Replace the serverUrl, apiKey, and appId values with your actual Kuralit credentials.
Action: Paste this prompt into your LLM along with your Flutter app code, and it will generate the complete integration for you!

What You’ve Built

AI Voice Agent with access to your API endpoints
WebSocket server running and ready
Flutter app with ready-made agent overlay template
Voice & text interaction with your API-powered agent

Next Steps