ainativeui

02Tutorial · 02

Respond to canvas events

Responding to canvas events from the host

Tee canvas events into your app's own logic — analytics, UI choreography,

side effects.

Overview

Every time the user interacts with the canvas — taps a button, commits a

slider, completes a widget — a UIEvent flows through the session's event

bus to the agent loop. By default the host doesn't see these events; the

loop handles them and the canvas re-renders.

But sometimes the host needs to react too. Dim the chat input while the AI

is processing. Trigger a haptic on puzzleCompleted. Log every

value_committed to your analytics pipeline. Swap an icon when a widget

emits dismiss. These are host concerns — they shouldn't go through the

agent.

AICanvasView's onCanvasEvent callback gives the host a parallel view of

every event without taking the event away from the agent loop. The

callback is a tee, not a replacement: events flow to both you and the

loop simultaneously.

When to use this

Right uses for onCanvasEvent:

  • ·Logging events to analytics or your telemetry pipeline
  • ·Driving host-side UI choreography (dim input, show a toast)
  • ·Triggering haptics or sound effects
  • ·Updating a toolbar badge based on widget state changes

Wrong uses — the agent should handle these:

system prompt teach this, the recipe doctrine handles it

protocol, not the event stream

the source of truth

  • ·Routing logic ("if user taps chip X, render Y") — let the agent's
  • ·State persistence — use Session.save(to:id:) and the SessionStore
  • ·Authoritative business logic — events are observational; the loop is

The shape

AICanvasView(session: session) { event in
    // Fires for every UIEvent the renderer dispatches.
    print("Canvas event: \(event.kind) on \(event.nodeID)")
}

The closure is @Sendable (UIEvent) -> Void. It runs on the main actor

(canvas events fire from SwiftUI's render thread). Don't block — push work

to a background Task if you need to.

Worked example: dim the chat input while a widget runs

struct ChatScreen: View {
    let session: Session
    @State private var input = ""
    @State private var widgetActive = false

    var body: some View {
        VStack(spacing: 0) {
            AICanvasView(session: session) { event in
                Task { @MainActor in
                    switch event.kind {
                    case "started", "tick":
                        widgetActive = true
                    case "completed", "puzzleCompleted", "stopped", "cancelled":
                        widgetActive = false
                    default:
                        break
                    }
                }
            }
            .aiBuiltinWidgets()
            .aiBuiltinRecipes()

            HStack {
                TextField("Say something…", text: $input)
                    .disabled(widgetActive)
                    .opacity(widgetActive ? 0.4 : 1.0)
                Button("Send", action: send)
                    .disabled(widgetActive)
            }
            .padding()
        }
    }
}

The host now reflects the widget's running state without touching the

agent loop or subclassing Session.

Worked example: telemetry pipeline

AICanvasView(session: session) { event in
    Task.detached(priority: .background) {
        await analytics.record(.canvasEvent(
            kind: event.kind,
            nodeID: event.nodeID.value,
            timestamp: Date()
        ))
    }
}

The detached(priority: .background) means the analytics write doesn't

hold up rendering. The agent loop still receives the same event in

parallel.

Common event kinds

The canvas dispatches many event kinds depending on what the user

touches. The most common:

finished a user edit (drag-end, Enter, selection change)

`EventRouting/meaningfulOnly; subscribe via .all` to see them)

  • ·tap — a control button or container tap-gesture fired
  • ·value_committed — a stateful control (toggle/slider/picker/textField)
  • ·value_changed — keystroke / mid-drag (filtered out by default at
  • ·submit — a code_editor widget's Run button
  • ·completed / puzzleCompleted / gameOver — widget finished
  • ·dismiss — a presentation (sheet, alert, etc.) was dismissed
  • ·reorder — drag-to-reorder fired in a reorderable stack

Use the event payloads to disambiguate which node fired (the nodeID)

and what the user did (the payload JSON). Suggestion-cluster chips

emit a tap with payload { suggestionID, manifests } per the Phase 2

contract — host code can short-circuit on suggestionID == "..." to

trigger custom logic before the agent sees the tap.

Where to go next

also flow through this callback

  • ·`01-DropIntoAnExistingChat — if you haven't set up AICanvasView`
  • ·`03-RegisteringDomainWidgets` — the events from your custom widgets