Skip to main content
Ghost UI

AI Sees. AI Tests. AI Iterates.

Test in milliseconds, not seconds. Iterate 10x faster. 80% fewer tokens.

The Old Way
~50KB

Screenshots burn tokens. LLMs can't interact.

The Aware Way
<1KB

Text snapshots + ghost actions. Full control.

// Key Features

Built for Speed

Not retrofitted accessibility APIs. Designed for rapid AI-driven iteration.

Ghost Actions

Tap, type, and scroll without moving the mouse. LLM invokes actions directly while you keep working.

Text Snapshots

80% fewer tokens than screenshots. Complete UI state as structured text that LLMs understand.

Staleness Detection

Auto-detect when React/SwiftUI state goes stale. Catch bugs that are impossible to find manually.

Cross-Platform

Works with SwiftUI, React/Web, iOS Simulator, and any macOS app via Accessibility APIs.

How It Works

One Line of Code

Add instrumentation with a simple modifier. Get rich text snapshots.

LoginView.swift
// Add Aware instrumentation
Button("Login") { login() }
.awareTappable("login-btn")
TextField("Email", text: $email)
.awareInput("email-field")
Text Snapshot
~200 bytes
# UI Snapshot (compact)
login-btn:button(80x40@120,300)
{"Login"}[enabled=true]→tap
email-field:input(300x40@50,200)
{""}[focused=true]→type
LLM Integration

The Ghost Workflow

LLM reads, reasons, acts, and verifies. All without touching your mouse.

Step 1

Snapshot

LLM reads text UI state

Step 2

Reason

Understands layout & state

Step 3

Command

Sends ghost action

Step 4

Execute

Aware invokes directly

Step 5

Verify

Confirms state change

Platform Support

Works Everywhere

Native support for Apple platforms, web, and any macOS app.

SwiftUI

iOS & macOS

.aware() view modifiers

React/Web

useAware() hooks

AwareProvider context

iOS Simulator

via simctl

ui_simulator tools

macOS Apps

Accessibility APIs

ui_ax_* tools
0%

Token Savings

0ms

Mouse Movement

0

Platforms

<0KB

Snapshot Size

Let AI see what it builds

Download Breathe and experience Ghost UI testing.