Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
90 changes: 90 additions & 0 deletions content/docs/ai-prompts/prompts.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
---
title: AI Prompts for OACP
last_edited: '2026-04-11T00:00:00.000Z'
tocIsHidden: false
---

Copy-paste prompts for Claude, ChatGPT, or any AI assistant. These help you plan integrations, write `oacp.json`, and generate receiver code.

## Prompt 1: Plan OACP integration

```
I want to add OACP (Open App Capability Protocol) to my Android app.

Read the OACP documentation at https://oacp.dev and plan how to add voice control capabilities to my app.

My app is: [describe your app and what it does]

For each feature that could be voice-controlled, suggest:
- A capability ID (snake_case)
- Whether it should be a broadcast (background) or activity (foreground) action
- What parameters it needs
- Good aliases and examples for voice matching

Output a complete oacp.json file and an OacpReceiver implementation.
```

## Prompt 2: Review oacp.json

```
Review this oacp.json file for completeness and voice matching quality.

Check:
- Are there enough aliases? (aim for 5+ per capability)
- Are the examples realistic user utterances?
- Are disambiguationHints present for similar-sounding actions?
- Are parameter types and constraints correct?
- Is __APPLICATION_ID__ used consistently?
- Are confirmation semantics appropriate?

[paste your oacp.json here]
```

## Prompt 3: Generate OacpReceiver

```
Generate a Kotlin OacpReceiver for these OACP capabilities.

Use the OACP Android SDK (org.oacp.android):
- Extend OacpReceiver
- Override onAction()
- Use OacpParams for type-safe parameter access
- Return OacpResult.success() or OacpResult.error()
- Match actions with endsWith() for build variant safety

Capabilities:
[paste your oacp.json capabilities array here]
```

## Prompt 4: Write OACP.md context

```
Write an OACP.md file for my Android app. This file gives AI assistants context about what the app does and how to disambiguate between capabilities.

My app: [describe your app]
My capabilities: [paste capability IDs and descriptions]

The file should:
- Start with the app name and a one-line description
- List each capability with plain-English usage notes
- Include disambiguation rules for similar actions
- Note any edge cases or defaults
```

## Prompt 5: Add OACP to a specific open source app

```
I want to add OACP support to [app name], an open-source Android app.

The source code is at: [GitHub URL]

Read the app's source code and:
1. Identify 3-5 features that could be voice-controlled
2. For each, determine if it's a foreground (needs UI) or background action
3. Write a complete oacp.json with proper aliases, examples, and parameters
4. Write an OacpReceiver that handles each action
5. Show the AndroidManifest.xml additions needed
6. Write an OACP.md context file

Follow the OACP v0.3 protocol. Use __APPLICATION_ID__ for all package-name-dependent values.
```
81 changes: 81 additions & 0 deletions content/docs/contributing/contributing.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
---
title: Contributing
last_edited: '2026-04-11T00:00:00.000Z'
tocIsHidden: false
---

Three ways to contribute, ordered by impact.

## 1. Add OACP to an app

This is the single most impactful thing you can do. Every app that ships `oacp.json` makes the entire ecosystem more useful.

Pick any open-source Android app. Fork it. Add OACP. Open a PR.

Here's the process:

1. Pick an app (ideally one you already use)
2. Follow the [Add OACP guide](/docs/get-started/add-oacp)
3. Fork the app on GitHub
4. Add the OACP integration (`oacp.json`, ContentProvider, BroadcastReceiver)
5. Test with Hark or `adb`
6. Open a PR against the original repo
7. If accepted, let us know and we'll add it to the [ecosystem](/docs/ecosystem/apps)

Not sure where to start? The [AI Prompts](/docs/ai-prompts/prompts) page has copy-paste prompts that help you plan an integration, write `oacp.json`, and generate a receiver.

## 2. Test Hark and file issues

Install Hark, try voice commands with [ecosystem apps](/docs/ecosystem/apps), and file issues when things break.

Report issues at: [github.com/OpenAppCapabilityProtocol/hark/issues](https://github.com/OpenAppCapabilityProtocol/hark/issues)

What to report:

- Discovery failures (app not found)
- Wrong action matched
- Parameter extraction errors
- Crash logs
- UX suggestions

## 3. Protocol feedback

Open issues on the protocol repo: [github.com/OpenAppCapabilityProtocol/oacp/issues](https://github.com/OpenAppCapabilityProtocol/oacp/issues)

Things we want feedback on:

- Missing fields in `oacp.json`
- Edge cases in discovery
- Confirmation semantics
- Entity provider patterns
- Cross-platform concerns (iOS, desktop)

## Development setup (for Hark contributors)

If you want to work on Hark itself:

```bash
git clone https://github.com/OpenAppCapabilityProtocol/hark
```

Requirements:

- Flutter 3.29+
- Android SDK 35
- Kotlin 2.0
- An arm64 Android device (emulator won't work for on-device models)

Build and run:

```bash
flutter run
```

Key conventions:

- **State management**: Riverpod 3.x. No codegen, no ChangeNotifier.
- **Platform bridge**: Pigeon. Regenerate with:
```bash
dart run pigeon --input packages/hark_platform/pigeons/messages.dart
```
- **Branching**: Always branch from main. Never commit directly to main.
37 changes: 37 additions & 0 deletions content/docs/ecosystem/apps.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
title: OACP Apps
last_edited: '2026-04-11T00:00:00.000Z'
tocIsHidden: false
---

Every app below supports OACP. Install it alongside Hark, and you can control it by voice. Each demonstrates a different OACP pattern -- from simple foreground launches to background queries with async results and parameter extraction.

| App | What it does | Capabilities | Source |
|-----|-------------|-------------|--------|
| OACP Test App | Counter and battery demo | Increment, decrement, set, reset counter; get battery, get counter | [GitHub](https://github.com/OpenAppCapabilityProtocol/oacp) |
| Breezy Weather | Weather forecasts | Weather queries | [GitHub](https://github.com/OpenAppCapabilityProtocol/breezy-weather) |
| Binary Eye | QR/barcode scanner | Scan, create barcodes | [GitHub](https://github.com/OpenAppCapabilityProtocol/BinaryEye) |
| Voice Recorder | Audio recording | Start/stop recording | [GitHub](https://github.com/OpenAppCapabilityProtocol/Voice-Recorder) |
| Libre Camera | Camera app | Take photos | [GitHub](https://github.com/OpenAppCapabilityProtocol/librecamera) |
| Wikipedia | Encyclopedia | Search articles, open pages | [GitHub](https://github.com/OpenAppCapabilityProtocol/apps-android-wikipedia) |
| ArchiveTune | Internet Archive music | Play music, search | [GitHub](https://github.com/OpenAppCapabilityProtocol/ArchiveTune) |
| Live Darbar | Sikh devotional streaming | Play live stream | [GitHub](https://github.com/0xharkirat/live_darbar) |

## What each app demonstrates

| App | OACP pattern |
|-----|-------------|
| **OACP Test App** | Foreground + background actions, integer parameters, async results. The full kitchen sink. |
| **Breezy Weather** | Async background query. The app never opens -- it broadcasts the forecast back to Hark. |
| **Binary Eye** | Foreground camera launch for scanning, plus parameter-driven barcode creation. |
| **Voice Recorder** | Simple foreground action. One command starts recording. |
| **Wikipedia** | Entity providers (language selection) and foreground navigation to articles. |
| **ArchiveTune** | Parameter extraction. Hark pulls song name and artist from a single utterance. |
| **Libre Camera** | Simple foreground action. Opens the camera to take a photo. |
| **Live Darbar** | Simple foreground action. Starts a live devotional stream. |

## Add your app

Got an app with OACP support? We want to list it here.

Check out the [Contributing guide](/docs/contributing/contributing) for how to add OACP to an open-source app and get it listed. You can also use the [AI Prompts](/docs/ai-prompts/prompts) page to get help writing your `oacp.json` and receiver.
59 changes: 59 additions & 0 deletions content/docs/faq.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
---
title: FAQ
last_edited: '2026-04-11T00:00:00.000Z'
tocIsHidden: false
---

### Does OACP require Google Play Services?

No. OACP uses standard Android APIs: ContentProvider for discovery, BroadcastReceiver and Intent for dispatch. No proprietary dependencies.

### Does it work on custom ROMs?

Yes. OACP relies only on standard AOSP APIs. If your ROM supports ContentProvider and BroadcastReceiver (all of them do), OACP works.

### What Android versions are supported?

The OACP Android SDK supports Android 5.0+ (API 21). Hark itself requires Android 8.0+ (API 26) because of speech recognition and overlay features.

### Is Hark on the Play Store?

Not yet. Hark is distributed as an APK from GitHub releases. Play Store publishing is planned.

### Can I use OACP without Hark?

Yes, in principle. Hark is currently the only assistant that supports OACP. But the protocol is open and assistant-agnostic by design.

Google Assistant, Claude, ChatGPT, Perplexity, or any other assistant could adopt OACP. The protocol just needs a ContentProvider scanner and an Intent dispatcher. If apps ship `oacp.json` and assistants implement discovery, OACP works without Hark.

Hark will always remain the open-source reference assistant.

### Does OACP work with Flutter apps?

Yes. Place `oacp.json` in `android/app/src/main/assets/oacp.json` (not in Flutter's `assets/` folder). The ContentProvider and receiver are native Android components, so you write them in Kotlin even in a Flutter project.

### Does OACP work with React Native apps?

Yes, same approach as Flutter. The OACP integration is entirely on the native Android side.

### What about iOS?

Under evaluation. The protocol concepts (capability manifest, discovery, invocation) can map to iOS, but the transport layer would be different. There is no ContentProvider or BroadcastReceiver on iOS.

### Is the protocol stable?

OACP is at v0.3. The schema is functional and apps are using it. v1.0 is the stability target with schema lock, compatibility guarantees, and long-term IDs. Breaking changes before v1.0 are possible but will be documented.

### How is OACP different from Android 15 AppFunctions?

AppFunctions is Google's vendor-specific API shipping with Android 15+. OACP works on Android 5.0+, is open and assistant-agnostic, and provides richer metadata (aliases, examples, disambiguation hints, entity providers).

The two may converge. OACP is exploring an annotation processor that generates both `oacp.json` and AppFunctions metadata from a single source.

### How big is the SDK?

The AAR is 27 KB. Six classes, zero transitive dependencies beyond `androidx.annotation`.

### Does Hark send data to the cloud?

No. All AI inference runs on-device using EmbeddingGemma (308M) for intent matching and Qwen3 (0.6B) for parameter extraction. No data leaves the device.
Loading
Loading