Orchex Beta Promise
Version: 3.0 Last Updated: 2026-02-23 Status: Beta ready (multi-LLM pivot)
What We Promise Beta Users
The Core Promise
Turn a plan into parallel AI agents that can't break each other's code.
Paste a plan doc into your AI assistant. Orchex splits it into parallel streams — each locked to its own files, each self-healing on failure. 5 LLM providers, dependency-aware waves, ownership enforcement. What took 2 hours of serial prompting becomes 12 minutes.
What You Get
Feature-Gated Local (Free)
- Ownership enforcement — Streams can only modify their declared files
- Parallel execution — Up to 5 streams, 2 waves
- Single provider — One LLM at a time
- Manual retry — Re-run failed streams (self-healing requires Pro+)
- BYOK — Use your own API key
- Works offline — No account needed for local use
Cloud Trial ($5 Credit, 30 Days)
- Full cloud features — Unlimited streams/waves, all 5 providers
orchex learn— Markdown plan → parallel executable streams- Observability dashboard — Stream timelines, diffs, cost tracking
- No credit card required — $5 credit to experience cloud execution
- Direct developer access — Feedback goes straight to the builder
- Shape the roadmap — Your use cases influence what gets built
What We Ask
Minimum (Required)
- Try it on a real project — Not a toy example, something you actually want to build
- Report bugs — Email
support@orchex.devor use the in-app feedback form
Appreciated (Optional)
- Share your use case — What are you building? How does orchex help?
- Documentation feedback — What was confusing? What's missing?
- Spread the word — If it works for you, tell others
What We Don't Promise
Not Yet
- Enterprise features — SSO, SCIM, audit logs for compliance
- Guaranteed uptime — Beta means things will break
- 24/7 support — Solo developer, response times vary
Never
- Magic — You still need to think about your architecture
- Replacement for understanding — Orchex executes your plan, it doesn't make the plan
- Autonomous decision-making — You define streams, you review results
- File modifications outside ownership — Streams can only modify their declared files
The Beta Journey
| Phase | Users | Focus | Your Role |
|---|---|---|---|
| Closed Beta | 50 | Core feedback loop | Direct communication, high-touch |
| Open Beta | 100 | Self-serve validation | Test onboarding, find doc gaps |
| GA | 500+ | Product-market fit | Use it for real work |
Closed Beta Criteria
- Invite-only (signup approval required)
- Target: 80%+ complete onboarding checklist
- Focus: First orchestration success, documentation feedback
Open Beta Criteria
- Public signup enabled
- Self-serve onboarding
- Focus: Can you succeed without hand-holding?
Who This Is For
Perfect Fit
- OpenAI and Gemini users wanting parallel orchestration
- Developers wanting provider flexibility — not locked to one LLM
- Teams needing file safety — ownership enforcement prevents accidents
- Solo developers building features with AI assistance
- Small teams (2-5) shipping faster than competitors
- Local LLM users running Ollama for privacy
Good Fit
- Anyone using GPT, Gemini, Claude, or local models
- Developers frustrated by unreliable AI agents
- Teams wanting consistent orchestration across different LLMs
Not Yet
- Enterprises needing regulatory compliance certifications
- Large teams (20+) with complex coordination needs
- Organizations requiring managed API keys
Not Targeting
- Developers working on small tasks (1-3 files) — serial execution is fine for small work
The Problem We Solve
Before Orchex
You: "GPT, update the types file"
[Wait 2 min]
You: "Now update the API file"
[Wait 2 min — oh no, it modified the config file too]
You: "Now update the tests"
[Wait 2 min — test failure, manual retry]
You: "Now update the docs"
[Wait 2 min]
Total: 8+ minutes of waiting, files modified unexpectedly, manual error handlingAfter Orchex
You: Define 4 streams (types, api, tests, docs) with ownership
Orchex: Runs all 4 in parallel with your LLM
Orchex: Each stream can only modify its declared files
Orchex: Test failure → categorizes error → auto-generates fix stream
Total: ~2 minutes, file safety guaranteed, self-healing built-inThe Transformation
| Before | After |
|---|---|
| Single LLM lock-in | Multi-LLM (OpenAI, Gemini, Claude, Ollama) |
| Agents modify any file | Ownership enforcement — declared files only |
| Sequential sessions | Parallel execution |
| Generic error retry | 10 error categories with intelligent retry |
| Manual retry on errors | Self-healing generates fixes with error context |
| Babysitting AI | AI works while you do something else |
Feedback Channels
Bug Reports
- Email:
support@orchex.dev - Include: What you did, what happened, what you expected
Feature Requests
- Email:
support@orchex.devwith subject "Feature Request"
Direct Feedback
- Email:
founders@orchex.dev - For: Use cases, general feedback, partnership inquiries
Our Commitment to You
- Transparency — We'll be honest about what works and what doesn't
- Responsiveness — Bug reports get attention (may not be instant, but they're seen)
- Iteration — Your feedback shapes the product
- Respect — Your time matters; we won't waste it with hype
FAQ
What AI providers are supported?
OpenAI (GPT-4, GPT-4.1), Google Gemini (1.5, 2.0), Anthropic Claude, DeepSeek (V3, Coder, Reasoner), and local models via Ollama. Configure your provider with a single environment variable.
Is it really free?
Local use: Feature-gated free. Max 5 streams, 2 waves, 1 provider. Enough to try, not enough for real workloads. Cloud trial: $5 one-time credit, 30 days. Full features, no credit card. Paid tiers: Pro $19/mo (100 runs), Team $49/user/mo (500 runs), Enterprise custom.
What is ownership enforcement?
Each stream declares which files it can modify in its owns array. Orchex rejects any file operations outside that list. This prevents agents from accidentally modifying files they shouldn't touch.
Do you store my code?
Local: No code leaves your machine. Cloud: Context sent for LLM calls, not persisted on our servers.
What if orchex breaks my code?
Orchex creates backups before applying changes. Ownership enforcement prevents most accidental modifications. You can always rollback.
Can I use this at work?
Yes, subject to your company's policies on AI tools. BYOK means you control API spend with your preferred provider.
What happens after beta?
Feature-gated local stays free. Cloud pricing is published. No bait-and-switch. Your $5 trial credit will be honored.
End of document.
This is our promise. We intend to keep it.