Hero OS Self-Documenting Platform — Built-in Book, Settings UI, AI-Guided Configuration #31

Open
opened 2026-03-18 00:00:56 +00:00 by mik-tf · 0 comments
Owner

Vision

Hero OS should be self-documenting and self-configuring. A user launches the container and can understand, configure, and operate the entire system without leaving it — through documentation (hero_books), visual configuration (Settings), or AI conversation (hero_shrimp). All three share one source of truth: a built-in book that ships with every hero_zero container.


Architecture

┌─────────────────────────────────────────────────┐
│                   hero_zero                      │
│                                                  │
│   ┌──────────┐    MCP     ┌──────────────┐      │
│   │  hero_    │◄─────────►│ hero_shrimp  │      │
│   │  books    │           │ (AI assist)  │      │
│   │          │           │              │      │
│   │ Built-in │           │ Suggested    │      │
│   │ Hero OS  │           │ questions    │      │
│   │ Guide    │           │ + context    │      │
│   └────┬─────┘           └──────┬───────┘      │
│        │                        │               │
│        │ same source of truth   │               │
│        │                        │               │
│   ┌────▼────────────────────────▼───────┐       │
│   │        Settings (⚙ gear icon)       │       │
│   │                                      │       │
│   │  Env vars tab: view/set all config  │       │
│   │  Simple mode ←→ Advanced mode       │       │
│   │  Categories: AI, Grid, DB, etc.     │       │
│   └──────────────────────────────────────┘       │
│                      │                           │
│              reads / writes                      │
│                      │                           │
│   ┌──────────────────▼───────────────────┐      │
│   │     Canonical env vars (Layer 1)     │      │
│   │  source ~/hero/cfg/env/* at launch   │      │
│   │  docker run -e KEY=val               │      │
│   └──────────────────────────────────────┘      │
└─────────────────────────────────────────────────┘

Two-Layer Configuration Model

Layer How Who When
1. Runtime env vars docker run -e GROQ_API_KEY=... or source ~/hero/cfg/env/* Sysadmin / DevOps / scripts At container launch
2. Settings UI Gear icon (⚙) → Settings → Env Vars tab End user At runtime, live in browser

Both layers coexist. Layer 1 provides defaults at boot. Layer 2 reads those values and lets users view, modify, or add new ones. If everything is set via env vars, Settings shows "all configured." If launched bare, Settings guides the user through setup.


Component Breakdown

Component Location What it does
Built-in book hero_services/data/books/hero_os_guide/ Markdown ebook shipped as seed data in every hero_zero container. Covers all services, all env vars, all capabilities.
Settings env vars tab hero_os (gear icon → Settings page) New tab in existing Settings view. Shows categorized env vars with status (set/missing/optional). Simple mode with friendly labels, Advanced mode with raw var names + custom entries.
Settings backend hero_osis or hero_services RPC endpoint to read current env state and persist overrides to disk.
AI book access hero_books MCP + hero_shrimp Shrimp reads the built-in book via MCP. Answers questions with full Hero OS context. Can cross-reference docs with live system state.
Suggested questions hero_archipelagos/intelligence/ai/ One-click starter prompts in the AI assistant UI.

Built-in Book Structure

Ships in hero_services/data/books/hero_os_guide/ and is copied to dist/var/books/ by the build pipeline (already exists for demo books).

hero_os_guide/
├── getting_started/
│   ├── intro.md              — What is Hero OS, what can it do
│   ├── quickstart.md         — Launch, first login, basic tour
│   ├── configuration.md      — Two-layer config model, Settings UI
│   └── local_first.md        — What works without cloud APIs
│                               (embedder, voice, books, redis)
├── services/
│   ├── overview.md           — Architecture, zinit, Unix sockets
│   ├── ai_assistant.md       — Shrimp, conversations, AI providers
│   ├── voice.md              — Speech-to-text, local VAD, transforms
│   ├── embedder.md           — Local embedding, ONNX models
│   ├── books.md              — Documentation, search, Q&A extraction
│   ├── redis.md              — Data management
│   ├── auth.md               — Authentication, SSO
│   ├── compute.md            — VM management, cloud infrastructure
│   ├── indexer.md            — Full-text search
│   ├── proxy.md              — Service routing
│   ├── foundry.md            — Code forge, git management
│   ├── aibroker.md           — LLM routing, model selection
│   └── osis.md               — Core data platform, contexts
├── configuration/
│   ├── env_reference.md      — Complete variable registry with
│                               categories, descriptions, where to
│                               get each key
│   ├── ai_providers.md       — Which AI keys do what, how to choose,
│                               cost vs local tradeoffs
│   ├── grid_setup.md         — Mnemonic, network, TFGrid deployment
│   └── advanced.md           — Custom vars, multi-env, CI/CD secrets
└── development/
    ├── architecture.md       — Repos, crates, build system
    ├── contributing.md       — Branches, PRs, forge workflow
    └── extending.md          — Adding services, islands, books

Settings UI — Env Vars Tab

Accessed via: ⚙ gear icon (top-right) → Settings → Environment Variables

Simple Mode (default)

Category Variable Status Description
AI OpenRouter API Key Set Primary LLM provider
AI Groq API Key ⚠ Not set Fast inference + Whisper transcription
Infrastructure Forgejo Token Set Git operations + CI
Grid TFGrid Mnemonic ⚠ Not set Wallet for cloud deployments
Local Embedder Built-in No configuration needed
Local Voice (VAD) Built-in No configuration needed
  • Green = configured, amber = required but missing, grey = optional
  • "Built-in" for local-first services that need no keys
  • Click any row to edit the value

Advanced Mode

  • Shows raw variable names (OPENROUTER_API_KEY, GROQ_API_KEY)
  • Add custom variables
  • Import/export as env file
  • Shows which service consumes each variable

AI Suggested Questions

On first launch or empty conversation, the AI assistant shows clickable starter prompts:

Suggested Question What happens
"What is Hero OS and what can it do?" AI reads the intro chapter, gives personalized overview based on what's configured
"Help me set up my AI provider keys" AI checks current env state, guides through missing keys, explains tradeoffs
"What services are running right now?" AI queries zinit status, cross-references with book descriptions
"Show me what I can do with local voice transcription" AI reads voice chapter, demonstrates capabilities
"How do I deploy to the ThreeFold Grid?" AI reads grid setup chapter, checks if mnemonic is set
"What can I do without any API keys?" AI highlights local-first capabilities: embedder, voice, books, redis, file management

These are not static FAQ — the AI adapts based on the user's current configuration and context.


Self-Extending Model

When a new service joins Hero OS:

  1. Add a chapter to the book → AI automatically learns about it
  2. Add env vars to the config reference → Settings UI picks them up
  3. Add a suggested question if relevant → users discover the feature naturally

Workstreams

# Workstream Repo Depends on Priority
1 Write built-in book content hero_services/data/books/ Nothing — can start now High
2 Settings env vars tab (UI) hero_os Book (for var descriptions) High
3 Settings backend (read/write vars) hero_osis or hero_services Settings UI design High
4 Register book as MCP source for Shrimp hero_books + hero_shrimp Book content exists Medium
5 Add suggested questions to AI island hero_archipelagos MCP integration works Medium
6 Advanced mode (import/export, custom vars) hero_os Basic settings working Low
## Vision Hero OS should be self-documenting and self-configuring. A user launches the container and can understand, configure, and operate the entire system without leaving it — through documentation (hero_books), visual configuration (Settings), or AI conversation (hero_shrimp). All three share one source of truth: a built-in book that ships with every hero_zero container. --- ## Architecture ``` ┌─────────────────────────────────────────────────┐ │ hero_zero │ │ │ │ ┌──────────┐ MCP ┌──────────────┐ │ │ │ hero_ │◄─────────►│ hero_shrimp │ │ │ │ books │ │ (AI assist) │ │ │ │ │ │ │ │ │ │ Built-in │ │ Suggested │ │ │ │ Hero OS │ │ questions │ │ │ │ Guide │ │ + context │ │ │ └────┬─────┘ └──────┬───────┘ │ │ │ │ │ │ │ same source of truth │ │ │ │ │ │ │ ┌────▼────────────────────────▼───────┐ │ │ │ Settings (⚙ gear icon) │ │ │ │ │ │ │ │ Env vars tab: view/set all config │ │ │ │ Simple mode ←→ Advanced mode │ │ │ │ Categories: AI, Grid, DB, etc. │ │ │ └──────────────────────────────────────┘ │ │ │ │ │ reads / writes │ │ │ │ │ ┌──────────────────▼───────────────────┐ │ │ │ Canonical env vars (Layer 1) │ │ │ │ source ~/hero/cfg/env/* at launch │ │ │ │ docker run -e KEY=val │ │ │ └──────────────────────────────────────┘ │ └─────────────────────────────────────────────────┘ ``` --- ## Two-Layer Configuration Model | Layer | How | Who | When | |-------|-----|-----|------| | **1. Runtime env vars** | `docker run -e GROQ_API_KEY=...` or `source ~/hero/cfg/env/*` | Sysadmin / DevOps / scripts | At container launch | | **2. Settings UI** | Gear icon (⚙) → Settings → Env Vars tab | End user | At runtime, live in browser | Both layers coexist. Layer 1 provides defaults at boot. Layer 2 reads those values and lets users view, modify, or add new ones. If everything is set via env vars, Settings shows "all configured." If launched bare, Settings guides the user through setup. --- ## Component Breakdown | Component | Location | What it does | |-----------|----------|-------------| | **Built-in book** | `hero_services/data/books/hero_os_guide/` | Markdown ebook shipped as seed data in every hero_zero container. Covers all services, all env vars, all capabilities. | | **Settings env vars tab** | `hero_os` (gear icon → Settings page) | New tab in existing Settings view. Shows categorized env vars with status (set/missing/optional). Simple mode with friendly labels, Advanced mode with raw var names + custom entries. | | **Settings backend** | `hero_osis` or `hero_services` | RPC endpoint to read current env state and persist overrides to disk. | | **AI book access** | `hero_books` MCP + `hero_shrimp` | Shrimp reads the built-in book via MCP. Answers questions with full Hero OS context. Can cross-reference docs with live system state. | | **Suggested questions** | `hero_archipelagos/intelligence/ai/` | One-click starter prompts in the AI assistant UI. | --- ## Built-in Book Structure Ships in `hero_services/data/books/hero_os_guide/` and is copied to `dist/var/books/` by the build pipeline (already exists for demo books). ``` hero_os_guide/ ├── getting_started/ │ ├── intro.md — What is Hero OS, what can it do │ ├── quickstart.md — Launch, first login, basic tour │ ├── configuration.md — Two-layer config model, Settings UI │ └── local_first.md — What works without cloud APIs │ (embedder, voice, books, redis) ├── services/ │ ├── overview.md — Architecture, zinit, Unix sockets │ ├── ai_assistant.md — Shrimp, conversations, AI providers │ ├── voice.md — Speech-to-text, local VAD, transforms │ ├── embedder.md — Local embedding, ONNX models │ ├── books.md — Documentation, search, Q&A extraction │ ├── redis.md — Data management │ ├── auth.md — Authentication, SSO │ ├── compute.md — VM management, cloud infrastructure │ ├── indexer.md — Full-text search │ ├── proxy.md — Service routing │ ├── foundry.md — Code forge, git management │ ├── aibroker.md — LLM routing, model selection │ └── osis.md — Core data platform, contexts ├── configuration/ │ ├── env_reference.md — Complete variable registry with │ categories, descriptions, where to │ get each key │ ├── ai_providers.md — Which AI keys do what, how to choose, │ cost vs local tradeoffs │ ├── grid_setup.md — Mnemonic, network, TFGrid deployment │ └── advanced.md — Custom vars, multi-env, CI/CD secrets └── development/ ├── architecture.md — Repos, crates, build system ├── contributing.md — Branches, PRs, forge workflow └── extending.md — Adding services, islands, books ``` --- ## Settings UI — Env Vars Tab Accessed via: **⚙ gear icon (top-right) → Settings → Environment Variables** ### Simple Mode (default) | Category | Variable | Status | Description | |----------|----------|--------|-------------| | **AI** | OpenRouter API Key | ✅ Set | Primary LLM provider | | **AI** | Groq API Key | ⚠ Not set | Fast inference + Whisper transcription | | **Infrastructure** | Forgejo Token | ✅ Set | Git operations + CI | | **Grid** | TFGrid Mnemonic | ⚠ Not set | Wallet for cloud deployments | | **Local** | Embedder | ✅ Built-in | No configuration needed | | **Local** | Voice (VAD) | ✅ Built-in | No configuration needed | - Green = configured, amber = required but missing, grey = optional - "Built-in" for local-first services that need no keys - Click any row to edit the value ### Advanced Mode - Shows raw variable names (`OPENROUTER_API_KEY`, `GROQ_API_KEY`) - Add custom variables - Import/export as env file - Shows which service consumes each variable --- ## AI Suggested Questions On first launch or empty conversation, the AI assistant shows clickable starter prompts: | Suggested Question | What happens | |-------------------|-------------| | "What is Hero OS and what can it do?" | AI reads the intro chapter, gives personalized overview based on what's configured | | "Help me set up my AI provider keys" | AI checks current env state, guides through missing keys, explains tradeoffs | | "What services are running right now?" | AI queries zinit status, cross-references with book descriptions | | "Show me what I can do with local voice transcription" | AI reads voice chapter, demonstrates capabilities | | "How do I deploy to the ThreeFold Grid?" | AI reads grid setup chapter, checks if mnemonic is set | | "What can I do without any API keys?" | AI highlights local-first capabilities: embedder, voice, books, redis, file management | These are not static FAQ — the AI adapts based on the user's current configuration and context. --- ## Self-Extending Model When a new service joins Hero OS: 1. Add a chapter to the book → AI automatically learns about it 2. Add env vars to the config reference → Settings UI picks them up 3. Add a suggested question if relevant → users discover the feature naturally --- ## Workstreams | # | Workstream | Repo | Depends on | Priority | |---|-----------|------|-----------|----------| | 1 | Write built-in book content | `hero_services/data/books/` | Nothing — can start now | High | | 2 | Settings env vars tab (UI) | `hero_os` | Book (for var descriptions) | High | | 3 | Settings backend (read/write vars) | `hero_osis` or `hero_services` | Settings UI design | High | | 4 | Register book as MCP source for Shrimp | `hero_books` + `hero_shrimp` | Book content exists | Medium | | 5 | Add suggested questions to AI island | `hero_archipelagos` | MCP integration works | Medium | | 6 | Advanced mode (import/export, custom vars) | `hero_os` | Basic settings working | Low |
Sign in to join this conversation.
No labels
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
lhumina_code/home#31
No description provided.