rename client and move incomplete projects to research
This commit is contained in:
2
research/repl/.gitignore
vendored
Normal file
2
research/repl/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
target
|
||||
temp_db_for_example_worker_default_worker
|
5
research/repl/.rhai_repl_history.txt
Normal file
5
research/repl/.rhai_repl_history.txt
Normal file
@@ -0,0 +1,5 @@
|
||||
#V2
|
||||
.edit
|
||||
quit
|
||||
.edit
|
||||
exit
|
1752
research/repl/Cargo.lock
generated
Normal file
1752
research/repl/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
21
research/repl/Cargo.toml
Normal file
21
research/repl/Cargo.toml
Normal file
@@ -0,0 +1,21 @@
|
||||
[package]
|
||||
name = "ui_repl"
|
||||
version = "0.1.0"
|
||||
edition = "2024" # Keep 2024 unless issues arise
|
||||
|
||||
[dependencies]
|
||||
tokio = { version = "1", features = ["macros", "rt-multi-thread", "time", "sync"] } # Added "time" for potential timeouts, "sync" for worker
|
||||
url = "2" # For parsing Redis URL
|
||||
tracing = "0.1" # For logging
|
||||
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
|
||||
log = "0.4" # rhai_dispatcher uses log crate
|
||||
rustyline = { version = "13.0.0", features = ["derive"] } # For enhanced REPL input
|
||||
tempfile = "3.8" # For creating temporary files for editing
|
||||
|
||||
rhai_dispatcher = { path = "../client" }
|
||||
anyhow = "1.0" # For simpler error handling
|
||||
|
||||
rhailib_worker = { path = "../worker", package = "rhailib_worker" }
|
||||
rhailib_engine = { path = "../engine" }
|
||||
heromodels = { path = "../../../db/heromodels", features = ["rhai"] }
|
||||
rhai = { version = "1.18.0" } # Match version used by worker/engine
|
77
research/repl/README.md
Normal file
77
research/repl/README.md
Normal file
@@ -0,0 +1,77 @@
|
||||
# Rhai REPL CLI for Circle WebSocket Servers
|
||||
|
||||
This crate provides a command-line interface (CLI) to interact with Rhai scripts executed on remote Circle WebSocket servers. It includes both an interactive REPL and a non-interactive example.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
1. **Circle Orchestrator Running**: Ensure the `circles_orchestrator` is running. This application manages and starts the individual Circle WebSocket servers.
|
||||
To run the orchestrator:
|
||||
```bash
|
||||
cd /path/to/herocode/circles/cmd
|
||||
cargo run
|
||||
```
|
||||
By default, this will start servers based on the `circles.json` configuration (e.g., "Alpha Circle" on `ws://127.0.0.1:8081/ws`).
|
||||
|
||||
2. **Redis Server**: Ensure a Redis server is running and accessible at `redis://127.0.0.1:6379` (this is the default used by the orchestrator and its components).
|
||||
|
||||
## Usage
|
||||
|
||||
Navigate to this crate's directory:
|
||||
```bash
|
||||
cd /path/to/herocode/circles/ui_repl
|
||||
```
|
||||
|
||||
### 1. Interactive REPL
|
||||
|
||||
The main binary of this crate is an interactive REPL.
|
||||
|
||||
**To run with default WebSocket URL (`ws://127.0.0.1:8081/ws`):**
|
||||
```bash
|
||||
cargo run
|
||||
```
|
||||
|
||||
**To specify a WebSocket URL:**
|
||||
```bash
|
||||
cargo run ws://<your-circle-server-ip>:<port>/ws
|
||||
# Example for "Beta Circle" if configured on port 8082:
|
||||
# cargo run ws://127.0.0.1:8082/ws
|
||||
```
|
||||
|
||||
Once connected, you can:
|
||||
- Type single-line Rhai scripts directly and press Enter.
|
||||
- Use Vi keybindings for editing the current input line (thanks to `rustyline`).
|
||||
- Type `.edit` to open your `$EDITOR` (or `vi` by default) for multi-line script input. Save and close the editor to execute the script.
|
||||
- Type `.run <filepath>` (or `run <filepath>`) to execute a Rhai script from a local file.
|
||||
- Type `exit` or `quit` to close the REPL.
|
||||
|
||||
Command history is saved to `.rhai_repl_history.txt` in the directory where you run the REPL.
|
||||
|
||||
### 2. Non-Interactive Example (`connect_and_play`)
|
||||
|
||||
This example connects to a WebSocket server, sends a predefined Rhai script, prints the response, and then disconnects.
|
||||
|
||||
**To run with default WebSocket URL (`ws://127.0.0.1:8081/ws`):**
|
||||
```bash
|
||||
cargo run --example connect_and_play
|
||||
```
|
||||
|
||||
**To specify a WebSocket URL for the example:**
|
||||
```bash
|
||||
cargo run --example connect_and_play ws://<your-circle-server-ip>:<port>/ws
|
||||
# Example:
|
||||
# cargo run --example connect_and_play ws://127.0.0.1:8082/ws
|
||||
```
|
||||
|
||||
The example script is:
|
||||
```rhai
|
||||
let a = 10;
|
||||
let b = 32;
|
||||
let message = "Hello from example script!";
|
||||
message + " Result: " + (a + b)
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
Both the REPL and the example use the `tracing` crate for logging. You can control log levels using the `RUST_LOG` environment variable. For example, to see debug logs from the `circle_client_ws` library:
|
||||
```bash
|
||||
RUST_LOG=info,circle_client_ws=debug cargo run --example connect_and_play
|
53
research/repl/docs/ARCHITECTURE.md
Normal file
53
research/repl/docs/ARCHITECTURE.md
Normal file
@@ -0,0 +1,53 @@
|
||||
# Architecture of the `ui_repl` Crate
|
||||
|
||||
The `ui_repl` crate provides an interactive Read-Eval-Print Loop (REPL) interface for the rhailib ecosystem, enabling real-time script development, testing, and execution with integrated worker management.
|
||||
|
||||
## Core Architecture
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[REPL Interface] --> B[Script Execution]
|
||||
A --> C[Worker Management]
|
||||
A --> D[Client Integration]
|
||||
|
||||
B --> B1[Local Engine Execution]
|
||||
B --> B2[Remote Worker Execution]
|
||||
B --> B3[Script Editing]
|
||||
|
||||
C --> C1[Worker Lifecycle]
|
||||
C --> C2[Task Distribution]
|
||||
C --> C3[Status Monitoring]
|
||||
|
||||
D --> D1[Redis Client]
|
||||
D --> D2[Task Submission]
|
||||
D --> D3[Result Retrieval]
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
### Interactive Development
|
||||
- **Enhanced Input**: Rustyline for advanced command-line editing
|
||||
- **Script Editing**: Temporary file editing with external editors
|
||||
- **Syntax Highlighting**: Enhanced script development experience
|
||||
|
||||
### Dual Execution Modes
|
||||
- **Local Execution**: Direct engine execution for development
|
||||
- **Remote Execution**: Worker-based execution for production testing
|
||||
- **Seamless Switching**: Easy mode transitions during development
|
||||
|
||||
### Integrated Worker Management
|
||||
- **Worker Spawning**: Automatic worker process management
|
||||
- **Lifecycle Control**: Start, stop, and restart worker processes
|
||||
- **Status Monitoring**: Real-time worker health and performance
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **Rhai Client**: Integration with rhailib client for remote execution
|
||||
- **Rhailib Engine**: Direct engine access for local execution
|
||||
- **Rhailib Worker**: Embedded worker management capabilities
|
||||
- **Enhanced CLI**: Rustyline for superior REPL experience
|
||||
- **Async Runtime**: Tokio for concurrent operations
|
||||
|
||||
## Usage Patterns
|
||||
|
||||
The REPL serves as the primary development interface for rhailib, providing developers with immediate feedback and testing capabilities for Rhai scripts and business logic.
|
198
research/repl/examples/connect_and_play.rs
Normal file
198
research/repl/examples/connect_and_play.rs
Normal file
@@ -0,0 +1,198 @@
|
||||
use anyhow::Context;
|
||||
use rhai_dispatcher::{RhaiDispatcher, RhaiDispatcherError, RhaiTaskDetails};
|
||||
use std::env;
|
||||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
use tokio::sync::mpsc;
|
||||
use tracing_subscriber::EnvFilter;
|
||||
|
||||
use engine::create_heromodels_engine;
|
||||
use heromodels::db::hero::OurDB;
|
||||
use std::path::PathBuf;
|
||||
use worker_lib::spawn_rhai_worker;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||
tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
EnvFilter::from_default_env()
|
||||
.add_directive("connect_and_play=info".parse().unwrap())
|
||||
.add_directive("rhai_dispatcher=info".parse().unwrap()),
|
||||
)
|
||||
.init();
|
||||
|
||||
let args: Vec<String> = env::args().collect();
|
||||
let redis_url = args.get(1).cloned().unwrap_or_else(|| {
|
||||
let default_url = "redis://127.0.0.1/".to_string();
|
||||
println!("No Redis URL provided. Defaulting to: {}", default_url);
|
||||
default_url
|
||||
});
|
||||
let worker_name = args.get(2).cloned().unwrap_or_else(|| {
|
||||
let default_worker = "default_worker".to_string();
|
||||
println!("No worker name provided. Defaulting to: {}", default_worker);
|
||||
default_worker
|
||||
});
|
||||
|
||||
// Define DB path for the worker
|
||||
let db_path_str = format!("./temp_db_for_example_worker_{}", worker_name);
|
||||
let db_path = PathBuf::from(&db_path_str);
|
||||
|
||||
// Create shutdown channel for the worker
|
||||
let (shutdown_tx, shutdown_rx) = mpsc::channel::<()>(1);
|
||||
|
||||
// Spawn a worker in the background
|
||||
let worker_redis_url = redis_url.clone();
|
||||
let worker_circle_name_for_task = worker_name.clone();
|
||||
let db_path_for_task = db_path_str.clone();
|
||||
|
||||
log::info!(
|
||||
"[Main] Spawning worker for circle '{}' with DB path '{}'",
|
||||
worker_circle_name_for_task,
|
||||
db_path_for_task
|
||||
);
|
||||
|
||||
let worker_join_handle = tokio::spawn(async move {
|
||||
log::info!(
|
||||
"[BG Worker] Starting for circle '{}' on Redis '{}'",
|
||||
worker_circle_name_for_task,
|
||||
worker_redis_url
|
||||
);
|
||||
// The `reset: true` in OurDB::new handles pre-cleanup if the directory exists.
|
||||
let db = Arc::new(
|
||||
OurDB::new(&db_path_for_task, true)
|
||||
.expect("Failed to create temp DB for example worker"),
|
||||
);
|
||||
let mut engine = create_heromodels_engine(db);
|
||||
engine.set_max_operations(0);
|
||||
engine.set_max_expr_depths(0, 0);
|
||||
engine.set_optimization_level(rhai::OptimizationLevel::Full);
|
||||
|
||||
if let Err(e) = spawn_rhai_worker(
|
||||
1, // dummy circle_id
|
||||
worker_circle_name_for_task.clone(),
|
||||
engine,
|
||||
worker_redis_url.clone(),
|
||||
shutdown_rx, // Pass the receiver from main
|
||||
false, // preserve_tasks
|
||||
)
|
||||
.await
|
||||
{
|
||||
log::error!(
|
||||
"[BG Worker] Failed to spawn or worker error for circle '{}': {}",
|
||||
worker_circle_name_for_task,
|
||||
e
|
||||
);
|
||||
} else {
|
||||
log::info!(
|
||||
"[BG Worker] Worker for circle '{}' shut down gracefully.",
|
||||
worker_circle_name_for_task
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// Give the worker a moment to start up
|
||||
tokio::time::sleep(Duration::from_secs(1)).await;
|
||||
|
||||
println!(
|
||||
"Initializing RhaiDispatcher for Redis at {} to target worker '{}'...",
|
||||
redis_url, worker_name
|
||||
);
|
||||
let client = RhaiDispatcher::new(&redis_url)
|
||||
.with_context(|| format!("Failed to create RhaiDispatcher for Redis URL: {}", redis_url))?;
|
||||
println!("RhaiDispatcher initialized.");
|
||||
|
||||
let script = "let a = 10; let b = 32; let message = \"Hello from example script!\"; message + \" Result: \" + (a + b)";
|
||||
println!("\nSending script:\n```rhai\n{}\n```", script);
|
||||
|
||||
let timeout = Duration::from_secs(30);
|
||||
match client
|
||||
.submit_script_and_await_result(&worker_name, script.to_string(), None, timeout)
|
||||
.await
|
||||
{
|
||||
Ok(task_details) => {
|
||||
println!("\nWorker response:");
|
||||
if let Some(ref output) = task_details.output {
|
||||
println!("Output: {}", output);
|
||||
}
|
||||
if let Some(ref error_msg) = task_details.error {
|
||||
eprintln!("Error: {}", error_msg);
|
||||
}
|
||||
if task_details.output.is_none() && task_details.error.is_none() {
|
||||
println!(
|
||||
"Worker finished with no explicit output or error. Status: {}",
|
||||
task_details.status
|
||||
);
|
||||
}
|
||||
}
|
||||
Err(e) => match e {
|
||||
RhaiDispatcherError::Timeout(task_id) => {
|
||||
eprintln!(
|
||||
"\nError: Script execution timed out for task_id: {}.",
|
||||
task_id
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::RedisError(redis_err) => {
|
||||
eprintln!(
|
||||
"\nError: Redis communication failed: {}. Check Redis connection and server status.",
|
||||
redis_err
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::SerializationError(serde_err) => {
|
||||
eprintln!(
|
||||
"\nError: Failed to serialize/deserialize task data: {}.",
|
||||
serde_err
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::TaskNotFound(task_id) => {
|
||||
eprintln!("\nError: Task {} not found after submission.", task_id);
|
||||
} /* All RhaiDispatcherError variants are handled, so _ arm is not strictly needed
|
||||
unless RhaiDispatcherError becomes non-exhaustive in the future. */
|
||||
},
|
||||
}
|
||||
|
||||
println!("\nExample client operations finished. Shutting down worker...");
|
||||
|
||||
// Send shutdown signal to the worker
|
||||
if let Err(e) = shutdown_tx.send(()).await {
|
||||
eprintln!(
|
||||
"[Main] Failed to send shutdown signal to worker: {} (worker might have already exited or an error occurred)",
|
||||
e
|
||||
);
|
||||
}
|
||||
|
||||
// Wait for the worker to finish
|
||||
log::info!("[Main] Waiting for worker task to join...");
|
||||
if let Err(e) = worker_join_handle.await {
|
||||
eprintln!("[Main] Error waiting for worker task to join: {:?}", e);
|
||||
} else {
|
||||
log::info!("[Main] Worker task joined successfully.");
|
||||
}
|
||||
|
||||
// Clean up the database directory
|
||||
log::info!(
|
||||
"[Main] Cleaning up database directory: {}",
|
||||
db_path.display()
|
||||
);
|
||||
if db_path.exists() {
|
||||
if let Err(e) = std::fs::remove_dir_all(&db_path) {
|
||||
eprintln!(
|
||||
"[Main] Failed to remove database directory '{}': {}",
|
||||
db_path.display(),
|
||||
e
|
||||
);
|
||||
} else {
|
||||
log::info!(
|
||||
"[Main] Successfully removed database directory: {}",
|
||||
db_path.display()
|
||||
);
|
||||
}
|
||||
} else {
|
||||
log::info!(
|
||||
"[Main] Database directory '{}' not found, no cleanup needed.",
|
||||
db_path.display()
|
||||
);
|
||||
}
|
||||
|
||||
println!("Example fully completed and cleaned up.");
|
||||
Ok(())
|
||||
}
|
275
research/repl/src/main.rs
Normal file
275
research/repl/src/main.rs
Normal file
@@ -0,0 +1,275 @@
|
||||
use anyhow::Context;
|
||||
use rhai_dispatcher::{RhaiDispatcher, RhaiDispatcherBuilder, RhaiDispatcherError};
|
||||
use rustyline::error::ReadlineError;
|
||||
use rustyline::{Config, DefaultEditor, EditMode};
|
||||
use std::env;
|
||||
use std::fs;
|
||||
use std::process::Command;
|
||||
use std::time::Duration;
|
||||
use tempfile::Builder as TempFileBuilder;
|
||||
use tracing_subscriber::EnvFilter;
|
||||
|
||||
// Default timeout for script execution
|
||||
const DEFAULT_SCRIPT_TIMEOUT_SECONDS: u64 = 30;
|
||||
|
||||
async fn execute_script(client: &RhaiDispatcher, circle_name: &str, script_content: String) {
|
||||
if script_content.trim().is_empty() {
|
||||
println!("Script is empty, not sending.");
|
||||
return;
|
||||
}
|
||||
println!(
|
||||
"Sending script to worker '{}':\n---\n{}\n---",
|
||||
circle_name, script_content
|
||||
);
|
||||
|
||||
let timeout = Duration::from_secs(DEFAULT_SCRIPT_TIMEOUT_SECONDS);
|
||||
|
||||
match client
|
||||
.new_play_request()
|
||||
.worker_id(circle_name)
|
||||
.script(&script_content)
|
||||
.timeout(timeout)
|
||||
.await_response()
|
||||
.await
|
||||
{
|
||||
Ok(task_details) => {
|
||||
if let Some(output) = &task_details.output {
|
||||
println!("worker: {}", output);
|
||||
}
|
||||
if let Some(error_msg) = &task_details.error {
|
||||
eprintln!("Worker error: {}", error_msg);
|
||||
}
|
||||
if task_details.output.is_none() && task_details.error.is_none() {
|
||||
println!(
|
||||
"Worker finished with no explicit output or error. Status: {}",
|
||||
task_details.status
|
||||
);
|
||||
}
|
||||
}
|
||||
Err(e) => match e {
|
||||
RhaiDispatcherError::Timeout(task_id) => {
|
||||
eprintln!(
|
||||
"Error: Script execution timed out for task_id: {}.",
|
||||
task_id
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::RedisError(redis_err) => {
|
||||
eprintln!(
|
||||
"Error: Redis communication failed: {}. Check Redis connection and server status.",
|
||||
redis_err
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::SerializationError(serde_err) => {
|
||||
eprintln!(
|
||||
"Error: Failed to serialize/deserialize task data: {}.",
|
||||
serde_err
|
||||
);
|
||||
}
|
||||
RhaiDispatcherError::TaskNotFound(task_id) => {
|
||||
eprintln!(
|
||||
"Error: Task {} not found after submission (this should be rare).",
|
||||
task_id
|
||||
);
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
async fn run_repl(redis_url: String, circle_name: String) -> anyhow::Result<()> {
|
||||
println!(
|
||||
"Initializing Rhai REPL for worker '{}' via Redis at {}...",
|
||||
circle_name, redis_url
|
||||
);
|
||||
|
||||
let client = RhaiDispatcherBuilder::new()
|
||||
.redis_url(&redis_url)
|
||||
.caller_id("ui_repl") // Set a caller_id
|
||||
.build()
|
||||
.with_context(|| format!("Failed to create RhaiDispatcher for Redis URL: {}", redis_url))?;
|
||||
|
||||
// No explicit connect() needed for rhai_dispatcher, connection is handled per-operation or pooled.
|
||||
println!(
|
||||
"RhaiDispatcher initialized. Ready to send scripts to worker '{}'.",
|
||||
circle_name
|
||||
);
|
||||
println!(
|
||||
"Type Rhai scripts, '.edit' to use $EDITOR, '.run <path>' to execute a file, or 'exit'/'quit'."
|
||||
);
|
||||
println!("Vi mode enabled for input line.");
|
||||
|
||||
let config = Config::builder()
|
||||
.edit_mode(EditMode::Vi)
|
||||
.auto_add_history(true) // Automatically add to history
|
||||
.build();
|
||||
let mut rl = DefaultEditor::with_config(config)?;
|
||||
|
||||
let history_file = ".rhai_repl_history.txt"; // Simple history file in current dir
|
||||
if rl.load_history(history_file).is_err() {
|
||||
// No history found or error loading, not critical
|
||||
}
|
||||
|
||||
let prompt = format!("rhai ({}) @ {}> ", circle_name, redis_url);
|
||||
|
||||
loop {
|
||||
let readline = rl.readline(&prompt);
|
||||
match readline {
|
||||
Ok(line) => {
|
||||
let input = line.trim();
|
||||
|
||||
if input.eq_ignore_ascii_case("exit") || input.eq_ignore_ascii_case("quit") {
|
||||
println!("Exiting REPL.");
|
||||
break;
|
||||
} else if input.eq_ignore_ascii_case(".edit") {
|
||||
// Correct way to create a temp file with a suffix
|
||||
let temp_file = TempFileBuilder::new()
|
||||
.prefix("rhai_script_") // Optional: add a prefix
|
||||
.suffix(".rhai")
|
||||
.tempfile_in(".") // Create in current directory for simplicity
|
||||
.with_context(|| "Failed to create temp file")?;
|
||||
|
||||
// You can pre-populate the temp file if needed:
|
||||
// use std::io::Write; // Add this import if using write_all
|
||||
// if let Err(e) = temp_file.as_file().write_all(b"// Start your Rhai script here\n") {
|
||||
// eprintln!("Failed to write initial content to temp file: {}", e);
|
||||
// }
|
||||
|
||||
let temp_path = temp_file.path().to_path_buf();
|
||||
let editor_cmd_str = env::var("EDITOR").unwrap_or_else(|_| "vi".to_string());
|
||||
|
||||
let mut editor_parts = editor_cmd_str.split_whitespace();
|
||||
let editor_executable = editor_parts.next().unwrap_or("vi"); // Default to vi if $EDITOR is empty string
|
||||
let editor_args: Vec<&str> = editor_parts.collect();
|
||||
|
||||
println!(
|
||||
"Launching editor: '{}' with args: {:?} for script editing. Save and exit editor to execute.",
|
||||
editor_executable, editor_args
|
||||
);
|
||||
|
||||
let mut command = Command::new(editor_executable);
|
||||
command.args(editor_args); // Add any arguments from $EDITOR (like -w)
|
||||
command.arg(&temp_path); // Add the temp file path as the last argument
|
||||
|
||||
let status = command.status();
|
||||
|
||||
match status {
|
||||
Ok(exit_status) if exit_status.success() => {
|
||||
match fs::read_to_string(&temp_path) {
|
||||
Ok(script_content) => {
|
||||
execute_script(&client, &circle_name, script_content).await;
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("Error reading temp file {:?}: {}", temp_path, e)
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(exit_status) => eprintln!(
|
||||
"Editor exited with status: {}. Script not executed.",
|
||||
exit_status
|
||||
),
|
||||
Err(e) => eprintln!(
|
||||
"Failed to launch editor '{}': {}. Ensure it's in your PATH.",
|
||||
editor_executable, e
|
||||
), // Changed 'editor' to 'editor_executable'
|
||||
}
|
||||
// temp_file is automatically deleted when it goes out of scope
|
||||
} else if input.starts_with(".run ") || input.starts_with("run ") {
|
||||
let parts: Vec<&str> = input.splitn(2, ' ').collect();
|
||||
if parts.len() == 2 {
|
||||
let file_path = parts[1];
|
||||
println!("Attempting to run script from file: {}", file_path);
|
||||
match fs::read_to_string(file_path) {
|
||||
Ok(script_content) => {
|
||||
execute_script(&client, &circle_name, script_content).await;
|
||||
}
|
||||
Err(e) => eprintln!("Error reading file {}: {}", file_path, e),
|
||||
}
|
||||
} else {
|
||||
eprintln!("Usage: .run <filepath>");
|
||||
}
|
||||
} else if !input.is_empty() {
|
||||
execute_script(&client, &circle_name, input.to_string()).await;
|
||||
}
|
||||
// rl.add_history_entry(line.as_str()) is handled by auto_add_history(true)
|
||||
}
|
||||
Err(ReadlineError::Interrupted) => {
|
||||
// Ctrl-C
|
||||
println!("Input interrupted. Type 'exit' or 'quit' to close.");
|
||||
continue;
|
||||
}
|
||||
Err(ReadlineError::Eof) => {
|
||||
// Ctrl-D
|
||||
println!("Exiting REPL (EOF).");
|
||||
break;
|
||||
}
|
||||
Err(err) => {
|
||||
eprintln!("Error reading input: {:?}", err);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if rl.save_history(history_file).is_err() {
|
||||
// Failed to save history, not critical
|
||||
}
|
||||
|
||||
// No explicit disconnect for RhaiDispatcher as it manages connections internally.
|
||||
println!("Exited REPL.");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
EnvFilter::from_default_env()
|
||||
.add_directive("ui_repl=info".parse()?)
|
||||
.add_directive("rhai_dispatcher=info".parse()?),
|
||||
)
|
||||
.init();
|
||||
|
||||
let args: Vec<String> = env::args().collect();
|
||||
|
||||
let redis_url_str = if args.len() > 1 {
|
||||
args[1].clone()
|
||||
} else {
|
||||
let default_url = "redis://127.0.0.1/".to_string();
|
||||
println!("No Redis URL provided. Defaulting to: {}", default_url);
|
||||
default_url
|
||||
};
|
||||
|
||||
let circle_name_str = if args.len() > 2 {
|
||||
args[2].clone()
|
||||
} else {
|
||||
let default_circle = "default_worker".to_string();
|
||||
println!(
|
||||
"No worker/circle name provided. Defaulting to: {}",
|
||||
default_circle
|
||||
);
|
||||
default_circle
|
||||
};
|
||||
|
||||
println!(
|
||||
"Usage: {} [redis_url] [worker_name]",
|
||||
args.get(0).map_or("ui_repl", |s| s.as_str())
|
||||
);
|
||||
println!(
|
||||
"Example: {} redis://127.0.0.1/ my_rhai_worker",
|
||||
args.get(0).map_or("ui_repl", |s| s.as_str())
|
||||
);
|
||||
|
||||
// Basic validation for Redis URL (scheme)
|
||||
// A more robust validation might involve trying to parse it with redis::ConnectionInfo
|
||||
if !redis_url_str.starts_with("redis://") {
|
||||
eprintln!(
|
||||
"Warning: Redis URL '{}' does not start with 'redis://'. Attempting to use it anyway.",
|
||||
redis_url_str
|
||||
);
|
||||
}
|
||||
|
||||
if let Err(e) = run_repl(redis_url_str, circle_name_str).await {
|
||||
eprintln!("REPL error: {:#}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
3
research/rhai_engine_ui/.gitignore
vendored
Normal file
3
research/rhai_engine_ui/.gitignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
/target
|
||||
/dist
|
||||
Cargo.lock
|
30
research/rhai_engine_ui/Cargo.toml
Normal file
30
research/rhai_engine_ui/Cargo.toml
Normal file
@@ -0,0 +1,30 @@
|
||||
[package]
|
||||
name = "rhai-engine-ui"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
yew = { version = "0.21", features = ["csr"] }
|
||||
wasm-bindgen = "0.2"
|
||||
wasm-logger = "0.2"
|
||||
gloo-net = "0.4"
|
||||
gloo-timers = "0.3.0"
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
web-sys = { version = "0.3", features = ["HtmlInputElement"] }
|
||||
log = "0.4"
|
||||
chrono = { version = "0.4", features = ["serde"] }
|
||||
wasm-bindgen-futures = "0.4"
|
||||
|
||||
# Server-side dependencies (optional)
|
||||
tokio = { version = "1", features = ["full"], optional = true }
|
||||
axum = { version = "0.7", optional = true }
|
||||
tower = { version = "0.4", optional = true }
|
||||
tower-http = { version = "0.5.0", features = ["fs", "cors"], optional = true }
|
||||
rand = { version = "0.8", optional = true }
|
||||
redis = { version = "0.25", features = ["tokio-comp"], optional = true }
|
||||
deadpool-redis = { version = "0.15.0", features = ["rt_tokio_1"], optional = true }
|
||||
|
||||
[features]
|
||||
# This feature enables the server-side components
|
||||
server = ["tokio", "axum", "tower", "tower-http", "rand", "redis", "deadpool-redis"]
|
42
research/rhai_engine_ui/README.md
Normal file
42
research/rhai_engine_ui/README.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# Rhai Engine Worker UI
|
||||
|
||||
A Yew-based WASM interface to monitor Rhai workers.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Rust: Install from [rust-lang.org](https://www.rust-lang.org/tools/install)
|
||||
- Trunk: Install with `cargo install trunk`
|
||||
- A backend service providing the necessary API endpoints (see below).
|
||||
|
||||
## Backend API Requirements
|
||||
|
||||
This UI expects a backend service to be running that can provide data from Redis. The UI will make requests to the following (example) endpoints:
|
||||
|
||||
- `GET /api/worker/{worker_name}/tasks_and_stats`: Returns initial `WorkerData` including a list of `TaskSummary` and initial `QueueStats`.
|
||||
- `WorkerData`: `{ "queue_stats": { "current_size": u32, "color_code": "string" }, "tasks": [TaskSummary] }`
|
||||
- `TaskSummary`: `{ "hash": "string", "created_at": i64, "status": "string" }`
|
||||
- `GET /api/worker/{worker_name}/queue_stats`: Returns current `QueueStats` for polling.
|
||||
- `QueueStats`: `{ "current_size": u32, "color_code": "string" }`
|
||||
- `GET /api/task/{task_hash}`: Returns `TaskDetails`.
|
||||
- `TaskDetails`: `{ "hash": "string", "created_at": i64, "status": "string", "script_content": "string", "result": "optional_string", "error": "optional_string" }`
|
||||
|
||||
**Note:** The API endpoints are currently hardcoded with relative paths (e.g., `/api/...`). This assumes the backend API is served from the same host and port as the Trunk development server, or that a proxy is configured.
|
||||
|
||||
## Development
|
||||
|
||||
1. Navigate to the `rhai_engine_ui` directory:
|
||||
```bash
|
||||
cd /Users/timurgordon/code/git.ourworld.tf/herocode/rhailib/rhai_engine_ui/
|
||||
```
|
||||
2. Run the development server:
|
||||
```bash
|
||||
trunk serve --port 8081
|
||||
```
|
||||
3. Open your browser to `http://127.0.0.1:8081`.
|
||||
|
||||
## Building for Release
|
||||
|
||||
```bash
|
||||
trunk build --release
|
||||
```
|
||||
This will output static files to the `dist` directory.
|
2
research/rhai_engine_ui/Trunk.toml
Normal file
2
research/rhai_engine_ui/Trunk.toml
Normal file
@@ -0,0 +1,2 @@
|
||||
[build]
|
||||
target = "index.html"
|
57
research/rhai_engine_ui/docs/ARCHITECTURE.md
Normal file
57
research/rhai_engine_ui/docs/ARCHITECTURE.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# Architecture of the `rhai-engine-ui` Crate
|
||||
|
||||
The `rhai-engine-ui` crate provides a web-based user interface for interacting with the rhailib ecosystem, offering both client-side and server-side components for comprehensive Rhai script management and execution.
|
||||
|
||||
## Core Architecture
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Web UI] --> B[Client-Side Components]
|
||||
A --> C[Server-Side Components]
|
||||
A --> D[Integration Layer]
|
||||
|
||||
B --> B1[Yew Frontend]
|
||||
B --> B2[WebAssembly Runtime]
|
||||
B --> B3[Browser Interface]
|
||||
|
||||
C --> C1[Axum Web Server]
|
||||
C --> C2[Redis Integration]
|
||||
C --> C3[API Endpoints]
|
||||
|
||||
D --> D1[Task Submission]
|
||||
D --> D2[Real-time Updates]
|
||||
D --> D3[Result Display]
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
### Frontend (WebAssembly)
|
||||
- **Yew Framework**: Modern Rust-based web frontend
|
||||
- **Real-time Interface**: Live updates and interactive script execution
|
||||
- **Browser Integration**: Native web technologies with Rust performance
|
||||
|
||||
### Backend (Optional Server)
|
||||
- **Axum Web Server**: High-performance async web server
|
||||
- **Redis Integration**: Direct connection to rhailib task queues
|
||||
- **API Layer**: RESTful endpoints for task management
|
||||
|
||||
### Dual Architecture
|
||||
- **Client-Only Mode**: Pure WebAssembly frontend for development
|
||||
- **Full-Stack Mode**: Complete web application with server backend
|
||||
- **Feature Flags**: Configurable deployment options
|
||||
|
||||
## Dependencies
|
||||
|
||||
### Frontend Dependencies
|
||||
- **Yew**: Component-based web framework
|
||||
- **WebAssembly**: Browser runtime for Rust code
|
||||
- **Web APIs**: Browser integration and DOM manipulation
|
||||
|
||||
### Backend Dependencies (Optional)
|
||||
- **Axum**: Modern web framework
|
||||
- **Redis**: Task queue integration
|
||||
- **Tower**: Middleware and service abstractions
|
||||
|
||||
## Deployment Options
|
||||
|
||||
The UI can be deployed as a static WebAssembly application for development use or as a full-stack web application with server-side Redis integration for production environments.
|
15
research/rhai_engine_ui/index.html
Normal file
15
research/rhai_engine_ui/index.html
Normal file
@@ -0,0 +1,15 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<title>Rhai Worker UI</title>
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap" rel="stylesheet">
|
||||
<link data-trunk rel="css" href="styles.css" />
|
||||
<!-- Trunk will inject a script tag here for the WASM loader -->
|
||||
</head>
|
||||
<body>
|
||||
<!-- The Yew app will be rendered here -->
|
||||
</body>
|
||||
</html>
|
388
research/rhai_engine_ui/src/app.rs
Normal file
388
research/rhai_engine_ui/src/app.rs
Normal file
@@ -0,0 +1,388 @@
|
||||
use gloo_net::http::Request;
|
||||
use gloo_timers::callback::Interval;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use wasm_bindgen_futures::spawn_local;
|
||||
use web_sys::HtmlInputElement;
|
||||
use yew::prelude::*;
|
||||
use yew::{html, Component, Context, Html, TargetCast};
|
||||
|
||||
// --- Data Structures (placeholders, to be refined based on backend API) ---
|
||||
|
||||
#[derive(Clone, PartialEq, Serialize, Deserialize, Debug)]
|
||||
pub struct QueueStats {
|
||||
pub current_size: u32,
|
||||
pub color_code: String, // e.g., "green", "yellow", "red"
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Serialize, Deserialize, Debug)]
|
||||
pub struct TaskSummary {
|
||||
pub hash: String,
|
||||
pub created_at: i64,
|
||||
pub status: String,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq)]
|
||||
pub struct TaskDetails {
|
||||
pub hash: String,
|
||||
pub created_at: i64,
|
||||
pub status: String,
|
||||
pub script_content: String,
|
||||
pub result: Option<String>,
|
||||
pub error: Option<String>,
|
||||
}
|
||||
|
||||
// Combined structure for initial fetch
|
||||
#[derive(Clone, PartialEq, Serialize, Deserialize, Debug)]
|
||||
pub struct WorkerDataResponse {
|
||||
pub queue_stats: Option<QueueStats>,
|
||||
pub tasks: Vec<TaskSummary>,
|
||||
}
|
||||
|
||||
// --- Component ---
|
||||
|
||||
pub enum Msg {
|
||||
UpdateWorkerName(String),
|
||||
FetchData,
|
||||
SetWorkerData(Result<WorkerDataResponse, String>),
|
||||
SetQueueStats(Result<QueueStats, String>),
|
||||
ViewTaskDetails(String), // Task hash
|
||||
SetTaskDetails(Result<TaskDetails, String>),
|
||||
ClearTaskDetails,
|
||||
IntervalTick, // For interval timer, to trigger queue stats fetch
|
||||
}
|
||||
|
||||
pub struct App {
|
||||
worker_name_input: String,
|
||||
worker_name_to_monitor: Option<String>,
|
||||
tasks_list: Vec<TaskSummary>,
|
||||
current_queue_stats: Option<QueueStats>,
|
||||
selected_task_details: Option<TaskDetails>,
|
||||
error_message: Option<String>,
|
||||
is_loading_initial_data: bool,
|
||||
is_loading_task_details: bool,
|
||||
queue_poll_timer: Option<Interval>,
|
||||
}
|
||||
|
||||
impl Component for App {
|
||||
type Message = Msg;
|
||||
type Properties = ();
|
||||
|
||||
fn create(_ctx: &Context<Self>) -> Self {
|
||||
Self {
|
||||
worker_name_input: "".to_string(),
|
||||
worker_name_to_monitor: None,
|
||||
tasks_list: Vec::new(),
|
||||
current_queue_stats: None,
|
||||
selected_task_details: None,
|
||||
error_message: None,
|
||||
is_loading_initial_data: false,
|
||||
is_loading_task_details: false,
|
||||
queue_poll_timer: None,
|
||||
}
|
||||
}
|
||||
|
||||
fn update(&mut self, ctx: &Context<Self>, msg: Self::Message) -> bool {
|
||||
match msg {
|
||||
Msg::UpdateWorkerName(name) => {
|
||||
self.worker_name_input = name;
|
||||
true
|
||||
}
|
||||
Msg::FetchData => {
|
||||
if self.worker_name_input.trim().is_empty() {
|
||||
self.error_message = Some("Please enter a worker name.".to_string());
|
||||
return true;
|
||||
}
|
||||
let worker_name = self.worker_name_input.trim().to_string();
|
||||
self.worker_name_to_monitor = Some(worker_name.clone());
|
||||
self.error_message = None;
|
||||
self.tasks_list.clear();
|
||||
self.current_queue_stats = None;
|
||||
self.selected_task_details = None;
|
||||
self.is_loading_initial_data = true;
|
||||
|
||||
let link = ctx.link().clone();
|
||||
let tasks_url = format!("/api/worker/{}/tasks_and_stats", worker_name);
|
||||
spawn_local(async move {
|
||||
match Request::get(&tasks_url).send().await {
|
||||
Ok(response) => {
|
||||
if response.ok() {
|
||||
match response.json::<WorkerDataResponse>().await {
|
||||
Ok(data) => link.send_message(Msg::SetWorkerData(Ok(data))),
|
||||
Err(e) => link.send_message(Msg::SetWorkerData(Err(format!(
|
||||
"Failed to parse worker data: {}",
|
||||
e
|
||||
)))),
|
||||
}
|
||||
} else {
|
||||
link.send_message(Msg::SetWorkerData(Err(format!(
|
||||
"API error: {} {}",
|
||||
response.status(),
|
||||
response.status_text()
|
||||
))));
|
||||
}
|
||||
}
|
||||
Err(e) => link.send_message(Msg::SetWorkerData(Err(format!(
|
||||
"Network error fetching worker data: {}",
|
||||
e
|
||||
)))),
|
||||
}
|
||||
});
|
||||
|
||||
// Set up polling for queue stats
|
||||
let link_for_timer = ctx.link().clone();
|
||||
let timer = Interval::new(5000, move || {
|
||||
// Poll every 5 seconds
|
||||
link_for_timer.send_message(Msg::IntervalTick);
|
||||
});
|
||||
if let Some(old_timer) = self.queue_poll_timer.take() {
|
||||
old_timer.cancel(); // Cancel previous timer if any
|
||||
}
|
||||
self.queue_poll_timer = Some(timer);
|
||||
true
|
||||
}
|
||||
Msg::IntervalTick => {
|
||||
if let Some(worker_name) = &self.worker_name_to_monitor {
|
||||
let queue_stats_url = format!("/api/worker/{}/queue_stats", worker_name);
|
||||
let link = ctx.link().clone();
|
||||
spawn_local(async move {
|
||||
match Request::get(&queue_stats_url).send().await {
|
||||
Ok(response) => {
|
||||
if response.ok() {
|
||||
match response.json::<QueueStats>().await {
|
||||
Ok(stats) => {
|
||||
link.send_message(Msg::SetQueueStats(Ok(stats)))
|
||||
}
|
||||
Err(e) => link.send_message(Msg::SetQueueStats(Err(
|
||||
format!("Failed to parse queue stats: {}", e),
|
||||
))),
|
||||
}
|
||||
} else {
|
||||
link.send_message(Msg::SetQueueStats(Err(format!(
|
||||
"API error (queue_stats): {} {}",
|
||||
response.status(),
|
||||
response.status_text()
|
||||
))));
|
||||
}
|
||||
}
|
||||
Err(e) => link.send_message(Msg::SetQueueStats(Err(format!(
|
||||
"Network error fetching queue stats: {}",
|
||||
e
|
||||
)))),
|
||||
}
|
||||
});
|
||||
}
|
||||
false // No direct re-render, SetQueueStats will trigger it
|
||||
}
|
||||
Msg::SetWorkerData(Ok(data)) => {
|
||||
self.tasks_list = data.tasks;
|
||||
self.current_queue_stats = data.queue_stats;
|
||||
self.error_message = None;
|
||||
self.is_loading_initial_data = false;
|
||||
true
|
||||
}
|
||||
Msg::SetWorkerData(Err(err_msg)) => {
|
||||
self.error_message = Some(err_msg);
|
||||
self.is_loading_initial_data = false;
|
||||
if let Some(timer) = self.queue_poll_timer.take() {
|
||||
timer.cancel();
|
||||
}
|
||||
true
|
||||
}
|
||||
Msg::SetQueueStats(Ok(stats)) => {
|
||||
self.current_queue_stats = Some(stats);
|
||||
// Don't clear main error message here, as this is a background update
|
||||
true
|
||||
}
|
||||
Msg::SetQueueStats(Err(err_msg)) => {
|
||||
log::error!("Failed to update queue stats: {}", err_msg);
|
||||
// Optionally show a non-blocking error for queue stats
|
||||
self.current_queue_stats = None;
|
||||
true
|
||||
}
|
||||
Msg::ViewTaskDetails(hash) => {
|
||||
self.is_loading_task_details = true;
|
||||
self.selected_task_details = None; // Clear previous details
|
||||
let task_details_url = format!("/api/task/{}", hash);
|
||||
let link = ctx.link().clone();
|
||||
spawn_local(async move {
|
||||
match Request::get(&task_details_url).send().await {
|
||||
Ok(response) => {
|
||||
if response.ok() {
|
||||
match response.json::<TaskDetails>().await {
|
||||
Ok(details) => {
|
||||
link.send_message(Msg::SetTaskDetails(Ok(details)))
|
||||
}
|
||||
Err(e) => link.send_message(Msg::SetTaskDetails(Err(format!(
|
||||
"Failed to parse task details: {}",
|
||||
e
|
||||
)))),
|
||||
}
|
||||
} else {
|
||||
link.send_message(Msg::SetTaskDetails(Err(format!(
|
||||
"API error (task_details): {} {}",
|
||||
response.status(),
|
||||
response.status_text()
|
||||
))));
|
||||
}
|
||||
}
|
||||
Err(e) => link.send_message(Msg::SetTaskDetails(Err(format!(
|
||||
"Network error fetching task details: {}",
|
||||
e
|
||||
)))),
|
||||
}
|
||||
});
|
||||
true
|
||||
}
|
||||
Msg::SetTaskDetails(Ok(details)) => {
|
||||
self.selected_task_details = Some(details);
|
||||
self.error_message = None; // Clear general error if task details load
|
||||
self.is_loading_task_details = false;
|
||||
true
|
||||
}
|
||||
Msg::SetTaskDetails(Err(err_msg)) => {
|
||||
self.error_message = Some(format!("Error loading task details: {}", err_msg));
|
||||
self.selected_task_details = None;
|
||||
self.is_loading_task_details = false;
|
||||
true
|
||||
}
|
||||
Msg::ClearTaskDetails => {
|
||||
self.selected_task_details = None;
|
||||
true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn view(&self, ctx: &Context<Self>) -> Html {
|
||||
let link = ctx.link();
|
||||
let on_worker_name_input = link.callback(|e: InputEvent| {
|
||||
let input: HtmlInputElement = e.target_unchecked_into();
|
||||
Msg::UpdateWorkerName(input.value())
|
||||
});
|
||||
|
||||
html! {
|
||||
<div class="container">
|
||||
<h1>{ "Rhai Worker Monitor" }</h1>
|
||||
|
||||
<div class="input-group">
|
||||
<input type="text"
|
||||
placeholder="Enter Worker Name (e.g., worker_default)"
|
||||
value={self.worker_name_input.clone()}
|
||||
oninput={on_worker_name_input.clone()}
|
||||
disabled={self.is_loading_initial_data}
|
||||
onkeypress={link.callback(move |e: KeyboardEvent| {
|
||||
if e.key() == "Enter" { Msg::FetchData } else { Msg::UpdateWorkerName(e.target_unchecked_into::<HtmlInputElement>().value()) }
|
||||
})}
|
||||
/>
|
||||
<button onclick={link.callback(|_| Msg::FetchData)} disabled={self.is_loading_initial_data || self.worker_name_input.trim().is_empty()}>
|
||||
{ if self.is_loading_initial_data { "Loading..." } else { "Load Worker Data" } }
|
||||
</button>
|
||||
</div>
|
||||
|
||||
if let Some(err) = &self.error_message {
|
||||
<p class="error">{ err }</p>
|
||||
}
|
||||
|
||||
if self.worker_name_to_monitor.is_some() && !self.is_loading_initial_data && self.error_message.is_none() {
|
||||
<h2>{ format!("Monitoring: {}", self.worker_name_to_monitor.as_ref().unwrap()) }</h2>
|
||||
|
||||
<h3>{ "Queue Status" }</h3>
|
||||
<div class="queue-visualization">
|
||||
{
|
||||
if let Some(stats) = &self.current_queue_stats {
|
||||
// TODO: Implement actual color coding and bar visualization
|
||||
html! { <p>{format!("Tasks in queue: {} ({})", stats.current_size, stats.color_code)}</p> }
|
||||
} else {
|
||||
html! { <p>{ "Loading queue stats..." }</p> }
|
||||
}
|
||||
}
|
||||
</div>
|
||||
|
||||
<h3>{ "Tasks" }</h3>
|
||||
{ self.view_tasks_table(ctx) }
|
||||
{ self.view_selected_task_details(ctx) }
|
||||
|
||||
} else if self.is_loading_initial_data {
|
||||
<p>{ "Loading worker data..." }</p>
|
||||
}
|
||||
</div>
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl App {
|
||||
fn view_tasks_table(&self, ctx: &Context<Self>) -> Html {
|
||||
if self.tasks_list.is_empty()
|
||||
&& self.worker_name_to_monitor.is_some()
|
||||
&& !self.is_loading_initial_data
|
||||
{
|
||||
return html! { <p>{ "No tasks found for this worker, or worker not found." }</p> };
|
||||
}
|
||||
if !self.tasks_list.is_empty() {
|
||||
html! {
|
||||
<table class="task-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>{ "Hash (click to view)" }</th>
|
||||
<th>{ "Created At (UTC)" }</th>
|
||||
<th>{ "Status" }</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{ for self.tasks_list.iter().map(|task| self.view_task_row(ctx, task)) }
|
||||
</tbody>
|
||||
</table>
|
||||
}
|
||||
} else {
|
||||
html! {}
|
||||
}
|
||||
}
|
||||
|
||||
fn view_task_row(&self, ctx: &Context<Self>, task: &TaskSummary) -> Html {
|
||||
let task_hash_clone = task.hash.clone();
|
||||
let created_at_str = chrono::DateTime::from_timestamp(task.created_at, 0).map_or_else(
|
||||
|| "Invalid date".to_string(),
|
||||
|dt| dt.format("%Y-%m-%d %H:%M:%S").to_string(),
|
||||
);
|
||||
html! {
|
||||
<tr onclick={ctx.link().callback(move |_| Msg::ViewTaskDetails(task_hash_clone.clone()))}
|
||||
style="cursor: pointer;">
|
||||
<td>{ task.hash.chars().take(12).collect::<String>() }{ "..." }</td>
|
||||
<td>{ created_at_str }</td>
|
||||
<td>{ &task.status }</td>
|
||||
</tr>
|
||||
}
|
||||
}
|
||||
|
||||
fn view_selected_task_details(&self, ctx: &Context<Self>) -> Html {
|
||||
if self.is_loading_task_details {
|
||||
return html! { <p>{ "Loading task details..." }</p> };
|
||||
}
|
||||
if let Some(details) = &self.selected_task_details {
|
||||
let created_at_str = chrono::DateTime::from_timestamp(details.created_at, 0)
|
||||
.map_or_else(
|
||||
|| "Invalid date".to_string(),
|
||||
|dt| dt.format("%Y-%m-%d %H:%M:%S UTC").to_string(),
|
||||
);
|
||||
html! {
|
||||
<div class="task-details-modal">
|
||||
<h4>{ format!("Task Details: {}", details.hash) }</h4>
|
||||
<p><strong>{ "Created At: " }</strong>{ created_at_str }</p>
|
||||
<p><strong>{ "Status: " }</strong>{ &details.status }</p>
|
||||
<p><strong>{ "Script Content:" }</strong></p>
|
||||
<pre>{ &details.script_content }</pre>
|
||||
if let Some(result) = &details.result {
|
||||
<p><strong>{ "Result:" }</strong></p>
|
||||
<pre>{ result }</pre>
|
||||
}
|
||||
if let Some(error) = &details.error {
|
||||
<p><strong>{ "Error:" }</strong></p>
|
||||
<pre style="color: red;">{ error }</pre>
|
||||
}
|
||||
<button onclick={ctx.link().callback(|_| Msg::ClearTaskDetails)}>{ "Close Details" }</button>
|
||||
</div>
|
||||
}
|
||||
} else {
|
||||
html! {}
|
||||
}
|
||||
}
|
||||
}
|
184
research/rhai_engine_ui/src/main.rs
Normal file
184
research/rhai_engine_ui/src/main.rs
Normal file
@@ -0,0 +1,184 @@
|
||||
// The 'app' module is shared between the server and the client.
|
||||
mod app;
|
||||
|
||||
// --- SERVER-SIDE CODE --- //
|
||||
|
||||
#[cfg(feature = "server")]
|
||||
mod server {
|
||||
use axum::{
|
||||
extract::{Path, State},
|
||||
http::{Method, StatusCode},
|
||||
routing::get,
|
||||
Json, Router,
|
||||
};
|
||||
use deadpool_redis::{Config, Pool, Runtime};
|
||||
use redis::{from_redis_value, AsyncCommands, FromRedisValue, Value};
|
||||
use std::collections::HashMap;
|
||||
use std::env;
|
||||
use std::net::SocketAddr;
|
||||
use tower_http::cors::{Any, CorsLayer};
|
||||
use tower_http::services::ServeDir;
|
||||
|
||||
// Import the shared application state and data structures
|
||||
use crate::app::{QueueStats, TaskDetails, TaskSummary, WorkerDataResponse};
|
||||
|
||||
const REDIS_TASK_DETAILS_PREFIX: &str = "rhai_task_details:";
|
||||
const REDIS_QUEUE_PREFIX: &str = "rhai_tasks:";
|
||||
|
||||
// The main function to run the server
|
||||
pub async fn run() {
|
||||
let redis_url = env::var("REDIS_URL").unwrap_or_else(|_| "redis://127.0.0.1/".to_string());
|
||||
let cfg = Config::from_url(redis_url);
|
||||
let pool = cfg
|
||||
.create_pool(Some(Runtime::Tokio1))
|
||||
.expect("Failed to create Redis pool");
|
||||
|
||||
let cors = CorsLayer::new()
|
||||
.allow_methods([Method::GET])
|
||||
.allow_origin(Any);
|
||||
|
||||
let app = Router::new()
|
||||
.route(
|
||||
"/api/worker/:worker_name/tasks_and_stats",
|
||||
get(get_worker_data),
|
||||
)
|
||||
.route("/api/worker/:worker_name/queue_stats", get(get_queue_stats))
|
||||
.route("/api/task/:hash", get(get_task_details))
|
||||
.nest_service("/", ServeDir::new("dist"))
|
||||
.with_state(pool)
|
||||
.layer(cors);
|
||||
|
||||
let addr = SocketAddr::from(([127, 0, 0, 1], 3000));
|
||||
println!("Backend server listening on http://{}", addr);
|
||||
println!("Serving static files from './dist' directory.");
|
||||
|
||||
let listener = tokio::net::TcpListener::bind(addr).await.unwrap();
|
||||
axum::serve(listener, app).await.unwrap();
|
||||
}
|
||||
|
||||
// --- API Handlers (Live Redis Data) ---
|
||||
|
||||
async fn get_worker_data(
|
||||
State(pool): State<Pool>,
|
||||
Path(worker_name): Path<String>,
|
||||
) -> Result<Json<WorkerDataResponse>, (StatusCode, String)> {
|
||||
let mut conn = pool.get().await.map_err(internal_error)?;
|
||||
let queue_key = format!("{}{}", REDIS_QUEUE_PREFIX, worker_name);
|
||||
|
||||
let task_ids: Vec<String> = conn
|
||||
.lrange(&queue_key, 0, -1)
|
||||
.await
|
||||
.map_err(internal_error)?;
|
||||
let mut tasks = Vec::new();
|
||||
|
||||
for task_id in task_ids {
|
||||
let task_key = format!("{}{}", REDIS_TASK_DETAILS_PREFIX, task_id);
|
||||
let task_details: redis::Value =
|
||||
conn.hgetall(&task_key).await.map_err(internal_error)?;
|
||||
if let Ok(summary) = task_summary_from_redis_value(&task_details) {
|
||||
tasks.push(summary);
|
||||
}
|
||||
}
|
||||
|
||||
let queue_stats = get_queue_stats_internal(&mut conn, &worker_name).await?;
|
||||
|
||||
Ok(Json(WorkerDataResponse {
|
||||
tasks,
|
||||
queue_stats: Some(queue_stats),
|
||||
}))
|
||||
}
|
||||
|
||||
async fn get_queue_stats(
|
||||
State(pool): State<Pool>,
|
||||
Path(worker_name): Path<String>,
|
||||
) -> Result<Json<QueueStats>, (StatusCode, String)> {
|
||||
let mut conn = pool.get().await.map_err(internal_error)?;
|
||||
let stats = get_queue_stats_internal(&mut conn, &worker_name).await?;
|
||||
Ok(Json(stats))
|
||||
}
|
||||
|
||||
async fn get_task_details(
|
||||
State(pool): State<Pool>,
|
||||
Path(hash): Path<String>,
|
||||
) -> Result<Json<TaskDetails>, (StatusCode, String)> {
|
||||
let mut conn = pool.get().await.map_err(internal_error)?;
|
||||
let task_key = format!("{}{}", REDIS_TASK_DETAILS_PREFIX, hash);
|
||||
let task_details: redis::Value = conn.hgetall(&task_key).await.map_err(internal_error)?;
|
||||
let details = task_details_from_redis_value(&task_details).map_err(internal_error)?;
|
||||
Ok(Json(details))
|
||||
}
|
||||
|
||||
// --- Internal Helper Functions ---
|
||||
|
||||
async fn get_queue_stats_internal(
|
||||
conn: &mut deadpool_redis::Connection,
|
||||
worker_name: &str,
|
||||
) -> Result<QueueStats, (StatusCode, String)> {
|
||||
let queue_key = format!("{}{}", REDIS_QUEUE_PREFIX, worker_name);
|
||||
let size: u32 = conn.llen(&queue_key).await.map_err(internal_error)?;
|
||||
let color_code = match size {
|
||||
0..=10 => "green",
|
||||
11..=50 => "yellow",
|
||||
_ => "red",
|
||||
}
|
||||
.to_string();
|
||||
Ok(QueueStats {
|
||||
current_size: size,
|
||||
color_code,
|
||||
})
|
||||
}
|
||||
|
||||
fn internal_error<E: std::error::Error>(err: E) -> (StatusCode, String) {
|
||||
(StatusCode::INTERNAL_SERVER_ERROR, err.to_string())
|
||||
}
|
||||
|
||||
fn task_summary_from_redis_value(v: &Value) -> redis::RedisResult<TaskSummary> {
|
||||
let map: HashMap<String, String> = from_redis_value(v)?;
|
||||
Ok(TaskSummary {
|
||||
hash: map.get("hash").cloned().unwrap_or_default(),
|
||||
created_at: map
|
||||
.get("createdAt")
|
||||
.and_then(|s| s.parse().ok())
|
||||
.unwrap_or_default(),
|
||||
status: map
|
||||
.get("status")
|
||||
.cloned()
|
||||
.unwrap_or_else(|| "Unknown".to_string()),
|
||||
})
|
||||
}
|
||||
|
||||
fn task_details_from_redis_value(v: &Value) -> redis::RedisResult<TaskDetails> {
|
||||
let map: HashMap<String, String> = from_redis_value(v)?;
|
||||
Ok(TaskDetails {
|
||||
hash: map.get("hash").cloned().unwrap_or_default(),
|
||||
created_at: map
|
||||
.get("createdAt")
|
||||
.and_then(|s| s.parse().ok())
|
||||
.unwrap_or_default(),
|
||||
status: map
|
||||
.get("status")
|
||||
.cloned()
|
||||
.unwrap_or_else(|| "Unknown".to_string()),
|
||||
script_content: map.get("script").cloned().unwrap_or_default(),
|
||||
result: map.get("output").cloned(),
|
||||
error: map.get("error").cloned(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// --- MAIN ENTRY POINTS --- //
|
||||
|
||||
// Main function for the server binary
|
||||
#[cfg(feature = "server")]
|
||||
#[tokio::main]
|
||||
async fn main() {
|
||||
server::run().await;
|
||||
}
|
||||
|
||||
// Main function for the WASM client (compiles when 'server' feature is not enabled)
|
||||
#[cfg(not(feature = "server"))]
|
||||
fn main() {
|
||||
wasm_logger::init(wasm_logger::Config::default());
|
||||
log::info!("Rhai Worker UI starting...");
|
||||
yew::Renderer::<app::App>::new().render();
|
||||
}
|
173
research/rhai_engine_ui/styles.css
Normal file
173
research/rhai_engine_ui/styles.css
Normal file
@@ -0,0 +1,173 @@
|
||||
/* --- Dark, Sleek, and Modern UI --- */
|
||||
|
||||
:root {
|
||||
--bg-color: #1a1a1a;
|
||||
--primary-color: #252525;
|
||||
--secondary-color: #333333;
|
||||
--font-color: #e0e0e0;
|
||||
--highlight-color: #00aaff;
|
||||
--border-color: #444444;
|
||||
--error-color: #ff4d4d;
|
||||
--error-bg-color: rgba(255, 77, 77, 0.1);
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: 'Inter', sans-serif;
|
||||
margin: 0;
|
||||
padding: 40px 20px;
|
||||
background-color: var(--bg-color);
|
||||
color: var(--font-color);
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
.container {
|
||||
background-color: transparent;
|
||||
max-width: 900px;
|
||||
margin: auto;
|
||||
}
|
||||
|
||||
h1, h2, h3, h4 {
|
||||
color: var(--font-color);
|
||||
font-weight: 600;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
h1 {
|
||||
text-align: center;
|
||||
font-size: 2.5em;
|
||||
letter-spacing: -1px;
|
||||
}
|
||||
|
||||
.input-group {
|
||||
margin-bottom: 30px;
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
}
|
||||
|
||||
input[type="text"] {
|
||||
flex-grow: 1;
|
||||
padding: 12px 15px;
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: 6px;
|
||||
font-size: 1em;
|
||||
background-color: var(--primary-color);
|
||||
color: var(--font-color);
|
||||
transition: border-color 0.3s, box-shadow 0.3s;
|
||||
}
|
||||
|
||||
input[type="text"]:focus {
|
||||
outline: none;
|
||||
border-color: var(--highlight-color);
|
||||
box-shadow: 0 0 0 3px rgba(0, 170, 255, 0.2);
|
||||
}
|
||||
|
||||
button {
|
||||
padding: 12px 20px;
|
||||
background-color: var(--highlight-color);
|
||||
color: #ffffff;
|
||||
border: none;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
font-size: 1em;
|
||||
font-weight: 500;
|
||||
transition: background-color 0.3s;
|
||||
}
|
||||
|
||||
button:hover {
|
||||
background-color: #0088cc;
|
||||
}
|
||||
|
||||
button:disabled {
|
||||
background-color: var(--secondary-color);
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
.error {
|
||||
color: var(--error-color);
|
||||
margin-top: 20px;
|
||||
text-align: center;
|
||||
padding: 12px;
|
||||
border: 1px solid var(--error-color);
|
||||
background-color: var(--error-bg-color);
|
||||
border-radius: 6px;
|
||||
}
|
||||
|
||||
.task-table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
margin-top: 30px;
|
||||
}
|
||||
|
||||
.task-table th, .task-table td {
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
padding: 15px;
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.task-table th {
|
||||
font-weight: 600;
|
||||
color: #a0a0a0;
|
||||
text-transform: uppercase;
|
||||
font-size: 0.85em;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
.task-table tr {
|
||||
transition: background-color 0.2s;
|
||||
}
|
||||
|
||||
.task-table tr:hover {
|
||||
background-color: var(--primary-color);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.queue-visualization {
|
||||
margin-top: 30px;
|
||||
padding: 25px;
|
||||
border: 1px solid var(--border-color);
|
||||
background-color: var(--primary-color);
|
||||
border-radius: 8px;
|
||||
text-align: center;
|
||||
font-size: 1.2em;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.task-details-modal {
|
||||
margin-top: 30px;
|
||||
padding: 25px;
|
||||
border: 1px solid var(--border-color);
|
||||
background-color: var(--primary-color);
|
||||
border-radius: 8px;
|
||||
}
|
||||
|
||||
.task-details-modal h4 {
|
||||
margin-top: 0;
|
||||
font-size: 1.5em;
|
||||
}
|
||||
|
||||
.task-details-modal p {
|
||||
margin: 12px 0;
|
||||
color: #c0c0c0;
|
||||
}
|
||||
|
||||
.task-details-modal p strong {
|
||||
color: var(--font-color);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.task-details-modal pre {
|
||||
background-color: var(--bg-color);
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
white-space: pre-wrap;
|
||||
word-break: break-all;
|
||||
max-height: 250px;
|
||||
overflow-y: auto;
|
||||
border: 1px solid var(--border-color);
|
||||
font-family: 'Courier New', Courier, monospace;
|
||||
}
|
||||
|
||||
.task-details-modal button {
|
||||
margin-top: 20px;
|
||||
}
|
Reference in New Issue
Block a user