21 Commits

Author SHA1 Message Date
bf20aeb88e Misc Doc Updates 2025-11-16 18:16:08 -07:00
50d5fe4738 Added Enrollment Code Argument Examples to README 2025-11-16 18:13:52 -07:00
824bd614d0 Added Enrollment Code Argument to Linux Bootstrapper Script 2025-11-16 18:09:28 -07:00
e36367b538 Re-implemented enrollmentcode commandline arguments for bootstrapper script. 2025-11-16 18:05:18 -07:00
f123ecf9bc Fixed Installed Software Search Breaking the WebUI 2025-11-16 17:54:24 -07:00
b2120d7385 Simplified & Reworked Enrollment Code System to be Site-Specific 2025-11-16 17:40:24 -07:00
65bee703e9 Re-commit 2025-11-16 07:27:47 -07:00
3091baecaf Allowed device JWTs to fetch repo hash for updater script 2025-11-16 07:12:02 -07:00
575be16214 Fixed Python Path Issues for Borealis Agent if Agent is moved to a different location on endpoint. 2025-11-16 06:56:28 -07:00
025255a102 Cleaned up references to legacy server in codex instructors. 2025-11-16 03:01:57 -07:00
547d04d965 Restructured Codex Instructor Files 2025-11-16 02:56:33 -07:00
3d914b70bc Scheduled Jobs: Fixed Job Table Column Spacing Issues 2025-11-15 14:02:14 -07:00
695e9ff73f Fixed Scheduled Job List Checkboxes Automatically Unchecking Themselves 2025-11-15 12:54:14 -07:00
98f678266b Adjusted "http" to "https" in documentation. 2025-11-15 06:51:47 -07:00
dbd176f4b4 Updated Launch Commands in README 2025-11-15 06:51:10 -07:00
0b56573d7e Updated Scheduled Job List Screenshot 2025-11-15 06:48:19 -07:00
f4fb060114 Additional scheduled job list UI adjustments 2025-11-15 06:37:37 -07:00
48a2a152e7 Quick Jobs Now Dispatch and Log Like Normal Jobs 2025-11-15 06:26:31 -07:00
7a599cdef7 Locked-down endpoints: /api/agents, /api/devices, /api/devices/<guid>, /api/device/details/<hostname>, /api/device/description/<hostname>, /api/device_list_views, /api/device_list_views/<view_id>, /api/sites, /api/sites/delete, /api/sites/device_map, /api/sites/assign, /api/sites/rename, /api/repo/current_hash, /api/agent/hash_list, /api/scripts/quick_run, /api/ansible/quick_run, /api/device/activity/<hostname>, /api/device/activity/job/<job_id>, /api/server/time. 2025-11-15 05:33:46 -07:00
b44aff64a3 Fixed Login Being Invisible on Edge/Chrome 2025-11-15 05:15:59 -07:00
305b108892 Removed Import Assemblies 2025-11-15 02:29:51 -07:00
55 changed files with 1910 additions and 1427 deletions

2
.github/.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,2 @@
# These are supported funding model platforms
ko_fi: bunnylab

56
.vscode/.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,56 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Borealis - Engine (Production)",
"type": "shell",
"command": "powershell.exe",
"args": [
"-NoLogo",
"-NoProfile",
"-ExecutionPolicy", "Bypass",
"-File", "${workspaceFolder}\\Borealis.ps1",
"-EngineProduction"
],
"presentation": {
"reveal": "always",
"panel": "shared"
},
"problemMatcher": []
},
{
"label": "Borealis - Engine (Dev)",
"type": "shell",
"command": "powershell.exe",
"args": [
"-NoLogo",
"-NoProfile",
"-ExecutionPolicy", "Bypass",
"-File", "${workspaceFolder}\\Borealis.ps1",
"-EngineDev"
],
"presentation": {
"reveal": "always",
"panel": "shared"
},
"problemMatcher": []
},
{
"label": "Borealis - Agent",
"type": "shell",
"command": "powershell.exe",
"args": [
"-NoLogo",
"-NoProfile",
"-ExecutionPolicy", "Bypass",
"-File", "${workspaceFolder}\\Borealis.ps1",
"-Agent"
],
"presentation": {
"reveal": "always",
"panel": "shared"
},
"problemMatcher": []
}
]
}

View File

@@ -1,80 +1,14 @@
## Borealis Agent
- **Purpose**: Runs under the packaged Python virtual environment (`Data/Agent` mirrored to `Agent/`) and provides outbound-only connectivity, device telemetry, scripting, and UI capabilities.
- **Bootstrap**: `Borealis.ps1` prepares dependencies, activates the agent venv, and launches the agent alongside the Engine runtime (legacy server boot remains available for parity checks).
- **Runtime Paths**: Do not edit `/Agent`; make changes in `Data/Agent` so the runtime copy stays ephemeral. Runtime folders are wiped regularly.
# Borealis Codex Engagement Index
### Logging
- General log: `Agent/Logs/agent.log`; rotate daily to `agent.log.YYYY-MM-DD` and never delete automatically.
- Subsystems (e.g., `ansible`, `webrtc`, `scheduler`) must log to `Agent/Logs/<service>.log` and follow the same rotation policy.
- Installation output writes to `Agent/Logs/install.log`; keep ad-hoc diagnostics (e.g., `system_last.ps1`, ansible traces) under `Agent/Logs/` so runtime state stays self-contained.
- When troubleshooting with operators, prepend each line with `<timestamp>-<service-name>-<log-data>` and confirm whether to keep or remove verbose logging after resolution.
Use this file as the entrypoint for Codex instructions. Domain-specific guidance lives in `/Docs/Codex` so we can scale without bloating this page.
### Security
- Generates device-wide Ed25519 keys on first launch (`Certificates/Agent/Identity/` with DPAPI protection on Windows, `chmod 600` elsewhere).
- Stores refresh/access tokens encrypted and pins them to the Engine certificate fingerprint; mismatches force re-enrollment.
- Uses a dedicated `ssl.SSLContext` seeded with the Engines TLS bundle for REST and Socket.IO traffic.
- Validates all script payloads with Ed25519 signatures issued by the backend before execution.
- Enforces outbound-only communication; every API/WebSocket call flows through `AgentHttpClient.ensure_authenticated` to refresh tokens proactively.
- Logs bootstrap, enrollment, token refresh, and signature events under `Agent/Logs/`.
## Where to Read
- Agent: `Docs/Codex/BOREALIS_AGENT.md` (runtime paths, logging, security, roles, platform parity, Ansible status).
- Engine: `Docs/Codex/BOREALIS_ENGINE.md` (migration tracker, architecture, logging, security/API parity, platform parity, Ansible state).
- Shared: `Docs/Codex/SHARED.md` with UI guidance at `Docs/Codex/USER_INTERFACE.md`.
### Execution Contexts & Roles
- Roles auto-discover from `Data/Agent/Roles/` and require no loader changes.
- Naming convention: `role_<Purpose>.py`, with `ROLE_NAME`, `ROLE_CONTEXTS`, and optional lifecycle hooks (`register_events`, `on_config`, `stop_all`).
- Standard roles: `role_DeviceInventory.py`, `role_Screenshot.py`, `role_ScriptExec_CURRENTUSER.py`, `role_ScriptExec_SYSTEM.py`, `role_Macro.py`.
- SYSTEM tasks depend on scheduled-task creation rights; failures should surface cleanly through Engine logging.
Precedence: follow domain docs first; fall back to Shared when there is overlap. If domain and Shared disagree, domain wins.
### Platform Parity
- Windows remains the reference environment. Linux (`Borealis.sh`) trails in feature parity (venv setup, supervision, role loading) and must be aligned before macOS work continues.
### Ansible Support (Unfinished)
- Agent and Engine scaffolding exists but is unreliable: expect stalled or silent failures, inconsistent recap delivery, and incomplete packaging of required collections.
- Windows blockers: `ansible.windows.*` modules generally require PSRP/WinRM, SYSTEM context lacks loopback remoting guarantees, and interpreter paths vary.
- Guidance: treat Ansible features as disabled; do not file bugs until the packaging and controller story is complete.
- Future direction includes credential management, selectable connection types, reliable live output/cancel semantics, and packaged collections.
## Borealis Engine
- **Role**: The actively developed successor to the legacy server (`Data/Server/server.py`), aiming for feature parity across Python services, REST APIs, WebSockets, and the Flask/Vite frontends while improving stability, flexibility, and troubleshooting.
- **Migration Tracker**: `Engine/Data/Engine/CODE_MIGRATION_TRACKER.md` records stages, active tasks, and completed work. Stages 15 (bootstrap, configuration parity, API scaffolding, testing, and legacy bridge) are complete; Stage 6 (WebUI migration) is in progress; Stage 7 (WebSocket migration) is queued.
- **Architecture**: Runs via `Data/Engine/server.py` with NodeJS + Vite for live development and Flask for production serving and API endpoints. `Borealis.ps1` launches the Engine by default while keeping the legacy server switch available for regression comparisons.
- **Runtime Paths**: Edit code in `Data/Engine`; runtime copies are placed under `/Engine` and discarded frequently. `/Server` remains untouched unless explicitly running the legacy path.
### Development Guidelines
- Every new Python module under `Data/Engine` or `Engine/Data/Engine` must start with the standard commentary header describing purpose and any API endpoints. Add the header to existing modules that lack it before further edits.
- Reference the migration tracker before making Engine changes to avoid jumping ahead of the approved stage.
### Logging
- General log: `Engine/Logs/engine.log` with daily rotation (`engine.log.YYYY-MM-DD`); do not auto-delete rotated files.
- Subsystems should log to `Engine/Logs/<service>.log`; installation output belongs in `Engine/Logs/install.log`.
- Adhere to the centralized logging policy and keep Engine-specific artifacts within `Engine/Logs/` to preserve the runtime boundary.
### Security & API Parity
- Shares the mutual trust model with the legacy server: Ed25519 device identities, EdDSA-signed access tokens, pinned Borealis root CA, TLS 1.3-only serving, and Authorization headers plus service-context markers on every device API.
- Implements DPoP proof validation, short-lived access tokens (15min), and SHA-256hashed refresh tokens with 30-day lifetime and explicit reuse errors.
- Enrollment workflows include operator approval queues, conflict detection, auditor recording, and pruning of expired codes/refresh tokens.
- Background jobs and service adapters preserve compatibility with legacy database schemas while allowing gradual API takeover.
### WebUI & WebSocket Migration
- Static/template handling resides in `Data/Engine/services/WebUI`, with deployment copy paths wired through `Borealis.ps1` and TLS-aware URL generation intact.
- Pending tasks in Stage 6: add the migration switch in the legacy server for WebUI delegation and finish porting device/admin API endpoints into Engine services (current active task).
- Stage 7 will introduce `register_realtime` hooks, Engine-side Socket.IO handlers, integration checks, and legacy delegation updates.
### Platform Parity
- Windows support is the primary target. Ensure Engine tooling remains aligned with the agent experience; Linux packaging must catch up before macOS work resumes.
### Ansible Support (Shared State)
- Mirrors the agents unfinished story; Engine adapters should treat Ansible orchestration as experimental until packaging, connection management, and logging mature.
## Borealis Server (Legacy)
- **Role**: The historical Flask runtime under `Data/Server/server.py`. It remains available for reference and parity testing while the Engine takes over; no new features should land here.
- **Usage**: Launch only when comparing behaviour or during migration fallback scenarios. `Borealis.ps1` can still mirror `Data/Server` into `/Server`, but the staging tree itself must remain untouched.
- **Runtime Paths**: `/Server` and `Data/Server` are read-only for day-to-day work; edit Engine staging instead.
### Logging
- Legacy logs write to `Logs/Server/<service>.log` with the same rotation policy (`<service>.log.YYYY-MM-DD`). Installation logs belong in `Logs/Server/install.log`. Avoid changes unless investigating historical behaviour.
### Security Posture
- Shares the same mutual-authentication and TLS posture as the Engine. Device authentication checks GUID normalization, SSL fingerprint matches, token version counters, and quarantine flags before admitting requests.
- Refresh tokens remain hashed (SHA-256) and DPoP-bound; reuse after revocation or expiry returns explicit errors. Enrollment workflows preserve operator approvals and auditing.
### Platform Notes
- Exists primarily to document past behaviour and assist the Engine migration. Future platform parity work should target the Engine; the legacy server will be deprecated once feature parity is confirmed.
## UI / AG Grid
- MagicUI styling language and AG Grid rules are consolidated in `Docs/Codex/USER_INTERFACE.md`.
- Visual example: `Data/Engine/web-interface/src/Admin/Page_Template.jsx` (reference only—no business logic). Use it to mirror layout, spacing, and selection column behavior.

View File

@@ -10,7 +10,8 @@ param(
[switch]$EngineTests,
[switch]$EngineProduction,
[switch]$EngineDev,
[string]$InstallerCode = ''
[Alias('enrollmentcode','Enrollmentcode')]
[string]$EnrollmentCode = ''
)
# Preselect menu choices from CLI args (optional)
@@ -851,19 +852,76 @@ function InstallOrUpdate-BorealisAgent {
$existingServerUrl = $null
Run-Step "Create Virtual Python Environment" {
if (-not (Test-Path (Join-Path $venvFolderPath 'Scripts\Activate'))) {
$pythonForVenv = $pythonExe
if (-not (Test-Path $pythonForVenv)) {
$pyCmd = Get-Command py -ErrorAction SilentlyContinue
$pythonCmd = Get-Command python -ErrorAction SilentlyContinue
if ($pyCmd) { $pythonForVenv = $pyCmd.Source }
elseif ($pythonCmd) { $pythonForVenv = $pythonCmd.Source }
else {
Write-Host "Python not found. Install Python or run Server setup (option 1)." -ForegroundColor Red
exit 1
}
$venvActivate = Join-Path $venvFolderPath 'Scripts\Activate'
$pyvenvCfg = Join-Path $venvFolderPath 'pyvenv.cfg'
$pythonForVenv = $pythonExe
if (-not (Test-Path $pythonForVenv)) {
$pyCmd = Get-Command py -ErrorAction SilentlyContinue
$pythonCmd = Get-Command python -ErrorAction SilentlyContinue
if ($pyCmd) { $pythonForVenv = $pyCmd.Source }
elseif ($pythonCmd) { $pythonForVenv = $pythonCmd.Source }
else {
Write-Host "Python not found. Install Python or run Server setup (option 1)." -ForegroundColor Red
exit 1
}
}
$expectedPython = $pythonForVenv
$expectedPythonNorm = $null
$expectedHomeNorm = $null
try {
if (Test-Path $expectedPython -PathType Leaf) {
$expectedPython = (Resolve-Path $expectedPython -ErrorAction Stop).ProviderPath
}
} catch { $expectedPython = $pythonForVenv }
if ($expectedPython) {
$expectedPythonNorm = $expectedPython.ToLowerInvariant()
try {
$expectedHome = Split-Path -Path $expectedPython -Parent
} catch { $expectedHome = $null }
if ($expectedHome) { $expectedHomeNorm = $expectedHome.ToLowerInvariant() }
}
$venvNeedsUpgrade = $false
if (Test-Path $pyvenvCfg -PathType Leaf) {
try {
$cfgLines = Get-Content -Path $pyvenvCfg -ErrorAction Stop
$cfgMap = @{}
foreach ($line in $cfgLines) {
$trimmed = $line.Trim()
if (-not $trimmed -or $trimmed.StartsWith('#')) { continue }
$parts = $trimmed -split '=', 2
if ($parts.Count -ne 2) { continue }
$cfgMap[$parts[0].Trim().ToLowerInvariant()] = $parts[1].Trim()
}
$cfgExecutable = $cfgMap['executable']
$cfgHome = $cfgMap['home']
if ($cfgExecutable -and -not (Test-Path $cfgExecutable -PathType Leaf)) {
$venvNeedsUpgrade = $true
} elseif ($cfgHome -and -not (Test-Path $cfgHome -PathType Container)) {
$venvNeedsUpgrade = $true
} else {
if ($cfgExecutable -and $expectedPythonNorm) {
try { $resolvedExe = (Resolve-Path $cfgExecutable -ErrorAction Stop).ProviderPath } catch { $resolvedExe = $cfgExecutable }
$resolvedExeNorm = if ($resolvedExe) { $resolvedExe.ToLowerInvariant() } else { $null }
if ($resolvedExeNorm -and $resolvedExeNorm -ne $expectedPythonNorm) { $venvNeedsUpgrade = $true }
}
if (-not $venvNeedsUpgrade -and $cfgHome -and $expectedHomeNorm) {
try { $resolvedHome = (Resolve-Path $cfgHome -ErrorAction Stop).ProviderPath } catch { $resolvedHome = $cfgHome }
$resolvedHomeNorm = if ($resolvedHome) { $resolvedHome.ToLowerInvariant() } else { $null }
if ($resolvedHomeNorm -and $resolvedHomeNorm -ne $expectedHomeNorm) { $venvNeedsUpgrade = $true }
}
}
} catch { $venvNeedsUpgrade = $true }
}
if (-not (Test-Path $venvActivate)) {
& $pythonForVenv -m venv $venvFolderPath
} elseif ($venvNeedsUpgrade) {
Write-Host "Detected relocated Agent virtual environment. Rebuilding interpreter bindings..." -ForegroundColor Yellow
& $pythonForVenv -m venv --upgrade $venvFolderPath
}
if (Test-Path $agentSourcePath) {
# Cleanup Previous Agent Folder & Create New Folder
@@ -977,6 +1035,7 @@ function InstallOrUpdate-BorealisAgent {
config_file_watcher_interval = 2
agent_id = ''
regions = @{}
enrollment_code = ''
installer_code = ''
}
$config = [ordered]@{}
@@ -1001,40 +1060,44 @@ function InstallOrUpdate-BorealisAgent {
$config['regions'] = @{}
}
$existingInstallerCode = ''
if ('installer_code' -in $config.Keys -and $null -ne $config['installer_code']) {
$existingInstallerCode = [string]$config['installer_code']
$existingEnrollmentCode = ''
if ('enrollment_code' -in $config.Keys -and $null -ne $config['enrollment_code']) {
$existingEnrollmentCode = [string]$config['enrollment_code']
} elseif ('installer_code' -in $config.Keys -and $null -ne $config['installer_code']) {
$existingEnrollmentCode = [string]$config['installer_code']
}
$providedInstallerCode = ''
if ($InstallerCode -and $InstallerCode.Trim()) {
$providedInstallerCode = $InstallerCode.Trim()
} elseif ($env:BOREALIS_INSTALLER_CODE -and $env:BOREALIS_INSTALLER_CODE.Trim()) {
$providedInstallerCode = $env:BOREALIS_INSTALLER_CODE.Trim()
$providedEnrollmentCode = ''
if ($EnrollmentCode -and $EnrollmentCode.Trim()) {
$providedEnrollmentCode = $EnrollmentCode.Trim()
} elseif ($env:BOREALIS_ENROLLMENT_CODE -and $env:BOREALIS_ENROLLMENT_CODE.Trim()) {
$providedEnrollmentCode = $env:BOREALIS_ENROLLMENT_CODE.Trim()
}
if (-not $providedInstallerCode) {
$defaultDisplay = if ($existingInstallerCode) { $existingInstallerCode } else { '' }
Write-Host ""; Write-Host "Set an installer code for agent enrollment." -ForegroundColor DarkYellow
$inputCode = Read-Host ("Installer Code [{0}] (e.g. A4E1-••••-••••-••••-••••-••••-••••-350A)" -f $defaultDisplay)
if (-not $providedEnrollmentCode) {
$defaultDisplay = if ($existingEnrollmentCode) { $existingEnrollmentCode } else { '' }
Write-Host ""; Write-Host "Set an enrollment code for agent enrollment." -ForegroundColor DarkYellow
$inputCode = Read-Host ("Enrollment Code [{0}] (e.g. A4E1-••••-••••-••••-••••-••••-••••-350A)" -f $defaultDisplay)
if ($inputCode -and $inputCode.Trim()) {
$providedInstallerCode = $inputCode.Trim()
$providedEnrollmentCode = $inputCode.Trim()
} elseif ($defaultDisplay) {
$providedInstallerCode = $defaultDisplay
$providedEnrollmentCode = $defaultDisplay
} else {
$providedInstallerCode = ''
$providedEnrollmentCode = ''
}
}
$config['installer_code'] = $providedInstallerCode
$config['enrollment_code'] = $providedEnrollmentCode
# Retain legacy key to avoid breaking existing agent readers
$config['installer_code'] = $providedEnrollmentCode
try {
$configJson = $config | ConvertTo-Json -Depth 10
[System.IO.File]::WriteAllText($configPath, $configJson, $utf8NoBom)
if ($providedInstallerCode) {
Write-Host "Installer code saved to agent_settings.json." -ForegroundColor Green
if ($providedEnrollmentCode) {
Write-Host "Enrollment code saved to agent_settings.json." -ForegroundColor Green
} else {
Write-Host "Installer code cleared in agent_settings.json." -ForegroundColor Yellow
Write-Host "Enrollment code cleared in agent_settings.json." -ForegroundColor Yellow
}
} catch {
Write-AgentLog -FileName 'Install.log' -Message ("[CONFIG] Failed to persist agent_settings.json: {0}" -f $_.Exception.Message)

View File

@@ -5,7 +5,7 @@
# - Bundles portable NodeJS into Dependencies/NodeJS to keep a known-good version (no root required)
# - Mirrors Windows flow: create Engine venv, stage Data/Engine, stage web-interface, Vite dev or prod build, Flask launch
# - Supports flags: --server/--agent (agent kept for compatibility), --vite/--flask, --quick, --engine-tests,
# --EngineProduction, --EngineDev (auto-select server mode), plus interactive menu
# --EngineProduction, --EngineDev (auto-select server mode), --enrollmentcode, plus interactive menu
# NOTE: This script focuses on ENGINE parity. Agent paths remain but are not the goal here.
set -o errexit
@@ -33,19 +33,20 @@ QUICK_FLAG=0
ENGINE_TESTS_FLAG=0
ENGINE_PROD_FLAG=0
ENGINE_DEV_FLAG=0
INSTALLER_CODE=""
ENROLLMENT_CODE=""
while (( "$#" )); do
case "$1" in
-Server|--server) SERVER_FLAG=1 ;;
-Agent|--agent) AGENT_FLAG=1 ;;
-Agent|--agent|--Agent) AGENT_FLAG=1 ;;
-Vite|--vite) VITE_FLAG=1 ;;
-Flask|--flask) FLASK_FLAG=1 ;;
-Quick|--quick) QUICK_FLAG=1 ;;
-EngineTests|--engine-tests) ENGINE_TESTS_FLAG=1 ;;
-EngineProduction|--engine-production) ENGINE_PROD_FLAG=1 ;;
-EngineDev|--engine-dev) ENGINE_DEV_FLAG=1 ;;
-InstallerCode|--installer-code) shift; INSTALLER_CODE="${1:-}" ;;
# Enrollment: prefer lowercase --enrollmentcode, keep old alias for compatibility
-EnrollmentCode|--EnrollmentCode|--enrollmentcode|--enrollment-code) shift; ENROLLMENT_CODE="${1:-}" ;;
*) ;; # ignore unknown for flexibility
esac
shift || true
@@ -86,6 +87,126 @@ write_vite_log() {
printf "%s-%s-%s\n" "$(date +%FT%T)" "$svc" "$msg" >> "${logdir}/vite.log"
}
# ---- Agent (settings-only parity) ----
configure_agent_settings() {
echo -e "${GREEN}Configuring Borealis Agent settings...${RESET}"
local settings_dir="${SCRIPT_DIR}/Agent/Borealis/Settings"
local legacy_settings_dir="${SCRIPT_DIR}/Agent/Settings"
local server_url_path="${settings_dir}/server_url.txt"
local config_path="${settings_dir}/agent_settings.json"
mkdir -p "${settings_dir}"
if [[ ! -f "${server_url_path}" && -f "${legacy_settings_dir}/server_url.txt" ]]; then
cp -f "${legacy_settings_dir}/server_url.txt" "${server_url_path}" 2>/dev/null || true
fi
local default_url="https://localhost:5000"
local current_url="${default_url}"
if [[ -n "${BOREALIS_SERVER_URL:-}" ]]; then
current_url="${BOREALIS_SERVER_URL}"
elif [[ -f "${server_url_path}" ]]; then
current_url="$(head -n 1 "${server_url_path}" || echo "${default_url}")"
fi
if [[ -t 0 ]]; then
read -r -p "Server URL [${current_url}]: " input_url
else
input_url=""
fi
input_url="${input_url:-${current_url}}"
input_url="$(echo -n "${input_url}" | tr -d '\r' | sed 's/^[[:space:]]*//;s/[[:space:]]*$//')"
if [[ -z "${input_url}" ]]; then input_url="${default_url}"; fi
printf "%s" "${input_url}" > "${server_url_path}"
local provided_code="${ENROLLMENT_CODE:-}"
if [[ -z "${provided_code}" && -n "${BOREALIS_ENROLLMENT_CODE:-}" ]]; then
provided_code="${BOREALIS_ENROLLMENT_CODE}"
fi
if [[ -z "${provided_code}" ]]; then
local existing_code=""
if [[ -f "${config_path}" ]]; then
if command -v python3 >/dev/null 2>&1 || command -v python >/dev/null 2>&1; then
existing_code="$(CONFIG_PATH="${config_path}" python3 - <<'PY' 2>/dev/null || CONFIG_PATH="${config_path}" python - <<'PY' 2>/dev/null || true
import json, os
path = os.environ.get("CONFIG_PATH")
try:
with open(path, "r", encoding="utf-8") as fh:
data = json.load(fh)
if isinstance(data, dict):
print(data.get("enrollment_code") or data.get("installer_code") or "")
except Exception:
pass
PY
)"
fi
fi
existing_code="${existing_code:-}"
if [[ -t 0 ]]; then
read -r -p "Enrollment Code [${existing_code}]: " input_code
else
input_code=""
fi
if [[ -n "${input_code// }" ]]; then
provided_code="${input_code}"
elif [[ -n "${existing_code}" ]]; then
provided_code="${existing_code}"
else
provided_code=""
fi
fi
local py_bin
py_bin="$(command -v python3 || command -v python || true)"
if [[ -n "${py_bin}" ]]; then
CONFIG_PATH="${config_path}" ENROLLMENT_CODE_VALUE="${provided_code}" "${py_bin}" - <<'PY'
import json, os
path = os.environ["CONFIG_PATH"]
code = os.environ.get("ENROLLMENT_CODE_VALUE", "")
defaults = {
"config_file_watcher_interval": 2,
"agent_id": "",
"regions": {},
"enrollment_code": "",
"installer_code": "",
}
data = defaults.copy()
if os.path.exists(path):
try:
with open(path, "r", encoding="utf-8") as fh:
existing = json.load(fh)
if isinstance(existing, dict):
data.update(existing)
except Exception:
pass
data["enrollment_code"] = code
data["installer_code"] = code
os.makedirs(os.path.dirname(path), exist_ok=True)
with open(path, "w", encoding="utf-8") as fh:
json.dump(data, fh)
PY
else
cat > "${config_path}" <<EOF
{
"config_file_watcher_interval": 2,
"agent_id": "",
"regions": {},
"enrollment_code": "${provided_code}",
"installer_code": "${provided_code}"
}
EOF
fi
if [[ -n "${provided_code}" ]]; then
echo -e "${GREEN}Enrollment code saved to agent_settings.json.${RESET}"
else
echo -e "${YELLOW}Enrollment code cleared in agent_settings.json.${RESET}"
fi
echo -e "${INFO} Agent runtime remains Windows-first; Linux flow configures settings only."
}
# ---- Dependency Installation (Linux) ----
install_shared_dependencies() {
detect_distro
@@ -382,7 +503,7 @@ main_menu() {
read -r -p "Enter a number: " choice
case "$choice" in
1) server_menu ;;
2) echo -e "${YELLOW}Agent management is out-of-scope for this parity pass. Use Windows for full Agent features.${RESET}" ;;
2) configure_agent_settings ;;
3) exit 0 ;;
*) echo -e "${RED}Invalid selection. Exiting...${RESET}"; exit 1 ;;
esac
@@ -393,6 +514,11 @@ if [[ $SERVER_FLAG -eq 1 && $AGENT_FLAG -eq 1 ]]; then
echo -e "${RED}Cannot use --server and --agent together.${RESET}"; exit 1
fi
if [[ $AGENT_FLAG -eq 1 ]]; then
configure_agent_settings
exit $?
fi
# Auto-select main menu option for Server when EngineProduction/EngineDev provided
if [[ $ENGINE_PROD_FLAG -eq 1 || $ENGINE_DEV_FLAG -eq 1 ]]; then
SERVER_FLAG=1

View File

@@ -17,11 +17,83 @@ try {
if (-not (Test-Path $logsAgent)) { New-Item -ItemType Directory -Path $logsAgent -Force | Out-Null }
$wrapperLog = Join-Path $logsAgent 'service_wrapper.log'
function Write-WrapperLog {
param([string]$Message)
if (-not $Message) { return }
try {
"[{0}] {1}" -f (Get-Date -Format s), $Message | Out-File -FilePath $wrapperLog -Append -Encoding utf8
} catch {}
}
$venvBin = Join-Path $scriptDir '..\Scripts'
$pyw = Join-Path $venvBin 'pythonw.exe'
$py = Join-Path $venvBin 'python.exe'
$agentPy = Join-Path $scriptDir 'agent.py'
$pyvenvCfg = Join-Path $agentRoot 'pyvenv.cfg'
$expectedPy = Join-Path $projRoot 'Dependencies\Python\python.exe'
if (-not (Test-Path $expectedPy -PathType Leaf) -and (Test-Path $py -PathType Leaf)) {
$expectedPy = $py
}
$expectedPyNorm = $null
$expectedHomeNorm = $null
try {
if (Test-Path $expectedPy -PathType Leaf) {
$expectedPy = (Resolve-Path $expectedPy -ErrorAction Stop).ProviderPath
}
} catch {}
if ($expectedPy) {
$expectedPyNorm = $expectedPy.ToLowerInvariant()
try { $expectedHome = Split-Path -Path $expectedPy -Parent } catch { $expectedHome = $null }
if ($expectedHome) { $expectedHomeNorm = $expectedHome.ToLowerInvariant() }
}
$venvNeedsRepair = $false
if (Test-Path $pyvenvCfg -PathType Leaf) {
try {
$cfgLines = Get-Content -Path $pyvenvCfg -ErrorAction Stop
$cfgMap = @{}
foreach ($line in $cfgLines) {
$trimmed = $line.Trim()
if (-not $trimmed -or $trimmed.StartsWith('#')) { continue }
$parts = $trimmed -split '=', 2
if ($parts.Count -ne 2) { continue }
$cfgMap[$parts[0].Trim().ToLowerInvariant()] = $parts[1].Trim()
}
$cfgExecutable = $cfgMap['executable']
$cfgHome = $cfgMap['home']
if ($cfgExecutable -and -not (Test-Path $cfgExecutable -PathType Leaf)) {
$venvNeedsRepair = $true
} elseif ($cfgHome -and -not (Test-Path $cfgHome -PathType Container)) {
$venvNeedsRepair = $true
} else {
if ($cfgExecutable -and $expectedPyNorm) {
try { $resolvedExe = (Resolve-Path $cfgExecutable -ErrorAction Stop).ProviderPath } catch { $resolvedExe = $cfgExecutable }
$resolvedExeNorm = if ($resolvedExe) { $resolvedExe.ToLowerInvariant() } else { $null }
if ($resolvedExeNorm -and $resolvedExeNorm -ne $expectedPyNorm) { $venvNeedsRepair = $true }
}
if (-not $venvNeedsRepair -and $cfgHome -and $expectedHomeNorm) {
try { $resolvedHome = (Resolve-Path $cfgHome -ErrorAction Stop).ProviderPath } catch { $resolvedHome = $cfgHome }
$resolvedHomeNorm = if ($resolvedHome) { $resolvedHome.ToLowerInvariant() } else { $null }
if ($resolvedHomeNorm -and $resolvedHomeNorm -ne $expectedHomeNorm) { $venvNeedsRepair = $true }
}
}
} catch { $venvNeedsRepair = $true }
}
if ($venvNeedsRepair -and (Test-Path $expectedPy -PathType Leaf)) {
Write-WrapperLog ("Detected relocated Agent virtual environment. Rebuilding interpreter bindings with {0}" -f $expectedPy)
try {
& $expectedPy -m venv --upgrade $agentRoot | Out-Null
Write-WrapperLog "Agent virtual environment successfully rebound to current project path."
} catch {
Write-WrapperLog ("Agent virtual environment repair failed: {0}" -f $_.Exception.Message)
}
}
if (-not (Test-Path $pyw) -and -not (Test-Path $py)) { throw "Python not found under: $venvBin" }
if (-not (Test-Path $agentPy)) { throw "Agent script not found: $agentPy" }

View File

@@ -94,7 +94,8 @@ CREATE TABLE IF NOT EXISTS enrollment_install_codes (
used_by_guid TEXT,
max_uses INTEGER,
use_count INTEGER,
last_used_at TEXT
last_used_at TEXT,
site_id INTEGER
);
CREATE TABLE IF NOT EXISTS enrollment_install_codes_persistent (
id TEXT PRIMARY KEY,
@@ -109,7 +110,8 @@ CREATE TABLE IF NOT EXISTS enrollment_install_codes_persistent (
last_used_at TEXT,
is_active INTEGER NOT NULL DEFAULT 1,
archived_at TEXT,
consumed_at TEXT
consumed_at TEXT,
site_id INTEGER
);
CREATE TABLE IF NOT EXISTS device_approvals (
id TEXT PRIMARY KEY,
@@ -118,6 +120,7 @@ CREATE TABLE IF NOT EXISTS device_approvals (
hostname_claimed TEXT,
ssl_key_fingerprint_claimed TEXT,
enrollment_code_id TEXT,
site_id INTEGER,
status TEXT,
client_nonce TEXT,
server_nonce TEXT,
@@ -145,7 +148,8 @@ CREATE TABLE IF NOT EXISTS sites (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
description TEXT,
created_at INTEGER
created_at INTEGER,
enrollment_code_id TEXT
);
CREATE TABLE IF NOT EXISTS device_sites (
device_hostname TEXT PRIMARY KEY,
@@ -270,9 +274,51 @@ def engine_harness(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> Iterator[
"2025-10-01T00:00:00Z",
),
)
site_code_id = "SITE-CODE-0001"
site_code_value = "SITE-MAIN-CODE"
site_code_created = "2025-01-01T00:00:00Z"
site_code_expires = "2030-01-01T00:00:00Z"
cur.execute(
"INSERT INTO sites (id, name, description, created_at) VALUES (?, ?, ?, ?)",
(1, "Main Lab", "Primary integration site", 1_700_000_000),
"""
INSERT INTO enrollment_install_codes (
id,
code,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
use_count,
last_used_at,
site_id
) VALUES (?, ?, ?, ?, NULL, NULL, 0, 0, NULL, ?)
""",
(site_code_id, site_code_value, site_code_expires, "admin", 1),
)
cur.execute(
"""
INSERT INTO enrollment_install_codes_persistent (
id,
code,
created_at,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
last_known_use_count,
last_used_at,
is_active,
archived_at,
consumed_at,
site_id
) VALUES (?, ?, ?, ?, ?, NULL, NULL, 0, 0, NULL, 1, NULL, NULL, ?)
""",
(site_code_id, site_code_value, site_code_created, site_code_expires, "admin", 1),
)
cur.execute(
"INSERT INTO sites (id, name, description, created_at, enrollment_code_id) VALUES (?, ?, ?, ?, ?)",
(1, "Main Lab", "Primary integration site", 1_700_000_000, site_code_id),
)
cur.execute(
"INSERT INTO device_sites (device_hostname, site_id, assigned_at) VALUES (?, ?, ?)",
@@ -294,6 +340,7 @@ def engine_harness(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> Iterator[
hostname_claimed,
ssl_key_fingerprint_claimed,
enrollment_code_id,
site_id,
status,
client_nonce,
server_nonce,
@@ -302,7 +349,7 @@ def engine_harness(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> Iterator[
updated_at,
approved_by_user_id
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(
"approval-1",
@@ -310,7 +357,8 @@ def engine_harness(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> Iterator[
None,
"pending-device",
"aa:bb:cc:dd",
None,
site_code_id,
1,
"pending",
"client-nonce",
"server-nonce",

View File

@@ -10,6 +10,7 @@ from __future__ import annotations
from typing import Any
import pytest
from Data.Engine.auth import jwt_service as jwt_service_module
from Data.Engine.integrations import github as github_integration
from Data.Engine.services.API.devices import management as device_management
@@ -24,8 +25,39 @@ def _client_with_admin_session(harness: EngineTestHarness):
return client
def _device_headers() -> dict:
jwt_service = jwt_service_module.load_service()
token = jwt_service.issue_access_token(
"GUID-TEST-0001",
"ff:ff:ff",
1,
expires_in=900,
)
return {"Authorization": f"Bearer {token}"}
def _patch_repo_call(monkeypatch: pytest.MonkeyPatch, calls: dict) -> None:
class DummyResponse:
def __init__(self, status_code: int, payload: Any):
self.status_code = status_code
self._payload = payload
def json(self) -> Any:
return self._payload
request_exception = getattr(github_integration.requests, "RequestException", RuntimeError)
def fake_get(url: str, headers: Any, timeout: int) -> DummyResponse:
calls["count"] += 1
if calls["count"] == 1:
return DummyResponse(200, {"commit": {"sha": "abc123"}})
raise request_exception("network error")
monkeypatch.setattr(github_integration.requests, "get", fake_get)
def test_list_devices(engine_harness: EngineTestHarness) -> None:
client = engine_harness.app.test_client()
client = _client_with_admin_session(engine_harness)
response = client.get("/api/devices")
assert response.status_code == 200
payload = response.get_json()
@@ -38,7 +70,7 @@ def test_list_devices(engine_harness: EngineTestHarness) -> None:
def test_list_agents(engine_harness: EngineTestHarness) -> None:
client = engine_harness.app.test_client()
client = _client_with_admin_session(engine_harness)
response = client.get("/api/agents")
assert response.status_code == 200
payload = response.get_json()
@@ -50,7 +82,7 @@ def test_list_agents(engine_harness: EngineTestHarness) -> None:
def test_device_details(engine_harness: EngineTestHarness) -> None:
client = engine_harness.app.test_client()
client = _client_with_admin_session(engine_harness)
response = client.get("/api/device/details/test-device")
assert response.status_code == 200
payload = response.get_json()
@@ -103,25 +135,9 @@ def test_device_list_views_lifecycle(engine_harness: EngineTestHarness) -> None:
def test_repo_current_hash_uses_cache(engine_harness: EngineTestHarness, monkeypatch: pytest.MonkeyPatch) -> None:
calls = {"count": 0}
class DummyResponse:
def __init__(self, status_code: int, payload: Any):
self.status_code = status_code
self._payload = payload
_patch_repo_call(monkeypatch, calls)
def json(self) -> Any:
return self._payload
request_exception = getattr(github_integration.requests, "RequestException", RuntimeError)
def fake_get(url: str, headers: Any, timeout: int) -> DummyResponse:
calls["count"] += 1
if calls["count"] == 1:
return DummyResponse(200, {"commit": {"sha": "abc123"}})
raise request_exception("network error")
monkeypatch.setattr(github_integration.requests, "get", fake_get)
client = engine_harness.app.test_client()
client = _client_with_admin_session(engine_harness)
first = client.get("/api/repo/current_hash?repo=test/test&branch=main")
assert first.status_code == 200
assert first.get_json()["sha"] == "abc123"
@@ -133,8 +149,23 @@ def test_repo_current_hash_uses_cache(engine_harness: EngineTestHarness, monkeyp
assert calls["count"] == 1
def test_agent_hash_list_permissions(engine_harness: EngineTestHarness) -> None:
def test_repo_current_hash_allows_device_token(engine_harness: EngineTestHarness, monkeypatch: pytest.MonkeyPatch) -> None:
calls = {"count": 0}
_patch_repo_call(monkeypatch, calls)
client = engine_harness.app.test_client()
response = client.get(
"/api/repo/current_hash?repo=test/test&branch=main",
headers=_device_headers(),
)
assert response.status_code == 200
payload = response.get_json()
assert payload["sha"] == "abc123"
assert calls["count"] == 1
def test_agent_hash_list_permissions(engine_harness: EngineTestHarness) -> None:
client = _client_with_admin_session(engine_harness)
forbidden = client.get("/api/agent/hash_list", environ_base={"REMOTE_ADDR": "192.0.2.10"})
assert forbidden.status_code == 403
allowed = client.get("/api/agent/hash_list", environ_base={"REMOTE_ADDR": "127.0.0.1"})
@@ -177,21 +208,20 @@ def test_sites_lifecycle(engine_harness: EngineTestHarness) -> None:
assert delete_resp.status_code == 200
def test_admin_enrollment_code_flow(engine_harness: EngineTestHarness) -> None:
def test_site_enrollment_code_rotation(engine_harness: EngineTestHarness) -> None:
client = _client_with_admin_session(engine_harness)
create_resp = client.post(
"/api/admin/enrollment-codes",
json={"ttl_hours": 1, "max_uses": 2},
)
assert create_resp.status_code == 201
code_id = create_resp.get_json()["id"]
sites_resp = client.get("/api/sites")
assert sites_resp.status_code == 200
sites = sites_resp.get_json()["sites"]
assert sites and sites[0]["enrollment_code"]
site_id = sites[0]["id"]
original_code = sites[0]["enrollment_code"]
list_resp = client.get("/api/admin/enrollment-codes")
codes = list_resp.get_json()["codes"]
assert any(code["id"] == code_id for code in codes)
delete_resp = client.delete(f"/api/admin/enrollment-codes/{code_id}")
assert delete_resp.status_code == 200
rotate_resp = client.post("/api/sites/rotate_code", json={"site_id": site_id})
assert rotate_resp.status_code == 200
rotated = rotate_resp.get_json()
assert rotated["id"] == site_id
assert rotated["enrollment_code"] and rotated["enrollment_code"] != original_code
def test_admin_device_approvals(engine_harness: EngineTestHarness) -> None:

View File

@@ -31,19 +31,29 @@ def _iso(dt: datetime) -> str:
return dt.astimezone(timezone.utc).isoformat()
def _seed_install_code(db_path: os.PathLike[str], code: str) -> str:
def _seed_install_code(db_path: os.PathLike[str], code: str, site_id: int = 1) -> str:
record_id = str(uuid.uuid4())
baseline = _now()
issued_at = _iso(baseline)
expires_at = _iso(baseline + timedelta(days=1))
with sqlite3.connect(str(db_path)) as conn:
columns = {row[1] for row in conn.execute("PRAGMA table_info(sites)")}
if "enrollment_code_id" not in columns:
conn.execute("ALTER TABLE sites ADD COLUMN enrollment_code_id TEXT")
conn.execute(
"""
INSERT OR IGNORE INTO sites (id, name, description, created_at, enrollment_code_id)
VALUES (?, ?, ?, ?, ?)
""",
(site_id, f"Test Site {site_id}", "Seeded site", int(baseline.timestamp()), record_id),
)
conn.execute(
"""
INSERT INTO enrollment_install_codes (
id, code, expires_at, used_at, used_by_guid, max_uses, use_count, last_used_at
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
id, code, expires_at, used_at, used_by_guid, max_uses, use_count, last_used_at, site_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(record_id, code, expires_at, None, None, 1, 0, None),
(record_id, code, expires_at, None, None, 1, 0, None, site_id),
)
conn.execute(
"""
@@ -60,8 +70,9 @@ def _seed_install_code(db_path: os.PathLike[str], code: str) -> str:
last_used_at,
is_active,
archived_at,
consumed_at
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
consumed_at,
site_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(
record_id,
@@ -77,8 +88,17 @@ def _seed_install_code(db_path: os.PathLike[str], code: str) -> str:
1,
None,
None,
site_id,
),
)
conn.execute(
"""
UPDATE sites
SET enrollment_code_id = ?
WHERE id = ?
""",
(record_id, site_id),
)
conn.commit()
return record_id
@@ -124,7 +144,7 @@ def test_enrollment_request_creates_pending_approval(engine_harness: EngineTestH
cur = conn.cursor()
cur.execute(
"""
SELECT hostname_claimed, ssl_key_fingerprint_claimed, client_nonce, status, enrollment_code_id
SELECT hostname_claimed, ssl_key_fingerprint_claimed, client_nonce, status, enrollment_code_id, site_id
FROM device_approvals
WHERE approval_reference = ?
""",
@@ -133,11 +153,12 @@ def test_enrollment_request_creates_pending_approval(engine_harness: EngineTestH
row = cur.fetchone()
assert row is not None
hostname_claimed, fingerprint, stored_client_nonce, status, stored_code_id = row
hostname_claimed, fingerprint, stored_client_nonce, status, stored_code_id, stored_site_id = row
assert hostname_claimed == "agent-node-01"
assert stored_client_nonce == client_nonce_b64
assert status == "pending"
assert stored_code_id == install_code_id
assert stored_site_id == 1
expected_fingerprint = crypto_keys.fingerprint_from_spki_der(public_der)
assert fingerprint == expected_fingerprint
@@ -206,7 +227,7 @@ def test_enrollment_poll_finalizes_when_approved(engine_harness: EngineTestHarne
with sqlite3.connect(str(harness.db_path)) as conn:
cur = conn.cursor()
cur.execute(
"SELECT guid, status FROM device_approvals WHERE approval_reference = ?",
"SELECT guid, status, site_id FROM device_approvals WHERE approval_reference = ?",
(approval_reference,),
)
approval_row = cur.fetchone()
@@ -215,6 +236,11 @@ def test_enrollment_poll_finalizes_when_approved(engine_harness: EngineTestHarne
(final_guid,),
)
device_row = cur.fetchone()
cur.execute(
"SELECT site_id FROM device_sites WHERE device_hostname = ?",
(device_row[0] if device_row else None,),
)
site_row = cur.fetchone()
cur.execute(
"SELECT COUNT(*) FROM refresh_tokens WHERE guid = ?",
(final_guid,),
@@ -241,15 +267,18 @@ def test_enrollment_poll_finalizes_when_approved(engine_harness: EngineTestHarne
persistent_row = cur.fetchone()
assert approval_row is not None
approval_guid, approval_status = approval_row
approval_guid, approval_status, approval_site_id = approval_row
assert approval_status == "completed"
assert approval_guid == final_guid
assert approval_site_id == 1
assert device_row is not None
hostname, fingerprint, token_version = device_row
assert hostname == "agent-node-02"
assert fingerprint == crypto_keys.fingerprint_from_spki_der(public_der)
assert token_version >= 1
assert site_row is not None
assert site_row[0] == 1
assert refresh_count == 1
assert install_row is not None

View File

@@ -10,8 +10,11 @@
from __future__ import annotations
import logging
import secrets
import sqlite3
import time
import uuid
from datetime import datetime, timedelta, timezone
from pathlib import Path
from typing import Optional, Sequence
@@ -24,6 +27,15 @@ _DEFAULT_ADMIN_HASH = (
)
def _iso(dt: datetime) -> str:
return dt.astimezone(timezone.utc).isoformat()
def _generate_install_code() -> str:
raw = secrets.token_hex(16).upper()
return "-".join(raw[i : i + 4] for i in range(0, len(raw), 4))
def initialise_engine_database(database_path: str, *, logger: Optional[logging.Logger] = None) -> None:
"""Ensure the Engine database has the required schema and default admin account."""
@@ -46,6 +58,7 @@ def initialise_engine_database(database_path: str, *, logger: Optional[logging.L
_ensure_activity_history(conn, logger=logger)
_ensure_device_list_views(conn, logger=logger)
_ensure_sites(conn, logger=logger)
_ensure_site_enrollment_codes(conn, logger=logger)
_ensure_users_table(conn, logger=logger)
_ensure_default_admin(conn, logger=logger)
_ensure_ansible_recaps(conn, logger=logger)
@@ -92,7 +105,8 @@ def _restore_persisted_enrollment_codes(conn: sqlite3.Connection, *, logger: Opt
used_by_guid,
max_uses,
use_count,
last_used_at
last_used_at,
site_id
)
SELECT
p.id,
@@ -103,7 +117,8 @@ def _restore_persisted_enrollment_codes(conn: sqlite3.Connection, *, logger: Opt
p.used_by_guid,
p.max_uses,
p.last_known_use_count,
p.last_used_at
p.last_used_at,
p.site_id
FROM enrollment_install_codes_persistent AS p
WHERE p.is_active = 1
ON CONFLICT(id) DO UPDATE
@@ -114,7 +129,8 @@ def _restore_persisted_enrollment_codes(conn: sqlite3.Connection, *, logger: Opt
used_by_guid = excluded.used_by_guid,
max_uses = excluded.max_uses,
use_count = excluded.use_count,
last_used_at = excluded.last_used_at
last_used_at = excluded.last_used_at,
site_id = excluded.site_id
"""
)
conn.commit()
@@ -185,7 +201,8 @@ def _ensure_sites(conn: sqlite3.Connection, *, logger: Optional[logging.Logger])
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE NOT NULL,
description TEXT,
created_at INTEGER
created_at INTEGER,
enrollment_code_id TEXT
)
"""
)
@@ -199,6 +216,10 @@ def _ensure_sites(conn: sqlite3.Connection, *, logger: Optional[logging.Logger])
)
"""
)
cur.execute("PRAGMA table_info(sites)")
columns = {row[1] for row in cur.fetchall()}
if "enrollment_code_id" not in columns:
cur.execute("ALTER TABLE sites ADD COLUMN enrollment_code_id TEXT")
except Exception as exc:
if logger:
logger.error("Failed to ensure site tables: %s", exc, exc_info=True)
@@ -208,6 +229,147 @@ def _ensure_sites(conn: sqlite3.Connection, *, logger: Optional[logging.Logger])
cur.close()
def _ensure_site_enrollment_codes(conn: sqlite3.Connection, *, logger: Optional[logging.Logger]) -> None:
cur = conn.cursor()
try:
cur.execute("SELECT id, enrollment_code_id FROM sites")
sites = cur.fetchall()
if not sites:
return
now = datetime.now(tz=timezone.utc)
long_expiry = _iso(now + timedelta(days=3650))
for site_id, current_code_id in sites:
active_code_id: Optional[str] = None
if current_code_id:
cur.execute(
"SELECT id, site_id FROM enrollment_install_codes WHERE id = ?",
(current_code_id,),
)
existing = cur.fetchone()
if existing:
active_code_id = current_code_id
if existing[1] is None:
cur.execute(
"UPDATE enrollment_install_codes SET site_id = ? WHERE id = ?",
(site_id, current_code_id),
)
cur.execute(
"UPDATE enrollment_install_codes_persistent SET site_id = COALESCE(site_id, ?) WHERE id = ?",
(site_id, current_code_id),
)
if not active_code_id:
cur.execute(
"""
SELECT id, code, created_at, expires_at, max_uses, last_known_use_count, last_used_at, site_id
FROM enrollment_install_codes_persistent
WHERE site_id = ? AND is_active = 1
ORDER BY datetime(created_at) DESC
LIMIT 1
""",
(site_id,),
)
row = cur.fetchone()
if row:
active_code_id = row[0]
if row[7] is None:
cur.execute(
"UPDATE enrollment_install_codes_persistent SET site_id = ? WHERE id = ?",
(site_id, active_code_id),
)
cur.execute(
"""
INSERT OR REPLACE INTO enrollment_install_codes (
id,
code,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
use_count,
last_used_at,
site_id
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(
row[0],
row[1],
row[3] or long_expiry,
"system",
None,
None,
row[4] or 0,
row[5] or 0,
row[6],
site_id,
),
)
if not active_code_id:
new_id = str(uuid.uuid4())
code_value = _generate_install_code()
issued_at = _iso(now)
cur.execute(
"""
INSERT OR REPLACE INTO enrollment_install_codes (
id,
code,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
use_count,
last_used_at,
site_id
)
VALUES (?, ?, ?, 'system', NULL, NULL, 0, 0, NULL, ?)
""",
(new_id, code_value, long_expiry, site_id),
)
cur.execute(
"""
INSERT OR REPLACE INTO enrollment_install_codes_persistent (
id,
code,
created_at,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
last_known_use_count,
last_used_at,
is_active,
archived_at,
consumed_at,
site_id
)
VALUES (?, ?, ?, ?, 'system', NULL, NULL, 0, 0, NULL, 1, NULL, NULL, ?)
""",
(new_id, code_value, issued_at, long_expiry, site_id),
)
active_code_id = new_id
if active_code_id and active_code_id != current_code_id:
cur.execute(
"UPDATE sites SET enrollment_code_id = ? WHERE id = ?",
(active_code_id, site_id),
)
conn.commit()
except Exception as exc:
conn.rollback()
if logger:
logger.error("Failed to ensure site enrollment codes: %s", exc, exc_info=True)
else:
raise
finally:
cur.close()
def _ensure_users_table(conn: sqlite3.Connection, *, logger: Optional[logging.Logger]) -> None:
cur = conn.cursor()
try:

View File

@@ -155,7 +155,8 @@ def _ensure_install_code_table(conn: sqlite3.Connection) -> None:
used_by_guid TEXT,
max_uses INTEGER NOT NULL DEFAULT 1,
use_count INTEGER NOT NULL DEFAULT 0,
last_used_at TEXT
last_used_at TEXT,
site_id INTEGER
)
"""
)
@@ -188,6 +189,13 @@ def _ensure_install_code_table(conn: sqlite3.Connection) -> None:
ADD COLUMN last_used_at TEXT
"""
)
if "site_id" not in columns:
cur.execute(
"""
ALTER TABLE enrollment_install_codes
ADD COLUMN site_id INTEGER
"""
)
def _ensure_install_code_persistence_table(conn: sqlite3.Connection) -> None:
@@ -207,7 +215,8 @@ def _ensure_install_code_persistence_table(conn: sqlite3.Connection) -> None:
last_used_at TEXT,
is_active INTEGER NOT NULL DEFAULT 1,
archived_at TEXT,
consumed_at TEXT
consumed_at TEXT,
site_id INTEGER
)
"""
)
@@ -274,6 +283,13 @@ def _ensure_install_code_persistence_table(conn: sqlite3.Connection) -> None:
ADD COLUMN last_used_at TEXT
"""
)
if "site_id" not in columns:
cur.execute(
"""
ALTER TABLE enrollment_install_codes_persistent
ADD COLUMN site_id INTEGER
"""
)
def _ensure_device_approval_table(conn: sqlite3.Connection) -> None:
@@ -287,6 +303,7 @@ def _ensure_device_approval_table(conn: sqlite3.Connection) -> None:
hostname_claimed TEXT NOT NULL,
ssl_key_fingerprint_claimed TEXT NOT NULL,
enrollment_code_id TEXT NOT NULL,
site_id INTEGER,
status TEXT NOT NULL,
client_nonce TEXT NOT NULL,
server_nonce TEXT NOT NULL,
@@ -297,6 +314,16 @@ def _ensure_device_approval_table(conn: sqlite3.Connection) -> None:
)
"""
)
cur.execute("PRAGMA table_info(device_approvals)")
columns = {row[1] for row in cur.fetchall()}
if "site_id" not in columns:
cur.execute(
"""
ALTER TABLE device_approvals
ADD COLUMN site_id INTEGER
"""
)
cur.execute(
"""
CREATE INDEX IF NOT EXISTS idx_da_status
@@ -309,6 +336,12 @@ def _ensure_device_approval_table(conn: sqlite3.Connection) -> None:
ON device_approvals(ssl_key_fingerprint_claimed, status)
"""
)
cur.execute(
"""
CREATE INDEX IF NOT EXISTS idx_da_site
ON device_approvals(site_id)
"""
)
def _create_devices_table(cur: sqlite3.Cursor) -> None:

View File

@@ -28,6 +28,7 @@ if TYPE_CHECKING: # pragma: no cover - typing aide
from .. import EngineServiceAdapters
from ...assemblies.service import AssemblyRuntimeService
from ...auth import RequestAuthContext
def _assemblies_root() -> Path:
@@ -404,9 +405,18 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
if assembly_cache is None:
raise RuntimeError("Assembly cache is not initialised; ensure Engine bootstrap executed.")
assembly_runtime = AssemblyRuntimeService(assembly_cache, logger=adapters.context.logger)
auth = RequestAuthContext(
app=app,
dev_mode_manager=adapters.dev_mode_manager,
config=adapters.config,
logger=adapters.context.logger,
)
@blueprint.route("/api/scripts/quick_run", methods=["POST"])
def scripts_quick_run():
user, error = auth.require_user()
if error:
return jsonify(error[0]), error[1]
data = request.get_json(silent=True) or {}
rel_path_input = data.get("script_path")
rel_path_normalized = _normalize_script_relpath(rel_path_input)
@@ -419,6 +429,7 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
return jsonify({"error": "Missing script_path or hostnames[]"}), 400
rel_path_canonical = rel_path_normalized
username = (user.get("username") if isinstance(user, dict) else None) or "unknown"
assembly_source = "runtime"
assembly_guid: Optional[str] = None
@@ -455,7 +466,7 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
except Exception as exc: # pragma: no cover - defensive guard
service_log(
"assemblies",
f"quick job failed to resolve script path={rel_path_input!r}: {exc}",
f"quick job failed to resolve script path={rel_path_input!r} user={username}: {exc}",
level="ERROR",
)
return jsonify({"error": "Failed to resolve script path"}), 500
@@ -470,7 +481,7 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
if not within_scripts or not os.path.isfile(abs_path_str):
service_log(
"assemblies",
f"quick job requested missing or out-of-scope script input={rel_path_input!r} normalized={rel_path_canonical}",
f"quick job requested missing or out-of-scope script input={rel_path_input!r} normalized={rel_path_canonical} user={username}",
level="WARNING",
)
return jsonify({"error": "Script not found"}), 404
@@ -597,7 +608,7 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
results.append({"hostname": host, "job_id": job_id, "status": "Running"})
service_log(
"assemblies",
f"quick job queued hostname={host} path={rel_path_canonical} run_mode={run_mode} source={assembly_source}",
f"quick job queued hostname={host} path={rel_path_canonical} run_mode={run_mode} source={assembly_source} requested_by={username}",
)
except Exception as exc:
if conn is not None:
@@ -611,10 +622,16 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/ansible/quick_run", methods=["POST"])
def ansible_quick_run():
_, error = auth.require_user()
if error:
return jsonify(error[0]), error[1]
return jsonify({"error": "Ansible quick run is not yet available in the Engine runtime."}), 501
@blueprint.route("/api/device/activity/<hostname>", methods=["GET", "DELETE"])
def device_activity(hostname: str):
_, error = auth.require_user()
if error:
return jsonify(error[0]), error[1]
conn = None
try:
conn = adapters.db_conn_factory()
@@ -657,6 +674,9 @@ def register_execution(app: "Flask", adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/device/activity/job/<int:job_id>", methods=["GET"])
def device_activity_job(job_id: int):
_, error = auth.require_user()
if error:
return jsonify(error[0]), error[1]
conn = None
try:
conn = adapters.db_conn_factory()

View File

@@ -357,19 +357,23 @@ class AdminDeviceService:
da.hostname_claimed,
da.ssl_key_fingerprint_claimed,
da.enrollment_code_id,
da.site_id,
da.status,
da.client_nonce,
da.server_nonce,
da.created_at,
da.updated_at,
da.approved_by_user_id,
u.username AS approved_by_username
u.username AS approved_by_username,
s.name AS site_name
FROM device_approvals AS da
LEFT JOIN users AS u
ON (
CAST(da.approved_by_user_id AS TEXT) = CAST(u.id AS TEXT)
OR LOWER(da.approved_by_user_id) = LOWER(u.username)
)
LEFT JOIN sites AS s
ON s.id = da.site_id
"""
status_norm = (status_filter or "").strip().lower()
if status_norm and status_norm != "all":
@@ -409,17 +413,19 @@ class AdminDeviceService:
"hostname_claimed": hostname,
"ssl_key_fingerprint_claimed": fingerprint_claimed,
"enrollment_code_id": row[5],
"status": row[6],
"client_nonce": row[7],
"server_nonce": row[8],
"created_at": row[9],
"updated_at": row[10],
"approved_by_user_id": row[11],
"site_id": row[6],
"status": row[7],
"client_nonce": row[8],
"server_nonce": row[9],
"created_at": row[10],
"updated_at": row[11],
"approved_by_user_id": row[12],
"hostname_conflict": conflict,
"alternate_hostname": alternate,
"conflict_requires_prompt": requires_prompt,
"fingerprint_match": fingerprint_match,
"approved_by_username": row[12],
"approved_by_username": row[13],
"site_name": row[14],
}
)
finally:
@@ -578,4 +584,3 @@ def register_admin_endpoints(app, adapters: "EngineServiceAdapters") -> None:
return jsonify(payload), status
app.register_blueprint(blueprint)

View File

@@ -4,24 +4,26 @@
#
# API Endpoints (if applicable):
# - POST /api/agent/details (Device Authenticated) - Ingests hardware and inventory payloads from enrolled agents.
# - GET /api/devices (No Authentication) - Returns a summary list of known devices for the WebUI transition.
# - GET /api/devices/<guid> (No Authentication) - Retrieves a single device record by GUID, including summary fields.
# - GET /api/device/details/<hostname> (No Authentication) - Returns full device details keyed by hostname.
# - GET /api/agents (Token Authenticated) - Lists online collectors grouped by hostname and run context.
# - GET /api/devices (Token Authenticated) - Returns a summary list of known devices for the WebUI transition.
# - GET /api/devices/<guid> (Token Authenticated) - Retrieves a single device record by GUID, including summary fields.
# - GET /api/device/details/<hostname> (Token Authenticated) - Returns full device details keyed by hostname.
# - POST /api/device/description/<hostname> (Token Authenticated) - Updates the human-readable description for a device.
# - GET /api/device_list_views (No Authentication) - Lists saved device table view definitions.
# - GET /api/device_list_views/<int:view_id> (No Authentication) - Retrieves a specific saved device table view definition.
# - GET /api/device_list_views (Token Authenticated) - Lists saved device table view definitions.
# - GET /api/device_list_views/<int:view_id> (Token Authenticated) - Retrieves a specific saved device table view definition.
# - POST /api/device_list_views (Token Authenticated) - Creates a custom device list view for the signed-in operator.
# - PUT /api/device_list_views/<int:view_id> (Token Authenticated) - Updates an existing device list view definition.
# - DELETE /api/device_list_views/<int:view_id> (Token Authenticated) - Deletes a saved device list view.
# - GET /api/sites (No Authentication) - Lists known sites and their summary metadata.
# - GET /api/sites (Token Authenticated) - Lists known sites and their summary metadata.
# - POST /api/sites (Token Authenticated (Admin)) - Creates a new site for grouping devices.
# - POST /api/sites/delete (Token Authenticated (Admin)) - Deletes one or more sites by identifier.
# - GET /api/sites/device_map (No Authentication) - Provides hostname to site assignment mapping data.
# - GET /api/sites/device_map (Token Authenticated) - Provides hostname to site assignment mapping data.
# - POST /api/sites/assign (Token Authenticated (Admin)) - Assigns a set of devices to a given site.
# - POST /api/sites/rename (Token Authenticated (Admin)) - Renames an existing site record.
# - GET /api/repo/current_hash (No Authentication) - Fetches the current agent repository hash (with caching).
# - POST /api/sites/rotate_code (Token Authenticated (Admin)) - Rotates the static enrollment code for a site.
# - GET /api/repo/current_hash (Device or Token Authenticated) - Fetches the current agent repository hash (with caching).
# - GET/POST /api/agent/hash (Device Authenticated) - Retrieves or updates an agent hash record bound to the authenticated device.
# - GET /api/agent/hash_list (Loopback Restricted) - Returns stored agent hash metadata for localhost diagnostics.
# - GET /api/agent/hash_list (Token Authenticated (Admin + Loopback)) - Returns stored agent hash metadata for localhost diagnostics.
# ======================================================
"""Device management endpoints for the Borealis Engine API."""
@@ -30,10 +32,11 @@ from __future__ import annotations
import json
import logging
import os
import secrets
import sqlite3
import time
import uuid
from datetime import datetime, timezone
from datetime import datetime, timedelta, timezone
from pathlib import Path
from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple
@@ -41,7 +44,7 @@ from flask import Blueprint, jsonify, request, session, g
from itsdangerous import BadSignature, SignatureExpired, URLSafeTimedSerializer
from ....auth.guid_utils import normalize_guid
from ....auth.device_auth import require_device_auth
from ....auth.device_auth import DeviceAuthError, require_device_auth
if TYPE_CHECKING: # pragma: no cover - typing aide
from .. import EngineServiceAdapters
@@ -117,6 +120,11 @@ def _is_internal_request(remote_addr: Optional[str]) -> bool:
return False
def _generate_install_code() -> str:
raw = secrets.token_hex(16).upper()
return "-".join(raw[i : i + 4] for i in range(0, len(raw), 4))
def _row_to_site(row: Tuple[Any, ...]) -> Dict[str, Any]:
return {
"id": row[0],
@@ -124,6 +132,11 @@ def _row_to_site(row: Tuple[Any, ...]) -> Dict[str, Any]:
"description": row[2] or "",
"created_at": row[3] or 0,
"device_count": row[4] or 0,
"enrollment_code_id": row[5],
"enrollment_code": row[6] or "",
"enrollment_code_expires_at": row[7] or "",
"enrollment_code_last_used_at": row[8] or "",
"enrollment_code_use_count": row[9] or 0,
}
@@ -418,6 +431,29 @@ class DeviceManagementService:
return {"error": "unauthorized"}, 401
return None
def _require_device_or_login(self) -> Optional[Tuple[Dict[str, Any], int]]:
user = self._current_user()
if user:
return None
manager = getattr(self.adapters, "device_auth_manager", None)
if manager is None:
return {"error": "unauthorized"}, 401
try:
ctx = manager.authenticate()
g.device_auth = ctx
return None
except DeviceAuthError as exc:
payload: Dict[str, Any] = {"error": exc.message}
retry_after = getattr(exc, "retry_after", None)
if retry_after:
payload["retry_after"] = retry_after
return payload, getattr(exc, "status_code", 401) or 401
except Exception:
self.service_log("server", "/api/repo/current_hash auth failure", level="ERROR")
return {"error": "unauthorized"}, 401
def _require_admin(self) -> Optional[Tuple[Dict[str, Any], int]]:
user = self._current_user()
if not user:
@@ -1079,26 +1115,83 @@ class DeviceManagementService:
# Site management helpers
# ------------------------------------------------------------------
def _site_select_sql(self) -> str:
return """
SELECT s.id,
s.name,
s.description,
s.created_at,
COALESCE(ds.cnt, 0) AS device_count,
s.enrollment_code_id,
ic.code,
ic.expires_at,
ic.last_used_at,
ic.use_count
FROM sites AS s
LEFT JOIN (
SELECT site_id, COUNT(*) AS cnt
FROM device_sites
GROUP BY site_id
) AS ds ON ds.site_id = s.id
LEFT JOIN enrollment_install_codes AS ic
ON ic.id = s.enrollment_code_id
"""
def _fetch_site_row(self, cur: sqlite3.Cursor, site_id: int) -> Optional[Tuple[Any, ...]]:
cur.execute(self._site_select_sql() + " WHERE s.id = ?", (site_id,))
return cur.fetchone()
def _issue_site_enrollment_code(self, cur: sqlite3.Cursor, site_id: int, *, creator: str) -> Dict[str, Any]:
now = datetime.now(tz=timezone.utc)
issued_iso = now.isoformat()
expires_iso = (now + timedelta(days=3650)).isoformat()
code_id = str(uuid.uuid4())
code_value = _generate_install_code()
creator_value = creator or "system"
cur.execute(
"""
INSERT INTO enrollment_install_codes (
id, code, expires_at, created_by_user_id, used_at, used_by_guid,
max_uses, use_count, last_used_at, site_id
)
VALUES (?, ?, ?, ?, NULL, NULL, 0, 0, NULL, ?)
""",
(code_id, code_value, expires_iso, creator_value, site_id),
)
cur.execute(
"""
INSERT OR REPLACE INTO enrollment_install_codes_persistent (
id,
code,
created_at,
expires_at,
created_by_user_id,
used_at,
used_by_guid,
max_uses,
last_known_use_count,
last_used_at,
is_active,
archived_at,
consumed_at,
site_id
)
VALUES (?, ?, ?, ?, ?, NULL, NULL, 0, 0, NULL, 1, NULL, NULL, ?)
""",
(code_id, code_value, issued_iso, expires_iso, creator_value, site_id),
)
return {
"id": code_id,
"code": code_value,
"created_at": issued_iso,
"expires_at": expires_iso,
}
def list_sites(self) -> Tuple[Dict[str, Any], int]:
conn = self._db_conn()
try:
cur = conn.cursor()
cur.execute(
"""
SELECT s.id,
s.name,
s.description,
s.created_at,
COALESCE(ds.cnt, 0) AS device_count
FROM sites AS s
LEFT JOIN (
SELECT site_id, COUNT(*) AS cnt
FROM device_sites
GROUP BY site_id
) AS ds ON ds.site_id = s.id
ORDER BY LOWER(s.name) ASC
"""
)
cur.execute(self._site_select_sql() + " ORDER BY LOWER(s.name) ASC")
rows = cur.fetchall()
sites = [_row_to_site(row) for row in rows]
return {"sites": sites}, 200
@@ -1112,6 +1205,8 @@ class DeviceManagementService:
if not name:
return {"error": "name is required"}, 400
now = int(time.time())
user = self._current_user() or {}
creator = user.get("username") or "system"
conn = self._db_conn()
try:
cur = conn.cursor()
@@ -1120,14 +1215,16 @@ class DeviceManagementService:
(name, description, now),
)
site_id = cur.lastrowid
code_info = self._issue_site_enrollment_code(cur, site_id, creator=creator)
cur.execute("UPDATE sites SET enrollment_code_id = ? WHERE id = ?", (code_info["id"], site_id))
conn.commit()
cur.execute(
"SELECT id, name, description, created_at, 0 FROM sites WHERE id = ?",
(site_id,),
)
row = cur.fetchone()
row = self._fetch_site_row(cur, site_id)
if not row:
return {"error": "creation_failed"}, 500
self.service_log(
"server",
f"site created id={site_id} code_id={code_info['id']} by={creator}",
)
return _row_to_site(row), 201
except sqlite3.IntegrityError:
conn.rollback()
@@ -1147,13 +1244,19 @@ class DeviceManagementService:
try:
norm_ids.append(int(value))
except Exception:
continue
return {"error": "invalid id"}, 400
if not norm_ids:
return {"status": "ok", "deleted": 0}, 200
conn = self._db_conn()
try:
cur = conn.cursor()
placeholders = ",".join("?" * len(norm_ids))
cur.execute(
f"SELECT id FROM enrollment_install_codes WHERE site_id IN ({placeholders})",
tuple(norm_ids),
)
code_ids = [row[0] for row in cur.fetchall() if row and row[0]]
now_iso = datetime.now(tz=timezone.utc).isoformat()
cur.execute(
f"DELETE FROM device_sites WHERE site_id IN ({placeholders})",
tuple(norm_ids),
@@ -1163,6 +1266,30 @@ class DeviceManagementService:
tuple(norm_ids),
)
deleted = cur.rowcount
cur.execute(
f"""
UPDATE enrollment_install_codes_persistent
SET is_active = 0,
archived_at = COALESCE(archived_at, ?)
WHERE site_id IN ({placeholders})
""",
(now_iso, *norm_ids),
)
if code_ids:
code_placeholders = ",".join("?" * len(code_ids))
cur.execute(
f"DELETE FROM enrollment_install_codes WHERE id IN ({code_placeholders})",
tuple(code_ids),
)
cur.execute(
f"""
UPDATE enrollment_install_codes_persistent
SET is_active = 0,
archived_at = COALESCE(archived_at, ?)
WHERE id IN ({code_placeholders})
""",
(now_iso, *code_ids),
)
conn.commit()
return {"status": "ok", "deleted": deleted}, 200
except Exception as exc:
@@ -1263,20 +1390,7 @@ class DeviceManagementService:
return {"error": "site not found"}, 404
conn.commit()
cur.execute(
"""
SELECT s.id,
s.name,
s.description,
s.created_at,
COALESCE(ds.cnt, 0) AS device_count
FROM sites AS s
LEFT JOIN (
SELECT site_id, COUNT(*) AS cnt
FROM device_sites
GROUP BY site_id
) ds ON ds.site_id = s.id
WHERE s.id = ?
""",
self._site_select_sql() + " WHERE s.id = ?",
(site_id_int,),
)
row = cur.fetchone()
@@ -1293,6 +1407,56 @@ class DeviceManagementService:
finally:
conn.close()
def rotate_site_enrollment_code(self, site_id: Any) -> Tuple[Dict[str, Any], int]:
try:
site_id_int = int(site_id)
except Exception:
return {"error": "invalid site_id"}, 400
user = self._current_user() or {}
creator = user.get("username") or "system"
now_iso = datetime.now(tz=timezone.utc).isoformat()
conn = self._db_conn()
try:
cur = conn.cursor()
cur.execute("SELECT enrollment_code_id FROM sites WHERE id = ?", (site_id_int,))
row = cur.fetchone()
if not row:
return {"error": "site not found"}, 404
existing_code_id = row[0]
if existing_code_id:
cur.execute("DELETE FROM enrollment_install_codes WHERE id = ?", (existing_code_id,))
cur.execute(
"""
UPDATE enrollment_install_codes_persistent
SET is_active = 0,
archived_at = COALESCE(archived_at, ?)
WHERE id = ?
""",
(now_iso, existing_code_id),
)
code_info = self._issue_site_enrollment_code(cur, site_id_int, creator=creator)
cur.execute(
"UPDATE sites SET enrollment_code_id = ? WHERE id = ?",
(code_info["id"], site_id_int),
)
conn.commit()
site_row = self._fetch_site_row(cur, site_id_int)
if not site_row:
return {"error": "site not found"}, 404
self.service_log(
"server",
f"site enrollment code rotated site_id={site_id_int} code_id={code_info['id']} by={creator}",
)
return _row_to_site(site_row), 200
except Exception as exc:
conn.rollback()
self.logger.debug("Failed to rotate site enrollment code", exc_info=True)
return {"error": str(exc)}, 500
finally:
conn.close()
def repo_current_hash(self) -> Tuple[Dict[str, Any], int]:
refresh_flag = (request.args.get("refresh") or "").strip().lower()
force_refresh = refresh_flag in {"1", "true", "yes", "force", "refresh"}
@@ -1584,21 +1748,37 @@ def register_management(app, adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/agents", methods=["GET"])
def _list_agents():
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.list_agents()
return jsonify(payload), status
@blueprint.route("/api/devices", methods=["GET"])
def _list_devices():
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.list_devices()
return jsonify(payload), status
@blueprint.route("/api/devices/<guid>", methods=["GET"])
def _device_by_guid(guid: str):
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.get_device_by_guid(guid)
return jsonify(payload), status
@blueprint.route("/api/device/details/<hostname>", methods=["GET"])
def _device_details(hostname: str):
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.get_device_details(hostname)
return jsonify(payload), status
@@ -1615,11 +1795,19 @@ def register_management(app, adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/device_list_views", methods=["GET"])
def _list_views():
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.list_views()
return jsonify(payload), status
@blueprint.route("/api/device_list_views/<int:view_id>", methods=["GET"])
def _get_view(view_id: int):
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.get_view(view_id)
return jsonify(payload), status
@@ -1679,6 +1867,10 @@ def register_management(app, adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/sites", methods=["GET"])
def _sites_list():
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.list_sites()
return jsonify(payload), status
@@ -1707,6 +1899,10 @@ def register_management(app, adapters: "EngineServiceAdapters") -> None:
@blueprint.route("/api/sites/device_map", methods=["GET"])
def _sites_device_map():
requirement = service._require_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.sites_device_map(request.args.get("hostnames"))
return jsonify(payload), status
@@ -1730,13 +1926,31 @@ def register_management(app, adapters: "EngineServiceAdapters") -> None:
payload, status = service.rename_site(data.get("id"), (data.get("new_name") or "").strip())
return jsonify(payload), status
@blueprint.route("/api/sites/rotate_code", methods=["POST"])
def _sites_rotate_code():
requirement = service._require_admin()
if requirement:
payload, status = requirement
return jsonify(payload), status
data = request.get_json(silent=True) or {}
payload, status = service.rotate_site_enrollment_code(data.get("site_id"))
return jsonify(payload), status
@blueprint.route("/api/repo/current_hash", methods=["GET"])
def _repo_current_hash():
requirement = service._require_device_or_login()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.repo_current_hash()
return jsonify(payload), status
@blueprint.route("/api/agent/hash_list", methods=["GET"])
def _agent_hash_list():
requirement = service._require_admin()
if requirement:
payload, status = requirement
return jsonify(payload), status
payload, status = service.agent_hash_list()
return jsonify(payload), status

View File

@@ -103,7 +103,8 @@ def register(
used_by_guid,
max_uses,
use_count,
last_used_at
last_used_at,
site_id
FROM enrollment_install_codes
WHERE code = ?
""",
@@ -121,6 +122,7 @@ def register(
"max_uses",
"use_count",
"last_used_at",
"site_id",
]
record = dict(zip(keys, row))
return record
@@ -140,16 +142,16 @@ def register(
if expiry <= _now():
return False, None
try:
max_uses = int(record.get("max_uses") or 1)
max_uses_raw = record.get("max_uses")
max_uses = int(max_uses_raw) if max_uses_raw is not None else 0
except Exception:
max_uses = 1
if max_uses < 1:
max_uses = 1
max_uses = 0
unlimited = max_uses <= 0
try:
use_count = int(record.get("use_count") or 0)
except Exception:
use_count = 0
if use_count < max_uses:
if unlimited or use_count < max_uses:
return True, None
guid = normalize_guid(record.get("used_by_guid"))
@@ -392,6 +394,24 @@ def register(
try:
cur = conn.cursor()
install_code = _load_install_code(cur, enrollment_code)
site_id = install_code.get("site_id") if install_code else None
if site_id is None:
log(
"server",
"enrollment request rejected missing_site_binding "
f"host={hostname} fingerprint={fingerprint[:12]} code_mask={_mask_code(enrollment_code)}",
context_hint,
)
return jsonify({"error": "invalid_enrollment_code"}), 400
cur.execute("SELECT 1 FROM sites WHERE id = ?", (site_id,))
if cur.fetchone() is None:
log(
"server",
"enrollment request rejected missing_site_owner "
f"host={hostname} fingerprint={fingerprint[:12]} code_mask={_mask_code(enrollment_code)}",
context_hint,
)
return jsonify({"error": "invalid_enrollment_code"}), 400
valid_code, reuse_guid = _install_code_valid(install_code, fingerprint, cur)
if not valid_code:
log(
@@ -427,6 +447,7 @@ def register(
SET hostname_claimed = ?,
guid = ?,
enrollment_code_id = ?,
site_id = ?,
client_nonce = ?,
server_nonce = ?,
agent_pubkey_der = ?,
@@ -437,6 +458,7 @@ def register(
hostname,
reuse_guid,
install_code["id"],
install_code.get("site_id"),
client_nonce_b64,
server_nonce_b64,
agent_pubkey_der,
@@ -451,11 +473,11 @@ def register(
"""
INSERT INTO device_approvals (
id, approval_reference, guid, hostname_claimed,
ssl_key_fingerprint_claimed, enrollment_code_id,
ssl_key_fingerprint_claimed, enrollment_code_id, site_id,
status, client_nonce, server_nonce, agent_pubkey_der,
created_at, updated_at
)
VALUES (?, ?, ?, ?, ?, ?, 'pending', ?, ?, ?, ?, ?)
VALUES (?, ?, ?, ?, ?, ?, ?, 'pending', ?, ?, ?, ?, ?)
""",
(
record_id,
@@ -464,6 +486,7 @@ def register(
hostname,
fingerprint,
install_code["id"],
install_code.get("site_id"),
client_nonce_b64,
server_nonce_b64,
agent_pubkey_der,
@@ -535,7 +558,7 @@ def register(
cur.execute(
"""
SELECT id, guid, hostname_claimed, ssl_key_fingerprint_claimed,
enrollment_code_id, status, client_nonce, server_nonce,
enrollment_code_id, site_id, status, client_nonce, server_nonce,
agent_pubkey_der, created_at, updated_at, approved_by_user_id
FROM device_approvals
WHERE approval_reference = ?
@@ -553,6 +576,7 @@ def register(
hostname_claimed,
fingerprint,
enrollment_code_id,
site_id,
status,
client_nonce_stored,
server_nonce_b64,
@@ -643,6 +667,17 @@ def register(
device_record = _ensure_device_record(cur, effective_guid, hostname_claimed, fingerprint)
_store_device_key(cur, effective_guid, fingerprint)
if site_id:
assigned_at = int(time.time())
cur.execute(
"""
INSERT INTO device_sites(device_hostname, site_id, assigned_at)
VALUES (?, ?, ?)
ON CONFLICT(device_hostname)
DO UPDATE SET site_id=excluded.site_id, assigned_at=excluded.assigned_at
""",
(device_record.get("hostname"), site_id, assigned_at),
)
# Mark install code used
if enrollment_code_id:
@@ -656,13 +691,14 @@ def register(
except Exception:
prior_count = 0
try:
allowed_uses = int(usage_row[1]) if usage_row else 1
allowed_uses = int(usage_row[1]) if usage_row else 0
except Exception:
allowed_uses = 1
allowed_uses = 0
unlimited = allowed_uses <= 0
if allowed_uses < 1:
allowed_uses = 1
new_count = prior_count + 1
consumed = new_count >= allowed_uses
consumed = False if unlimited else new_count >= allowed_uses
cur.execute(
"""
UPDATE enrollment_install_codes
@@ -767,4 +803,3 @@ def _mask_code(code: str) -> str:
if len(trimmed) <= 6:
return "***"
return f"{trimmed[:3]}***{trimmed[-3:]}"

View File

@@ -13,6 +13,8 @@ from typing import TYPE_CHECKING, Any, Dict
from flask import Blueprint, Flask, jsonify
from ...auth import RequestAuthContext
if TYPE_CHECKING: # pragma: no cover - typing aide
from .. import EngineServiceAdapters
@@ -31,13 +33,22 @@ def _serialize_time(now_local: datetime, now_utc: datetime) -> Dict[str, Any]:
}
def register_info(app: Flask, _adapters: "EngineServiceAdapters") -> None:
def register_info(app: Flask, adapters: "EngineServiceAdapters") -> None:
"""Expose server telemetry endpoints used by the admin interface."""
blueprint = Blueprint("engine_server_info", __name__)
auth = RequestAuthContext(
app=app,
dev_mode_manager=adapters.dev_mode_manager,
config=adapters.config,
logger=adapters.context.logger,
)
@blueprint.route("/api/server/time", methods=["GET"])
def server_time() -> Any:
_, error = auth.require_user()
if error:
return jsonify(error[0]), error[1]
now_utc = datetime.now(timezone.utc)
now_local = now_utc.astimezone()
payload = _serialize_time(now_local, now_utc)

View File

@@ -48,7 +48,6 @@ import GithubAPIToken from "./Access_Management/Github_API_Token.jsx";
import ServerInfo from "./Admin/Server_Info.jsx";
import PageTemplate from "./Admin/Page_Template.jsx";
import LogManagement from "./Admin/Log_Management.jsx";
import EnrollmentCodes from "./Devices/Enrollment_Codes.jsx";
import DeviceApprovals from "./Devices/Device_Approvals.jsx";
// Networking Imports
@@ -115,10 +114,12 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
const [userDisplayName, setUserDisplayName] = useState(null);
const [editingJob, setEditingJob] = useState(null);
const [jobsRefreshToken, setJobsRefreshToken] = useState(0);
const [quickJobDraft, setQuickJobDraft] = useState(null);
const [assemblyEditorState, setAssemblyEditorState] = useState(null); // { mode: 'script'|'ansible', row, nonce }
const [sessionResolved, setSessionResolved] = useState(false);
const initialPathRef = useRef(window.location.pathname + window.location.search);
const pendingPathRef = useRef(null);
const quickJobSeedRef = useRef(0);
const [notAuthorizedOpen, setNotAuthorizedOpen] = useState(false);
// Top-bar search state
@@ -228,8 +229,6 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
return "/admin/server_info";
case "page_template":
return "/admin/page_template";
case "admin_enrollment_codes":
return "/admin/enrollment-codes";
case "admin_device_approvals":
return "/admin/device-approvals";
default:
@@ -284,7 +283,6 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
if (path === "/access_management/credentials") return { page: "access_credentials", options: {} };
if (path === "/admin/server_info") return { page: "server_info", options: {} };
if (path === "/admin/page_template") return { page: "page_template", options: {} };
if (path === "/admin/enrollment-codes") return { page: "admin_enrollment_codes", options: {} };
if (path === "/admin/device-approvals") return { page: "admin_device_approvals", options: {} };
return { page: "devices", options: {} };
} catch {
@@ -380,6 +378,45 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
navigateByPathRef.current = navigateByPath;
}, [navigateTo, navigateByPath]);
const handleQuickJobLaunch = useCallback(
(hostnames) => {
const list = Array.isArray(hostnames) ? hostnames : [hostnames];
const normalized = Array.from(
new Set(
list
.map((host) => (typeof host === "string" ? host.trim() : ""))
.filter((host) => Boolean(host))
)
);
if (!normalized.length) {
return;
}
quickJobSeedRef.current += 1;
const primary = normalized[0];
const extraCount = normalized.length - 1;
const deviceLabel = extraCount > 0 ? `${primary} +${extraCount} more` : primary;
setEditingJob(null);
setQuickJobDraft({
id: `${Date.now()}_${quickJobSeedRef.current}`,
hostnames: normalized,
deviceLabel,
initialTabKey: "components",
scheduleType: "immediately",
placeholderAssemblyLabel: "Choose Assembly",
});
navigateTo("create_job");
},
[navigateTo]
);
const handleConsumeQuickJobDraft = useCallback((draftId) => {
setQuickJobDraft((prev) => {
if (!prev) return prev;
if (draftId && prev.id !== draftId) return prev;
return null;
});
}, []);
// Build breadcrumb items for current view
const breadcrumbs = React.useMemo(() => {
const items = [];
@@ -471,10 +508,6 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
items.push({ label: "Developer Tools" });
items.push({ label: "Page Template", page: "page_template" });
break;
case "admin_enrollment_codes":
items.push({ label: "Admin Settings", page: "server_info" });
items.push({ label: "Installer Codes", page: "admin_enrollment_codes" });
break;
case "admin_device_approvals":
items.push({ label: "Admin Settings", page: "server_info" });
items.push({ label: "Device Approvals", page: "admin_device_approvals" });
@@ -1004,7 +1037,6 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
useEffect(() => {
const requiresAdmin = currentPage === 'server_info'
|| currentPage === 'admin_enrollment_codes'
|| currentPage === 'admin_device_approvals'
|| currentPage === 'access_credentials'
|| currentPage === 'access_github_token'
@@ -1039,6 +1071,7 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
onSelectDevice={(d) => {
navigateTo("device_details", { device: d });
}}
onQuickJobLaunch={handleQuickJobLaunch}
/>
);
case "agent_devices":
@@ -1047,17 +1080,19 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
onSelectDevice={(d) => {
navigateTo("device_details", { device: d });
}}
onQuickJobLaunch={handleQuickJobLaunch}
/>
);
case "ssh_devices":
return <SSHDevices />;
return <SSHDevices onQuickJobLaunch={handleQuickJobLaunch} />;
case "winrm_devices":
return <WinRMDevices />;
return <WinRMDevices onQuickJobLaunch={handleQuickJobLaunch} />;
case "device_details":
return (
<DeviceDetails
device={selectedDevice}
onQuickJobLaunch={handleQuickJobLaunch}
onBack={() => {
navigateTo("devices");
setSelectedDevice(null);
@@ -1078,8 +1113,19 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
return (
<CreateJob
initialJob={editingJob}
onCancel={() => { navigateTo("jobs"); setEditingJob(null); }}
onCreated={() => { navigateTo("jobs"); setEditingJob(null); setJobsRefreshToken(Date.now()); }}
quickJobDraft={quickJobDraft}
onConsumeQuickJobDraft={handleConsumeQuickJobDraft}
onCancel={() => {
navigateTo("jobs");
setEditingJob(null);
setQuickJobDraft(null);
}}
onCreated={() => {
navigateTo("jobs");
setEditingJob(null);
setJobsRefreshToken(Date.now());
setQuickJobDraft(null);
}}
/>
);
@@ -1144,9 +1190,6 @@ const LOCAL_STORAGE_KEY = "borealis_persistent_state";
case "page_template":
return <PageTemplate isAdmin={isAdmin} />;
case "admin_enrollment_codes":
return <EnrollmentCodes />;
case "admin_device_approvals":
return <DeviceApprovals />;

View File

@@ -294,6 +294,12 @@ export default function DeviceApprovals() {
minWidth: 100,
Width: 100,
},
{
headerName: "Site",
field: "site_name",
valueGetter: (p) => p.data?.site_name || (p.data?.site_id ? `Site ${p.data.site_id}` : "—"),
minWidth: 160,
},
{ headerName: "Created", field: "created_at", valueFormatter: (p) => formatDateTime(p.value), minWidth: 160 },
{ headerName: "Updated", field: "updated_at", valueFormatter: (p) => formatDateTime(p.value), minWidth: 160 },
{

View File

@@ -1,6 +1,6 @@
////////// PROJECT FILE SEPARATION LINE ////////// CODE AFTER THIS LINE ARE FROM: <ProjectRoot>/Data/Engine/web-interface/src/Devices/Device_Details.jsx
import React, { useState, useEffect, useMemo, useCallback, useRef } from "react";
import React, { useState, useEffect, useMemo, useCallback } from "react";
import {
Box,
Tabs,
@@ -29,7 +29,6 @@ import "prismjs/components/prism-powershell";
import "prismjs/components/prism-batch";
import "prismjs/themes/prism-okaidia.css";
import Editor from "react-simple-code-editor";
import QuickJob from "../Scheduling/Quick_Job.jsx";
import { AgGridReact } from "ag-grid-react";
import { ModuleRegistry, AllCommunityModule, themeQuartz } from "ag-grid-community";
@@ -248,7 +247,7 @@ const GRID_COMPONENTS = {
HistoryActionsCell,
};
export default function DeviceDetails({ device, onBack }) {
export default function DeviceDetails({ device, onBack, onQuickJobLaunch }) {
const [tab, setTab] = useState(0);
const [agent, setAgent] = useState(device || {});
const [details, setDetails] = useState({});
@@ -266,11 +265,9 @@ export default function DeviceDetails({ device, onBack }) {
const [outputTitle, setOutputTitle] = useState("");
const [outputContent, setOutputContent] = useState("");
const [outputLang, setOutputLang] = useState("powershell");
const [quickJobOpen, setQuickJobOpen] = useState(false);
const [menuAnchor, setMenuAnchor] = useState(null);
const [clearDialogOpen, setClearDialogOpen] = useState(false);
const [assemblyNameMap, setAssemblyNameMap] = useState({});
const softwareGridApiRef = useRef(null);
// Snapshotted status for the lifetime of this page
const [lockedStatus, setLockedStatus] = useState(() => {
// Prefer status provided by the device list row if available
@@ -281,6 +278,18 @@ export default function DeviceDetails({ device, onBack }) {
const now = Date.now() / 1000;
return now - tsSec <= 300 ? "Online" : "Offline";
});
const quickJobTargets = useMemo(() => {
const values = [];
const push = (value) => {
const normalized = typeof value === "string" ? value.trim() : "";
if (!normalized) return;
if (!values.includes(normalized)) values.push(normalized);
};
push(agent?.hostname);
push(device?.hostname);
return values;
}, [agent, device]);
const canLaunchQuickJob = quickJobTargets.length > 0 && typeof onQuickJobLaunch === "function";
useEffect(() => {
setConnectionError("");
@@ -707,19 +716,6 @@ export default function DeviceDetails({ device, onBack }) {
}, []);
const softwareRows = useMemo(() => details.software || [], [details.software]);
const handleSoftwareGridReady = useCallback(
(params) => {
softwareGridApiRef.current = params.api;
params.api.setQuickFilter(softwareSearch);
},
[softwareSearch]
);
useEffect(() => {
if (softwareGridApiRef.current) {
softwareGridApiRef.current.setQuickFilter(softwareSearch);
}
}, [softwareSearch]);
const getSoftwareRowId = useCallback(
(params) => `${params.data?.name || "software"}-${params.data?.version || ""}-${params.rowIndex}`,
@@ -1250,7 +1246,7 @@ export default function DeviceDetails({ device, onBack }) {
paginationPageSize={20}
paginationPageSizeSelector={[20, 50, 100]}
animateRows
onGridReady={handleSoftwareGridReady}
quickFilterText={softwareSearch}
getRowId={getSoftwareRowId}
components={GRID_COMPONENTS}
theme={myTheme}
@@ -1626,11 +1622,11 @@ export default function DeviceDetails({ device, onBack }) {
>
<MoreHorizIcon fontSize="small" />
</IconButton>
<Menu
anchorEl={menuAnchor}
open={Boolean(menuAnchor)}
onClose={() => setMenuAnchor(null)}
PaperProps={{
<Menu
anchorEl={menuAnchor}
open={Boolean(menuAnchor)}
onClose={() => setMenuAnchor(null)}
PaperProps={{
sx: {
bgcolor: "rgba(8,12,24,0.96)",
color: "#fff",
@@ -1639,9 +1635,11 @@ export default function DeviceDetails({ device, onBack }) {
}}
>
<MenuItem
disabled={!canLaunchQuickJob}
onClick={() => {
setMenuAnchor(null);
setQuickJobOpen(true);
if (!canLaunchQuickJob) return;
onQuickJobLaunch && onQuickJobLaunch(quickJobTargets);
}}
>
Quick Job
@@ -1748,13 +1746,6 @@ export default function DeviceDetails({ device, onBack }) {
}}
/>
{quickJobOpen && (
<QuickJob
open={quickJobOpen}
onClose={() => setQuickJobOpen(false)}
hostnames={[agent?.hostname || device?.hostname].filter(Boolean)}
/>
)}
</Box>
);
}

View File

@@ -21,7 +21,6 @@ import CachedIcon from "@mui/icons-material/Cached";
import { AgGridReact } from "ag-grid-react";
import { ModuleRegistry, AllCommunityModule, themeQuartz } from "ag-grid-community";
import { DeleteDeviceDialog, CreateCustomViewDialog, RenameCustomViewDialog } from "../Dialogs.jsx";
import QuickJob from "../Scheduling/Quick_Job.jsx";
import AddDevice from "./Add_Device.jsx";
ModuleRegistry.registerModules([AllCommunityModule]);
@@ -339,6 +338,7 @@ function formatUptime(seconds) {
export default function DeviceList({
onSelectDevice,
onQuickJobLaunch,
filterMode = "all",
title,
showAddButton,
@@ -351,7 +351,7 @@ export default function DeviceList({
const [confirmOpen, setConfirmOpen] = useState(false);
// Track selection by agent id to avoid duplicate hostname collisions
const [selectedIds, setSelectedIds] = useState(() => new Set());
const [quickJobOpen, setQuickJobOpen] = useState(false);
const canLaunchQuickJob = selectedIds.size > 0 && typeof onQuickJobLaunch === "function";
const [addDeviceOpen, setAddDeviceOpen] = useState(false);
const [addDeviceType, setAddDeviceType] = useState(null);
const computedTitle = useMemo(() => {
@@ -1739,18 +1739,26 @@ export default function DeviceList({
<Button
variant="contained"
size="small"
disabled={selectedIds.size === 0}
disabled={!canLaunchQuickJob}
disableElevation
onClick={() => setQuickJobOpen(true)}
onClick={() => {
if (!canLaunchQuickJob) return;
const hostnames = rows
.filter((r) => selectedIds.has(r.id))
.map((r) => r.hostname)
.filter((hostname) => Boolean(hostname));
if (!hostnames.length) return;
onQuickJobLaunch(hostnames);
}}
sx={{
borderRadius: 999,
px: 2.2,
textTransform: "none",
fontWeight: 600,
background: selectedIds.size === 0 ? "rgba(148,163,184,0.2)" : "linear-gradient(135deg, #34d399, #22d3ee)",
color: selectedIds.size === 0 ? MAGIC_UI.textMuted : "#041224",
border: selectedIds.size === 0 ? "1px solid rgba(148,163,184,0.35)" : "none",
boxShadow: selectedIds.size === 0 ? "none" : "0 0 24px rgba(45, 212, 191, 0.45)",
background: canLaunchQuickJob ? "linear-gradient(135deg, #34d399, #22d3ee)" : "rgba(148,163,184,0.2)",
color: canLaunchQuickJob ? "#041224" : MAGIC_UI.textMuted,
border: canLaunchQuickJob ? "none" : "1px solid rgba(148,163,184,0.35)",
boxShadow: canLaunchQuickJob ? "0 0 24px rgba(45, 212, 191, 0.45)" : "none",
}}
>
Quick Job
@@ -2090,13 +2098,6 @@ export default function DeviceList({
onConfirm={handleDelete}
/>
{quickJobOpen && (
<QuickJob
open={quickJobOpen}
onClose={() => setQuickJobOpen(false)}
hostnames={rows.filter((r) => selectedIds.has(r.id)).map((r) => r.hostname)}
/>
)}
{assignDialogOpen && (
<Popover
open={assignDialogOpen}

View File

@@ -1,372 +0,0 @@
import React, { useCallback, useEffect, useMemo, useState, useRef } from "react";
import {
Box,
Paper,
Typography,
Button,
Stack,
Alert,
FormControl,
InputLabel,
MenuItem,
Select,
CircularProgress,
Tooltip
} from "@mui/material";
import {
ContentCopy as CopyIcon,
DeleteOutline as DeleteIcon,
Refresh as RefreshIcon,
Key as KeyIcon,
} from "@mui/icons-material";
import { AgGridReact } from "ag-grid-react";
import { ModuleRegistry, AllCommunityModule, themeQuartz } from "ag-grid-community";
// IMPORTANT: Do NOT import global AG Grid CSS here to avoid overriding other pages.
// We rely on the project's existing CSS and themeQuartz class name like other MagicUI pages.
ModuleRegistry.registerModules([AllCommunityModule]);
// Match the palette used on other pages (see Site_List / Device_List)
const MAGIC_UI = {
shellBg:
"radial-gradient(120% 120% at 0% 0%, rgba(76, 186, 255, 0.16), transparent 55%), " +
"radial-gradient(120% 120% at 100% 0%, rgba(214, 130, 255, 0.18), transparent 60%), #040711",
panelBg:
"linear-gradient(135deg, rgba(10, 16, 31, 0.98) 0%, rgba(6, 10, 24, 0.94) 60%, rgba(15, 6, 26, 0.96) 100%)",
panelBorder: "rgba(148, 163, 184, 0.35)",
textBright: "#e2e8f0",
textMuted: "#94a3b8",
accentA: "#7dd3fc",
accentB: "#c084fc",
};
// Generate a scoped Quartz theme class (same pattern as other pages)
const gridTheme = themeQuartz.withParams({
accentColor: "#8b5cf6",
backgroundColor: "#070b1a",
browserColorScheme: "dark",
fontFamily: { googleFont: "IBM Plex Sans" },
foregroundColor: "#f4f7ff",
headerFontSize: 13,
});
const themeClassName = gridTheme.themeName || "ag-theme-quartz";
const TTL_PRESETS = [
{ value: 1, label: "1 hour" },
{ value: 3, label: "3 hours" },
{ value: 6, label: "6 hours" },
{ value: 12, label: "12 hours" },
{ value: 24, label: "24 hours" },
];
const determineStatus = (record) => {
if (!record) return "expired";
const maxUses = Number.isFinite(record?.max_uses) ? record.max_uses : 1;
const useCount = Number.isFinite(record?.use_count) ? record.use_count : 0;
if (useCount >= Math.max(1, maxUses || 1)) return "used";
if (!record.expires_at) return "expired";
const expires = new Date(record.expires_at);
if (Number.isNaN(expires.getTime())) return "expired";
return expires.getTime() > Date.now() ? "active" : "expired";
};
const formatDateTime = (value) => {
if (!value) return "—";
const date = new Date(value);
if (Number.isNaN(date.getTime())) return value;
return date.toLocaleString();
};
const maskCode = (code) => {
if (!code) return "—";
const parts = code.split("-");
if (parts.length <= 1) {
const prefix = code.slice(0, 4);
return `${prefix}${"•".repeat(Math.max(0, code.length - prefix.length))}`;
}
return parts
.map((part, idx) => (idx === 0 || idx === parts.length - 1 ? part : "•".repeat(part.length)))
.join("-");
};
export default function EnrollmentCodes() {
const [codes, setCodes] = useState([]);
const [loading, setLoading] = useState(false);
const [error, setError] = useState("");
const [feedback, setFeedback] = useState(null);
const [statusFilter, setStatusFilter] = useState("all");
const [ttlHours, setTtlHours] = useState(6);
const [generating, setGenerating] = useState(false);
const [maxUses, setMaxUses] = useState(2);
const gridRef = useRef(null);
const fetchCodes = useCallback(async () => {
setLoading(true);
setError("");
try {
const query = statusFilter === "all" ? "" : `?status=${encodeURIComponent(statusFilter)}`;
const resp = await fetch(`/api/admin/enrollment-codes${query}`, { credentials: "include" });
if (!resp.ok) {
const body = await resp.json().catch(() => ({}));
throw new Error(body.error || `Request failed (${resp.status})`);
}
const data = await resp.json();
setCodes(Array.isArray(data.codes) ? data.codes : []);
} catch (err) {
setError(err.message || "Unable to load codes");
} finally {
setLoading(false);
}
}, [statusFilter]);
useEffect(() => { fetchCodes(); }, [fetchCodes]);
const handleGenerate = useCallback(async () => {
setGenerating(true);
try {
const resp = await fetch("/api/admin/enrollment-codes", {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ ttl_hours: ttlHours, max_uses: maxUses }),
});
if (!resp.ok) {
const body = await resp.json().catch(() => ({}));
throw new Error(body.error || `Request failed (${resp.status})`);
}
await fetchCodes();
setFeedback({ type: "success", message: "New installer code created" });
} catch (err) {
setFeedback({ type: "error", message: err.message });
} finally {
setGenerating(false);
}
}, [ttlHours, maxUses, fetchCodes]);
const handleCopy = (code) => {
if (!code) return;
try {
if (navigator.clipboard?.writeText) {
navigator.clipboard.writeText(code);
setFeedback({ type: "success", message: "Code copied to clipboard" });
}
} catch (_) {}
};
const handleDelete = async (id) => {
if (!id) return;
if (!window.confirm("Delete this installer code?")) return;
try {
const resp = await fetch(`/api/admin/enrollment-codes/${id}`, {
method: "DELETE",
credentials: "include",
});
if (!resp.ok) {
const body = await resp.json().catch(() => ({}));
throw new Error(body.error || `Request failed (${resp.status})`);
}
await fetchCodes();
setFeedback({ type: "success", message: "Code deleted" });
} catch (err) {
setFeedback({ type: "error", message: err.message });
}
};
const columns = useMemo(() => [
{
headerName: "Status",
field: "status",
cellRenderer: (params) => {
const status = determineStatus(params.data);
const color =
status === "active" ? "#34d399" :
status === "used" ? "#7dd3fc" :
"#fbbf24";
return <span style={{ color, fontWeight: 600 }}>{status}</span>;
},
minWidth: 100
},
{
headerName: "Installer Code",
field: "code",
cellRenderer: (params) => (
<span style={{ fontFamily: "monospace", color: "#7dd3fc" }}>{maskCode(params.value)}</span>
),
minWidth: 340
},
{ headerName: "Expires At",
field: "expires_at",
valueFormatter: p => formatDateTime(p.value)
},
{ headerName: "Created By", field: "created_by_user_id" },
{
headerName: "Usage",
valueGetter: (p) => `${p.data.use_count || 0} / ${p.data.max_uses || 1}`,
cellStyle: { fontFamily: "monospace" },
width: 120
},
{ headerName: "Last Used", field: "last_used_at", valueFormatter: p => formatDateTime(p.value) },
{ headerName: "Used By GUID", field: "used_by_guid" },
{
headerName: "Actions",
cellRenderer: (params) => {
const record = params.data;
const disableDelete = (record.use_count || 0) !== 0;
return (
<Stack direction="row" spacing={1} justifyContent="flex-end">
<Tooltip title="Copy code">
<span>
<Button size="small" onClick={() => handleCopy(record.code)}>
<CopyIcon fontSize="small" />
</Button>
</span>
</Tooltip>
<Tooltip title={disableDelete ? "Only unused codes can be deleted" : "Delete code"}>
<span>
<Button size="small" disabled={disableDelete} onClick={() => handleDelete(record.id)}>
<DeleteIcon fontSize="small" />
</Button>
</span>
</Tooltip>
</Stack>
);
},
width: 160
}
], []);
const defaultColDef = useMemo(() => ({
sortable: true,
filter: true,
resizable: true,
flex: 1,
minWidth: 140,
}), []);
return (
<Paper
sx={{
m: 0,
p: 0,
display: "flex",
flexDirection: "column",
flexGrow: 1,
minWidth: 0,
height: "100%",
borderRadius: 0,
border: `1px solid ${MAGIC_UI.panelBorder}`,
background: MAGIC_UI.shellBg,
boxShadow: "0 25px 80px rgba(6, 12, 30, 0.8)",
overflow: "hidden",
}}
elevation={0}
>
{/* Hero header */}
<Box sx={{ p: 3 }}>
<Stack direction="row" spacing={2} alignItems="center" justifyContent="space-between">
<Stack direction="row" spacing={1} alignItems="center">
<KeyIcon sx={{ color: MAGIC_UI.accentA }} />
<Typography variant="h6" sx={{ color: MAGIC_UI.textBright, fontWeight: 700 }}>
Enrollment Installer Codes
</Typography>
</Stack>
<Stack direction="row" spacing={1}>
<Button
variant="contained"
disabled={generating}
startIcon={generating ? <CircularProgress size={16} color="inherit" /> : null}
onClick={handleGenerate}
sx={{ background: "linear-gradient(135deg,#7dd3fc,#c084fc)", borderRadius: 999 }}
>
{generating ? "Generating…" : "Generate Code"}
</Button>
<Button variant="outlined" startIcon={<RefreshIcon />} onClick={fetchCodes} disabled={loading}>
Refresh
</Button>
</Stack>
</Stack>
</Box>
{/* Controls */}
<Box sx={{ p: 2, display: "flex", gap: 2, alignItems: "center", flexWrap: "wrap" }}>
<FormControl size="small" sx={{ minWidth: 140 }}>
<InputLabel>Status</InputLabel>
<Select value={statusFilter} label="Status" onChange={(e) => setStatusFilter(e.target.value)}>
<MenuItem value="all">All</MenuItem>
<MenuItem value="active">Active</MenuItem>
<MenuItem value="used">Used</MenuItem>
<MenuItem value="expired">Expired</MenuItem>
</Select>
</FormControl>
<FormControl size="small" sx={{ minWidth: 160 }}>
<InputLabel>Duration</InputLabel>
<Select value={ttlHours} label="Duration" onChange={(e) => setTtlHours(Number(e.target.value))}>
{TTL_PRESETS.map((p) => (
<MenuItem key={p.value} value={p.value}>
{p.label}
</MenuItem>
))}
</Select>
</FormControl>
<FormControl size="small" sx={{ minWidth: 160 }}>
<InputLabel>Allowed Uses</InputLabel>
<Select value={maxUses} label="Allowed Uses" onChange={(e) => setMaxUses(Number(e.target.value))}>
{[1, 2, 3, 5].map((n) => (
<MenuItem key={n} value={n}>
{n === 1 ? "Single use" : `${n} uses`}
</MenuItem>
))}
</Select>
</FormControl>
</Box>
{feedback && (
<Box sx={{ px: 3 }}>
<Alert severity={feedback.type} onClose={() => setFeedback(null)}>
{feedback.message}
</Alert>
</Box>
)}
{error && (
<Box sx={{ px: 3 }}>
<Alert severity="error">{error}</Alert>
</Box>
)}
{/* Grid wrapper — all overrides are SCOPED to this instance via inline CSS vars */}
<Box
className={themeClassName}
sx={{
flex: 1,
p: 2,
overflow: "hidden",
}}
// Inline style ensures the CSS variables only affect THIS grid instance
style={{
"--ag-background-color": "#070b1a",
"--ag-foreground-color": "#f4f7ff",
"--ag-header-background-color": "#0f172a",
"--ag-header-foreground-color": "#cfe0ff",
"--ag-odd-row-background-color": "rgba(255,255,255,0.02)",
"--ag-row-hover-color": "rgba(125,183,255,0.08)",
"--ag-selected-row-background-color": "rgba(64,164,255,0.18)",
"--ag-font-family": "'IBM Plex Sans', 'Helvetica Neue', Arial, sans-serif",
"--ag-border-color": "rgba(125,183,255,0.18)",
"--ag-row-border-color": "rgba(125,183,255,0.14)",
"--ag-border-radius": "8px",
}}
>
<AgGridReact
ref={gridRef}
rowData={codes}
columnDefs={columns}
defaultColDef={defaultColDef}
animateRows
pagination
paginationPageSize={20}
/>
</Box>
</Paper>
);
}

View File

@@ -1,17 +1,3 @@
////////// PROJECT FILE SEPARATION LINE ////////// CODE AFTER THIS LINE ARE FROM: <ProjectRoot>/Data/Engine/web-interface/src/GUI_STYLING_GUIDE.md
# This guide moved
# Borealis MagicUI Styling Guide
- **Aurora Shells**: Page containers should sit on aurora gradients that blend deep navy (#040711) with soft cyan/violet blooms plus a subtle border (`rgba(148,163,184,0.35)`) and low, velvety drop shadows to create depth without harsh edges.
- **Full-Bleed Canvas**: Let hero shells run edge-to-edge inside the content column (no dark voids); reserve inset padding for interior cards so gradients feel immersive.
- **Glass Panels**: Primary panels/cards use glassmorphic layers (`rgba(15,23,42,0.7)`), rounded 1624px corners, blurred backdrops, and micro borders; add radial light flares via pseudo-elements for motion while keeping content readable.
- **Hero Storytelling**: Each view begins with a stat-forward hero—gradient StatTiles (min 160px) and uppercase pills (HERO_BADGE_SX) summarize live signals, active filters, and selections so telemetry feels alive at a glance.
- **Summary Data Grids**: When metadata runs long (device summary, network facts) render it with AG Grid inside a glass wrapper—two columns (Field/Value), matte navy background, and no row striping so it reads like structured cards instead of spreadsheets.
- **Tile Palettes**: Online tiles lean cyan→green, stale tiles go orange→red, “needs update” stays violet→cyan, and secondary metrics fade from cyan into desaturated steel so each KPI has a consistent hue family across pages.
- **Hardware Islands**: Storage, memory, and network blocks reuse the Quartz AG Grid theme inside rounded glass shells with flat fills (no gradients); show readable numeric columns (`Capacity`, `Used`, `Free`, `%`) instead of custom bars so panels match the Device Inventory surface.
- **Action Surfaces**: Control bars (view selectors, tool strips) live inside translucent glass bands with generous spacing; selectors get filled dark inputs with cyan hover borders, while primary actions are pill-shaped gradients and secondary controls use soft-outline icon buttons.
- **Anchored Controls**: Align view selectors/utility buttons directly with grid edges, keeping the controls in a single horizontal row that feels docked to the data surface; reserve glass backdrops for hero sections so the content canvas stays flush.
- **Buttons & Chips**: Reserve gradient pills (`linear-gradient(135deg,#34d399,#22d3ee)` for success, `#7dd3fc→#c084fc` for creation) for primary CTAs; neutral actions rely on rounded outlines with `rgba(148,163,184,0.4)` borders and uppercase microcopy for supporting tokens.
- **Rainbow Accents**: When highlighting creation CTAs (e.g., Add Device), use dark-fill pill buttons with rainbow border gradients (dual-layer background clip) so the surface stays matte while the perimeter shimmers through the aurora palette, and pair every CTA glow with the same teal halo used on Quick Job for cohesion.
- **AG Grid Treatment**: Stick with the Quartz theme but override backgrounds so headers are matte navy, alternating rows have subtle opacity shifts, and interactions (hover/selection) glow with cyan/magenta washes; rounded wrappers, soft borders, and inset selection glows keep the grid cohesive with the rest of the MagicUI surface.
- **Overlays & Menus**: Menus, popovers, and dialogs share the same `rgba(8,12,24,0.96)` canvas, blurred backdrops, and thin steel borders; keep typography bright, inputs filled with deep blue glass, and accent colors aligned (cyan for confirm, mauve for destructive).
The MagicUI styling language and AG Grid rules now live in `Docs/Codex/Shared/ui/README.md` alongside references to `Data/Engine/web-interface/src/Admin/Page_Template.jsx`. Use that doc as the single source of truth for UI patterns and AG Grid behavior.

View File

@@ -243,20 +243,6 @@ export default function Login({ onLogin }) {
position: relative;
width: var(--card-w);
border-radius: var(--radius-xl);
padding: 1px; /* border thickness */
background:
conic-gradient(from 0deg,
rgba(88,166,255,.0),
rgba(88,166,255,.45),
rgba(34,211,238,.0),
rgba(125, 255, 183, 0.3),
rgba(34,211,238,.0),
rgba(88,166,255,.45),
rgba(167,139,250,.0));
-webkit-mask: linear-gradient(#000 0 0) content-box, linear-gradient(#000 0 0);
-webkit-mask-composite: xor;
mask-composite: exclude;
/* rotation disabled per bugfix */
box-shadow:
0 0 0 1px rgba(255,255,255,0.06) inset,
0 10px 30px rgba(0,0,0,.5);
@@ -267,7 +253,18 @@ export default function Login({ onLogin }) {
}
.shine-inner {
border-radius: inherit;
background: linear-gradient(180deg, rgba(17,22,36,.9), rgba(11,14,20,.9));
position: relative;
border: 1px solid transparent;
background:
linear-gradient(180deg, rgba(17,22,36,.92), rgba(11,14,20,.92)) padding-box,
conic-gradient(from 0deg,
rgba(88,166,255,.0),
rgba(88,166,255,.45),
rgba(34,211,238,.0),
rgba(125, 255, 183, 0.3),
rgba(34,211,238,.0),
rgba(88,166,255,.45),
rgba(167,139,250,.0)) border-box;
backdrop-filter: blur(10px);
padding: 28px 24px;
}
@@ -487,4 +484,3 @@ export default function Login({ onLogin }) {
</>
);
}

View File

@@ -24,7 +24,6 @@ import {
VpnKey as CredentialIcon,
PersonOutline as UserIcon,
GitHub as GitHubIcon,
Key as KeyIcon,
Dashboard as PageTemplateIcon,
AdminPanelSettings as AdminPanelSettingsIcon,
ReceiptLong as LogsIcon,
@@ -61,7 +60,6 @@ function NavigationSidebar({ currentPage, onNavigate, isAdmin = false }) {
"winrm_devices",
"agent_devices",
"admin_device_approvals",
"admin_enrollment_codes",
].includes(currentPage),
automation: ["jobs", "assemblies", "community"].includes(currentPage),
filters: ["filters", "groups"].includes(currentPage),
@@ -194,12 +192,6 @@ function NavigationSidebar({ currentPage, onNavigate, isAdmin = false }) {
label="Device Approvals"
pageKey="admin_device_approvals"
/>
<NavItem
icon={<KeyIcon fontSize="small" />}
label="Enrollment Codes"
pageKey="admin_enrollment_codes"
indent
/>
<NavItem
icon={<DevicesIcon fontSize="small" />}
label="Devices"

View File

@@ -355,7 +355,7 @@ function ComponentCard({ comp, onRemove, onVariableChange, errors = {} }) {
);
}
export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
export default function CreateJob({ onCancel, onCreated, initialJob = null, quickJobDraft = null, onConsumeQuickJobDraft }) {
const [tab, setTab] = useState(0);
const [jobName, setJobName] = useState("");
const [pageTitleJobName, setPageTitleJobName] = useState("");
@@ -376,6 +376,7 @@ export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
const [assembliesLoading, setAssembliesLoading] = useState(false);
const [assembliesError, setAssembliesError] = useState("");
const assemblyExportCacheRef = useRef(new Map());
const quickDraftAppliedRef = useRef(null);
const loadCredentials = useCallback(async () => {
setCredentialLoading(true);
@@ -508,6 +509,25 @@ export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
const [selectedTargets, setSelectedTargets] = useState({}); // map hostname->bool
const [deviceSearch, setDeviceSearch] = useState("");
const [componentVarErrors, setComponentVarErrors] = useState({});
const [quickJobMeta, setQuickJobMeta] = useState(null);
const primaryComponentName = useMemo(() => {
if (!components.length) return "";
const first = components[0] || {};
const candidates = [
first.displayName,
first.name,
first.component_name,
first.script_name,
first.script_path,
first.path
];
for (const candidate of candidates) {
if (typeof candidate === "string" && candidate.trim()) {
return candidate.trim();
}
}
return "";
}, [components]);
const [deviceRows, setDeviceRows] = useState([]);
const [deviceStatusFilter, setDeviceStatusFilter] = useState(null);
const [deviceOrderBy, setDeviceOrderBy] = useState("hostname");
@@ -917,9 +937,28 @@ export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
return true;
}, [jobName, components.length, targets.length, scheduleType, startDateTime, remoteExec, selectedCredentialId, execContext, useSvcAccount]);
const handleJobNameInputChange = useCallback((value) => {
setJobName(value);
setQuickJobMeta((prev) => {
if (!prev?.allowAutoRename) return prev;
if (!prev.currentAutoName) return prev;
if (value.trim() !== prev.currentAutoName.trim()) {
return { ...prev, allowAutoRename: false };
}
return prev;
});
}, []);
const [confirmOpen, setConfirmOpen] = useState(false);
const editing = !!(initialJob && initialJob.id);
useEffect(() => {
if (editing) {
quickDraftAppliedRef.current = null;
setQuickJobMeta(null);
}
}, [editing]);
// --- Job History (only when editing) ---
const [historyRows, setHistoryRows] = useState([]);
const [historyOrderBy, setHistoryOrderBy] = useState("started_ts");
@@ -1550,6 +1589,60 @@ export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
return base;
}, [editing]);
useEffect(() => {
if (editing) return;
if (!quickJobDraft || !quickJobDraft.id) return;
if (quickDraftAppliedRef.current === quickJobDraft.id) return;
quickDraftAppliedRef.current = quickJobDraft.id;
const uniqueTargets = [];
const pushTarget = (value) => {
const normalized = typeof value === "string" ? value.trim() : "";
if (!normalized) return;
if (!uniqueTargets.includes(normalized)) uniqueTargets.push(normalized);
};
const incoming = Array.isArray(quickJobDraft.hostnames) ? quickJobDraft.hostnames : [];
incoming.forEach(pushTarget);
setTargets(uniqueTargets);
setSelectedTargets({});
setComponents([]);
setComponentVarErrors({});
const normalizedSchedule = String(quickJobDraft.scheduleType || "immediately").trim().toLowerCase() || "immediately";
setScheduleType(normalizedSchedule);
const placeholderAssembly = (quickJobDraft.placeholderAssemblyLabel || "Choose Assembly").trim() || "Choose Assembly";
const deviceLabel = (quickJobDraft.deviceLabel || uniqueTargets[0] || "Selected Device").trim() || "Selected Device";
const initialName = `Quick Job - ${placeholderAssembly} - ${deviceLabel}`;
setJobName(initialName);
setPageTitleJobName(initialName.trim());
setQuickJobMeta({
id: quickJobDraft.id,
deviceLabel,
allowAutoRename: true,
currentAutoName: initialName
});
const targetTabKey = quickJobDraft.initialTabKey || "components";
const tabIndex = tabDefs.findIndex((t) => t.key === targetTabKey);
if (tabIndex >= 0) setTab(tabIndex);
else if (tabDefs.length > 1) setTab(1);
if (typeof onConsumeQuickJobDraft === "function") {
onConsumeQuickJobDraft(quickJobDraft.id);
}
}, [editing, quickJobDraft, tabDefs, onConsumeQuickJobDraft]);
useEffect(() => {
if (!quickJobMeta?.allowAutoRename) return;
if (!primaryComponentName) return;
const deviceLabel = quickJobMeta.deviceLabel || "Selected Device";
const newName = `Quick Job - ${primaryComponentName} - ${deviceLabel}`;
if (jobName === newName) return;
setJobName(newName);
setPageTitleJobName(newName.trim());
setQuickJobMeta((prev) => {
if (!prev) return prev;
if (!prev.allowAutoRename) return prev;
return { ...prev, currentAutoName: newName };
});
}, [primaryComponentName, quickJobMeta, jobName]);
return (
<Paper sx={{ m: 2, p: 0, bgcolor: "#1e1e1e", overflow: "auto" }} elevation={2}>
<Box sx={{ p: 2, pb: 1 }}>
@@ -1604,7 +1697,7 @@ export default function CreateJob({ onCancel, onCreated, initialJob = null }) {
}}
placeholder="Example Job Name"
value={jobName}
onChange={(e) => setJobName(e.target.value)}
onChange={(e) => handleJobNameInputChange(e.target.value)}
onBlur={(e) => setPageTitleJobName(e.target.value.trim())}
InputLabelProps={{ shrink: true }}
error={jobName.trim().length === 0}

View File

@@ -1,659 +0,0 @@
import React, { useEffect, useState, useCallback, useMemo, useRef } from "react";
import {
Dialog,
DialogTitle,
DialogContent,
DialogActions,
Button,
Box,
Typography,
Paper,
FormControlLabel,
Checkbox,
TextField,
FormControl,
InputLabel,
Select,
MenuItem,
CircularProgress,
Chip
} from "@mui/material";
import { Folder as FolderIcon, Description as DescriptionIcon } from "@mui/icons-material";
import { SimpleTreeView, TreeItem } from "@mui/x-tree-view";
import { DomainBadge } from "../Assemblies/Assembly_Badges";
import {
buildAssemblyIndex,
buildAssemblyTree,
normalizeAssemblyPath,
parseAssemblyExport
} from "../Assemblies/assemblyUtils";
const DIALOG_SHELL_SX = {
backgroundImage: "linear-gradient(120deg,#040711 0%,#0b1222 55%,#020617 100%)",
border: "1px solid rgba(148,163,184,0.35)",
boxShadow: "0 28px 60px rgba(2,6,12,0.65)",
borderRadius: 3,
color: "#e2e8f0",
overflow: "hidden"
};
const GLASS_PANEL_SX = {
backgroundColor: "rgba(15,23,42,0.78)",
border: "1px solid rgba(148,163,184,0.35)",
borderRadius: 3,
boxShadow: "0 16px 40px rgba(2,6,15,0.45)",
backdropFilter: "blur(22px)"
};
const PRIMARY_PILL_GRADIENT = "linear-gradient(135deg,#34d399,#22d3ee)";
const SECONDARY_PILL_GRADIENT = "linear-gradient(135deg,#7dd3fc,#c084fc)";
export default function QuickJob({ open, onClose, hostnames = [] }) {
const [assemblyPayload, setAssemblyPayload] = useState({ items: [], queue: [] });
const [assembliesLoading, setAssembliesLoading] = useState(false);
const [assembliesError, setAssembliesError] = useState("");
const [selectedAssemblyGuid, setSelectedAssemblyGuid] = useState("");
const [running, setRunning] = useState(false);
const [error, setError] = useState("");
const [runAsCurrentUser, setRunAsCurrentUser] = useState(false);
const [mode, setMode] = useState("scripts"); // 'scripts' | 'ansible'
const [credentials, setCredentials] = useState([]);
const [credentialsLoading, setCredentialsLoading] = useState(false);
const [credentialsError, setCredentialsError] = useState("");
const [selectedCredentialId, setSelectedCredentialId] = useState("");
const [useSvcAccount, setUseSvcAccount] = useState(true);
const [variables, setVariables] = useState([]);
const [variableValues, setVariableValues] = useState({});
const [variableErrors, setVariableErrors] = useState({});
const [variableStatus, setVariableStatus] = useState({ loading: false, error: "" });
const assemblyExportCacheRef = useRef(new Map());
const loadAssemblies = useCallback(async () => {
setAssembliesLoading(true);
setAssembliesError("");
try {
const resp = await fetch("/api/assemblies");
if (!resp.ok) {
const detail = await resp.text();
throw new Error(detail || `HTTP ${resp.status}`);
}
const data = await resp.json();
assemblyExportCacheRef.current.clear();
setAssemblyPayload({
items: Array.isArray(data?.items) ? data.items : [],
queue: Array.isArray(data?.queue) ? data.queue : []
});
} catch (err) {
console.error("Failed to load assemblies:", err);
setAssemblyPayload({ items: [], queue: [] });
setAssembliesError(err?.message || "Failed to load assemblies");
} finally {
setAssembliesLoading(false);
}
}, []);
const assemblyIndex = useMemo(
() => buildAssemblyIndex(assemblyPayload.items, assemblyPayload.queue),
[assemblyPayload.items, assemblyPayload.queue]
);
const scriptTreeData = useMemo(
() => buildAssemblyTree(assemblyIndex.grouped?.scripts || [], { rootLabel: "Scripts" }),
[assemblyIndex]
);
const ansibleTreeData = useMemo(
() => buildAssemblyTree(assemblyIndex.grouped?.ansible || [], { rootLabel: "Ansible Playbooks" }),
[assemblyIndex]
);
const selectedAssembly = useMemo(() => {
if (!selectedAssemblyGuid) return null;
const guid = selectedAssemblyGuid.toLowerCase();
return assemblyIndex.byGuid?.get(guid) || null;
}, [selectedAssemblyGuid, assemblyIndex]);
const loadAssemblyExport = useCallback(
async (assemblyGuid) => {
const cacheKey = assemblyGuid.toLowerCase();
if (assemblyExportCacheRef.current.has(cacheKey)) {
return assemblyExportCacheRef.current.get(cacheKey);
}
const resp = await fetch(`/api/assemblies/${encodeURIComponent(assemblyGuid)}/export`);
if (!resp.ok) {
throw new Error(`Failed to load assembly (HTTP ${resp.status})`);
}
const data = await resp.json();
assemblyExportCacheRef.current.set(cacheKey, data);
return data;
},
[]
);
useEffect(() => {
if (!open) {
setSelectedAssemblyGuid("");
return;
}
setSelectedAssemblyGuid("");
setError("");
setVariables([]);
setVariableValues({});
setVariableErrors({});
setVariableStatus({ loading: false, error: "" });
setUseSvcAccount(true);
setSelectedCredentialId("");
if (!assemblyPayload.items.length && !assembliesLoading) {
loadAssemblies();
}
}, [open, loadAssemblies, assemblyPayload.items.length, assembliesLoading]);
useEffect(() => {
if (!open) return;
setSelectedAssemblyGuid("");
setVariables([]);
setVariableValues({});
setVariableErrors({});
setVariableStatus({ loading: false, error: "" });
}, [mode, open]);
useEffect(() => {
if (!open || mode !== "ansible") return;
let canceled = false;
setCredentialsLoading(true);
setCredentialsError("");
(async () => {
try {
const resp = await fetch("/api/credentials");
if (!resp.ok) throw new Error(`HTTP ${resp.status}`);
const data = await resp.json();
if (canceled) return;
const list = Array.isArray(data?.credentials)
? data.credentials.filter((cred) => {
const conn = String(cred.connection_type || "").toLowerCase();
return conn === "ssh" || conn === "winrm";
})
: [];
list.sort((a, b) => String(a?.name || "").localeCompare(String(b?.name || "")));
setCredentials(list);
} catch (err) {
if (!canceled) {
setCredentials([]);
setCredentialsError(String(err.message || err));
}
} finally {
if (!canceled) setCredentialsLoading(false);
}
})();
return () => {
canceled = true;
};
}, [open, mode]);
useEffect(() => {
if (!open) {
setSelectedCredentialId("");
}
}, [open]);
useEffect(() => {
if (mode !== "ansible" || useSvcAccount) return;
if (!credentials.length) {
setSelectedCredentialId("");
return;
}
if (!selectedCredentialId || !credentials.some((cred) => String(cred.id) === String(selectedCredentialId))) {
setSelectedCredentialId(String(credentials[0].id));
}
}, [mode, credentials, selectedCredentialId, useSvcAccount]);
const renderNodes = (nodes = []) =>
nodes.map((n) => (
<TreeItem
key={n.id}
itemId={n.id}
label={
<Box sx={{ display: "flex", alignItems: "center" }}>
{n.isFolder ? (
<FolderIcon fontSize="small" sx={{ mr: 1, color: "#ccc" }} />
) : (
<DescriptionIcon fontSize="small" sx={{ mr: 1, color: "#ccc" }} />
)}
<Typography variant="body2" sx={{ color: "#e6edf3" }}>{n.label}</Typography>
</Box>
}
>
{n.children && n.children.length ? renderNodes(n.children) : null}
</TreeItem>
));
const onItemSelect = useCallback(
(_e, itemId) => {
const treeData = mode === "ansible" ? ansibleTreeData : scriptTreeData;
const node = treeData.map[itemId];
if (node && !node.isFolder && node.assemblyGuid) {
setSelectedAssemblyGuid(node.assemblyGuid);
setError("");
setVariableErrors({});
}
},
[mode, ansibleTreeData, scriptTreeData]
);
const deriveInitialValue = (variable) => {
const { type, default: defaultValue } = variable;
if (type === "boolean") {
if (typeof defaultValue === "boolean") return defaultValue;
if (defaultValue == null) return false;
const str = String(defaultValue).trim().toLowerCase();
if (!str) return false;
return ["true", "1", "yes", "on"].includes(str);
}
if (type === "number") {
if (defaultValue == null || defaultValue === "") return "";
if (typeof defaultValue === "number" && Number.isFinite(defaultValue)) {
return String(defaultValue);
}
const parsed = Number(defaultValue);
return Number.isFinite(parsed) ? String(parsed) : "";
}
return defaultValue == null ? "" : String(defaultValue);
};
useEffect(() => {
if (!selectedAssemblyGuid) {
setVariables([]);
setVariableValues({});
setVariableErrors({});
setVariableStatus({ loading: false, error: "" });
return;
}
let canceled = false;
(async () => {
setVariableStatus({ loading: true, error: "" });
try {
const exportDoc = await loadAssemblyExport(selectedAssemblyGuid);
if (canceled) return;
const parsed = parseAssemblyExport(exportDoc);
const defs = Array.isArray(parsed.variables) ? parsed.variables : [];
setVariables(defs);
const initialValues = {};
defs.forEach((v) => {
if (!v || !v.name) return;
initialValues[v.name] = deriveInitialValue(v);
});
setVariableValues(initialValues);
setVariableErrors({});
setVariableStatus({ loading: false, error: "" });
} catch (err) {
if (canceled) return;
setVariables([]);
setVariableValues({});
setVariableErrors({});
setVariableStatus({ loading: false, error: err?.message || String(err) });
}
})();
return () => {
canceled = true;
};
}, [selectedAssemblyGuid, loadAssemblyExport]);
const handleVariableChange = (variable, rawValue) => {
const { name, type } = variable;
if (!name) return;
setVariableValues((prev) => ({
...prev,
[name]: type === "boolean" ? Boolean(rawValue) : rawValue
}));
setVariableErrors((prev) => {
if (!prev[name]) return prev;
const next = { ...prev };
delete next[name];
return next;
});
};
const buildVariablePayload = () => {
const payload = {};
variables.forEach((variable) => {
if (!variable?.name) return;
const { name, type } = variable;
const hasOverride = Object.prototype.hasOwnProperty.call(variableValues, name);
const raw = hasOverride ? variableValues[name] : deriveInitialValue(variable);
if (type === "boolean") {
payload[name] = Boolean(raw);
} else if (type === "number") {
if (raw === "" || raw === null || raw === undefined) {
payload[name] = "";
} else {
const num = Number(raw);
payload[name] = Number.isFinite(num) ? num : "";
}
} else {
payload[name] = raw == null ? "" : String(raw);
}
});
return payload;
};
const onRun = async () => {
if (!selectedAssembly) {
setError(mode === "ansible" ? "Please choose a playbook to run." : "Please choose a script to run.");
return;
}
if (mode === "ansible" && !useSvcAccount && !selectedCredentialId) {
setError("Select a credential to run this playbook.");
return;
}
if (variables.length) {
const errors = {};
variables.forEach((variable) => {
if (!variable) return;
if (!variable.required) return;
if (variable.type === "boolean") return;
const hasOverride = Object.prototype.hasOwnProperty.call(variableValues, variable.name);
const raw = hasOverride ? variableValues[variable.name] : deriveInitialValue(variable);
if (raw == null || raw === "") {
errors[variable.name] = "Required";
}
});
if (Object.keys(errors).length) {
setVariableErrors(errors);
setError("Please fill in all required variable values.");
return;
}
}
setRunning(true);
setError("");
try {
let resp;
const variableOverrides = buildVariablePayload();
const normalizedPath = normalizeAssemblyPath(
mode === "ansible" ? "ansible" : "script",
selectedAssembly.path || "",
selectedAssembly.displayName
);
if (mode === "ansible") {
resp = await fetch("/api/ansible/quick_run", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
playbook_path: normalizedPath,
hostnames,
variable_values: variableOverrides,
credential_id: !useSvcAccount && selectedCredentialId ? Number(selectedCredentialId) : null,
use_service_account: Boolean(useSvcAccount)
})
});
} else {
resp = await fetch("/api/scripts/quick_run", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
script_path: normalizedPath,
hostnames,
run_mode: runAsCurrentUser ? "current_user" : "system",
variable_values: variableOverrides
})
});
}
const contentType = String(resp.headers.get("content-type") || "");
let data = null;
if (contentType.includes("application/json")) {
data = await resp.json().catch(() => null);
} else {
const text = await resp.text().catch(() => "");
if (text && text.trim()) {
data = { error: text.trim() };
}
}
if (!resp.ok) {
const message = data?.error || data?.message || `HTTP ${resp.status}`;
throw new Error(message);
}
onClose && onClose();
} catch (err) {
setError(String(err.message || err));
} finally {
setRunning(false);
}
};
const credentialRequired = mode === "ansible" && !useSvcAccount;
const disableRun =
running ||
!selectedAssembly ||
(credentialRequired && (!selectedCredentialId || !credentials.length));
const activeTreeData = mode === "ansible" ? ansibleTreeData : scriptTreeData;
const treeItems = Array.isArray(activeTreeData.root) ? activeTreeData.root : [];
const targetCount = hostnames.length;
const hostPreview = hostnames.slice(0, 3).join(", ");
const remainingHosts = Math.max(targetCount - 3, 0);
return (
<Dialog
open={open}
onClose={running ? undefined : onClose}
fullWidth
maxWidth="lg"
PaperProps={{ sx: DIALOG_SHELL_SX }}
>
<DialogTitle sx={{ pb: 0 }}>
<Box
sx={{
display: "flex",
flexDirection: { xs: "column", sm: "row" },
justifyContent: "space-between",
gap: 2
}}
>
<Box>
<Typography sx={{ fontWeight: 600, letterSpacing: 0.4 }}>Quick Job</Typography>
<Typography variant="body2" sx={{ color: "rgba(226,232,240,0.78)" }}>
Dispatch {mode === "ansible" ? "playbooks" : "scripts"} through the runner.
</Typography>
</Box>
<Box sx={{ display: "flex", gap: 1.5, flexWrap: "wrap" }}>
<Paper sx={{ ...GLASS_PANEL_SX, px: 2, py: 1 }}>
<Typography variant="caption" sx={{ textTransform: "uppercase", color: "rgba(226,232,240,0.7)", letterSpacing: 1 }}>
Targets
</Typography>
<Typography variant="h6">{hostnames.length || "—"}</Typography>
</Paper>
<Paper sx={{ ...GLASS_PANEL_SX, px: 2, py: 1 }}>
<Typography variant="caption" sx={{ textTransform: "uppercase", color: "rgba(226,232,240,0.7)", letterSpacing: 1 }}>
Mode
</Typography>
<Typography variant="h6">{mode === "ansible" ? "Ansible" : "Script"}</Typography>
</Paper>
</Box>
</Box>
</DialogTitle>
<DialogContent>
<Box sx={{ display: 'flex', alignItems: 'center', gap: 1, mb: 1 }}>
<Button size="small" variant={mode === 'scripts' ? 'outlined' : 'text'} onClick={() => setMode('scripts')} sx={{ textTransform: 'none', color: '#58a6ff', borderColor: '#58a6ff' }}>Scripts</Button>
<Button size="small" variant={mode === 'ansible' ? 'outlined' : 'text'} onClick={() => setMode('ansible')} sx={{ textTransform: 'none', color: '#58a6ff', borderColor: '#58a6ff' }}>Ansible</Button>
</Box>
<Typography variant="body2" sx={{ color: "#aaa", mb: 1 }}>
Select a {mode === 'ansible' ? 'playbook' : 'script'} to run on {hostnames.length} device{hostnames.length !== 1 ? "s" : ""}.
</Typography>
{assembliesError ? (
<Typography variant="body2" sx={{ color: "#ff8080", mb: 1 }}>{assembliesError}</Typography>
) : null}
{mode === 'ansible' && (
<Box sx={{ display: "flex", alignItems: "center", gap: 1.5, flexWrap: "wrap", mb: 2 }}>
<FormControlLabel
control={
<Checkbox
checked={useSvcAccount}
onChange={(e) => {
const checked = e.target.checked;
setUseSvcAccount(checked);
if (checked) {
setSelectedCredentialId("");
} else if (!selectedCredentialId && credentials.length) {
setSelectedCredentialId(String(credentials[0].id));
}
}}
size="small"
/>
}
label="Use Configured svcBorealis Account"
sx={{ mr: 2 }}
/>
<FormControl
size="small"
sx={{ minWidth: 260 }}
disabled={useSvcAccount || credentialsLoading || !credentials.length}
>
<InputLabel sx={{ color: "#aaa" }}>Credential</InputLabel>
<Select
value={selectedCredentialId}
label="Credential"
onChange={(e) => setSelectedCredentialId(e.target.value)}
sx={{ bgcolor: "#1f1f1f", color: "#fff" }}
>
{credentials.map((cred) => {
const conn = String(cred.connection_type || "").toUpperCase();
return (
<MenuItem key={cred.id} value={String(cred.id)}>
{cred.name}
{conn ? ` (${conn})` : ""}
</MenuItem>
);
})}
</Select>
</FormControl>
{useSvcAccount && (
<Typography variant="body2" sx={{ color: "#aaa" }}>
Runs with the agent&apos;s svcBorealis account.
</Typography>
)}
{credentialsLoading && <CircularProgress size={18} sx={{ color: "#58a6ff" }} />}
{!credentialsLoading && credentialsError && (
<Typography variant="body2" sx={{ color: "#ff8080" }}>{credentialsError}</Typography>
)}
{!useSvcAccount && !credentialsLoading && !credentialsError && !credentials.length && (
<Typography variant="body2" sx={{ color: "#ff8080" }}>
No SSH or WinRM credentials available. Create one under Access Management.
</Typography>
)}
</Box>
)}
<Box sx={{ display: "flex", gap: 2 }}>
<Paper sx={{ flex: 1, p: 1, bgcolor: "#1e1e1e", maxHeight: 400, overflow: "auto" }}>
<SimpleTreeView sx={{ color: "#e6edf3" }} onItemSelectionToggle={onItemSelect}>
{assembliesLoading ? (
<Box sx={{ display: "flex", alignItems: "center", gap: 1, px: 1, py: 0.5, color: "#7db7ff" }}>
<CircularProgress size={18} sx={{ color: "#58a6ff" }} />
<Typography variant="body2">Loading assemblies</Typography>
</Box>
) : treeItems.length ? (
renderNodes(treeItems)
) : (
<Typography variant="body2" sx={{ color: "#888", p: 1 }}>
{mode === 'ansible' ? 'No playbooks found.' : 'No scripts found.'}
</Typography>
)}
</SimpleTreeView>
</Paper>
<Box sx={{ width: 320 }}>
<Typography variant="subtitle2" sx={{ color: "#ccc", mb: 1 }}>Selection</Typography>
{selectedAssembly ? (
<Box sx={{ display: "flex", flexDirection: "column", gap: 0.5 }}>
<Box sx={{ display: "flex", alignItems: "center", gap: 1 }}>
<Typography variant="body2" sx={{ color: "#e6edf3" }}>{selectedAssembly.displayName}</Typography>
<DomainBadge domain={selectedAssembly.domain} size="small" />
</Box>
<Typography variant="body2" sx={{ color: "#aaa" }}>{selectedAssembly.path}</Typography>
</Box>
) : (
<Typography variant="body2" sx={{ color: "#888" }}>
{mode === 'ansible' ? 'No playbook selected' : 'No script selected'}
</Typography>
)}
<Box sx={{ mt: 2 }}>
{mode !== 'ansible' && (
<>
<FormControlLabel
control={<Checkbox size="small" checked={runAsCurrentUser} onChange={(e) => setRunAsCurrentUser(e.target.checked)} />}
label={<Typography variant="body2">Run as currently logged-in user</Typography>}
/>
<Typography variant="caption" sx={{ color: "#888" }}>
Unchecked = Run-As BUILTIN\SYSTEM
</Typography>
</>
)}
</Box>
<Box sx={{ mt: 3 }}>
<Typography variant="subtitle2" sx={{ color: "#ccc", mb: 1 }}>Variables</Typography>
{variableStatus.loading ? (
<Typography variant="body2" sx={{ color: "#888" }}>Loading variables</Typography>
) : variableStatus.error ? (
<Typography variant="body2" sx={{ color: "#ff4f4f" }}>{variableStatus.error}</Typography>
) : variables.length ? (
<Box sx={{ display: "flex", flexDirection: "column", gap: 1.5 }}>
{variables.map((variable) => (
<Box key={variable.name}>
{variable.type === "boolean" ? (
<FormControlLabel
control={(
<Checkbox
size="small"
checked={Boolean(variableValues[variable.name])}
onChange={(e) => handleVariableChange(variable, e.target.checked)}
/>
)}
label={
<Typography variant="body2">
{variable.label}
{variable.required ? " *" : ""}
</Typography>
}
/>
) : (
<TextField
fullWidth
size="small"
label={`${variable.label}${variable.required ? " *" : ""}`}
type={variable.type === "number" ? "number" : variable.type === "credential" ? "password" : "text"}
value={variableValues[variable.name] ?? ""}
onChange={(e) => handleVariableChange(variable, e.target.value)}
InputLabelProps={{ shrink: true }}
sx={{
"& .MuiOutlinedInput-root": { bgcolor: "#1b1b1b", color: "#e6edf3" },
"& .MuiInputBase-input": { color: "#e6edf3" }
}}
error={Boolean(variableErrors[variable.name])}
helperText={variableErrors[variable.name] || variable.description || ""}
/>
)}
{variable.type === "boolean" && variable.description ? (
<Typography variant="caption" sx={{ color: "#888", ml: 3 }}>
{variable.description}
</Typography>
) : null}
</Box>
))}
</Box>
) : (
<Typography variant="body2" sx={{ color: "#888" }}>No variables defined for this assembly.</Typography>
)}
</Box>
{error && (
<Typography variant="body2" sx={{ color: "#ff4f4f", mt: 1 }}>{error}</Typography>
)}
</Box>
</Box>
</DialogContent>
<DialogActions>
<Button onClick={onClose} disabled={running} sx={{ color: "#58a6ff" }}>Cancel</Button>
<Button onClick={onRun} disabled={disableRun}
sx={{ color: disableRun ? "#666" : "#58a6ff" }}
>
Run
</Button>
</DialogActions>
</Dialog>
);
}

View File

@@ -1,13 +1,3 @@
// /////////////////////////////////////////////////////////////////////////////
// Scheduled_Jobs_List — Borealis MagicUI styling parity with Page Template
// - Aurora gradient shell
// - Small Material icon LEFT of title
// - Subtitle under title
// - Top-right utility buttons (Refresh, Create Job, Settings)
// - AG Grid Quartz theme + square checkboxes, rounded chrome
// - Keeps all original logic + renderers
// /////////////////////////////////////////////////////////////////////////////
import React, {
useCallback,
useEffect,
@@ -87,6 +77,23 @@ const gradientButtonSx = {
},
};
const FILTER_OPTIONS = [
{ key: "all", label: "All" },
{ key: "immediate", label: "Immediate" },
{ key: "scheduled", label: "Scheduled" },
{ key: "recurring", label: "Recurring" },
{ key: "completed", label: "Completed" },
];
const AUTO_SIZE_COLUMNS = [
"name",
"componentsMeta",
"target",
"occurrence",
"lastRun",
"nextRun",
"resultsCounts",
];
function ResultsBar({ counts }) {
const total = Math.max(1, Number(counts?.total_targets || 0));
const sections = [
@@ -159,10 +166,40 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
const [error, setError] = useState("");
const [bulkDeleteOpen, setBulkDeleteOpen] = useState(false);
const [selectedIds, setSelectedIds] = useState(() => new Set());
const [jobFilterMode, setJobFilterMode] = useState("all");
const [assembliesPayload, setAssembliesPayload] = useState({ items: [], queue: [] });
const [assembliesLoading, setAssembliesLoading] = useState(false);
const [assembliesError, setAssembliesError] = useState("");
const gridApiRef = useRef(null);
const autoSizeTrackedColumns = useCallback(() => {
const api = gridApiRef.current;
if (!api) return;
const run = () => {
try {
api.autoSizeColumns(AUTO_SIZE_COLUMNS, true);
} catch {
/* ignore auto-size errors triggered during async refresh */
}
};
if (typeof window !== "undefined" && typeof window.requestAnimationFrame === "function") {
window.requestAnimationFrame(run);
} else {
setTimeout(run, 0);
}
}, []);
const deriveRowKey = useCallback((row, index = "") => {
if (row && row.id != null && row.id !== "") {
return String(row.id);
}
if (row && row.name) {
return String(row.name);
}
if (index !== undefined && index !== null && index !== "") {
return `__row_${index}`;
}
return "";
}, []);
const assembliesCellRenderer = useCallback((params) => {
const list = params?.data?.componentsMeta || [];
@@ -302,14 +339,54 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
? `${j.targets.length} device${j.targets.length !== 1 ? "s" : ""}`
: "";
const occurrence = pretty(j.schedule_type || "immediately");
const fallbackTargetCount = Array.isArray(j.targets) ? j.targets.length : 0;
const resultsCounts = {
total_targets: Array.isArray(j.targets) ? j.targets.length : 0,
pending: Array.isArray(j.targets) ? j.targets.length : 0,
total_targets: fallbackTargetCount,
pending: fallbackTargetCount,
...(j.result_counts || {})
};
if (resultsCounts && resultsCounts.total_targets == null) {
resultsCounts.total_targets = Array.isArray(j.targets) ? j.targets.length : 0;
if (resultsCounts.total_targets == null || Number.isNaN(Number(resultsCounts.total_targets))) {
resultsCounts.total_targets = fallbackTargetCount;
}
const normalizeCount = (value) => {
const num = Number(value);
return Number.isFinite(num) ? num : 0;
};
const totalTargets = normalizeCount(resultsCounts.total_targets);
const pendingCount = normalizeCount(resultsCounts.pending);
const runningCount = normalizeCount(resultsCounts.running);
const successCount = normalizeCount(resultsCounts.success);
const failedCount = normalizeCount(resultsCounts.failed);
const expiredCount = normalizeCount(resultsCounts.expired);
const timedOutCount = normalizeCount(resultsCounts.timed_out || resultsCounts.timedOut);
const totalFinished = successCount + failedCount + expiredCount + timedOutCount;
const allTargetsEvaluated =
totalTargets > 0
? totalFinished >= totalTargets && pendingCount === 0 && runningCount === 0
: pendingCount === 0 && runningCount === 0;
const everyTargetSuccessful =
totalTargets > 0
? successCount >= totalTargets && pendingCount === 0 && runningCount === 0
: pendingCount === 0 &&
runningCount === 0 &&
failedCount === 0 &&
expiredCount === 0 &&
timedOutCount === 0;
const jobExpiredFlag =
expiredCount > 0 || String(j.last_status || "").toLowerCase() === "expired";
const scheduleRaw = String(j.schedule_type || "").toLowerCase();
const isImmediateType = scheduleRaw === "immediately";
const isScheduledType = scheduleRaw === "once";
const showImmediate = isImmediateType && !allTargetsEvaluated;
const showScheduled = isScheduledType && !allTargetsEvaluated;
const canComplete = isImmediateType || isScheduledType;
const showCompleted = canComplete && (jobExpiredFlag || everyTargetSuccessful);
const categoryFlags = {
immediate: showImmediate,
scheduled: showScheduled,
recurring: !isImmediateType && !isScheduledType,
completed: showCompleted
};
return {
id: j.id,
name: j.name,
@@ -322,6 +399,7 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
result: j.last_status || (j.next_run_ts ? "Scheduled" : ""),
resultsCounts,
enabled: Boolean(j.enabled),
categoryFlags,
raw: { ...j, components: normalizedComponents }
};
});
@@ -330,7 +408,7 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
setSelectedIds((prev) => {
if (!prev.size) return prev;
const valid = new Set(
mappedRows.map((row, index) => row.id ?? row.name ?? String(index))
mappedRows.map((row, index) => deriveRowKey(row, index))
);
let changed = false;
const next = new Set();
@@ -353,7 +431,7 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
}
}
},
[assemblyIndex]
[assemblyIndex, deriveRowKey]
);
useEffect(() => {
@@ -376,34 +454,63 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
};
}, [loadJobs, refreshToken]);
const handleGridReady = useCallback((params) => {
gridApiRef.current = params.api;
}, []);
const handleGridReady = useCallback(
(params) => {
gridApiRef.current = params.api;
autoSizeTrackedColumns();
},
[autoSizeTrackedColumns]
);
const filterCounts = useMemo(() => {
const totals = { all: rows.length, immediate: 0, scheduled: 0, recurring: 0, completed: 0 };
rows.forEach((row) => {
if (row?.categoryFlags?.immediate) totals.immediate += 1;
if (row?.categoryFlags?.scheduled) totals.scheduled += 1;
if (row?.categoryFlags?.recurring) totals.recurring += 1;
if (row?.categoryFlags?.completed) totals.completed += 1;
});
return totals;
}, [rows]);
const filteredRows = useMemo(() => {
if (jobFilterMode === "all") return rows;
return rows.filter((row) => row?.categoryFlags?.[jobFilterMode]);
}, [rows, jobFilterMode]);
const activeFilterLabel = useMemo(() => {
const match = FILTER_OPTIONS.find((option) => option.key === jobFilterMode);
return match ? match.label : jobFilterMode;
}, [jobFilterMode]);
useEffect(() => {
const api = gridApiRef.current;
if (!api) return;
if (loading) {
api.showLoadingOverlay();
} else if (!rows.length) {
} else if (!filteredRows.length) {
api.showNoRowsOverlay();
} else {
api.hideOverlay();
}
}, [loading, rows]);
}, [loading, filteredRows]);
useEffect(() => {
const api = gridApiRef.current;
if (!api) return;
api.forEachNode((node) => {
const shouldSelect = selectedIds.has(node.id);
const nodeKey = deriveRowKey(node?.data, node?.rowIndex);
const shouldSelect = nodeKey && selectedIds.has(nodeKey);
if (node.isSelected() !== shouldSelect) {
node.setSelected(shouldSelect);
}
});
}, [rows, selectedIds]);
}, [deriveRowKey, filteredRows, selectedIds]);
const anySelected = selectedIds.size > 0;
useEffect(() => {
if (!filteredRows.length || loading) return;
autoSizeTrackedColumns();
}, [filteredRows, loading, autoSizeTrackedColumns]);
const handleSelectionChanged = useCallback(() => {
const api = gridApiRef.current;
@@ -411,20 +518,18 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
const selectedNodes = api.getSelectedNodes();
const next = new Set();
selectedNodes.forEach((node) => {
if (node?.id != null) {
next.add(String(node.id));
const nodeKey = deriveRowKey(node?.data, node?.rowIndex);
if (nodeKey) {
next.add(nodeKey);
}
});
setSelectedIds(next);
}, []);
}, [deriveRowKey]);
const getRowId = useCallback((params) => {
return (
params?.data?.id ??
params?.data?.name ??
String(params?.rowIndex ?? "")
);
}, []);
const getRowId = useCallback(
(params) => deriveRowKey(params?.data, params?.rowIndex),
[deriveRowKey]
);
const nameCellRenderer = useCallback(
(params) => {
@@ -525,23 +630,26 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
field: "name",
cellRenderer: nameCellRenderer,
sort: "asc",
minWidth: 220,
minWidth: 150,
cellClass: "auto-col-tight",
},
{
headerName: "Assembly(s)",
field: "componentsMeta",
minWidth: 260,
cellRenderer: assembliesCellRenderer
minWidth: 180,
cellRenderer: assembliesCellRenderer,
cellClass: "auto-col-tight",
},
{ headerName: "Target", field: "target", minWidth: 140 },
{ headerName: "Recurrence", field: "occurrence", minWidth: 160 },
{ headerName: "Last Run", field: "lastRun", minWidth: 160 },
{ headerName: "Next Run", field: "nextRun", minWidth: 160 },
{ headerName: "Target", field: "target", minWidth: 140, cellClass: "auto-col-tight" },
{ headerName: "Recurrence", field: "occurrence", minWidth: 150, cellClass: "auto-col-tight" },
{ headerName: "Last Run", field: "lastRun", minWidth: 150, cellClass: "auto-col-tight" },
{ headerName: "Next Run", field: "nextRun", minWidth: 150, cellClass: "auto-col-tight" },
{
headerName: "Results",
field: "resultsCounts",
minWidth: 280,
cellRenderer: resultsCellRenderer,
cellClass: "auto-col-tight",
sortable: false,
filter: false
},
@@ -549,8 +657,9 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
headerName: "Enabled",
field: "enabled",
minWidth: 140,
maxWidth: 160,
flex: 1,
cellRenderer: enabledCellRenderer,
cellClass: "auto-col-tight",
sortable: false,
filter: false,
resizable: false,
@@ -659,6 +768,74 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
{/* Content area — a bit more top space below subtitle */}
<Box sx={{ mt: "28px", px: 2, pb: 2, flexGrow: 1, minHeight: 0, display: "flex", flexDirection: "column" }}>
<Box sx={{ display: "flex", flexWrap: "wrap", alignItems: "center", justifyContent: "space-between", gap: 1.5, mb: 2, px: 0.5 }}>
<Box
sx={{
display: "inline-flex",
alignItems: "center",
gap: 0.75,
background: "linear-gradient(120deg, rgba(8,12,24,0.92), rgba(4,7,17,0.85))",
borderRadius: 999,
border: "1px solid rgba(148,163,184,0.35)",
boxShadow: "0 18px 48px rgba(2,8,23,0.45)",
padding: "4px"
}}
>
{FILTER_OPTIONS.map((option) => {
const active = jobFilterMode === option.key;
return (
<Box
key={option.key}
component="button"
type="button"
onClick={() => setJobFilterMode(option.key)}
sx={{
border: "none",
outline: "none",
background: active ? "linear-gradient(135deg,#7dd3fc,#c084fc)" : "transparent",
color: active ? "#041224" : "#cbd5e1",
fontWeight: 600,
fontSize: 13,
px: 2,
py: 0.5,
borderRadius: 999,
cursor: "pointer",
display: "inline-flex",
alignItems: "center",
gap: 0.6,
boxShadow: active ? "0 0 18px rgba(125,211,252,0.35)" : "none",
transition: "all 0.2s ease",
}}
>
<Box component="span" sx={{ userSelect: "none" }}>{option.label}</Box>
<Box
component="span"
sx={{
minWidth: 28,
textAlign: "center",
borderRadius: 999,
fontSize: 12,
fontWeight: 600,
px: 0.75,
py: 0.1,
color: active ? "#041224" : "#94a3b8",
backgroundColor: active ? "rgba(4,18,36,0.2)" : "rgba(15,23,42,0.65)",
border: active ? "1px solid rgba(4,18,36,0.3)" : "1px solid rgba(148,163,184,0.3)"
}}
>
{filterCounts[option.key] ?? 0}
</Box>
</Box>
);
})}
</Box>
<Typography variant="body2" sx={{ color: AURORA_SHELL.subtext }}>
{jobFilterMode === "all"
? `Showing ${filterCounts.all || 0} jobs`
: `Showing ${filterCounts[jobFilterMode] || 0} ${activeFilterLabel} job${(filterCounts[jobFilterMode] || 0) === 1 ? "" : "s"}`}
</Typography>
</Box>
<Box
className={themeClassName}
sx={{
@@ -709,6 +886,10 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
alignItems: "center",
justifyContent: "center",
},
"& .ag-cell.auto-col-tight": {
paddingRight: 0,
paddingLeft: 0,
},
}}
style={{
"--ag-background-color": "#070b1a",
@@ -748,7 +929,7 @@ export default function ScheduledJobsList({ onCreateJob, onEditJob, refreshToken
)}
<AgGridReact
rowData={rows}
rowData={filteredRows}
columnDefs={columnDefs}
defaultColDef={defaultColDef}
animateRows

View File

@@ -6,12 +6,15 @@ import {
Button,
IconButton,
Tooltip,
CircularProgress,
} from "@mui/material";
import AddIcon from "@mui/icons-material/Add";
import LocationCityIcon from "@mui/icons-material/LocationCity";
import DeleteIcon from "@mui/icons-material/DeleteOutline";
import EditIcon from "@mui/icons-material/Edit";
import RefreshIcon from "@mui/icons-material/Refresh";
import ContentCopyIcon from "@mui/icons-material/ContentCopy";
import { AgGridReact } from "ag-grid-react";
import { ModuleRegistry, AllCommunityModule, themeQuartz } from "ag-grid-community";
import { CreateSiteDialog, ConfirmDeleteDialog, RenameSiteDialog } from "../Dialogs.jsx";
@@ -69,6 +72,7 @@ export default function SiteList({ onOpenDevicesForSite }) {
const [deleteOpen, setDeleteOpen] = useState(false);
const [renameOpen, setRenameOpen] = useState(false);
const [renameValue, setRenameValue] = useState("");
const [rotatingId, setRotatingId] = useState(null);
const gridRef = useRef(null);
const fetchSites = useCallback(async () => {
@@ -83,6 +87,42 @@ export default function SiteList({ onOpenDevicesForSite }) {
useEffect(() => { fetchSites(); }, [fetchSites]);
const handleCopy = useCallback(async (code) => {
const value = (code || "").trim();
if (!value) return;
try {
await navigator.clipboard.writeText(value);
} catch {
window.prompt("Copy enrollment code", value);
}
}, []);
const handleRotate = useCallback(async (site) => {
if (!site?.id) return;
const confirmRotate = window.confirm(
"Are you sure you want to rotate the enrollment code associated with this site? "
+ "If there are automations that deploy agents to endpoints, the enrollment code associated with them will need to also be updated."
);
if (!confirmRotate) return;
setRotatingId(site.id);
try {
const resp = await fetch("/api/sites/rotate_code", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ site_id: site.id }),
});
if (resp.ok) {
const updated = await resp.json();
setRows((prev) => prev.map((row) => (row.id === site.id ? { ...row, ...updated } : row)));
}
} catch {
// Silently fail the rotate if the request errors; grid will refresh on next fetch.
} finally {
setRotatingId(null);
fetchSites();
}
}, [fetchSites]);
const columnDefs = useMemo(() => [
{
headerName: "",
@@ -105,9 +145,51 @@ export default function SiteList({ onOpenDevicesForSite }) {
</span>
),
},
{
headerName: "Agent Enrollment Code",
field: "enrollment_code",
minWidth: 320,
flex: 1.2,
cellRenderer: (params) => {
const code = params.value || "—";
const site = params.data || {};
const busy = rotatingId === site.id;
return (
<Box sx={{ display: "flex", alignItems: "center", gap: 1 }}>
<Tooltip title="Rotate Code">
<span>
<IconButton
size="small"
onClick={() => handleRotate(site)}
disabled={busy}
sx={{ color: MAGIC_UI.accentA, border: "1px solid rgba(148,163,184,0.35)" }}
>
{busy ? <CircularProgress size={16} color="inherit" /> : <RefreshIcon fontSize="small" />}
</IconButton>
</span>
</Tooltip>
<Typography variant="body2" sx={{ fontFamily: "monospace", color: MAGIC_UI.textBright }}>
{code}
</Typography>
<Tooltip title="Copy">
<span>
<IconButton
size="small"
onClick={() => handleCopy(code)}
disabled={!code || code === "—"}
sx={{ color: MAGIC_UI.textMuted }}
>
<ContentCopyIcon fontSize="small" />
</IconButton>
</span>
</Tooltip>
</Box>
);
},
},
{ headerName: "Description", field: "description", minWidth: 220 },
{ headerName: "Devices", field: "device_count", minWidth: 120 },
], [onOpenDevicesForSite]);
], [onOpenDevicesForSite, handleRotate, handleCopy, rotatingId]);
const defaultColDef = useMemo(() => ({
sortable: true,

BIN
Dependencies/Dependencies/7zip/7z.dll vendored Normal file

Binary file not shown.

BIN
Dependencies/Dependencies/7zip/7z.exe vendored Normal file

Binary file not shown.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Borealis Agent
Use this doc for agent-only work (Borealis agent runtime under `Data/Agent``/Agent`). For shared guidance, see `Docs/Codex/SHARED.md`.
## Scope & Runtime Paths
- Purpose: outbound-only connectivity, device telemetry, scripting, UI helpers.
- Bootstrap: `Borealis.ps1` preps dependencies, activates the agent venv, and co-launches the Engine.
- Edit in `Data/Agent`, not `/Agent`; runtime copies are ephemeral and wiped regularly.
## Logging
- Primary log: `Agent/Logs/agent.log` with daily rotation to `agent.log.YYYY-MM-DD` (never auto-delete rotated files).
- Subsystems: log to `Agent/Logs/<service>.log` with the same rotation policy.
- Install/diagnostics: `Agent/Logs/install.log`; keep ad-hoc traces (e.g., `system_last.ps1`, ansible) under `Agent/Logs/` to keep runtime state self-contained.
- Troubleshooting: prefix lines with `<timestamp>-<service-name>-<log-data>`; ask operators whether verbose logging should stay after resolution.
## Security
- Generates device-wide Ed25519 keys on first launch (`Certificates/Agent/Identity/`; DPAPI on Windows, `chmod 600` elsewhere).
- Refresh/access tokens are encrypted and pinned to the Engine certificate fingerprint; mismatches force re-enrollment.
- Uses dedicated `ssl.SSLContext` seeded with the Engine TLS bundle for REST + Socket.IO traffic.
- Validates script payloads with backend-issued Ed25519 signatures before execution.
- Outbound-only; API/WebSocket calls flow through `AgentHttpClient.ensure_authenticated` for proactive refresh. Logs bootstrap, enrollment, token refresh, and signature events in `Agent/Logs/`.
## Execution Contexts & Roles
- Auto-discovers roles from `Data/Agent/Roles/`; no loader changes needed.
- Naming: `role_<Purpose>.py` with `ROLE_NAME`, `ROLE_CONTEXTS`, and optional hooks (`register_events`, `on_config`, `stop_all`).
- Standard roles: `role_DeviceInventory.py`, `role_Screenshot.py`, `role_ScriptExec_CURRENTUSER.py`, `role_ScriptExec_SYSTEM.py`, `role_Macro.py`.
- SYSTEM tasks depend on scheduled-task creation rights; failures should surface through Engine logging.
## Platform Parity
- Windows is the reference. Linux (`Borealis.sh`) lags in venv setup, supervision, and role loading; align Linux before macOS work continues.
## Ansible Support (Unfinished)
- Agent + Engine scaffolding exists but is unreliable: expect stalled/silent failures, inconsistent recap, missing collections.
- Windows blockers: `ansible.windows.*` usually needs PSRP/WinRM; SYSTEM context lacks loopback remoting guarantees; interpreter paths vary.
- Treat Ansible features as disabled until packaging/controller story is complete. Future direction: credential mgmt, selectable connections, reliable live output/cancel, packaged collections.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Borealis Engine
Use this doc for Engine work (successor to the legacy server). For shared guidance, see `Docs/Codex/SHARED.md`.
## Scope & Runtime Paths
- Bootstrap: `Borealis.ps1` launches the Engine and/or Agent. The equivalant bootstrap script exists for Linux when running `Borealis.sh`.
- Edit in `Data/Engine`; runtime copies live under `/Engine` and are discarded every time the engine is launched.
## Architecture
- Runtime: `Data/Engine/server.py` with NodeJS + Vite for live dev and Flask for production serving/API endpoints.
## Development Guidelines
- Every Python module under `Data/Engine` or `Engine/Data/Engine` starts with the standard commentary header (purpose + API endpoints). Add the header to any existing module before further edits.
## Logging
- Primary log: `Engine/Logs/engine.log` with daily rotation (`engine.log.YYYY-MM-DD`); do not auto-delete rotated files.
- Subsystems: `Engine/Logs/<service>.log`; install output to `Engine/Logs/install.log`.
- Keep Engine-specific artifacts within `Engine/Logs/` to preserve the runtime boundary.
## Security & API Parity
- Mirrors legacy mutual trust: Ed25519 device identities, EdDSA-signed access tokens, pinned Borealis root CA, TLS 1.3-only serving, Authorization headers + service-context markers on every device API.
- Implements DPoP validation, short-lived access tokens (~15 min), SHA-256hashed refresh tokens (30-day) with explicit reuse errors.
- Enrollment: operator approvals, conflict detection, auditor recording, pruning of expired codes/refresh tokens.
- Background jobs and service adapters maintain compatibility with legacy DB schemas while enabling gradual API takeover.
## WebUI & WebSocket Migration
- Static/template handling: `Data/Engine/services/WebUI`; deployment copy paths are wired through `Borealis.ps1` with TLS-aware URL generation.
- Stage 6 tasks: migration switch in the legacy server for WebUI delegation and porting device/admin API endpoints into Engine services.
- Stage 7 (queued): `register_realtime` hooks, Engine-side Socket.IO handlers, integration checks, legacy delegation updates.
## Platform Parity
- Windows is primary target. Keep Engine tooling aligned with the agent experience; Linux packaging must catch up before macOS work resumes.
## Ansible Support (Shared State)
- Mirrors the agents unfinished story: treat orchestration as experimental until packaging, connection management, and logging mature.

6
Docs/Codex/SHARED.md Normal file
View File

@@ -0,0 +1,6 @@
# Codex Guide: Shared Conventions
Cross-cutting guidance that applies to both Agent and Engine work. Domain-specific rules live in `Docs/Codex/BOREALIS_AGENT.md` and `Docs/Codex/BOREALIS_ENGINE.md`.
- UI & AG Grid: see `Docs/Codex/USER_INTERFACE.md` for MagicUI styling language and AG Grid patterns (with references to live templates).
- Add further shared topics here (e.g., triage process, security posture deltas) instead of growing `AGENTS.md`.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Shared UI (MagicUI + AG Grid)
Applies to all Borealis frontends. Use `Data/Engine/web-interface/src/Admin/Page_Template.jsx` as the canonical visual reference (no API/business logic). Keep this doc as the single source of truth for styling rules and AG Grid behavior.
## Page Template Reference
- Purpose: visual-only baseline for new pages; copy structure but wire your data in real pages.
- Header: small Material icon left of the title, subtitle beneath, utility buttons on the top-right.
- Shell: full-bleed aurora gradient container; avoid gutters on the Paper.
- Selection column (for bulk actions): pinned left, square checkboxes, header checkbox enabled, ~52px fixed width, no menu/sort/resize; rely on AG Grid built-ins.
- Typography/buttons: IBM Plex Sans, gradient primary buttons, rounded corners (~8px), themed Quartz grid wrapper.
## MagicUI Styling Language (Visual System)
- Aurora shells: gradient backgrounds blending deep navy (#040711) with soft cyan/violet blooms, subtle borders (`rgba(148,163,184,0.35)`), and low, velvety shadows.
- Full-bleed canvas: hero shells run edge-to-edge; inset padding lives inside cards so gradients feel immersive.
- Glass panels: glassmorphic layers (`rgba(15,23,42,0.7)`), rounded 1624px corners, blurred backdrops, micro borders, optional radial flares for motion.
- Hero storytelling: start views with stat-forward heroes—gradient StatTiles (min 160px) and uppercase pills (HERO_BADGE_SX) summarizing live signals/filters.
- Summary data grids: use AG Grid inside a glass wrapper (two columns Field/Value), matte navy background, no row striping.
- Tile palettes: online cyan→green; stale orange→red; “needs update” violet→cyan; secondary metrics fade from cyan into desaturated steel for consistent hue families.
- Hardware islands: storage/memory/network blocks reuse Quartz theme in rounded glass shells with flat fills; present numeric columns (Capacity/Used/Free/%) to match Device Inventory.
- Action surfaces: control bars live in translucent glass bands; filled dark inputs with cyan hover borders; primary actions are pill-shaped gradients; secondary controls are soft-outline icon buttons.
- Anchored controls: align selectors/utility buttons with grid edges in a single row; reserve glass backdrops for hero sections so content stays flush.
- Buttons & chips: gradient pills for primary CTAs (`linear-gradient(135deg,#34d399,#22d3ee)` success; `#7dd3fc→#c084fc` creation); neutral actions use rounded outlines with `rgba(148,163,184,0.4)` borders and uppercase microcopy.
- Rainbow accents: for creation CTAs, use dark-fill pills with rainbow border gradients + teal halo (shared with Quick Job).
- AG Grid treatment: Quartz theme with matte navy headers, subtle alternating row opacity, cyan/magenta interaction glows, rounded wrappers, soft borders, inset selection glows.
- Overlays/menus: `rgba(8,12,24,0.96)` canvas, blurred backdrops, thin steel borders; bright typography; deep blue glass inputs; cyan confirm, mauve destructive accents.
## AG Grid Column Behavior (All Tables)
- Auto-size value columns and let the last column absorb remaining width so views span available space.
- Declare `AUTO_SIZE_COLUMNS` near the grid component (exclude the fill column).
- Helper: store the grid API in a ref and call `api.autoSizeColumns(AUTO_SIZE_COLUMNS, true)` inside `requestAnimationFrame` (or `setTimeout(...,0)` fallback); swallow errors because it can run before rows render.
- Hook the helper into both `onGridReady` and a `useEffect` watching the dataset (e.g., `[filteredRows, loading]`); skip while `loading` or when there are zero rows.
- Column defs: apply shared `cellClass: "auto-col-tight"` (or equivalent) to every auto-sized column for consistent padding. Last column keeps the class for styling consistency.
- CSS override: add `& .ag-cell.auto-col-tight { padding-left: 0; padding-right: 0; }` in the theme scope.
- Fill column: last column `{ flex: 1, minWidth: X }` (no width/maxWidth) to stretch when horizontal space remains.
- Example: follow the scaffolding in `Engine/web-interface/src/Scheduling/Scheduled_Jobs_List.jsx` and the structure in `Data/Engine/web-interface/src/Admin/Page_Template.jsx`.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Borealis Agent
Use this doc for agent-only work (Borealis agent runtime under `Data/Agent``/Agent`). For shared guidance, see `Docs/Codex/SHARED.md`.
## Scope & Runtime Paths
- Purpose: outbound-only connectivity, device telemetry, scripting, UI helpers.
- Bootstrap: `Borealis.ps1` preps dependencies, activates the agent venv, and co-launches the Engine.
- Edit in `Data/Agent`, not `/Agent`; runtime copies are ephemeral and wiped regularly.
## Logging
- Primary log: `Agent/Logs/agent.log` with daily rotation to `agent.log.YYYY-MM-DD` (never auto-delete rotated files).
- Subsystems: log to `Agent/Logs/<service>.log` with the same rotation policy.
- Install/diagnostics: `Agent/Logs/install.log`; keep ad-hoc traces (e.g., `system_last.ps1`, ansible) under `Agent/Logs/` to keep runtime state self-contained.
- Troubleshooting: prefix lines with `<timestamp>-<service-name>-<log-data>`; ask operators whether verbose logging should stay after resolution.
## Security
- Generates device-wide Ed25519 keys on first launch (`Certificates/Agent/Identity/`; DPAPI on Windows, `chmod 600` elsewhere).
- Refresh/access tokens are encrypted and pinned to the Engine certificate fingerprint; mismatches force re-enrollment.
- Uses dedicated `ssl.SSLContext` seeded with the Engine TLS bundle for REST + Socket.IO traffic.
- Validates script payloads with backend-issued Ed25519 signatures before execution.
- Outbound-only; API/WebSocket calls flow through `AgentHttpClient.ensure_authenticated` for proactive refresh. Logs bootstrap, enrollment, token refresh, and signature events in `Agent/Logs/`.
## Execution Contexts & Roles
- Auto-discovers roles from `Data/Agent/Roles/`; no loader changes needed.
- Naming: `role_<Purpose>.py` with `ROLE_NAME`, `ROLE_CONTEXTS`, and optional hooks (`register_events`, `on_config`, `stop_all`).
- Standard roles: `role_DeviceInventory.py`, `role_Screenshot.py`, `role_ScriptExec_CURRENTUSER.py`, `role_ScriptExec_SYSTEM.py`, `role_Macro.py`.
- SYSTEM tasks depend on scheduled-task creation rights; failures should surface through Engine logging.
## Platform Parity
- Windows is the reference. Linux (`Borealis.sh`) lags in venv setup, supervision, and role loading; align Linux before macOS work continues.
## Ansible Support (Unfinished)
- Agent + Engine scaffolding exists but is unreliable: expect stalled/silent failures, inconsistent recap, missing collections.
- Windows blockers: `ansible.windows.*` usually needs PSRP/WinRM; SYSTEM context lacks loopback remoting guarantees; interpreter paths vary.
- Treat Ansible features as disabled until packaging/controller story is complete. Future direction: credential mgmt, selectable connections, reliable live output/cancel, packaged collections.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Borealis Engine
Use this doc for Engine work (successor to the legacy server). For shared guidance, see `Docs/Codex/SHARED.md`.
## Scope & Runtime Paths
- Bootstrap: `Borealis.ps1` launches the Engine and/or Agent. The equivalant bootstrap script exists for Linux when running `Borealis.sh`.
- Edit in `Data/Engine`; runtime copies live under `/Engine` and are discarded every time the engine is launched.
## Architecture
- Runtime: `Data/Engine/server.py` with NodeJS + Vite for live dev and Flask for production serving/API endpoints.
## Development Guidelines
- Every Python module under `Data/Engine` or `Engine/Data/Engine` starts with the standard commentary header (purpose + API endpoints). Add the header to any existing module before further edits.
## Logging
- Primary log: `Engine/Logs/engine.log` with daily rotation (`engine.log.YYYY-MM-DD`); do not auto-delete rotated files.
- Subsystems: `Engine/Logs/<service>.log`; install output to `Engine/Logs/install.log`.
- Keep Engine-specific artifacts within `Engine/Logs/` to preserve the runtime boundary.
## Security & API Parity
- Mirrors legacy mutual trust: Ed25519 device identities, EdDSA-signed access tokens, pinned Borealis root CA, TLS 1.3-only serving, Authorization headers + service-context markers on every device API.
- Implements DPoP validation, short-lived access tokens (~15 min), SHA-256hashed refresh tokens (30-day) with explicit reuse errors.
- Enrollment: operator approvals, conflict detection, auditor recording, pruning of expired codes/refresh tokens.
- Background jobs and service adapters maintain compatibility with legacy DB schemas while enabling gradual API takeover.
## WebUI & WebSocket Migration
- Static/template handling: `Data/Engine/services/WebUI`; deployment copy paths are wired through `Borealis.ps1` with TLS-aware URL generation.
- Stage 6 tasks: migration switch in the legacy server for WebUI delegation and porting device/admin API endpoints into Engine services.
- Stage 7 (queued): `register_realtime` hooks, Engine-side Socket.IO handlers, integration checks, legacy delegation updates.
## Platform Parity
- Windows is primary target. Keep Engine tooling aligned with the agent experience; Linux packaging must catch up before macOS work resumes.
## Ansible Support (Shared State)
- Mirrors the agents unfinished story: treat orchestration as experimental until packaging, connection management, and logging mature.

View File

@@ -0,0 +1,6 @@
# Codex Guide: Shared Conventions
Cross-cutting guidance that applies to both Agent and Engine work. Domain-specific rules live in `Docs/Codex/BOREALIS_AGENT.md` and `Docs/Codex/BOREALIS_ENGINE.md`.
- UI & AG Grid: see `Docs/Codex/USER_INTERFACE.md` for MagicUI styling language and AG Grid patterns (with references to live templates).
- Add further shared topics here (e.g., triage process, security posture deltas) instead of growing `AGENTS.md`.

View File

@@ -0,0 +1,35 @@
# Codex Guide: Shared UI (MagicUI + AG Grid)
Applies to all Borealis frontends. Use `Data/Engine/web-interface/src/Admin/Page_Template.jsx` as the canonical visual reference (no API/business logic). Keep this doc as the single source of truth for styling rules and AG Grid behavior.
## Page Template Reference
- Purpose: visual-only baseline for new pages; copy structure but wire your data in real pages.
- Header: small Material icon left of the title, subtitle beneath, utility buttons on the top-right.
- Shell: full-bleed aurora gradient container; avoid gutters on the Paper.
- Selection column (for bulk actions): pinned left, square checkboxes, header checkbox enabled, ~52px fixed width, no menu/sort/resize; rely on AG Grid built-ins.
- Typography/buttons: IBM Plex Sans, gradient primary buttons, rounded corners (~8px), themed Quartz grid wrapper.
## MagicUI Styling Language (Visual System)
- Aurora shells: gradient backgrounds blending deep navy (#040711) with soft cyan/violet blooms, subtle borders (`rgba(148,163,184,0.35)`), and low, velvety shadows.
- Full-bleed canvas: hero shells run edge-to-edge; inset padding lives inside cards so gradients feel immersive.
- Glass panels: glassmorphic layers (`rgba(15,23,42,0.7)`), rounded 1624px corners, blurred backdrops, micro borders, optional radial flares for motion.
- Hero storytelling: start views with stat-forward heroes—gradient StatTiles (min 160px) and uppercase pills (HERO_BADGE_SX) summarizing live signals/filters.
- Summary data grids: use AG Grid inside a glass wrapper (two columns Field/Value), matte navy background, no row striping.
- Tile palettes: online cyan→green; stale orange→red; “needs update” violet→cyan; secondary metrics fade from cyan into desaturated steel for consistent hue families.
- Hardware islands: storage/memory/network blocks reuse Quartz theme in rounded glass shells with flat fills; present numeric columns (Capacity/Used/Free/%) to match Device Inventory.
- Action surfaces: control bars live in translucent glass bands; filled dark inputs with cyan hover borders; primary actions are pill-shaped gradients; secondary controls are soft-outline icon buttons.
- Anchored controls: align selectors/utility buttons with grid edges in a single row; reserve glass backdrops for hero sections so content stays flush.
- Buttons & chips: gradient pills for primary CTAs (`linear-gradient(135deg,#34d399,#22d3ee)` success; `#7dd3fc→#c084fc` creation); neutral actions use rounded outlines with `rgba(148,163,184,0.4)` borders and uppercase microcopy.
- Rainbow accents: for creation CTAs, use dark-fill pills with rainbow border gradients + teal halo (shared with Quick Job).
- AG Grid treatment: Quartz theme with matte navy headers, subtle alternating row opacity, cyan/magenta interaction glows, rounded wrappers, soft borders, inset selection glows.
- Overlays/menus: `rgba(8,12,24,0.96)` canvas, blurred backdrops, thin steel borders; bright typography; deep blue glass inputs; cyan confirm, mauve destructive accents.
## AG Grid Column Behavior (All Tables)
- Auto-size value columns and let the last column absorb remaining width so views span available space.
- Declare `AUTO_SIZE_COLUMNS` near the grid component (exclude the fill column).
- Helper: store the grid API in a ref and call `api.autoSizeColumns(AUTO_SIZE_COLUMNS, true)` inside `requestAnimationFrame` (or `setTimeout(...,0)` fallback); swallow errors because it can run before rows render.
- Hook the helper into both `onGridReady` and a `useEffect` watching the dataset (e.g., `[filteredRows, loading]`); skip while `loading` or when there are zero rows.
- Column defs: apply shared `cellClass: "auto-col-tight"` (or equivalent) to every auto-sized column for consistent padding. Last column keeps the class for styling consistency.
- CSS override: add `& .ag-cell.auto-col-tight { padding-left: 0; padding-right: 0; }` in the theme scope.
- Fill column: last column `{ flex: 1, minWidth: X }` (no width/maxWidth) to stretch when horizontal space remains.
- Example: follow the scaffolding in `Engine/web-interface/src/Scheduling/Scheduled_Jobs_List.jsx` and the structure in `Data/Engine/web-interface/src/Admin/Page_Template.jsx`.

20
Docs/Docs/assemblies.md Normal file
View File

@@ -0,0 +1,20 @@
# Assemblies Runtime Reference
## Database Layout
- Three SQLite databases live under `Data/Engine/Assemblies` (`official.db`, `community.db`, `user_created.db`) and mirror to `Engine/Assemblies` at runtime.
- Automatic JSON → SQLite imports for the official domain have been retired; the staged `official.db` now serves as the authoritative store unless you invoke a manual sync.
- Payload binaries/json store under `Payloads/<payload-guid>` in both staging and runtime directories; the AssemblyCache references payload GUIDs instead of embedding large blobs.
- WAL mode with shared-cache is enabled on every connection; queue flushes copy the refreshed `.db`, `-wal`, and `-shm` files into the runtime mirror.
- `AssemblyCache.describe()` reveals dirty/clean state per assembly, helping operators spot pending writes before shutdown or sync operations.
## Dev Mode Controls
- User-created domain mutations remain open to authenticated operators; community/official writes require an administrator with Dev Mode enabled.
- Toggle Dev Mode via `POST /api/assemblies/dev-mode/switch` or the Assemblies admin controls; state expires automatically based on the server-side TTL.
- Privileged actions (create/update/delete, cross-domain clone, queue flush, official sync, import into protected domains) emit audit entries under `Engine/Logs/assemblies.log`.
- When Dev Mode is disabled, API responses return `dev_mode_required` to prompt admins to enable overrides before retrying protected mutations.
## Backup Guidance
- Regularly snapshot `Data/Engine/Assemblies` and `Data/Engine/Assemblies/Payloads` alongside the mirrored runtime copies to preserve both metadata and payload artifacts.
- Include the queue inspection endpoint (`GET /api/assemblies`) in maintenance scripts to verify no dirty entries remain before capturing backups.
- Maintain the staged databases directly; to publish new official assemblies copy the curated `official.db` into `Data/Engine/Assemblies` before restarting the Engine.
- Future automation will extend to scheduled backups and staged restore helpers; until then, ensure filesystem backups capture both SQLite databases and payload directories atomically.

Binary file not shown.

After

Width:  |  Height:  |  Size: 168 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 209 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 166 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 228 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 179 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 288 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 298 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 388 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 156 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 214 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 160 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 309 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 167 KiB

After

Width:  |  Height:  |  Size: 214 KiB

View File

@@ -189,9 +189,41 @@ function Start-AgentScheduledTasks {
function Stop-AgentPythonProcesses {
param(
[string[]]$ProcessNames = @('python', 'pythonw')
[string[]]$ProcessNames = @('python', 'pythonw'),
[string]$ProjectRoot,
[switch]$SkipEngine = $true
)
$enginePids = @()
$engineRoot = ''
$dataEngineRoot = ''
if ($ProjectRoot) {
try { $engineRoot = (Join-Path $ProjectRoot 'Engine') } catch {}
try { $dataEngineRoot = (Join-Path $ProjectRoot 'Data\Engine') } catch {}
}
if ($SkipEngine) {
try {
$cims = Get-CimInstance -ClassName Win32_Process -Filter "Name='python.exe' OR Name='pythonw.exe'" -ErrorAction Stop
foreach ($proc in $cims) {
try {
$pid = [int]$proc.ProcessId
} catch { continue }
$cmd = ($proc.CommandLine -as [string])
$exePath = ($proc.ExecutablePath -as [string])
$isEngine = $false
foreach ($marker in @($engineRoot, $dataEngineRoot, '\Engine\', '\Data\Engine\', 'Engine\server.py', 'Data\Engine\server.py')) {
if (-not $marker) { continue }
try {
if ($cmd -and $cmd.ToLowerInvariant().Contains($marker.ToLowerInvariant())) { $isEngine = $true; break }
if ($exePath -and $exePath.ToLowerInvariant().Contains($marker.ToLowerInvariant())) { $isEngine = $true; break }
} catch {}
}
if ($isEngine -and ($enginePids -notcontains $pid)) { $enginePids += $pid }
}
} catch {}
}
foreach ($name in ($ProcessNames | Where-Object { -not [string]::IsNullOrWhiteSpace($_) } | Select-Object -Unique)) {
$name = $name.Trim()
if (-not $name) { continue }
@@ -206,13 +238,29 @@ function Stop-AgentPythonProcesses {
foreach ($proc in $processes) {
$procId = $null
$procName = $null
$procPath = $null
try {
$procId = $proc.Id
$procName = $proc.ProcessName
$procPath = $proc.Path
} catch {}
if ($procId -eq $null) { continue }
$isEngineProc = $false
try {
if ($enginePids -and ($enginePids -contains $procId)) { $isEngineProc = $true }
foreach ($marker in @($engineRoot, $dataEngineRoot)) {
if (-not $marker) { continue }
if ($procPath -and $procPath.ToLowerInvariant().StartsWith($marker.ToLowerInvariant())) { $isEngineProc = $true; break }
}
} catch {}
if ($SkipEngine -and $isEngineProc) {
Write-Host "Skipping Engine python process: PID $procId ($procName)" -ForegroundColor Cyan
continue
}
if (-not $procName) { $procName = $name }
$stopped = $false
@@ -1732,7 +1780,7 @@ function Invoke-BorealisAgentUpdate {
} else {
Write-UpdateLog "No managed tasks were running when update started." 'DEBUG'
}
Run-Step "Updating: Terminate Running Python Processes" { Stop-AgentPythonProcesses }
Run-Step "Updating: Terminate Running Python Processes" { Stop-AgentPythonProcesses -ProjectRoot $scriptDir -SkipEngine }
$updateSucceeded = $false
try {
@@ -1797,4 +1845,4 @@ function Invoke-BorealisAgentUpdate {
}
}
Invoke-BorealisAgentUpdate
Invoke-BorealisAgentUpdate

View File

@@ -56,11 +56,23 @@ Site List:
### Installation
1) Start the Server:
- Windows: `./Borealis.ps1 -Server -Flask` *Production Server @ http://localhost:5000*
- Windows: `./Borealis.ps1 -Server -Vite` *Development Server @ http://localhost:5173*
- Windows: `./Borealis.ps1 -EngineProduction` *Production Server @ https://localhost:5000*
- Windows: `./Borealis.ps1 -EngineDev` *Development Server @ https://localhost:5173*
2) (*Optional*) Install the Agent (*elevated PowerShell*):
- Windows: `./Borealis.ps1 -Agent`
## Automated Agent Enrollment
If you plan on deploying the agent via something like a Group Policy or other existing automation platform, you can use the following commandline arguments to install an agent automatically with an enrollment code pre-injected. *The enrollment code below is simply an example*.
**Windows**:
```powershell
.\Borealis.ps1 -Agent -EnrollmentCode "E925-448B-626D-D595-5A0F-FB24-B4D6-6983"
```
**Linux**:
```sh
./Borealis.sh --Agent --EnrollmentCode "E925-448B-626D-D595-5A0F-FB24-B4D6-6983"
```
### Reverse Proxy Configuration
Traefik Dynamic Config: `Replace Service URL with actual IP of Borealis server`
```yml
@@ -111,6 +123,7 @@ The process that agents go through when authenticating securely with a Borealis
- All device APIs now require Authorization: Bearer headers and a service-context (e.g. SYSTEM or CURRENTUSER) marker; missing, expired, mismatched, or revoked credentials are rejected before any business logic runs. Operator-driven revoking / device quarantining logic is not yet implemented.
- Replay and credential theft defenses layer in DPoP proof validation (thumbprint binding) on the server side and short-lived access tokens (15min) with 30-day refresh tokens hashed via SHA-256.
- Centralized logging under Logs/Server and Agent/Logs captures enrollment approvals, rate-limit hits, signature failures, and auth anomalies for post-incident review.
- The Engines operator-facing API endpoints (device inventory, assemblies, job history, etc.) require an authenticated operator session or bearer token; unauthenticated requests are rejected with 401/403 responses before any inventory or script metadata is returned and the requesting user is logged with each quick-run dispatch.
#### Server Security
- Auto-manages PKI: a persistent Borealis root CA (ECDSA SECP384R1) signs leaf certificates that include localhost SANs, tightened filesystem permissions, and a combined bundle for agent identity / cert pinning.
- Script delivery is code-signed with an Ed25519 key stored under Certificates/Server/Code-Signing; agents refuse any payload whose signature or hash does not match the pinned public key.