mirror of
https://github.com/bunny-lab-io/Borealis.git
synced 2025-12-16 07:25:48 -07:00
Document parity plan and add engine unit tests
This commit is contained in:
@@ -63,3 +63,5 @@
|
|||||||
- 12.1 Stand up Engine end-to-end in a staging environment, exercising enrollment, token refresh, agent connections, and jobs.
|
- 12.1 Stand up Engine end-to-end in a staging environment, exercising enrollment, token refresh, agent connections, and jobs.
|
||||||
- 12.2 Document any divergences and address them with follow-up commits.
|
- 12.2 Document any divergences and address them with follow-up commits.
|
||||||
- 12.3 Once satisfied, coordinate cut-over steps (switch entrypoint, deprecate legacy server) as a future initiative.
|
- 12.3 Once satisfied, coordinate cut-over steps (switch entrypoint, deprecate legacy server) as a future initiative.
|
||||||
|
- Documentation and test coverage for this phase now live in `Data/Engine/README.md` and `Data/Engine/tests/` to guide the
|
||||||
|
remaining staging work.
|
||||||
|
|||||||
@@ -2,6 +2,37 @@
|
|||||||
|
|
||||||
The Engine is an additive server stack that will ultimately replace the legacy Flask app under `Data/Server`. It is safe to run the Engine entrypoint (`Data/Engine/bootstrapper.py`) side-by-side with the legacy server while we migrate functionality feature-by-feature.
|
The Engine is an additive server stack that will ultimately replace the legacy Flask app under `Data/Server`. It is safe to run the Engine entrypoint (`Data/Engine/bootstrapper.py`) side-by-side with the legacy server while we migrate functionality feature-by-feature.
|
||||||
|
|
||||||
|
## Architectural roles
|
||||||
|
|
||||||
|
The Engine is organized around explicit dependency layers so each concern stays
|
||||||
|
testable and replaceable:
|
||||||
|
|
||||||
|
- **Configuration (`Data/Engine/config/`)** parses environment variables into
|
||||||
|
immutable settings objects that the bootstrapper hands to factories and
|
||||||
|
integrations.
|
||||||
|
- **Builders (`Data/Engine/builders/`)** transform external inputs (HTTP
|
||||||
|
headers, JSON payloads, scheduled job definitions) into validated immutable
|
||||||
|
records that services can trust.
|
||||||
|
- **Domain models (`Data/Engine/domain/`)** house pure value objects, enums, and
|
||||||
|
error types with no I/O so services can express intent without depending on
|
||||||
|
Flask or SQLite.
|
||||||
|
- **Repositories (`Data/Engine/repositories/`)** encapsulate all SQLite access
|
||||||
|
and expose protocol methods that return domain models. They are injected into
|
||||||
|
services through the container so persistence can be swapped or mocked.
|
||||||
|
- **Services (`Data/Engine/services/`)** host business logic such as device
|
||||||
|
authentication, enrollment, job scheduling, GitHub artifact lookups, and
|
||||||
|
real-time agent coordination. Services depend only on repositories,
|
||||||
|
integrations, and builders.
|
||||||
|
- **Integrations (`Data/Engine/integrations/`)** wrap external systems (GitHub
|
||||||
|
today) and keep HTTP/token handling outside the services that consume them.
|
||||||
|
- **Interfaces (`Data/Engine/interfaces/`)** provide thin HTTP/Socket.IO
|
||||||
|
adapters that translate requests to builder/service calls and serialize
|
||||||
|
responses. They contain no business rules of their own.
|
||||||
|
|
||||||
|
The runtime factory (`Data/Engine/runtime.py`) wires these layers together and
|
||||||
|
attaches the resulting container to the Flask app created in
|
||||||
|
`Data/Engine/server.py`.
|
||||||
|
|
||||||
## Environment configuration
|
## Environment configuration
|
||||||
|
|
||||||
The Engine mirrors the legacy defaults so it can boot without additional configuration. These environment variables are read by `Data/Engine/config/environment.py`:
|
The Engine mirrors the legacy defaults so it can boot without additional configuration. These environment variables are read by `Data/Engine/config/environment.py`:
|
||||||
@@ -95,3 +126,50 @@ Step 11 migrates the GitHub artifact provider into the Engine:
|
|||||||
- `Data/Engine/interfaces/http/github.py` exposes `/api/repo/current_hash` and `/api/github/token` through the Engine stack while keeping business logic in the service layer.
|
- `Data/Engine/interfaces/http/github.py` exposes `/api/repo/current_hash` and `/api/github/token` through the Engine stack while keeping business logic in the service layer.
|
||||||
|
|
||||||
The service container now wires `github_service`, giving other interfaces and background jobs a clean entry point for GitHub functionality.
|
The service container now wires `github_service`, giving other interfaces and background jobs a clean entry point for GitHub functionality.
|
||||||
|
|
||||||
|
## Final parity checklist
|
||||||
|
|
||||||
|
Step 12 tracks the final integration work required before switching over to the
|
||||||
|
Engine entrypoint:
|
||||||
|
|
||||||
|
1. Stand up the Engine in a staging environment and exercise enrollment, token
|
||||||
|
refresh, scheduler operations, and the agent real-time channel side-by-side
|
||||||
|
with the legacy server.
|
||||||
|
2. Capture any behavioural differences uncovered during staging and file them
|
||||||
|
for follow-up fixes before the cut-over.
|
||||||
|
3. When satisfied with parity, coordinate the entrypoint swap (point production
|
||||||
|
tooling at `Data/Engine/bootstrapper.py`) and plan the deprecation of
|
||||||
|
`Data/Server`.
|
||||||
|
|
||||||
|
## Performing unit tests
|
||||||
|
|
||||||
|
Targeted unit tests cover the most important domain, builder, repository, and
|
||||||
|
migration behaviours without requiring Flask or external services. Run them
|
||||||
|
with the standard library test runner:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python -m unittest discover Data/Engine/tests
|
||||||
|
```
|
||||||
|
|
||||||
|
The suite currently validates:
|
||||||
|
|
||||||
|
- Domain normalization helpers for GUIDs, fingerprints, and authentication
|
||||||
|
failures.
|
||||||
|
- Device authentication and refresh-token builders, including error handling for
|
||||||
|
malformed requests.
|
||||||
|
- SQLite schema migrations to ensure the Engine can provision required tables in
|
||||||
|
a fresh database.
|
||||||
|
|
||||||
|
Successful execution prints a summary similar to:
|
||||||
|
|
||||||
|
```
|
||||||
|
.............
|
||||||
|
----------------------------------------------------------------------
|
||||||
|
Ran 13 tests in <N>.<M>s
|
||||||
|
|
||||||
|
OK
|
||||||
|
```
|
||||||
|
|
||||||
|
Additional tests should follow the same pattern and live under
|
||||||
|
`Data/Engine/tests/` so this command remains the single entry point for Engine
|
||||||
|
unit verification.
|
||||||
|
|||||||
@@ -8,16 +8,28 @@ from .device_auth import (
|
|||||||
RefreshTokenRequest,
|
RefreshTokenRequest,
|
||||||
RefreshTokenRequestBuilder,
|
RefreshTokenRequestBuilder,
|
||||||
)
|
)
|
||||||
from .device_enrollment import (
|
|
||||||
EnrollmentRequestBuilder,
|
|
||||||
ProofChallengeBuilder,
|
|
||||||
)
|
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
"DeviceAuthRequest",
|
"DeviceAuthRequest",
|
||||||
"DeviceAuthRequestBuilder",
|
"DeviceAuthRequestBuilder",
|
||||||
"RefreshTokenRequest",
|
"RefreshTokenRequest",
|
||||||
"RefreshTokenRequestBuilder",
|
"RefreshTokenRequestBuilder",
|
||||||
"EnrollmentRequestBuilder",
|
|
||||||
"ProofChallengeBuilder",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
|
try: # pragma: no cover - optional dependency shim
|
||||||
|
from .device_enrollment import (
|
||||||
|
EnrollmentRequestBuilder,
|
||||||
|
ProofChallengeBuilder,
|
||||||
|
)
|
||||||
|
except ModuleNotFoundError as exc: # pragma: no cover - executed when crypto deps missing
|
||||||
|
_missing_reason = str(exc)
|
||||||
|
|
||||||
|
def _missing_builder(*_args: object, **_kwargs: object) -> None:
|
||||||
|
raise ModuleNotFoundError(
|
||||||
|
"device enrollment builders require optional cryptography dependencies"
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
EnrollmentRequestBuilder = _missing_builder # type: ignore[assignment]
|
||||||
|
ProofChallengeBuilder = _missing_builder # type: ignore[assignment]
|
||||||
|
else:
|
||||||
|
__all__ += ["EnrollmentRequestBuilder", "ProofChallengeBuilder"]
|
||||||
|
|||||||
@@ -45,7 +45,6 @@ def _require(value: Optional[str], field: str) -> str:
|
|||||||
class EnrollmentCode:
|
class EnrollmentCode:
|
||||||
"""Installer code metadata loaded from the persistence layer."""
|
"""Installer code metadata loaded from the persistence layer."""
|
||||||
|
|
||||||
record_id: Optional[str] = None
|
|
||||||
code: str
|
code: str
|
||||||
expires_at: datetime
|
expires_at: datetime
|
||||||
max_uses: int
|
max_uses: int
|
||||||
@@ -53,6 +52,7 @@ class EnrollmentCode:
|
|||||||
used_by_guid: Optional[DeviceGuid]
|
used_by_guid: Optional[DeviceGuid]
|
||||||
last_used_at: Optional[datetime]
|
last_used_at: Optional[datetime]
|
||||||
used_at: Optional[datetime]
|
used_at: Optional[datetime]
|
||||||
|
record_id: Optional[str] = None
|
||||||
|
|
||||||
def __post_init__(self) -> None:
|
def __post_init__(self) -> None:
|
||||||
if not self.code:
|
if not self.code:
|
||||||
@@ -69,7 +69,6 @@ class EnrollmentCode:
|
|||||||
used_by = record.get("used_by_guid")
|
used_by = record.get("used_by_guid")
|
||||||
used_by_guid = DeviceGuid(used_by) if used_by else None
|
used_by_guid = DeviceGuid(used_by) if used_by else None
|
||||||
return cls(
|
return cls(
|
||||||
record_id=str(record.get("id") or "") or None,
|
|
||||||
code=_require(record.get("code"), "code"),
|
code=_require(record.get("code"), "code"),
|
||||||
expires_at=_parse_iso8601(record.get("expires_at")) or datetime.now(tz=timezone.utc),
|
expires_at=_parse_iso8601(record.get("expires_at")) or datetime.now(tz=timezone.utc),
|
||||||
max_uses=int(record.get("max_uses") or 1),
|
max_uses=int(record.get("max_uses") or 1),
|
||||||
@@ -77,6 +76,7 @@ class EnrollmentCode:
|
|||||||
used_by_guid=used_by_guid,
|
used_by_guid=used_by_guid,
|
||||||
last_used_at=_parse_iso8601(record.get("last_used_at")),
|
last_used_at=_parse_iso8601(record.get("last_used_at")),
|
||||||
used_at=_parse_iso8601(record.get("used_at")),
|
used_at=_parse_iso8601(record.get("used_at")),
|
||||||
|
record_id=str(record.get("id") or "") or None,
|
||||||
)
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
|
|||||||
@@ -9,12 +9,7 @@ from .connection import (
|
|||||||
connection_factory,
|
connection_factory,
|
||||||
connection_scope,
|
connection_scope,
|
||||||
)
|
)
|
||||||
from .device_repository import SQLiteDeviceRepository
|
|
||||||
from .enrollment_repository import SQLiteEnrollmentRepository
|
|
||||||
from .github_repository import SQLiteGitHubRepository
|
|
||||||
from .job_repository import SQLiteJobRepository
|
|
||||||
from .migrations import apply_all
|
from .migrations import apply_all
|
||||||
from .token_repository import SQLiteRefreshTokenRepository
|
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
"SQLiteConnectionFactory",
|
"SQLiteConnectionFactory",
|
||||||
@@ -22,10 +17,31 @@ __all__ = [
|
|||||||
"connect",
|
"connect",
|
||||||
"connection_factory",
|
"connection_factory",
|
||||||
"connection_scope",
|
"connection_scope",
|
||||||
|
"apply_all",
|
||||||
|
]
|
||||||
|
|
||||||
|
try: # pragma: no cover - optional dependency shim
|
||||||
|
from .device_repository import SQLiteDeviceRepository
|
||||||
|
from .enrollment_repository import SQLiteEnrollmentRepository
|
||||||
|
from .github_repository import SQLiteGitHubRepository
|
||||||
|
from .job_repository import SQLiteJobRepository
|
||||||
|
from .token_repository import SQLiteRefreshTokenRepository
|
||||||
|
except ModuleNotFoundError as exc: # pragma: no cover - triggered when auth deps missing
|
||||||
|
def _missing_repo(*_args: object, **_kwargs: object) -> None:
|
||||||
|
raise ModuleNotFoundError(
|
||||||
|
"Engine SQLite repositories require optional authentication dependencies"
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
SQLiteDeviceRepository = _missing_repo # type: ignore[assignment]
|
||||||
|
SQLiteEnrollmentRepository = _missing_repo # type: ignore[assignment]
|
||||||
|
SQLiteGitHubRepository = _missing_repo # type: ignore[assignment]
|
||||||
|
SQLiteJobRepository = _missing_repo # type: ignore[assignment]
|
||||||
|
SQLiteRefreshTokenRepository = _missing_repo # type: ignore[assignment]
|
||||||
|
else:
|
||||||
|
__all__ += [
|
||||||
"SQLiteDeviceRepository",
|
"SQLiteDeviceRepository",
|
||||||
"SQLiteRefreshTokenRepository",
|
"SQLiteRefreshTokenRepository",
|
||||||
"SQLiteJobRepository",
|
"SQLiteJobRepository",
|
||||||
"SQLiteEnrollmentRepository",
|
"SQLiteEnrollmentRepository",
|
||||||
"SQLiteGitHubRepository",
|
"SQLiteGitHubRepository",
|
||||||
"apply_all",
|
]
|
||||||
]
|
|
||||||
|
|||||||
1
Data/Engine/tests/__init__.py
Normal file
1
Data/Engine/tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Test suite for the Borealis Engine."""
|
||||||
74
Data/Engine/tests/test_builders_device_auth.py
Normal file
74
Data/Engine/tests/test_builders_device_auth.py
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
import unittest
|
||||||
|
|
||||||
|
from Data.Engine.builders.device_auth import (
|
||||||
|
DeviceAuthRequestBuilder,
|
||||||
|
RefreshTokenRequestBuilder,
|
||||||
|
)
|
||||||
|
from Data.Engine.domain.device_auth import DeviceAuthErrorCode, DeviceAuthFailure
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceAuthRequestBuilderTests(unittest.TestCase):
|
||||||
|
def test_build_successful_request(self) -> None:
|
||||||
|
request = (
|
||||||
|
DeviceAuthRequestBuilder()
|
||||||
|
.with_authorization("Bearer abc123")
|
||||||
|
.with_http_method("post")
|
||||||
|
.with_htu("https://example.test/api")
|
||||||
|
.with_service_context("currentUser")
|
||||||
|
.with_dpop_proof("proof")
|
||||||
|
.build()
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual(request.access_token, "abc123")
|
||||||
|
self.assertEqual(request.http_method, "POST")
|
||||||
|
self.assertEqual(request.htu, "https://example.test/api")
|
||||||
|
self.assertEqual(request.service_context, "CURRENTUSER")
|
||||||
|
self.assertEqual(request.dpop_proof, "proof")
|
||||||
|
|
||||||
|
def test_missing_authorization_raises_failure(self) -> None:
|
||||||
|
builder = (
|
||||||
|
DeviceAuthRequestBuilder()
|
||||||
|
.with_http_method("GET")
|
||||||
|
.with_htu("/health")
|
||||||
|
)
|
||||||
|
|
||||||
|
with self.assertRaises(DeviceAuthFailure) as ctx:
|
||||||
|
builder.build()
|
||||||
|
|
||||||
|
self.assertEqual(ctx.exception.code, DeviceAuthErrorCode.MISSING_AUTHORIZATION)
|
||||||
|
|
||||||
|
|
||||||
|
class RefreshTokenRequestBuilderTests(unittest.TestCase):
|
||||||
|
def test_refresh_request_requires_all_fields(self) -> None:
|
||||||
|
request = (
|
||||||
|
RefreshTokenRequestBuilder()
|
||||||
|
.with_payload({"guid": "de305d54-75b4-431b-adb2-eb6b9e546014", "refresh_token": "tok"})
|
||||||
|
.with_http_method("post")
|
||||||
|
.with_htu("https://example.test/api")
|
||||||
|
.with_dpop_proof("proof")
|
||||||
|
.build()
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual(request.guid.value, "DE305D54-75B4-431B-ADB2-EB6B9E546014")
|
||||||
|
self.assertEqual(request.refresh_token, "tok")
|
||||||
|
self.assertEqual(request.http_method, "POST")
|
||||||
|
self.assertEqual(request.htu, "https://example.test/api")
|
||||||
|
self.assertEqual(request.dpop_proof, "proof")
|
||||||
|
|
||||||
|
def test_refresh_request_missing_guid_raises_failure(self) -> None:
|
||||||
|
builder = (
|
||||||
|
RefreshTokenRequestBuilder()
|
||||||
|
.with_payload({"refresh_token": "tok"})
|
||||||
|
.with_http_method("POST")
|
||||||
|
.with_htu("https://example.test/api")
|
||||||
|
)
|
||||||
|
|
||||||
|
with self.assertRaises(DeviceAuthFailure) as ctx:
|
||||||
|
builder.build()
|
||||||
|
|
||||||
|
self.assertEqual(ctx.exception.code, DeviceAuthErrorCode.INVALID_CLAIMS)
|
||||||
|
self.assertIn("missing guid", ctx.exception.detail)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__": # pragma: no cover - convenience for local runs
|
||||||
|
unittest.main()
|
||||||
59
Data/Engine/tests/test_domain_device_auth.py
Normal file
59
Data/Engine/tests/test_domain_device_auth.py
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import unittest
|
||||||
|
|
||||||
|
from Data.Engine.domain.device_auth import (
|
||||||
|
DeviceAuthErrorCode,
|
||||||
|
DeviceAuthFailure,
|
||||||
|
DeviceFingerprint,
|
||||||
|
DeviceGuid,
|
||||||
|
sanitize_service_context,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceGuidTests(unittest.TestCase):
|
||||||
|
def test_guid_normalization_accepts_braces_and_lowercase(self) -> None:
|
||||||
|
guid = DeviceGuid("{de305d54-75b4-431b-adb2-eb6b9e546014}")
|
||||||
|
self.assertEqual(guid.value, "DE305D54-75B4-431B-ADB2-EB6B9E546014")
|
||||||
|
|
||||||
|
def test_guid_rejects_empty_string(self) -> None:
|
||||||
|
with self.assertRaises(ValueError):
|
||||||
|
DeviceGuid("")
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceFingerprintTests(unittest.TestCase):
|
||||||
|
def test_fingerprint_normalization_trims_and_lowercases(self) -> None:
|
||||||
|
fingerprint = DeviceFingerprint(" AA:BB:CC ")
|
||||||
|
self.assertEqual(fingerprint.value, "aa:bb:cc")
|
||||||
|
|
||||||
|
def test_fingerprint_rejects_blank_input(self) -> None:
|
||||||
|
with self.assertRaises(ValueError):
|
||||||
|
DeviceFingerprint(" ")
|
||||||
|
|
||||||
|
|
||||||
|
class ServiceContextTests(unittest.TestCase):
|
||||||
|
def test_sanitize_service_context_returns_uppercase_only(self) -> None:
|
||||||
|
self.assertEqual(sanitize_service_context("system"), "SYSTEM")
|
||||||
|
|
||||||
|
def test_sanitize_service_context_filters_invalid_chars(self) -> None:
|
||||||
|
self.assertEqual(sanitize_service_context("sys tem!"), "SYSTEM")
|
||||||
|
|
||||||
|
def test_sanitize_service_context_returns_none_for_empty_result(self) -> None:
|
||||||
|
self.assertIsNone(sanitize_service_context("@@@"))
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceAuthFailureTests(unittest.TestCase):
|
||||||
|
def test_to_dict_includes_retry_after_and_detail(self) -> None:
|
||||||
|
failure = DeviceAuthFailure(
|
||||||
|
DeviceAuthErrorCode.RATE_LIMITED,
|
||||||
|
http_status=429,
|
||||||
|
retry_after=30,
|
||||||
|
detail="too many attempts",
|
||||||
|
)
|
||||||
|
payload = failure.to_dict()
|
||||||
|
self.assertEqual(
|
||||||
|
payload,
|
||||||
|
{"error": "rate_limited", "retry_after": 30.0, "detail": "too many attempts"},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__": # pragma: no cover - convenience for local runs
|
||||||
|
unittest.main()
|
||||||
32
Data/Engine/tests/test_sqlite_migrations.py
Normal file
32
Data/Engine/tests/test_sqlite_migrations.py
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
import sqlite3
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
from Data.Engine.repositories.sqlite import migrations
|
||||||
|
|
||||||
|
|
||||||
|
class MigrationTests(unittest.TestCase):
|
||||||
|
def test_apply_all_creates_expected_tables(self) -> None:
|
||||||
|
conn = sqlite3.connect(":memory:")
|
||||||
|
try:
|
||||||
|
migrations.apply_all(conn)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
tables = {
|
||||||
|
row[0]
|
||||||
|
for row in cursor.execute(
|
||||||
|
"SELECT name FROM sqlite_master WHERE type='table'"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
self.assertIn("devices", tables)
|
||||||
|
self.assertIn("refresh_tokens", tables)
|
||||||
|
self.assertIn("enrollment_install_codes", tables)
|
||||||
|
self.assertIn("device_approvals", tables)
|
||||||
|
self.assertIn("scheduled_jobs", tables)
|
||||||
|
self.assertIn("scheduled_job_runs", tables)
|
||||||
|
self.assertIn("github_token", tables)
|
||||||
|
finally:
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__": # pragma: no cover - convenience for local runs
|
||||||
|
unittest.main()
|
||||||
Reference in New Issue
Block a user