integration tests #43

Open
opened 2026-01-05 10:01:32 +00:00 by despiegk · 0 comments
Owner

with AI create integration tests, which are basically directories with rhai scripts
whcih test all rhai capabilities
start with os & crypt & vault ...

for the more advanced ones we should use a kubernetes env and the driver we have to run those in other environments
when we run e.g. an ubuntu24.04 in priviliged mode we can even test containers stuff...

for tests on e.g. ubuntu24.04 we can take dedicted machin on hetzner to test e.g. cloudhypervisor...
we have also driver on hetzner to re-install a machine and then we can connect over ssh and test

its prob not a bad idea to create a driver with minimal UI where we can see which tests worked and what is still busy
and we can restart in UI when one failed, we could also use dagu for it maybe or other suggestions welcome?
can e.g. walk over the dir and see what can go in parallel and what not, dagu can even execute remotely
then schedule it in dagu... if we go this approach make SAL for dagu and use it this way

but at start we can just use "herodo adir" to test dir per dir and fix stuff, the larger scale automation can come later

the rhai scripts should execute and test in same scripts

if we need to change fix rhai scripts, do so and make sure the rhaidocs are adjusted

with AI create integration tests, which are basically directories with rhai scripts whcih test all rhai capabilities start with os & crypt & vault ... for the more advanced ones we should use a kubernetes env and the driver we have to run those in other environments when we run e.g. an ubuntu24.04 in priviliged mode we can even test containers stuff... for tests on e.g. ubuntu24.04 we can take dedicted machin on hetzner to test e.g. cloudhypervisor... we have also driver on hetzner to re-install a machine and then we can connect over ssh and test its prob not a bad idea to create a driver with minimal UI where we can see which tests worked and what is still busy and we can restart in UI when one failed, we could also use dagu for it maybe or other suggestions welcome? can e.g. walk over the dir and see what can go in parallel and what not, dagu can even execute remotely then schedule it in dagu... if we go this approach make SAL for dagu and use it this way but at start we can just use "herodo adir" to test dir per dir and fix stuff, the larger scale automation can come later the rhai scripts should execute and test in same scripts if we need to change fix rhai scripts, do so and make sure the rhaidocs are adjusted
Sign in to join this conversation.
No labels
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
geomind_research/herolib_rust#43
No description provided.