- Revised copyright statement in LICENSE to include assistance from Claude 4.5 Sonnet. - Updated author attribution in README.md to acknowledge Claude 4.5 Sonnet's contributions.
102 lines
3.2 KiB
Markdown
102 lines
3.2 KiB
Markdown
# Macha - AI-Powered Autonomous System Administrator
|
|
|
|
Macha is an AI-powered autonomous system administrator for NixOS that monitors system health, diagnoses issues, and can take corrective actions with appropriate approval workflows.
|
|
|
|
## Features
|
|
|
|
- **Autonomous Monitoring**: Continuous health checks with configurable intervals
|
|
- **Multi-Host Management**: SSH-based management of multiple NixOS hosts
|
|
- **Tool Calling**: Comprehensive system administration tools via Ollama LLM
|
|
- **Queue-Based Architecture**: Serialized LLM requests to prevent resource contention
|
|
- **Knowledge Base**: ChromaDB-backed learning system for operational wisdom
|
|
- **Approval Workflows**: Safety-first approach with configurable autonomy levels
|
|
- **Notification System**: Gotify integration for alerts
|
|
|
|
## Quick Start
|
|
|
|
### As a NixOS Flake Input
|
|
|
|
Add to your `flake.nix`:
|
|
|
|
```nix
|
|
{
|
|
inputs = {
|
|
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
|
|
macha-autonomous.url = "git+https://git.coven.systems/lily/macha-autonomous";
|
|
};
|
|
|
|
outputs = { self, nixpkgs, macha-autonomous }: {
|
|
nixosConfigurations.yourhost = nixpkgs.lib.nixosSystem {
|
|
modules = [
|
|
macha-autonomous.nixosModules.default
|
|
{
|
|
services.macha-autonomous = {
|
|
enable = true;
|
|
autonomyLevel = "suggest"; # observe, suggest, auto-safe, auto-full
|
|
checkInterval = 300;
|
|
ollamaHost = "http://localhost:11434";
|
|
model = "gpt-oss:latest";
|
|
};
|
|
}
|
|
];
|
|
};
|
|
};
|
|
}
|
|
```
|
|
|
|
## Configuration Options
|
|
|
|
See `module.nix` for full configuration options including:
|
|
- Autonomy levels (observe, suggest, auto-safe, auto-full)
|
|
- Check intervals
|
|
- Ollama host and model settings
|
|
- Git repository monitoring
|
|
- Service user/group configuration
|
|
|
|
## CLI Tools
|
|
|
|
- `macha-chat` - Interactive chat interface
|
|
- `macha-ask` - Single-question interface
|
|
- `macha-check` - Trigger immediate health check
|
|
- `macha-approve` - Approve pending actions
|
|
- `macha-logs` - View service logs
|
|
- `macha-issues` - Query issue database
|
|
- `macha-knowledge` - Query knowledge base
|
|
- `macha-systems` - List managed systems
|
|
- `macha-notify` - Send Gotify notification
|
|
|
|
## Architecture
|
|
|
|
- **Agent**: Core AI logic with tool calling
|
|
- **Orchestrator**: Main monitoring loop
|
|
- **Executor**: Safe action execution
|
|
- **Queue System**: Serialized Ollama requests with priorities
|
|
- **Context DB**: ChromaDB for system context and learning
|
|
- **Tools**: System administration capabilities
|
|
|
|
## Requirements
|
|
|
|
- NixOS with flakes enabled
|
|
- Ollama service running
|
|
- Python 3 with requests, psutil, chromadb
|
|
|
|
## Documentation
|
|
|
|
See `DESIGN.md` for comprehensive architecture documentation.
|
|
|
|
## License
|
|
|
|
This project is licensed under the **Peer Production License (PPL)** - see the [LICENSE](LICENSE) file for details.
|
|
|
|
The PPL is a copyfarleft license that allows:
|
|
- ✅ Worker-owned cooperatives and collectives to use commercially
|
|
- ✅ Non-profit organizations to use for any purpose
|
|
- ✅ Individuals to use for personal/non-commercial purposes
|
|
- ❌ For-profit companies using wage labor to use commercially
|
|
|
|
For more information about the Peer Production License, see: https://wiki.p2pfoundation.net/Peer_Production_License
|
|
|
|
## Author
|
|
|
|
Lily Miller (with assistance from Claude 4.5 Sonnet)
|