1ff07fb2d1e0709ea2304369db2e55391df866f6
TWO CRITICAL FIXES:
1. Tool Output Processing:
- Raised pass-through threshold from 2KB to 5KB
- Medium outputs (5-20KB) now use hierarchical extraction instead of truncation
- Prevents data loss in service lists and medium-sized outputs
- Full data is analyzed before responding
2. SSH Tool Confusion:
- AI was trying to call non-existent 'ssh()' tool
- Updated system prompt to explicitly state SSH commands use execute_command
- Added examples: execute_command('ssh rhiannon systemctl status ollama')
- Made it clear there is NO separate 'ssh' tool
- Applied same clarification to 'nh' commands
Benefits:
- No more missing services due to truncation
- No more tool calling errors for SSH
- Clear guidance on how to use remote commands
- Consistent command execution
Macha - AI-Powered Autonomous System Administrator
Macha is an AI-powered autonomous system administrator for NixOS that monitors system health, diagnoses issues, and can take corrective actions with appropriate approval workflows.
Features
- Autonomous Monitoring: Continuous health checks with configurable intervals
- Multi-Host Management: SSH-based management of multiple NixOS hosts
- Tool Calling: Comprehensive system administration tools via Ollama LLM
- Queue-Based Architecture: Serialized LLM requests to prevent resource contention
- Knowledge Base: ChromaDB-backed learning system for operational wisdom
- Approval Workflows: Safety-first approach with configurable autonomy levels
- Notification System: Gotify integration for alerts
Quick Start
As a NixOS Flake Input
Add to your flake.nix:
{
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
macha-autonomous.url = "git+https://git.coven.systems/lily/macha-autonomous";
};
outputs = { self, nixpkgs, macha-autonomous }: {
nixosConfigurations.yourhost = nixpkgs.lib.nixosSystem {
modules = [
macha-autonomous.nixosModules.default
{
services.macha-autonomous = {
enable = true;
autonomyLevel = "suggest"; # observe, suggest, auto-safe, auto-full
checkInterval = 300;
ollamaHost = "http://localhost:11434";
model = "gpt-oss:latest";
};
}
];
};
};
}
Configuration Options
See module.nix for full configuration options including:
- Autonomy levels (observe, suggest, auto-safe, auto-full)
- Check intervals
- Ollama host and model settings
- Git repository monitoring
- Service user/group configuration
CLI Tools
macha-chat- Interactive chat interfacemacha-ask- Single-question interfacemacha-check- Trigger immediate health checkmacha-approve- Approve pending actionsmacha-logs- View service logsmacha-issues- Query issue databasemacha-knowledge- Query knowledge basemacha-systems- List managed systemsmacha-notify- Send Gotify notification
Architecture
- Agent: Core AI logic with tool calling
- Orchestrator: Main monitoring loop
- Executor: Safe action execution
- Queue System: Serialized Ollama requests with priorities
- Context DB: ChromaDB for system context and learning
- Tools: System administration capabilities
Requirements
- NixOS with flakes enabled
- Ollama service running
- Python 3 with requests, psutil, chromadb
Documentation
See DESIGN.md for comprehensive architecture documentation.
License
[Add your license here]
Author
Lily Miller
Description
Languages
Python
89.9%
Nix
10.1%