Lily Miller b9a498a3fd Unify chat.py and conversation.py into single implementation
CRITICAL FIX: chat.py had TWO execution paths causing inconsistent behavior:
1. Tool calling (correct) - used centralized command_patterns
2. Legacy JSON command parsing (broken) - bypassed SysadminTools

This caused macha-chat to fail SSH connections while macha-ask worked.

Changes:
- Rewrote chat.py to use ONLY tool-calling architecture
- All commands now go through SysadminTools.execute_command()
- SSH commands use centralized command_patterns.py
- conversation.py is now a lightweight wrapper for compatibility
- Both macha-chat and macha-ask use the same code path
- Updated module.nix to call chat.py directly

Benefits:
- Consistent behavior between macha-chat and macha-ask
- Single execution path = easier to maintain
- All SSH commands use explicit key paths
- No more password prompts

Fixes:
- SSH from macha-chat now works correctly
- Both interfaces use centralized command patterns
2025-10-06 16:19:57 -06:00

Macha - AI-Powered Autonomous System Administrator

Macha is an AI-powered autonomous system administrator for NixOS that monitors system health, diagnoses issues, and can take corrective actions with appropriate approval workflows.

Features

  • Autonomous Monitoring: Continuous health checks with configurable intervals
  • Multi-Host Management: SSH-based management of multiple NixOS hosts
  • Tool Calling: Comprehensive system administration tools via Ollama LLM
  • Queue-Based Architecture: Serialized LLM requests to prevent resource contention
  • Knowledge Base: ChromaDB-backed learning system for operational wisdom
  • Approval Workflows: Safety-first approach with configurable autonomy levels
  • Notification System: Gotify integration for alerts

Quick Start

As a NixOS Flake Input

Add to your flake.nix:

{
  inputs = {
    nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
    macha-autonomous.url = "git+https://git.coven.systems/lily/macha-autonomous";
  };

  outputs = { self, nixpkgs, macha-autonomous }: {
    nixosConfigurations.yourhost = nixpkgs.lib.nixosSystem {
      modules = [
        macha-autonomous.nixosModules.default
        {
          services.macha-autonomous = {
            enable = true;
            autonomyLevel = "suggest";  # observe, suggest, auto-safe, auto-full
            checkInterval = 300;
            ollamaHost = "http://localhost:11434";
            model = "gpt-oss:latest";
          };
        }
      ];
    };
  };
}

Configuration Options

See module.nix for full configuration options including:

  • Autonomy levels (observe, suggest, auto-safe, auto-full)
  • Check intervals
  • Ollama host and model settings
  • Git repository monitoring
  • Service user/group configuration

CLI Tools

  • macha-chat - Interactive chat interface
  • macha-ask - Single-question interface
  • macha-check - Trigger immediate health check
  • macha-approve - Approve pending actions
  • macha-logs - View service logs
  • macha-issues - Query issue database
  • macha-knowledge - Query knowledge base
  • macha-systems - List managed systems
  • macha-notify - Send Gotify notification

Architecture

  • Agent: Core AI logic with tool calling
  • Orchestrator: Main monitoring loop
  • Executor: Safe action execution
  • Queue System: Serialized Ollama requests with priorities
  • Context DB: ChromaDB for system context and learning
  • Tools: System administration capabilities

Requirements

  • NixOS with flakes enabled
  • Ollama service running
  • Python 3 with requests, psutil, chromadb

Documentation

See DESIGN.md for comprehensive architecture documentation.

License

[Add your license here]

Author

Lily Miller

Description
No description provided
Readme 307 KiB
Languages
Python 89.9%
Nix 10.1%