Routes LLM requests to a local model (Ollama, LM Studio, llamafile) before falling back to cloud APIs.
openclaw install @joelnishanth/local-first-llm