Autonomously manage and use local Ollama models for continuous operation without internet dependency.
openclaw install @and-ray-m/offline-llama