安裝 Node.js 時,可以不用安裝額外工具,如果有使用 c 或 c++ 時再安裝就好

安裝 WSL2
以 系統管理員 身份開啟 PowerShell,執行:
啟用 WSL 功能
dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart
啟用虛擬機器平台
dism.exe /online /enable-feature /featurename:VirtualMachinePlatform /all /norestart
重新開機
步驟 2.2:安裝 WSL2 + Ubuntu
安裝 WSL2 並預設安裝 Ubuntu
wsl --install -d Ubuntu
安裝完畢,設定帳號、密碼(隱藏)

帳號必須使用小寫、底線開頭
Invalid username. A valid username must start with a lowercase letter or underscore, and can contain lowercase letters, digits, underscores, and dashes.
如果兩次密碼輸入不匹配,會請你重新輸入一次
New password:
Retype new password:
Sorry, passwords do not match.
passwd: Authentication token manipulation error
passwd: password unchanged
Try again? [y/N] y

確認安裝成功
wsl --status
wsl --list --verbose

安裝完成後,可以在開始功能表中看到Ubuntu

開啟安裝完成的Ubuntu

更新系統套件
sudo apt update && sudo apt upgrade -y
在更新時,可能需要重新輸入密碼(隱藏)


安裝常用工具
sudo apt install -y curl git wget build-essential

步驟 2.4:WSL2 核心更新(重要)
在 PowerShell 中執行(確保 WSL2 使用最新核心)
wsl --update
wsl --shutdown

第三部分:安裝 Docker Desktop
前往 [https://www.docker.com/products/docker-desktop/](https://www.docker.com/products/docker-desktop/) 下載 Windows 版
啟動Docker,確保Settings>Resources=>WSL interation中的Enable integration with my default WSL distro是勾選的,並同意使用Ubuntu後,按下Apply & Restart套用

步驟 3.3:驗證 Docker 正常運作
在 WSL2(Ubuntu)終端機中執行:
docker –version
顯示Docker version 29.2.1, build a5c7197
docker compose version
顯示Docker Compose version v5.0.2
docker run hello-world
將會下載docker的hello-world鏡像

Docker中會出現hello world的容器與鏡像,可以刪除 (只是確認可以在WSL中正常使用docker)


步驟 3.4:啟用 GPU 支援(NVIDIA,選用)
在 PowerShell 測試 GPU 是否可用
docker run --rm --gpus all nvidia/cuda:12.2.0-base-ubuntu22.04 nvidia-smi
這會在docker中下載一個cuda鏡像,並顯示目前裝置的cuda狀態

Windows 原生安裝 Ollama
前往 [https://ollama.com/download](https://ollama.com/download) 下載 Windows 安裝檔
OpenClaw 需要有 tools 標籤的模型。可用以下指令確認:
| 模型 | tools | vision | thinking | cloud |
| llama3.2:latest | ✅ | ❌ | ❌ | ❌ |
| qwen3:8b | ✅ | ❌ | ✅ | ❌ |
| qwen3.5:cloud | ✅ | ✅ | ✅ | ✅ |
| qwen3-coder-next:cloud | ✅ | ❌ | ❌ | ✅ |
例如 llama3.2與qwen3中帶有tools標籤,可以使用OpenClaw中的Angent模式


Cloud 模型(需要登入ollama帳號)
ollama pull qwen3-coder-next:cloud
ollama pull qwen3-coder:480b-cloud
ollama pull qwen3.5:cloud
本地模型
ollama pull qwen3:8b
ollama pull llama3.2:latest # 3B,最輕量本地備援
Ollama針對Cloud模型,免費帳號有時間、每周流量限制,升級後可以增加流量

Ollama中需要設定
勾選Expose Ollama to the network,讓Docker可以讀取
設定上下文長度,OpenClaw 最低門檻 16000
Qwen3 8B 以上建議 65536,若 RAM 不足則降回 32768

第五部分:安裝 OpenClaw (Docker 安裝)
步驟 5.1:Clone 官方的鏡像
cd ~
git clone https://github.com/openclaw/openclaw
如果安裝成功,可以移動到openclaw資料夾中
cd openclaw

實際資料夾在這邊,這是docker的安裝包,安裝、啟動都在這個資料夾中

步驟 5.2:執行 Docker 安裝腳本
先確認腳本存在
ls -la docker-setup.sh
存在才執行
bash docker-setup.sh
這時會開始拉取、建立docker容器

安裝完畢後會啟動設置畫面

Docker中也可以看到openclaw已經啟動

使用鍵盤左右鍵來同意授權


I understand this is powerful and inherently risky. Continue? => yes
Onboarding mode => Manual
What do you want to set up? => Local gateway (this machine)
Workspace directory=>/home/node/.openclaw/workspace (預設)
Model/auth provider=> Skip for now
Filter models by provider=>隨便選,沒有Ollama或跳過選項
Default model=> Enter model manually
Default model=>隨便輸入,之後會改設置檔
Gateway port=>給OpenClaw的Port
Gateway bind=> Loopback (127.0.0.1) (僅本機可用)
Gateway auth=> Token
Tailscale exposure=> Off (No Tailscale exposure)
Gateway token (blank to generate)=>空白即可


Configure chat channels now? => No(先不設定聊天頻道)
Configure skills now? (recommended)=>Yes(先設置技能)
Install missing skill dependencies
🧩 clawhub, 🐙 github, 📄 nano-pdf, 💎 obsidian, 🎙️ openai-whisper, 🗣️ sag, 🧾 summarize, 𝕏 xurl
Show Homebrew install command? => No
如果你現在是在 macOS
如果你現在是在 WSL / Ubuntu(Linux)
Preferred node manager for skill installs=>npm
Set GOOGLE_PLACES_API_KEY for goplaces?No
Set GEMINI_API_KEY for nano-banana-pro? No
Set NOTION_API_KEY for notion? No
Set OPENAI_API_KEY for openai-image-gen? No
Set OPENAI_API_KEY for openai-whisper-api? No
Set ELEVENLABS_API_KEY for sag?No

Enable hooks?=> Skip for now


如果技能安裝失敗,有兩種解決方法
路線 1(推薦,簡單)
先跳過這些 skill
路線 2(你想補)
用 Ubuntu 的方式手動安裝對應 CLI(不是用 brew)
B 類:uv not installed(nano-pdf)
這個很單純,nano-pdf 需要 uv,你還沒裝。
✅ 解法
先裝 uv(WSL Ubuntu)
curl -LsSf https://astral.sh/uv/install.sh
| sh
source ~/.bashrc
uv --version
如果你之後要用 nano-pdf,這個值得裝。


Enable zsh shell completion for openclaw? =>要不要啟用 zsh 的自動補全功能,如果輸入指令時可以按下tab來補全指令,在WSL環境下選No即可
這樣就安裝好了

如果docker有這個錯誤
2026-02-24T15:17:42.199+00:00 Gateway failed to start: Error: non-loopback
Control UI requires gateway.controlUi.allowedOrigins (set explicit origins), or
set gateway.controlUi.dangerouslyAllowHostHeaderOriginFallback=true to use
Host-header origin fallback mode

通常表示openclaw.json有問題

開啟openclaw.json找到gateway
"gateway": {
"port": 18789,
"mode": "local",
"bind": "loopback",
"auth": {
"mode": "token",
"token": "4f477d2cf49c09bdcdf0e10577fc22c3db074982a22b311c"
},
"tailscale": {
"mode": "off",
"resetOnExit": false
},
"nodes": {
"denyCommands": [
"camera.snap",
"camera.clip",
"screen.record",
"calendar.add",
"contacts.add",
"reminders.add"
]
}
},
加入controlUi欄位
"controlUi": {
"allowedOrigins": [
"http://127.0.0.1:18789",
"http://localhost:18789"
]
},
看起來像是
"gateway": {
"port": 18789,
"mode": "local",
"bind": "loopback",
"controlUi": {
"allowedOrigins": [
"http://127.0.0.1:18789",
"http://localhost:18789"
]
},
"auth": {
"mode": "token",
"token": "4f477d2cf49c09bdcdf0e10577fc22c3db074982a22b311c"
},
"tailscale": {
"mode": "off",
"resetOnExit": false
},
"nodes": {
"denyCommands": [
"camera.snap",
"camera.clip",
"screen.record",
"calendar.add",
"contacts.add",
"reminders.add"
]
}
},
改完後,docker可以正常啟動

網頁可以連進來

切換到 概況頁面


使用你的TOKEN,可以得到請求號碼
例如我的TOKEN= 4f477d2cf49c09bdcdf0e10577fc22c3db074982a22b311c
docker compose exec -e OPENCLAW_GATEWAY_TOKEN=4f477d2cf49c09bdcdf0e10577fc22c3db074982a22b311c openclaw-gateway node dist/index.js devices list
附註:要先在 localhost:18789/overview 中輸入TOKEN,才可以使用devices list抓到配對裝置
注意,不同瀏覽器(包括無痕)、不同IP,都算做不同裝置,每次都要重新配對

會得到請求ID [61de2211-77bf-4b46-b0e6-31700fb6d96a]
使用TOKEN與請求ID來配對,啟用OpenClaw
docker compose exec -e OPENCLAW_GATEWAY_TOKEN=4f477d2cf49c09bdcdf0e10577fc22c3db074982a22b311c openclaw-gateway node dist/index.js devices approve 61de2211-77bf-4b46-b0e6-31700fb6d96a

“Your AI assistant, now without the $3,499 headset.” 是 OpenClaw CLI 啟動時隨機顯示的幽默標語(tagline / slogan),不是收費提示、不是授權訊息、也不是付款要求
它在開玩笑說:
在 localhost:18789/overview 中,可以看到目前的OpenClaw已經啟動成功

目前已經連接成功了,接著要串ollama
開啟openclaw.json,找到models,目前只有剛剛建立的ollama{}

找到agents區塊

建議找到models與agents,刪除兩個欄位直接覆蓋,避免括號有問題
"models": {
"mode": "merge",
"providers": {
"ollama": {
"baseUrl": "http://host.docker.internal:11434",
"apiKey": "ollama-local",
"auth": "api-key",
"api": "ollama",
"models": [
{
"id": "qwen3-coder-next:cloud",
"name": "Qwen3 Coder Next (Cloud)",
"api": "ollama",
"reasoning": false,
"input": ["text"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 262144,
"maxTokens": 16384
},
{
"id": "qwen3-coder:480b-cloud",
"name": "Qwen3 Coder 480B (Cloud)",
"api": "ollama",
"reasoning": false,
"input": ["text"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 262144,
"maxTokens": 16384
},
{
"id": "qwen3.5:cloud",
"name": "Qwen3.5 Vision (Cloud)",
"api": "ollama",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 131072,
"maxTokens": 16384
},
{
"id": "qwen3:8b",
"name": "Qwen3 8B (Local)",
"api": "ollama",
"reasoning": false,
"input": ["text"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 32768,
"maxTokens": 8192
},
{
"id": "llama3.2:latest",
"name": "Llama 3.2 3B (Local, 備援)",
"api": "ollama",
"reasoning": false,
"input": ["text"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 32768,
"maxTokens": 4096
}
]
}
}
},
"agents": {
"defaults": {
"models": {
"ollama/qwen3-coder-next:cloud": { "alias": "Coder-Next-Cloud" },
"ollama/qwen3-coder:480b-cloud": { "alias": "Coder-480B-Cloud" },
"ollama/qwen3.5:cloud": { "alias": "Qwen3.5-Vision-Cloud" },
"ollama/qwen3:8b": { "alias": "Qwen3-8B-Local" },
"ollama/llama3.2:latest": { "alias": "Llama3.2-Local" }
},
"workspace": "/home/node/.openclaw/workspace",
"compaction": {
"mode": "safeguard"
},
"blockStreamingBreak": "message_end",
"maxConcurrent": 4,
"subagents": {
"maxConcurrent": 8
}
},
"list": [
{
"id": "main",
"default": true,
"model": {
"primary": "ollama/qwen3-coder-next:cloud",
"fallbacks": [
"ollama/qwen3.5:cloud",
"ollama/qwen3-coder:480b-cloud",
"ollama/qwen3:8b",
"ollama/llama3.2:latest"
]
},
"identity": {
"name": "小助理",
"theme": "親切的繁體中文 AI 助理",
"emoji": "🤖",
"avatar": ""
}
}
]
},
這裡也附上openclaw.json,但TOKEN要修改

重新啟動docker openclaw-getway後,開啟網頁,開啟代理畫面,可以找到剛剛建立的小助理,選擇模型後並儲存 (但重新整理、重新進入網頁後失效)
如果需要修改模型順序,可以開啟openclaw.json,並調整agents的順序
例如我將Qwen3-8B-Local變成首選



可以回到聊天中,對AI進行問答,但有時候會回答簡體字

開啟.openclaw\workspace資料夾,會看到 USER.md SOUL.md 等文件

這裡附上文件,可以直接覆蓋 (可以先備份原檔)






我請他幫我建立資料夾與檔案

可以在 .openclaw\workspace\hello_world 中找到目標檔案
