Compare commits

18 Commits

Author SHA1 Message Date
Windmill Bot
19f64db8ed Auto-sync: 2026-03-20 09:00:01 2026-03-20 09:00:01 +00:00
Windmill Bot
d29dcc2b61 Auto-sync: 2026-03-13 06:30:01 2026-03-13 06:30:01 +00:00
Windmill Bot
280c12ccdf Auto-sync: 2026-03-05 06:30:01 2026-03-05 06:30:01 +00:00
Windmill Bot
ee88aa8c32 Auto-sync: 2026-03-05 06:00:01 2026-03-05 06:00:01 +00:00
Windmill Bot
5d3dd18224 Auto-sync: 2026-03-03 17:00:01 2026-03-03 17:00:01 +00:00
Windmill Bot
6f099e3665 Auto-sync: 2026-03-03 07:30:01 2026-03-03 07:30:01 +00:00
Windmill Bot
c5b49c015d Auto-sync: 2026-03-03 06:00:01 2026-03-03 06:00:01 +00:00
Windmill Bot
873d834dad Auto-sync: 2026-03-03 05:00:01 2026-03-03 05:00:01 +00:00
Windmill Bot
72483e045b Auto-sync: 2026-03-03 03:00:01 2026-03-03 03:00:01 +00:00
Windmill Bot
2a81b7ea35 Auto-sync: 2026-03-02 12:00:01 2026-03-02 12:00:01 +00:00
Windmill Bot
4666fa23fc Auto-sync: 2026-03-02 11:30:01 2026-03-02 11:30:01 +00:00
Windmill Bot
27854af3a7 Auto-sync: 2026-03-02 08:00:02 2026-03-02 08:00:02 +00:00
Windmill Bot
e646a87e6b Auto-sync: 2026-03-02 05:30:01 2026-03-02 05:30:01 +00:00
Windmill Bot
434fa33670 Auto-sync: 2026-03-02 05:00:01 2026-03-02 05:00:01 +00:00
Windmill Bot
65846cf6f6 Auto-sync: 2026-03-02 04:30:01 2026-03-02 04:30:01 +00:00
Windmill Bot
bdd1f5c689 Auto-sync: 2026-03-02 04:00:01 2026-03-02 04:00:01 +00:00
Windmill Bot
33a4f5ad7b Auto-sync: 2026-03-01 17:29:03 2026-03-01 17:29:03 +00:00
Windmill Bot
dcca6ee056 sync branch: インフラファイルを除外、ワークフロー定義のみ追跡 2026-03-02 02:24:04 +09:00
30 changed files with 606 additions and 1045 deletions

9
.gitignore vendored
View File

@@ -46,3 +46,12 @@ workflows/.wmill/tmp/
!workflows/g/
!workflows/wmill.yaml
!workflows/wmill-lock.yaml
# sync ブランチではインフラファイルを追跡しない
docker-compose.yml
docker-compose-dev.yml
Caddyfile
SERVER_SETUP.md
env.host
sync_to_git.sh
mcp/

View File

@@ -1,35 +0,0 @@
{
layer4 {
:25 {
proxy {
to windmill_server:2525
}
}
}
}
{$BASE_URL} {
bind {$ADDRESS}
# LSP - Language Server Protocol for code intelligence (windmill_extra:3001)
reverse_proxy /ws/* http://windmill_extra:3001
# Multiplayer - Real-time collaboration, Enterprise Edition (windmill_extra:3002)
# Uncomment and set ENABLE_MULTIPLAYER=true in docker-compose.yml
# reverse_proxy /ws_mp/* http://windmill_extra:3002
# Debugger - Interactive debugging via DAP WebSocket (windmill_extra:3003)
# Set ENABLE_DEBUGGER=true in docker-compose.yml to enable
handle_path /ws_debug/* {
reverse_proxy http://windmill_extra:3003
}
# Search indexer, Enterprise Edition (windmill_indexer:8002)
# reverse_proxy /api/srch/* http://windmill_indexer:8002
# Default: Windmill server
reverse_proxy /* http://windmill_server:8000
# TLS with custom certificates
# tls /certs/cert.pem /certs/key.pem
}

View File

@@ -1,155 +0,0 @@
# Windmill サーバー設定手順 (VPS移行版)
本番環境VPSへのデプロイ手順です。
既にTraefikが稼働している環境`traefik-net` ネットワークが存在する環境)を前提としています。
## 前提条件
- サーバー上でTraefikが稼働しており、`traefik-net` ネットワークが存在すること。
- ドメイン `windmill.keinafarm.net` がサーバーのIPに向けられていること。
## ステップ1: リポジトリの準備
サーバー上の任意の場所(例: `/home/windmill/windmill`)にリポジトリをクローンします。
**重要**: WindmillのGit同期機能を使用するため、このディレクトリパスは重要です。
```bash
mkdir -p /home/windmill
cd /home/windmill
git clone https://gitea.keinafarm.net/akira/windmill.git windmill
cd windmill
```
## ステップ2: 環境変数の設定
`.env` ファイルを作成し、本番用の設定を行います。
```bash
cp .env .env.production
nano .env
```
以下の内容を確認・修正してください:
- `DATABASE_URL`: `postgres://postgres:あなたの強力なパスワード@db/windmill?sslmode=disable`
- `POSTGRES_PASSWORD`: 上記と同じパスワード
- `WM_IMAGE`: `ghcr.io/windmill-labs/windmill:main`
## ステップ3: 起動
`docker-compose.yml` は本番用に構成されていますTraefik連携済み
```bash
docker-compose up -d
```
## ステップ4: Git同期用ワークフローのセットアップ
Windmill上で「登録されたワークフローをGitに保存する」機能を有効にする手順です。
`git_sync` フローが定期実行されると、Windmill DB上のスクリプト/フローの変更がGiteaリポジトリに自動コミットプッシュされます。
### 4-1. Windmill APIトークンの取得
1. ブラウザで `https://windmill.keinafarm.net` にログイン
2. 左下の **Settings****Account** をクリック
3. **Tokens** セクションで **Create token** をクリック
4. Label例: `git-sync`)を入力し、作成
5. 表示されたトークンをコピーしておく(後のステップで使用)
### 4-2. ワークフロー定義の取り込み(初回のみ)
リポジトリの `workflows/` にある定義ファイルをWindmill DBに取り込みます。
```bash
# Windmillサーバーコンテナに入る
docker exec -it windmill_server /bin/bash
# コンテナ内で実行windmill-cli をインストール
npm install -g windmill-cli
# wmill.yamlがあるディレクトリに移動して sync push
cd /workspace/workflows
wmill sync push \
--token "<4-1で取得したトークン>" \
--base-url "http://localhost:8000" \
--workspace admins \
--yes
exit
```
> **注意**: `wmill sync push` はディスク→DBへの反映です。
> 逆に `wmill sync pull` はDB→ディスクへの反映です。
> スケジュールされた `git_sync` フローが `sync pull` を実行するため、
> **UIで直接スクリプトを修正した場合、次回の sync pull で正しくディスクにも反映されます。**
### 4-3. Gitea認証情報の設定git push用
`git_sync` フローが Gitea へ `git push` できるよう、サーバー上のリモートURLにGiteaのアクセストークンを含めます。
```bash
# サーバーのホスト側で実行
cd ~/windmill
# 現在のリモートURLを確認
git remote -v
# Giteaのアクセストークンを含んだURLに変更
git remote set-url origin https://<username>:<giteaトークン>@gitea.keinafarm.net/akira/windmill.git
```
> **Giteaトークンの作成方法**: Gitea`https://gitea.keinafarm.net`)にログイン →
> 右上アバター → Settings → Applications → Generate New Token
### 4-4. WM_TOKEN Variable の設定
WindmillのWeb画面で、`git_sync` フローが使用する変数を登録します。
1. 左メニューの **Variables** をクリック
2. **+ Variable** をクリック
3. 以下を入力:
- **Path**: `u/antigravity/wm_token`
- **Value**: 4-1で取得したWindmill APIトークン
- **Is Secret**: ✅ オン
4. **Save** をクリック
> **注意**: `git_sync` フローのスクリプト(`a.sh`)内で `$WM_TOKEN` として参照されます。
> フローのInput設定で、この変数が正しく紐づけられていることを確認してください。
### 4-5. git_sync フローの手動実行テスト
1. Windmill UI で **`u/antigravity/git_sync`** フローを開く
2. **Run** ボタンで手動実行
3. **Runs** ページで実行ログを確認
4. 成功すれば、Giteaリポジトリに自動コミットが作成されているはず
### 4-6. スケジュール実行の確認
`git_sync.schedule.yaml` により、2分ごとに自動実行されるスケジュールが登録されています。
左メニューの **Schedules** から、スケジュールが有効になっていることを確認してください。
---
## トラブルシューティング
### ディスク上のファイルが古い内容に戻る
`git_sync` フローが `wmill sync pull`DB→ディスクを実行するため、UIで修正した内容がディスクに上書きされます。
スクリプトの修正は **Windmill UI上で直接編集** するのが確実です。
### git push が失敗する
```bash
# サーバー上でリモートURLにトークンが含まれているか確認
cd ~/windmill
git remote -v
# https://<user>:<token>@gitea.keinafarm.net/... の形式であること
```
### 開発環境(ローカル)での起動
ローカルで起動する場合は `docker-compose-dev.yml` を使用します:
```bash
docker-compose -f docker-compose-dev.yml up -d
```
### ログ確認
```bash
docker-compose logs -f
```

View File

@@ -1,182 +0,0 @@
version: "3.9"
x-logging: &default-logging
driver: "json-file"
options:
max-size: "${LOG_MAX_SIZE:-20m}"
max-file: "${LOG_MAX_FILE:-10}"
compress: "true"
networks:
traefik-net:
external: true # Traefik管理下のネットワーク
windmill-internal:
driver: bridge # Windmill内部通信用
services:
db:
deploy:
replicas: 1
image: postgres:16
shm_size: 1g
restart: unless-stopped
volumes:
- db_data:/var/lib/postgresql/data
expose:
- 5432
environment:
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: windmill
healthcheck:
test: [ "CMD-SHELL", "pg_isready -U postgres" ]
interval: 10s
timeout: 5s
retries: 5
logging: *default-logging
networks:
- windmill-internal
windmill_server:
image: ${WM_IMAGE}
container_name: windmill_server
pull_policy: always
deploy:
replicas: 1
restart: unless-stopped
expose:
- 8000
- 2525
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=server
depends_on:
db:
condition: service_healthy
volumes:
- worker_logs:/tmp/windmill/logs
- /home/windmill/windmill:/workspace
labels:
# Traefik設定
- "traefik.enable=true"
# HTTPSルーター
- "traefik.http.routers.windmill.rule=Host(`windmill.keinafarm.net`)"
- "traefik.http.routers.windmill.entrypoints=websecure"
- "traefik.http.routers.windmill.tls=true"
- "traefik.http.routers.windmill.tls.certresolver=letsencrypt"
- "traefik.http.services.windmill.loadbalancer.server.port=8000"
# HTTPからHTTPSへのリダイレクト
- "traefik.http.routers.windmill-http.rule=Host(`windmill.keinafarm.net`)"
- "traefik.http.routers.windmill-http.entrypoints=web"
- "traefik.http.routers.windmill-http.middlewares=windmill-https-redirect"
- "traefik.http.middlewares.windmill-https-redirect.redirectscheme.scheme=https"
networks:
- traefik-net
- windmill-internal
logging: *default-logging
windmill_worker:
image: ${WM_IMAGE}
pull_policy: always
deploy:
replicas: 3
resources:
limits:
cpus: "1"
memory: 2048M
restart: unless-stopped
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=worker
- WORKER_GROUP=default
depends_on:
db:
condition: service_healthy
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- worker_dependency_cache:/tmp/windmill/cache
- worker_logs:/tmp/windmill/logs
- /home/windmill/windmill:/workspace
networks:
- windmill-internal
logging: *default-logging
windmill_worker_native:
image: ${WM_IMAGE}
pull_policy: always
deploy:
replicas: 1
resources:
limits:
cpus: "1"
memory: 2048M
restart: unless-stopped
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=worker
- WORKER_GROUP=native
- NUM_WORKERS=8
- SLEEP_QUEUE=200
depends_on:
db:
condition: service_healthy
volumes:
- worker_logs:/tmp/windmill/logs
networks:
- windmill-internal
logging: *default-logging
windmill_indexer:
image: ${WM_IMAGE}
pull_policy: always
deploy:
replicas: 0 # 必要に応じて1に変更
restart: unless-stopped
expose:
- 8002
environment:
- PORT=8002
- DATABASE_URL=${DATABASE_URL}
- MODE=indexer
depends_on:
db:
condition: service_healthy
volumes:
- windmill_index:/tmp/windmill/search
- worker_logs:/tmp/windmill/logs
networks:
- windmill-internal
logging: *default-logging
windmill_extra:
image: ghcr.io/windmill-labs/windmill-extra:latest
pull_policy: always
restart: unless-stopped
expose:
- 3001
- 3002
- 3003
environment:
- ENABLE_LSP=true
- ENABLE_MULTIPLAYER=false
- ENABLE_DEBUGGER=true
- DEBUGGER_PORT=3003
- ENABLE_NSJAIL=false
- REQUIRE_SIGNED_DEBUG_REQUESTS=false
- WINDMILL_BASE_URL=http://windmill_server:8000
volumes:
- lsp_cache:/pyls/.cache
networks:
- windmill-internal
logging: *default-logging
# Caddyは使わないTraefikを使用
# caddy:
# deploy:
# replicas: 0
volumes:
db_data: null
worker_dependency_cache: null
worker_logs: null
worker_memory: null
windmill_index: null
lsp_cache: null

View File

@@ -1,202 +0,0 @@
x-logging: &default-logging
driver: "json-file"
options:
max-size: "${LOG_MAX_SIZE:-20m}"
max-file: "${LOG_MAX_FILE:-10}"
compress: "true"
networks:
traefik-net:
external: true # サーバー上の既存Traefikネットワーク
windmill-internal:
driver: bridge
services:
db:
deploy:
replicas: 1
image: postgres:16
shm_size: 1g
restart: unless-stopped
volumes:
- db_data:/var/lib/postgresql/data
expose:
- 5432
environment:
POSTGRES_PASSWORD: ${DATABASE_PASSWORD}
POSTGRES_DB: windmill
healthcheck:
test: [ "CMD-SHELL", "pg_isready -U postgres" ]
interval: 10s
timeout: 5s
retries: 5
logging: *default-logging
networks:
- windmill-internal
windmill_server:
image: ${WM_IMAGE}
container_name: windmill_server
pull_policy: if_not_present
deploy:
replicas: 1
restart: unless-stopped
expose:
- 8000
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=server
- BASE_URL=https://windmill.keinafarm.net
- OAUTH_REDIRECT_BASE_URL=https://windmill.keinafarm.net
- GOOGLE_OAUTH_ENABLED=true
- GOOGLE_OAUTH_CLIENT_ID=${GOOGLE_OAUTH_CLIENT_ID}
- GOOGLE_OAUTH_CLIENT_SECRET=${GOOGLE_OAUTH_CLIENT_SECRET}
depends_on:
db:
condition: service_healthy
volumes:
- worker_logs:/tmp/windmill/logs
# Git同期のために、カレントディレクトリリポジトリルートを/workspaceにマウント
# これにより、コンテナ内から .git ディレクトリにアクセス可能となり、git pushが可能になる
- .:/workspace
labels:
- "traefik.enable=true"
# HTTPSルーター
- "traefik.http.routers.windmill.rule=Host(`windmill.keinafarm.net`)"
- "traefik.http.routers.windmill.entrypoints=websecure"
- "traefik.http.routers.windmill.tls=true"
- "traefik.http.routers.windmill.tls.certresolver=letsencrypt"
- "traefik.http.services.windmill.loadbalancer.server.port=8000"
# HTTPからHTTPSへのリダイレクト
- "traefik.http.routers.windmill-http.rule=Host(`windmill.keinafarm.net`)"
- "traefik.http.routers.windmill-http.entrypoints=web"
- "traefik.http.routers.windmill-http.middlewares=windmill-https-redirect"
- "traefik.http.middlewares.windmill-https-redirect.redirectscheme.scheme=https"
networks:
- traefik-net
- windmill-internal
logging: *default-logging
windmill_worker:
image: ${WM_IMAGE}
pull_policy: if_not_present
deploy:
replicas: 3
resources:
limits:
cpus: "1"
memory: 2048M
restart: unless-stopped
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=worker
- WORKER_GROUP=default
depends_on:
db:
condition: service_healthy
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- worker_dependency_cache:/tmp/windmill/cache
- worker_logs:/tmp/windmill/logs
# WorkerからもGit同期が必要な場合に備えてマウント
- .:/workspace
networks:
- windmill-internal
logging: *default-logging
windmill_worker_native:
image: ${WM_IMAGE}
pull_policy: if_not_present
deploy:
replicas: 1
resources:
limits:
cpus: "1"
memory: 2048M
restart: unless-stopped
environment:
- DATABASE_URL=${DATABASE_URL}
- MODE=worker
- WORKER_GROUP=native
- NUM_WORKERS=8
- SLEEP_QUEUE=200
depends_on:
db:
condition: service_healthy
volumes:
- worker_logs:/tmp/windmill/logs
networks:
- windmill-internal
logging: *default-logging
windmill_extra:
image: ghcr.io/windmill-labs/windmill-extra:${WM_VERSION}
pull_policy: if_not_present
restart: unless-stopped
expose:
- 3001
- 3002
- 3003
environment:
- ENABLE_LSP=true
- ENABLE_MULTIPLAYER=false
- ENABLE_DEBUGGER=true
- DEBUGGER_PORT=3003
- ENABLE_NSJAIL=false
- REQUIRE_SIGNED_DEBUG_REQUESTS=false
- WINDMILL_BASE_URL=http://windmill_server:8000
volumes:
- lsp_cache:/pyls/.cache
networks:
- windmill-internal
logging: *default-logging
labels:
# LSPなどのWebSocket用設定Caddyfileの代替
- "traefik.enable=true"
# LSPへのルーティング (/ws/* -> 3001)
- "traefik.http.routers.windmill-lsp.rule=Host(`windmill.keinafarm.net`) && PathPrefix(`/ws/`)"
- "traefik.http.routers.windmill-lsp.entrypoints=websecure"
- "traefik.http.routers.windmill-lsp.tls=true"
- "traefik.http.services.windmill-lsp.loadbalancer.server.port=3001"
# Debuggerへのルーティング (/ws_debug/* -> 3003)
- "traefik.http.routers.windmill-debug.rule=Host(`windmill.keinafarm.net`) && PathPrefix(`/ws_debug/`)"
- "traefik.http.routers.windmill-debug.entrypoints=websecure"
- "traefik.http.routers.windmill-debug.tls=true"
- "traefik.http.services.windmill-debug.loadbalancer.server.port=3003"
windmill_mcp:
build:
context: ./mcp
dockerfile: Dockerfile
container_name: windmill_mcp
restart: unless-stopped
expose:
- 8001
environment:
- WINDMILL_TOKEN=${WINDMILL_TOKEN}
- WINDMILL_URL=https://windmill.keinafarm.net
- WINDMILL_WORKSPACE=admins
- MCP_TRANSPORT=sse
- MCP_HOST=0.0.0.0
- MCP_PORT=8001
labels:
- "traefik.enable=true"
# HTTPS ルーター
- "traefik.http.routers.windmill-mcp.rule=Host(`windmill_mcp.keinafarm.net`)"
- "traefik.http.routers.windmill-mcp.entrypoints=websecure"
- "traefik.http.routers.windmill-mcp.tls=true"
- "traefik.http.routers.windmill-mcp.tls.certresolver=letsencrypt"
- "traefik.http.services.windmill-mcp.loadbalancer.server.port=8001"
# HTTP → HTTPS リダイレクト
- "traefik.http.routers.windmill-mcp-http.rule=Host(`windmill_mcp.keinafarm.net`)"
- "traefik.http.routers.windmill-mcp-http.entrypoints=web"
- "traefik.http.routers.windmill-mcp-http.middlewares=windmill-https-redirect"
networks:
- traefik-net
logging: *default-logging
volumes:
db_data: null
worker_dependency_cache: null
worker_logs: null
lsp_cache: null

View File

@@ -1,5 +0,0 @@
WM_IMAGE=ghcr.io/windmill-labs/windmill:main
POSTGRES_PASSWORD=MyS3cur3P@ssw0rd!2024
DATABASE_URL=postgresql://postgres:${POSTGRES_PASSWORD}@db:5432/windmill
LOG_MAX_SIZE=20m
LOG_MAX_FILE=10

View File

@@ -1,14 +0,0 @@
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY windmill_mcp.py .
ENV MCP_TRANSPORT=sse
ENV MCP_HOST=0.0.0.0
ENV MCP_PORT=8001
CMD ["python", "windmill_mcp.py"]

View File

@@ -1,2 +0,0 @@
mcp>=1.0.0
httpx>=0.27.0

View File

@@ -1,346 +0,0 @@
#!/usr/bin/env python3
"""Windmill MCP Server - Claude が Windmill を直接操作できるようにする"""
import os
import json
import sys
import httpx
from mcp.server.fastmcp import FastMCP
WINDMILL_URL = os.environ.get("WINDMILL_URL", "https://windmill.keinafarm.net")
WINDMILL_TOKEN = os.environ.get("WINDMILL_TOKEN", "")
WINDMILL_WORKSPACE = os.environ.get("WINDMILL_WORKSPACE", "admins")
if not WINDMILL_TOKEN:
print("Error: WINDMILL_TOKEN 環境変数が設定されていません", file=sys.stderr)
sys.exit(1)
mcp = FastMCP("windmill")
def _headers() -> dict:
return {"Authorization": f"Bearer {WINDMILL_TOKEN}"}
def _api(path: str) -> str:
return f"{WINDMILL_URL}/api/w/{WINDMILL_WORKSPACE}/{path}"
@mcp.tool()
def windmill_list_flows(per_page: int = 20) -> str:
"""Windmill のフロー一覧を取得する
Args:
per_page: 取得件数最大100
"""
resp = httpx.get(
_api("flows/list"),
headers=_headers(),
params={"per_page": min(per_page, 100)},
timeout=30,
)
resp.raise_for_status()
flows = resp.json()
if not flows:
return "フローが見つかりませんでした"
lines = [
f"- {f['path']}: {f.get('summary', '(概要なし)')}" for f in flows
]
return "\n".join(lines)
@mcp.tool()
def windmill_get_flow(path: str) -> str:
"""指定したパスのフロー定義(スクリプト含む)を取得する
Args:
path: フローのパス (例: u/antigravity/git_sync)
"""
resp = httpx.get(_api(f"flows/get/{path}"), headers=_headers(), timeout=30)
resp.raise_for_status()
return json.dumps(resp.json(), indent=2, ensure_ascii=False)
@mcp.tool()
def windmill_run_flow(path: str, args: str = "{}") -> str:
"""フローをトリガーして実行する
Args:
path: フローのパス (例: u/antigravity/git_sync)
args: JSON形式の入力引数 (例: {"key": "value"})
"""
try:
args_dict = json.loads(args)
except json.JSONDecodeError as e:
return f"Error: argsのJSON形式が不正です: {e}"
resp = httpx.post(
_api(f"jobs/run/f/{path}"),
headers=_headers(),
json=args_dict,
timeout=30,
)
resp.raise_for_status()
job_id = resp.text.strip().strip('"')
return (
f"フローを開始しました。\n"
f"ジョブID: {job_id}\n"
f"詳細URL: {WINDMILL_URL}/run/{job_id}?workspace={WINDMILL_WORKSPACE}"
)
@mcp.tool()
def windmill_list_recent_jobs(
limit: int = 20,
success_only: bool = False,
failure_only: bool = False,
script_path_filter: str = "",
) -> str:
"""最近のジョブ一覧を取得する
Args:
limit: 取得件数最大100
success_only: Trueにすると成功ジョブのみ表示
failure_only: Trueにすると失敗ジョブのみ表示
script_path_filter: パスで絞り込む (例: u/antigravity/git_sync)
"""
params: dict = {"per_page": min(limit, 100)}
if success_only:
params["success"] = "true"
if failure_only:
params["success"] = "false"
if script_path_filter:
params["script_path_filter"] = script_path_filter
resp = httpx.get(_api("jobs/list"), headers=_headers(), params=params, timeout=30)
resp.raise_for_status()
jobs = resp.json()
if not jobs:
return "ジョブが見つかりませんでした"
lines = []
for j in jobs:
success = j.get("success")
if success is True:
status = "[OK]"
elif success is False:
status = "[FAIL]"
else:
status = "[RUNNING]"
path = j.get("script_path", "unknown")
started = (j.get("started_at") or "")[:19] or "pending"
job_id = j.get("id", "")
lines.append(f"{status} [{started}] {path} (ID: {job_id})")
return "\n".join(lines)
@mcp.tool()
def windmill_get_job_logs(job_id: str) -> str:
"""ジョブの詳細情報とログを取得する
Args:
job_id: ジョブのIDwindmill_list_recent_jobs で確認できる)
"""
resp = httpx.get(_api(f"jobs_u/get/{job_id}"), headers=_headers(), timeout=30)
resp.raise_for_status()
job = resp.json()
success = job.get("success")
if success is True:
state = "成功 [OK]"
elif success is False:
state = "失敗 [FAIL]"
else:
state = "実行中 [RUNNING]"
result_parts = [
f"ジョブID: {job_id}",
f"パス: {job.get('script_path', 'N/A')}",
f"状態: {state}",
f"開始: {job.get('started_at', 'N/A')}",
f"終了: {job.get('created_at', 'N/A')}",
]
log_resp = httpx.get(
_api(f"jobs_u/getlogs/{job_id}"), headers=_headers(), timeout=30
)
if log_resp.status_code == 200:
result_parts.append("\n--- ログ ---")
result_parts.append(log_resp.text)
result_val = job.get("result")
if result_val is not None:
result_parts.append("\n--- 実行結果 ---")
result_parts.append(
json.dumps(result_val, indent=2, ensure_ascii=False)
if isinstance(result_val, (dict, list))
else str(result_val)
)
return "\n".join(result_parts)
@mcp.tool()
def windmill_create_flow(path: str, summary: str, flow_definition: str, description: str = "") -> str:
"""新しいフローを作成する
Args:
path: フローのパス (例: u/admin/my_flow)
summary: フローの概要
flow_definition: フローの定義 (JSON形式の文字列)
description: フローの詳細説明 (省略可)
"""
try:
flow_value = json.loads(flow_definition)
except json.JSONDecodeError as e:
return f"Error: flow_definitionのJSON形式が不正です: {e}"
payload = {
"path": path,
"summary": summary,
"description": description,
"value": flow_value,
}
resp = httpx.post(
_api("flows/create"),
headers=_headers(),
json=payload,
timeout=30,
)
resp.raise_for_status()
return (
f"フローを作成しました。\n"
f"パス: {path}\n"
f"URL: {WINDMILL_URL}/flows/edit/{path}?workspace={WINDMILL_WORKSPACE}"
)
@mcp.tool()
def windmill_update_flow(path: str, summary: str, flow_definition: str, description: str = "") -> str:
"""既存のフローを更新する
Args:
path: フローのパス (例: u/admin/my_flow)
summary: フローの概要
flow_definition: フローの定義 (JSON形式の文字列)
description: フローの詳細説明 (省略可)
"""
try:
flow_value = json.loads(flow_definition)
except json.JSONDecodeError as e:
return f"Error: flow_definitionのJSON形式が不正です: {e}"
payload = {
"path": path,
"summary": summary,
"description": description,
"value": flow_value,
}
resp = httpx.post(
_api(f"flows/edit/{path}"),
headers=_headers(),
json=payload,
timeout=30,
)
resp.raise_for_status()
return (
f"フローを更新しました。\n"
f"パス: {path}\n"
f"URL: {WINDMILL_URL}/flows/edit/{path}?workspace={WINDMILL_WORKSPACE}"
)
@mcp.tool()
def windmill_create_script(
path: str, language: str, content: str, summary: str = "", description: str = ""
) -> str:
"""新しいスクリプトを作成する(既存パスの場合は新バージョンを登録する)
Args:
path: スクリプトのパス (例: u/admin/my_script)
language: 言語 (python3, deno, bun, bash など)
content: スクリプトのソースコード
summary: スクリプトの概要 (省略可)
description: スクリプトの詳細説明 (省略可)
"""
payload = {
"path": path,
"language": language,
"content": content,
"summary": summary,
"description": description,
}
resp = httpx.post(
_api("scripts/create"),
headers=_headers(),
json=payload,
timeout=30,
)
resp.raise_for_status()
hash_val = resp.text.strip().strip('"')
return (
f"スクリプトを作成しました。\n"
f"パス: {path}\n"
f"ハッシュ: {hash_val}\n"
f"URL: {WINDMILL_URL}/scripts/edit/{path}?workspace={WINDMILL_WORKSPACE}"
)
@mcp.tool()
def windmill_list_scripts(per_page: int = 20) -> str:
"""Windmill のスクリプト一覧を取得する
Args:
per_page: 取得件数最大100
"""
resp = httpx.get(
_api("scripts/list"),
headers=_headers(),
params={"per_page": min(per_page, 100)},
timeout=30,
)
resp.raise_for_status()
scripts = resp.json()
if not scripts:
return "スクリプトが見つかりませんでした"
lines = [
f"- {s['path']} [{s.get('language', '?')}]: {s.get('summary', '(概要なし)')}"
for s in scripts
]
return "\n".join(lines)
@mcp.tool()
def windmill_get_script(path: str) -> str:
"""指定したパスのスクリプトのソースコードを取得する
Args:
path: スクリプトのパス (例: u/antigravity/test_git_sync)
"""
resp = httpx.get(_api(f"scripts/get/{path}"), headers=_headers(), timeout=30)
resp.raise_for_status()
script = resp.json()
result_parts = [
f"パス: {script.get('path', 'N/A')}",
f"言語: {script.get('language', 'N/A')}",
f"概要: {script.get('summary', 'N/A')}",
"",
"--- コード ---",
script.get("content", "(コードなし)"),
]
return "\n".join(result_parts)
if __name__ == "__main__":
transport = os.environ.get("MCP_TRANSPORT", "stdio")
if transport == "sse":
host = os.environ.get("MCP_HOST", "0.0.0.0")
port = int(os.environ.get("MCP_PORT", "8001"))
mcp.run(transport="sse", host=host, port=port)
else:
mcp.run(transport="stdio")

View File

@@ -1,68 +0,0 @@
#!/bin/bash
# Windmill Workflow Git Auto-Sync Script
# このスクリプトは、Windmillワークフローを自動的にGitにコミットします
set -e
# 色付き出力
GREEN='\033[0;32m'
YELLOW='\033[1;33m'#!/bin/bash
# Windmill Workflow Git Auto-Sync Script for Gitea
# このスクリプトは、Windmillワークフローを自動的にGiteaにコミットプッシュします
set -e
# 色付き出力
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
echo -e "${GREEN}=== Windmill Workflow Git Sync (Gitea) ===${NC}"
# 作業ディレクトリに移動
cd /workspace
# PATHを設定
export PATH=~/.npm-global/bin:$PATH
# Git設定safe.directoryエラー対策
git config --global --add safe.directory /workspace
git config --global user.email "bot@example.com"
git config --global user.name "Windmill Bot"
# Windmillから最新を取得
echo -e "${YELLOW}Pulling from Windmill...${NC}"
wmill sync pull --skip-variables --skip-secrets --skip-resources --yes
# 変更があるか確認
if [[ -n $(git status --porcelain) ]]; then
echo -e "${YELLOW}Changes detected, committing to Git...${NC}"
# 変更をステージング
git add -A
# コミット
TIMESTAMP=$(date '+%Y-%m-%d %H:%M:%S')
git commit -m "Auto-sync: ${TIMESTAMP}
Synced workflows from Windmill workspace"
# Giteaにプッシュ
echo -e "${YELLOW}Pushing to Gitea...${NC}"
git push origin main || {
echo -e "${RED}Failed to push to Gitea. Check credentials.${NC}"
# トークンや認証情報が設定されていない場合のヒント
echo -e "${YELLOW}Hint: Ensure you have set up git credentials or use a token in the remote URL.${NC}"
exit 1
}
echo -e "${GREEN}✓ Changes pushed to Gitea${NC}"
else
echo -e "${GREEN}✓ No changes detected${NC}"
fi
echo -e "${GREEN}=== Sync Complete ===${NC}"

View File

@@ -0,0 +1 @@
# py: 3.12

View File

@@ -0,0 +1,219 @@
from __future__ import annotations
import re
import subprocess
import time
from typing import Any
def main(
task_contract: dict[str, Any],
steps: list[dict[str, Any]],
context: dict[str, Any] | None = None,
) -> dict[str, Any]:
"""Standalone Windmill runner flow for Butler delegated step execution.
This file is intentionally self-contained so it can be pasted or synced to
Windmill without requiring the Butler repository on the worker.
"""
timeout_sec = _resolve_timeout_sec(task_contract)
resolved_context = dict(context or {})
results: list[dict[str, Any]] = []
for idx, raw_step in enumerate(steps):
result = _execute_step(raw_step, idx, timeout_sec, resolved_context)
if raw_step.get("kind") in {"cmd", "check"} and result["exit_code"] != 0:
err_type = _classify_error(str(raw_step.get("value", "")), result["stderr"] or result["stdout"])
if err_type == "transient":
time.sleep(30)
result = _execute_step(raw_step, idx, timeout_sec, resolved_context)
results.append(result)
if result["exit_code"] != 0:
evidence = _build_evidence(results)
evidence["ok"] = False
return {
"ok": False,
"summary": _failure_summary(raw_step, result),
"failed_step_index": idx,
"step_results": results,
"evidence": evidence,
}
evidence = _build_evidence(results)
evidence["ok"] = True
return {
"ok": True,
"summary": f"Executed {len(results)} step(s) successfully.",
"step_results": results,
"evidence": evidence,
}
def _resolve_timeout_sec(task_contract: dict[str, Any]) -> int:
constraints = task_contract.get("constraints", {})
max_minutes = constraints.get("max_minutes", 1)
try:
return max(1, int(max_minutes) * 60)
except (TypeError, ValueError):
return 60
def _execute_step(
step: dict[str, Any],
step_index: int,
timeout_sec: int,
context: dict[str, Any],
) -> dict[str, Any]:
kind = str(step.get("kind", "")).strip()
value = str(step.get("value", "") or "")
if kind == "wait":
started = time.perf_counter()
seconds = _parse_wait_seconds(value)
time.sleep(seconds)
duration_ms = int((time.perf_counter() - started) * 1000)
return _step_result(step_index, kind, value, 0, "", "", duration_ms)
if kind == "mcp_call":
return _execute_mcp_call(step, step_index, timeout_sec, context)
started = time.perf_counter()
try:
proc = subprocess.run(
value,
shell=True,
capture_output=True,
timeout=timeout_sec,
text=True,
)
duration_ms = int((time.perf_counter() - started) * 1000)
return _step_result(
step_index,
kind,
value,
proc.returncode,
proc.stdout,
proc.stderr,
duration_ms,
)
except subprocess.TimeoutExpired as exc:
duration_ms = int((time.perf_counter() - started) * 1000)
stdout = exc.stdout if isinstance(exc.stdout, str) else ""
return _step_result(
step_index,
kind,
value,
124,
stdout,
f"timeout after {timeout_sec}s",
duration_ms,
)
def _execute_mcp_call(
step: dict[str, Any],
step_index: int,
timeout_sec: int,
context: dict[str, Any],
) -> dict[str, Any]:
"""Placeholder for future Windmill-side MCP execution.
The first real connectivity test uses `check` steps, so we keep the
deployment artifact dependency-free for now and fail explicitly if a flow
attempts `mcp_call`.
"""
_ = timeout_sec, context
server = str(step.get("server", "") or "").strip()
tool = str(step.get("tool", "") or "").strip()
return _step_result(
step_index,
"mcp_call",
tool,
1,
"",
f"mcp_call is not supported in the standalone Windmill runner yet (server={server}, tool={tool})",
0,
)
def _step_result(
step_index: int,
kind: str,
value: str,
exit_code: int,
stdout: str,
stderr: str,
duration_ms: int,
) -> dict[str, Any]:
return {
"step_index": step_index,
"kind": kind,
"value": value,
"exit_code": exit_code,
"stdout": stdout,
"stderr": stderr,
"duration_ms": duration_ms,
}
def _build_evidence(results: list[dict[str, Any]]) -> dict[str, Any]:
executed_commands = [str(result.get("value", "")) for result in results]
key_outputs: list[str] = []
error_lines: list[str] = []
for result in results:
stdout = str(result.get("stdout", "") or "")
stderr = str(result.get("stderr", "") or "")
if stdout:
key_outputs.extend(stdout.splitlines()[:5])
if stderr:
lines = stderr.splitlines()
error_lines.extend(lines[:5])
if len(lines) > 5:
error_lines.extend(lines[-5:])
return {
"executed_commands": executed_commands,
"key_outputs": key_outputs,
"error_head_tail": "\n".join(error_lines) if error_lines else None,
}
def _failure_summary(step: dict[str, Any], result: dict[str, Any]) -> str:
kind = str(step.get("kind", "") or "")
stderr = str(result.get("stderr", "") or "")
stdout = str(result.get("stdout", "") or "")
if kind == "mcp_call":
return stderr or stdout or "mcp_call failed."
return stderr or stdout or f"{kind} step failed."
def _classify_error(command: str, output: str) -> str:
lowered = (command + "\n" + output).lower()
transient_markers = [
"timeout",
"timed out",
"temporarily unavailable",
"connection reset",
"connection aborted",
"connection refused",
"503",
"502",
"rate limit",
]
for marker in transient_markers:
if marker in lowered:
return "transient"
return "permanent"
def _parse_wait_seconds(value: str) -> float:
normalized = value.strip().lower()
if re.fullmatch(r"\d+(\.\d+)?s", normalized):
return float(normalized[:-1])
if re.fullmatch(r"\d+(\.\d+)?", normalized):
return float(normalized)
raise ValueError(f"Invalid wait value: {value}")

View File

@@ -0,0 +1,47 @@
summary: Butler generic runner - delegated step execution
description: >-
Receives a serialized TaskContract and resolved step list from Butler,
executes steps server-side with Butler-compatible semantics
(cmd/check/wait/retry), and returns ok/summary/step_results/evidence.
value:
modules:
- id: a
summary: Execute Butler task steps
value:
type: rawscript
content: '!inline execute_butler_task_steps.py'
input_transforms:
context:
type: javascript
expr: flow_input.context
steps:
type: javascript
expr: flow_input.steps
task_contract:
type: javascript
expr: flow_input.task_contract
lock: '!inline execute_butler_task_steps.lock'
language: python3
schema:
$schema: 'https://json-schema.org/draft/2020-12/schema'
type: object
order:
- task_contract
- steps
- context
properties:
context:
type: object
description: 'Execution context (target, payload)'
default: {}
steps:
type: array
description: Resolved SOP step list
items:
type: object
task_contract:
type: object
description: Serialized Butler TaskContract
required:
- task_contract
- steps

View File

@@ -0,0 +1 @@
# py: 3.12

View File

@@ -0,0 +1,2 @@
def main():
print('こんにちは、世界')

View File

@@ -0,0 +1,12 @@
summary: Print greeting
description: ''
value:
modules:
- id: a
value:
type: rawscript
content: '!inline a.py'
input_transforms: {}
lock: '!inline a.lock'
language: python3
schema: null

View File

@@ -0,0 +1 @@
# py: 3.12

View File

@@ -0,0 +1,3 @@
def main():
from datetime import datetime
print(datetime.now().strftime('%H:%M:%S'))

View File

@@ -0,0 +1,12 @@
summary: Display current time on startup
description: ''
value:
modules:
- id: a
value:
type: rawscript
content: '!inline a.py'
input_transforms: {}
lock: '!inline a.lock'
language: python3
schema: null

View File

@@ -53,7 +53,7 @@ ACCOUNTS = [
# Xserver (keinafarm.com) 6アカウント
{
"name": "xserver_akiracraftwork",
"account_code": "xserver",
"account_code": "xserver1",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER1_IMAP_USER",
@@ -63,7 +63,7 @@ ACCOUNTS = [
},
{
"name": "xserver_service",
"account_code": "xserver",
"account_code": "xserver2",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER2_IMAP_USER",
@@ -73,7 +73,7 @@ ACCOUNTS = [
},
{
"name": "xserver_midori",
"account_code": "xserver",
"account_code": "xserver3",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER3_IMAP_USER",
@@ -83,7 +83,7 @@ ACCOUNTS = [
},
{
"name": "xserver_kouseiren",
"account_code": "xserver",
"account_code": "xserver4",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER4_IMAP_USER",
@@ -93,7 +93,7 @@ ACCOUNTS = [
},
{
"name": "xserver_post",
"account_code": "xserver",
"account_code": "xserver5",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER5_IMAP_USER",
@@ -103,7 +103,7 @@ ACCOUNTS = [
},
{
"name": "xserver_sales",
"account_code": "xserver",
"account_code": "xserver6",
"host": "sv579.xserver.jp",
"port": 993,
"user_var": "u/admin/XSERVER6_IMAP_USER",
@@ -218,6 +218,14 @@ def process_message(mail, uid, account, api_key, api_url, gemini_key, line_token
"""メッセージを1通処理。戻り値: 'skipped' / 'not_important' / 'notified'"""
account_code = account["account_code"]
forwarding_map = account.get("forwarding_map", {})
recipient_map = {
"akira@keinafarm.com": "xserver1",
"service@keinafarm.com": "xserver2",
"midori@keinafarm.com": "xserver3",
"kouseiren@keinafarm.com": "xserver4",
"post@keinafarm.com": "xserver5",
"sales@keinafarm.com": "xserver6",
}
# メール取得
_, data = mail.uid("FETCH", str(uid), "(RFC822)")
@@ -246,15 +254,15 @@ def process_message(mail, uid, account, api_key, api_url, gemini_key, line_token
body_preview = extract_body_preview(msg, max_chars=500)
# 転送検出: To:ヘッダーのドメインが forwarding_map に存在する場合は account_code を上書き
if forwarding_map:
# 宛先補正: To:ヘッダーから account_code を補正(転送/重複受信時の誤判定防止)
to_raw = msg.get("To", "")
if to_raw:
to_addr = extract_email_address(to_raw)
to_domain = to_addr.split("@")[-1] if "@" in to_addr else ""
if to_domain in forwarding_map:
account_code = forwarding_map[to_domain]
print(f" [転送検出] To:{to_addr} → account: {account_code}")
mapped = forwarding_map.get(to_addr) or forwarding_map.get(to_domain) or recipient_map.get(to_addr)
if mapped:
account_code = mapped
print(f" [宛先補正] To:{to_addr} → account: {account_code}")
print(f" From: {sender_email_addr}, Subject: {subject[:50]}")
@@ -402,6 +410,12 @@ ACCOUNT_LABELS = {
"gmail": "Gmail (メイン)",
"gmail_service": "Gmail (サービス用)",
"hotmail": "Hotmail",
"xserver1": "Xserver (akira@keinafarm.com)",
"xserver2": "Xserver (service@keinafarm.com)",
"xserver3": "Xserver (midori@keinafarm.com)",
"xserver4": "Xserver (kouseiren@keinafarm.com)",
"xserver5": "Xserver (post@keinafarm.com)",
"xserver6": "Xserver (sales@keinafarm.com)",
"xserver": "Xserver",
}

View File

@@ -0,0 +1,5 @@
{
"dependencies": {}
}
//bun.lock
<empty>

View File

@@ -0,0 +1,21 @@
summary: Echo デバイスに TTS で読み上げ
description: 指定した Echo デバイスにテキストを読み上げさせる
lock: '!inline u/admin/alexa_speak.script.lock'
kind: script
schema:
type: object
properties:
device:
type: object
description: ''
default: null
format: dynselect-device
originalType: DynSelect_device
text:
type: string
description: ''
default: null
originalType: string
required:
- device
- text

View File

@@ -0,0 +1,69 @@
/**
* alexa_speak.ts
* 指定した Echo デバイスにテキストを読み上げさせる Windmill スクリプト
*
* パラメータ:
* device - ドロップダウンから選択するデバイス(内部的にはシリアル番号)
* text - 読み上げるテキスト
*/
const ALEXA_API_URL = "http://alexa_api:3500";
type DeviceOption = { value: string; label: string };
const FALLBACK_DEVICE_OPTIONS: DeviceOption[] = [
{ value: "G0922H085165007R", label: "プレハブ (G0922H085165007R)" },
{ value: "G8M2DB08522600RL", label: "リビングエコー1 (G8M2DB08522600RL)" },
{ value: "G8M2DB08522503WF", label: "リビングエコー2 (G8M2DB08522503WF)" },
{ value: "G0922H08525302K5", label: "オフィスの右エコー (G0922H08525302K5)" },
{ value: "G0922H08525302J9", label: "オフィスの左エコー (G0922H08525302J9)" },
{ value: "G8M2HN08534302XH", label: "寝室のエコー (G8M2HN08534302XH)" },
];
// Windmill Dynamic Select: 引数名 `device` に対応する `DynSelect_device` と `device()` を定義
export type DynSelect_device = string;
export async function device(): Promise<DeviceOption[]> {
try {
const res = await fetch(`${ALEXA_API_URL}/devices`);
if (!res.ok) return FALLBACK_DEVICE_OPTIONS;
const devices = (await res.json()) as Array<{
name?: string;
serial?: string;
family?: string;
}>;
const options = devices
.filter((d) => d.family === "ECHO" && d.serial)
.map((d) => ({
value: d.serial as string,
label: `${d.name ?? d.serial} (${d.serial})`,
}))
.sort((a, b) => a.label.localeCompare(b.label, "ja"));
return options.length > 0 ? options : FALLBACK_DEVICE_OPTIONS;
} catch {
return FALLBACK_DEVICE_OPTIONS;
}
}
export async function main(
device: DynSelect_device,
text: string,
): Promise<{ ok: boolean; device: string; text: string }> {
const res = await fetch(`${ALEXA_API_URL}/speak`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ device, text }),
});
if (!res.ok) {
const body = await res.json().catch(() => ({}));
throw new Error(
`alexa-api error ${res.status}: ${JSON.stringify(body)}`
);
}
return await res.json();
}

View File

@@ -0,0 +1,16 @@
description: ''
args: {}
cron_version: v2
email: akiracraftwork@gmail.com
enabled: true
is_flow: true
no_flow_overlap: false
on_failure_exact: false
on_failure_times: 1
on_recovery_extra_args: {}
on_recovery_times: 1
on_success_extra_args: {}
schedule: 0 0 * * * *
script_path: u/akiracraftwork/hourly_chime
timezone: Asia/Tokyo
ws_error_handler_muted: false

View File

@@ -0,0 +1,5 @@
{
"dependencies": {}
}
//bun.lock
<empty>

View File

@@ -0,0 +1,29 @@
export async function main(
device: string = "オフィスの右エコー",
prefix: string = "現在時刻は",
suffix: string = "です"
) {
const now = new Date();
const hhmm = new Intl.DateTimeFormat("ja-JP", {
timeZone: "Asia/Tokyo",
hour: "2-digit",
minute: "2-digit",
hour12: false,
}).format(now); // 例: 09:30
const [h, m] = hhmm.split(":");
const text = `${prefix}${Number(h)}${Number(m)}${suffix}`;
const res = await fetch("http://alexa_api:3500/speak", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ device, text }),
});
if (!res.ok) {
const body = await res.text();
throw new Error(`alexa-api error ${res.status}: ${body}`);
}
return { ok: true, device, text };
}

View File

@@ -0,0 +1,88 @@
summary: 鳩時計機能
description: 毎正時にAlexaで時刻を読み上げる。失敗時はLINEで通知。
value:
modules:
- id: a
value:
type: rawscript
content: '!inline a.ts'
input_transforms:
device:
type: static
value: オフィスの右エコー
prefix:
type: static
value: 現在時刻は
suffix:
type: static
value: です
lock: '!inline a.lock'
language: bun
failure_module:
id: failure
summary: エラー時LINE通知
value:
type: rawscript
content: |
import * as wmill from "windmill-client";
export async function main() {
const token = await wmill.getVariable("u/admin/LINE_CHANNEL_ACCESS_TOKEN");
const to = await wmill.getVariable("u/admin/LINE_TO");
const message = [
"\u26a0\ufe0f \u9ce9\u6642\u8a08\u30a8\u30e9\u30fc",
"",
"Alexa TTS API \u304c\u5931\u6557\u3057\u307e\u3057\u305f\u3002",
"Cookie\u306e\u671f\u9650\u5207\u308c\u306e\u53ef\u80fd\u6027\u304c\u3042\u308a\u307e\u3059\u3002",
"",
"\u5bfe\u51e6: auth4.js \u3067 Cookie \u3092\u518d\u53d6\u5f97\u3057\u3066\u304f\u3060\u3055\u3044\u3002"
].join("\n");
const res = await fetch("https://api.line.me/v2/bot/message/push", {
method: "POST",
headers: {
"Authorization": `Bearer ${token}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
to: to,
messages: [{ type: "text", text: message }],
}),
});
if (!res.ok) {
const body = await res.text();
throw new Error(`LINE API error ${res.status}: ${body}`);
}
return { notified: true };
}
input_transforms: {}
lock: |
{
"dependencies": {
"windmill-client": "latest"
}
}
//bun.lock
{
"lockfileVersion": 1,
"configVersion": 1,
"workspaces": {
"": {
"dependencies": {
"windmill-client": "latest",
},
},
},
"packages": {
"windmill-client": ["windmill-client@1.661.0", "", {}, "sha512-vEosrP1NKVHJMi6gEnKnvd3QrNeoy0W0PYqAIIKvg0B4K4ejpw9zbvrytVvoSb7XC3Fb9PzYdvGFqdfaVCCTvg=="],
}
}
language: bun
schema:
$schema: 'https://json-schema.org/draft/2020-12/schema'
type: object
properties: {}
required: []

View File

@@ -2,10 +2,10 @@
set -e
export PATH=/usr/bin:/usr/local/bin:/usr/sbin:/sbin:/bin:$PATH
GREEN=""
YELLOW=""
RED=""
NC=""
GREEN="\033[0;32m"
YELLOW="\033[1;33m"
RED="\033[0;31m"
NC="\033[0m"
echo -e "${GREEN}=== Windmill Workflow Git Sync ===${NC}"
@@ -22,6 +22,19 @@ git config --global --add safe.directory "$REPO_ROOT"
git config --global user.email "bot@keinafarm.net"
git config --global user.name "Windmill Bot"
# sync ブランチを使用
CURRENT_BRANCH=$(git -C "$REPO_ROOT" rev-parse --abbrev-ref HEAD)
if [ "$CURRENT_BRANCH" != "sync" ]; then
echo -e "${YELLOW}Switching to sync branch...${NC}"
git -C "$REPO_ROOT" fetch origin sync
git -C "$REPO_ROOT" checkout sync
fi
echo -e "${YELLOW}Pulling from origin/sync...${NC}"
git -C "$REPO_ROOT" pull --rebase origin sync || {
echo -e "${RED}Failed to pull from remote. Continuing...${NC}"
}
echo -e "${YELLOW}Pulling from Windmill...${NC}"
cd "$WMILL_DIR"
wmill sync pull --config-dir /workspace/wmill_config --skip-variables --skip-secrets --skip-resources --yes || exit 1
@@ -32,15 +45,12 @@ if [[ -n $(git status --porcelain) ]]; then
git add -A
TIMESTAMP=$(date "+%Y-%m-%d %H:%M:%S")
git commit -m "Auto-sync: ${TIMESTAMP}"
echo -e "${YELLOW}Pushing to Gitea...${NC}"
git pull --rebase origin main || {
echo -e "${RED}Failed to pull from remote. Trying push anyway...${NC}"
}
git push origin main || {
echo -e "${YELLOW}Pushing to Gitea (sync branch)...${NC}"
git push origin sync || {
echo -e "${RED}Failed to push.${NC}"
exit 1
}
echo -e "${GREEN}Changes pushed to Gitea${NC}"
echo -e "${GREEN}Changes pushed to Gitea (sync branch)${NC}"
else
echo -e "${GREEN}No changes detected${NC}"
fi

View File

@@ -1,5 +1,5 @@
summary: Git Sync Workflow
description: Automatically sync Windmill workflows to Git repository
description: Automatically sync Windmill workflows to Git repository (sync branch)
value:
modules:
- id: a
@@ -9,9 +9,4 @@ value:
input_transforms: {}
lock: ''
language: bash
schema:
$schema: 'https://json-schema.org/draft/2020-12/schema'
type: object
order: []
properties: {}
required: []
schema: null

View File

@@ -5,17 +5,28 @@ locks:
'f/app_custom/system_heartbeat__flow+step2:_データ検証.py': d7f4e6e04ed116ba3836cb32793a0187a69359a3f2a807b533030b01d42bed39
'f/app_custom/system_heartbeat__flow+step3:_httpヘルスチェック.py': 5d3bce0ddb4f521444bf01bc80670e7321933ad09f935044f4d6123c658ca7a8
'f/app_custom/system_heartbeat__flow+step4:_年度判定_&_最終レポート.py': 6889bfac9a629fa42cf0505cbc945ba3782c59e1697b8493ce6101ef5ffa8b32
f/mail/mail_filter__flow+__flow_hash: 08d8ca9e024f743def5c1e8a90e424da3d9884628074f02c37dbb4c00599e9e9
f/mail/mail_filter__flow+メール取得・判定・通知.py: 0b9cc3ff72d6f3445d46005a657903ae8f195104d1623b47079d13691811c602
f/butler/execute_task_steps__flow+__flow_hash: 4b331a51d9f4bd6fbfc4714a859a08df86184f81fd902a382725541c002bdca8
f/butler/execute_task_steps__flow+execute_butler_task_steps.py: 90e90680a89ff3e7bd05d6c32513e9893b0c2064ae1c9e3dc3e2f3e05bad2166
f/dev/hello_world__flow+__flow_hash: 08a256433d5978b05d08e2ba6cfa8e4324c23be4875c9775777d683f32c6015e
f/dev/hello_world__flow+a.py: 63bf18351b5b0e81067254a03c9811e6bb388c890ad72e18092ac5ec2690a456
f/dev/konnnichiha__flow+__flow_hash: 0d40e9e9fe2cf6944028d671b6facb9e0598d41abc3682993d5339800188b8f1
f/dev/konnnichiha__flow+a.py: 932c967ebcf32abf2e923458c22d63973933b9b4451d0495846b2b720ff25d6d
f/dev/textout__flow+__flow_hash: 869322134a2ea15f54c3b35adf533a495b407d946ddd0b0e9c20d77316479c8b
f/dev/textout__flow+a.py: c4062ee04d2177a398ab3eb23dee0536088d183e8cf22f1d890b05a1bd6e518c
f/mail/mail_filter__flow+__flow_hash: 5790f99e6189a6ed1acabf57f9e6777fb1dc8a334facc1d1b1d26a08be8558a0
f/mail/mail_filter__flow+メール取得・判定・通知.py: b105f1a8414e7ee395f0e3ec1b9515766b4cb630d1fe5205b0493170a727237e
f/shiraou/shiraou_notification__flow+__flow_hash: 94825ff4362b6e4b6d165f8e17a51ebf8e5ef4da3e0ec1407a94b614ecab19dd
f/shiraou/shiraou_notification__flow+変更確認・line通知.py: ac80896991cce8132cfbf34d5dae20d3c09de5bc74a55c500e4c8705dd6a9d88
f/weather/weather_sync__flow+__flow_hash: 8af44676b2a175c1cc105028682f18e4bfbf7bf9de2722263a7d85c13c825f08
f/weather/weather_sync__flow+気象データ取得・同期.py: 86c9953ec7346601eaa13c681e2db5c01c9a5b4b45a3c47e8667ad3c47557029
g/all/setup_app__app+__app_hash: d71add32e14e552d1a4c861c972a50d9598b07c0af201bbadec5b59bbd99d7e3
g/all/setup_app__app+change_account.deno.ts: 3c592cac27e9cdab0de6ae19270bcb08c7fa54355ad05253a12de2351894346b
u/admin/alexa_speak: e5bef63ab682e903715056cf24b4a94e87a14d4db60d8d29cd7c579359b56c72
u/admin/hub_sync: aaf9fd803fa229f3029d1bb02bbe3cc422fce680cad39c4eec8dd1da115de102
u/antigravity/git_sync__flow+__flow_hash: 66cdf1feb6136bb87f65a050266840e7b074a136f4b752bd01dbe524eb8f05d7
u/antigravity/git_sync__flow+a.sh: 3094bf5aed54e3232c6e0260fa0b3f3849f7fc19930ec2a8395fcfe437cdbe8f
u/akiracraftwork/hourly_chime__flow+__flow_hash: 79974bee69ff196e45a08b74e9539d8a3b50885ef0abba6907a00530809984fa
u/akiracraftwork/hourly_chime__flow+a.ts: b27320279be1d14184a210632e15d0e89d701243545d2d73cdd20e11dd413c53
u/antigravity/git_sync__flow+__flow_hash: 5a7194ef6bf1ce5529e70ae74fdb4cd05a0da662c78bfa355bb7e98698689ae6
u/antigravity/git_sync__flow+a.sh: ac7fdc83548f305fed33389129b79439e0c40077ed39a410477c77d08dca0ca9
u/antigravity/hello_world_demo__flow+__flow_hash: 0adc341960f8196454876684f85fe14ef087ba470322d2aabc99b37bf61edac9
u/antigravity/hello_world_demo__flow+a.ts: 53669a285c16d4ba322888755a33424521f769e9ebf64fc1f0cb21f9952b5958
u/antigravity/test_git_sync: 3aa9e66ad8c87f1c2718d41d78ce3b773ce20743e4a1011396edbe2e7f88ac51