Add vision-LLM import pipeline for hex terrain classification
Adds a 3-phase pipeline to populate a HexMap from a source image using Claude's vision capabilities instead of naive pixel-color matching: Phase 1 (extract-submaps): crops annotated submap PNGs per hex, including center + 6 neighbours with SVG overlay. Phase 2: Claude classifies submaps in a Code session, writing classifications.json — resumable across sessions. Phase 3 (import-from-json): reads classifications.json into the DB. Also adds assemble-map.ts to reconstruct a full image from a Leaflet tile pyramid (used to recover the Aventurien source map from kiepenkerl). Adds CLAUDE.md documenting the approach, scale constants (PIXELS_PER_MEILE=8, hexSize=40 = 10 Meilen/Hex), and the Nord/Mitte/Süd region split for the Aventurien map. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
4
.gitignore
vendored
4
.gitignore
vendored
@@ -4,3 +4,7 @@ dist-server/
|
|||||||
*.log
|
*.log
|
||||||
tiles/
|
tiles/
|
||||||
data/*.db
|
data/*.db
|
||||||
|
|
||||||
|
# Pipeline artefakte — lokal erzeugte Karten, Tiles, Submaps
|
||||||
|
pipeline/source/
|
||||||
|
pipeline/submaps/
|
||||||
|
|||||||
191
CLAUDE.md
Normal file
191
CLAUDE.md
Normal file
@@ -0,0 +1,191 @@
|
|||||||
|
# Hexifyer — CLAUDE.md
|
||||||
|
|
||||||
|
## Projektübersicht
|
||||||
|
|
||||||
|
Hexifyer ist ein Hex-Grid-Overlay-Tool für Fantasy-Karten.
|
||||||
|
Eine Leaflet-Karte (Kacheln aus einem Quellbild) wird mit einem bearbeitbaren Hex-Raster überlagert.
|
||||||
|
Jedes Hex speichert Terrain-Typ (Basis + lineare Features wie Flüsse, Straßen).
|
||||||
|
|
||||||
|
**Stack:** Vanilla TypeScript · Vite · Leaflet · Express · sql.js (SQLite WASM)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architektur
|
||||||
|
|
||||||
|
```
|
||||||
|
core/ — Pure Logik: Koordinaten, Terrain-Typen, HexMap-Datenstruktur
|
||||||
|
server/ — Express API + sql.js DB (hex_maps, hexes, hex_features)
|
||||||
|
src/ — Frontend: Leaflet-Karte, SVG-Renderer, UI-Komponenten
|
||||||
|
pipeline/ — CLI-Scripts für Daten-Import und Karten-Aufbereitung
|
||||||
|
tests/ — Vitest-Tests für core/
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Hex-Koordinatensystem
|
||||||
|
|
||||||
|
**Flat-top Axial-Koordinaten** (q, r).
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Pixel → Axial
|
||||||
|
x = origin.x + size * (3/2) * q
|
||||||
|
y = origin.y + size * (√3/2 * q + √3 * r)
|
||||||
|
```
|
||||||
|
|
||||||
|
Axiale Koordinaten sind im Pixelraum **geschert** — konstante r-Linien
|
||||||
|
sind diagonal. Das ist relevant für die Regionsaufteilung (siehe unten).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Pipeline: Hex-Map aus Quellbild generieren
|
||||||
|
|
||||||
|
### Warum nicht pixelbasiert?
|
||||||
|
|
||||||
|
Der naive Ansatz — Pixelfarbe des Hex-Mittelpunkts → nächste Terrain-Farbe —
|
||||||
|
scheitert in der Praxis:
|
||||||
|
|
||||||
|
- **Visuell ähnliche Terrains:** Hügel, Wiesen und Waldränder haben oft
|
||||||
|
überlappende Farbwerte. Eine Wiese im Abendlicht sieht farblich wie
|
||||||
|
ein Wald aus; ein kahler Hügel wie Gebirge.
|
||||||
|
- **Kein Kontextwissen:** Ob ein Pixel zu einem Hügel oder einer Ebene
|
||||||
|
gehört, ist oft nur aus dem Umfeld erkennbar — der Schattenwurf,
|
||||||
|
die umgebenden Höhenlinien, angrenzende Gewässer.
|
||||||
|
- **Lineare Features nicht ableitbar:** Ob durch ein Hex ein Fluss oder
|
||||||
|
eine Straße führt — und vor allem: durch welche Kanten — ist aus
|
||||||
|
einem Einzelpixel nicht bestimmbar.
|
||||||
|
|
||||||
|
### Submap-Ansatz mit Vision-LLM
|
||||||
|
|
||||||
|
Für jedes zu klassifizierende Hex wird ein **Submap-Crop** aus dem
|
||||||
|
Quellbild erzeugt:
|
||||||
|
|
||||||
|
```
|
||||||
|
Zentrum-Hex (rot umrandet) + alle 6 Nachbar-Hexes (grau)
|
||||||
|
→ ca. 220×220 px bei hexSize=40
|
||||||
|
→ annotiertes PNG: pipeline/submaps/<map-id>/<q>_<r>.png
|
||||||
|
```
|
||||||
|
|
||||||
|
Das Vision-LLM (Claude) bekommt:
|
||||||
|
1. Das annotierte Bild — für visuelle Mustererkennung (Texturen, Formen, Schatten)
|
||||||
|
2. Bereits klassifizierte Nachbarn als Text-Kontext — für Feature-Kontinuität
|
||||||
|
|
||||||
|
```
|
||||||
|
Bereits klassifizierte Nachbarn:
|
||||||
|
NE: forest
|
||||||
|
E: plains + river(edges: W+E) ← Fluss kommt von W, geht nach E
|
||||||
|
SE: unknown
|
||||||
|
SW: plains
|
||||||
|
...
|
||||||
|
```
|
||||||
|
|
||||||
|
Dadurch kann das Modell z.B. erkennen: *„der Fluss, der beim östlichen
|
||||||
|
Nachbar via W-Kante eintritt, muss hier via E-Kante den Hex verlassen"* —
|
||||||
|
selbst wenn die Farbe im Bild alleine das nicht eindeutig zeigt.
|
||||||
|
|
||||||
|
### Verarbeitungsreihenfolge
|
||||||
|
|
||||||
|
Hexes werden **spaltenweise** verarbeitet (q aufsteigend, r aufsteigend
|
||||||
|
innerhalb jedes q). Damit sind beim Erreichen von Hex (q, r) die
|
||||||
|
Nachbarn **NW** und **W** immer bereits klassifiziert.
|
||||||
|
|
||||||
|
### Drei-Phasen-Pipeline
|
||||||
|
|
||||||
|
```
|
||||||
|
Phase 1 — Extract (pipeline/extract-submaps.ts)
|
||||||
|
Quellbild → annotierte PNG-Submaps + manifest.json
|
||||||
|
Kein API-Key nötig.
|
||||||
|
|
||||||
|
Phase 2 — Classify (Claude Code Session)
|
||||||
|
Claude liest die PNGs mit dem Read-Tool,
|
||||||
|
klassifiziert in Batches, schreibt pipeline/submaps/<id>/classifications.json.
|
||||||
|
Kann über mehrere Sessions fortgesetzt werden.
|
||||||
|
|
||||||
|
Phase 3 — Import (pipeline/import-from-json.ts)
|
||||||
|
classifications.json → SQLite-DB (hexes + hex_features)
|
||||||
|
Kein API-Key nötig.
|
||||||
|
```
|
||||||
|
|
||||||
|
**Alternative:** `pipeline/import-from-image.ts` kombiniert alle drei Phasen
|
||||||
|
in einem Script, benötigt aber einen eigenen `ANTHROPIC_API_KEY`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Aventurien-Karte (DSA 4.1, 2016)
|
||||||
|
|
||||||
|
### Quellbild
|
||||||
|
|
||||||
|
`pipeline/source/aventurien-8000x12000.jpg` — aus Tile-Pyramide
|
||||||
|
des kiepenkerl-Projekts (git.davoryn.de/calic/kiepenkerl) rekonstruiert.
|
||||||
|
Nicht im Repo (`.gitignore`), lokal erzeugen:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Tiles holen (zoom 6, 1504 Tiles)
|
||||||
|
docker cp kiepenkerl:/app/dist-server/public/tiles/6 pipeline/source/aventurien-tiles-z6
|
||||||
|
# Bild zusammensetzen
|
||||||
|
npm run pipeline:assemble -- pipeline/source/aventurien-tiles-z6 pipeline/source/aventurien.jpg
|
||||||
|
# Auf exakte 8000×12000 cropppen
|
||||||
|
node -e "import('sharp').then(({default:s})=>s('pipeline/source/aventurien.jpg').extract({left:0,top:0,width:8000,height:12000}).jpeg({quality:92}).toFile('pipeline/source/aventurien-8000x12000.jpg').then(i=>console.log(i)))"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Skala
|
||||||
|
|
||||||
|
`PIXELS_PER_MEILE = 8` (aus kiepenkerl `src/engine/route-calc.ts`)
|
||||||
|
|
||||||
|
| Skala | `hexSize` | Hexes (8000×12000) |
|
||||||
|
|---|---|---|
|
||||||
|
| 10 Meilen/Hex | **40 px** | ~17.800 |
|
||||||
|
| 5 Meilen/Hex | 20 px | ~71.000 |
|
||||||
|
|
||||||
|
Standard für diese Karte: **10 Meilen/Hex, hexSize=40**.
|
||||||
|
|
||||||
|
### Regionsaufteilung (axiale r-Koordinate)
|
||||||
|
|
||||||
|
Axiale Grenzen schneiden durch den Pixelraum **diagonal** (r-Isolinien
|
||||||
|
sind im Bild von links-oben nach rechts-unten geneigt). Das ist korrekt
|
||||||
|
und erwünscht — Grenzen sind konsistent und erweiterbar.
|
||||||
|
|
||||||
|
| Region | r-Bereich | Inhalt |
|
||||||
|
|---|---|---|
|
||||||
|
| **Nord** | r < 25 | Ifirns Ozean, Thorwal, Gjalskerland |
|
||||||
|
| **Mitte** | 25 ≤ r ≤ 90 | Mittelreich, Horasreich, Bornland, Raschtulswall, Khom |
|
||||||
|
| **Süd** | r > 90 | Al'Anfa, Südmeer-Inseln |
|
||||||
|
|
||||||
|
Jede Session kann eine Region hinzufügen, ohne die anderen zu berühren.
|
||||||
|
|
||||||
|
### Map-Eintrag in der DB
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Einmalig anlegen (z.B. über API beim laufenden Server, oder direkt):
|
||||||
|
curl -X POST http://localhost:3001/api/maps \
|
||||||
|
-H 'Content-Type: application/json' \
|
||||||
|
-d '{"name":"Aventurien 10M/Hex","image_width":8000,"image_height":12000,
|
||||||
|
"hex_size":40,"origin_x":0,"origin_y":0,"min_zoom":0,"max_zoom":6}'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Workflow (Mitte-Region)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Submaps extrahieren (Mitte: r=25..90)
|
||||||
|
npm run pipeline:extract -- \
|
||||||
|
pipeline/source/aventurien-8000x12000.jpg <map-id> \
|
||||||
|
--region 0,25,133,90
|
||||||
|
|
||||||
|
# 2. Claude klassifiziert in dieser Session
|
||||||
|
# (submaps/<map-id>/classifications.json wird schrittweise gefüllt)
|
||||||
|
|
||||||
|
# 3. Ergebnisse importieren
|
||||||
|
npm run pipeline:import -- \
|
||||||
|
pipeline/submaps/<map-id>/classifications.json <map-id>
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Entwicklung
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run dev # Vite + Express parallel
|
||||||
|
npm run test # Vitest
|
||||||
|
npm run pipeline:assemble -- <tiles-dir> <output.jpg>
|
||||||
|
npm run pipeline:extract -- <image> <map-id> [--region q0,r0,q1,r1]
|
||||||
|
npm run pipeline:import -- <classifications.json> <map-id>
|
||||||
|
```
|
||||||
45
package-lock.json
generated
45
package-lock.json
generated
@@ -8,6 +8,7 @@
|
|||||||
"name": "hexifyer",
|
"name": "hexifyer",
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@anthropic-ai/sdk": "^0.89.0",
|
||||||
"concurrently": "^9.2.1",
|
"concurrently": "^9.2.1",
|
||||||
"cors": "^2.8.6",
|
"cors": "^2.8.6",
|
||||||
"express": "^5.2.1",
|
"express": "^5.2.1",
|
||||||
@@ -25,6 +26,33 @@
|
|||||||
"vitest": "^3.0.0"
|
"vitest": "^3.0.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/@anthropic-ai/sdk": {
|
||||||
|
"version": "0.89.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@anthropic-ai/sdk/-/sdk-0.89.0.tgz",
|
||||||
|
"integrity": "sha512-nyGau0zex62EpU91hsHa0zod973YEoiMgzWZ9hC55WdiOLrE4AGpcg4wXI7lFqtvMLqMcLfewQU9sHgQB6psow==",
|
||||||
|
"dependencies": {
|
||||||
|
"json-schema-to-ts": "^3.1.1"
|
||||||
|
},
|
||||||
|
"bin": {
|
||||||
|
"anthropic-ai-sdk": "bin/cli"
|
||||||
|
},
|
||||||
|
"peerDependencies": {
|
||||||
|
"zod": "^3.25.0 || ^4.0.0"
|
||||||
|
},
|
||||||
|
"peerDependenciesMeta": {
|
||||||
|
"zod": {
|
||||||
|
"optional": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@babel/runtime": {
|
||||||
|
"version": "7.29.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.29.2.tgz",
|
||||||
|
"integrity": "sha512-JiDShH45zKHWyGe4ZNVRrCjBz8Nh9TMmZG1kh4QTK8hCBTWBi8Da+i7s1fJw7/lYpM4ccepSNfqzZ/QvABBi5g==",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/@emnapi/runtime": {
|
"node_modules/@emnapi/runtime": {
|
||||||
"version": "1.9.2",
|
"version": "1.9.2",
|
||||||
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.2.tgz",
|
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.2.tgz",
|
||||||
@@ -2304,6 +2332,18 @@
|
|||||||
"dev": true,
|
"dev": true,
|
||||||
"license": "MIT"
|
"license": "MIT"
|
||||||
},
|
},
|
||||||
|
"node_modules/json-schema-to-ts": {
|
||||||
|
"version": "3.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/json-schema-to-ts/-/json-schema-to-ts-3.1.1.tgz",
|
||||||
|
"integrity": "sha512-+DWg8jCJG2TEnpy7kOm/7/AxaYoaRbjVB4LFZLySZlWn8exGs3A4OLJR966cVvU26N7X9TWxl+Jsw7dzAqKT6g==",
|
||||||
|
"dependencies": {
|
||||||
|
"@babel/runtime": "^7.18.3",
|
||||||
|
"ts-algebra": "^2.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=16"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/leaflet": {
|
"node_modules/leaflet": {
|
||||||
"version": "1.9.4",
|
"version": "1.9.4",
|
||||||
"resolved": "https://registry.npmjs.org/leaflet/-/leaflet-1.9.4.tgz",
|
"resolved": "https://registry.npmjs.org/leaflet/-/leaflet-1.9.4.tgz",
|
||||||
@@ -3067,6 +3107,11 @@
|
|||||||
"tree-kill": "cli.js"
|
"tree-kill": "cli.js"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/ts-algebra": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/ts-algebra/-/ts-algebra-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-FPAhNPFMrkwz76P7cdjdmiShwMynZYN6SgOujD1urY4oNm80Ou9oMdmbR45LotcKOXoy7wSmHkRFE6Mxbrhefw=="
|
||||||
|
},
|
||||||
"node_modules/tslib": {
|
"node_modules/tslib": {
|
||||||
"version": "2.8.1",
|
"version": "2.8.1",
|
||||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
|
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
|
||||||
|
|||||||
@@ -11,7 +11,10 @@
|
|||||||
"test": "vitest run",
|
"test": "vitest run",
|
||||||
"test:watch": "vitest",
|
"test:watch": "vitest",
|
||||||
"server": "tsx server/index.ts",
|
"server": "tsx server/index.ts",
|
||||||
"pipeline:tiles": "tsx pipeline/generate-tiles.ts"
|
"pipeline:tiles": "tsx pipeline/generate-tiles.ts",
|
||||||
|
"pipeline:assemble": "tsx pipeline/assemble-map.ts",
|
||||||
|
"pipeline:extract": "tsx pipeline/extract-submaps.ts",
|
||||||
|
"pipeline:import": "tsx pipeline/import-from-json.ts"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@types/cors": "^2.8.19",
|
"@types/cors": "^2.8.19",
|
||||||
@@ -24,6 +27,7 @@
|
|||||||
"vitest": "^3.0.0"
|
"vitest": "^3.0.0"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@anthropic-ai/sdk": "^0.89.0",
|
||||||
"concurrently": "^9.2.1",
|
"concurrently": "^9.2.1",
|
||||||
"cors": "^2.8.6",
|
"cors": "^2.8.6",
|
||||||
"express": "^5.2.1",
|
"express": "^5.2.1",
|
||||||
|
|||||||
112
pipeline/assemble-map.ts
Normal file
112
pipeline/assemble-map.ts
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
/**
|
||||||
|
* Reconstruct a full map image from a Leaflet tile pyramid (zoom level z).
|
||||||
|
*
|
||||||
|
* Usage:
|
||||||
|
* npx tsx pipeline/assemble-map.ts <tiles-dir> <output-image> [zoom-level]
|
||||||
|
*
|
||||||
|
* The tiles-dir must contain subdirectories named by x-index, each containing
|
||||||
|
* y-index.jpg files (standard Leaflet tile layout: {z}/{x}/{y}.jpg).
|
||||||
|
*
|
||||||
|
* Example:
|
||||||
|
* npx tsx pipeline/assemble-map.ts pipeline/source/aventurien-tiles-z6 pipeline/source/aventurien.jpg
|
||||||
|
*/
|
||||||
|
|
||||||
|
import sharp from 'sharp';
|
||||||
|
import { readdirSync, existsSync } from 'fs';
|
||||||
|
import { join, resolve } from 'path';
|
||||||
|
|
||||||
|
const TILE_SIZE = 256;
|
||||||
|
|
||||||
|
async function assembleTiles(tilesDir: string, outputPath: string) {
|
||||||
|
if (!existsSync(tilesDir)) {
|
||||||
|
console.error(`Tiles directory not found: ${tilesDir}`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Discover grid dimensions from directory structure
|
||||||
|
const xDirs = readdirSync(tilesDir)
|
||||||
|
.map(Number)
|
||||||
|
.filter(n => !isNaN(n))
|
||||||
|
.sort((a, b) => a - b);
|
||||||
|
|
||||||
|
if (xDirs.length === 0) {
|
||||||
|
console.error('No tile columns found in', tilesDir);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const maxX = Math.max(...xDirs);
|
||||||
|
|
||||||
|
// Count rows from first column
|
||||||
|
const firstColDir = join(tilesDir, String(xDirs[0]));
|
||||||
|
const yFiles = readdirSync(firstColDir)
|
||||||
|
.map(f => parseInt(f))
|
||||||
|
.filter(n => !isNaN(n))
|
||||||
|
.sort((a, b) => a - b);
|
||||||
|
const maxY = Math.max(...yFiles);
|
||||||
|
|
||||||
|
const tilesX = maxX + 1;
|
||||||
|
const tilesY = maxY + 1;
|
||||||
|
const totalWidth = tilesX * TILE_SIZE;
|
||||||
|
const totalHeight = tilesY * TILE_SIZE;
|
||||||
|
|
||||||
|
console.log(`Grid: ${tilesX}×${tilesY} tiles → canvas ${totalWidth}×${totalHeight}px`);
|
||||||
|
|
||||||
|
const composites: sharp.OverlayOptions[] = [];
|
||||||
|
let loaded = 0;
|
||||||
|
|
||||||
|
for (const x of xDirs) {
|
||||||
|
const colDir = join(tilesDir, String(x));
|
||||||
|
const yFiles = readdirSync(colDir)
|
||||||
|
.map(f => parseInt(f))
|
||||||
|
.filter(n => !isNaN(n))
|
||||||
|
.sort((a, b) => a - b);
|
||||||
|
|
||||||
|
for (const y of yFiles) {
|
||||||
|
const tilePath = join(colDir, `${y}.jpg`);
|
||||||
|
if (!existsSync(tilePath)) continue;
|
||||||
|
composites.push({
|
||||||
|
input: tilePath,
|
||||||
|
left: x * TILE_SIZE,
|
||||||
|
top: y * TILE_SIZE,
|
||||||
|
});
|
||||||
|
loaded++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Compositing ${loaded} tiles...`);
|
||||||
|
|
||||||
|
// Process in batches to avoid hitting sharp's composite limit
|
||||||
|
const BATCH = 200;
|
||||||
|
let canvas = sharp({
|
||||||
|
create: { width: totalWidth, height: totalHeight, channels: 3, background: { r: 42, g: 85, b: 116 } },
|
||||||
|
}).jpeg({ quality: 92 });
|
||||||
|
|
||||||
|
// sharp supports all composites in one call for reasonable counts
|
||||||
|
await sharp({
|
||||||
|
create: { width: totalWidth, height: totalHeight, channels: 3, background: { r: 42, g: 85, b: 116 } },
|
||||||
|
})
|
||||||
|
.composite(composites)
|
||||||
|
.jpeg({ quality: 92 })
|
||||||
|
.toFile(outputPath);
|
||||||
|
|
||||||
|
console.log(`Assembled → ${outputPath}`);
|
||||||
|
|
||||||
|
// Print map config hint
|
||||||
|
const actualW = tilesX * TILE_SIZE;
|
||||||
|
const actualH = tilesY * TILE_SIZE;
|
||||||
|
console.log(`\nMap config hint:`);
|
||||||
|
console.log(` imageSize: [${actualW}, ${actualH}]`);
|
||||||
|
console.log(` PIXELS_PER_MEILE: 8`);
|
||||||
|
console.log(` 10 Meilen/Hex → hexSize: 40 (~${Math.round(actualW/80)}×${Math.round(actualH/69)} hexes)`);
|
||||||
|
console.log(` 5 Meilen/Hex → hexSize: 20 (~${Math.round(actualW/40)}×${Math.round(actualH/35)} hexes)`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const [tilesDir, outputPath] = process.argv.slice(2);
|
||||||
|
if (!tilesDir || !outputPath) {
|
||||||
|
console.error('Usage: npx tsx pipeline/assemble-map.ts <tiles-dir> <output-image>');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
assembleTiles(resolve(tilesDir), resolve(outputPath)).catch(err => {
|
||||||
|
console.error('Failed:', err);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
224
pipeline/extract-submaps.ts
Normal file
224
pipeline/extract-submaps.ts
Normal file
@@ -0,0 +1,224 @@
|
|||||||
|
/**
|
||||||
|
* Phase 1: Extract hex submaps from a source image for Claude-vision classification.
|
||||||
|
*
|
||||||
|
* Usage:
|
||||||
|
* npx tsx pipeline/extract-submaps.ts <image> <map-id> [options]
|
||||||
|
*
|
||||||
|
* Options:
|
||||||
|
* --region <q0,r0,q1,r1> Only process hexes within this axial bounding box
|
||||||
|
*
|
||||||
|
* Output:
|
||||||
|
* submaps/<map-id>/manifest.json Map config + processing order
|
||||||
|
* submaps/<map-id>/<q>_<r>.png Annotated submap per hex (center in red)
|
||||||
|
*
|
||||||
|
* The manifest lists hexes in processing order (q asc, r asc) so Claude can
|
||||||
|
* use left/top neighbours as context when classifying each hex.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import sharp from 'sharp';
|
||||||
|
import { mkdirSync, writeFileSync, existsSync } from 'fs';
|
||||||
|
import { join, resolve, dirname } from 'path';
|
||||||
|
import { fileURLToPath } from 'url';
|
||||||
|
import { initDb, getDb } from '../server/db.js';
|
||||||
|
import { axialToPixel, hexVertices } from '../core/coords.js';
|
||||||
|
import { gridBoundsForImage } from '../core/hex-grid.js';
|
||||||
|
import { HexEdge, EDGE_DIRECTIONS, ALL_EDGES, type AxialCoord, type PixelCoord } from '../core/types.js';
|
||||||
|
|
||||||
|
const ROOT = resolve(dirname(fileURLToPath(import.meta.url)), '..');
|
||||||
|
|
||||||
|
// ─── Constants ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const PIXELS_PER_MEILE = 8;
|
||||||
|
const CROP_RADIUS_FACTOR = 2.8;
|
||||||
|
|
||||||
|
// ─── DB helpers ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
interface MapConfig {
|
||||||
|
id: number;
|
||||||
|
image_width: number;
|
||||||
|
image_height: number;
|
||||||
|
hex_size: number;
|
||||||
|
origin_x: number;
|
||||||
|
origin_y: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
function loadMapConfig(mapId: number): MapConfig {
|
||||||
|
const db = getDb();
|
||||||
|
const rows = db.exec(
|
||||||
|
'SELECT id, image_width, image_height, hex_size, origin_x, origin_y FROM hex_maps WHERE id = ?',
|
||||||
|
[mapId],
|
||||||
|
);
|
||||||
|
if (rows.length === 0 || rows[0].values.length === 0) {
|
||||||
|
throw new Error(`Map ${mapId} not found in DB`);
|
||||||
|
}
|
||||||
|
const [id, image_width, image_height, hex_size, origin_x, origin_y] = rows[0].values[0] as number[];
|
||||||
|
return { id, image_width, image_height, hex_size, origin_x, origin_y };
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Submap extraction ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
function buildHexSvg(
|
||||||
|
center: AxialCoord,
|
||||||
|
hexSize: number,
|
||||||
|
origin: PixelCoord,
|
||||||
|
cropLeft: number,
|
||||||
|
cropTop: number,
|
||||||
|
cropW: number,
|
||||||
|
cropH: number,
|
||||||
|
): string {
|
||||||
|
const EDGE_NAME: Record<HexEdge, string> = {
|
||||||
|
[HexEdge.NE]: 'NE', [HexEdge.E]: 'E', [HexEdge.SE]: 'SE',
|
||||||
|
[HexEdge.SW]: 'SW', [HexEdge.W]: 'W', [HexEdge.NW]: 'NW',
|
||||||
|
};
|
||||||
|
|
||||||
|
const hexCoords: Array<{ coord: AxialCoord; label?: string }> = [
|
||||||
|
{ coord: center },
|
||||||
|
...ALL_EDGES.map(e => {
|
||||||
|
const d = EDGE_DIRECTIONS[e];
|
||||||
|
return { coord: { q: center.q + d.q, r: center.r + d.r }, label: EDGE_NAME[e] };
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
const polys = hexCoords.map(({ coord, label }, i) => {
|
||||||
|
const px = axialToPixel(coord, hexSize, origin);
|
||||||
|
const verts = hexVertices(px.x - cropLeft, px.y - cropTop, hexSize);
|
||||||
|
const pts = verts.map(v => `${v.x.toFixed(1)},${v.y.toFixed(1)}`).join(' ');
|
||||||
|
const isCenter = i === 0;
|
||||||
|
|
||||||
|
const poly = `<polygon points="${pts}" fill="${isCenter ? 'rgba(255,60,60,0.10)' : 'none'}" `
|
||||||
|
+ `stroke="${isCenter ? '#ff3c3c' : '#ffffff'}" `
|
||||||
|
+ `stroke-width="${isCenter ? 2.5 : 1.2}" stroke-opacity="${isCenter ? 1 : 0.6}"/>`;
|
||||||
|
|
||||||
|
if (!label) return poly;
|
||||||
|
|
||||||
|
// Small direction label
|
||||||
|
const lx = (px.x - cropLeft).toFixed(1);
|
||||||
|
const ly = (px.y - cropTop).toFixed(1);
|
||||||
|
const text = `<text x="${lx}" y="${ly}" text-anchor="middle" dominant-baseline="middle" `
|
||||||
|
+ `font-family="monospace" font-size="${Math.max(8, hexSize * 0.35)}" fill="white" opacity="0.7">${label}</text>`;
|
||||||
|
return poly + text;
|
||||||
|
});
|
||||||
|
|
||||||
|
return `<svg width="${cropW}" height="${cropH}" xmlns="http://www.w3.org/2000/svg">\n`
|
||||||
|
+ polys.join('\n') + '\n</svg>';
|
||||||
|
}
|
||||||
|
|
||||||
|
async function extractSubmap(
|
||||||
|
imagePath: string,
|
||||||
|
coord: AxialCoord,
|
||||||
|
hexSize: number,
|
||||||
|
origin: PixelCoord,
|
||||||
|
imageWidth: number,
|
||||||
|
imageHeight: number,
|
||||||
|
outputPath: string,
|
||||||
|
): Promise<void> {
|
||||||
|
const px = axialToPixel(coord, hexSize, origin);
|
||||||
|
const r = Math.ceil(hexSize * CROP_RADIUS_FACTOR);
|
||||||
|
|
||||||
|
const left = Math.max(0, Math.round(px.x - r));
|
||||||
|
const top = Math.max(0, Math.round(px.y - r));
|
||||||
|
const right = Math.min(imageWidth, Math.round(px.x + r));
|
||||||
|
const bottom = Math.min(imageHeight, Math.round(px.y + r));
|
||||||
|
const w = right - left;
|
||||||
|
const h = bottom - top;
|
||||||
|
|
||||||
|
const svg = buildHexSvg(coord, hexSize, origin, left, top, w, h);
|
||||||
|
|
||||||
|
await sharp(imagePath)
|
||||||
|
.extract({ left, top, width: w, height: h })
|
||||||
|
.composite([{ input: Buffer.from(svg), top: 0, left: 0 }])
|
||||||
|
.png({ compressionLevel: 6 })
|
||||||
|
.toFile(outputPath);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Processing order ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
function sortCoords(coords: AxialCoord[]): AxialCoord[] {
|
||||||
|
return [...coords].sort((a, b) => a.q !== b.q ? a.q - b.q : a.r - b.r);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Main ─────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const args = process.argv.slice(2);
|
||||||
|
const imagePath = args[0];
|
||||||
|
const mapId = parseInt(args[1], 10);
|
||||||
|
|
||||||
|
if (!imagePath || isNaN(mapId)) {
|
||||||
|
console.error('Usage: npx tsx pipeline/extract-submaps.ts <image> <map-id> [--region q0,r0,q1,r1]');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
let regionFilter: { q0: number; r0: number; q1: number; r1: number } | null = null;
|
||||||
|
const regionIdx = args.indexOf('--region');
|
||||||
|
if (regionIdx !== -1) {
|
||||||
|
const [q0, r0, q1, r1] = args[regionIdx + 1].split(',').map(Number);
|
||||||
|
regionFilter = { q0, r0, q1, r1 };
|
||||||
|
console.log(`Region filter: q=[${q0},${q1}] r=[${r0},${r1}]`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!existsSync(imagePath)) {
|
||||||
|
console.error(`Image not found: ${imagePath}`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
await initDb();
|
||||||
|
const cfg = loadMapConfig(mapId);
|
||||||
|
const { image_width, image_height, hex_size, origin_x, origin_y } = cfg;
|
||||||
|
const origin: PixelCoord = { x: origin_x, y: origin_y };
|
||||||
|
|
||||||
|
const hexesPerMeile = (hex_size * 2) / PIXELS_PER_MEILE;
|
||||||
|
console.log(`Map ${mapId}: ${image_width}×${image_height}px, hexSize=${hex_size}px = ${hexesPerMeile} Meilen/Hex`);
|
||||||
|
|
||||||
|
let { coords } = gridBoundsForImage(image_width, image_height, hex_size, origin);
|
||||||
|
|
||||||
|
if (regionFilter) {
|
||||||
|
const { q0, r0, q1, r1 } = regionFilter;
|
||||||
|
coords = coords.filter(c => c.q >= q0 && c.q <= q1 && c.r >= r0 && c.r <= r1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const sorted = sortCoords(coords);
|
||||||
|
console.log(`Total hexes to extract: ${sorted.length}`);
|
||||||
|
|
||||||
|
const outDir = join(ROOT, 'pipeline', 'submaps', String(mapId));
|
||||||
|
mkdirSync(outDir, { recursive: true });
|
||||||
|
|
||||||
|
// Write manifest
|
||||||
|
const manifest = {
|
||||||
|
mapId,
|
||||||
|
imageWidth: image_width,
|
||||||
|
imageHeight: image_height,
|
||||||
|
hexSize: hex_size,
|
||||||
|
originX: origin_x,
|
||||||
|
originY: origin_y,
|
||||||
|
meilenPerHex: hexesPerMeile,
|
||||||
|
hexes: sorted.map(c => ({ q: c.q, r: c.r })),
|
||||||
|
};
|
||||||
|
writeFileSync(join(outDir, 'manifest.json'), JSON.stringify(manifest, null, 2));
|
||||||
|
console.log(`Manifest written: ${sorted.length} hexes`);
|
||||||
|
|
||||||
|
let done = 0;
|
||||||
|
for (const coord of sorted) {
|
||||||
|
const filename = `${coord.q}_${coord.r}.png`;
|
||||||
|
const outPath = join(outDir, filename);
|
||||||
|
|
||||||
|
// Skip if already extracted (resumable)
|
||||||
|
if (existsSync(outPath)) {
|
||||||
|
done++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
await extractSubmap(imagePath, coord, hex_size, origin, image_width, image_height, outPath);
|
||||||
|
done++;
|
||||||
|
|
||||||
|
if (done % 100 === 0 || done === sorted.length) {
|
||||||
|
process.stdout.write(` ${done}/${sorted.length}\r`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`\nExtracted ${done} submaps → ${outDir}`);
|
||||||
|
console.log(`\nNext step: Claude classifies submaps in this session.`);
|
||||||
|
console.log(`Then run: npx tsx pipeline/import-from-json.ts ${outDir}/classifications.json ${mapId}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch(err => { console.error('Fatal:', err); process.exit(1); });
|
||||||
394
pipeline/import-from-image.ts
Normal file
394
pipeline/import-from-image.ts
Normal file
@@ -0,0 +1,394 @@
|
|||||||
|
/**
|
||||||
|
* One-time hex map terrain import from a source image using Claude vision.
|
||||||
|
*
|
||||||
|
* Usage:
|
||||||
|
* npx tsx pipeline/import-from-image.ts <image> <map-id> [options]
|
||||||
|
*
|
||||||
|
* Options:
|
||||||
|
* --model haiku|sonnet Vision model (default: sonnet)
|
||||||
|
* --dry-run Classify without writing to DB
|
||||||
|
* --save-every <n> Persist DB every N hexes (default: 50)
|
||||||
|
*
|
||||||
|
* Processing order: column-by-column (q ascending, r ascending within q),
|
||||||
|
* so NW and W neighbours are always already classified when a hex is reached.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import Anthropic from '@anthropic-ai/sdk';
|
||||||
|
import sharp from 'sharp';
|
||||||
|
import { resolve, dirname } from 'path';
|
||||||
|
import { fileURLToPath } from 'url';
|
||||||
|
import { initDb, getDb, saveDb } from '../server/db.js';
|
||||||
|
import { axialToPixel, hexVertices } from '../core/coords.js';
|
||||||
|
import { gridBoundsForImage } from '../core/hex-grid.js';
|
||||||
|
import { TERRAIN_TYPES } from '../core/terrain.js';
|
||||||
|
import { HexEdge, EDGE_DIRECTIONS, ALL_EDGES, type AxialCoord, type PixelCoord } from '../core/types.js';
|
||||||
|
|
||||||
|
// ─── Model config ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const MODELS = {
|
||||||
|
haiku: 'claude-haiku-4-5-20251001',
|
||||||
|
sonnet: 'claude-sonnet-4-6',
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
type ModelKey = keyof typeof MODELS;
|
||||||
|
|
||||||
|
// ─── Edge mask helpers ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const EDGE_BIT: Record<HexEdge, number> = {
|
||||||
|
[HexEdge.NE]: 1,
|
||||||
|
[HexEdge.E]: 2,
|
||||||
|
[HexEdge.SE]: 4,
|
||||||
|
[HexEdge.SW]: 8,
|
||||||
|
[HexEdge.W]: 16,
|
||||||
|
[HexEdge.NW]: 32,
|
||||||
|
};
|
||||||
|
|
||||||
|
const EDGE_NAME: Record<HexEdge, string> = {
|
||||||
|
[HexEdge.NE]: 'NE',
|
||||||
|
[HexEdge.E]: 'E',
|
||||||
|
[HexEdge.SE]: 'SE',
|
||||||
|
[HexEdge.SW]: 'SW',
|
||||||
|
[HexEdge.W]: 'W',
|
||||||
|
[HexEdge.NW]: 'NW',
|
||||||
|
};
|
||||||
|
|
||||||
|
// ─── Submap extraction ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const CROP_RADIUS_FACTOR = 2.8; // multiplied by hexSize → radius of crop around center
|
||||||
|
|
||||||
|
interface Crop {
|
||||||
|
left: number;
|
||||||
|
top: number;
|
||||||
|
width: number;
|
||||||
|
height: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
function computeCrop(
|
||||||
|
cx: number,
|
||||||
|
cy: number,
|
||||||
|
hexSize: number,
|
||||||
|
imageWidth: number,
|
||||||
|
imageHeight: number,
|
||||||
|
): Crop {
|
||||||
|
const r = Math.ceil(hexSize * CROP_RADIUS_FACTOR);
|
||||||
|
const left = Math.max(0, Math.round(cx - r));
|
||||||
|
const top = Math.max(0, Math.round(cy - r));
|
||||||
|
const right = Math.min(imageWidth, Math.round(cx + r));
|
||||||
|
const bottom = Math.min(imageHeight, Math.round(cy + r));
|
||||||
|
return { left, top, width: right - left, height: bottom - top };
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Build SVG overlay: center hex in red, neighbours in grey. */
|
||||||
|
function buildHexOverlaySvg(
|
||||||
|
center: AxialCoord,
|
||||||
|
hexSize: number,
|
||||||
|
origin: PixelCoord,
|
||||||
|
crop: Crop,
|
||||||
|
): string {
|
||||||
|
const hexCoords = [center, ...ALL_EDGES.map(e => {
|
||||||
|
const d = EDGE_DIRECTIONS[e];
|
||||||
|
return { q: center.q + d.q, r: center.r + d.r };
|
||||||
|
})];
|
||||||
|
|
||||||
|
const polys = hexCoords.map((coord, i) => {
|
||||||
|
const px = axialToPixel(coord, hexSize, origin);
|
||||||
|
const verts = hexVertices(px.x - crop.left, px.y - crop.top, hexSize);
|
||||||
|
const pts = verts.map(v => `${v.x.toFixed(1)},${v.y.toFixed(1)}`).join(' ');
|
||||||
|
const isCenter = i === 0;
|
||||||
|
return `<polygon points="${pts}" fill="${isCenter ? 'rgba(255,68,68,0.08)' : 'none'}" `
|
||||||
|
+ `stroke="${isCenter ? '#ff4444' : '#cccccc'}" `
|
||||||
|
+ `stroke-width="${isCenter ? 3 : 1.5}" stroke-opacity="0.9"/>`;
|
||||||
|
});
|
||||||
|
|
||||||
|
return `<svg width="${crop.width}" height="${crop.height}" xmlns="http://www.w3.org/2000/svg">\n`
|
||||||
|
+ polys.join('\n') + '\n</svg>';
|
||||||
|
}
|
||||||
|
|
||||||
|
async function extractSubmap(
|
||||||
|
imagePath: string,
|
||||||
|
center: AxialCoord,
|
||||||
|
hexSize: number,
|
||||||
|
origin: PixelCoord,
|
||||||
|
imageWidth: number,
|
||||||
|
imageHeight: number,
|
||||||
|
): Promise<string> {
|
||||||
|
const px = axialToPixel(center, hexSize, origin);
|
||||||
|
const crop = computeCrop(px.x, px.y, hexSize, imageWidth, imageHeight);
|
||||||
|
const svg = buildHexOverlaySvg(center, hexSize, origin, crop);
|
||||||
|
|
||||||
|
const buf = await sharp(imagePath)
|
||||||
|
.extract(crop)
|
||||||
|
.composite([{ input: Buffer.from(svg), top: 0, left: 0 }])
|
||||||
|
.png()
|
||||||
|
.toBuffer();
|
||||||
|
|
||||||
|
return buf.toString('base64');
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Neighbour context ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
interface ClassifiedHex {
|
||||||
|
base: string;
|
||||||
|
features: Array<{ terrainId: string; edgeMask: number }>;
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildNeighbourContext(
|
||||||
|
center: AxialCoord,
|
||||||
|
classified: Map<string, ClassifiedHex>,
|
||||||
|
): string {
|
||||||
|
const lines: string[] = [];
|
||||||
|
for (const edge of ALL_EDGES) {
|
||||||
|
const d = EDGE_DIRECTIONS[edge];
|
||||||
|
const key = `${center.q + d.q},${center.r + d.r}`;
|
||||||
|
const name = EDGE_NAME[edge];
|
||||||
|
const result = classified.get(key);
|
||||||
|
if (result) {
|
||||||
|
let desc = result.base;
|
||||||
|
if (result.features.length > 0) {
|
||||||
|
const feats = result.features.map(f => {
|
||||||
|
const exitEdges = ALL_EDGES
|
||||||
|
.filter(e => f.edgeMask & EDGE_BIT[e])
|
||||||
|
.map(e => EDGE_NAME[e])
|
||||||
|
.join('+');
|
||||||
|
return `${f.terrainId}(edges:${exitEdges})`;
|
||||||
|
}).join(', ');
|
||||||
|
desc += ` + ${feats}`;
|
||||||
|
}
|
||||||
|
lines.push(` ${name}: ${desc}`);
|
||||||
|
} else {
|
||||||
|
lines.push(` ${name}: unknown`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return lines.join('\n');
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Claude tool definition ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const AREA_IDS = TERRAIN_TYPES.filter(t => t.category === 'area').map(t => t.id);
|
||||||
|
const LINEAR_IDS = TERRAIN_TYPES.filter(t => t.category === 'linear').map(t => t.id);
|
||||||
|
|
||||||
|
const classifyTool: Anthropic.Tool = {
|
||||||
|
name: 'classify_hex',
|
||||||
|
description: 'Classify the terrain of the center hex (red outline) in the map image.',
|
||||||
|
input_schema: {
|
||||||
|
type: 'object' as const,
|
||||||
|
properties: {
|
||||||
|
base: {
|
||||||
|
type: 'string',
|
||||||
|
enum: AREA_IDS,
|
||||||
|
description: 'Primary area terrain filling the center hex.',
|
||||||
|
},
|
||||||
|
features: {
|
||||||
|
type: 'array',
|
||||||
|
items: {
|
||||||
|
type: 'object' as const,
|
||||||
|
properties: {
|
||||||
|
terrainId: { type: 'string', enum: LINEAR_IDS },
|
||||||
|
edgeMask: {
|
||||||
|
type: 'integer',
|
||||||
|
minimum: 0,
|
||||||
|
maximum: 63,
|
||||||
|
description: '6-bit mask of edges the feature crosses. '
|
||||||
|
+ 'Bits: NE=1, E=2, SE=4, SW=8, W=16, NW=32.',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
required: ['terrainId', 'edgeMask'],
|
||||||
|
},
|
||||||
|
description: 'Linear features (rivers, roads, coastlines) passing through the hex. '
|
||||||
|
+ 'Omit if none visible.',
|
||||||
|
},
|
||||||
|
reasoning: {
|
||||||
|
type: 'string',
|
||||||
|
description: 'One sentence explaining the classification.',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
required: ['base', 'features', 'reasoning'],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const SYSTEM_PROMPT = `You are a fantasy cartography expert classifying hexagonal map regions.
|
||||||
|
You will receive a cropped section of a hand-drawn or digitally painted fantasy map.
|
||||||
|
The CENTER hex is outlined in RED. Surrounding hexes are outlined in grey for context.
|
||||||
|
|
||||||
|
Terrain types (area):
|
||||||
|
${TERRAIN_TYPES.filter(t => t.category === 'area')
|
||||||
|
.map(t => ` ${t.id}: ${t.name}`).join('\n')}
|
||||||
|
|
||||||
|
Linear features (can overlay area terrain):
|
||||||
|
${TERRAIN_TYPES.filter(t => t.category === 'linear')
|
||||||
|
.map(t => ` ${t.id}: ${t.name} — use edgeMask to indicate which hex edges it crosses`).join('\n')}
|
||||||
|
|
||||||
|
Edge mask bits: NE=1, E=2, SE=4, SW=8, W=16, NW=32.
|
||||||
|
A river entering from the W edge and exiting via the E edge → edgeMask = 18 (W|E = 16+2).
|
||||||
|
|
||||||
|
Focus on the CENTER hex. Use neighbour context only to infer continuity of rivers/roads.`;
|
||||||
|
|
||||||
|
// ─── API call ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function classifyHex(
|
||||||
|
client: Anthropic,
|
||||||
|
model: string,
|
||||||
|
submapBase64: string,
|
||||||
|
neighborContext: string,
|
||||||
|
): Promise<ClassifiedHex> {
|
||||||
|
const userText = `Classify the center hex (red outline).\n\nNeighbour terrain:\n${neighborContext}`;
|
||||||
|
|
||||||
|
const response = await client.messages.create({
|
||||||
|
model,
|
||||||
|
max_tokens: 512,
|
||||||
|
system: SYSTEM_PROMPT,
|
||||||
|
tools: [classifyTool],
|
||||||
|
tool_choice: { type: 'any' },
|
||||||
|
messages: [{
|
||||||
|
role: 'user',
|
||||||
|
content: [
|
||||||
|
{
|
||||||
|
type: 'image',
|
||||||
|
source: { type: 'base64', media_type: 'image/png', data: submapBase64 },
|
||||||
|
},
|
||||||
|
{ type: 'text', text: userText },
|
||||||
|
],
|
||||||
|
}],
|
||||||
|
});
|
||||||
|
|
||||||
|
const toolUse = response.content.find(b => b.type === 'tool_use');
|
||||||
|
if (!toolUse || toolUse.type !== 'tool_use') {
|
||||||
|
throw new Error('No tool_use block in response');
|
||||||
|
}
|
||||||
|
|
||||||
|
const input = toolUse.input as { base: string; features: any[]; reasoning: string };
|
||||||
|
return { base: input.base, features: input.features ?? [] };
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── DB helpers ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
interface MapConfig {
|
||||||
|
id: number;
|
||||||
|
image_width: number;
|
||||||
|
image_height: number;
|
||||||
|
hex_size: number;
|
||||||
|
origin_x: number;
|
||||||
|
origin_y: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
function loadMapConfig(mapId: number): MapConfig {
|
||||||
|
const db = getDb();
|
||||||
|
const rows = db.exec('SELECT id, image_width, image_height, hex_size, origin_x, origin_y FROM hex_maps WHERE id = ?', [mapId]);
|
||||||
|
if (rows.length === 0 || rows[0].values.length === 0) {
|
||||||
|
throw new Error(`Map ${mapId} not found`);
|
||||||
|
}
|
||||||
|
const [id, image_width, image_height, hex_size, origin_x, origin_y] = rows[0].values[0] as number[];
|
||||||
|
return { id, image_width, image_height, hex_size, origin_x, origin_y };
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeHex(mapId: number, coord: AxialCoord, result: ClassifiedHex): void {
|
||||||
|
const db = getDb();
|
||||||
|
db.run(
|
||||||
|
`INSERT INTO hexes (map_id, q, r, base_terrain, updated_at)
|
||||||
|
VALUES (?, ?, ?, ?, datetime('now'))
|
||||||
|
ON CONFLICT(map_id, q, r)
|
||||||
|
DO UPDATE SET base_terrain = excluded.base_terrain, updated_at = datetime('now')`,
|
||||||
|
[mapId, coord.q, coord.r, result.base],
|
||||||
|
);
|
||||||
|
const idRows = db.exec('SELECT id FROM hexes WHERE map_id = ? AND q = ? AND r = ?', [mapId, coord.q, coord.r]);
|
||||||
|
const hexId = idRows[0].values[0][0] as number;
|
||||||
|
db.run('DELETE FROM hex_features WHERE hex_id = ?', [hexId]);
|
||||||
|
for (const f of result.features) {
|
||||||
|
if (f.edgeMask === 0) continue;
|
||||||
|
db.run('INSERT INTO hex_features (hex_id, terrain_id, edge_mask) VALUES (?, ?, ?)', [hexId, f.terrainId, f.edgeMask]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Processing order ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
/** Sort axial coords: q ascending, then r ascending within each q column. */
|
||||||
|
function sortCoords(coords: AxialCoord[]): AxialCoord[] {
|
||||||
|
return [...coords].sort((a, b) => a.q !== b.q ? a.q - b.q : a.r - b.r);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ─── Main ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const args = process.argv.slice(2);
|
||||||
|
|
||||||
|
const imagePath = args[0];
|
||||||
|
const mapId = parseInt(args[1], 10);
|
||||||
|
|
||||||
|
if (!imagePath || isNaN(mapId)) {
|
||||||
|
console.error('Usage: npx tsx pipeline/import-from-image.ts <image> <map-id> [--model haiku|sonnet] [--dry-run] [--save-every n]');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const modelKey = (args.includes('--model') ? args[args.indexOf('--model') + 1] : 'sonnet') as ModelKey;
|
||||||
|
const dryRun = args.includes('--dry-run');
|
||||||
|
const saveEvery = args.includes('--save-every') ? parseInt(args[args.indexOf('--save-every') + 1], 10) : 50;
|
||||||
|
|
||||||
|
if (!(modelKey in MODELS)) {
|
||||||
|
console.error(`Unknown model "${modelKey}". Use: haiku or sonnet`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const model = MODELS[modelKey];
|
||||||
|
console.log(`Model: ${modelKey} (${model})`);
|
||||||
|
console.log(`Dry run: ${dryRun}`);
|
||||||
|
|
||||||
|
await initDb();
|
||||||
|
const mapConfig = loadMapConfig(mapId);
|
||||||
|
const { image_width, image_height, hex_size, origin_x, origin_y } = mapConfig;
|
||||||
|
const origin: PixelCoord = { x: origin_x, y: origin_y };
|
||||||
|
|
||||||
|
console.log(`Map ${mapId}: ${image_width}×${image_height}px, hexSize=${hex_size}, origin=(${origin_x},${origin_y})`);
|
||||||
|
|
||||||
|
const { coords } = gridBoundsForImage(image_width, image_height, hex_size, origin);
|
||||||
|
const sorted = sortCoords(coords);
|
||||||
|
console.log(`Total hexes: ${sorted.length}`);
|
||||||
|
|
||||||
|
const client = new Anthropic();
|
||||||
|
const classified = new Map<string, ClassifiedHex>();
|
||||||
|
let done = 0;
|
||||||
|
let errors = 0;
|
||||||
|
|
||||||
|
for (const coord of sorted) {
|
||||||
|
const key = `${coord.q},${coord.r}`;
|
||||||
|
done++;
|
||||||
|
|
||||||
|
const neighborContext = buildNeighbourContext(coord, classified);
|
||||||
|
|
||||||
|
let result: ClassifiedHex;
|
||||||
|
try {
|
||||||
|
const submap = await extractSubmap(imagePath, coord, hex_size, origin, image_width, image_height);
|
||||||
|
result = await classifyHex(client, model, submap, neighborContext);
|
||||||
|
} catch (err) {
|
||||||
|
console.error(` [${done}/${sorted.length}] (${coord.q},${coord.r}) ERROR: ${err}`);
|
||||||
|
errors++;
|
||||||
|
result = { base: 'plains', features: [] };
|
||||||
|
}
|
||||||
|
|
||||||
|
classified.set(key, result);
|
||||||
|
|
||||||
|
const featureStr = result.features.length > 0
|
||||||
|
? ` + ${result.features.map(f => f.terrainId).join(', ')}`
|
||||||
|
: '';
|
||||||
|
console.log(` [${done}/${sorted.length}] (${coord.q},${coord.r}) → ${result.base}${featureStr}`);
|
||||||
|
|
||||||
|
if (!dryRun) {
|
||||||
|
writeHex(mapId, coord, result);
|
||||||
|
if (done % saveEvery === 0) {
|
||||||
|
saveDb();
|
||||||
|
console.log(` [saved at ${done}]`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!dryRun) {
|
||||||
|
saveDb();
|
||||||
|
console.log(`\nDone. ${done} hexes written, ${errors} errors.`);
|
||||||
|
} else {
|
||||||
|
console.log(`\nDry run complete. ${done} hexes classified (not written), ${errors} errors.`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch(err => {
|
||||||
|
console.error('Fatal:', err);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
94
pipeline/import-from-json.ts
Normal file
94
pipeline/import-from-json.ts
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
/**
|
||||||
|
* Phase 3: Import hex classifications from JSON into the DB.
|
||||||
|
*
|
||||||
|
* Usage:
|
||||||
|
* npx tsx pipeline/import-from-json.ts <classifications.json> <map-id>
|
||||||
|
*
|
||||||
|
* Input JSON format (array):
|
||||||
|
* [
|
||||||
|
* { "q": 0, "r": 0, "base": "forest", "features": [] },
|
||||||
|
* { "q": 1, "r": 0, "base": "plains", "features": [{ "terrainId": "river", "edgeMask": 18 }] },
|
||||||
|
* ...
|
||||||
|
* ]
|
||||||
|
*
|
||||||
|
* Edge mask bits: NE=1, E=2, SE=4, SW=8, W=16, NW=32
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { readFileSync } from 'fs';
|
||||||
|
import { resolve } from 'path';
|
||||||
|
import { initDb, getDb, saveDb } from '../server/db.js';
|
||||||
|
|
||||||
|
interface HexClassification {
|
||||||
|
q: number;
|
||||||
|
r: number;
|
||||||
|
base: string;
|
||||||
|
features: Array<{ terrainId: string; edgeMask: number }>;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const [jsonPath, mapIdStr] = process.argv.slice(2);
|
||||||
|
|
||||||
|
if (!jsonPath || !mapIdStr) {
|
||||||
|
console.error('Usage: npx tsx pipeline/import-from-json.ts <classifications.json> <map-id>');
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const mapId = parseInt(mapIdStr, 10);
|
||||||
|
const hexes: HexClassification[] = JSON.parse(readFileSync(resolve(jsonPath), 'utf-8'));
|
||||||
|
console.log(`Importing ${hexes.length} hexes into map ${mapId}...`);
|
||||||
|
|
||||||
|
await initDb();
|
||||||
|
const db = getDb();
|
||||||
|
|
||||||
|
// Verify map exists
|
||||||
|
const mapRows = db.exec('SELECT id FROM hex_maps WHERE id = ?', [mapId]);
|
||||||
|
if (mapRows.length === 0 || mapRows[0].values.length === 0) {
|
||||||
|
console.error(`Map ${mapId} not found`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
db.run('BEGIN TRANSACTION');
|
||||||
|
let written = 0;
|
||||||
|
let errors = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
for (const hex of hexes) {
|
||||||
|
// Upsert hex
|
||||||
|
db.run(
|
||||||
|
`INSERT INTO hexes (map_id, q, r, base_terrain, updated_at)
|
||||||
|
VALUES (?, ?, ?, ?, datetime('now'))
|
||||||
|
ON CONFLICT(map_id, q, r)
|
||||||
|
DO UPDATE SET base_terrain = excluded.base_terrain, updated_at = datetime('now')`,
|
||||||
|
[mapId, hex.q, hex.r, hex.base],
|
||||||
|
);
|
||||||
|
|
||||||
|
const idRows = db.exec(
|
||||||
|
'SELECT id FROM hexes WHERE map_id = ? AND q = ? AND r = ?',
|
||||||
|
[mapId, hex.q, hex.r],
|
||||||
|
);
|
||||||
|
const hexId = idRows[0].values[0][0] as number;
|
||||||
|
|
||||||
|
db.run('DELETE FROM hex_features WHERE hex_id = ?', [hexId]);
|
||||||
|
for (const f of (hex.features ?? [])) {
|
||||||
|
if (!f.edgeMask) continue;
|
||||||
|
db.run(
|
||||||
|
'INSERT INTO hex_features (hex_id, terrain_id, edge_mask) VALUES (?, ?, ?)',
|
||||||
|
[hexId, f.terrainId, f.edgeMask],
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
written++;
|
||||||
|
}
|
||||||
|
|
||||||
|
db.run("UPDATE hex_maps SET updated_at = datetime('now') WHERE id = ?", [mapId]);
|
||||||
|
db.run('COMMIT');
|
||||||
|
saveDb();
|
||||||
|
console.log(`Done: ${written} hexes written, ${errors} errors.`);
|
||||||
|
} catch (err) {
|
||||||
|
db.run('ROLLBACK');
|
||||||
|
console.error('Transaction failed:', err);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch(err => { console.error('Fatal:', err); process.exit(1); });
|
||||||
Reference in New Issue
Block a user