Compare commits

...

52 Commits

Author SHA1 Message Date
89ea421404 chore: canonical prologue + hide legacy prologue from archicratie index
Some checks failed
CI / build-and-anchors (push) Failing after 49s
SMOKE / smoke (push) Successful in 12s
2026-01-26 18:23:58 +01:00
33a5bad49e chore: add docx chapter splits and manifest
All checks were successful
CI / build-and-anchors (push) Successful in 1m7s
SMOKE / smoke (push) Successful in 27s
2026-01-26 13:27:46 +01:00
56f1be0906 chore: track sources docx/pdf; document anchor aliases; add smoke workflow
All checks were successful
CI / build-and-anchors (push) Successful in 1m5s
SMOKE / smoke (push) Successful in 22s
2026-01-26 11:05:55 +01:00
459ba6bf24 ci: remove ping debug workflow
All checks were successful
CI / build-and-anchors (push) Successful in 57s
2026-01-23 18:49:09 +01:00
441860889e ci: rewrite workflow clean (triggers + build before verify)
All checks were successful
CI / build-and-anchors (push) Successful in 58s
PING / ping (push) Successful in 18s
2026-01-23 18:44:19 +01:00
11fa4eb73f ci: add ping workflow (debug trigger)
All checks were successful
PING / ping (push) Successful in 22s
2026-01-23 17:27:42 +01:00
00e6e260dc ci: trigger 2026-01-23 17:01:37 +01:00
7efd2e10c2 ci: fix workflow yaml (valid triggers) 2026-01-23 17:01:27 +01:00
78f2703aa5 ci: trigger 2026-01-23 15:35:27 +01:00
ad8e245ec7 ci: fix triggers (push any branch) + manual dispatch; remove stray quote 2026-01-23 15:35:13 +01:00
e8771187ab ci: trigger 2026-01-23 15:33:27 +01:00
3b30a3894e ci: fix triggers (push any branch) + manual dispatch; remove stray quote 2026-01-23 15:33:04 +01:00
8c8e13baad ci: run dist alias verification after build (single verify step) 2026-01-23 15:15:01 +01:00
fb5aac70cb ci: fix step order (build before dist alias verification) 2026-01-23 15:03:33 +01:00
3d4ab82047 ci: verify all anchor aliases are injected in dist
Some checks failed
CI / build-and-anchors (push) Failing after 26s
2026-01-23 14:08:03 +01:00
44974a676d docs: add roadmap for CI + anchor aliases
All checks were successful
CI / build-and-anchors (push) Successful in 50s
2026-01-23 13:41:10 +01:00
12d73fc26e docs: CI baseline + handoff + workflow snapshot
All checks were successful
CI / build-and-anchors (push) Successful in 52s
2026-01-23 13:17:22 +01:00
587af3e997 docs: passation CI/runner DNS baseline
All checks were successful
CI / build-and-anchors (push) Successful in 56s
2026-01-23 12:02:57 +01:00
800226a404 ci: verify anchor aliases are injected into dist
All checks were successful
CI / build-and-anchors (push) Successful in 54s
2026-01-23 09:47:25 +01:00
fee143e86f ci: add anchor aliases schema checker
All checks were successful
CI / build-and-anchors (push) Successful in 53s
2026-01-23 09:33:16 +01:00
15f0679d2e ci: validate anchor aliases schema
Some checks failed
CI / build-and-anchors (push) Failing after 1m21s
2026-01-23 09:19:06 +01:00
4294d566ee docs: record CI baseline (runner host net + node22)
All checks were successful
CI / build-and-anchors (push) Successful in 50s
2026-01-22 18:27:01 +01:00
812d074148 ci: use node 22 container to satisfy engines
All checks were successful
CI / build-and-anchors (push) Successful in 2m0s
2026-01-22 18:12:07 +01:00
ae2715a14c ci: checkout from event.json (no external actions, no GITEA_* vars)
Some checks failed
CI / build-and-anchors (push) Failing after 32s
2026-01-22 18:00:02 +01:00
0888d6b424 ci: no external checkout; no apt; bash + node dns harden
Some checks failed
CI / build-and-anchors (push) Failing after 23s
2026-01-22 17:42:36 +01:00
7cee744208 ci: rerun (runner network fix)
Some checks failed
CI / build-and-anchors (push) Failing after 2m3s
2026-01-22 12:38:16 +01:00
9823d70896 ci: rerun after runner network/dns fix
Some checks failed
CI / build-and-anchors (push) Failing after 22s
2026-01-22 10:26:28 +01:00
939e6ae9ac ci: trigger
Some checks failed
CI / build-and-anchors (push) Failing after 1m56s
2026-01-21 19:39:01 +01:00
b1391cea6e Revert "Merge pull request 'ci: checkout without external actions (no github.com)' (#43) from fix/ci-no-external-actions into master"
Some checks failed
CI / build-and-anchors (push) Failing after 1m55s
This reverts commit 92b01a43b2, reversing
changes made to b6b9855f58.
2026-01-21 19:31:42 +01:00
56d511caf2 Revert "Merge pull request 'ci: stabilize DNS for job container (fix apt resolution)' (#44) from fix/ci-dns into master"
This reverts commit 058004e865, reversing
changes made to 92b01a43b2.
2026-01-21 19:31:42 +01:00
45b76b9c44 Revert "Merge pull request 'ci: remove apt dependency + force LAN DNS in job container' (#45) from fix/ci-no-apt-dns into master"
This reverts commit 0e6e92e327, reversing
changes made to 058004e865.
2026-01-21 19:31:42 +01:00
ce42bdfe04 Revert "Merge pull request 'ci: fix shell (dash) by removing pipefail' (#46) from fix/ci-sh-no-pipefail into master"
This reverts commit d1caff6b21, reversing
changes made to 0e6e92e327.
2026-01-21 19:31:42 +01:00
599ece37b2 Revert "Merge pull request 'ci: fix shell (dash) by removing pipefail' (#47) from fix/ci-sh-no-pipefail into master"
This reverts commit cec0a75fc8, reversing
changes made to d1caff6b21.
2026-01-21 19:31:42 +01:00
cec0a75fc8 Merge pull request 'ci: fix shell (dash) by removing pipefail' (#47) from fix/ci-sh-no-pipefail into master
Some checks failed
CI / build-and-anchors (push) Failing after 36s
Reviewed-on: #47
2026-01-21 18:33:25 +01:00
9a59e9a6cf ci: fix shell (dash) by removing pipefail
Some checks failed
CI / build-and-anchors (push) Failing after 42s
CI / build-and-anchors (pull_request) Failing after 36s
2026-01-21 18:32:48 +01:00
d1caff6b21 Merge pull request 'ci: fix shell (dash) by removing pipefail' (#46) from fix/ci-sh-no-pipefail into master
Some checks failed
CI / build-and-anchors (push) Failing after 1m21s
Reviewed-on: #46
2026-01-21 14:09:41 +01:00
7e13b1166d ci: fix shell (dash) by removing pipefail
Some checks failed
CI / build-and-anchors (push) Failing after 1m20s
CI / build-and-anchors (pull_request) Failing after 1m1s
2026-01-21 14:09:04 +01:00
0e6e92e327 Merge pull request 'ci: remove apt dependency + force LAN DNS in job container' (#45) from fix/ci-no-apt-dns into master
Some checks failed
CI / build-and-anchors (push) Failing after 24s
Reviewed-on: #45
2026-01-21 13:58:07 +01:00
0f94676b27 ci: remove apt dependency + force LAN DNS in job container
Some checks failed
CI / build-and-anchors (push) Failing after 2m29s
CI / build-and-anchors (pull_request) Failing after 10s
2026-01-21 13:56:13 +01:00
058004e865 Merge pull request 'ci: stabilize DNS for job container (fix apt resolution)' (#44) from fix/ci-dns into master
Some checks failed
CI / build-and-anchors (push) Has been cancelled
Reviewed-on: #44
2026-01-21 13:25:52 +01:00
cc088df702 ci: stabilize DNS for job container (fix apt resolution)
Some checks failed
CI / build-and-anchors (push) Has been cancelled
CI / build-and-anchors (pull_request) Has been cancelled
2026-01-21 13:19:12 +01:00
92b01a43b2 Merge pull request 'ci: checkout without external actions (no github.com)' (#43) from fix/ci-no-external-actions into master
Some checks failed
CI / build-and-anchors (push) Failing after 39m47s
Reviewed-on: #43
2026-01-21 10:34:37 +01:00
01f41432f0 ci: checkout without external actions (no github.com)
Some checks failed
CI / build-and-anchors (push) Failing after 4m28s
CI / build-and-anchors (pull_request) Failing after 4m24s
2026-01-21 10:33:51 +01:00
b6b9855f58 Merge pull request 'feat/m2-apply-ticket-confort' (#42) from feat/m2-apply-ticket-confort into master
Some checks failed
CI / build-and-anchors (push) Failing after 33s
Reviewed-on: #42
2026-01-20 22:01:05 +01:00
30d5a20572 m2: apply-ticket supports --close (+ PR guard)
Some checks failed
CI / build-and-anchors (push) Failing after 33s
CI / build-and-anchors (pull_request) Failing after 34s
2026-01-20 21:54:36 +01:00
0abf98aa1f m2: apply-ticket fallback anchor/chemin parsing
Some checks failed
CI / build-and-anchors (push) Failing after 34s
2026-01-20 20:24:57 +01:00
1e894e7a1f Merge pull request 'm2: apply-ticket supports --alias and --commit' (#41) from feat/m2-apply-ticket-confort into master
Some checks failed
CI / build-and-anchors (push) Failing after 42s
Reviewed-on: #41
2026-01-20 19:59:41 +01:00
d87d8c0a8f m2: apply-ticket supports --alias and --commit
Some checks failed
CI / build-and-anchors (push) Failing after 34s
CI / build-and-anchors (pull_request) Failing after 34s
2026-01-20 19:53:05 +01:00
5c00593e67 p0: restore anchor alias p-8-e7075fe3 -> p-8-0e65838d
Some checks failed
CI / build-and-anchors (push) Failing after 43s
2026-01-20 19:09:55 +01:00
b2b3d5621b p0: add anchor aliases for tickets #39 and #40
Some checks failed
CI / build-and-anchors (push) Failing after 30s
2026-01-20 19:00:56 +01:00
4c7b6a772c edit: apply ticket #40 (/archicratie/prologue/#p-5-85126fa5)
Some checks failed
CI / build-and-anchors (push) Failing after 27s
2026-01-20 18:57:12 +01:00
11e45eb9d0 edit: apply ticket #39 (/archicratie/prologue/#p-3-76df8102) 2026-01-20 18:57:03 +01:00
51 changed files with 1468 additions and 122 deletions

100
.gitea/workflows/ci.yaml Normal file
View File

@@ -0,0 +1,100 @@
name: CI
on:
push:
pull_request:
branches: [master]
workflow_dispatch:
env:
NODE_OPTIONS: --dns-result-order=ipv4first
defaults:
run:
shell: bash
jobs:
build-and-anchors:
runs-on: ubuntu-latest
container:
image: mcr.microsoft.com/devcontainers/javascript-node:22-bookworm
steps:
- name: Tools sanity
run: |
set -euo pipefail
git --version
node --version
npm --version
npm ping --registry=https://registry.npmjs.org
- name: Checkout (from event.json, no external actions)
run: |
set -euo pipefail
export EVENT_JSON="/var/run/act/workflow/event.json"
test -f "$EVENT_JSON" || { echo "❌ Missing $EVENT_JSON"; exit 1; }
eval "$(node --input-type=module -e 'import fs from "node:fs";
const ev = JSON.parse(fs.readFileSync(process.env.EVENT_JSON,"utf8"));
const repo =
ev?.repository?.clone_url ||
(ev?.repository?.html_url ? (ev.repository.html_url.replace(/\/$/,"") + ".git") : "");
const sha =
ev?.after ||
ev?.pull_request?.head?.sha ||
ev?.head_commit?.id ||
ev?.sha ||
"";
if (!repo) throw new Error("No repository url in event.json");
if (!sha) throw new Error("No sha in event.json");
process.stdout.write(`REPO_URL=${JSON.stringify(repo)}\nSHA=${JSON.stringify(sha)}\n`);
')"
echo "Repo URL: $REPO_URL"
echo "SHA: $SHA"
rm -rf .git
git init -q
git remote add origin "$REPO_URL"
git fetch --depth 1 origin "$SHA"
git -c advice.detachedHead=false checkout -q FETCH_HEAD
git log -1 --oneline
- name: Anchor aliases schema
run: |
set -euo pipefail
node scripts/check-anchor-aliases.mjs
- name: NPM harden
run: |
set -euo pipefail
npm config set fetch-retries 5
npm config set fetch-retry-mintimeout 20000
npm config set fetch-retry-maxtimeout 120000
npm config set registry https://registry.npmjs.org
npm config get registry
- name: Install deps
run: |
set -euo pipefail
npm ci
- name: Inline scripts syntax check
run: |
set -euo pipefail
node scripts/check-inline-js.mjs
- name: Build (includes postbuild injection + pagefind)
run: |
set -euo pipefail
npm run build
- name: Anchors contract
run: |
set -euo pipefail
npm run test:anchors
- name: Verify anchor aliases injected in dist
run: |
set -euo pipefail
node scripts/verify-anchor-aliases-in-dist.mjs

View File

@@ -1,35 +1,103 @@
name: CI
on:
push:
branches: ["**"]
push: {}
pull_request:
branches: ["master"]
workflow_dispatch: {}
env:
NODE_OPTIONS: --dns-result-order=ipv4first
defaults:
run:
shell: bash
jobs:
build-and-anchors:
runs-on: ubuntu-latest
container:
image: node:20-bookworm-slim
image: mcr.microsoft.com/devcontainers/javascript-node:22-bookworm
steps:
- name: Install git (needed by checkout)
- name: Tools sanity
run: |
apt-get update
apt-get install -y --no-install-recommends git ca-certificates
set -euo pipefail
git --version
node --version
npm --version
npm ping --registry=https://registry.npmjs.org
- name: Checkout
uses: actions/checkout@v4
- name: Checkout (from event.json, no external actions)
run: |
set -euo pipefail
EVENT_JSON="/var/run/act/workflow/event.json"
test -f "$EVENT_JSON" || (echo "❌ Missing $EVENT_JSON" && exit 1)
eval "$(node - <<'NODE'
import fs from "node:fs";
const ev = JSON.parse(fs.readFileSync("/var/run/act/workflow/event.json","utf8"));
const repo =
ev?.repository?.clone_url ||
(ev?.repository?.html_url ? (ev.repository.html_url.replace(/\/$/,'') + ".git") : "");
const sha =
ev?.after ||
ev?.pull_request?.head?.sha ||
ev?.head_commit?.id ||
ev?.sha ||
"";
if (!repo) { console.error("No repository.clone_url/html_url in event.json"); process.exit(1); }
if (!sha) { console.error("No sha/after/pull_request.head.sha in event.json"); process.exit(1); }
console.log(`REPO_URL=${JSON.stringify(repo)}`);
console.log(`SHA=${JSON.stringify(sha)}`);
NODE
)"
echo "Repo URL: $REPO_URL"
echo "SHA: $SHA"
rm -rf .git
git init
git remote add origin "$REPO_URL"
git fetch --depth 1 origin "$SHA"
git checkout -q FETCH_HEAD
git log -1 --oneline
- name: Anchor aliases schema
run: |
set -euo pipefail
node scripts/check-anchor-aliases.mjs
- name: NPM harden
run: |
set -euo pipefail
npm config set fetch-retries 5
npm config set fetch-retry-mintimeout 20000
npm config set fetch-retry-maxtimeout 120000
npm config set registry https://registry.npmjs.org
npm config get registry
- name: Install deps
run: npm ci
run: |
set -euo pipefail
npm ci
- name: Inline scripts syntax check
run: node scripts/check-inline-js.mjs
run: |
set -euo pipefail
node scripts/check-inline-js.mjs
- name: Build
run: npm run build
- name: Build (includes postbuild injection + pagefind)
run: |
set -euo pipefail
npm run build
- name: Anchors contract
run: npm run test:anchors
run: |
set -euo pipefail
npm run test:anchors
- name: Verify anchor aliases injected in dist
run: |
set -euo pipefail
node scripts/verify-anchor-aliases-in-dist.mjs

View File

@@ -0,0 +1,9 @@
name: SMOKE
on: [push, workflow_dispatch]
jobs:
smoke:
runs-on: ubuntu-latest
steps:
- run: node -v && npm -v
- run: echo "runner OK"

20
.gitignore vendored
View File

@@ -9,8 +9,24 @@ dist/
# Environnements locaux (on versionne plutôt .env.example)
.env*
# Dossiers de travail local (à garder hors repo)
sources/
# --- sources : on versionne l'amont (docx/pdf), pas les artefacts ---
sources/**
!sources/
!sources/docx/
!sources/docx/**
!sources/pdf/
!sources/pdf/**
# Artefacts et bruit
sources/logs/**
sources/**/layouts-backups/**
sources/**/*.bak
sources/**/*.BROKEN.*
sources/**/*.step*-fix.bak
sources/**/*.bak.issue-*
# LibreOffice/Office lock files
**/.~lock.*#
# Astro generated
.astro/

33
docs/CI-BASELINE.md Normal file
View File

@@ -0,0 +1,33 @@
# CI-BASELINE — Gitea Actions + runner Synology (DS220+)
Baseline VALIDÉE :
- runner : container.network = host
- job CI : container Node 22 (conforme engines)
- checkout : sans GitHub, basé sur workflow/event.json
- zéro apt-get dans le workflow
- durcissement DNS Node : NODE_OPTIONS=--dns-result-order=ipv4first
## Runner (DS220+) — configuration de référence
Fichier : /data/config.yaml dans le conteneur runner (ex: gitea-act-runner)
Section container attendue :
container:
network: host
options: >-
--add-host=gitea.archicratie.trans-hands.synology.me:192.168.1.20
-e NODE_OPTIONS=--dns-result-order=ipv4first
Pourquoi : sur cette infra, le DNS du bridge Docker (127.0.0.11) a généré ESERVFAIL / EAI_AGAIN / apt qui ne résout pas.
Le host network stabilise les résolutions (npm registry, deb.debian.org, etc.).
## Smoke test NAS (doit passer)
docker run --rm --network host mcr.microsoft.com/devcontainers/javascript-node:22-bookworm bash -lc "npm ping --registry=https://registry.npmjs.org"
## Symptômes -> cause -> action
- EAI_AGAIN / ESERVFAIL : runner pas en host network -> remettre container.network: host + restart runner
- EBADENGINE : mauvais Node -> container Node 22
- MODULE_NOT_FOUND scripts/check-anchor-aliases.mjs : fichier non commité -> git add/commit/push

123
docs/CI-WORKFLOW.md Normal file
View File

@@ -0,0 +1,123 @@
# CI-WORKFLOW — snapshot de .gitea/workflows/ci.yml
name: CI
on:
push:
pull_request:
branches: ["master"]
env:
NODE_OPTIONS: --dns-result-order=ipv4first
defaults:
run:
shell: bash
jobs:
build-and-anchors:
runs-on: ubuntu-latest
container:
image: mcr.microsoft.com/devcontainers/javascript-node:22-bookworm
steps:
- name: Tools sanity
run: |
set -euo pipefail
git --version
node --version
npm --version
npm ping --registry=https://registry.npmjs.org
# Checkout SANS action externe (pas de github.com)
- name: Checkout (from event.json, no external actions)
run: |
set -euo pipefail
EVENT_JSON="/var/run/act/workflow/event.json"
if [ ! -f "$EVENT_JSON" ]; then
echo "ERROR: missing $EVENT_JSON"
ls -la /var/run/act/workflow || true
exit 1
fi
# 1) Récupère l'URL du repo depuis event.json
REPO_URL="$(node -e '
const fs=require("fs");
const ev=JSON.parse(fs.readFileSync(process.argv[1],"utf8"));
let url = ev.repository?.clone_url || ev.repository?.html_url || "";
if (!url) process.exit(2);
if (!url.endsWith(".git")) url += ".git";
process.stdout.write(url);
' "$EVENT_JSON")"
# 2) Récupère le SHA (push -> after, PR -> pull_request.head.sha)
SHA="$(node -e '
const fs=require("fs");
const ev=JSON.parse(fs.readFileSync(process.argv[1],"utf8"));
const sha =
ev.after ||
ev.pull_request?.head?.sha ||
ev.head_commit?.id ||
"";
process.stdout.write(sha);
' "$EVENT_JSON")"
if [ -z "$SHA" ]; then
echo "ERROR: cannot find SHA in event.json"
node -e 'const ev=require(process.argv[1]); console.log(Object.keys(ev));' "$EVENT_JSON" || true
exit 1
fi
echo "Repo URL: $REPO_URL"
echo "SHA: $SHA"
# 3) Ajoute token si disponible (NE PAS afficher le token)
AUTH_URL="$REPO_URL"
if [ -n "${GITHUB_TOKEN:-}" ] && [[ "$REPO_URL" == https://* ]]; then
AUTH_URL="${REPO_URL/https:\/\//https:\/\/oauth2:${GITHUB_TOKEN}@}"
elif [ -n "${GITEA_TOKEN:-}" ] && [[ "$REPO_URL" == https://* ]]; then
AUTH_URL="${REPO_URL/https:\/\//https:\/\/oauth2:${GITEA_TOKEN}@}"
fi
# 4) Clone minimal + checkout exact du SHA
rm -rf .git || true
git init .
# Optionnel si ton Gitea a un TLS “non standard” (certificat) :
# git config --global http.sslVerify false
git remote add origin "$AUTH_URL"
git fetch --depth=1 origin "$SHA"
git checkout -q FETCH_HEAD
git log -1 --oneline
- name: Anchor aliases schema
run: node scripts/check-anchor-aliases.mjs
- name: NPM harden
run: |
set -euo pipefail
npm config set fetch-retries 5
npm config set fetch-retry-mintimeout 20000
npm config set fetch-retry-maxtimeout 120000
npm config set registry https://registry.npmjs.org
npm config get registry
- name: Install deps
run: npm ci
- name: Inline scripts syntax check
run: node scripts/check-inline-js.mjs
- name: Build
run: npm run build
- name: Verify anchor aliases injected
run: node scripts/verify-anchor-aliases-in-dist.mjs
- name: Anchors contract
run: npm run test:anchors

25
docs/HANDOFF-SESSION.md Normal file
View File

@@ -0,0 +1,25 @@
# HANDOFF — Bilan synthèse (passation)
## Mission
Rendre la CI Gitea Actions fiable (Synology) et sécuriser les ancrages de paragraphes :
- mapping oldId -> newId versionné
- injection build-time dans dist pour préserver les liens profonds
## Causes racines identifiées
1) DNS instable dans les conteneurs de job via bridge Docker (127.0.0.11) sur cette infra
2) Checkout GitHub externe impossible/indésirable + variables GITEA_* parfois absentes
3) engines Node imposent >=22 <23 => EBADENGINE si Node 20
## Résolution validée (baseline)
- Runner : container.network = host
- Job : image Node 22
- Checkout : via workflow/event.json (pas actions/checkout)
- Workflow : pas de apt-get
- Anchors :
- src/anchors/anchor-aliases.json (par route)
- scripts/inject-anchor-aliases.mjs injecte <span id="oldId"> avant lélément id="newId"
- scripts/check-anchor-aliases.mjs valide le schéma en CI
## État actuel
- CI passe (host net + Node 22 + checkout event.json + no apt)
- Injection daliases vérifiée localement dans dist/…/index.html

157
docs/ROADMAP.md Normal file
View File

@@ -0,0 +1,157 @@
# ROADMAP — CI (Gitea Actions Synology) + Ancrages (aliases build-time)
But : permettre à un successeur de reprendre sans rien deviner.
Ce document décrit :
- létat stable actuel (baseline)
- les invariants à ne pas casser
- les prochaines étapes “mission principale” (ancrages primaires robustes + CI durable)
- la méthode de debug rapide
---
## 0) État actuel (baseline VALIDÉE)
### CI (Gitea Actions)
- ✅ Job dans un container Node 22 (conforme `engines`)
- ✅ Checkout **sans actions GitHub**, depuis `workflow/event.json`
- ✅ Zéro `apt-get` dans le workflow
-`npm ci` + build + tests anchors + validation schema aliases
- ✅ Injection daliases au postbuild confirmée en logs
### Runner (DS220+)
-`container.network: host` dans `/data/config.yaml` du runner
-`NODE_OPTIONS=--dns-result-order=ipv4first` passé aux containers de job
-`--add-host=gitea.archicratie.trans-hands.synology.me:192.168.1.20`
Raison : le DNS du bridge Docker (127.0.0.11) est instable sur cette infra → EAI_AGAIN / ESERVFAIL (npm, debian).
Référence : `docs/CI-BASELINE.md` + `docs/CI-WORKFLOW.md` + `docs/HANDOFF-SESSION.md`.
---
## 1) Invariants (NE PAS “optimiser”)
Ces points sont des garde-fous. Si on les retire, on revient aux mêmes pannes.
1) Runner :
- garder `container.network: host` (tant que linfra DNS bridge nest pas corrigée)
- garder `-e NODE_OPTIONS=--dns-result-order=ipv4first`
2) Workflow :
- ne pas réintroduire `apt-get`
- ne pas dépendre de `actions/checkout@...`
- garder un container Node 22 tant que `package.json engines` impose `>=22 <23`
3) Ancrages :
- le fichier canonique : `src/anchors/anchor-aliases.json`
- injection build-time : `scripts/inject-anchor-aliases.mjs`
- test anchors : `scripts/check-anchors.mjs`
- validation schema aliases : `scripts/check-anchor-aliases.mjs`
---
## 2) Mission principale (raccrochage)
Objectif “métier” :
- préserver les liens profonds (ancrages) malgré lédition (déplacements, insertions, corrections)
- éviter les résolutions “par index” (fragiles)
- rendre la migration dancrages **déterministe, versionnée, testée**
Traduction technique :
- quand un `newId` remplace un `oldId`, on versionne `oldId -> newId` **par page**
- au build, on injecte un alias DOM invisible portant lancien `id` avant lélément ciblé
---
## 3) Prochains jalons (ordre recommandé)
### Jalons A — Verrouillage qualité (court terme, “béton”)
A1) CI : prouver linjection (pas seulement “build ok”)
- ajouter un test qui parcourt `src/anchors/anchor-aliases.json` et vérifie dans `dist/<route>/index.html` :
- présence de `<span id="oldId" ...>`
- présence de lélément `id="newId"`
- et idéalement : alias placé “juste avant” la cible (proximité)
A2) CI : interdire les IDs en double (risque SEO/DOM)
- dans les pages `dist`, détecter les doublons dattribut `id="..."`
A3) CI : artefacts / logs actionnables
- quand un test échoue : afficher `route`, `oldId`, `newId`, extrait HTML et ligne
### Jalons B — Ergonomie éditeur (moyen terme)
B1) `apply-ticket.mjs` : renforcer le mode `--alias`
- si un paragraphe est remplacé : écrire lalias automatiquement
- si conflit : message clair “oldId déjà mappé / newId introuvable”
B2) `check-anchors.mjs` : suggestion daliases
- lorsquil détecte “removed X / added Y” avec même préfixe `p-8-...`
- générer une proposition, option `--write-aliases` (ou sortie patch)
### Jalons C — Robustesse long terme (ops)
C1) Runner : réduire le risque “host network”
- isoler le runner sur LAN (réseau dédié/pare-feu)
- limiter les labels/queues aux repos nécessaires
- documenter comment restaurer `/data/config.yaml`
C2) Versionner les décisions
- tout changement CI/runner : documenté dans `docs/` + commit (pas de “magic fix” non tracé)
---
## 4) Procédure standard (dev -> PR -> merge)
### Ajouter/modifier du contenu
1) modifier les sources (docx/import etc.)
2) si des IDs de paragraphes changent :
- appliquer `scripts/apply-ticket.mjs --alias` si possible
- sinon éditer `src/anchors/anchor-aliases.json` (par route)
### Vérifier en local
- `npm test`
- ou au minimum :
- `npm run build`
- vérifier injection : `grep -n "para-alias" dist/<route>/index.html`
### PR & merge
- une PR = un ticket logique
- CI doit passer
- merge seulement quand anchors + aliases sont cohérents
---
## 5) Debug express (quand ça casse)
### CI échoue “DNS / npm”
Symptômes typiques :
- `EAI_AGAIN`, `ESERVFAIL`, `Temporary failure resolving`
Actions :
1) vérifier runner config : `/data/config.yaml` contient bien `network: host`
2) vérifier job container : logs montrent `network="host"`
3) smoke test NAS :
- `docker run --rm --network host mcr.microsoft.com/devcontainers/javascript-node:22-bookworm bash -lc "npm ping --registry=https://registry.npmjs.org"`
### CI échoue “EBADENGINE”
- Node pas 22 → corriger limage du job (Node 22)
### CI échoue “MODULE_NOT_FOUND scripts/...”
- fichier non commité
- `git status --porcelain` puis `git add/commit/push`
### Injection dalias absente
- vérifier que `postbuild` appelle bien `inject-anchor-aliases.mjs`
- vérifier que `src/anchors/anchor-aliases.json` respecte le schéma (par route)
---
## 6) Définition de “DONE” (quand on peut dire “mission accomplie”)
1) CI stable sur 30+ runs consécutifs (push + PR + merge)
2) Toute modification de paragraphes qui casse des anchors produit :
- soit un alias automatique via tooling
- soit un échec CI explicite (avec patch proposé)
3) Aliases injectés testés (preuve dans dist) + pas de doublons dIDs
4) Documentation à jour (baseline + décisions + procédures)
---
Fin.

View File

@@ -28,3 +28,10 @@ Le test compare, page par page, la liste des IDs de paragraphes présents dans `
## Politique déchec (pragmatique)
Le test échoue si le churn dune page dépasse un seuil (défaut : 20%) sur une page “suffisamment grande”.
## Aliases build-time
- `src/anchors/anchor-aliases.json`
- `scripts/inject-anchor-aliases.mjs`
- `scripts/check-anchor-aliases.mjs`
- et rappelle : *alias = compat rétro de liens historiques sans JS*

View File

@@ -10,9 +10,10 @@
"postbuild": "node scripts/inject-anchor-aliases.mjs && npx pagefind --site dist",
"import": "node scripts/import-docx.mjs",
"apply:ticket": "node scripts/apply-ticket.mjs",
"test": "npm run build && npm run test:anchors && node scripts/check-inline-js.mjs",
"test": "npm run test:aliases && npm run build && npm run test:anchors && node scripts/check-inline-js.mjs",
"test:anchors": "node scripts/check-anchors.mjs",
"test:anchors:update": "node scripts/check-anchors.mjs --update"
"test:anchors:update": "node scripts/check-anchors.mjs --update",
"test:aliases": "node scripts/check-anchor-aliases.mjs"
},
"dependencies": {
"@astrojs/mdx": "^4.3.13",

View File

@@ -4,23 +4,44 @@ import path from "node:path";
import process from "node:process";
import { spawnSync } from "node:child_process";
/**
* apply-ticket — applique une proposition de correction depuis un ticket Gitea
*
* Conçu pour:
* - prendre un ticket [Correction]/[Fact-check] (issue) avec Chemin + Ancre + Proposition
* - retrouver le bon paragraphe dans le .mdx
* - remplacer proprement
* - optionnel: écrire un alias dancre old->new (build-time) dans src/anchors/anchor-aliases.json
* - optionnel: committer automatiquement
* - optionnel: fermer le ticket (après commit)
*/
function usage(exitCode = 0) {
console.log(`
apply-ticket — applique une proposition de correction depuis un ticket Gitea (robuste)
Usage:
node scripts/apply-ticket.mjs <issue_number> [--dry-run] [--no-build]
node scripts/apply-ticket.mjs <issue_number> [--dry-run] [--no-build] [--alias] [--commit] [--close]
Flags:
--dry-run : ne modifie rien, affiche BEFORE/AFTER
--no-build : n'exécute pas "npm run build" (INCOMPATIBLE avec --alias)
--alias : après application, ajoute l'alias d'ancre (old -> new) dans src/anchors/anchor-aliases.json
--commit : git add + git commit automatiquement (inclut alias si --alias)
--close : ferme automatiquement le ticket après commit (+ commentaire avec SHA)
Env (recommandé):
FORGE_API = base API (LAN) ex: http://192.168.1.20:3000 (évite DNS)
FORGE_BASE = base web ex: https://gitea.xxx.tld
FORGE_TOKEN = PAT (avec accès au repo + issues)
FORGE_API = base API (LAN) ex: http://192.168.1.20:3000
FORGE_BASE = base web ex: https://gitea.xxx.tld (fallback si FORGE_API absent)
FORGE_TOKEN = PAT (accès repo + issues)
GITEA_OWNER = owner (optionnel si auto-détecté depuis git remote)
GITEA_REPO = repo (optionnel si auto-détecté depuis git remote)
Notes:
- Si dist/<chemin>/index.html est absent, le script lance "npm run build" sauf si --no-build.
- Sauvegarde automatique: <fichier>.bak.issue-<N> (uniquement si on écrit)
- Avec --alias : le script rebuild pour identifier le NOUVEL id, puis écrit l'alias old->new.
- Refuse automatiquement les Pull Requests (PR) : ce ne sont pas des tickets éditoriaux.
`);
process.exit(exitCode);
}
@@ -36,10 +57,40 @@ if (!Number.isFinite(issueNum) || issueNum <= 0) {
const DRY_RUN = argv.includes("--dry-run");
const NO_BUILD = argv.includes("--no-build");
const DO_ALIAS = argv.includes("--alias");
const DO_COMMIT = argv.includes("--commit");
const DO_CLOSE = argv.includes("--close");
if (DO_ALIAS && NO_BUILD) {
console.error("❌ --alias est incompatible avec --no-build (risque d'alias faux).");
console.error("➡️ Relance sans --no-build.");
process.exit(1);
}
if (DRY_RUN && (DO_ALIAS || DO_COMMIT || DO_CLOSE)) {
console.warn(" --dry-run : --alias/--commit/--close sont ignorés (aucune écriture).");
}
if (DO_CLOSE && DRY_RUN) {
console.error("❌ --close est incompatible avec --dry-run.");
process.exit(1);
}
if (DO_CLOSE && !DO_COMMIT) {
console.error("❌ --close nécessite --commit (on ne ferme jamais un ticket sans commit).");
process.exit(1);
}
if (typeof fetch !== "function") {
console.error("❌ fetch() indisponible dans ce Node. Utilise Node 18+ (ou plus).");
process.exit(1);
}
const CWD = process.cwd();
const CONTENT_ROOT = path.join(CWD, "src", "content");
const DIST_ROOT = path.join(CWD, "dist");
const ALIASES_FILE = path.join(CWD, "src", "anchors", "anchor-aliases.json");
/* -------------------------- utils texte / matching -------------------------- */
function normalizeText(s) {
return String(s ?? "")
@@ -57,11 +108,11 @@ function normalizeText(s) {
// stripping très pragmatique
function stripMd(mdx) {
let s = String(mdx ?? "");
s = s.replace(/`[^`]*`/g, " "); // inline code
s = s.replace(/`[^`]*`/g, " "); // inline code
s = s.replace(/!\[[^\]]*\]\([^)]+\)/g, " "); // images
s = s.replace(/\[[^\]]*\]\([^)]+\)/g, " "); // links
s = s.replace(/[*_~]/g, " "); // emphasis-ish
s = s.replace(/<[^>]+>/g, " "); // html tags
s = s.replace(/\[[^\]]*\]\([^)]+\)/g, " "); // links
s = s.replace(/[*_~]/g, " "); // emphasis-ish
s = s.replace(/<[^>]+>/g, " "); // html tags
s = s.replace(/\s+/g, " ").trim();
return s;
}
@@ -74,13 +125,78 @@ function tokenize(s) {
.filter((w) => w.length >= 4);
}
function scoreText(candidate, targetText) {
const tgt = tokenize(targetText);
const blk = tokenize(candidate);
if (!tgt.length || !blk.length) return 0;
const tgtSet = new Set(tgt);
const blkSet = new Set(blk);
let hit = 0;
for (const w of tgtSet) if (blkSet.has(w)) hit++;
// Bonus si un long préfixe ressemble
const tgtNorm = normalizeText(stripMd(targetText));
const blkNorm = normalizeText(stripMd(candidate));
const prefix = tgtNorm.slice(0, Math.min(180, tgtNorm.length));
const prefixBonus = prefix && blkNorm.includes(prefix) ? 1000 : 0;
// Ratio bonus (0..100)
const ratio = hit / Math.max(1, tgtSet.size);
const ratioBonus = Math.round(ratio * 100);
return prefixBonus + hit + ratioBonus;
}
function bestBlockMatchIndex(blocks, targetText) {
let best = { i: -1, score: -1 };
for (let i = 0; i < blocks.length; i++) {
const sc = scoreText(blocks[i], targetText);
if (sc > best.score) best = { i, score: sc };
}
return best;
}
function splitParagraphBlocks(mdxText) {
const raw = String(mdxText ?? "").replace(/\r\n/g, "\n");
return raw.split(/\n{2,}/);
}
function isLikelyExcerpt(s) {
const t = String(s || "").trim();
if (!t) return true;
if (t.length < 120) return true;
if (/[.…]$/.test(t)) return true;
if (normalizeText(t).includes("tronqu")) return true;
return false;
}
/* ------------------------------ utils système ------------------------------ */
function run(cmd, args, opts = {}) {
const r = spawnSync(cmd, args, { stdio: "inherit", ...opts });
if (r.error) throw r.error;
if (r.status !== 0) throw new Error(`Command failed: ${cmd} ${args.join(" ")}`);
}
function runQuiet(cmd, args, opts = {}) {
const r = spawnSync(cmd, args, { encoding: "utf8", stdio: "pipe", ...opts });
if (r.error) throw r.error;
if (r.status !== 0) {
const out = (r.stdout || "") + (r.stderr || "");
throw new Error(`Command failed: ${cmd} ${args.join(" ")}\n${out}`);
}
return r.stdout || "";
}
async function fileExists(p) {
try { await fs.access(p); return true; } catch { return false; }
try {
await fs.access(p);
return true;
} catch {
return false;
}
}
function getEnv(name, fallback = "") {
@@ -96,21 +212,31 @@ function inferOwnerRepoFromGit() {
return { owner: m.groups.owner, repo: m.groups.repo };
}
function gitHasStagedChanges() {
const r = spawnSync("git", ["diff", "--cached", "--quiet"]);
return r.status === 1;
}
/* ------------------------------ parsing ticket ----------------------------- */
function escapeRegExp(s) {
return String(s).replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
}
function pickLine(body, key) {
const re = new RegExp(`^\\s*${escapeRegExp(key)}\\s*:\\s*([^\\n\\r]+)`, "mi");
const m = body.match(re);
const m = String(body || "").match(re);
return m ? m[1].trim() : "";
}
function pickHeadingValue(body, headingKey) {
const re = new RegExp(`^##\\s*${escapeRegExp(headingKey)}[^\\n]*\\n([\\s\\S]*?)(?=\\n##\\s|\\n\\s*$)`, "mi");
const m = body.match(re);
const re = new RegExp(
`^##\\s*${escapeRegExp(headingKey)}[^\\n]*\\n([\\s\\S]*?)(?=\\n##\\s|\\n\\s*$)`,
"mi"
);
const m = String(body || "").match(re);
if (!m) return "";
const lines = m[1].split(/\r?\n/).map(l => l.trim());
const lines = m[1].split(/\r?\n/).map((l) => l.trim());
for (const l of lines) {
if (!l) continue;
if (l.startsWith("<!--")) continue;
@@ -120,18 +246,25 @@ function pickHeadingValue(body, headingKey) {
}
function pickSection(body, markers) {
const text = body.replace(/\r\n/g, "\n");
const text = String(body || "").replace(/\r\n/g, "\n");
const idx = markers
.map(m => ({ m, i: text.toLowerCase().indexOf(m.toLowerCase()) }))
.filter(x => x.i >= 0)
.map((m) => ({ m, i: text.toLowerCase().indexOf(m.toLowerCase()) }))
.filter((x) => x.i >= 0)
.sort((a, b) => a.i - b.i)[0];
if (!idx) return "";
const start = idx.i + idx.m.length;
const tail = text.slice(start);
const stops = [
"\n## ", "\nJustification", "\n---", "\n## Justification", "\n## Sources",
"\nProblème identifié", "\nSources proposées", "\n## Proposition", "\n## Problème"
"\n## ",
"\nJustification",
"\n---",
"\n## Justification",
"\n## Sources",
"\nProblème identifié",
"\nSources proposées",
"\n## Proposition",
"\n## Problème",
];
let end = tail.length;
for (const s of stops) {
@@ -144,83 +277,84 @@ function pickSection(body, markers) {
function unquoteBlock(s) {
return String(s ?? "")
.split(/\r?\n/)
.map(l => l.replace(/^\s*>\s?/, ""))
.map((l) => l.replace(/^\s*>\s?/, ""))
.join("\n")
.trim();
}
function normalizeChemin(chemin) {
let c = String(chemin || "").trim();
if (!c) return "";
if (!c.startsWith("/")) c = "/" + c;
if (!c.endsWith("/")) c = c + "/";
return c;
}
function extractAnchorIdAnywhere(text) {
const s = String(text || "");
const m = s.match(/#?(p-\d+-[0-9a-f]{8})/i);
return m ? m[1] : "";
}
function extractCheminFromAnyUrl(text) {
const s = String(text || "");
// Exemple: http://localhost:4321/archicratie/prologue/#p-3-xxxx
// ou: /archicratie/prologue/#p-3-xxxx
const m = s.match(/(\/[a-z0-9\-]+\/[a-z0-9\-\/]+\/)#p-\d+-[0-9a-f]{8}/i);
return m ? m[1] : "";
}
/* --------------------------- lecture HTML paragraphe ------------------------ */
function cleanHtmlInner(inner) {
let s = String(inner ?? "");
s = s.replace(
/<span[^>]*class=["'][^"']*para-tools[^"']*["'][^>]*>[\s\S]*?<\/span>/gi,
" "
);
s = s.replace(/<[^>]+>/g, " ");
s = s.replace(/\s+/g, " ").trim();
s = s.replace(/\b(¶|Citer|Proposer|Copié)\b/gi, "").replace(/\s+/g, " ").trim();
return s;
}
async function readHtmlParagraphText(htmlPath, anchorId) {
const html = await fs.readFile(htmlPath, "utf-8");
const re = new RegExp(`<p[^>]*\\bid=["']${escapeRegExp(anchorId)}["'][^>]*>([\\s\\S]*?)<\\/p>`, "i");
const re = new RegExp(
`<p[^>]*\\bid=["']${escapeRegExp(anchorId)}["'][^>]*>([\\s\\S]*?)<\\/p>`,
"i"
);
const m = html.match(re);
if (!m) return "";
let inner = m[1];
inner = inner.replace(/<span[^>]*class=["'][^"']*para-tools[^"']*["'][^>]*>[\s\S]*?<\/span>/gi, " ");
inner = inner.replace(/<[^>]+>/g, " ");
inner = inner.replace(/\s+/g, " ").trim();
inner = inner.replace(/\b(¶|Citer|Proposer|Copié)\b/gi, "").replace(/\s+/g, " ").trim();
return inner;
return cleanHtmlInner(m[1]);
}
function splitParagraphBlocks(mdxText) {
const raw = mdxText.replace(/\r\n/g, "\n");
return raw.split(/\n{2,}/);
}
function isLikelyExcerpt(s) {
const t = String(s || "").trim();
if (!t) return true;
if (t.length < 120) return true;
if (/[.…]$/.test(t)) return true;
if (t.includes("tronqu")) return true; // tronqué/tronquee etc (sans diacritiques)
return false;
}
function scoreBlock(block, targetText) {
const tgt = tokenize(targetText);
const blk = tokenize(block);
if (!tgt.length || !blk.length) return 0;
const tgtSet = new Set(tgt);
const blkSet = new Set(blk);
let hit = 0;
for (const w of tgtSet) if (blkSet.has(w)) hit++;
// Bonus si un long préfixe ressemble (moins strict qu'un includes brut)
const tgtNorm = normalizeText(stripMd(targetText));
const blkNorm = normalizeText(stripMd(block));
const prefix = tgtNorm.slice(0, Math.min(180, tgtNorm.length));
const prefixBonus = prefix && blkNorm.includes(prefix) ? 1000 : 0;
// Ratio bonus (0..100)
const ratio = hit / Math.max(1, tgtSet.size);
const ratioBonus = Math.round(ratio * 100);
return prefixBonus + hit + ratioBonus;
}
function bestBlockMatchIndex(blocks, targetText) {
let best = { i: -1, score: -1 };
for (let i = 0; i < blocks.length; i++) {
const b = blocks[i];
const sc = scoreBlock(b, targetText);
if (sc > best.score) best = { i, score: sc };
async function readAllHtmlParagraphs(htmlPath) {
const html = await fs.readFile(htmlPath, "utf-8");
const out = [];
const re = /<p\b[^>]*\sid=["'](p-\d+-[0-9a-f]{8})["'][^>]*>([\s\S]*?)<\/p>/gi;
let m;
while ((m = re.exec(html))) {
out.push({ id: m[1], text: cleanHtmlInner(m[2]) });
}
return best;
return out;
}
/* --------------------------- localisation fichier contenu ------------------- */
async function findContentFileFromChemin(chemin) {
const clean = chemin.replace(/^\/+|\/+$/g, "");
const clean = normalizeChemin(chemin).replace(/^\/+|\/+$/g, "");
const parts = clean.split("/").filter(Boolean);
if (parts.length < 2) return null;
const collection = parts[0];
const slugPath = parts.slice(1).join("/");
const root = path.join(CONTENT_ROOT, collection);
if (!(await fileExists(root))) return null;
const exts = [".mdx", ".md"];
async function walk(dir) {
const entries = await fs.readdir(dir, { withFileTypes: true });
for (const e of entries) {
@@ -240,36 +374,137 @@ async function findContentFileFromChemin(chemin) {
}
return null;
}
return await walk(root);
}
/* -------------------------------- build helper ----------------------------- */
async function ensureBuildIfNeeded(distHtmlPath) {
if (NO_BUILD) return;
if (await fileExists(distHtmlPath)) return;
console.log(" dist manquant pour cette page → build (npm run build) …");
run("npm", ["run", "build"], { cwd: CWD });
if (!(await fileExists(distHtmlPath))) {
throw new Error(`dist toujours introuvable après build: ${distHtmlPath}`);
}
}
/* ----------------------------- API Gitea helpers --------------------------- */
async function fetchIssue({ forgeApiBase, owner, repo, token, issueNum }) {
const url = `${forgeApiBase.replace(/\/+$/,"")}/api/v1/repos/${owner}/${repo}/issues/${issueNum}`;
const url = `${forgeApiBase.replace(/\/+$/, "")}/api/v1/repos/${owner}/${repo}/issues/${issueNum}`;
const res = await fetch(url, {
headers: {
"Authorization": `token ${token}`,
"Accept": "application/json",
"User-Agent": "archicratie-apply-ticket/1.1",
}
Authorization: `token ${token}`,
Accept: "application/json",
"User-Agent": "archicratie-apply-ticket/2.0",
},
});
if (!res.ok) {
const t = await res.text().catch(()=> "");
const t = await res.text().catch(() => "");
throw new Error(`HTTP ${res.status} fetching issue: ${url}\n${t}`);
}
return await res.json();
}
async function closeIssue({ forgeApiBase, owner, repo, token, issueNum, comment }) {
const base = forgeApiBase.replace(/\/+$/, "");
const headers = {
Authorization: `token ${token}`,
Accept: "application/json",
"Content-Type": "application/json",
"User-Agent": "archicratie-apply-ticket/2.0",
};
if (comment) {
const urlC = `${base}/api/v1/repos/${owner}/${repo}/issues/${issueNum}/comments`;
await fetch(urlC, { method: "POST", headers, body: JSON.stringify({ body: comment }) });
}
const url = `${base}/api/v1/repos/${owner}/${repo}/issues/${issueNum}`;
const res = await fetch(url, { method: "PATCH", headers, body: JSON.stringify({ state: "closed" }) });
if (!res.ok) {
const t = await res.text().catch(() => "");
throw new Error(`HTTP ${res.status} closing issue: ${url}\n${t}`);
}
}
/* ------------------------------ Aliases helpers ---------------------------- */
async function loadAliases() {
try {
const s = await fs.readFile(ALIASES_FILE, "utf8");
const obj = JSON.parse(s);
return obj && typeof obj === "object" ? obj : {};
} catch {
return {};
}
}
function sortObjectKeys(obj) {
return Object.fromEntries(Object.keys(obj).sort().map((k) => [k, obj[k]]));
}
async function saveAliases(obj) {
let out = obj || {};
for (const k of Object.keys(out)) {
if (out[k] && typeof out[k] === "object") out[k] = sortObjectKeys(out[k]);
}
out = sortObjectKeys(out);
await fs.mkdir(path.dirname(ALIASES_FILE), { recursive: true });
await fs.writeFile(ALIASES_FILE, JSON.stringify(out, null, 2) + "\n", "utf8");
}
async function upsertAlias({ chemin, oldId, newId }) {
const route = normalizeChemin(chemin);
if (!oldId || !newId) throw new Error("Alias: oldId/newId requis");
if (oldId === newId) return { changed: false, reason: "same" };
const data = await loadAliases();
if (!data[route]) data[route] = {};
const prev = data[route][oldId];
if (prev && prev !== newId) {
throw new Error(
`Alias conflict: ${route}${oldId} already mapped to ${prev} (new=${newId})`
);
}
if (prev === newId) return { changed: false, reason: "already" };
data[route][oldId] = newId;
await saveAliases(data);
return { changed: true, reason: "written" };
}
async function computeNewIdFromDistByContent(distHtmlPath, afterBlock) {
const paras = await readAllHtmlParagraphs(distHtmlPath);
if (!paras.length) throw new Error(`Aucun <p id="p-..."> trouvé dans ${distHtmlPath}`);
let best = { id: null, score: -1 };
const target = stripMd(afterBlock).slice(0, 1200);
for (const p of paras) {
const sc = scoreText(p.text, target);
if (sc > best.score) best = { id: p.id, score: sc };
}
if (!best.id || best.score < 60) {
throw new Error(
`Impossible d'identifier le nouvel id dans dist (score trop faible: ${best.score}).\n` +
`➡️ Vérifie que la proposition correspond bien à UN paragraphe.`
);
}
return best.id;
}
/* ----------------------------------- MAIN ---------------------------------- */
async function main() {
const token = getEnv("FORGE_TOKEN");
if (!token) {
@@ -279,7 +514,7 @@ async function main() {
const inferred = inferOwnerRepoFromGit() || {};
const owner = getEnv("GITEA_OWNER", inferred.owner || "");
const repo = getEnv("GITEA_REPO", inferred.repo || "");
const repo = getEnv("GITEA_REPO", inferred.repo || "");
if (!owner || !repo) {
console.error("❌ Impossible de déterminer owner/repo. Fix: export GITEA_OWNER=... GITEA_REPO=...");
process.exit(1);
@@ -294,19 +529,54 @@ async function main() {
console.log(`🔎 Fetch ticket #${issueNum} from ${owner}/${repo}`);
const issue = await fetchIssue({ forgeApiBase, owner, repo, token, issueNum });
const body = String(issue.body || "").replace(/\r\n/g, "\n");
// Guard PR (Pull Request = "Demande d'ajout" = pas un ticket éditorial)
if (issue?.pull_request) {
console.error(`❌ #${issueNum} est une Pull Request (demande dajout), pas un ticket éditorial.`);
console.error(`➡️ Ouvre un ticket [Correction]/[Fact-check] depuis le site (Proposer), puis relance apply-ticket sur ce numéro.`);
process.exit(2);
}
const body = String(issue.body || "").replace(/\r\n/g, "\n");
const title = String(issue.title || "");
let chemin =
pickLine(body, "Chemin") ||
pickHeadingValue(body, "Chemin") ||
extractCheminFromAnyUrl(body) ||
extractCheminFromAnyUrl(title);
let ancre =
pickLine(body, "Ancre") ||
pickHeadingValue(body, "Ancre paragraphe") ||
pickHeadingValue(body, "Ancre");
let chemin = pickLine(body, "Chemin") || pickHeadingValue(body, "Chemin");
let ancre = pickLine(body, "Ancre") || pickHeadingValue(body, "Ancre paragraphe") || pickHeadingValue(body, "Ancre");
ancre = (ancre || "").trim();
if (ancre.startsWith("#")) ancre = ancre.slice(1);
const currentFull = pickSection(body, ["Texte actuel (copie exacte du paragraphe)", "## Texte actuel (copie exacte du paragraphe)"]);
const currentEx = pickSection(body, ["Texte actuel (extrait)", "## Assertion / passage à vérifier", "Assertion / passage à vérifier"]);
// fallback si ticket mal formé
if (!ancre) ancre = extractAnchorIdAnywhere(title) || extractAnchorIdAnywhere(body);
chemin = normalizeChemin(chemin);
const currentFull = pickSection(body, [
"Texte actuel (copie exacte du paragraphe)",
"## Texte actuel (copie exacte du paragraphe)",
]);
const currentEx = pickSection(body, [
"Texte actuel (extrait)",
"## Assertion / passage à vérifier",
"Assertion / passage à vérifier",
]);
const texteActuel = unquoteBlock(currentFull || currentEx);
const prop1 = pickSection(body, ["Proposition (texte corrigé complet)", "## Proposition (texte corrigé complet)"]);
const prop2 = pickSection(body, ["Proposition (remplacer par):", "## Proposition (remplacer par)"]);
const prop1 = pickSection(body, [
"Proposition (texte corrigé complet)",
"## Proposition (texte corrigé complet)",
]);
const prop2 = pickSection(body, [
"Proposition (remplacer par):",
"## Proposition (remplacer par)",
]);
const proposition = (prop1 || prop2).trim();
if (!chemin) throw new Error("Ticket: Chemin introuvable dans le body.");
@@ -319,13 +589,13 @@ async function main() {
if (!contentFile) throw new Error(`Fichier contenu introuvable pour Chemin=${chemin}`);
console.log(`📄 Target content file: ${path.relative(CWD, contentFile)}`);
const distHtmlPath = path.join(DIST_ROOT, chemin.replace(/^\/+|\/+$/g,""), "index.html");
const distHtmlPath = path.join(DIST_ROOT, chemin.replace(/^\/+|\/+$/g, ""), "index.html");
await ensureBuildIfNeeded(distHtmlPath);
// targetText: préférence au texte complet (ticket), sinon dist si extrait probable
// Texte cible: préférence au texte complet (ticket), sinon dist si extrait probable
let targetText = texteActuel;
let distText = "";
if (await fileExists(distHtmlPath)) {
distText = await readHtmlParagraphText(distHtmlPath, ancre);
}
@@ -344,14 +614,13 @@ async function main() {
const best = bestBlockMatchIndex(blocks, targetText);
// seuil de sécurité : on veut au moins un overlap raisonnable.
// Avec le bonus prefix+ratio, un match correct dépasse très vite ~6080.
// seuil de sécurité
if (best.i < 0 || best.score < 40) {
console.error("❌ Match trop faible: je refuse de remplacer automatiquement.");
console.error(`➡️ Score=${best.score}. Recommandation: ticket avec 'Texte actuel (copie exacte du paragraphe)'.`);
// debug: top 5
const ranked = blocks
.map((b, i) => ({ i, score: scoreBlock(b, targetText), excerpt: stripMd(b).slice(0, 140) }))
.map((b, i) => ({ i, score: scoreText(b, targetText), excerpt: stripMd(b).slice(0, 140) }))
.sort((a, b) => b.score - a.score)
.slice(0, 5);
@@ -388,10 +657,74 @@ async function main() {
}
await fs.writeFile(contentFile, updated, "utf-8");
console.log("✅ Applied. Next:");
console.log("✅ Applied.");
let aliasChanged = false;
let newId = null;
if (DO_ALIAS) {
console.log("🔁 Rebuild to compute new anchor ids (npm run build) …");
run("npm", ["run", "build"], { cwd: CWD });
if (!(await fileExists(distHtmlPath))) {
throw new Error(`dist introuvable après build: ${distHtmlPath}`);
}
newId = await computeNewIdFromDistByContent(distHtmlPath, afterBlock);
const res = await upsertAlias({ chemin, oldId: ancre, newId });
aliasChanged = res.changed;
if (aliasChanged) {
console.log(`✅ Alias ajouté: ${chemin} ${ancre} -> ${newId}`);
// MàJ dist sans rebuild complet (inject seulement)
run("node", ["scripts/inject-anchor-aliases.mjs"], { cwd: CWD });
} else {
console.log(` Alias déjà présent ou inutile (${ancre} -> ${newId}).`);
}
// garde-fous rapides
run("npm", ["run", "test:anchors"], { cwd: CWD });
run("node", ["scripts/check-inline-js.mjs"], { cwd: CWD });
}
if (DO_COMMIT) {
const files = [path.relative(CWD, contentFile)];
if (DO_ALIAS && aliasChanged) files.push(path.relative(CWD, ALIASES_FILE));
run("git", ["add", ...files], { cwd: CWD });
if (!gitHasStagedChanges()) {
console.log(" Nothing to commit (aucun changement staged).");
return;
}
const msg = `edit: apply ticket #${issueNum} (${chemin}#${ancre})`;
run("git", ["commit", "-m", msg], { cwd: CWD });
const sha = runQuiet("git", ["rev-parse", "--short", "HEAD"], { cwd: CWD }).trim();
console.log(`✅ Committed: ${msg} (${sha})`);
if (DO_CLOSE) {
const comment = `✅ Appliqué par apply-ticket.\nCommit: ${sha}`;
await closeIssue({ forgeApiBase, owner, repo, token, issueNum, comment });
console.log(`✅ Ticket #${issueNum} fermé.`);
}
return;
}
// mode manuel
console.log("Next (manuel) :");
console.log(` git diff -- ${path.relative(CWD, contentFile)}`);
console.log(` git add ${path.relative(CWD, contentFile)}`);
console.log(
` git add ${path.relative(CWD, contentFile)}${
DO_ALIAS ? " src/anchors/anchor-aliases.json" : ""
}`
);
console.log(` git commit -m "edit: apply ticket #${issueNum} (${chemin}#${ancre})"`);
if (DO_CLOSE) {
console.log(" (puis relance avec --commit --close pour fermer automatiquement)");
}
}
main().catch((e) => {

View File

@@ -0,0 +1,63 @@
import fs from "node:fs";
import path from "node:path";
const ALIASES_PATH = path.join(process.cwd(), "src", "anchors", "anchor-aliases.json");
if (!fs.existsSync(ALIASES_PATH)) {
console.log(" Aucun fichier d'aliases (src/anchors/anchor-aliases.json). Skip.");
process.exit(0);
}
let data;
try {
data = JSON.parse(fs.readFileSync(ALIASES_PATH, "utf8"));
} catch (e) {
console.error("❌ JSON invalide dans src/anchors/anchor-aliases.json");
console.error(e?.message || e);
process.exit(1);
}
if (!data || typeof data !== "object" || Array.isArray(data)) {
console.error("❌ Le JSON doit être un objet { route: { oldId: newId } }");
process.exit(1);
}
let pages = 0;
let aliases = 0;
for (const [route, mapping] of Object.entries(data)) {
pages++;
if (typeof route !== "string" || !route.trim()) {
console.error("❌ Route invalide (clé): doit être une string non vide", { route });
process.exit(1);
}
// Optionnel mais sain : routes de type "/xxx/yyy/"
if (!route.startsWith("/") || !route.endsWith("/")) {
console.error("❌ Route invalide: doit commencer et finir par '/'", { route });
process.exit(1);
}
if (!mapping || typeof mapping !== "object" || Array.isArray(mapping)) {
console.error("❌ Mapping invalide: doit être un objet { oldId: newId }", { route });
process.exit(1);
}
for (const [oldId, newId] of Object.entries(mapping)) {
if (typeof oldId !== "string" || typeof newId !== "string") {
console.error("❌ oldId/newId doivent être des strings", { route, oldId, newId });
process.exit(1);
}
if (!oldId.trim() || !newId.trim()) {
console.error("❌ oldId/newId ne doivent pas être vides", { route, oldId, newId });
process.exit(1);
}
if (oldId === newId) {
console.error("❌ oldId doit différer de newId", { route, oldId });
process.exit(1);
}
aliases++;
}
}
console.log(`✅ anchor-aliases.json OK: pages=${pages} aliases=${aliases}`);

View File

@@ -0,0 +1,228 @@
import fs from "node:fs/promises";
import path from "node:path";
function escapeRegExp(s) {
return s.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
}
function routeToHtmlPath(distDir, route) {
if (typeof route !== "string") throw new Error(`Route must be a string, got ${typeof route}`);
// Normalise: route must be like "/a/b/" or "/"
let r = route.trim();
if (!r.startsWith("/")) r = "/" + r;
if (r !== "/" && !r.endsWith("/")) r = r + "/";
const segments = r.split("/").filter(Boolean); // removes empty
if (segments.length === 0) return path.join(distDir, "index.html");
return path.join(distDir, ...segments, "index.html");
}
function countIdAttr(html, id) {
const re = new RegExp(`\\bid="${escapeRegExp(id)}"`, "g");
let c = 0;
while (re.exec(html)) c++;
return c;
}
function snippetAround(html, idx, beforeLines = 2, afterLines = 4) {
const lines = html.split("\n");
// compute line number
const upto = html.slice(0, Math.max(0, idx));
const lineNo = upto.split("\n").length; // 1-based
const start = Math.max(1, lineNo - beforeLines);
const end = Math.min(lines.length, lineNo + afterLines);
const out = [];
for (let i = start; i <= end; i++) {
out.push(`${String(i).padStart(5, " ")}| ${lines[i - 1]}`);
}
return out.join("\n");
}
function parseArgs(argv) {
const args = {
dist: "dist",
aliases: path.join("src", "anchors", "anchor-aliases.json"),
strict: true,
};
for (let i = 2; i < argv.length; i++) {
const a = argv[i];
if (a === "--dist" && argv[i + 1]) args.dist = argv[++i];
else if (a === "--aliases" && argv[i + 1]) args.aliases = argv[++i];
else if (a === "--non-strict") args.strict = false;
else if (a === "-h" || a === "--help") {
console.log(`Usage:
node scripts/verify-anchor-aliases-in-dist.mjs [--dist dist] [--aliases src/anchors/anchor-aliases.json] [--non-strict]
Checks that every (route, oldId->newId) alias is injected into the built HTML in dist.`);
process.exit(0);
} else {
console.error("Unknown arg:", a);
process.exit(2);
}
}
return args;
}
const { dist, aliases, strict } = parseArgs(process.argv);
const CWD = process.cwd();
const distDir = path.isAbsolute(dist) ? dist : path.join(CWD, dist);
const aliasesPath = path.isAbsolute(aliases) ? aliases : path.join(CWD, aliases);
let data;
try {
data = JSON.parse(await fs.readFile(aliasesPath, "utf8"));
} catch (e) {
console.error(`❌ Cannot read/parse aliases JSON: ${aliasesPath}`);
console.error(e?.message || e);
process.exit(1);
}
if (!data || typeof data !== "object" || Array.isArray(data)) {
console.error("❌ anchor-aliases.json must be an object of { route: { oldId: newId } }");
process.exit(1);
}
let pages = 0;
let aliasesCount = 0;
let checked = 0;
const failures = [];
for (const [route, mapping] of Object.entries(data)) {
pages++;
if (!mapping || typeof mapping !== "object" || Array.isArray(mapping)) {
failures.push({ route, msg: "Mapping must be an object oldId->newId." });
continue;
}
const htmlPath = routeToHtmlPath(distDir, route);
let html;
try {
html = await fs.readFile(htmlPath, "utf8");
} catch (e) {
failures.push({
route,
msg: `Missing built page: ${htmlPath}. Did you run 'npm run build'?`,
});
continue;
}
for (const [oldId, newId] of Object.entries(mapping)) {
aliasesCount++;
checked++;
if (typeof oldId !== "string" || typeof newId !== "string") {
failures.push({ route, oldId, newId, htmlPath, msg: "oldId/newId must be strings." });
continue;
}
const oldCount = countIdAttr(html, oldId);
const newCount = countIdAttr(html, newId);
if (oldCount === 0) {
failures.push({
route,
oldId,
newId,
htmlPath,
msg: `oldId not found in HTML (expected injected alias span).`,
});
continue;
}
if (newCount === 0) {
failures.push({
route,
oldId,
newId,
htmlPath,
msg: `newId not found in HTML (target missing).`,
});
continue;
}
// Strictness: ensure uniqueness
if (strict && oldCount !== 1) {
failures.push({
route,
oldId,
newId,
htmlPath,
msg: `oldId occurs ${oldCount} times (expected exactly 1).`,
});
continue;
}
if (strict && newCount !== 1) {
failures.push({
route,
oldId,
newId,
htmlPath,
msg: `newId occurs ${newCount} times (expected exactly 1).`,
});
continue;
}
// Require para-alias class on the injected span (contract)
const reAliasSpan = new RegExp(
`<span[^>]*\\bid="${escapeRegExp(oldId)}"[^>]*\\bclass="[^"]*\\bpara-alias\\b[^"]*"[^>]*>\\s*<\\/span>`,
"i"
);
if (!reAliasSpan.test(html)) {
failures.push({
route,
oldId,
newId,
htmlPath,
msg: `Injected alias span exists but does not match expected contract (missing class="...para-alias...").`,
});
continue;
}
// Adjacency: alias span immediately before the element carrying newId
const reAdjacent = new RegExp(
`<span[^>]*\\bid="${escapeRegExp(oldId)}"[^>]*\\bclass="[^"]*\\bpara-alias\\b[^"]*"[^>]*>\\s*<\\/span>\\s*<[^>]*\\bid="${escapeRegExp(
newId
)}"`,
"is"
);
if (!reAdjacent.test(html)) {
const oldIdx = html.indexOf(`id="${oldId}"`);
const newIdx = html.indexOf(`id="${newId}"`);
failures.push({
route,
oldId,
newId,
htmlPath,
msg:
`oldId & newId are present, but alias is NOT immediately before target.\n` +
`--- Context around oldId (line approx) ---\n${snippetAround(html, oldIdx)}\n\n` +
`--- Context around newId (line approx) ---\n${snippetAround(html, newIdx)}\n`,
});
continue;
}
}
}
if (failures.length) {
console.error(`❌ Alias injection verification FAILED.`);
console.error(`Checked: pages=${pages}, aliases=${aliasesCount}, verified_pairs=${checked}, strict=${strict}`);
console.error("");
for (const f of failures) {
console.error("------------------------------------------------------------");
console.error(`Route: ${f.route}`);
if (f.htmlPath) console.error(`HTML: ${f.htmlPath}`);
if (f.oldId) console.error(`oldId: ${f.oldId}`);
if (f.newId) console.error(`newId: ${f.newId}`);
console.error(`Reason: ${f.msg}`);
}
process.exit(1);
}
console.log(`✅ verify-anchor-aliases-in-dist OK: pages=${pages} aliases=${aliasesCount} strict=${strict}`);

BIN
sources/docx/.DS_Store vendored Normal file

Binary file not shown.

BIN
sources/docx/archicrat-ia/.DS_Store vendored Normal file

Binary file not shown.

BIN
sources/docx/lexique/.DS_Store vendored Normal file

Binary file not shown.

172
sources/manifest.yml Normal file
View File

@@ -0,0 +1,172 @@
version: 1
docs:
# =========================
# Archicratie — Essai-thèse "ArchiCraT-IA"
# =========================
- source: sources/docx/archicrat-ia/Prologue—Archicratie-fondation_et_finalite_sociopolitique_et_historique-version_officielle.docx
collection: archicratie
slug: archicrat-ia/prologue
title: "Prologue — Fondation et finalité sociopolitique et historique"
order: 10
- source: sources/docx/archicrat-ia/Chapitre_1—Fondements_epistemologiques_et_modelisation_Archicratie-version_officielle.docx
collection: archicratie
slug: archicrat-ia/chapitre-1
title: "Chapitre 1 — Fondements épistémologiques et modélisation"
order: 20
- source: sources/docx/archicrat-ia/Chapitre_2Archeogenese_des_regimes_de_co-viabilite-version_officielle.docx
collection: archicratie
slug: archicrat-ia/chapitre-2
title: "Chapitre 2 — Archéogenèse des régimes de co-viabilité"
order: 30
- source: sources/docx/archicrat-ia/Chapitre_3—Philosophies_du_pouvoir_et_Archicration-pour_une_topologie_differenciee_des_regimes_regulateurs-version_officielle.docx
collection: archicratie
slug: archicrat-ia/chapitre-3
title: "Chapitre 3 — Philosophies du pouvoir et archicration"
order: 40
- source: sources/docx/archicrat-ia/Chapitre_4—Vers_une_histoire_archicratique_des_revolutions_industrielles-version_officielle.docx
collection: archicratie
slug: archicrat-ia/chapitre-4
title: "Chapitre 4 — Histoire archicratique des révolutions industrielles"
order: 50
- source: sources/docx/archicrat-ia/Chapitre_5—Problematiques_des_tensions_des_co-viabilites_et_des_regulations_archicratiques-version_officielle.docx
collection: archicratie
slug: archicrat-ia/chapitre-5
title: "Chapitre 5 — Tensions, co-viabilités et régulations"
order: 60
- source: sources/docx/archicrat-ia/Conclusion-Archicrat-IA-version_officielle.docx
collection: archicratie
slug: archicrat-ia/conclusion
title: "Conclusion — ArchiCraT-IA"
order: 70
# =========================
# IA — Cas pratique (1 page = 1 chapitre)
# NOTE: on n'inclut PAS le monolithe "Cas_IA-... .docx" dans le manifeste.
# =========================
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Introduction_generale—Mettre_en_scene_un_systeme_IA.docx
collection: ia
slug: cas-pratique/introduction
title: "Cas pratique — Introduction générale : Mettre en scène un système IA"
order: 110
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_I—Epreuve_de_detectabilite.docx
collection: ia
slug: cas-pratique/chapitre-1
title: "Cas pratique — Chapitre I : Épreuve de détectabilité"
order: 120
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_II—Epreuve_topologique.docx
collection: ia
slug: cas-pratique/chapitre-2
title: "Cas pratique — Chapitre II : Épreuve topologique"
order: 130
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_III—Epreuve_archeogenetique.docx
collection: ia
slug: cas-pratique/chapitre-3
title: "Cas pratique — Chapitre III : Épreuve archéogénétique"
order: 140
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_IV—Epreuve_morphologique.docx
collection: ia
slug: cas-pratique/chapitre-4
title: "Cas pratique — Chapitre IV : Épreuve morphologique"
order: 150
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_V—Epreuve_historique.docx
collection: ia
slug: cas-pratique/chapitre-5
title: "Cas pratique — Chapitre V : Épreuve historique"
order: 160
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_VI—Epreuve_de_co-viabilite.docx
collection: ia
slug: cas-pratique/chapitre-6
title: "Cas pratique — Chapitre VI : Épreuve de co-viabilité"
order: 170
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Chapitre_VII—Gestes_archicratiques_concrets_pour_un_systeme_IA.docx
collection: ia
slug: cas-pratique/chapitre-7
title: "Cas pratique — Chapitre VII : Gestes archicratiques concrets"
order: 180
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Conclusion.docx
collection: ia
slug: cas-pratique/conclusion
title: "Cas pratique — Conclusion"
order: 190
- source: sources/docx/cas-ia/Cas_IA-Archicratie_et_gouvernance_des_systemes_IA-Annexe—Glossaire_archicratique_pour_audit_des_systemes_IA.docx
collection: ia
slug: cas-pratique/annexe-glossaire-audit
title: "Cas pratique — Annexe : Glossaire archicratique pour audit des systèmes IA"
order: 195
# =========================
# Traité — Ontodynamique générative (1 page = 1 chapitre)
# NOTE: on n'inclut PAS le monolithe "Traite-...-version_officielle.docx" dans le manifeste.
# =========================
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Introduction-version_officielle.docx
collection: traite
slug: ontodynamique/introduction
title: "Traité — Introduction"
order: 210
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_1—Le_flux_ontogenetique-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-1
title: "Traité — Chapitre 1 : Le flux ontogénétique"
order: 220
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_2—economie_du_reel-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-2
title: "Traité — Chapitre 2 : Économie du réel"
order: 230
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_3—Le_reel_comme_systeme_regulateur-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-3
title: "Traité — Chapitre 3 : Le réel comme système régulateur"
order: 240
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_4—Arcalite-structures_formes_invariants-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-4
title: "Traité — Chapitre 4 : Arcalité — structures, formes, invariants"
order: 250
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_5-Cratialite-forces_flux_gradients-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-5
title: "Traité — Chapitre 5 : Cratialité — forces, flux, gradients"
order: 260
- source: sources/docx/traite/Traite-Ontodynamique_Generative-Fondements_Archicratie-Chapitre_6—Archicration-version_officielle.docx
collection: traite
slug: ontodynamique/chapitre-6
title: "Traité — Chapitre 6 : Archicration"
order: 270
# =========================
# Glossaire / Lexique
# =========================
- source: sources/docx/lexique/Lexique_general_archicratique.docx
collection: glossaire
slug: lexique-general
title: "Lexique général archicratique"
order: 900
- source: sources/docx/lexique/MINI-GLOSSAIRE_DES_VERBES_DE_LA_SCENE_ARCHICRATIQUE.docx
collection: glossaire
slug: mini-glossaire-verbes
title: "Mini-glossaire des verbes de la scène archicratique"
order: 910

0
sources/pdf/.gitkeep Normal file
View File

View File

@@ -1,6 +1,8 @@
{
"/archicratie/prologue/": {
"p-8-e7075fe3": "p-8-0e65838d"
"p-8-e7075fe3": "p-8-0e65838d",
"p-3-76df8102": "p-3-539ac0fd",
"p-5-85126fa5": "p-5-285d27a7"
}
}

View File

@@ -1,5 +1,5 @@
---
title: "Prologue — Fondation et finalité socio-politique et historique"
title: "Prologue — (ancien emplacement)"
edition: "archicratie"
status: "modele_sociopolitique"
level: 1
@@ -7,7 +7,14 @@ version: "0.1.0"
concepts: []
links: []
order: 0
summary: ""
summary: "⚠️ Ancien emplacement — le Prologue a été déplacé."
deprecated: true
canonical: "/archicratie/archicrat-ia/prologue/"
---
---
⚠️ **Le Prologue a été déplacé.**
➡️ Consulte la version canon ici : **/archicratie/archicrat-ia/prologue/**
---
# **Réguler sans dominer : brèche archicratique dans la pensée du pouvoir**
@@ -17,12 +24,11 @@ C'est cette perte de prise sur le réel que ce livre souhaite prendre au sérieu
Cette tenue du monde n'équivaut ni à la paix civile, ni à la stabilité des institutions, ni à l'ordre établi. C'est une difficulté conceptuelle que d'envisager *la possibilité pour un ordre de durer sans s'effondrer*, alors même qu'il est traversé en permanence par des forces et des légitimités qui le travaillent, l'éprouvent, le modifient, l'usent, le contestent, le prolongent ou le sapent. Cette possibilité de tenir le monde commun, nous la nommons *co-viabilité*.
Le terme n'est pas trivial. Il ne s'agit pas simplement d'une viabilité partagée, ni d'une coexistence pacifique, ni même d'une durabilité écologique élargie. Il s'agit d'un état dynamique, instable, fragile, dans lequel un ensemble — une société, d'un système biologique, d'une formation historique, d'un milieu technique ou d'un monde institué — parvient à maintenir une *existence viable*, *malgré et grâce à ses tensions constitutives*.
Le terme nest pas trivial. Il ne sagit pas simplement dune viabilité partagée, ni dune coexistence pacifique, ni même dune durabilité écologique élargie. Il sagit dun état dynamique, instable, fragile, dans lequel un ensemble — une société, un système biologique, une formation historique, un milieu technique ou un monde institué — parvient à maintenir une existence viable, malgré et grâce à ses tensions constitutives.
La *co-viabilité* ne désigne ni un état d'équilibre, ni une finalité normative. Elle nomme un état dynamique et instable, dans lequel un monde — société, milieu technique, formation historique — tient non pas par homogénéité ou harmonie, mais parce qu'il parvient à réguler ce qui le menace sans se détruire lui-même. Il compose entre des éléments hétérogènes — forces d'inertie et d'innovation, attachements profonds et ruptures nécessaires — sans chercher à les unifier. C'est cette disposition active, faite de compromis fragiles et d'ajustements toujours révisables, que nous tenons pour première, et non dérivée.
Ce qui revient à dire que la question politique — au sens fort — n'a peut-être jamais été qui commande ? Mais bien plus : *Comment un ordre tient-il malgré ce qui le défait ?* *Quels sont les dispositifs qui permettent à une société de ne pas se désagréger sous l'effet de ses propres contradictions ?* *Comment sont régulées les tensions qui traversent le tissu du monde commun sans le déchirer ?*\
Cette bascule de perspective prolonge des intuitions anciennes. Max Weber (*Économie et société*, 1922) rappelait que ce qui fait tenir un ordre, ce n'est pas seulement la force ou la loi, mais les « chances de validité » socialement reconnues. Norbert Elias (*La dynamique de l'Occident*, 1939/1975) montrait, quant à lui, que les sociétés se maintiennent par des équilibres toujours précaires entre interdépendances, rivalités et pacifications. Notre démarche s'inscrit dans ce sillage : travailler cette interrogation sur les *conditions de viabilité d'un monde commun*.
Ce qui revient à dire que la question politique — au sens fort — na peut-être jamais été qui commande ? Mais bien plus : Comment un ordre tient-il malgré ce qui le défait ? Quels sont les dispositifs qui permettent à une société de ne pas se désagréger sous leffet de ses propres contradictions ? Comment sont régulées les tensions qui traversent le tissu du monde commun sans le déchirer ? Cette bascule de perspective prolonge des intuitions anciennes. Max Weber (Économie et société, 1922) rappelait que ce qui fait tenir un ordre, ce nest pas seulement la force ou la loi, mais les « chances de validité » socialement reconnues. Norbert Elias (La dynamique de lOccident, 1939/1974) montrait, quant à lui, que les sociétés se maintiennent par des équilibres toujours précaires entre interdépendances, rivalités et pacifications. Notre démarche sinscrit dans ce sillage : travailler cette interrogation sur les conditions de viabilité dun monde commun.
Ce changement de perspective implique une rupture profonde dans la manière même de poser la question politique. Pendant des siècles, les sociétés ont pensé le politique à partir de principes transcendants — Dieu, Nature, Volonté générale, Pacte social. Ces principes, supposés extérieurs aux conflits du présent, garantissaient l'ordre en surplomb. Comme le rappelle Michel Foucault, il n'y a pas de principe extérieur au jeu des forces : seulement des rapports de pouvoir situés, modulés, réversibles. C'est précisément cette exigence — trouver dans les relations elles-mêmes les ressources nécessaires pour maintenir des mondes vivables — qui définit notre époque.

View File

@@ -2,7 +2,8 @@
import SiteLayout from "../../layouts/SiteLayout.astro";
import { getCollection } from "astro:content";
const entries = await getCollection("archicratie");
const entries = (await getCollection('archicratie'))
.filter((e) => e.slug !== "prologue");
entries.sort((a, b) => (a.data.order ?? 9999) - (b.data.order ?? 9999));
---
<SiteLayout title="Essai-thèse — Archicratie">

View File

@@ -7,9 +7,9 @@
"p-0-d7974f88",
"p-1-2ef25f29",
"p-2-edb49e0a",
"p-3-76df8102",
"p-3-539ac0fd",
"p-4-8ed4f807",
"p-5-85126fa5",
"p-5-285d27a7",
"p-6-3515039d",
"p-7-64a0ca9c",
"p-8-0e65838d",
@@ -142,6 +142,8 @@
"p-135-c19330ce",
"p-136-17f1cf51",
"p-137-d8f1539e",
"p-3-76df8102",
"p-5-85126fa5",
"p-8-e7075fe3"
],
"atlas/00-demarrage/index.html": [