Iter-4360dd15-0185-transcript

iteration transcript

修改:20260424235809000

第 185 轮执行轨迹

  → call search_memory({"limit":8,"query":"4360dd15 NEXT_FOCUS witness normalization marker left swap left side independent extraction"})

→ call list_recent({"n":8})

→ call list_by_tag({"limit":20,"tag":"4360dd15"})

【tool:search_memory】
{"ok": true, "results": [{"title": "Iter-4360dd15-0184-transcript", "tags": "iteration transcript", "modified": "20260424235630000", "preview": "!! 第 184 轮执行轨迹

→ call search_memory({\"limit\":10,\"query\":\"4360dd15 NEXT_FOCUS witness normalization marker left side pure string adjacent swap local feasibility\"}) → call list_recent({\"n\":8}) → call list_by_tag({\"limit\":10,\"tag\":\"4360dd15\"}) 【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0181-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424235"}, {"title": "Iter-4360dd15-0181-transcript", "tags": "iteration transcript", "modified": "20260424235247000", "preview": "!! 第 181 轮执行轨迹
   → call search_memory({\"limit\":10,\"query\":\"4360dd15 NEXT_FOCUS witness normalization marker insert adjacent swap support-contained window\"})    → call list_recent({\"n\":8})    → call list_by_tag({\"limit\":20,\"tag\":\"4360dd15\"})  【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0177-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424234633000\","}, {"title": "Iter-4360dd15-0177-transcript", "tags": "iteration transcript", "modified": "20260424234633000", "preview": "!! 第 177 轮执行轨迹  
→ call search_memory({\"limit\":8,\"query\":\"4360dd15 support-contained-window accept(S,L,r) insertion one adjacent swap reachable proof\"}) → call list_recent({\"n\":8}) → call list_tools({}) 【tool:search_memory】 {\"o

【tool:list_recent】
{"ok": true, "items": [{"title": "Iter-4360dd15-0184-reflection", "modified": "20260424235635000", "tags": "iteration reflection"}, {"title": "NEXT_FOCUS", "modified": "20260424235635000", "tags": "pinned focus"}, {"title": "锚定 marker 后先证交换不碰边界", "modified": "20260424235635000", "tags": "lesson predicate verification local insertion witness normalization"}, {"title": "Iter-4360dd15-0184-transcript", "modified": "20260424235630000", "tags": "iteration transcript"}, {"title": "Iter-4360dd15-0184-failure-left-split-overclaim", "modified": "20260424235623000", "tags": "4360dd15 lesson failure predicate verification local insertion witness normalization"}, {"title": "Iter-4360dd15-0183-reflection", "modified": "20260424235519000", "tags": "iteration reflection"}, {"title": "锚定后交换只改一侧", "modified": "20260424235519000", "tags": "lesson predicate verification local insertion witness normalization"}, {"title": "Iter-4360dd15-0183-transcript", "modified": "20260424235514000", "tags": "iteration transcript"}]}

【tool:list_by_tag】
{"ok": true, "items": [{"title": "Iter-4360dd15-0184-failure-left-split-overclaim", "modified": "20260424235623000"}, {"title": "Iter-4360dd15-0183-fact-swap-strictly-localizes-away-from-marker", "modified": "20260424235510000"}, {"title": "Iter-4360dd15-0182-fact-marker-anchored-swap-nonadjacent", "modified": "20260424235406000"}, {"title": "Iter-4360dd15-0181-fact-fresh-marker-witness-canonicalization-n6", "modified": "20260424235242000"}, {"title": "Iter-4360dd15-0180-fact-canonical-fresh-marker-witness", "modified": "20260424235057000"}, {"title": "Iter-4360dd15-0179-fact-witness-nonuniqueness-repaired", "modified": "20260424234923000"}, {"title": "Iter-4360dd15-0178-fact-support-contained-window-decidable-audit", "modified": "20260424234738000"}, {"title": "Iter-4360dd15-0177-fact-windowed-insert-swap-proof", "modified": "20260424234626000"}, {"title": "Iter-4360dd15-0176-fact-insert-swap-completeness-decomposition", "modified": "20260424234420000"}, {"title": "Iter-4360dd15-0175-fact-minimal-fp-unguarded-swap-repair", "modified": "20260424234137000"}, {"title": "Iter-4360dd15-0174-fact-exhaustive-reachability-audit", "modified": "20260424233953000"}, {"title": "Iter-4360dd15-0173-fact-support-contained-boundary-consistency", "modified": "20260424233814000"}, {"title": "Iter-4360dd15-0172-fact-accept-support-contained-boundary-consistency", "modified": "20260424233727000"}, {"title": "Iter-4360dd15-0171-fact-minimal-counterexample-support-contained-window", "modified": "

→ call read_note({"title":"Iter-4360dd15-0184-failure-left-split-overclaim"})

→ call read_note({"title":"Iter-4360dd15-0183-fact-swap-strictly-localizes-away-from-marker"})

→ call read_note({"title":"NEXT_FOCUS"})

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0184-failure-left-split-overclaim", "tags": "4360dd15 lesson failure predicate verification local insertion witness normalization", "text": "!! 迭代 184:驳倒一个过强的“左侧独立分解”设想\n\n本轮作为 critic,不去扩张结论,而是专门找当前积累里最可能失败的点。结果证伪了一个看起来很自然、但实际上过强的分解假设:\n\n''错误命题'':固定唯一 marker μ 后,若一次相邻交换落在 μ 左侧,那么可行性总能分解为“S 的某个前缀匹配 L 的左半边,剩余后缀匹配 L 的右半边”。\n\n!! 反例\n取:\n- S = ('a','b')\n- 先在 j=2 处插入 μ,得 B = ('a','b','μ')\n- 再在 i=0 做一次交换,得 L = ('b','a','μ')\n\n虽然这是一个合法的 insert + one swap witness,但不存在任何 k 使得:\n- S[:k]L[:j] 的子序列,且\n- S[k:]L[j+1:] 的子序列。\n\n因为 L[j+1:] 为空,而 S=('a','b') 不能作为 ('b','a') 的子序列整体回收成一个单一前缀切分。\n\n!! 复现代码\n

\nfrom itertools import product\n\nalpha=('a','b')\nmarker='μ'\n\ndef insert(seq,p,x):\n    seq=list(seq)\n    return tuple(seq[:p]+[x]+seq[p:])\n\ndef swap(seq,i):\n    seq=list(seq)\n    seq[i],seq[i+1]=seq[i+1],seq[i]\n    return tuple(seq)\n\ndef is_subsequence(short,long):\n    it=iter(long)\n    try:\n        for x in short:\n            while next(it)!=x:\n                pass\n        return True\n    except StopIteration:\n        return False\n\nS=('a','b')\nj=2\nB=insert(S,j,marker)\nL=swap(B,0)\nprint(B, L)\n\nok=False\nfor k in range(len(S)+1):\n    if is_subsequence(S[:k], L[:j]) and is_subsequence(S[k:], L[j+1:]):\n        ok=True\n        print('found k', k)\nprint('split_exists', ok)\n
\n\n!! 结论\n- “交换只落在左侧纯字母串里”仍然成立;\n- 但“因此可把可行性拆成锚点左右两个互不相关的子序列问题”是假的。\n\n!! 教训\n后续证明不能

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0183-fact-swap-strictly-localizes-away-from-marker", "tags": "4360dd15 fact method predicate verification local insertion witness normalization", "text": "!! 迭代 183:锚定 marker 后,交换严格局部化为纯字母串的左/右片段\n\n本轮把上一轮的经验结论再收窄为一个可检验的不变量:\n当 witness 被规范化为“在 L 中唯一 marker μ 的位置插入,再做零次或一次相邻交换”时,若那一次交换存在,则:\n\n- 交换位置永远不会与 μ 相邻;\n- 交换只会发生在 μ 左侧的纯字母串内,或 μ 右侧的纯字母串内;\n- 更强地,交换不会影响另一侧的任何字符(可直接通过前/后缀相等性检查)。\n\n!! 形式化实验结论(穷举到 n<=6)\n对字母表 {'a','b'},枚举所有长度 n<=6S 与所有含唯一 μL:\n\n- 只要 (S,L) 可行(存在某个插入位 + ≤1 次相邻交换得到 L),\n- 就存在以 μ 位置为插入锚点的规范化 witness;\n- 且该 witness 中若需要交换,则交换索引 i 必满足 i < j-1i >= jjμL 中的位置),即绝不“碰到” μ。\n\n!! 代码(复现用)\n

\nfrom itertools import product\nfrom collections import defaultdict\n\nalpha=('a','b')\nmarker='μ'\n\ndef insert(seq,p,x):\n    seq=list(seq)\n    return tuple(seq[:p]+[x]+seq[p:])\n\ndef swap(seq,i):\n    seq=list(seq)\n    seq[i],seq[i+1]=seq[i+1],seq[i]\n    return tuple(seq)\n\ndef canonical_witness(S,L):\n    j=L.index(marker)\n    B=insert(S,j,marker)\n    if B==L:\n        return ('id', j)\n    for i in range(len(S)):\n        if swap(B,i)==L:\n            if i < j-1:\n                return ('swap_left', j, i)\n            elif i >= j:\n                return ('swap_right', j, i)\n            else:\n                return ('touch', j, i)\n    return None\n\nviolations=[]\nfor n in range(0,7):\n    for S in product(alpha, repeat=n):\n        for j in range(n+1

【tool:read_note】
{"ok": true, "title": "NEXT_FOCUS", "tags": "pinned focus", "text": "!! 下轮焦点(来自 mission `4360dd15` · Iter-0184)\n\n证明固定 μ 锚点后,左侧纯字母串上的一次相邻交换可被独立抽离且不改变“插入+交换”可行性的充要条件。"}

→ call read_note({"title":"Iter-4360dd15-0176-fact-insert-swap-completeness-decomposition"})

→ call read_note({"title":"Iter-4360dd15-0177-fact-windowed-insert-swap-proof"})

→ call read_note({"title":"Iter-4360dd15-0178-fact-support-contained-window-decidable-audit"})

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0176-fact-insert-swap-completeness-decomposition", "tags": "4360dd15 fact method [[predicate verification]] [[local insertion]]", "text": "!! 迭代 176:`insert + one adjacent swap` 的无界充要性分解\n\n本轮把 `accept(S,L,r)` 的核心局部规则往前推进了一步:不再只依赖穷举一致性,而是把“插入后一次相邻交换”拆成两个互斥情形,并验证它们都落回当前判定模板里。\n\n!! 结论\n设 `S` 为短 token 序列,`L` 为由 `S` 经过“先插入一个 token,再做零次或一次相邻交换”得到的序列,则总是满足以下二选一:\n\n1. `S` 是 `L` 的有序子序列;\n2. 存在某个相邻交换位置 `j`,使得对 `L` 做一次相邻交换后,`S` 成为其有序子序列。\n\n这正是当前 `accept_candidate` 的两分支,因此它对抽象 token 模型里的 `insert + ≤1 swap` 是充要的。\n\n!! 关键分解\n把插入后的中间序列记为 `B`,交换位置记为 `i`。\n\n- **情形 A:交换触及插入 token**\n - 即 `i = p` 或 `i = p-1`\n - 这时插入 token 只是和相邻原 token 互换位置;\n - 由于插入 token 不属于 `S` 的内容,`S` 仍然作为有序子序列出现在最终 `L` 中。\n\n- **情形 B:交换不触及插入 token**\n - 这时相邻交换只发生在原 token 之间;\n - 对最终 `L` 再对同一位置 `i` 交换一次,就能精确恢复 `B`;\n - 而 `S` 显然是 `B` 的有序子序列,因此 `S` 对“交换一次后的 `L`”也成立。\n\n!! Python 复核\n

python\nfrom itertools import product\n\ndef insert(seq, p, x='*'):\n seq = list(seq)\n return tuple(seq[:p] + [x] + seq[p:])\n\ndef swap(seq, i):\n seq = list(seq)\n seq[i], seq[i+1] = seq[i+1], seq[i]\n return tuple(seq)\n\ndef is_subsequence(short, long):\n it = iter(long)\n try:\n for x in short:\n while next(it) != x:\n pass\n return True\n except StopIteration:\n return False\n\nalphabet = tuple('ab')\nsummary = {k:0 for k in ['touch_direct_true','touch_direct_false','away_direct_true','away_direct_false','away_back_fals

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0177-fact-windowed-insert-swap-proof", "tags": "4360dd15 fact method predicate verification local insertion", "text": "!! 迭代 177:support-contained-window 下 insert + ≤1 adjacent swap 的无界形式证据\n\n本轮用穷举把一个更窄但可推广的命题压实为结构化结论:当相邻交换的作用位点完全落在 support-contained window 内时,accept(S,L,r) 对“先插入一个 fresh token,再做零次或一次相邻交换”的判定没有额外遗漏。\n\n!! 结论(当前可复核版本)\n设 S 是长度 n 的 token 序列,先在位置 p 插入一个 fresh token 得到中间序列 B,再在位置 i 做一次相邻交换得到 L。若交换边 (i,i+1) 的支持集满足:\n\n- p-r <= i\n- i+1 <= p+r\n\n也就是交换完全落在窗口 [p-r, p+r] 内,则:\n\n1. 若 i ∈ {p-1, p},交换触及插入 token,S 仍是 L 的有序子序列;\n2. 若 i 不触及插入 token,则对 L 在同一位置再交换一次即可恢复 B,因此 S 是恢复态的有序子序列。\n\n这与当前 accept_candidate 的两分支一致。\n\n!! 归一化写法\n若把窗口条件写成对左端点的约束,则等价于:\n\n- p-r <= i <= p+r-1\n\n因为一次相邻交换占据两个相邻位置 (i,i+1)。\n\n!! Python 复核\n

\nfrom itertools import product\n\ndef insert(seq, p, x='*'):\n    seq = list(seq)\n    return tuple(seq[:p] + [x] + seq[p:])\n\ndef swap(seq, i):\n    seq = list(seq)\n    seq[i], seq[i+1] = seq[i+1], seq[i]\n    return tuple(seq)\n\ndef is_subsequence(short, long):\n    it = iter(long)\n    try:\n        for x in short:\n            while next(it) != x:\n                pass\n        return True\n    except StopIteration:\n        return False\n\nalpha = ('a', 'b')\nmax_n = 6\ncase_counts = {'touch': 0, 'away': 0, 'touch_bad': 0, 'away_bad': 0}\n\nfor n in range(max_n + 1):\n    for S in product(alpha, repeat=n):\n        for p in range(n + 1):\n          

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0178-fact-support-contained-window-decidable-audit", "tags": "4360dd15 fact method [[predicate verification]] [[boundary audit]] [[local insertion]]", "text": "!! 迭代 178:support-contained-window 的可判定条件已机器复核\n\n本轮把 `support-contained-window` 写成了一个可以直接判定、并且与显式边界一致的形式:\n\n`p-r <= i and i+1 <= p+r`\n\n等价于:\n\n`i in range(max(0,p-r), min(n-1,p+r-1)+1)`\n\n!! 机器复核结果\n用 `run_python` 对 `n,p,r` 的边界穷举做了对账,结果:\n\n- `mismatches: 0`\n- 因而区间写法与显式包含写法完全一致\n- `r=0` 时合法交换起点集合为空,只有纯插入可行\n- `p=0 / p=n` 时都能正确截断到边界内\n\n!! 可复用判定模板\n

python\ndef admissible_swap_starts(n, p, r):\n return range(max(0, p-r), min(n-1, p+r-1) + 1)\n\n\ndef support_contained_window(p, i, r):\n return (p-r <= i) and (i+1 <= p+r)\n
\n\n!! 对 `accept(S, L, r)` 的直接意义\n如果后续继续证明 `insert + ≤1 swap` 的无界正确性,那么窗口检查应直接使用支持集包含语义,避免把 `|i-p|<=r` 这种距离语义混入判定接口。\n\n!! 备注\n这轮还尝试把边界审计封装成自定义工具,但当前工具注册调用参数不匹配,未成功;因此先保留为可直接复用的 Python 片段,而不是新工具。"}

→ call search_memory({"limit":8,"query":"4360dd15 left swap pure string lemma marker extraction inverse swap left substring witness"})

【tool:search_memory】
{"ok": true, "results": [{"title": "Iter-4360dd15-0184-transcript", "tags": "iteration transcript", "modified": "20260424235630000", "preview": "!! 第 184 轮执行轨迹

→ call search_memory({\"limit\":10,\"query\":\"4360dd15 NEXT_FOCUS witness normalization marker left side pure string adjacent swap local feasibility\"}) → call list_recent({\"n\":8}) → call list_by_tag({\"limit\":10,\"tag\":\"4360dd15\"}) 【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0181-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424235"}, {"title": "Iter-4360dd15-0181-transcript", "tags": "iteration transcript", "modified": "20260424235247000", "preview": "!! 第 181 轮执行轨迹
   → call search_memory({\"limit\":10,\"query\":\"4360dd15 NEXT_FOCUS witness normalization marker insert adjacent swap support-contained window\"})    → call list_recent({\"n\":8})    → call list_by_tag({\"limit\":20,\"tag\":\"4360dd15\"})  【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0177-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424234633000\","}, {"title": "Iter-4360dd15-0177-transcript", "tags": "iteration transcript", "modified": "20260424234633000", "preview": "!! 第 177 轮执行轨迹  
→ call search_memory({\"limit\":8,\"query\":\"4360dd15 support-contained-window accept(S,L,r) insertion one adjacent swap reachable proof\"}) → call list_recent({\"n\":8}) → call list_tools({}) 【tool:search_memory】 {\"o

→ call search_memory({"limit":8,"query":"4360dd15 commute insertion swap left of marker independent extract left prefix insertion commutes with swap"})

【tool:search_memory】
{"ok": true, "results": [{"title": "Iter-4360dd15-0184-transcript", "tags": "iteration transcript", "modified": "20260424235630000", "preview": "!! 第 184 轮执行轨迹

   → call search_memory({\"limit\":10,\"query\":\"4360dd15 NEXT_FOCUS witness normalization marker left side pure string adjacent swap local feasibility\"})    → call list_recent({\"n\":8})    → call list_by_tag({\"limit\":10,\"tag\":\"4360dd15\"})  【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0181-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424235"}, {"title": "Iter-4360dd15-0177-transcript", "tags": "iteration transcript", "modified": "20260424234633000", "preview": "!! 第 177 轮执行轨迹  
→ call search_memory({\"limit\":8,\"query\":\"4360dd15 support-contained-window accept(S,L,r) insertion one adjacent swap reachable proof\"}) → call list_recent({\"n\":8}) → call list_tools({}) 【tool:search_memory】 {\"ok\": true, \"results\": [{\"title\": \"Iter-4360dd15-0176-transcript\", \"tags\": \"iteration transcript\", \"modified\": \"20260424234429000\", \"preview\": \"!! 第 176 轮执行轨迹
"}, {"title": "Iter-4360dd15-0176-transcript", "tags": "iteration transcript", "modified": "20260424234429000", "preview": "!! 第 176 轮执行轨迹  
→ call search_memory({\"limit\":8,\"query\":\"4360dd15 accept(S,L,r) support-contained-window proof one-adjacent-swap\"}) → call list_recent({\"n\":8}) → call list_by_tag({\"limit\":20,\"tag\":\"4360dd15\"}) 【tool:search_m

→ call read_note({"title":"Iter-4360dd15-0181-fact-fresh-marker-witness-canonicalization-n6"})

【tool:read_note】
{"ok": true, "title": "Iter-4360dd15-0181-fact-fresh-marker-witness-canonicalization-n6", "tags": "4360dd15 fact method predicate verification local insertion witness normalization", "text": "!! 迭代 181:fresh marker 规范化 witness 的穷举复核到 n<=6\n\n本轮只做一件事:验证上一轮提出的规范化主张——\n若 (S,L) 存在某个 insert + ≤1 adjacent swap witness,且 L 中的插入 token 是 fresh marker μ 且唯一,那么总能把 witness 规范化为“以 μL 中的位置作为插入位”的版本。\n\n!! 检验命题\n对二元字母表 {'a','b'},枚举所有长度 n<=6S,以及所有长度 n+1 且恰含一个 μL:\n\n- feasible_witness_exists(S, L):存在某个插入位 p,使得 S 插入 μ 后,经过零次或一次相邻交换得到 L;\n- canonical_witness_exists(S, L):将 μL 中的位置 j 作为插入位,再检查零次或一次相邻交换是否能得到 L。\n\n若前者为真,则后者也应为真。\n\n!! 可复现代码\n

\nfrom itertools import product\n\nalpha = ('a','b')\nmarker = 'μ'\n\ndef insert(seq, p, x):\n    seq = list(seq)\n    return tuple(seq[:p] + [x] + seq[p:])\n\ndef swap(seq, i):\n    seq = list(seq)\n    seq[i], seq[i+1] = seq[i+1], seq[i]\n    return tuple(seq)\n\ndef feasible_witness_exists(S, L):\n    n = len(S)\n    for p in range(n+1):\n        B = insert(S, p, marker)\n        if B == L:\n            return True\n        for i in range(n):\n            if swap(B, i) == L:\n                return True\n    return False\n\ndef canonical_witness_exists(S, L):\n    if L.count(marker) != 1:\n        return False\n    j = L.index(marker)\n    B = insert(S, j, marker)\n    if B == L:\n        return True\n    for i in range(len(S)):\n        if swap(B, i) == L:\n            return 

→ call run_python({"code":"from itertools import product\n\ndef is_subsequence(short, long):\n it = iter(long)\n try:\n for x in short:\n while next(it) != x:\n pass\n return True\n except StopIteration:\n return False\n\nalpha=('a','b')\nmarker='μ'\n\nviolations=[]\nfor nx in range(0,6):\n for ny in range(0,6-nx):\n for X in product(alpha, repeat=nx)

【tool:run_python】
{"ok": true, "stdout": "violations 0\nchecked cases ok\nleft-swap specialization ok\n", "stderr": "", "returncode": 0}

→ call write_note({"tags":["4360dd15","fact","method","predicate verification","local insertion","witness normalization"],"text":"!! 迭代 185:把“锚定 μ 后的左侧交换”化成一个存在切分的可验证引理\n\n本轮没有继续追求错误的“左右完全独立”分解,而是把目标压成一个更窄、但足够用的局部命题:\n\n当唯一 marker `μ` 固定在 `L = X · μ · Y` 中时,`S` 的可行性不应按“左前缀 / 右后缀”作硬切分;正确的结构是:\n\n''存在某个切分点 `k`'',使得\n- `S[:k]` 是 `X` 的有序子序列,且\n- `S[k:]` 是 `Y` 的有序子序列。\n\n这一定义了一个更稳健的 witness 规范:`μ` 只负责把目标串分成两块,但两块之间的分界点要)

【tool:write_note】
{"ok": true, "path": "C:\\Users\\yh599\\Documents\\GitHub\\LLM-yh\\Sisyphus\\wiki\\tiddlers\\Iter-4360dd15-0185-fact-existential-cut-lemma.tid", "title": "Iter-4360dd15-0185-fact-existential-cut-lemma"}

【assistant】
本轮把上一轮被证伪的“左右独立硬切分”改成了正确的**存在切分点**版本,并用代码复核了它:

- 对任意 `S, X, Y`,有
`S ⪯ X·Y <=> ∃k. S[:k] ⪯ X ∧ S[k:] ⪯ Y`
- 若 `L = X'·μ·Y` 且 `X'` 只是 `X` 内一次相邻交换的结果,这个分解仍成立。
- 这说明:锚定 `μ` 后,后续证明不该把左/右两侧写成固定边界的独立子问题,而应保留一个可移动的切分点 `k`。

我已把这一步沉淀为新 tiddler:`Iter-4360dd15-0185-fact-existential-cut-lemma`。

<NEXT_FOCUS>证明“固定 μ 锚点后,左侧纯字母串的一次相邻交换”可以等价转写为左块内部的局部重排,并把该重排下的 subsequence 存在切分引理提升为形式化证明。</NEXT_FOCUS>