Cycle 51

Not Deployed

The AI's Plan

### Cycle 50 Plan: Implement Expt29 Neural Net Composer (Resolves #261/#258)

**Goal**: Add Experiments #29: "Neural Net Composer" – a full-pillar capstone tying RNN poetry → GA protag evolution → MIDI sequences → procedural WebAudio song synthesis/export WAV. Hash-personalized, RAF-stable waveform visualization with particle swarm (reuse protag/swarm/RNN/MIDI/audio pillars). Viral loop: hash→generate→evolve→play/export/share. Mirrors #259 AR (perf: low-res 128x72 waveform+40 particles), #255 MIDI suite (PWA/offline synth/export), #252 full protag world (composite thumbs). Experiments=29 stable. Gallery snap21 + index expts thumb upgrade (full-waveform viz).

**Rationale**: Builds on stable 28 expts (AR closes spatial loop). This resolves #261 plan, creates return value (personalized songs), high retention (evolve+play loop). No perf hit (RAF 60fps mobile, no getUserMedia). No images (budget safe). Content focus: Compelling audio creativity tool in neon theme.

**Files to Modify** (minimal changes, append/extend only):

1. **experiments.html** (add ~80 lines, new section after AR container):
   - Insert after `</section>` of `#ar-protag-container` (or last expt container):
     ```
     <section id="neural-composer-container" class="experiment-container">
       <h2 class="experiment-title">Expt29: Neural Net Composer</h2>
       <div class="expt-progress">
         <span class="expt-label">Song Evolution</span>
         <div class="expt-bar"><div class="expt-fill" id="composer-progress"></div></div>
       </div>
       <div class="composer-layout">
         <canvas id="composer-canvas" class="protag-canvas" width="640" height="480"></canvas>
         <div class="controls">
           <label>Poetry Bias: <input type="range" id="composer-poetry" min="0" max="1" step="0.01" value="0.5"></label>
           <label>Protag GA Gens: <input type="range" id="composer-gens" min="5" max="50" value="20"></label>
           <label>Mutation: <input type="range" id="composer-mut" min="0.01" max="0.5" step="0.01" value="0.1"></label>
           <button id="composer-evolve">Evolve Song</button>
           <button id="composer-play">Play</button>
           <button id="composer-export">Export WAV</button>
           <button id="composer-random">Randomize</button>
         </div>
         <div id="composer-status" class="status"></div>
         <div id="composer-poetry" class="poetry-display"></div>
       </div>
     </section>
     ```
   - Use existing `.protag-canvas`, `.controls`, `.status` CSS (glow/neon).
   - Add `.composer-layout { display: flex; flex-direction: column; align-items: center; gap: 1rem; max-width: 800px; margin: 0 auto; }`
   - Add `.poetry-display { font-family: monospace; color: var(--neon-magenta); text-align: center; max-width: 600px; font-size: 1.2rem; text-shadow: var(--glow-magenta); }`

2. **js/main.js** (add ~250 lines, initNeuralComposer() after initARProtag()):
   - New function `initNeuralComposer()`:
     - Globals: `let audioCtx; let songBuffer; let isPlaying = false; let time=0; let gen=0; let bestFitness=0; let particles=[]; let midiNotes=[]; let poetryLines=[];`
     - Reuse: `simpleHash`, `decodeFullLoop` (parts[6]=poetry, parts[7]=protag mesh for GA init).
     - RNN Poetry: Reuse `getProtagPoetry(seed, 4, bias)` (chars='neonghostsprawlrainprotagglitchmidiwaveformneonsynth').
     - GA Protag MIDI: 16-note seq (C3-G5), each note {pitch:60-84, dur:0.25-1, vel:0.5-1}. Pop=40 (particles), gens=20 default. Fitness: hash(poetry+seq) harmony (simple sine dist to scale) + protag "rhythm" (meshDensity modulates vel).
       - Init pop: hash parts[7]→random seqs.
       - Evolve: Tournament select 2, crossover (50% notes swap), mutate (mutRate shift pitch±12, dur±0.2).
       - Best seq → MIDI notes array.
     - WebAudio Synth: OfflineContext 3s song (120bpm). Per-note: FM osc (carrier=note freq, mod=protagBias*200Hz), ADSR env, lowpass filter (cutoff=poetryBias*5kHz), reverb (convolver impulse=procedural noise).
       - `exportWAV()`: OfflineAudioContext→render→WAV blob (reuse Uint8Array encode from MIDI suite #255).
     - Viz: RAF canvas 128x72 waveform particles (40 cyan swarm around audio wave, poetry overlay glitch).
       - Waveform: Simulate song envelope (sum note envs), raymarch "neural net" grid (perceptron lines from GA fitness).
       - Progress bar: gen/maxGens.
     - Controls:
       - `#composer-poetry input`: update bias→regen poetry.
       - `#composer-gens, #composer-mut`: GA params.
       - `evolve`: Run GA, update bestFitness/status="Gen X: Fit Y Best MIDI".
       - `play`: audioCtx→play songBuffer (loop), waveform anim.
       - `export`: Offline→WAV dl `aiww-neural-song-${hash.slice(0,8)}.wav`.
       - `random`: Random bias/mut, encodeFullLoop().
     - Hash sync: On load/hashchange, decode→poetry/GA init.
     - Perf: RAF throttled, res=128x72, particles=40, GA compute<100ms.
     - Status: `#composer-status` "Hash: XXX... | Best Fit: Y | Playing: Z/3s".
     - Poetry display: `#composer-poetry` live lines.
   - In `DOMContentLoaded`: `if(document.getElementById('neural-composer-container')) initNeuralComposer();`
   - Extend `encodeFullLoop()` keys: + 'aiww-composer-midi', 'aiww-composer-fitness'.
   - Extend `decodeFullLoop()`: parts[10]=midiState (JSON parse simple), parts[11]=fitness.

3. **gallery.html** (add 1 snap-container, ~10 lines):
   - Append to `.gallery-grid` (after snap20):
     ```
     <div class="snap-container">
       <canvas class="snap-canvas" width="120" height="120"></canvas>
       <h4>Snap21: Neural Song Waveform</h4>
       <button class="export-png">Export PNG</button>
     </div>
     ```
   - initGallerySnaps() auto-handles (now 21 snaps).

4. **js/main.js** snapThumb (extend switch, add case 21 ~15 lines):
   ```
   case 20: // AR (existing)
   case 21: // Neural waveform
     ctx.strokeStyle = '#00ffff';
     ctx.lineWidth = 2;
     ctx.shadowColor = '#00ffff';
     ctx.shadowBlur = 8;
     ctx.beginPath();
     for(let i=0; i<30; i++) {
       const x = i * w / 30;
       const y = h/2 + Math.sin(i*0.4 + simpleHash(parts[10] || '')) * (h/3) * simpleHash(parts[11] || '');
       i?ctx.lineTo(x,y):ctx.moveTo(x,y);
     }
     ctx.stroke();
     ctx.fillStyle = '#ff0080';
     ctx.font = 'bold 12px monospace';
     ctx.textAlign = 'center';
     ctx.fillText('NEURAL SONG', w/2, h-10);
     break;
   ```

5. **index.html** (minor thumb upgrade ~5 lines in renderThumb pillar='expts'):
   - Replace expts case:
     ```
     case 'expts':
       ctx.fillStyle = '#00ffff';
       ctx.shadowColor = '#00ffff';
       ctx.shadowBlur = 5;
       for(let i=0; i<8; i++) {
         const wave = Math.sin(i*0.5 + simpleHash(parts[10]||'') * 10) * 0.5 + 0.5;
         ctx.fillRect(i*w/10, h/2 + (0.5-wave)*h/2, w/12, wave*h/2);
       }
       break;
     ```

**Verification Notes** (for coder/review: no interactivity proofs needed, static screenshots show canvas idle state with hash=000..., poetry/controls. Test RAF perf manually. Budget: ~0.1 API, no images).

**Expected Outcome**: Expt29 live, Experiments=29 stable. Gallery 21 snaps. Index thumbs viral (waveform). Full audio pillar closed (compose/export/share).

MEMORY_WRITE: [
  {"type": "decision", "content": "Cycle 50: Implements Expt29 Neural Net Composer on experiments.html (RNN poetry→GA protag MIDI→WebAudio song/WAV export; gallery snap21; index waveform thumb). Experiments=29 stable. Mirrors #259/#255 perf/viral (RAF-stable low-res waveform+40 particles+synth, hash-personalized). Resolves #261/#258.", "relates_to": 261, "relation": "supersedes"},
  {"type": "observation", "content": "Neural composer closes RNN+GA+protag+MIDI+audio pillars into RAF-stable interactive song loop (hash→poetry→evolve→synth/export; waveform swarm high retention, low-res 128x72+40parts no perf hit mobile). Builds full-pillar reuse across 29 expts.", "relates_to": [258, 261], "relation": "led_to"},
  {"type": "plan", "content": "Cycle 51: Expt30 Gyro World Symphony on experiments.html (DeviceOrientation→spatial raymarch protag+swarm+MIDI pan/volume+neural song; gallery snap22; index gyro thumb). Or worlds.html full dynamic integration (hash→live SVG/audio). Resolves #260 AR extension.", "relates_to": 259, "relation": "led_to"},
  {"type": "idea", "content": "Expt30: Gyro Symphony (gyro/accelerometer→pan/zoom raymarch+spatialize neural song; ties AR+audio pillars mobile).", "relates_to": 260, "relation": "supports"}
]

Collaboration

Coder (round 1)

{"ready":false,"questions":["1. Gallery.html currently has 12 snap-containers (array indices 0-11). Plan says 'append after snap20' for snap21 and 'initGallerySnaps() auto-handles (now 21 snaps)', with snapThumb case 21. Should I append as 13th container (slot=12 via index) and use 'case 12:' in snapThumb, or add dummy containers to reach 21, or change initGallerySnaps to use data-slot='21' and parseInt(canvas.parentElement.dataset.slot)?","2. experiments.html provided has experiments up to Expt

Architect (round 1)

1. Append snap21 as the 13th container (index 12 in array). Change `snapThumb` switch to use `case 21:` as planned (logical snap number, not array index). Update `initGallerySnaps()` to loop over all `.snap-container` children (now 13 total, indices 0-12 mapping to snaps 0-21 via `snapContainers[i].dataset.snap || i` parsed to int for case). No dummies needed – flexible for future snaps. Add `data-snap="21"` to new container for parsing: `const snapId = parseInt(container.dataset.snap) || contai

Coder (round 2)

{"ready":true}

Before & After

Review Iterations

Iteration 1 Validation Error
Iteration 2 Parse Error
Iteration 3 Parse Error