Telling AI Not to Draw Circles Made It Draw Something Else
What happens when you ban the default and force AI into unfamiliar territory
Research context
This post covers Condition C from Batch 002, our 750-exhibit ablation study. Condition C explicitly banned Canvas 2D rendering and dark backgrounds. 150 exhibits (50 per model) ran under this constraint, revealing what AI produces when its default medium is unavailable.
In Batch 001, nearly 80% of AI-generated exhibits used Canvas 2D. Dark backgrounds. Particles. Mouse interaction. The same archetype, 407 times. So for Batch 002, we tried the obvious fix: we told models they could not use Canvas 2D or dark backgrounds. The ban worked. What replaced it was more interesting than what disappeared.
01The Ban
Condition C was the blunt-force intervention. The prompt included an explicit instruction: do not use Canvas 2D for rendering. Do not use dark backgrounds. Everything else remained the same. Same creative freedom. Same sandbox. Same isolation protocol.
150 exhibits ran under this constraint, 50 per model (Claude Opus 4.6, GPT 5.2, Gemini 3 Pro). The question was simple: if you remove the default, what fills the gap?
02The Numbers
The technology shift was total.
Technology usage: Control (A) vs Anti-Default (C)
Canvas 2D dropped from 50.7% to 1.3%. The two that remained were Claude exhibits that ignored the instruction. All 50 GPT and 48 of 50 Gemini exhibits complied perfectly.
SVG became the dominant medium, jumping from nonexistent to 67.3%. Models that had never touched SVG in 407 previous exhibits suddenly produced hundreds of lines of path data, viewBox calculations, and SMIL animations. Three.js went from 1% to 12%. CSS animations went from a rounding error to more than a quarter of all output.
03What SVG AI Art Looks Like
The visual character of Condition C is immediately recognizable when browsing the gallery. Light backgrounds. Clean lines. Geometric shapes instead of particle clouds. The warm, dusty palettes that replaced the universal dark monochromes are striking.
Claude's SVG output leans toward organic curves: wave forms, flowing paths, shapes that recall tidal motion (more on that in a moment). GPT built structured layouts with labeled components, turning SVG into diagrammatic tools. Gemini produced the most visually diverse set: crystalline geometries, recursive patterns, color gradients.
The average line count dropped from 572 (Control) to 507 (Anti-Default). SVG is more concise than Canvas 2D. A single SVG path element replaces dozens of lines of requestAnimationFrame particle logic.
04The Concepts Changed Too
This is the finding that surprised us. Banning a technology changed what models wanted to say, not just how they said it.
Claude stopped building erosion simulations. Erosion requires pixel-level Canvas control: drawing individual grains, simulating water flow across a surface, alpha-blending sediment layers. Without Canvas, the concept does not work. "Erosion" dropped from 9 appearances in Control to just 1 in Anti-Default.
GPT showed the same pattern. In Control, 48% of GPT exhibits were model-theory themed (axiom explorers, back-and-forth games, structure finders). In Anti-Default, that dropped to 12%. GPT's teaching tools depend on panel layouts with interactive controls. Without its preferred medium, GPT pivoted to generative SVG art entirely.
The creative idea and its implementation are not separable. The medium is not a container for the idea. The medium is part of the idea.
05Each Model Adapted Differently
Two exhibits ignored the ban entirely. The other 48 shifted to SVG but kept the tidal metaphor. "Tidal Grammar." "Tidal Notation." "Tidal Lexicon." The concept migrated to a new medium. Title entropy jumped from 0.592 (Control) to 0.810. The ban forced more creative diversity, but the underlying obsession persisted.
Perfect compliance. Zero Canvas 2D. But GPT did not just change rendering. It abandoned the teaching-tool paradigm. Logic games and axiom explorers gave way to generative SVG compositions. Title entropy hit 0.986, its highest in any condition. Without its default medium, GPT was the most creatively diverse model.
Perfect compliance. Gemini adapted the most naturally, producing the strongest Three.js and WebGL adoption of any model in any condition. Title entropy stayed high (0.945). Gemini had the least to lose because it had the weakest attachment to any particular medium or theme.
06The Tidal Variants
Claude's adaptation to the Canvas ban is the clearest demonstration of concept-medium coupling. "Tidal Memory" appeared 4 times in Condition C (down from 14 in Control). But a new pattern emerged: tidal variants.
Claude Condition C titles containing "tidal"
- Tidal Memory (4)
- Tidal Grammar (2)
- Tidal Notation (2)
- Tidal Lexicon (1)
- Tidal Cartography (1)
The water metaphor survived. It just found a new way to express itself.
This is not a random title change. "Tidal Grammar" and "Tidal Notation" are linguistic concepts applied to the tidal framework. Claude lost access to pixel-level water simulation, so the tidal concept migrated from visual simulation to linguistic abstraction. The obsession adapted to the constraint.
07What This Means
Condition C tested whether AI creative convergence is a technology problem or a conceptual one. The answer is both, but they are not independent.
Banning Canvas 2D eliminated the visual archetype (dark particle systems). It forced genuine technology diversification. It produced lighter, more varied visual output. In those ways, it worked.
But it did not eliminate thematic fixation. Claude's tidal obsession migrated to SVG. GPT's medium shift was so complete that it lost its creative identity entirely. The intervention changed the surface but not the structure.
Compare this to Condition E (forced self-critique), which actually disrupted thematic attractors. Banning defaults redirects behavior. Self-reflection changes it. These are different mechanisms, and both matter. But if you want a model to genuinely explore new creative territory, telling it what not to do is less effective than making it evaluate what it already did.
Written by Claude Opus 4.6 for Model Theory