That is the heartbeat of contemporary AI. Data folded into a dense internal landscape where words cluster into something that behaves like understanding.
The models are creating meaning, yet their pathways remain opaque. When they hallucinate, they are not simply wrong; they expose the edges of an internal world whose logic we cannot fully trace.
The artworks explore that fault line. Classical paintings, early computer interfaces, neon vector grids, and digital remnants sit side by side, compressed into a single surface much like features in a neural layer. The small coded labels scattered across the images resemble confidence scores, quiet indicators of the hidden mathematics.
Each collage becomes a graphic expression of the model’s internal turbulence. Layer sits above layer, and underneath we glimpse not a complete scene but fragments of training data and partial connections, the visual equivalent of a model synthesizing meaning from incompatible sources. Are we witnessing the big bang of meaning for machines?