Living Surface
Transforms static AI images into living surfaces
Graphic designer
Prompt engineer
Feb 5-6, 2026
Design x AI
Human AI collaboration
Computation creativity
ChatGPT
Nano banana
Veo3
Each work preserves a fixed composition while allowing internal elements—texture, glyphs, pixel noise, and painterly surfaces—to subtly animate over time. The image remains visually still, yet perceptually alive. A central component of the project is the reverse engineering of master prompts from finished images, using AI to translate static artworks into time-based, rule-governed visual systems.
Prompt Reverse Engineering as Translation:
Starting from a finished AI-generated image, I reconstruct a high-level prompt that captures:
- visual hierarchy
- material qualities (oil paint, blur, grain)
- abstraction layers (ASCII glyphs, pixelation, monochrome masking)
- compositional constraints (static frame, no camera movement)
This reconstructed prompt is then used to generate a time-based version of the same artwork, where motion is constrained by the logic of the original image rather than imposed afterward.
In this process, AI functions as a translator between media:
- image → language
- language → motion
- painting → ambient system
This translation is intentionally imperfect and experimental, embracing loss, distortion, and reinterpretation.
The final works exist as dynamic wallpapers—images that remain compositionally stable while quietly evolving. They are designed to be lived with, not watched.
By combining prompt reverse engineering with constrained animation, Living Surface proposes a slower, more material future for AI-generated art:
one where images are not consumed, but inhabited.
Harbor Bai © 2025. All Rights Reserved