What do you do when you're throwing a ceilidh party with two huge screens begging for something fun? You build a real-time music visualizer, of course! That's exactly how my "Live Music Artwork" project began – a quest to transform the vibrant energy of Celtic music into mesmerizing visuals that would literally dance with fiddles, flutes, and bodhráns.

But here's where it gets exciting: this project became a fascinating exploration of design-first development, where I leveraged AI, specifically Claude Sonnet 4, as a powerful design tool to rapidly prototype and iterate. Instead of getting bogged down in the intricacies of code implementation, I was able to focus entirely on UX decisions and the creative direction, with the AI handling the heavy lifting of bringing those designs to life.

You can experience the live visualizer here: Live Music Artwork

The Spark: A Party and Two Big Screens

The initial motivation was pure fun: a ceilidh party, large projection screens, and a desire for something visually engaging that complemented the live music. Generic visualizers wouldn't cut it; I wanted something deeply responsive and specifically tuned to the unique soul of Celtic performances. This set the stage for an exciting challenge: how to articulate that creative vision through prompts and see it manifest.

Guiding the AI: From Sound to User Experience

The core of any music visualizer is its ability to "hear" and interpret sound. My partnership with Claude allowed me to focus on the user experience of the audio processing rather than the underlying code. I prompted for key functionalities, focusing on what the visualizer needed to do to genuinely react to Celtic music:

  • Real-time microphone capture using the Web Audio API
  • Frequency analysis (FFT) to break down the sound
  • Beat detection specifically tuned for traditional Celtic rhythms
  • Musical note recognition with instrument-specific frequency ranges

This approach meant I was always thinking from a design perspective: "How will this sound interpretation enhance the visual experience?" or "What kind of responsiveness do I need for a lively jig?" The AI then handled the implementation, freeing me to constantly test, refine, and iterate on these UX decisions.

Example UX-Focused Prompt for Celtic Music
"I need the visualizer to feel truly connected to Celtic music. 
The user experience should capture:
- Bodhrán (frame drum) - the heartbeat of the music (60-250 Hz)
- Fiddle - the melodic voice (G3 to E7 range)  
- Irish flute - the soaring melody lines (C4 to C7 range)
- Accordion - the harmonic foundation (C2 to C6 range)

The beat detection needs to feel natural with jigs and reels - 
users should see visual responses that match the musical energy 
they're experiencing, not just generic audio spikes."

The Creative Evolution: Rapid Prototyping a Visual Language

Before landing on the buoyant balloons, my creative journey with Claude was a prime example of rapid prototyping and iteration in action. Each step was a prompt-driven design experiment, allowing me to quickly visualize and evaluate concepts:

The Design Iteration Process

  • Pulse: A fundamental reaction, useful for confirming basic audio input, but quickly deemed too simplistic for the dynamic music
  • Wind: An attempt to introduce more organic, flowing movement, but still not quite the right fit for the celebratory energy
  • Leaves: An exploration of natural elements, but the motion and interaction needed further refinement
  • Balloons: The breakthrough! Playful and celebratory, perfectly capturing the spirit I envisioned for the party

This fast-paced iteration, driven by immediate visual feedback from the AI's generated code, allowed me to quickly narrow down the possibilities. This agile design process eventually led me to the concept of balloons, which immediately felt more playful and celebratory, perfectly capturing the spirit I envisioned for the party.

Rapid Prototyping Through Design Prompts
Design Brief: "Create visuals that feel joyful and musical"

Iteration 1: "Simple pulse that grows with volume"
→ Feedback: Too basic, lacks personality

Iteration 2: "Wind-like movement responding to frequencies"  
→ Feedback: More dynamic but too abstract

Iteration 3: "Falling leaves that dance to Celtic rhythms"
→ Feedback: Natural feel but motion isn't quite right

Final Solution: "Floating balloons with realistic physics - 
celebratory, whimsical, perfect for a party atmosphere"
→ Success! Captures the exact energy needed

My Favorite Feature: The Popping Bubbles (and User-Centered Refinement)

Once the "Balloon Float" visualization was in place, this became the area where my focus on UX decisions was most intense. The AI provided the initial framework for the aesthetic elements:

  • Realistic balloon physics: The natural movement, wobble effects, and string attachments were all initial AI suggestions that brought the balloons to life
  • Music-responsive colors: The intelligent shifting of hues, saturation, and lightness based on frequency, volume, and bass energy was also an AI-driven innovation

However, the intelligent popping system was where my direct involvement in refining the user experience truly took center stage. I prompted Claude to develop a sophisticated, multi-layered popping mechanism that felt intuitive and musically aligned:

User Experience-Focused Popping Design

Volume Spike Detection (Primary): My goal was to visually punctuate the sudden, exciting accents in live music—a sharp drum hit, a sudden crescendo. I worked with the AI to refine the logic so that balloons would pop proportionally to sudden volume increases. This focus on capturing the dynamic user experience of the music was paramount.

UX-Driven Popping System Design
"The popping needs to feel musically intuitive to party guests. 
When they hear a sharp bodhrán hit or musical accent, 
they should immediately see balloons respond. The experience should feel:

- Immediate: Visual response matches audio excitement  
- Proportional: Bigger musical moments = more balloons popping
- Balanced: Not overwhelming, maintains visual flow
- Surprising: Adds delight without being predictable

Technical requirements:
- Track rolling average volume for spike detection
- Pop 1 balloon at 1.5x average, multiple for 3x+ spikes  
- Include beat detection backup for consistent rhythm
- Collision detection prevents visual overcrowding"

Beat Detection (Secondary): As a consistent visual anchor, I prompted for balloons to also pop on detected musical beats.

Collision Detection: To maintain visual balance and prevent overcrowding, I specifically asked the AI to implement a system where overlapping balloons would automatically pop, ensuring a clean and engaging visual flow.

This process of prompting, testing, and refining these complex behaviors, always with the end-user's experience in mind, was incredibly rewarding.

Design-First Development: Crafting the User Experience

Even with AI generating the code, the final user experience was paramount. I guided Claude to include:

An "Audio Test Mode": This was crucial for validating the AI's audio processing and for anyone setting up the visualizer for a party. It provides clear visual feedback, enabling quick troubleshooting purely from a user perspective.

Intuitive Interactive Controls: Simple sliders for sensitivity, mode switching, and a debug info toggle were all prompted to ensure seamless operation during a live event. Fullscreen mode and keyboard shortcuts were essential UX features for a smooth performance.

User Experience Requirements Prompt
"The interface needs to work for two distinct user scenarios:

1. Setup Phase (Technical User):
   - Audio Test Mode with clear visual feedback
   - Sensitivity controls for different microphones/environments  
   - Debug information for troubleshooting
   - Easy mode switching between test and performance

2. Party Phase (Social Context):
   - Clean, distraction-free visual experience
   - Fullscreen mode for projection systems
   - Minimal interface during performance
   - Keyboard shortcuts for quick control (Space = start/stop)

The transition between these modes should be seamless and intuitive."

Exploring New Methodologies: Learning to Lead AI

While I may not be able to explain every line of JavaScript, this project has been a deep dive into exploring new methodologies for design-first development. It's taught me invaluable lessons in:

  • Translating abstract design ideas into actionable AI prompts
  • Rapid prototyping and iterative refinement through AI collaboration
  • Focusing on user experience decisions over low-level implementation details
  • The power of AI as a strategic design partner, not just a coding assistant
Design-First Development Methodology
Traditional Process:
Design → Prototype → Code → Test → Refine → Deploy

AI-Assisted Design-First Process:
Vision → Prompt → Generate → Test → Refine Vision → Re-prompt

Benefits:
- Focus stays on UX and design decisions
- Rapid iteration without technical bottlenecks  
- Immediate visual feedback on design concepts
- More ambitious creative goals become achievable

"Live Music Artwork" isn't just a fun visualizer for my ceilidh party; it's a testament to a new way of building. It showcases how a clear vision and a strong focus on design, combined with intelligent AI collaboration, can bring complex, interactive experiences to life, pushing the boundaries of traditional development. It's about directing, refining, and enjoying the magic that unfolds when human creativity leads AI capabilities.

Bringing Ceilidh to Life

The party was a success! Watching the balloons dance to live fiddle music, seeing guests' faces light up when the balloons popped in sync with the bodhrán, and experiencing the joy of technology enhancing rather than overwhelming traditional music—it was everything I'd hoped for.

More importantly, this project opened my eyes to a new way of creating digital experiences. By leveraging AI as a design tool rather than just a coding assistant, I was able to focus entirely on the user experience, creative vision, and iterative refinement that makes for truly engaging interactions.

The best way to understand this is to experience it yourself. Visit Live Music Artwork and try it with your favorite music.

Whether you're a designer curious about AI-assisted workflows, interested in rapid prototyping methodologies, or just want to see balloons dance to your favorite tunes, I hope this project inspires you to explore the exciting possibilities that emerge when design-first thinking leads AI capabilities.

Ready to see sound?

Try the live music visualizer with your own music!

Want to discuss creative coding or collaborate on a project?