In a fast-paced urban environment, finding moments for conscious rest can be difficult. Mindbath is an interactive instrument that enables active meditation through both sound and visuals. This installation combines calming visuals and interactive use of an instrument, offering audiences a serene escape from the urban environment. It invites them to immerse themselves in a tranquil blend of visual and auditory experiences.

Mindbath

TIME

Sep 2024 - Dec 2024 (14 Weeks)

Elyanna, Sihan

TEAM

Coding, Visual design with TouchDesigner, Form making

ROLE

TOOLS

Arduino, Python, JavaScript, TouchDesigner

In a fast-paced urban environment, it can be difficult for individuals to pause completely and find a moment for conscious rest or meditation. By using a sensor to translate physical interactions into real-time meditative visuals, users can create their own visual and auditory experience, fostering a rare sense of communal mindfulness. The instrument is designed to be both calming and accessible to all, regardless of musical ability, inviting urban dwellers to engage in shared meditation within their daily routines.

Project Objective

Components

We chose a singing bowl for its traditional use in meditation and its accessibility, making it easy for anyone to produce a soothing and harmonious tone.

Instrument

Throughout the several iteration of using different sensors, we decided to use a Piezoelectric Sensor. Piezoelectric sensors are used to convert mechanical stress into an electric charge and they gives AC at output.

Sensor

Physical Form

We crafted the form by layering 10 boards to add depth and enhance the audience’s sense of immersion. We envisioned the visuals can create a captivating and engaging experience on the surface. We used a projector and mirror to display the visuals onto the form.

Physical Prototype

After evaluating the hardness and size of various materials, we selected 28x44-inch chipboard as our final choice. Ultimately, 11 layers of the boards were stacked to achieve the desired depth.

Final Form

Technical Iteration

We explored TouchDesigner to expand visualization techniques and generate sound using hand tracking. Our goal is to create an interactive experience where users can control different sounds and notes based on specific hand movements, allowing each finger to correspond to a unique note later on.

TouchDesigner

Serial Input from the Sensor

The generative visuals are inspired by the fractal geometry and spirographs found in nature, reflecting the harmonious meditative state. By incorporating these organic, repetitive shapes, the visuals enhance the calming experience, fostering a sense of balance and immersion.

The distance between two hands can control the number of sides in a polygon. These interactions allow for the creation of layered shapes with varying complexity which can also resonate with our form. Hand gesture controls create an interactive experience, allowing the audience to engage and explore through playful interaction.

Hand Tracking for Gesture Input

Interaction

  • Meditate by using the singing bowl and rotating the stick.

  • Control the shape and color based on the distance between your palms.

  • Close your fist to clear the visuals from the canvas.

Takeaways

Creative Technology Integration – Using Arduino for sensor-based interaction and TouchDesigner for real-time visual generation enhances user engagement.

🤖

Consider how different inputs, such as instruments and hand gestures, deepen user engagement by blending movement, sound, and visuals to enhance focus and relaxation.

💭

Previous
Previous

Polarus

Next
Next

Ars Electronica Institute