Bleached

The Problem: Carbon-driven coral bleaching is distant, abstract, and underwater—making it invisible and hard to connect to. Standard museum displays, like touchscreens and videos, often fail to create lasting engagement.

Our design goal: Create an immersive experience that transforms user presence into a visceral reminder of environmental impact.

The Solution: Bleached is a interactive 360° installation designed to educate users about coral bleaching through audio-visual immersion. Built with generative visuals, Arduino sensors, Unity, and spatial audio, the installation immerses users in a thriving reef that degrades in response to their proximity—transforming passive engagement into a cause-and-effect learning experience.

Client

CSULB Innovation Lab

Project

Digital Installation

Year

2023

Tools

Unity, Igloo 360 Room, Arduino, Premeire, After Effects, Figma

Team

Retish Aditya, Keira Wong, Aleyna Akkan

Context

Developed in one month at CSULB’s Innovation Lab, the project prototypes a scalable framework for interactive educational environments. It explores how sensor-driven feedback, environmental storytelling, and immersive media can create more memorable and emotionally resonant learning experiences. The core design principles—real-time feedback, spatial responsiveness, and multimodal storytelling—position Bleached as a case study for future museum or exhibition-based interactive installations.

Research

Inspired by the Aquarium of the Pacific’s most memorable exhibits, we sought to reverse the typical passive experience. Our vision was to create a thriving underwater world that reacted to human presence—not just to inform, but to confront. When guests entered the room, their physical proximity triggered a cascade of visual and sonic degradation—visually connecting their presence to environmental collapse.

To better understand how cultural institutions and people engage with environmental education, we studied 8 interactive installations at the Aquarium of the Pacific. Our goal was to identify what and how different types of experiences resonated most with guests across different age groups.

We noticed that only 3 installations elicited meaningful engagement—usually those involving physical interaction, unexpected triggers, or living organisms. Meanwhile, digital displays like iPads or video walls received minimal attention and quick drop-off.

We also conducted informal interviews and observations of parents, children, and educators to gather insights into how they approached and remembered different exhibits.

From this research, two clear pain points emerged:

  1. Lack of Feedback – Most digital installations felt static and did not respond to user presence. This made them easy to ignore and forget.

  2. Interface Fatigue – Devices like tablets and touchscreens felt too familiar. Guests didn’t feel curious, challenged, or emotionally drawn in.

These insights led us to reframe our problem:

How might we create an immersive, educational installation that uses interaction and feedback to make environmental consequences feel immediate and personal?

Development

We pitched four early concepts based on metaphor and interaction:

  • Dramatization – Use visual contrast (fire vs. ocean) to provoke urgency

  • Past & Future – Force visitors to witness a deteriorated future environment

  • Source & Impact – Connect CO₂ emissions directly to ocean acidification

The final concept was an Underwater Perspective of the Death of Coral Life driven by the presence of the museum goers.

From there, we built Bleached using the following pipeline:

  • Visuals: We used generative AI to design vibrant reef scenes and composited them into two animated 8K panoramas—one alive, one bleached. With the size of the screens we wanted to make sure we can produce original high definition images, so we opted for AI to generate underwater coral reef images which we then animated.

  • Audio: A calm underwater soundscape faded into a distorted, hollow tone as the interaction progressed. We used copyright free music to create a layered underwater spatial experience.

  • Interaction Logic: We used Arduino & ultrasonic sensors to detect user proximity and duration, feeding that data into Unity for real-time playback control.

  • Environment: The installation was deployed inside CSULB’s Igloo Dome, a 360° immersive space that allowed full spatial projection.

Two levels of interaction were designed:

  • Short-term presence faded the reef gradually

  • Sustained presence triggered full bleaching, reinforcing the message of cumulative impact

What we learned

During final reviews and walkthroughs, we observed reactions and collected feedback from students and designers. These were our key takeaways:

  • Narrative Extension Was Missing

    The experience was effective in-the-moment but didn’t offer guests a way to continue learning. Printed materials, QR codes, or digital extensions could drive longer-term impact.

  • Sensor Precision Matters

    Our microwave sensor lacked the fidelity to tie user movement closely to the reef’s condition. Future iterations would benefit from motion tracking or gesture-based inputs.

  • Real vs. Generated Media

    While AI-generated visuals were efficient and stylized, they lacked biological realism. Live footage or photorealistic rendering could deepen emotional resonance.

  • Emotional Pacing Was Effective

    The gradual degradation of the reef created a strong emotional arc. Visitors expressed surprise and discomfort—indicators that the system effectively externalized abstract climate data into felt experience.

Next Steps

We envision future versions that expand multi-user interactions, embed real reef footage, integrate scent and haptic cues, and offer tangible educational extensions. Most importantly, we want to evolve the installation into a mobile toolkit for schools and public spaces—bringing climate storytelling where it’s needed most.