Content
Kaiju is an XR stage short film that blends live-action performance with real-time virtual production using Unreal Engine. The film tells the story of a city under siege by a colossal kaiju, with the protagonist experiencing the devastation firsthand. Utilizing the power of LED volume technology and motion capture, Kaiju delivers a cinematic experience that merges practical and digital effects seamlessly.

LED Volume

VFX

Virtual Production

Environment Design
Tools
Overview
Role
Year
Genre
Platform
Tools
Screenshots




Contributions
Concept & Story Development
"Kaiju" began as a concept rooted in reimagining the monster movie format through the lens of virtual production and real-time environments. The film is told from the ground-level perspective of civilians, reporters, and military responders during the sudden invasion of a city by a colossal creature. Rather than focusing on the monster itself, the narrative centers around human reaction, media panic, and urban collapse. Framed initially as a news broadcast, the film quickly spirals into chaos as sirens wail, distant tremors shake the screen, and explosions illuminate the skyline. The goal was to tell a visceral story with minimal exposition, relying instead on environmental storytelling, visual escalation, and emotional timing. I structured the narrative into a classic three-act arc: tension, escalation, and a crescendo of destruction, leaving viewers suspended in a moment just before everything comes undone.




Previsualization & Planning
During the pre-production phase, I sketched a series of storyboards and motion diagrams to map camera angles, environment builds, and the dynamic flow of each scene. Since we were working with an XR stage, every shot needed to be meticulously blocked to accommodate camera tracking, actor placement, and the parallax requirements of the LED volume. I created a previs edit with greybox environments inside Unreal Engine to test the pacing and visual rhythm early on, iterating in real time with the team. These low-fidelity sequences became the foundation for aligning the virtual and physical aspects of production, ensuring that the performers could interact believably with digital environments while giving us room to fine-tune lighting, framing, and visual effects before stepping onto the stage.


Environment Design in Unreal Engine
The world of Kaiju was built entirely within Unreal Engine 5.3, where I designed a sprawling urban environment that could shift in tone and lighting throughout the film. I developed modular city blocks using Marketplace assets and high-fidelity textures from Quixel Megascans to give a grounded and cinematic feel to the setting. Each building facade was constructed with destruction states in mind — from pristine glass towers in the opening to scorched and collapsing structures in the climax. Environmental storytelling was critical: abandoned vehicles, scattered papers, downed power lines, and flickering streetlamps helped convey panic without ever needing to show the creature directly. To simulate urban decay and warzone chaos, I created looping particle effects for smoke, fire, debris, and dust using Niagara, layering them with time-of-day lighting changes to create an increasingly volatile atmosphere.
Virtual Cinematography & Sequencer Workflow
As the cinematographer, I used Unreal’s CineCameraActors and Sequencer to design and orchestrate each shot. The project called for both grounded, handheld-style compositions and sweeping cinematic visuals to create a feeling of being embedded in the chaos. I set up multiple virtual lenses, mimicking real-world film optics, and integrated depth of field, custom LUTs, and chromatic aberration to reinforce the tone. Sequencer timelines were organized per scene, each containing camera movement, character animation, lighting adjustments, fog intensity, and Niagara system triggers. For transitions, I choreographed camera movement that mimicked practical rigs like jibs, cranes, and shoulder cams, blending traditional filmmaking language with real-time technology. Each camera path was tested live with LED wall previews, ensuring accuracy between Unreal renders and what was captured on set.
Live-Action Integration on the XR Stage
One of the most unique aspects of Kaiju was the integration of real actors into Unreal environments using an LED volume. On set, I worked closely with the camera operator and gaffer to sync lighting and actor blocking with the digital backdrop. I designed each shot to accommodate realistic shadow casting and light reflections off the LED wall, using HDRI textures and DMX-controlled lighting to blur the line between physical and digital space. The actors' performances were informed by pre-rendered sequences, which allowed them to react naturally to environmental destruction — whether looking toward collapsing buildings, dodging invisible shockwaves, or responding to distant monster sounds. The spatial sync between physical and virtual elements was meticulously managed using nDisplay and real-time camera tracking, resulting in immersive in-camera VFX with minimal need for post-compositing.
VFX, Animation & Creature Presence
Even though the Kaiju itself is never fully revealed, its presence is constantly felt through the environment and action. I designed the illusion of its size and movement using a combination of indirect cues: shaking camera movements, falling debris, shadows sweeping over buildings, and reactions from both characters and the environment. Animation-wise, I implemented soldier movements via Mixamo animations, carefully timed with external events like nearby explosions or aircraft flybys. Helicopters were animated to circle the Kaiju’s implied location, while missile trails and timed blasts were created using Niagara particle emitters and mesh trajectories. The creature’s footsteps were expressed through earth tremors and real-time physics events such as bouncing cars and shattering windows, keeping the suspense intact while letting the audience’s imagination complete the picture.
Sound Design & Emotional Atmosphere
Sound played a massive role in building tension and making the virtual world feel alive. I developed a layered soundscape in Adobe Audition, beginning with subtle newsroom murmurs and distant city ambiance that eventually gave way to blaring sirens, deep growls, radio static, and the chaos of war. Carefully synced audio peaks were matched to environmental events — glass breaking as a blast goes off, propellers slicing through the sky, and terrified breaths from civilians crouching behind rubble. To add realism, I included reverb filters and low-pass effects based on location perspective: for example, indoor shots had muffled outside destruction, while wide outdoor shots echoed with depth and open space. The Kaiju’s sound signature itself was a blend of low-frequency bass pulses and distorted organic growls, designed to be felt more than heard.
Editing, Color & Final Delivery
The final film was edited in Adobe Premiere Pro and color graded in DaVinci Resolve. I used Unreal’s render queue to export high-resolution sequences in EXR and ProRes formats, ensuring maximum visual fidelity. I color matched the XR stage shots with Unreal renders for seamless transitions and used color to express tonal shifts: neutral tones for the opening, washed-out blues and reds during panic scenes, and warm, fiery hues during the final act. I added minimal titles and lower thirds to preserve the immersive feel, keeping the broadcast aesthetic intact in the early scenes and letting it deteriorate as the film spiraled into chaos. The final render was exported in both 1080p for web delivery and 4K for submission to film festivals and portfolio archives.
Reflection & Takeaways
Kaiju was an intense and rewarding dive into the possibilities of virtual production, blending real-time Unreal workflows with the collaborative demands of filmmaking. This project pushed me to think like a director, environment artist, VFX lead, and technical supervisor all at once. From building entire cities to syncing real actors with LED volumes, the process sharpened my understanding of cinematic storytelling in the age of XR. I walked away with a deeper knowledge of scene optimization, DMX lighting control, sequencer shot management, and the subtle art of letting visual and sonic suggestion evoke scale and emotion without overshowing. It remains one of my most complete and ambitious productions — a true XR hybrid made possible through a mix of narrative focus, technical execution, and creative problem-solving.