When Game of Thrones premiered in 2011, the television landscape was ill-equipped to handle the visual complexity of high-fantasy creatures. At the time, dragons were often relegated to low-budget Syfy movies or static, distance shots. However, as the series progressed, the “children” of Daenerys Targaryen—Drogon, Rhaegal, and Viserion—became the gold standard for visual effects (VFX) in digital storytelling.
The story of what happens to these dragons is not just a narrative arc of war and loss; it is a chronicle of a decade-long breakthrough in computer graphics, hardware processing, and artificial intelligence. To understand the dragons, one must look past the scales and fire to the sophisticated technological framework that brought them to life.

The Architecture of Digital Life: Sculpting and Rigging
The creation of a dragon begins with a digital skeleton, a process known as rigging. In the early seasons, the dragons were small enough to be handled by basic animation software, but as they grew to the size of Boeing 747s, the underlying technology had to evolve to simulate the weight and physics of such massive entities.
Biomechanics and Skeletal Rigging
The tech teams at Pixomondo and later Scanline VFX used advanced biomechanical modeling to ensure the dragons moved realistically. They didn’t just animate a shape; they built an internal skeletal structure and a musculature system that reacted to movement. Using software like Autodesk Maya, technicians developed custom “rigs” that allowed animators to manipulate joints while the software automatically calculated how the surrounding “flesh” and “muscle” should bulge or stretch. This prevented the “uncanny valley” effect, where the movement looks weightless or rubbery.
Skin Texturing and Subsurface Scattering
One of the most significant tech hurdles was the rendering of dragon skin. To make the scales look authentic, engineers employed “Subsurface Scattering” (SSS). This technology simulates how light penetrates a translucent surface (like skin or scales) and scatters, creating a glow that distinguishes organic matter from plastic or metal. By the final seasons, the dragons’ skin featured millions of individual polygons, each programmed with unique reflective properties to react to the changing lighting of the North or the bright sun of King’s Landing.
The Physics of Flight and Fire: Simulating Realism
In the world of VFX, two of the most difficult elements to render are fluid dynamics (fire) and complex collisions (flight). For Game of Thrones, the tech stack had to be upgraded annually to keep up with the increasing scale of the dragon sequences.
Aerodynamics and Wing Simulation
To make the dragons’ flight feel earned, the VFX teams studied the wing-to-body ratios of fruit bats and eagles. They implemented aerodynamics simulators that calculated air resistance against the dragons’ wings. When Drogon dove toward an enemy, the software simulated the way the wing membrane would vibrate—a phenomenon known as “flutter”—based on the simulated wind speed. This level of granular detail required massive computational power, often utilizing parallel processing across hundreds of servers to render a single frame of flight.
Procedural Fire Generation and Fluid Dynamics
Early seasons used a mix of practical fire and digital overlays. However, as the “Dracarys” moments grew in scale, the show turned to procedural fire generation using SideFX Houdini. Houdini is a tool that uses mathematical algorithms to simulate the behavior of fluids and gases. Instead of “drawing” fire, the tech team programmed the physical properties of the dragon’s breath—velocity, temperature, and fuel density—and let the software simulate the combustion. This allowed the fire to interact realistically with the environment, curling around corners or reflecting off the water of Blackwater Bay.

Motion Capture and Human-Machine Interaction
One of the most critical aspects of the dragons’ presence was their interaction with the human cast. To bridge the gap between a digital asset and a physical actor, HBO utilized cutting-edge “Simulcam” technology and sophisticated on-set rigs.
The Role of the Simulcam and VR
During the filming of the later seasons, directors used “Simulcam” technology—originally pioneered by James Cameron for Avatar. This allowed the camera operators to see a low-resolution, real-time digital version of the dragon through their viewfinders while filming an empty landscape. This tech integration ensured that the camera’s movement felt organic, as if the operator were actually tracking a massive flying creature, rather than guessing its position. In some instances, VR headsets were used by directors to “scout” digital sets, allowing them to place the digital dragons in the environment before a single frame was shot.
The “Big One”: Mechanical Rigs and Motion Bases
For Emilia Clarke (Daenerys), interacting with her dragons required a physical interface. The production used a “motion base” or a “gimbal”—a massive, computer-controlled hydraulic rig that could tilt, pitch, and yaw. The movement of this rig was synced perfectly with the pre-animated flight paths of the digital dragon. This synchronization meant that when the digital dragon banked left in the final render, the actress was physically banking left on set, ensuring that her hair, clothes, and body weight shifted in a way that looked technologically seamless in the final composite.
The Infrastructure of Scale: Cloud Computing and Rendering Pipelines
The sheer volume of data required to bring the dragons to life in the final season was staggering. As the dragons grew, so did the “render time”—the amount of time a computer takes to process all the lighting, textures, and physics into a finished image.
Scalable Rendering and GPU Optimization
By the time the series reached its climax, the rendering requirements exceeded the capacity of traditional local servers. The production shifted toward high-performance computing (HPC) and cloud-based rendering pipelines. Using thousands of GPUs (Graphics Processing Units) working in tandem, the team could process the complex “Global Illumination” algorithms required for the dragons. These algorithms calculate how light bounces off the dragon’s scales and onto the environment, such as the snow in Winterfell or the stone of the Red Keep. Without the shift to GPU-accelerated rendering, a single episode would have taken years to process.
AI and Machine Learning in Post-Production
In the final stages of the series, machine learning began to play a role in “denoising” and “rotoscoping.” AI tools were trained to identify the edges of the dragons, helping the VFX artists separate the digital creatures from the background more efficiently. This allowed for faster iterations; if a director wanted to change the dragon’s head position slightly, AI-assisted tools could re-calculate the lighting and shadows in a fraction of the time it took in earlier seasons.

The Legacy of Dragon VFX in the Modern Tech Landscape
The technological journey of the dragons in Game of Thrones did more than just provide a spectacle for television; it moved the entire VFX industry forward. The techniques developed for skeletal rigging, fluid simulation, and Simulcam integration have now become standard practice in high-end streaming content.
The “what happens” to the dragons is ultimately a transition from physical limitations to digital infinity. While the narrative saw the loss of Viserion and Rhaegal, and the departure of Drogon into the unknown, the technology created to sustain them lives on. Today, the lessons learned from the Thrones pipeline are being applied to “Real-Time Rendering” engines like Unreal Engine 5, which are used in the successor series House of the Dragon.
We have entered an era where the distinction between a practical effect and a digital one is almost non-existent. The dragons of Westeros were the catalysts for this change, proving that with enough processing power, sophisticated algorithms, and a robust digital pipeline, the impossible can be rendered into reality. They represent a milestone in the marriage of art and technology, leaving behind a legacy that continues to define the boundaries of what is possible on screen.
aViewFromTheCave is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com. Amazon, the Amazon logo, AmazonSupply, and the AmazonSupply logo are trademarks of Amazon.com, Inc. or its affiliates. As an Amazon Associate we earn affiliate commissions from qualifying purchases.