A personal experiment in making something fake look like it belongs.
No brief, no client — just curiosity about whether I could place a 3D object convincingly into footage I found online. The drone footage was a free clip. The helicopter asset was sourced the same way.
The asset was rigged and animated in Blender, rendered with Cycles, then tracked and composited in Nuke. I posted the result to r/Blender and r/vfx mostly to see what people would notice. The comments were more useful than I expected.
Getting the motion right before touching the composite
The helicopter was animated and rendered entirely in Blender using the Cycles render engine. Getting the movement to read as physically plausible — rotor blur, banking angle, the way weight shifts in a turn — required more iteration than the compositing work that followed. A static object dropped into footage reads as fake immediately. Motion sells it.
Tracking the world before placing something in it
Nuke handled both the 3D camera track and the final composite. The track gave the render camera the same movement as the drone — without it, the helicopter would slide against the background regardless of how good the render looked. Matching light, grain, and colour grade came after that foundation was solid.
The one thing that didn't fully land was the helicopter's shadow on the mountain surface. Getting an accurate point cloud of uneven terrain to catch a projected shadow convincingly was beyond what the footage and tools could reliably give me at the time.
Community feedback as a design method
Posting to the Blender and VFX subreddits turned the project into something more useful than a finished piece — it became a conversation. Comments identified exactly the moments that broke the illusion.