google.com, pub-7322221544005154, DIRECT, f08c47fec0942fa0
 
Search

The wonderful VFX of "Dog in the Woods"

Updated: Jul 25, 2019


Directing duo Christian Chapman and Paul Jason Hoffman’s micro-budget film, their debut narrative short and first use of visual effects, envisions the wild sensory experiences of a German Shepherd’s late-night forest prowl.


Dog in the Woods, a 5-minute live-action film from Connecticut-based directors Christian Chapman and Paul Jason Hoffman of Resonator Films. The duo’s debut narrative short as well as first experience with visual effects, Dog in the Woods shows how perseverance and “dogged” determination (sorry, had to get that in at least once) enabled the pair to work around a tight budget and produce a unique and compelling piece.

The film, which premiered earlier this year at Slamdance and was selected at the Florida, St. Kilda, Newport Beach and Pendance film festivals, tells the story of Alice, a jet-black German Shepherd, as she bolts through the woods one night in search of her wild side. After teaching themselves the basics of VFX supervision, Chapman and Hoffman assembled and directed an international team of VFX artists led by Dresden-based designer Marc Zimmerman (Conscious Existence) and Andy Thomas (Synthetic Nature); together, they teamed to visualize the vibrant, otherworldly dimensions of a dog’s forest prowl and the sensations she experiences during her journey.

  1. Having never worked with visual effects, and hamstrung by an incredibly small budget, the two were forced to learn quickly; mistakes were plentiful and costly, but ultimately, instructive. “Our biggest challenge was lack of experience,” Hoffman reveals. “We entered this project having never worked with VFX, so we made some mistakes in production and even in pre-production that cost us time and money in post. For example, had we known the basic concept of tracking, we could have consulted a VFX supervisor and used tracking markers on set. Instead, our tracking artists had to match move some of our shots frame by frame. Also, our lack of experience in VFX meant we started with zero contacts in the VFX freelance world, as well as zero knowledge of how to price each job and how to best communicate our ideas with our collaborators. It was a steep learning curve, starting from scratch.”


Their first major task was creating what they referred to as the “VFX Direction Storyboard,” a document that became the film’s visual effects production blueprint. According to Chapman, “This [document] was a 40-something page PDF that would later serve as the foundation for communicating with our VFX artists. But at first, it was a way for us to hash out the design, timing, positioning, and physics for each visual effect. We started by scouring the internet for beautiful images found in nature – deep sea jellyfish, northern lights, microscope slides of plant matter, etc. – then compiling them into reference folders. From there, we grabbed a still for every shot to which we wanted to apply visual effects, and brought them into Photoshop. We circled back to our reference folders, brought the appropriate reference images into Photoshop, and merged them with our stills. Then we used a Wacom tablet to illustrate drawings and notes on top of each image. Throughout this process, we kept looking at an H.R. Giger illustration from Alien on our office wall, depicting the anatomy of the iconic predator. That got us pretty psyched.”

Once the VFX “plan” was established, the directing pair set out to find artists, establish a pipeline and enable effective means of communication during the production. “We were hellbent on creating effects that would impress even the most seasoned Hollywood VFX artists, but also confined to working with a tight budget,” Chapman continues. “Luckily, after about two months searching through YouTube, Vimeo, and Upwork, we were able to find a very passionate and talented group of about 10-12 VFX freelancers from all corners of the globe. With our direction in place, and our freelance VFX roster assembled, we created a pipeline that paired each shot’s needs - tracking, design, roto and compositing - with the artists whose skills seemed most suitable for the task.”

“For the time-consuming task of revisions, we often communicated with our VFX collaborators through Skype calls with screen shares, so we could make edits in real time,” the director adds. “We also took stills from their drafts and made notes in Photoshop, similar to the process we did when creating the initial ‘VFX Direction Storyboard.’ Thankfully, everyone was very patient and committed to the vision, so it all worked out in the end. But it wasn’t easy.”


Regarding production tools, Zimmermann explains, “On the software side I used Cinema4D and After Effects. The rotten spiderweb effect, for example, was created using the native cloth simulation tools of Cinema4D. I created a custom texture with the X-Particles plugin using a particle system emitting from a heavily cut static 3D mesh and an effector that created connections between the particles. Once pre-rendered, that texture was refined in After Effects with a bit of turbulent displace and vecto