“Age of Sail” was directed by John Kahrs, and produced at Chromosphere, Evil Eye Pictures, and Google Spotlight Stories. Working on this story, with this crew, has been an unforgettable experience. I’ll have lots more to say about it in future posts, but for now: enjoy the show!
After years of not doing anything particularly special for Halloween, this year we decided to start early and actually make our own costumes. Of course our plans were way too ambitious, so despite the fact that we started in August, by Halloween morning only one of our three costumes was actually finished.
The idea: inflatable nudibranchs. Nudibranchs (aka sea slugs) are marine invertebrates with incredible, psychedelic designs. If they look like something from a science fiction book cover, that may be because the designers of sci-fi aliens have been quietly stealing ideas from nudibranchs for decades.
I’d never made anything inflatable before, and barely ever used a sewing machine, so I learned a lot on this project. The material we used is this incredibly lightweight but sturdy stuff called 1-ounce calendered HyperD diamond ripstop nylon. (I chose this particular kind based on one negative review where a customer had made a quilt, and complained that it was hard to stuff into a sack because it kept trapping air inside it. Which was of course exactly what we wanted it to do!)
We did a lot of experiments to figure out the mechanics of inflatable structures. It’s pretty hard to visualize what 3D shape you’ll get from a bunch of flat parts until you’ve stitched them all together and filled it up with air. (Although I did find some interesting graphics research that solves the inverse problem!)
The trickiest part was figuring out how to attach the costume to a person’s body. With most inflatable costumes you can buy in stores, your whole body goes inside the inflated part of the costume, with just your hands and feet sticking out of elastic cuffs. This seemed like it’d be really hot and uncomfortable over a long night of running around trick-or-treating. So I made ours an outside-the-body design, with two belts of nylon webbing to attach to your torso, and a drawstring hood for your head. We found it wasn’t hard to rip the ripstop nylon at the point of attachment to the straps, so I reinforced those areas with a second layer, based loosely on how sails are reinforced at the clew.
One part I’m pretty happy with is the quilting of the blue fringe around the body. Without it, the body puffed up into a big potato shape. But I stitched in a pattern of alternating lines to make a little zigzag maze for the air to flow through. The result was a pretty decent match for the crinkly fringe of a real nudibranch’s foot.
The air blower is mounted in the ventral side of the body, below the waist so it would have a chance at clear airflow. And for illumination we ran two strands of LED fairy lights down the interior, from the tips of the blue cephalic tentacles down to the tail. This was okay, but not perfect: I would have preferred to light up more of the orange cerata sticking off the back. (The lighting was pretty rushed, and definitely something I’d like to do better next time…)
By now I am completely hooked on this inflatable costume idea. Which is good, because I’ve still got my own blue water dragon nudibranch costume to finish… but that’ll have to wait ’til next year!
This just in: I’m coming up to Vancouver this weekend for the Vancouver International Film Festival’s “VIFF Immersed” event. We’re showing Age of Sail (a Canadian premiere!) and Back to the Moon in VR. I’ll also be doing a talk about both projects in the “New Realities in Storytelling” conference, Saturday September 29th from 3:30-4:15 in the Reliance Theatre at Emily Carr University. Looks like there’ll be lots of other interesting VR-related talks happening all weekend! Here’s the full conference program. The VR exhibition will be running from Sunday to Tuesday in the “Hangar” building at the Centre for Digital Media. Tickets are available here.
But who’s that guy with the beard, and why does he keep blocking the screen?
I’m off to Annecy for another week packed solid with VR demos, film screenings and talks with the Google Spotlight Stories crew! I’ll be on a panel called “VR: the New Animation Playground” on Wednesday at 11am. We’ll be showing four projects at the Bonlieu Salle de Création: “Isle of Dogs: Behind the Scenes in VR” (Wes Anderson, with Felix & Paul Studios), “Piggy” (Jan Pinkava and Mark Oftedal), “Back to the Moon” (FX Goby and Hélène Leroux, with Nexus and the Google Doodles team) and also a sneak peek of the story I’m working on right now, “Age of Sail” (directed by John Kahrs, in collaboration with Chromosphere). The creators will also be doing a panel Wednesday at 6pm: “Animation Everywhere!”
If you’re on that side of the planet, come hang out! (But bring an umbrella, I hear it’s gonna be raining all week!)
One of the highlights of my time at DreamWorks was getting to help an amazing team of animators design their next-generation animation software, Premo. The software we’d been using up to that point, Emo, was originally written at PDI in the 1980’s. While a lot of talented engineers had improved it over the years, there was only so much they could build on top of a foundation so old that it predated the GPU! We often dreamed about what the ideal animation tool would be like, if we could somehow start again from scratch.
Incredibly, in 2008, we got the opportunity to do exactly that, and the idea for Premo was born. Rex Grignon led the design effort, and brought on Jason Schleifer, Fred Nilsson, Jason Reisig, Simon Otto, Dave Torres and myself to flesh out the zillions of tiny details that matter so much. We worked closely with engineers Bruce Wilson, Seath Ahrens, Morgwn Mccarty, Brendan Duncan and many others (see this article for the full list!) as they brought their own expertise to bear on all those tiny details, and turned our fluffy wishful ideas into real working code. After years of development, the animators on How to Train Your Dragon 2 got to take Premo for its inaugural flight, and were spoiled forever by the best software any of us had ever seen. We knew we had something very special on our hands, but I wondered: would anyone outside the company ever know about it?
Now, a decade later, the Academy of Motion Picture Arts and Sciences has honored Premo with a Sci-Tech Award!
Here’s a video of the awards ceremony, with Sir Patrick Stewart introducing some of the team:
And here’s a terrific blog post from Nimble Collective (the company that Rex, Jason and Bruce went on to co-found after DreamWorks) about FIFICRD, our shamelessly awkward acronym for our fiercely held beliefs about how great software can and should be:
I’m beyond proud to see this enormous group effort get the recognition it deserves. Go Premo! FIFICRD FOREVER!
I can’t believe the Annecy Animation Festival is less than a week away! And man, it’s going to be a busy week. Most days you’ll be able to find me in the Bonlieu Salle de Création: Tuesday, June 13 (showing Gorillaz “Saturnz Barz” on Daydream), Thursday June 15 (showing a VR teaser for Jorge Gutierrez’s Son of Jaguar on Vive) and Friday June 16 (showing Sonaria by Scot Stafford and Chromosphere, also on Vive). Stop by and say hi if you’re there!
I’ve long been a fan of nonsense. (It seem me who they look where to sit down one’s.) Unlike its near relatives—noise, lies, and bullshit—real nonsense is surprisingly hard to construct, because the sense-making instinct runs deep with us humans. So when I see a performance like Vanessa Bayer’s as “Dawn Lazarus” in this SNL skit I tend to take notice. Which then leads me down a rabbit hole of related arts like double-talk (which is easier if you do it in a foreign language) and good old Engrish menu fails. Meanwhile, the glorious internet digs up treasures from the pre-Google era, like the player names in Fighting Baseball:
Now, in 2017, machine learning comes on the scene and opens entirely new frontiers for nonsense-lovers. Witness this attempt to train a neural network to generate paint color names based on the contents of the Sherwin-Williams catalog (producing some names that could have come straight out of LiarTown, USA):
I wasn’t sure about machine learning before, but it’s growing on me.
The theatrical print of “Pearl” will be screened next week at the San Francisco International Film Festival, in a collection of family-friendly animated shorts. I’ll be there at the 10am Sunday screening at the Castro Theater, along with some of the other filmmakers, for Q&A. Stop by and say hi if you’re in the area!
At the VR Storytelling Meetup last night, an interesting conversation with the other panelists got me thinking about frame rates again. Apparently, for filmmakers shooting live-action 360º video, the high frame rate required for playback in a VR device can be a bit of an obstacle. Not just technically, but psychologically: it’s a turnoff for the audience.
I felt that emotional turnoff when I finally saw Peter Jackson’s first Hobbit movie at 48 frames per second. It was astonishing and beautiful in the sweeping exterior shots. But when it was just characters sitting and talking, it felt… fake. I found myself scrutinizing the makeup, looking for flaws and finding them. At the time I attributed it to a cultural bias: because I grew up in an era when high quality entertainment came in the form of 24p films, and cheesy soap operas were 60i video, I must subconsciously associate high frame rates with low quality.
But what if there’s more to it than that?
In a recent interview about Pearl, Patrick Osborne pointed out that simplifying the visual style, removing texture and detail, leaves room for the audience to put themselves into the characters. It lowers a barrier to empathy. Scott McCloud said as much in Understanding Comics. This is why I’ve always preferred non-photorealism over realism. It’s what you leave out that counts.
What if a similar mechanic is at work around the question of frame rates? The secondhand report from the live action VR filmmaker was that at 60fps, it felt too obvious that the people were actors. You could look at a background character and tell instantly that they were pretending. Leaving aside the possibility that it may have just been bad acting: is it possible that the high frame rate itself lets you see through the ruse? Could it be the density of information you’re receiving that pushes your perceptiveness over some threshold, and makes you a sharper lie detector?
And if that turns out to be the case: how should filmmakers respond?