This just in: I’m coming up to Vancouver this weekend for the Vancouver International Film Festival’s “VIFF Immersed” event. We’re showing Age of Sail (a Canadian premiere!) and Back to the Moon in VR. I’ll also be doing a talk about both projects in the “New Realities in Storytelling” conference, Saturday September 29th from 3:30-4:15 in the Reliance Theatre at Emily Carr University. Looks like there’ll be lots of other interesting VR-related talks happening all weekend! Here’s the full conference program. The VR exhibition will be running from Sunday to Tuesday in the “Hangar” building at the Centre for Digital Media. Tickets are available here.
But who’s that guy with the beard, and why does he keep blocking the screen?
I’m off to Annecy for another week packed solid with VR demos, film screenings and talks with the Google Spotlight Stories crew! I’ll be on a panel called “VR: the New Animation Playground” on Wednesday at 11am. We’ll be showing four projects at the Bonlieu Salle de Création: “Isle of Dogs: Behind the Scenes in VR” (Wes Anderson, with Felix & Paul Studios), “Piggy” (Jan Pinkava and Mark Oftedal), “Back to the Moon” (FX Goby and Hélène Leroux, with Nexus and the Google Doodles team) and also a sneak peek of the story I’m working on right now, “Age of Sail” (directed by John Kahrs, in collaboration with Chromosphere). The creators will also be doing a panel Wednesday at 6pm: “Animation Everywhere!”
If you’re on that side of the planet, come hang out! (But bring an umbrella, I hear it’s gonna be raining all week!)
One of the highlights of my time at DreamWorks was getting to help an amazing team of animators design their next-generation animation software, Premo. The software we’d been using up to that point, Emo, was originally written at PDI in the 1980’s. While a lot of talented engineers had improved it over the years, there was only so much they could build on top of a foundation so old that it predated the GPU! We often dreamed about what the ideal animation tool would be like, if we could somehow start again from scratch.
Incredibly, in 2008, we got the opportunity to do exactly that, and the idea for Premo was born. Rex Grignon led the design effort, and brought on Jason Schleifer, Fred Nilsson, Jason Reisig, Simon Otto, Dave Torres and myself to flesh out the zillions of tiny details that matter so much. We worked closely with engineers Bruce Wilson, Seath Ahrens, Morgwn Mccarty, Brendan Duncan and many others (see this article for the full list!) as they brought their own expertise to bear on all those tiny details, and turned our fluffy wishful ideas into real working code. After years of development, the animators on How to Train Your Dragon 2 got to take Premo for its inaugural flight, and were spoiled forever by the best software any of us had ever seen. We knew we had something very special on our hands, but I wondered: would anyone outside the company ever know about it?
Now, a decade later, the Academy of Motion Picture Arts and Sciences has honored Premo with a Sci-Tech Award!
Here’s a video of the awards ceremony, with Sir Patrick Stewart introducing some of the team:
And here’s a terrific blog post from Nimble Collective (the company that Rex, Jason and Bruce went on to co-found after DreamWorks) about FIFICRD, our shamelessly awkward acronym for our fiercely held beliefs about how great software can and should be:
I’m beyond proud to see this enormous group effort get the recognition it deserves. Go Premo! FIFICRD FOREVER!
I can’t believe the Annecy Animation Festival is less than a week away! And man, it’s going to be a busy week. Most days you’ll be able to find me in the Bonlieu Salle de Création: Tuesday, June 13 (showing Gorillaz “Saturnz Barz” on Daydream), Thursday June 15 (showing a VR teaser for Jorge Gutierrez’s Son of Jaguar on Vive) and Friday June 16 (showing Sonaria by Scot Stafford and Chromosphere, also on Vive). Stop by and say hi if you’re there!
I’ve long been a fan of nonsense. (It seem me who they look where to sit down one’s.) Unlike its near relatives—noise, lies, and bullshit—real nonsense is surprisingly hard to construct, because the sense-making instinct runs deep with us humans. So when I see a performance like Vanessa Bayer’s as “Dawn Lazarus” in this SNL skit I tend to take notice. Which then leads me down a rabbit hole of related arts like double-talk (which is easier if you do it in a foreign language) and good old Engrish menu fails. Meanwhile, the glorious internet digs up treasures from the pre-Google era, like the player names in Fighting Baseball:
Now, in 2017, machine learning comes on the scene and opens entirely new frontiers for nonsense-lovers. Witness this attempt to train a neural network to generate paint color names based on the contents of the Sherwin-Williams catalog (producing some names that could have come straight out of LiarTown, USA):
I wasn’t sure about machine learning before, but it’s growing on me.
The theatrical print of “Pearl” will be screened next week at the San Francisco International Film Festival, in a collection of family-friendly animated shorts. I’ll be there at the 10am Sunday screening at the Castro Theater, along with some of the other filmmakers, for Q&A. Stop by and say hi if you’re in the area!
At the VR Storytelling Meetup last night, an interesting conversation with the other panelists got me thinking about frame rates again. Apparently, for filmmakers shooting live-action 360º video, the high frame rate required for playback in a VR device can be a bit of an obstacle. Not just technically, but psychologically: it’s a turnoff for the audience.
I felt that emotional turnoff when I finally saw Peter Jackson’s first Hobbit movie at 48 frames per second. It was astonishing and beautiful in the sweeping exterior shots. But when it was just characters sitting and talking, it felt… fake. I found myself scrutinizing the makeup, looking for flaws and finding them. At the time I attributed it to a cultural bias: because I grew up in an era when high quality entertainment came in the form of 24p films, and cheesy soap operas were 60i video, I must subconsciously associate high frame rates with low quality.
But what if there’s more to it than that?
In a recent interview about Pearl, Patrick Osborne pointed out that simplifying the visual style, removing texture and detail, leaves room for the audience to put themselves into the characters. It lowers a barrier to empathy. Scott McCloud said as much in Understanding Comics. This is why I’ve always preferred non-photorealism over realism. It’s what you leave out that counts.
What if a similar mechanic is at work around the question of frame rates? The secondhand report from the live action VR filmmaker was that at 60fps, it felt too obvious that the people were actors. You could look at a background character and tell instantly that they were pretending. Leaving aside the possibility that it may have just been bad acting: is it possible that the high frame rate itself lets you see through the ruse? Could it be the density of information you’re receiving that pushes your perceptiveness over some threshold, and makes you a sharper lie detector?
And if that turns out to be the case: how should filmmakers respond?