This just in: next Wednesday, June 1, we’ll be screening “Pearl” in 2D, 360º and VR at an SF-SIGGRAPH event in San Francisco. We’ll also be doing a talk with some behind-the-scenes footage. Seating is limited, so if you’re in the Bay Area and want to attend, sign up now!
Category Archives: npar
“Pearl” goes live today!
The short I’ve been working on for the past year finally goes live today! It should play nicely on any iPhone, iPad or Android device made in the past few years. Here are all the ways to see it:
If you’re on iOS: download the Google Spotlight Stories app.
If you’re on Android: visit Google Spotlight Stories using the YouTube app.
(If you’re on a desktop computer or older device, fear not: you can still see Pearl, as a non-interactive 360º video. But the experience is made for mobile, so that’s really the best way to see it.)
Whatever device you’re using, I recommend plugging in your very best pair of headphones so you can enjoy the 360º ambisonic sound and music!
Some great press about “Pearl”
It’s been wonderful to watch people react to “Pearl” as we take it out on tour. Here are some of the nice reviews and interviews that have come out since our debut at Tribeca:
USA Today, May 17: Google to show off animated VR short at I/O
Cartoon Brew, May 14: Observations from the VR Front: ‘Pearl’, ‘Invasion!’, and ‘Allumette’
Filmmaker Magazine, April 22: Tribeca 2016: Patrick Osborne on Winning an Oscar and his Animated VR Piece Pearl
The Verge, April 22: The best virtual reality from the 2016 Tribeca Film Festival
The Verge, April 21: The brains behind Pearl on bringing Google’s new Spotlight Story to Tribeca
Inverse, April 18: Google’s VR Film ‘Pearl’ Combines Disney Charm and Cutting Edge Tech
Variety, April 16: Google Shows Off First Spotlight Story on HTC Vive at Tribeca
“Pearl” in 360° at Tribeca Interactive Playground
Okay, one more update for those of you in New York: we will also be doing the first ever live public demo of Patrick Osborne’s Spotlight Story “Pearl” in full 360° at Tribeca’s Interactive Playground on Saturday, April 16th. We’ll be there all day.
Here’s where you can find tickets to the event. Hope to see you there!
“Pearl” to premiere at Tribeca Film Festival
News is finally starting to come out about the project I’ve been working on for the past year at Google Spotlight Stories. It’s an interactive 360º story, directed by Patrick Osborne, called “Pearl”. We’re simultaneously making a 2D film version of the story, which will have its world premiere at the Tribeca Film Festival on Sunday, April 17th at 6pm.
It’s been an amazing experience so far, full of exciting artistic and technical challenges, and it’s opened my mind to some astonishing new things. I’ll post more about it when we’re done, but in the meantime, Cartoon Brew has a nice writeup with some images from the show. And if you’re in New York that weekend, swing by and say hi!
Inceptionism and learning envy
A few days ago the image above started going around the social networks, attributed to “a friend working on AI”. Apparently a deliberate leak, now we know where it came from: a research group at Google working on neural networks for image recognition. Their research is worth reading about, and the images are fascinating: Inceptionism.
I have to admit, machine learning makes me jealous. Why does the machine get to do the learning? Why can’t I do the learning? But projects like this make me feel a little better. When the black box opens and we see the writhing mass inside, we get to learn how the machine does the learning. Everyone wins.
And the machines still have some catching up to do. As soon as I saw the amazing gallery of inceptionist images, I recognized the “inceptionist style” from the mysterious grey squirrel. Could a neural network do that?
The New Chair, rough cut (1998)
The New Chair, rough cut (1998) from Cassidy Curtis on Vimeo.
Here’s a rough cut of part of a short film I started while working at the University of Washington. I did all the animation, and developed the Loose and Sketchy rendering style.
The character animation is pretty janky overall, though a few of the shots hold up pretty well. I sure have learned a lot since 1998!
Brick-a-Brac (1995)
Brick-a-Brac (1995) from Cassidy Curtis on Vimeo.
Here is my first short film. I made this at PDI in 1995, during a gap between commercials. I modeled and rigged the characters, did most of the animation, and developed the wobbly ink-line look.
The pigeons’ torso was a metaball surface driven by a series of spheres along a spline between the head and the body, which were both separate IK joints (so I could easily get that pigeon-head movement style without counteranimating.) The eyes, beak, legs and wings were separate objects, each of which got rendered in its own separate pass. Each layer had its vector outline traced (using a tool originally written for scanning corporate photostats for flying logos!) I processed the curves using a procedural scripting language to give them some physics and personality, and then rendered them as black ink lines with varying thickness (using a tool written by Drew Olbrich). Finally, I ran the rendered lines through some image processing filters to get the edge darkening effect, and did some iterated stochastic silhouette dilation to add random ink blotches where the lines were thickest. Simple, really! ;-)
The Coffee Ring Effect
Why do coffee stains always have a dark ring around the edge? It’s because the water’s surface is curved: it evaporates more quickly near the edges, causing it to flow outward from the middle, carrying coffee particles with it. We cited some early research demonstrating this effect in our 1997 watercolor paper, but now there’s video that actually shows the process happening at a microscopic scale. (Thanks Eric for the link!)
Painterly rendering in your pocket.
My friends Dan Wexler and Gilles Dezeustre have just released a new app for your iPhone/iPad. It’s called Glaze. It’s a painterly rendering filter for your photos, and it’s really cool. It’s based on the traditional brushstroke-based model pioneered by folks like Paul Haeberli, Pete Litwinowicz and Aaron Hertzmann, but it adds some neat new twists: face detection to guarantee that eyes and other important features come out with the right amount of detail; a genetic algorithm for mutating and doing artificial selection on painting styles; and a really slick iOS interface that makes all of the above completely effortless and transparent. It also runs blazingly fast, considering all the work that must be going on under the hood. They’ll be giving a talk about the details at SIGGRAPH next week. (Here’s an abstract of their talk… wish I could go!)
What I find the app does best, so far, is to turn my garbage photos into beautiful art. This, for example, is a picture my thumb took by accident as I was putting away my phone. The original photo was blurry, out of focus, and weirdly composed. But the painting’s handmade feeling makes your eye linger on the details, and the results are just lovely. (Everyone’s Instagram is about to get a lot prettier!)