It was Si Lumb who tweeted me about Pixar’s Inside Out contextual visuals.
Now I know this isn’t anything new, I mean films have had region differences for a long while but its good to see it discussed openly and I was interesting to read about how (we think) they do it.
It’s interesting to note that the bottom five entries of the list, starting with “Thai Food,” remain consistent throughout (maybe Disney/Marvel Studios’ digital wizards couldn’t replace the stuff that Chris Evans’ hand passed over), but the top items change a lot.
Which leads me to think its all done in post production using things like impossible software?
Post producing this stuff is a mistake in my mind, but then again I’m working on the future of this kind of thing with Perceptive Media. I also imagine the writer and director had no time to think about variations for different countries, or wasn’t paid enough?
Rather than write up my thoughts of how to do this with digital cinema (isn’t this part of the promise of digital cinema?) plus I’m writing a paper with Anna frew about this. I thought it was about time I wrote something about the project I’m currently working on.
Visual Perceptive Media
Visual perceptive media is a short film which changes based on the person who is watching the video. It uses profiled data from a phone application to build a profile of the user via their music collection and some basic questions. The data then is used to inform what variations it should apply to the media when watched.
The variations are applied in real time and include different music, different colour grading, different video assets effects and much more. Were using the WebAudioAPI, WebGL and other open web technologies.
What makes this different or unique…?
- We had buy in with the script writer and director (Julius Amedume was both and amazing) right from the very start which makes a massive difference. The scripts were written with all this in mind.
- It was shot and edited with its intended purpose of making real-time variations.
- Most things we (BBC R&D) have done in the responsive/perceptive area has been audio based and this I would say is a bit of moonshot moment like Breaking Out 3 years ago! Just what I feel the BBC should be doing.
- Keeping with the core principle of Perceptive media, the app which Manchester based startup Percepiv (was moment.us, wondered if working with us had a hand in the name change?) created using there own very related technology. Is mainly using implicit data to build the profile. You can check out music+personality on your own android and iphone now.
Its going to be very cool and I believe we the technology has gotten to the point where it makes sense that we can do this so seamlessly that people won’t even know or realise (this is something we will be testing in our lab). As Brian McHarg says, theres going to be some interesting water cooler conversations, but the slight variations are going to be even more subtle and interesting.
@BrianMcHarg @cubicgarden yeah, less "did you see", more "what was it like for you?" I'd love to see more like "let's play" coverage for tv.
— Si Lumb (@si_lumb) July 26, 2015
This is no branching narrative
I have been using the word variations throughout this post because I really want us to get away from the notion of edits or versions. I recently had the joy of going Learn Do, Share Warsaw. I was thinking about how to explain what our thinking was with the Visual Perceptive Media project. How do you explain which has 2 films genres with 6 established endings with 20+ types music genres and a endless number of lengths and effects?
This certainly isn’t a branching narrative and the idea of branching narrative is certainly not apt here. If this was a branching narrative, it would have upwards of 240 versions not including any of the more subtle effects to increase your viewing enjoyment. I considered them as variations and the language works, when you consider the photoshop variation tool. This was very handy when talking to others not so familiar with perceptive media. But its only a step and makes you consider there might be editions…
I was talking to my manager Phil about it before heading to Warsaw and came up with something closer to the tesseract/hypercube in interstellar (if you not seen it/spoiler alert!)
Unlimited isn’t quite right but the notion of time and variations which intersect is much closer to the idea. I say to Si Lumb maybe the way to show this would be in VR, as I certainly can’t visualise it easily.
When its up and running I’d love people to have a go and get some serious feedback.
Perceptive media breaks into the ad industry #pmedia @cubicgarden https://t.co/cp66ikh2SN
— Tony Churnside (@TonyChurnside) July 24, 2015
On a loosely related subject, Tony Churnside also tweeted me about Perceptive Media breaking into the advertising industry.