Perceptive media meets the visual

visual pm realtime grading
Changing the colour grade

The next web broke the story after seeing a tweet from BBCRD on Thursday but others have followed.

So what is this visual perceptive media thing?

Imagine a world where the narrative, background music, colour grading and general feel of a drama is shaped in real time to suit your personality. This is called Visual Perceptive Media and we are making it now in our lab in MediaCityUK.

The ability to customise or even personalise media (video in this case) in a browser using no special back end technology or delivery mechanism is fascinating. Its all javascript, client side technologies and  standard http in a modern web browser. Because of this it’s open, not propriety and I believe scalable (the way it should be). This also means when we do make it public the most amount of people can experience it, fitting with the BBC’s public purpose.

More details of the project will emerge soon, but I wanted to make certain things clear.

The project isn’t a one off, it a line of projects around our object media ambitions. Some others were used at Edinburgh this summer and IP studio is a big part of it. There’s even been some projects very similar to Visual Perceptive Media including Forecaster.

Perceptive Media (implicit) has always been about audience experiences and fits as an alternative of responsive media (explicit). Breaking out and Perceptive radio. All are new experiences we have been building, underpinned by ip technology and rethinking our notions of media as a solid monolithic block.

Lego Bricks

You are already seeing this happen with the movement in STEMS in music. However, while audio multiplication in the open environment of the web is easier via the WebAudioAPI. Theres no real unified API like the WebAudioAPI for Video. SMIL was that but it got sidelined as HTML5 pushed the capabilities in the browsers not mediaplayers.

We have been working in this area and looked at many options including Popcorn.JS. In the end we started creating a video compositor library and recently open sourced the library. Without that library, the project would be still be in our lab.

There has been some criticism about the personality side of things.

Data ethics is something we have been thinking and talking about a lot. Earlier this year we created a microsite summing up some of our thoughts and raising opinions of some industry experts. The question about the filter bubble was talked about my many but we didn’t include it in the short documentaries, maybe now would be a good time to dig them out.

But before I dive into the deep end, its important to say we are using personality as simply a proxy for changing things. It could have been anything, as someone even suggested we could used shoe size. We used personality after meeting and being impressed by Preceptiv a long while ago by their technology.

The next thing was to connect the data to changeable aspects of a film. Film makers are very good at this and working with Julius Amedume (film director and writer) we explored the links between personality and effect. Colour grade and music were key ones along with shot choices, we felt were most achievable.

Theres a lot more I can say, most which was said at the This way up conference panel: The film is not enough.

On the day before (Wednesday) we did our first somewhat public but secretive closed door reveal of the very much early preview of visual perceptive media with 16 industry people. It originally was meant to be a smaller number but the demand was such that we increased the number and increased the machines needed to view it. The technical challenges did cause problems but with the help of Anna from AND Festival, myself and Andy from R&D got some good feedback. We are still crunching the feedback but I expect the frank discussions will be the most enlightening.

The panel discussion on Thursday was great. I gave the following presentation after Gabby asked me to give more context to the video here. I was my usual firestarter self and maybe caused people to think quite a bit. The trend towards events around film is welcomed and there are some great people doing amazing things but I was questioning film its self. We should demand more from the media of film…

Some of the feedback afterwards was quite amazing. I had everything from “This will not work!” – spent 15 productive mins  talking with one person about this. To in-depth questioning of what we have done so far and how, revealed nothing.

I had a good chuckle at this tweet and must remember to bring it up at my next appraisal.

I generally don’t want to say too much because the research should speak for its self but its certainly got people thinking, talking and hopefully more of the BBC R&D project around object media will start to complete the picture of what’s possible and show the incredible value the BBC brings to the UK.

https://twitter.com/AndyRae_/status/672436090389794816

Author: Ianforrester

Senior firestarter at BBC R&D, emergent technology expert and serial social geek event organiser. Can be found at cubicgarden@mas.to, cubicgarden@twit.social and cubicgarden@blacktwitter.io

12 thoughts on “Perceptive media meets the visual

Comments are closed.