Mozilla Festival 2021 – Its all virtual and you are invited!

Me with a face covering in 2016
Even in 2016, I was ready for the pandemic?

Its finally here, Mozilla Festival 2021 and its looking excellent.

I can’t tell you how long I have spent this evening looking at the hu

My adaptive podcasting workshopImagine being able to craft personalised podcasts which take advantage of data and sensors to wrap the listener in a story. Then imagine being able to do this for many people at once. This is what we call adaptive podcasting and the best part is its free, open…

I can finally tell you two of my three submitted sessions were accepted. The big one is a workshop around adaptive podcasting which will happen Monday 15th at 2015-2115 GMT. Don’t worry there is calendar invites for all the sessions including mine.

Its a hour workshop and its unlikely we will get the point of creating podcasts but there will be follow up sessions on the Mozfest slack and Storytellers United community.

Of course Adaptive podcasting will appear elsewhere outside of Mozfest, so keep an eye on the blog for more information around that.

My other session is the advantages of neurodiversity, which is a follow up to 2019’s the advantage of dyslexia, which is highly influenced by the amazing book by the same name.

Advantage of Neurodiversty - ArtLast year we explored the advantages of dyslexia at the brand new Neurodiversity space. This year we are back looking for people to explore and understand the advantages of different types of neurodiversity.

Look out for this one, as this art piece relies on your thoughts around the advantages of neurodiversity.

Adaptive bedtime lullabies

Oura  lullabies

I gave Oura’s sleep story a try the other night. It was pretty good, I was pretty much a sleep in under 10mins. I say 10mins because I couldn’t help but think how this could be so much better as a adaptive narrative or even a adaptive podcast?

Especially with the subject being around the moon.

I get the bedtime/sleep story is meant to be something to fall sleep to, but imagine it fitting/adapting slightly to the moon phase, how your day has been, etc. Oura is sitting on a ton of personal data and their system keeps that secure to the user.

Perfect for personalised adaptive narratives.

Join us in exploring object-based media making tools


Like visual perceptive media? Like the concept of perceptive radio, like the javascript libraries we have put out in a open and public way? We want you to come on board and join us…!

We (BBC R&D) have been exploring the new reality of creating object based media through a range of prototypes. I have been exploring the implicit uses of data and sensors to change the objects; or as we started calling it a while ago Perceptive Media.

The big issue is to realisticily create and author these new types of stories, requires a lot of technical knowledge and doesn’t easily seat in the standard content creation workflow, or does it? We want to bring together people in a workshop format to explore the potential of creating accessible tools for authors and producers. Ultimately seeding a community of practice, through open experimentation learning from each other.

The core of the workshop will focus on the question…

Is it desirable and feasible for the community of technical developers and media explorers to build an open set of tools for use by storytellers and producers?

During the backdrop of the International Sheffield Documentary Festival the workshop on Monday 13th June will bring together, and are putting out a call for interested parties to work together with the aim of understanding how to develop tools which can benefit storytellers, designers, producers and developers.

We are calling for people, universities, startups, hackers and companies with a serious interest in opening up this area; to reach out and join us. Apply for a ticket and we will be in touch.

Speaking at Bucharest technology week

Bucharest - Dâmbovița River

This week Thursday (26th May 2016), I’ll be speaking at Enterprise-IT Summit during Bucharest Technology Week, a celebration of the positive impact technology can have on our personal and professional lives. Its going to be at the Athénée Palace Hilton, in Bucharest.

I’ll be talking on the future of contextual and adaptive media. Something we are really understanding along with others, as you can see in this call for participation for a workshop on Monday 13th June.

You can browse the event agenda and full speaker line-up on the website.

I have never been to Romania or eastern europe till I went to Poland last year. but I am really looking forward to meeting all the great people involved in the digital & tech scene out there. Will be fun to testing their creative thinking in a little workshop following my talk on the same subject.

What Cinema can learn from Broadcasting?

IMG_8891

Its weirdly ironic that I wrote a blog post about what cinema can learn from TV, 3 years while ago almost to the day of the this way up conference in December I’m about to talk at.

The this way up conference is a film exhibition innovation conference which launched last year. It returns with a jam-packed two-day event that promises to inspire and enlighten, provoke and challenge, connect and share.

I’ll be doing two things on behalf of BBC R&D

The first one is on Wednesday and is a lunch time workshop around a unreleased Perceptive Media project, I have been working on for most of the year.

Lunchtime Lab: BBC Perceptive Media Want to contribute to the evolution of storytelling? BBC Research and Development’s North Lab, based at MediaCityUK in Salford, showcase their latest experiment in a top secret, closed door workshop. A select group of THIS WAY UP attendees will try out a new smartphone app before being a shown a premiere of a short film that looks to change the way we engage. Further details are strictly under wraps, but the BBC are looking for volunteers to take part in this limited study and to share and discuss their experiences with other participants. Workshop led by Ian Forrester, BBC R&D North lab. Results from the workshop will be revealed at Thursday’s The Film is Not Enough session.

Its really research in the wild and we have no idea how the audience will react to this. The results will be intriguing to say the least.

On the Thursday I’ll be on a panel talking about the changes which need to happen to regain the cinema audience.

The Film is not Enough – With the rise of event cinema, alternative content, enhanced screenings, sing-a-longs and tweet-a-longs, is there a danger that the original purpose of cinemas is being lost as audiences demand novelty and gimmickry? This panel will hear from those folk changing audience perceptions and expectations of what ‘coming to the cinema’ means. Panel includes: Tony Jones (Cambridge Film Festival), Jo Wingate (Sensoria), Rhidian Davis (BFI), Gaby Jenks (Abandon Normal Devices – chair), Lisa Brook (Live Cinema), and Ian Forrester (BBC Research & Development).

I’ll talk about details of the project experienced on Wednesday and explain why this is a good and scalable way to regaining the TV and maybe the cinema audience. The panel should be good with a number of really viewpoints and Gaby Jenks from Abandon Normal Devices chairing the debate.

What cinema can learn from broadcast will be driven home by the keynote from Nick North, the director of Audiences at the BBC.

Look out for more details soon… but theres already plenty of interest….

Context is queen?

I wanted my grandmothers pokerface....

I’m hearing a lot of talk about how 2013 is The year responsive design starts to get weird… or rather how its going to be all about responsive design (what happened to adaptive designing who knows)

Think it’s hard to adapt your content to mobile, tablet, and desktop? Just wait until you have to ask how this will also look on the smart TV. Or the refrigerator door. Or on the bathroom mirror.

Or on a user’s eye.

They’re all coming…if they aren’t already here. It doesn’t take much imagination or deep reading of the tech press to know that in 2013 more and more devices will connect to the internet and become another way for people to consume internets.

We’ll see the first versions of Google’s Project Glass in 2013. A set of smart glasses will put the internet on a user’s eyes for the first time. Reaction to early sneak peeks is a mix of mockery and amazement, mostly depending on your propensity for tech lust. We don’t know much about them, other than some tantalizing video, but Google is making them, so it’s a safe bet that Chrome For Your Eyes will be in there. And that means some news organization in 2013 is going to ask: “How does this look jammed right into a user’s eyeballs?”

Stop! Nieman labs is forgetting something major! And I could argue they are still thinking in a publishing/broadcasting mindset

Yes the C word, Context…

Ironically this is something Robert Scoble actually gets in his blog post, The coming automatic, freaky, contextual world and why we’re writing a book about it.

A TV guide that shows you stuff to watch. Automatically. Based on who you are. A contextual system that watches Gmail and Google Calendar and tells you stuff that it learns. A photo app that sends photos to each other automatically if you photograph them together. And then there’s the Google Glasses (AKA Project Glass) that will tell you stuff about your world before you knew you needed to know. There is a new toy coming this Christmas that will entertain your kids and change depending on the context they are in (it will know it’s a rainy day, for instance, and will change their behavior accordingly)

Context is whats missing and in the mindset of pushing content around (broadcast and publishing) and into peoples faces, responsive design sounds like a good idea. Soon as you add context to the mix, it doesn’t sound so great. Actually it sounds damm right annoying or even intrusive? I do understand its the best we got right now, but as sensors become more common, we’ll finally be able to understand context and hopefully be able to build perceptive systems.

We already demonstrated, sensors don’t have to be cameras, gyroscopes, etc. The referral, operating system, screen resolution, cookies, etc all are bits of data which can (some maybe less that others) be used to understand the context.

I can come up with many scenarios where the responsive part gets in the way, unless you are also considering the context. In a few years time, we’ll look back at this period of time and laugh, wondering what the heck were we thinking…

I’m with Scoble on this one… Context and Content are the Queen and King.