Adobe audition uses XML like Audacity files

https://cubicgarden.com/2019/03/03/hooray-audacity-files-are-xml/

Today I tried to open a Adobe Audition file which a Salford student sent me for a potential perceptive podcast. I knew it wouldn’t open but I wanted to see which applications Ubuntu would suggest.

Instead it opened in Atom editor and I was surprised to find a reasonable XML file. It was confirmed after a quick search.

Similar to Audacity and FinalCutXML, all can be easily transformed with XSL or any other programming language. Extremely useful for future User Interfaces. Sure someone will do something with this one day?

My Data: Public spaces / Private data

Mydata 2019 conference card

I’m back at Mydata this year, this time with more colleagues, Publicspaces.net and the Finnish public broadcaster YLE.

If you are at Mydata, our event is in Hall H from 14:00 – 15:45 on the opening day of Wednesday 25th September.

More and more people live their lives online, and we are encouraged to view the internet as a public space. However the personal data we bring to this space can be used in many inappropriate ways: Instagram stories are scraped to target advertisement; faces in family photographs are used to train the ML systems that will scan crowds for suspects; the devices we thought we owned end up owning us; and our browsing histories are stored and scanned by governments and private companies. This creates a tension for public service organisations as they try to deliver value to audiences and users online.

In this session experts from the BBC Research & Development, Finnish Broadcasting Company YLE, and PublicSpaces will consider how to resolve these tensions, and look at some specific interventions aimed at providing value to audiences and communities through the responsible use of private data in online public spaces.

The format will be four brief talks and a round table discussion.

Chair: Rhianne Jones (BBC)
PublicSpaces and an internet for the common good: Sander van der Waal (PublicSpaces)
The Living Room of the Future:  Ian Forrester (BBC)
How public service media can engage online; Aleksi Rossi (YLE)
Data Stewardship and the BBC Box:  Jasmine Cox/ Max Leonard (BBC)

If this interests you, don’t forget to add yourself to the London event with a similar name. Public Spaces, Private Data: can we build a better internet?

Computational photography is just the start

Tree scene with sunlight
Far Cry 5 / A Run in the Park

I found it interesting  to read how Virtual Photography: taking photos in videogames could be imaging’s next evolution. A while ago I mentioned how computational photography was pretty stunning a while ago when using my Google Pixel 2’s night sight mode.

Theres a project BBC R&D have been working on for a while, which fits directly into the frame of computational media. We have named it REB or Render Engine Broadcasting. Like OBM, Object based media theres a lot of computational use in the production of media, but I think theres a ton of more interesting research questions aimed at the user/client/audience side.

Its clear computational media is going to be a big trend in the next few years (if not now?). You may have heard about deepfakes in the news and thats just one end of the scale. Have a look through this flickr group. Its worth remembering HDR (high dynamic range) is a early/accepted type of computational. I expect in game/virtual photography is next, hence why I’ve shown in game photography to make the point of where we go next.

Hellblade: Senua's Sacrifice / Up There

Its clear like every picture we see has been photoshopped, all media we will have to assume has been modified, computed or even completely generated. computational capture and machine vision/learning really is something which we have to grapple with.  Media literacy and tools to more easily identify computational media are what is missing. But the computational genie is out of the bottle and can’t be put back.

Theres also many good things about computational media too, beyond the sheer consumption.

While I cannot deny that my real world photography experience aids my virtual photography through the use of compositional techniques, directional lighting, depth of field, etc. there is nothing that you cannot learn through experience. In fact, virtual photography has also helped to develop my photography skills outside of games by enabling me to explore styles of imagery that I would not normally have engaged with. Naturally, my interest in detail still comes through but in the virtual world I have not only found a liking for portraiture that I simply don’t have with real humans, but can also conveniently experiment with otherwise impractical situations (where else can you photograph a superhero evading a rocket propelled grenade?) or capture profound emotions rarely exhibited openly in the real world!

Virtual photography has begun to uncover a huge wealth of artistic talent as people capture images of the games they love, in the way they interpret them; how you do it really is up to you.

Its a new type of media, with new sensibility and a new type of craft…

Of course its not all perfect.

https://twitter.com/iainthomson/status/1165755171923587072

Black Mirror choices can be snooped on?

Magic box

I have so much to say about Bandersnatch, most has been written here. But its clear that Netflix haven’t given up on the medium and even doubling down on it.

Something popped into my feed about some researchers paper saying you can snoop on the choices of people using Netflix’s interactive system. I’m hardly surprised as its typical network analysis and GDPR requests. But it reminds me how important the work we have done with perceptive media is.

I best explain it as delivering (broadcasting) the experience as a contained bundle which unfolds within the safe space (maybe living room) of the audience. Nothing is sent back to the cloud/base. This is closer to the concept of broadcast and means the audience/user(s) and their data isn’t surveil by the provider. This is exactly how podcasts use to work before podcast providers started focusing on metrics and providing apps which spy on their listeners. I would suggest the recent buy out of gimlet media by spotify might point this way too?

Of course the broadcast/delivery model this doesn’t work too well for surveillance capitalism but that frankly not my problem; and all audience interaction should be (especially under HDI) explicitly agreed before data is shared or exported.

I might be idealistic about this all but frankly I know I’m on the right side of history and maybe the coming backlash.

27-28th Feb is Manchester’s first Storytellers United Hackjam

storytellers united hackjamOn the Wednesday 27th – Thursday 28th February in Manchester’s first Storytellers United Hackjam.

The hackjam is run with support from BBC R&D and BBC Academy, MMU’s School of Digital Arts (SODA), Storytellers United, Popathon, University of York’s Digital Creativity labs and Creative England.

Its a 36 hours hackathon around responsive/perceptive/adaptive media experiences. Participants work as a team to brainstorm ideas, create prototypes of their own storytelling experiences. They will compete against the clock, not against each other sharing knowledge and expertise as they go. They won’t be alone, as they will have some excellent mentored by industry experts sharing their knowledge and experiences. Its all part of BBC Academy’s Manchester Digital Cities week.

The hackjam is only part the story. On the late afternoon of Thursday 28th Feb there will be a mini-conference titled Storytelling in the Internet Age. Where promising prototypes will be demoed to the audience.

Collaborating together

Ideal participants are from the creative sectors such as,

  • Freelancers, Sole-traders and SMEs working in new media fields combining data with media,, may have tried twine, eko, inkle, etc
  • Producers and Directors interested in adaptive and non-linear narratives, may have tried twine, eko, inkle, etc
  • Developers and Designers with an interest in audio & video combined with data and used javascript libs like the VideoContext.js, Seriously.js, etc
  • Students and Academics with a deep interest in object based media, adaptive narratives, interactive digital narrative
  • Artists exploring mixed media and non-linear narratives

Tickets are free but an expression of interest, with no guarantee entry.

See you there!

Perceptive theme park rides?

Tony tweeted me about this thrill machine which uses body data to influence how the ride operates. The link comes from Mashable and I was able to trace it back to the original

“…while building this attraction I also wanted to change the usual one-sided relation – a situation where the body is overwhelmed by physical impressions but the machine itself remains indifferent, inattentive for what the body goes through. Neurotransmitter 3000 should therefore be more intimate, more reciprocal. That’s why I’ve developed a system to control the machine with biometric data. Using sensors, attached to the body of the passenger – measuring his heart rate, muscle tension, body temperature and orientation and gravity – the data is translated into variations in motion. And so, man and machine intensify their bond. They re-meet in a shared interspace, where human responsiveness becomes the input for a bionic conversation.”

https://danieldebruin.com/neurotransmitter-3000

Its a good idea but unfortunately couldn’t work on a rollercoasters, which is my thing. Or could it? For example everyones hand up in the air means what? The ride goes faster? How on earth does work? How meaningful would this be if you could actually do this?

Its one of the research questions we attempted to explore in the living room of the future. How can you combine different peoples personal data to construct a experience which is meaningful and not simply a medium of it all.

These global changes don’t seem meaningful or so useful? Maybe its about the micro changes like mentioned previous.

Of course others have been working around this type of things too.

Did Netflix scorched the earth of interactive digital narrative?

Netflix - Black mirror
Bandersnatch

Everyone is talking about Black Mirror Bandersnatch, and to be fair after watching 5hrs 13mins of it seeing every version/variation. Its quite something. But even before it launched there were problems.

I agree its slick but its also very interesting to read Charlie Brooker’s thoughts on the experience of creating it.

Creator Charlie Brooker told The New York Times that he won’t be making more interactive episodes of the Netflix series – so no more difficult cereal choices in the future.
Asked what advice he had for anyone attempting to make interactive TV, Brooker added: “Run away. It’s harder than you think.”

I wonder if Bandersnatch will ultimately cause people to avoid IDNs (Interactive Digital Narratives) or adaptive narratives. It would be a real shame if it did but as Tom says in reply to my thoughts earlier today

I do wonder if Netflix has slightly done some damage by doing something so extreme? Something of a firework which everyone saw and caused a fire as it rained on peoples head?

Maybe James is right along with Tom? Explicit Interactive Digital Narratives has been done to death. You only have to look at the stuff Marian was doing in the mid- late 2000s with shapeshifting media.

I can predict in a year or so time, people will have forgotten Bandersnatch (packed away on a top shelf as James says) but this isn’t good news for all those other productions and experiments which may not be as smart but genuine a pleasure to be part of.

Would funding for IDN dry or boom because of Bandersnatch? Hard to tell at this stage.

What I would like from Netflix is some data/numbers on repeat viewings, paths people take, etc. If I was writing a paper, this would be a good experiment to be in on.

Less blogging recently…

You might have noticed less blogging from me recently. Theres a number of reasons mainly to do with being on holiday in Portugal & Spain. But also I’m working on the living room of the future project. Something I highly recommend you should sign up to experience.

I did about 6 pacemaker mixes while away on holiday but I would say only 3 maybe 4  are worth publishing. So look out for them on Mixcloud.com.

Leaving Madrid, recorded on the plane back to Manchester

  1. First attempt – Tomcraft
  2. Energy Flash (Graffiti on mars remix) – Joey Beltram
  3. Flight 643 (oliver klein remix) – Ferry Corsten
  4. Fractal – Bednar
  5. I feel wonderful (cosmic gate’s from AM to PM mix) – Cosmic gate feat Jan Johnston
  6. She wants him (Blake Jarrells panty dropper mix) – Moussa Clark & Terrafunka
  7. Opium – Jerome Isma-Ae & Alastor
  8. Suru (martin roth electrance remix) – super8 & tab
  9. Anomaly (Eeemus’s Higgs Boson remix) – Gordey Tsukanov

The heights of Lisbon, recorded during on the evening nights in Lisbon.

  1. Open up – Leftfield
  2. Loneliness (club mix) – Tomcraft
  3. Whites of her eyes – Simon Patterson
  4. Delores – Indecient noise
  5. From Russia with love (matt darey mix) – Matt Darey pres DSP
  6. Jump the next train (Vadim Zhukov dub) – Young Parisians feat Ben Lost
  7. Labyrinth (Paul Keyan remix) – Lee Cassells
  8. Strange world (M.I.K.E’s rework 2006) – Push
  9. Souvenir De Chine – Fire & Ice
  10. Take me away (into the night) (purple haze remix) – 4 Strings
  11. Sweet little girl (Voolgarizm remix) – Mario Piu
  12. Tenshi – Gouryella
  13. Uncommon world – Bryan Kearney
  14. We are one (instrumental mix) – Dave 202
  15. Why does my heart feel so bad (Ferry Corsten remix) – Moby
  16. Anahara (extended mix) – Ferry Corsten pres Gouryella

Raving in Albufeira, recorded on a long bus ride from Albuferia to Faro

  1. Sunset (bird of prey) – Fatboy Slim
  2. Rheinkraft (extended mix) – Oliver KleinDj Cul
  3. ture (Joey Beltram mix) – Kevin SaundersR
  4. evolving doors (club mix) – Ronski Speed
  5. Wrist block (Joey Beltram remix) – Side Four
  6. Running up the hill (Jerome isma-ae bootleg mix) – Placebo
  7. Flat Beat – Mr Ozio
  8. Shnorkel – Miki Litvak & Ido Ophir
  9. Valhalla (tonerush remix) – OneBeat
  10. Higher state of consciousness (dirty south remix) – Josh Wink
  11. Aumento – Joey Beltram
  12. EDM Death Machine – Knife Party
  13. A9 – Ariel
  14. Brainwashed (Club mix) – Tomcraft
  15. Gouryella (extended mix) – Gouryella
  16. Anahera (extended mix) – Ferry Corsten pres Gouryella

I finally took up the Gratitude habit and started publishing them here. Standardnotes has quite nice system to publish notes but also keep parts secret if you choose to. Its like I imagined for mydreamscape ages ago.

Rethinking Podcasting

Reinventing podcasting
Ok maybe less reinvent and more rethink?

I hinted at Perceptive Podcasting previously in a post about being busy. I have finally come out of that busy period and am UK bound as my passport is due to expire.

Just before the busy period, I drafted a post about Perceptive Podcasting and why it’s not simply another unique project. It went up on the BBC R&D blog recently which is wonderful because I can point to that rather than the other way around.

Perceptive Radio v1

Since we first launched the Perceptive Radio v1 in 2013 as a concept of what Perceptive Media (implicit interaction from sensors & data, adapting media objects) could become; the radio’s have always been a framework to explore further into adaptive object based media experiences. But we have always acknowledged the growing power of the smartphone and how it could be the container for so much more.

Even when we created the Perceptive Radio v2 with Lancaster University and Mudlark, it was modeled around an android phone and extending the sensors. The possibilities of IOT Storytelling with object based media was deep in my mind, along with research questions.

As a person who saw the revolution of podcasting in 2000, I was always interested in the fact its downloaded audio and generally consumed/created in a personal way, unlike radio in my view. I’ve also been watching the rise in popularity of podcasting again; heck Techcrunch asks if it could save the world 🙂

Of course I’ve started a few podcasts myself (recently Techgrumps and Lovegrumps) and love the fact it’s quite easy to get started and it can feel quite personal. I also found the diversity of podcasting quite interesting for example I’ve been listening to the guilty feminist, friends like us and risk, for quite sometime and find them fascinating every time.

Why a client for podcasts?

In 2017, you are seeing more webservices hosting podcasts like stitcher, (heck even Spotify is hosting some). At the server-side there is a lot you can do like dynamically change adverts, geo-fence media, etc. 60db are one such service doing nice things with podcasts but they are limited in what they can do, as they said in a comment on a similar post. But doing this all server-side is a pain, and tends to break the podcast idea of download-able audio (even if you have 4g everywhere), it feels more like the radio model of tuning in.

Imagine if you could do the server-side type of processing but on the actual device and even unlock the pools of sensor/data with the users consent? And imagine if the creators could use this in storytelling too!

Its Personal, Dynamic and Responsive without being creepy or infringing personal liberties, It adaptives to changes in context in real time. It dances with Interactivity and we are also exploring the value and feasibility of object based media approaches for engaging with audience. We believe that this offers the key to creating increasingly Immersive media experiences as it gives more story possibilities to the writer/director/producer. But also provides levels of tailored accessibility we have yet to imagine.

So many possibilities and its made in a very open way to encourage others to try making content in a object based way too.

Keep an eye on bbc.co.uk/taster and the bbc.co.uk/rd/blog for details soon.

Deja’Vu or generated reality

I saw ai artist conjures up convincing fake worlds from memories via si lumb and instantly thought about my experience of watching Vanilla Sky for the first time.

Could be incredible and terrifying for perceptive media, but alas the best technology always sits right on the fence, waiting for someone to drag it one direction or another?

Human & AI Powered Creativity in Storytelling from TOA Berlin 2017

I already wrote about TOA Berlin and the different satellite events I also took part in. I remember how tired I was getting to Berlin late and then being on stage early doors with the multiple changes on public transport, I should have just taken a cab really.

No idea what was up with my voice, but it certainly sounds a little odd.

Anyhow lots of interesting ideas were bunched into the slide deck, and certainly caused a number of long conversations afterwards.

Someday all content will be made this way…

Lego Bricks

This is adapted from the BBC R&D blog post, but I felt it was important enough to repost on my own blog.

Object-based media (OBM) is something that BBC R&D has been working on for quite some-time. OBM underpins many media experiences including the one I keep banging on about, perceptive media.

I’ve spoken to thousands of producers, creators and developers across Europe about object-based work and the experiences. Through those discussions it’s become clear that people have many questions, there has been confusion about what OBM is, and other people would like to know how to get involved themselves.

So because of this… BBC R&D started a community of practice because we really do believe “Someday all content will be made this way.”

A community of practice brings together people and companies who are already working in the adaptive narrative field. BBC R&D do believe that the object-based approach is the key to content creation of the future, one which uses the attributes of the internet to let us all make more personal, interactive, responsive content and by learning together we can turn it into something which powers media beyond the scope of the BBC.

There are three big aims for the community of practice…

  • Awareness: Seek out people and organisations already interested in or working on adaptive narratives through talks, workshops and conferences
  • Advocacy: demonstrating best practice in our work and methods as we explore object-based media and connecting people through networks like the Storytellers United slack channel and helping share perspectives and knowledge..
  • Access: Early access to emerging software tools, to trial and shape the new technology together.

These aims are hugely important for the success and progress of object-based media.

As a start, we’re running a few events around the UK, because conferences are great but sometimes you just want to ask questions to someone and get a better sense of what and why. Our current plan is linked on the BBC R&D post which is being update by myself everytime a new event is made live.

Back at the Quantified Self conference in June

Quantified Self 2011

I’m back at the Quantified self conference and it’s been a few years since due to scheduling and other conflicts. It’s actually been a while since I talked about the Quantified self mainly because I feel it’s so mainstream now, few people even know what it is, although they use things like Strava, fitbits, etc.

The line up for the Quantified self confidence is looking very good and there’s plenty of good sessions for almost every palette and I’ll be heading up this session while at the conference.

Using Your Data To Influence Your Environment

With home automation tools, it is now possible for your personal data to influence your environment. Soon, your personal data could be used to influence how a movie is shown to you! Let’s talk about the implications and ethics of data being used this way.

Its basically centered around the notion our presence effects the world around us. Directly linking Perceptive media and the Quantified self together. Of course I’m hoping to tease out some of the complexity of data ethics with people who full understand this and have skin in the game as such.

I’m also looking to report back on this conference and restart the manchester quantified self group which went quiet a while ago.

Barbican’s black mirror exhibit

Black Mirror s1 ep 2: 15 million Merits

Interesting news recently, that London Barbican will be opening a exhibit around Black mirror episode S1 ep2: 15 million merits

I’ll be personally interested to see how far down the perceptive media (or as I use to call it intrusive TV) route they go? Also be interested to see if they use the chance to educate the public about data ethics and the value of data like the science museum have done.

https://twitter.com/BarbicanCentre/status/850308312025489408