The collective is a community of cross-sector organisations and community representatives, working to (first) establish Greater Manchester as an equitable, inclusive and sustainable examplar for responsible tech, through putting people first in its creation.
The collective has been one of those interesting groups which has been doing a lot around what I’m calling the public service internet ecosystem. Another group out of Manchester is Open Data Manchester.
I have been pretty busy recently and had not as much time to do much blogging. To be fair my mastodon microblogging has increased quite a bit, can’t think why…
During the busy last few months, I gave a talk at Durham’s Bright ideas gathering. It was a really good event which felt like a TEDx with a number of different topics and speakers.
Originally I was going to give a talk about the recently launched Adaptive podcasting but gave it more context with why its a important project. Along the way we stop at the big changes coming to the BBC looking at my own personal view of moving to Manchester.
Thank to Herb and the team which delivered another excellent conference even in the middle of train strikes.
I shared the slides on slideshare (which is still a thing it seems)
With the Android app/player you can listen to adaptive podcasts. With the app/player installed, you can load and listen to your own made podcasts. There is of course RSS support, providing the ability to load in a series of adaptive podcasts (replacing the default feed from BBC R&D).
With access to the web editor on BBC Makerbox, you can visually create adaptive podcasts in a few minutes. Its node like interface is running completely client side, meaning there is no server side processing. Just like the app/player, which does zero server callbacks to the BBC. Pure Javascript/HTML/CSS.
If you find the web editor not advanced/in-depth enough for you, there is the XML specification which is based on SMIL. As the code can be written or even generated. We even considered other editors like audacity.
But I wanted to thank all the people who helped in making this go from the Perceptive Radio to Adaptive Podcasting. So far I started a github page but will write the history of how this happened when I got more time. Partly because its a interesting story but also because it demonstrates the power of collaboration, relationships, communities and the messy timeline of innovation.
Its moving along thanks to some great friends who have done such a great job editing, structuring and shaping the book. But one thing I turned my attention to a while ago, is the illustrations.
I did pay for an artist out of my own money but wasn’t quite happy with every single illustration for each chapter, so only had about half done. The rest I’m talking to another artist about but recently been quite impressed with the AI art generators like DALL-E 2, Midjourney, Nightcafe.
The generated works are strange and abstract enough to fit with what I’m looking for in the book. Not only that, the ownership and copyright seems to be working out (from what I read using DALL-E 2).
(c) Copyright. OpenAI will not assert copyright over Content generated by the API for you or your end users.
I certainly seen the AI bias in some of the images generated. For example if I don’t say what gender or race the person is, the AI defaults to male and white. Its only when I deliberately say Black male / female it then switches. I would also say the images of black women are not as fully thought out as white women. Because I’m generating pictures of dating, it always defaults to straight dating unless I add something to the query. Likewise the women are always thin never curvy unless specified. Actually a few times, I got women who were pregnant. Of course every single time I make a query, it takes credit (money) making it costly to really test its bias, sure someones already on this.
The big question I have is, if I was to use DALL-E for illustrations in my book, what would that say or mean for my stance around AI, bias and data use? To be honest, I’m actually thinking about generating the front & back covers in full colour, rather than the in book illustrations.
Maybe I should be less worried about this? Or even better I was thinking about ways to not just make clear it’s AI generated but show the process of selection or something similar?
To quote Buckminster Fuller “You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.”
Ian thinks: Everyone is throwing their weight behind FIDO and its looks extremely useful. Finally something which is user friendly, easy to use and secure.
Ian thinks: Mozilla’s research into those apps many people used during the pandemic and varies lockdowns is simply a horror story. There has to be a better solution which doesn’t rely on misplaced trust?
Ian thinks: On a similar note to the previous one, the consumer reports article is full of very useful tips to protect you. These are good for almost every single app I would say.
Ian thinks: Dove’s self esteem project is consistently doing great things for society. Deep faked mothers talking to their daughters while sitting next to their real mothers is just incredible and so well thought out.
Ian thinks: Andy Yen Proton’s CEO gave a talk in the European Parliament hinting at this announcement. Taking on Google with a non surveillance business model is intriguing as scale isn’t as critical for success?
Ian thinks: The Dutch collation, Publicspaces had their 2nd conference in May and a good number of the English language sessions are well worth your time. Always challenging and full of good threads to tug on.
Ian thinks: This is a sobering and some what recently forgotten side of the digital revolution. If left to market forces, I can’t see things getting any better. Only a public service internet can really make the difference.
Ian thinks: Although the register adds a level of snark to the idea, there is something which does speak true. Regulating algorithms could really provide a level of trust, comfort and agency which just doesn’t exist right now.
Ian thinks: I love these projects explaining and educating diverse communities to take control of the technologies to avoid being the disadvantaged by them.
Ian thinks: In the middle of the hype, there is very little looking back and learning the lessons of previous generations. Dare I say it, those who don’t learn their history are doomed to repeat it.
My finally setup was something I was playing with for ages but mainly via a self installed wordpress on my raspberrypi. I found problems when installing hyperaudio and in the end decided to go with a static website. I choose Publii as it had a linux client and I could just write the HTML easily (so many use markdown and other things, which would have made working with hyperaudio more difficult than it needs to be)
With the site creation out the way, I needed somewhere to host it.
Originally I was going to use Yunohost but I couldn’t find a simple webserver to just host the static files, instead I found a proxy server, which points at my NAS, which is running a very simple webserver. Of course the NAS has plenty of space, its also where the mixes sit, has a excellent redundancy and backup system.
Originally I always saw Hyperaudio for its ability to tie a knot between the written word and the audio (& video). It wasn’t till I saw a demo of the WebMon functionality is when I understood it could be the thing I need for DJ mixes.
With correctly written HTML, I can tell Hyperaudio what it should do, and with Mark’s help we had a prototype up and running.
<li class="active" data-wm="$ilp.uphold.com/B69UrXkYeQPr">
<span data-m="0">Activator, I know you can (That kid chris mix) - Whatever girl</span></li>
<li data-wm="$ilp.uphold.com/3h66mKZLrgQZ"><span data-m="127000">Air traffic (Erik De Koning remix) - Three drives</span></li>
<li data-wm="$ilp.uphold.com/B69UrXkYeQPr"><span data-m="445000">Chinook - Markus Schulz pres. Dakota</span></li>
<li data-wm="$ilp.uphold.com/3h66mKZLrgQZ"><span data-m="632000">Opium (Quivver remix) - Jerome Isma-Ae & Alastor</span></li>
Each tune has a time configured using the attribute data-m, this is in milliseconds. As I have all the data in the old CUE files I created a long time ago. Mark helped me out with a nice script which saved me manually copying and pasting. (I also considered writing a XSLT to do the conversion). In between sleeping and relaxing with Covid, I got a number of mixes up, changed the theming and finally got to grips with the static file uploading process, and the results you can see on the site.
Payment and royalties
You will also notice each tune/list item also has data=”wm” attribute with a $ilp (payment pointers). Currently they are pointing to myself and Mark Boas. Obviously I would change them to the payment pointers of the artists/producers/djs involved but I don’t know any which have them so far. Which leads nicely on to the next challenge for WebMix.
I did/do have a plan to do a mix with dance music from artists which have payment providers but that is still in the pipeline. Along side this, myself and Mark thought about some kind of database/airtable/spreadsheet/etc with payment pointers crossed linked to their discogs profile.
Back to the current experiment, here is Opium (Quivver Remix) – Jerome Isma-Ae Alastor. You could imagine one payment provider decided between all involved which could be used to pay for each time its played on the site. (I am very aware this is very simplex and the royalties of music is a total nightmare!) but the point of the payment pointer is to hide the complexity behind one simple payment pointer, how its divided afterwards is up to each of the parties involved. I’m imagining a management agent, organisation or even dare I say it DAO; responsible for the payment pointer. There’s already things like revshare, which means you can have multiple people/entities behind the payment pointer and theres interest in this space. Long tail economics certainly could benefit here.
Anyway its a long complex area which I’m best staying out of…?
The main point is its all working and expect more updates soon… I know Mark has other ideas, while I still need to get older mixes up. I also would like to tie the whole thing to something federated or at very least setup a activity-pub feed.
The Ethical Dilemma Cafe Manchester happened last week on Tuesday 26-Wednesday 27th April. It was quite something to build, prepare and experience.
Building on the ethical dilemma cafe in Mozfest 2014, we took the idea into a real working cafe complete with the public coming and going, but experiencing the dilemma.
When I say the dilemma, what do I mean? In 2014…
The café offered popcorn, juice, and smoothies not found anywhere else at the festival, but to enter the café, you had to cross a boundary that required a ridiculous data user agreement. As part of this agreement, your personal information would be plastered through the festival’s halls hours later. This experience was about getting out of a chair and experiencing the dilemma in a real, tangible way. Would you read the agreement in order to obtain a glass of juice? Ignore the agreement and quench your thirst in ignorant bliss? Or read the agreement and walk away, and try to find snacks elsewhere because the agreement was unacceptable?
While in 2022 with the changes in how mobile phones are less leaky about data and a ton of frankly new challenges (some are explored in our virtual mozfest 2022 session), we decided to explore both the QR code and personal data sharing problems.
People scanned a QR code, signed up to a fake cafe ordering system with their email or social media login. After that, they are forced to answer a question before being presented with a QR code which can be scanned for a hot drink (or looking at the very very long receipt, cold drinks). If you went for a second, third, etc drink you will get more and much more personal questions. We had 5 levels of questions and the single 5th question was deeply personal. Is the coffee really worth it
Sometimes almost by random, the QR code would switch to a public rick roll (making clear you should be careful what you scan) but most of the time you get the webapp which will use any data used.
The biggest output being the questions and answers on a screen right on the cafe bar. Of course there were some intriguing answers to our questions.
I’m still wondering who wrote the answer with my name in it?
The Dilemma is just the start, as there was a whole number of talks, workshops and exhibits/interventions.
On the exhibits end we had everything from the human values postcards by BBC R&D and is everybody happy by Open Data Manchester to Presence robots (reverse metaverse) to the Caravan of the future.
Talks included Designing the Internet for Children with the ICO, Keeping Trusted News Safe Online with BBC R&D, Trustworthy AI – what do we mean when we say with Mozilla.
Talks were kept to 15mins as it went out to the whole cafe and people were encouraged to take a table to keep the conversation going afterwards. In typical Mozfest style.
Finally the workshops included Materialising the Immaterial with Northumbria University, Designing the Internet for Children with the ICO, Why might you personalise your news with BBC R&D, Common Voice / Contribute-a-ton with Mozilla.
In the usual Mozfest style there was plenty of great moments for example when the traffic warden came to check out the Caravan of the Future.
There was plenty of interest in the reverse metaverse (presence bots), which was one of the projects which run through out the 2 days. Like the original ethical dilemma cafe, we wanted to expose people to work in progress rather than a museum, where everything is perfectly working. When they worked it really worked well.
To get a real sense of the reverse metaverse / presence bot, I recorded Jasmine for a short while with a remote person.
Does it understand me, is a speech to text system trained using the similar/same algorithms as the Amazon Alexa. It was so weird to see how when it got the wrong word, it guessed with something so strange. Like Deliveroo and Kindle?
Having the public come into the space was a positive, as many of the regulars popped in and end up going to a workshop or checking out a few of the interventions. Even better was having the staff of the feel good cafe joining in and enjoying the event. There’s a few times, when I overheard people asking what was going on and then the staff suggesting checking out the loom, human values postcards, etc.
The concept really came together well over the two days. Its something which will come back in other forms. Keep an eye out for future iterations of the ethical dilemma cafe soon.
Massive thanks to everyone involved in the Ethical Dilemma Cafe, so many people from the Mozilla Foundation, who took over a hotel in the northern quarter (it was so strange seeing people I usually see on Zoom or in London only 10mins away from my home), all the partners who took a leap of faith with the concept bringing their research and passion to the cafe. The cafe and the amazing woman (can’t remember her name) who really went with the concept. All the people who helped promote it and encourage others to join us over the 2 days. My colleagues who pulled out a number of stops to make things like the coffee with strings, reverse metaverse bots, etc. All amazing along with the talks and workshops, which nicely fitted with our partners. Thanks to the security guard who worked 2 full days and his presence was just right. Finally thank you to all the people who traveled sometimes from quite far to make the event, because without you there would be no ethical dilemma cafe.
There is likely people I have forgotten and I have deliberately not named anyone in-case I miss anyone by name. But I thank everybody especially Sarah, Lucie, Jasmine, Marc, Henry, Iain, Julian, Sam, Laura, Paul, Jesse, Bob, Steph, Lianne, Jimmy, Bill, Zach, Michael, Juliet, Georgina, Todd, Charlie, etc.
March has been so busy and I really enjoyed the start of the month at the Mozilla Festival 2022 virtual (which reminds me I must write that up, maybe in my new conference new style as suggested by Bill Thompson).
More information will be revealed but you can now book tickets to guarantee your time in the cafe. Simply click register for ticket.
We are limiting the numbers for the safety and comfort of everybody including the volunteers and staff. We will also follow the government guidance on Covid19.
Don’t forget to check out #mozfestedc (Mozilla Festival Ethical Dilemma Cafe) for more announcements.
Remember a healthy internet means a healthy society and a healthy you.
While hearing Cass Sunstein talk about his new book Sludge, I thought surely this is fiction?
Not knocking it, I’m already known for saying friction can be a good thing.
One of the things I really like about the design guidance is the time to stop. A lot of the time we rush design in a frictionless way but we need to deliberately add friction.
In the background there has been talk about what would the ethical dilemma cafe look like in 2020? By the time me and Jasmine talked about it here, there was enough momentum between Mozilla’s internet health report and BBC R&D’s research into the public service internet, to really make it happen.
With Mozilla Festival currently mainly virtual, it was a good time to try a more distributed festival. Hence why not run the ethical dilemma cafe locally in Manchester, in a real cafe with real hot drinks and with the general public too? Heck yes!
In 2014 we worried about hidden microphones, secret cameras and toys with prying eyes. We asked for off buttons, clearer privacy terms and control over our own data. What has changed since then? Are our worries still valid? What are the new areas of concern? Or are we just more accepting of relinquishing control?
The Ethical Dilemma Cafe is a relaxing space to grab a free coffee and meet fellow festival participants. However there is a catch!
You will have the opportunity to let your personal data take you on a journey through a space full of wonder and intrigue, where you will uncover the power of data and algorithms and how they shape your world, whether you’re aware of it or not. But nothing in this world is for free, the dilemma you face is your willingness to cross the threshold and be complicit in the interpretation of how your data defines you and your community, in perpetuity.
This year the Cafe will show you how your data is reflecting your identity in the digital world. How measurement, categorisation, and labelling of humans by machines determines the barriers and privilege you experience. It will prompt you to question if the established metrics are measuring the right things, at an appropriate granularity and how their influence touches your online and offline experiences.
If you are local to Manchester, join us from April 25-26 2022
If you are local to Manchester or can travel from around the UK, you don’t want to miss this 2 day event. Put it in your calendar now, Tuesday 25th & Wednesday 26th April.
I had a strange dream last night… No not that kind of a dream!
I have been doing a number of sleeping/dreaming tests during the pandemic including trying build back up my ability to lucid dream.
With that, I had quite a amazing one last night (Wednesday 2nd Feb 2022)
I lucid dreamed I was seeing the Google’s new attempt at the tablet market. It was different size tablets, from 7inches to 13inches. I assume it might have been the news about root access to the remarkable 2 earlier the same day, which got me thinking. The noteworthy part of the dream was the sales room was in a traditional Japanese wooden house (complete opposite of apple’s white and glass stores) a distance from the centre of town, surrounded by lots grass and water. Plus the tablets were housed in wood rather than plastic or metal.
There were many pastel colours from yellow, purple, red, greens, browns and blacks. The tablets were quite thin but comfortable to hold. The tablet supported both finger control and a thin pen like stylus. The screen had some different kind of technology, like colour ink but a more vivid. I expected to see a new futuristic version of Android to be installed on picking one up. But instead was greeted with Fuchsia.
Other features? Multiple day battery, Google tensor chip, light to carry, usual wireless connections including bluetooth, wifi, nfc, 5g but no cameras and not waterproof only dust proof. All for 349 pounds?
Later the same day I heard the news story that Google is rethinking tablets again. Honestly had no idea but I don’t think for how great the tablets were I was dreaming, google is going to do wood tablets… or will they *wink*
Over the last few months theres been a ton of interest in the metaverse, we all know why. Its been annoying seeing people wooing over something which others have started building decades earlier.
This got me thinking about the values and ethics which make the public service internet so important and so different from the corporate metaverse. But rather than think it out myself alone, I wrote a proposal for Mozfest 2022 to explore this in a discussion with a number of people. Evaluating emerging technology to understand its benefits and its problem. To hopefully shape the technology for the benefit of the public and society, is the goal of the session.
I’m extremely proud to say it was accepted and in March this year, I will lead the session sketching out the stark differences.
I almost want to add Web3 to the line up, but I believe there will be plenty to cover just in the metaverse alone.
Its impressive, perfectly timed and fitted perfectly with the questions raised with the Matrix. The textures are good but not perfect, but I am impressed with the luminosity which helps it sit within the environment. The biggest give away is the movements of people but things like objects are pretty close.
By the way, I know a lot of people are not blown away by Matrix 4 but I have seen it 5 times now and still rate it 8/10. Just below the original but ahead of the 2 and 3.
MozFest is a unique hybrid: part art, tech and society convening, part maker festival, and the premiere gathering for activists in diverse global movements fighting for a more humane digital world.
There are 9 Spaces created by the Wranglers that address urgent issues such as: digital privacy; neurodiversity and wellbeing; intersectionality in tech; and climate and sustainability. MozFest is looking for collaborative, participatory and inclusive sessions, workshops, skillshares, immersive art projects, and more that interrogate these issues and drive forward the conversations around Trustworthy AI.
I spent some time in the spa recently and listened to a conversation about Android vs iOS in the stream room. I didn’t partake but found it interesting to hear how people were describing both and their dis/advantages.
There was a point when one person mentioned the customization of Android vs iOS, something like “you only just widgets last year”
But there is something which I have been thinking about in that general space.
Most phones are super similar and the software is what makes it different, its why I stick to the Google phones. I’m not keen on the Samsung opinionated software choices, although I understand people do find much comfort in the per-installed software and decisions. I think of it like Debian vs Ubuntu (of sorts). When Ubuntu came with Unity, I always installed Gnome Shell. It was easy enough to do, but its very difficult to do on a phone (replace Samsung’s UI with plain Android).
But back to phones…
The customization is key… I was originally concerned when Google was following Apple’s approach a while ago but then they seemed to understand the power of Android being yours and leaned right into customization.
Having upgraded to Android 12 a couple of days ago, I really like the system. Material you is surprising and is just right even in dark mode.
I am using Yatse remote which changes the background of my phone depending on what I am watching.That change will persist till I watch something else. I thought it might cause a clash but it doesn’t and still manages to look good always. The colour palette works no matter what. What would Joney Ive and Steve Jobs make of this design approach? Can’t imagine they would be a fan. Its one of the rejections I had about objectified the film/documentary is the lack of customization.
I found this video which sums up what I’m thinking. I look forward to seeing Material you on my new Pixel 6 soon.