The future of public service broadcasting is closer than you think

The other day a family member said to me.

I remember ages ago you talking about something you were researching around personal data stores. Then I saw on the most recent click, something similar.

I looked it up, knowing colleagues in the team have recently done a interview with BBC click and found the above video. Its great to see it being used in unique ways and always reminds me of the great BBC news story “Why the BBC does not want your data

To me its clear a personal data store is a key part of the public service internet ecosystem. There is much more to it but people controlled data is a key corner stone. Its also why the Perceptive Radio sits next to MyPDS/Datapod in the Manchester Museum of Science and Industry (MOSI).

MyPDS and Perceptive Raido

My role in personal data store project ended a while ago but I’m still involved in tangential research around this all. Part of it being the living room of the future and other new research.

The public service internet is one step closer… everyday!

Join us! Its going to be great!

Web Monetised DJ mixes anyone?

Its Mozilla Festival 2022 virtual week and the grand WebMontisation experiment is underway.

While thinking about the experiment and the ability to tip people, I thought about this aspect within mixes. Originally I thought about it per mix as WebMontization is page level, although there are plans for link level monetization in store.

Then I saw a bunch of Hyperaudio experiments with WebMon. This got me thinking imagine if every artist/label had a payment pointer?
Its not like we don’t have the precise timing metadata, especially when recording a mix digitally.

WebMon Mixing

For example here is the Pacemaker editor, which gives you exact times of when tunes are used and not used. The mix is my latest one, the incidental contact high mix, I do love that mix!

With the advantage of metadata lookup, it wouldn’t take a lot to correctly identify the tune and auto discover the payment pointer of the artist/label. For example here is Protoculture which is appears 3 times in the mix. With something like hyperaudio, it would be pretty straight forward to automatically send a stream or micropayment to the artist/label everytime the track is played within a mix.

With all this in mind, I’m thinking about creating an experiment.

If I was to do a mix using creative commons attributed licensed music, with all artists who have payment pointers. Then provide it through hyperaudio on my site.

Wouldn’t that be a really interesting experiment?

Following what Coil & Mozilla have done with the tipping experiment, I could use payment pointers for a number of charity’s instead?

Coil tipping

My first tip went to Hyperaudio!

Its certainly feel like a perfect DJ Hackday project?

I have refined the idea on the WebMon community site

Project description

The existing models for distributing DJ mixes is frankly painful with many DJs having to fight with take-down notices and copyright flags.
I am investigating ways to self-host and share DJ mixes with the care and attention of what a DJ would like to bring to the mix, and include a way to pay the artist/creator of the music in the mix.

Ways in Which I Am Web Monetizing These Resources

Currently I am Web Monetizing the whole of the site but I am going to change the audio player to HyperaudioLite and take advantage of the new feature to pay per section of the audio.
As a DJ, my main interest is to share the mix with as many as possible without limits and constraints. I will turn off WebMon for myself and use the payment providers of the artists instead. As I expect many artists have not heard of WebMon and so I recommend using payment providers of charities and non-profits instead (same ones Mozilla have used throughout the Mozilla virtual festival).

As more artists and labels start to support WebMontization and get payment pointers. It will be easy to reroute the payments to the new payment pointers and even split payments between groups/collaborations.

Ideally I’d like to see this fit within the fediverse systems like funkwhale, reel2bits or Castopod enabling support for future forms of sharing, ignoring and distributing.

The Grand Mozilla Festival Web Monetization Experiment is go

I was aware of this also being a Mozfest ambassador but at the end of last week it was announced. The grand Mozfest 2022 web monetization is go.

In short every Mozfest ticket holder will receive a free coil account with 5 dollars of webmon funding. On top of all the usual coil benefits, you will see changes in the virtual mozfest with the ability to tip speakers for their workshops. Tipping is quite new but a interesting addition to webmon.

Bet you wished you booked yourself a ticket for Mozfest now?

You are in luck, there are still tickets and the advantages of being involved just got a lot better!

Erica’s video sums it up with that huge dog and cute kitty.

What is The Grand MozFest Web Monetization Experiment, you ask?

It is an experiment to see how the creative minds of MozFest Community can apply the Web Monetization Standard to their MozFest resources and assets to raise money for an Internet Health initiative of their choosing, inclusive of their own work.

What Does this Mean for MozFest Attendees?

This means that every MozFest attendee will receive a 6-month pre-paid Coil account* stocked with $10 US worth of tips to use on Web Monetized resources and assets at MozFest, in addition to $5 US / month of micropayments to stream to Web Monetized resources and assets that you spend time on each month.

I look forward to seeing how the experiment changes how virtual Mozfest works in 2022 and beyond (maybe). Its certainly something which I can imagine many others conferences try and copy in years to come.

It happened and I have updated my Coil account after getting my email from Mozilla

https://twitter.com/cubicgarden/status/1496657712570155013

A rallying calling for distributed rather than decentralised

Centralied, decentralied and distributed network models

I’m currently writing my presentation for the Mozilla Festival on the metaverse vs the public service internet, and thinking about Web3 and the metaverse quite a bit in reflection to a truly digital public space…

There has been so much talk about Web 3.0 and Crypto. The recent interview with Tim Oreilly (heads-up, I know him and been to many of his conferences in the past), adds to the piles of critical thoughts of this all. I specially found Small technology’s (Aral & Laura, who I also know well) Web0 manifesto a interesting thought.

web3 = decentralisation + blockchain + NFTs + metaverse
web0 = web3 – blockchain – NFTs – metaverse
web0 = decentralisation

web0 is the decentralised web.

Pulling all the “corporate right-libertarian Silicon Valley bullshit.” out of Web3, leaving us with a decentralised web.

Something I believe is a landmark on the way to the future destination of the distributed web. (I’m aware web isn’t the right term rather it should be internet but as most people experience the internet via the web…).

I think about this a lot as I look at the very notion of a public service internet and the very idea of a public service stack. The decentralisation move still has elements of neoliberatiasm which puts dependence on the individual. This is fine if you got time, resource and knowledge. Those without are out of luck?

As you can imagine not everyone has these but in a distributed model you can trust others to support/help/collaborate to lessen the cognitive/environmental/time load. This gives everybody the ability to benefit from a distributed internet.

If that isn’t the future, I’m not sure what is?

To me, the distributed model is a superset, supporting the decentralised and even some aspects of centralised models. Federated is also interesting to me but for many different reasons.

NFTs and Crypto by Cracked

I found the recent list of Cracked videos quite fun to watch, but also with a ton of head shaking.

The bubble aspects of NFTs and Crypto is of course in full effect along with the massive number of camps fighting for their legitimacy. This all has a massive impact for the notion of Web3, which is a whole can of worms I’m not going to delve into right now.

Facebook interest categories unveiled

My FaceBook interest catergoriesI heard today via Daily Tech News that Facebook was finally stopping Advertisers target users via interest categories.

These categories a pile of junk as you can see mine before I removed them all.

I mean thought? Dyson? Information?

If you want to remove/see your own.

Users can see their profile’s interest groups by navigating on desktop to Settings and Privacy > Settings > Ads > Ad Settings > Categories used to reach you > Interest Categories. If you don’t want to receive ads based on a certain interest, you can opt out.

Of course Facebook has much more data points but frankly removing some of these is good. Although I do wonder how many people will actually investigate their own interest categories?

The future of the browser conference

I was very happy to take part in the future of the browser conference, along with my amazing colleague Jasmine. It was a different kind of conference and this was very clear from the introduction with Amber in the future! (amazing and well worth watching)

Each section started with a really different video from different artists. This was a great move with most conferences, generally starting with a keynote before the conference talks) This conference turned that on its head before jumping into number of deep dives around aspects of the future browser. All the videos are now up online, so go have a watch now.

Since the conference I have installed Brave on my Ubuntu machine now and also Puma on my mobile.

There was quite a few new developments I hadn’t really seen before.

Project fugu

Project Fugu is an effort to close gaps in the web’s capabilities enabling new classes of applications to run on the web. In short things like controlling Zigbee, USB and Bluetooth devices. I heard about this a while ago and thought Living room of the Future controlled via a browser.

Handshake

Handshake is a decentralized, permission-less naming protocol where every peer is validating and in charge of managing the root DNS naming zone with the goal of creating an alternative to existing Certificate Authorities and naming systems. Names on the internet (top level domains, social networking handles, etc.) ultimately rely upon centralized actors with full control over a system which are relied upon to be honest, as they are vulnerable to hacking, censorship, and corruption.
I think that says it all, something which many people have pointed out many times as a major problem with the DNS system.

Handshake is an experiment which seeks to explore those new ways in which the necessary tools to build a more decentralized internet. Services on the internet have become more centralized beginning in the 1990s, but do not fulfill the original decentralized vision of the internet. Email became Gmail, usenet became reddit, blog replies became facebook and Medium, pingbacks became twitter, squid became Cloudflare, even gnutella became The Pirate Bay. Centralization exists because there is a need to manage spam, griefing, and sockpuppet/sybil attacks.

Unlock protocol

As most of you know, I have webmontization enabled on this blog and also on my mixes site. But Unlock is aimed at subscriptions and memberships. This is great news, except its not a standard yet but looks promising.

The coming age of the 402

I had heard Matt talk previously during May’s Manchester Futurists. There is a lot of unexplored areas in the http status codes including 402, which could be used to do new and interesting things like WebMon, which was also covered in Manchester Futurists and the future of the browser.

Better ways to archive and save the page.

Saving the page has been a nightmare for a long while, and I found Webrecorder and ArchiveWeb.page quite interesting solutions. Its also interesting to think about the lengths we have gone through to stop people saving the page.

This is why I like these community driven conferences… Big thanks to @caseorganic, @anselm and all the speakers.

Plotting and harvesting Chia cryptocurrancy for a more green crypto future?

My Chia farming setup

I have admit over the last 2 months I setup a cryptocurrancy rig in my flat. Now when most people think about cryptocurrancy they think of bitcoin and the absolute insane amount of power going into mining bitcoins. This is why when I saw Bram Cohen (creator of BitTorrent) talk about creating something different (proof of space+time) I was always interested. To be fair I since BitTorrent I’ve been watching what he’s been up to, Bram is just one of those serial entrepreneur I keep an eye on.

After hearing about Chia, I downloaded the Linux app and got it up and running on a old laptop I use for bits and bobs. I would have used one of my my Raspberry PIs if I had Ubuntu installed. I plugged in a external USB to SATA SSD which I was using to run my old Dell XPS13 work laptop, when the internal drive got screwed up. Then plugged in a old USB to SATA caddy/docking bay with one of my old 2TB mechanical hard drives from my old server (pre NAS).

My Chia plot

Then left it plotting and harvesting my 1 single plot for a month or so.

At the time, the estimated time to win a Chia was 8 months. As I had the laptop on doing other things all the time, it wasn’t a big deal for me. Actually removing my server and replacing it with the NAS, 2x Raspberry Pis and this laptop  is actually less electricity than my single home made server with 7 drives and 4 fans. I hear most of you say wtf! I do have a lot of devices on in my flat and my electricity is high compared to typical single person but everything else (heating, water, etc) is low.

It was about 4-5 weeks when I was telling someone about Chia and noticed I had harvested 2 chia’s unbeknownst to me. To be clear I have 1 single 100gig plot and although I tried setting up 2 plots afterwards in parallel, I decided it was too much for my old laptop’s little quad core CPU and switched back to a single 100gig plot again (to be clear its more the parallel part which was the problem and CPU is only really used)

Chia CPU and Memory load in Htop

With all this in mind, I was introduced to the Reddit subthread for Chia, where I saw people building massive rigs to plot and harvest. Its quite insane and then hearing how Chia is being blamed for shortages in HDDs and SSDs. Of course why most people are interested in Chia (including myself) is the proof of space & time rather than proof of work. This realistically could be far more sustainable than proof of work models like Bitcoin. I say “could” because seeing these massive rigs seems to throw oil over the notion of Chia’s green attributes.

Although its tempted to add some more plots, I’m not going to change my setup because its sustainable for me. Little has changed on my network or on my physical desk. Getting in early was something very good but I got lucky with 2 Chia already.

Yesterday a friend mentioned Elon had tweeted about Tesla not taking Bitcoin for their electric cars.

I can’t say anything profound about Chia except I’m more than interested because its not just a speculated currency like Bitcoin. Although the price is super surprising for a new cryptocurrency. I said similar about Ethereum because of the smart contracts, NFTs and other things. The currency side is only slightly interesting while things like ChiaLisp for Identity spikes my interest.

Adventures in the Metaverse with Immersive Arts lab – Feb 11th 2021

The Bridge: metaverse

I will be giving a keynote talk at the Immersive Arts lab’s Adventures in the Metaverse

Fancy an adventure in the Metaverse? Or interested in what the next evolution of the internet is going to be?…

As computing becomes spatial, virtual and more mobile, and as the building blocks of the internet changes to a blockchain network, in the next decade – the internet as you know it, is going to fundamentally change. Come and learn more and join the future.

I’m going to focus on the public service internet side of things, rather than the layers on top of it. Other speakers will cover that better than myself. Imagine what kind of data can be collected in Virtual reality systems like the Oculus Riff by Facebook. Imagine the possibilities for awful abuse  There is also the big risk of pushing out smaller platforms who are more focused on civic and public purposes.

We are only scratching the surface here… I recommend getting a ticket and hearing more on the 11th Feb.

What is Bluesky doing which others can’t do?

A leaf, blueskies and clouds

Following Twitter’s CEO Jack Dorsey’s discussion about de-platforming Trump, there was mention about Decentralisating twitter and BlueSky.

He first made mention of this in 2019 in a number of tweets.

Researchers involved with bluesky reveal to TechCrunch an initiative still in its earliest stages that could fundamentally shift the power dynamics of the social web.

Bluesky is aiming to build a “durable” web standard that will ultimately ensure that platforms like Twitter have less centralized responsibility in deciding which users and communities have a voice on the internet. While this could protect speech from marginalized groups, it may also upend modern moderation techniques and efforts to prevent online radicalization.

When I first heard about Bluesky there was little information then at some point during the pandemic I heard about the iOS only app Planetary. My instant thought was oh no there going to try and bypass all the excellent work which has been done by others already. Especially with ActivityPub now a W3C recommendation.

I looked beyond the Techcrunch post (which is full of little odd bits) to see what I could dig up about Bluesky. Looking at the Github repo from Planetary it seems to be based on the Scuttlebot.io protocol? Its good to also see Scuttlebug to ActivityPub and RSS too. As its Scuttlebot, theres other clients for many other platforms.

So my question is what difference does it make over what already exists?
I get if twitter was to be a client of the protocol that would be generally a good thing and I imagine the publicity for decentralised systems would be welcomed but beyond that? Will their business model change? Will anything change? I guess does anything need to change from Twitter’s point of view?
On top of this all, will all the efforts before hand be forgotten now Twitter throws their hat into the ring? That would be awful for all the hard work others have put in for years and years.