Public Service Internet monthly newsletter (Sept 2023)

2 FBI agent's casually question Reality Winner outside her house

We live in incredible times with such possibilities that is clear. Although its easily dismissed the unthinkable awful end of apps, the breach of UK voters data and zoom’s new business model.

To quote Buckminster Fuller “You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.

You are seeing aspects of this with Meta threads supporting Mastodon’s verification. the whitehouse cracking down on data brokers and ABC in Australia closing down almost all of its twitter account.


Colbalt the hidden side of the energy revolution

Ian thinks: This documentary by German broadcaster DW is a real eye opener of the real inequality happening in the Democratic Republic of Congo and its effect across any potential energy revolution.

The really deep problems of Venture Capital

Ian thinks: This interview outlines not just the well known problems of taking capital from a VC but how deeply broken the whole ecosystem is. Nothing better said that the market place won’t solve everything, it may make it worst?

No one gets fired for buying Microsoft?

Ian thinks: Local first but when will companies and the C suite care enough to take it seriously? You might have thought the serious damage of ransomware and malware might be a factor for the future to mitigate such damage?

AI right down the middle

Ian thinks: Although I was sceptical of the talk, there were some good points made about the problem of markets. I especially the responsibilities, coordination and the public/society interest.

A race to the bottom where everyone loses

Ian thinks: I hadn’t heard the term Moloch but Liv makes it clear what the term is and give great examples of Moloch. Although Liv and Tristan brush over the thought this is the market and capitalism in effect.

The world is a mess lets fix it?

Ian thinks: Defcon 31 has showed there is a number of incredible revelations on hacking voting machine to the Cult of dead cow’s open-source, privacy peer-to-peer networked framework.

When are we going to finally listen to the minorities who know?

Ian thinks: This fine group of black women have been ringing the bell about the real problems of AI for such a long time. They refuse to be silenced and as we are starting to understand they were absolutely right.

The awful truth about facial recognition and black people

Ian thinks: Simple as this, facial recognition can’t tell black people apart. So why the heck are we still deploying it? Its a question I just can’t wrap my head around, If you don’t trust business insider read the actual paper here.

Reality asks whats in the public interest?

Ian thinks: This film ended up in a lot of small cinemas but the true story of Reality NSA whistle blower is portrayed exactly how the FBI recording captured. Its quite compelling and raises questions about the public interest and what happened next.


Find the archive here

Facial recognition’s ‘dirty little secret’: Millions of online photos scraped without consent

By Olivia Solon

Facial recognition can log you into your iPhone, track criminals through crowds and identify loyal customers in stores.

The technology — which is imperfect but improving rapidly — is based on algorithms that learn how to recognize human faces and the hundreds of ways in which each one is unique.

To do this well, the algorithms must be fed hundreds of thousands of images of a diverse array of faces. Increasingly, those photos are coming from the internet, where they’re swept up by the millions without the knowledge of the people who posted them, categorized by age, gender, skin tone and dozens of other metrics, and shared with researchers at universities and companies.

When I first heard about this story I was annoyed but didn’t think too much about it. Then later down the story, its clear they used creative commons Flickr photos.

“This is the dirty little secret of AI training sets. Researchers often just grab whatever images are available in the wild,” said NYU School of Law professor Jason Schultz.

The latest company to enter this territory was IBM, which in January released a collection of nearly a million photos that were taken from the photo hosting site Flickr and coded to describe the subjects’ appearance. IBM promoted the collection to researchers as a progressive step toward reducing bias in facial recognition.

But some of the photographers whose images were included in IBM’s dataset were surprised and disconcerted when NBC News told them that their photographs had been annotated with details including facial geometry and skin tone and may be used to develop facial recognition algorithms. (NBC News obtained IBM’s dataset from a source after the company declined to share it, saying it could be used only by academic or corporate research groups.)

And then there is a checker to see if your photos were used in the teaching of machines. After typing my username, I found out I have 207 photo(s) in the IBM dataset. This is one of them:

Not my choice of photo, just the one which comes up when using the website

Georg Holzer, uploaded his photos to Flickr to remember great moments with his family and friends, and he used Creative Commons licenses to allow nonprofits and artists to use his photos for free. He did not expect more than 700 of his images to be swept up to study facial recognition technology.

“I know about the harm such a technology can cause,” he said over Skype, after NBC News told him his photos were in IBM’s dataset. “Of course, you can never forget about the good uses of image recognition such as finding family pictures faster, but it can also be used to restrict fundamental rights and privacy. I can never approve or accept the widespread use of such a technology.”

I have a similar view to Georg, I publish almost all my flickr photos under a creative commons non-commercial sharealike licence. I swear this has been broken. I’m also not sure if the pictures are all private or not. But I’m going to find out thanks to GDPR

There may, however, be legal recourse in some jurisdictions thanks to the rise of privacy laws acknowledging the unique value of photos of people’s faces. Under Europe’s General Data Protection Regulation, photos are considered “sensitive personal information” if they are used to confirm an individual’s identity. Residents of Europe who don’t want their data included can ask IBM to delete it. If IBM doesn’t comply, they can complain to their country’s data protection authority, which, if the particular photos fall under the definition of “sensitive personal information,” can levy fines against companies that violate the law.

Expect a GDPR request soon IBM! Anything I can do to send a message I wasn’t happy with this.