I lost all trust for Zoom yesterday…

British PM on Zoom
Wonder how many people have tried to dial into that zoom id?

Yesterday I was on a zoom call which was hijacked or zoombombed with something not just horrible but totally illegal. Because of this I have pretty much lost all trust in zoom.

This is of course very difficult as its what we use at work and of course being in the middle of the covid19 lockdown, makes things tricky. Because of this, I’m going to still use it but with much more caution and I’m going to be a lot more forceful about the hosting side of it.

Its clear war-dialers for public Zoom meetings is so easy and well used by inscrutable groups of people. Zoom could make sharable links much more difficult to war dial, similar to the way Google docs uses combinations of characters and numbers to make a much longer url, a lot harder to war-dial.

The defaults of Zoom, is setup for a semi trusted corporate environment. I understand the covid-19 pandemic changed everything but there has been many updates and only now is the defaults only just safe. Their share prices have rocketed but they are only now focused on security ahead of more features?

Their idea of end to end encryption is a total dump on top of the security findings saying some calls are being routed via China.. Today they announce you can choose your routing but you need to pay for it. More governments and companies are blocking zoom because they just don’t trust it.

Likewise neither do I… but I will use it… with caution.

I have been thinking about an equivalent, and thought about two.

  1. I lost trust in Facebook a long while ago but still use it for volleyball events and the occasional post about something I feel could be important for friends, family and the public who don’t read my blog (as its posted on the internet already, I post publicly adopting the indieweb Posse approach, much to the surprise of some friends). For example I posted what happened on zoom yesterday there today.
    Facebook was hardly trustworthy to start with and over and over again they took the living daylights with our data.
  2. There was a point when Windows Vista pushed as the step/edition of Windows XP and I didn’t like what Microsoft had done to it. To be fair I didn’t trust them and saw shadows of where things were heading. So I switched to Ubuntu.I know the new Microsoft is quite different of course but the damage was done.

If you are hosting a Zoom call, please do lock it down theres a number of guides to help including this one.

Sharing is caring with plex server

My current plex stats over 90days
The last 90 days on my plex server

I always expected an increase in my plex server usage from my friends as the UK enter the Covid-19 lockdown. Especially when I heard all the streaming services dropping back to SD  To be fair its not been as big a demand as I expected but its noticeable compared to last few months of 2019.

Most popular clients

The most popular clients seem to be Chrome & Chromecasts (not sure if Chromecasts sometimes report themselves as Chrome). Thankfully the Xbox has dropped because that client requires everything to be transcoded unlike Tizen which will happily if theres enough bandwidth send the original file and not require any transcoding.

plex concurrent users

I have however hit 3 concurrent streams again, keeping an eye out to see if I can hit 4 or even more. No idea what it would do to my AMD based Plex server? I do have the Intel Xeon based HP Z800 which I recently bought a Nvida Quadro K620 GPU hoping to get hardware accelerated streaming finally working on the HP Z800.

Thats a task for Easter maybe…

Update 13/4/2020

4 concurrent users finally

I noticed my plex server finally reached the 4 concurrent users over the last weekend.

New kid on the block Joplin for notes

Joplin clientsI have a bit of history with note taking apps, having started with Evernote back when I had a Windows Pocket PC and moving to a bunch of different apps till I finally settled on Standard Notes.

Its really good and been using the listed feature for a my gratitude diary. However I checked out Joplin recently and quite like some of the features.

Evernote import works perfectly, meaning I get my rich evernotes back complete with attachments, Actually the way it handles attachments is a lot closer to Evernote. Syncing is done in a number of different ways including Dropbox, Webdav, etc and all done with encryption meaning its using the service as a file container like how I use keepass right now.I had tried to setup a standard notes sync server in the past but didn’t really put the effort in.

Standard note uses markdown which is good and quick but I like Joplin’s hybrid of markdown and html. This pushed further with web clippers in Firefox and Chrome. Making Joplin extremely useful for capturing online resources.

I like both but will find it tricky to use both although I noticed for example Joplin is really painful to use on my ereader, while standardnotes is super smooth. They are build for different purposes but working what do with each is a bit of a dilemma right now. Currently I have my evernote backup and standardnotes in Joplin thanks to export/import. Of course theres lots of duplicate notes which I really need to go through and delete, as I already imported my evernote into standard notes previously.

Getting on the self-hosted train again

Map of the fediverse.space

A long time ago, accessing cubicgarden.com meant accessing my direct server sitting in my home. I use to run Blojsom on top of Resin server. I was self hosting from my 512k ADSL line with 256k up (remember how fast that use to be to!?)

There were a lot of problems I grant you that but it mainly worked ok, although I didn’t like the sysadmin side of it all, as I was using Windows 2000 as the operating system. At some point I decided to switch to wordpress only because PHP hosting was cheaper than Java, although I got some incredible breaks during my time. In 2014 I moved my blog to WPengine thanks to dotBen

That was a while ago and since then I have massively upgraded my connection speed to 1gigabit up and down thanks to Hyperoptic and upgraded my server quite a bit (6 core AMD with 16 gig of memory). The first thing I did was installed Plex server.

Since then I have been slowly adding more services to my server. I guess the most noteworthy ones being tiny tiny rss, icecast2, plex and zerotier vpn (which I’m considering changing to wireguard with the recent announcements). Tiny tiny RSS is useful as I don’t like what feedly and others are doing with my data. Zerotier VPN is very cool and very much like the old and forgotten Hamachi. Because it uses internal ip addresses (non-addressable?) any device I have it connected with can access those addresses like they are on a internal network. This ultimately means I can access all my services including tiny tiny rss without opening up ports on my firewall and exposing it to the internet.

Anyway I’ve been thinking about adding more services to my server including Wekan (alternative to trello), Pixelfed (feiverse instagram), wisemapping (web based mindmapping tool), wallabag (alternative to instapaper), standardnotes server, mastodon (fediverse twitter), funkwhale (fediverse spoitfy), language tool (alternative to grammerly) and matrix (powerful alternative to slack).

Doing it under Ubuntu isn’t a problem as theres lots of tutorials and theres plenty which use Docker to manage everything.

But there is issue it seems when installing multiple services on top of each other. Most of the tutorials require a Apache or Ngnix then some SQL database. The tutorials are written like you are running just one service alone and things become more tricky when you have services using certain ports, etc. Trying to move the ports, database tables is sometimes tricky to follow.

Right now, I’m focused on doing one service at a time or really getting to grips with Docker which was meant to make this easier to deal with???

Ok so why selfhosting (and there is a lot of self-hosting services as I found here) and all the hassle?

I found something which sums it up nicely from a different but connected context.

Decentralized, peer-to-peer networks are evolutionarily superior to the bastardized corporate ‘sharing economy’ platforms like Uber and Lyft. Their billion-dollar budgets won’t save them from the inevitability of the blockchain-based peer-to-peer economy.

The decentralization revolution is here.

Dropping Rescuetime for ActivityWatch

Activity Watch logo

I tend to weigh up different systems and applications I use every once in a while. Especially weighing up the benefits to me.

One such application is Rescue time.

I used it in the past and over the last few months reinstalled it again. However this time I tried to automate the reports out of the free account and pretty much failed. The only way I could really do it is if I paid for the pro account at the cost of (a discounted) $6.75 per month.

So enough I thought… A little look around alternative to and decided to give Activity Watch a try.

ActivityWatch is an app that automatically tracks how you spend time on your devices.

It is open sourceprivacy-firstcross-platform, and a great alternative to services like RescueTime, ManicTime, and WakaTime.

It can help you keep track of time spent on different projects, kick bad screen habits, or just understand how you spend your time.

Its pretty good and doesn’t drain my laptop while watching my laptop. Of course being local and under my control only, I don’t really need to worry so much about whats collected. You can of course limit things as you go, turn off tracking or just delete the data any time.

I have it on my Dell XPS laptop and on my work phone and its good except one thing. Currently there is no sync server, so each device has its own server. But they are working on this… Once they do, I’ll likely install it on my server and put the client on more of my devices.

The other thing I’m hoping for is to see more use of the stopwatch activity watch bucket (buckets are the pools of data collected). Since Project hamster is currently being rethought and I like to track my work progress alongside my activity.

As a whole the project has a lot of potential and worth the wait I hope for the features expressed above.

Adobe audition uses XML like Audacity files

https://cubicgarden.com/2019/03/03/hooray-audacity-files-are-xml/

Today I tried to open a Adobe Audition file which a Salford student sent me for a potential perceptive podcast. I knew it wouldn’t open but I wanted to see which applications Ubuntu would suggest.

Instead it opened in Atom editor and I was surprised to find a reasonable XML file. It was confirmed after a quick search.

Similar to Audacity and FinalCutXML, all can be easily transformed with XSL or any other programming language. Extremely useful for future User Interfaces. Sure someone will do something with this one day?

Windows 10 inside of Ubuntu 18.04, the way it should be…?

windows 10 inside of ubuntu

Its been forever since I moved from Windows to Linux and the idea of running Windows is quite a scary idea for me now. I made the decision I was never going to use Windows Vista (remember that pile of crap!) and wiped my main computer and installed Ubuntu 6.04.

It felt strange downloading Windows 10 from Microsoft’s site (all 5.4gig of it!) using the serial number which came with the Dell XPS, easily extracted using this terminal command.

There certainly was a disturbance going on, as it installed then attempted to get a ton of updates.

Due to a number of changes coming soon, it seems sensible to virtualise Windows inside of Ubuntu for certain future tasks at work. Of course others think it should be the other way, but of course they are sadly mistaken…

I’m happy to say it works but I could really do with some more memory, as 16gig is tight for my daily usage. Shame I can’t upgrade the memory easily on my Dell XPS 13.

Upgrading the pacemaker device with a SD card, not this time!

IMG_20190813_211758

Its a short story (not in effort and time). I tried it after a colleague suggested it instead of SSD a while ago. I tried it but found the card reader the Kalea Informatique adapter, didn’t support SD cards over 32gig. The description said 64gig but everytime I restored the pacemaker firmware it would only format to 32gig. Even using Gparted (like partition magic) to extend the Fat32 partition caused the pacemaker to no longer be accessible by my laptop (it comes up as generic Linux storage device).

I haven’t given up, I’m returning the Kalea adapter and have bought a Kalea adaptor converter but the compact flash version. Yes in short it would be a ZIF LIF to Compact Flash to SD card.

Hopefully this will actually work

Dada says there might be a problem?

Grandpa's Pocket Ledger & My Field Notes

Following on from the great work being done by the databox project team which recently appeared in BBC News, about the work (BBC R&D) have done with it including the living room of the future and BBC Box project. I was impressed to learn about the Dada wiki.

The Defense Against the Dark Artefacts (DADA) project is a collaboration between the Universities of Cambridge, Nottingham, and Imperial, addressing challenges in security and privacy related to smart home devices. These challenges result from the current, widely-adopted approaches in which cloud services underpin home IoT devices, where network infrastructure protection is minimal and little or no isolation is provided between attached devices and the data traffic they carry.

It addresses these challenges by:

  1. designing and implementing mechanisms for device traffic monitoring with a precise look at packet traces and device profiles;
  2. applying learning technologies to detect devices’ abnormal behavior;
  3. introducing techniques for dealing with traffic anomalies and restoring home network operability;
  4. putting the homeowner in the center of management by informing them of possible security threats and offering a choice of defences.

This although I used the wrong technology, this was what I was pointing towards in my blog titled your home needs a blockchain. All the things in Human Data Interaction – Legibility, Agency and Negotiatability all apply if Dada was a databox application.

Interestingly Dada isn’t the only one in this field. Recently Princeton released IOT inspector to do something similar.

Today, we release Princeton IoT Inspector, a open-source tool that lets you inspect IoT traffic in your home network right from the browser. With a one-click install process, you can watch how your IoT devices watch you within minutes of setup.

However IOT inspector is a tool for inpection, while Dada is a tool and place to upload data for analysis to benefit the research community. Of course you don’t have to upload the data and maybe do the analysis locally (this would fit the Databox model perfectly). There is a privacy policy of course, but I expect this will be expanded in the near future.

We understand that any uploaded device trace might contain personal application data. While we need to analyse the uploaded traces to extract IoT features in order to form ML training datasets, we do not aim to analyse nor store your personal data. Therefore, the processed traces are anonymised and all sensitive application payload is removed before the actual analysis starts.

After analysis is done, our servers store the anonymised trace and the extracted features such as packet headers, addresses, ports and payload size (but not the payload itself).

Of course uploading the data for research purposes could be incredible useful. For example imagine you bought a device which is already in the Dada database. You check the device and it seems to be sending a lot of traffic odd places. You check the version number, firmware, etc but its consuming a lot of traffic which is odd. Maybe it was hacked/hijacked? With a public database, its possible to check. Even better with a databox application, it could be done automaticlly if the user(s) allow it.

Some of you maybe thinking this is insane stuff but can I remind you of the house that spied on me and the follow up which armed people with tools.

Even Mozilla went as far as to create a buyers guide to help people choose IOT devices with more information that whats usually available to you in the shop or without proper research. Now theres loads of stories about IOT hijacking by hackers (hummmm possible) and more likely from the companies who make the hardware to bring new features… 

96656cc2-6c28-4100-a783-f1006f53c102_text_hi.gif

Google Titan key security problem?

I was sure I tooted/tweet a thank you to the Google team in Berlin’s Re:publica conference. But it looks like it never quite happened due to connectivity issues with the wifi at certain points of the day.

So first of all I want to say thanks for giving me a titan security key for spending time listening to what changes Google had made to their security as announced in Google IO 2019.

I was surprised to see Google there with all the ill feeling about the 5 stacks, their monopoly and business practice.

But before I could get home try the key/system, I saw a bunch of problems with the key.

Google Titan Bluetooth Security Key Can Be Used to Hack Paired Devices

Titan-ic disaster: Bluetooth blunder sinks Google’s 2FA keys, free replacements offered

Obviously I was a little concerned, although I had not added the titan key to my google 2 factor auth yet.

After a bunch of reading, it seems its not completely flawed. The Google security blog confirms my research.

The problem is with the Bluetooth fob which to be honest is super convenient wasn’t the most secure idea in the world. The bluetooth stack is limited in its range but because of that, its not got as much security as most things on the net.

Due to a misconfiguration in the Titan Security Keys’ Bluetooth pairing protocols, it is possible for an attacker who is physically close to you at the moment you use your security key — within approximately 30 feet — to (a) communicate with your security key, or (b) communicate with the device to which your key is paired. In order for the misconfiguration to be exploited, an attacker would have to align a series of events in close coordination:

When you’re trying to sign into an account on your device, you are normally asked to press the button on your BLE security key to activate it. An attacker in close physical proximity at that moment in time can potentially connect their own device to your affected security key before your own device connects. In this set of circumstances, the attacker could sign into your account using their own device if the attacker somehow already obtained your username and password and could time these events exactly.

Before you can use your security key, it must be paired to your device. Once paired, an attacker in close physical proximity to you could use their device to masquerade as your affected security key and connect to your device at the moment you are asked to press the button on your key. After that, they could attempt to change their device to appear as a Bluetooth keyboard or mouse and potentially take actions on your device.

This all being a big mistake, Google has offered a replacement key. However because my key hasn’t been added to my account yet, I get a message saying no action is required but a email to override this. However after double checking my key is a type T3 meaning it wasn’t effected.

Good work Google…

This is not Plex on your GPU

https://www.flickr.com/photos/nvidia/30153594058/

I hadn’t really thought Plex Media Server could massively benefit from a GPU, to be fair its not really a thing you put in a headless server? But after reading about it, I gave it a try by borrowing a Nvida Quatro PCI-express card and after some small issues getting the propitery drivers working gave it a try.

I thrown together a shell script to log the CPU and GPU heat to a text file called heat.txt

while true; do date >> heat.txt ;
 nvidia-smi -q -d temperature |
 grep 'GPU Current Temp' >> heat.txt; sensors |
 grep -e 'CPU Temperature' -e 'CPU Fan Speed' 
-e 'MB Temperature' >> heat.txt; sleep 10; 
done

I know theres a better way but it was quick and dirty.

From the short tests I did, it seemed the CPU kicked into high gear for a minute or two before it hands off to, what I thought was the GPU. However… During a stream encode of 4k h.264 content to 1080p h.264, while directly streaming at the same time. I got these results.

Thu 20 Dec 20:23:51 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 1650 RPM
> CPU Temperature: +71.0°C
> MB Temperature: +34.0°C
Thu 20 Dec 20:24:01 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan:: 1599 RPM
> CPU Temperature: +68.0°C
> MB Temperature: +34.0°C
Thu 20 Dec 20:24:11 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 1261 RPM
> CPU Temperature: +59.0°C
> MB Temperature: +34.0°C
Thu 20 Dec 20:24:21 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 1167 RPM
> CPU Temperature: +54.0°C
> MB Temperature: +34.0°C

A while later once the transcoding stops

Thu 20 Dec 20:37:40 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 725 RPM
> CPU Temperature: +37.0°C
> MB Temperature: +35.0°C
Thu 20 Dec 20:37:50 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 724 RPM
> CPU Temperature: +37.0°C
> MB Temperature: +35.0°C
Thu 20 Dec 20:38:00 GMT 2018;
> GPU Temperature: 33.0°C
> CPU Fan: 725 RPM
> CPU Temperature: +37.0°C
> MB Temperature: +35.0°C

As you can see with proper testing it was clear the GPU isn’t being used for transcoding (unless the CPU magically is doing something else, but looking at Htop, its clearly Plex transcoding). This was confirmed when doing more research on the issue.

Seems the problem I got is the AMD processor and if I was to swich it to a Intel one it should work with the Nvidia GPU?

So this brings me to the idea of maybe changing parts of my server.

Si pointed me at PC part picker which is alright but I don’t really understand why some Linux operating systems are not listed under operating systems? I listed most of my parts here and to be fair changing the CPU, motherboard, case and of course getting my own GPU might be a good idea?

Making Slack useable on x64 Linux?

 

Slack

Its been a while since I reinstalled my work laptop; one thing I haven’t reinstalled properly is the Slack app.

The amount of times I use to start it up and go and make a tea because it would make my ubuntu install act like Windows 95. Most of the time I would come back to find my laptop completely frozen.

I tried removing the amount of slack workspaces I had attached to the app but it made little difference. So I decided to hell with the slack app, which seems to be a wrapper for Chrome, with each slack instance being another instance of chrome!

This time I’m using Slack in Firefox and limiting how many I have open at a time. I noticed if you login into the different slacks, the cookie will hold them open for you without using the resources. This can be done from the main page using the Workspace options.

Slack home

I also noticed the enterprise slack version also has a front page which can be used to reach the other slacks.

Recently I decided to give Flatpak Slack a try. Interestingly I found you can launch the Slack app from the slack pages mentioned above.

It sounds like a lot of hassle but it works and mean my ubuntu system is fully useable.

Hopefully this will be useful for other Linux Slack users?

I bought a Chromebook

https://www.youtube.com/watch?v=YDIhZZJQWRw

The other day my work Dell XPS 13 which has been running Ubuntu 16.04.1 asked me to upgrade. This message has been coming up for a while but I decided it was time for a upgrade, 18.04 was running well on my server and well it was time.

However the upgrade broke and I was left with Ubuntu 18.04 with Busybox. I had backups but as it was a BBC R&D build of Ubuntu, I needed to go to work for them to reinstall it. All of this was just before I went away to Mydata 2018 in Helsinki. On top of that my ubuntu server also had a problem.

Double wammy!

It was clear I could reinstall Ubuntu quickly but I would need to do a bunch of configuration and that takes time. I have a task to create a live CD with a bunch of configurations just for me, incase similar happens again.

I’d been looking at Chromebooks since I bought one for my parents ages ago and seen how ChromeOS has matured. I’m not the only one. It was the ability to run Android and Linux apps which pushed to get one.

Google Makes it Easier to Run Linux Apps on Chromebooks

So I bought the Asus chromebook flip c302, and I’m quite impressed with it. The size is good and the performance is good. As a backup laptop its ideal. It also kinda a solution to my lack of a decent tablet now my Nexus 7 is pretty much dead. I was tempted with the Google Pixelbook but it seemed too close to what the Dell XPS 13 is for.

I did consider getting a second hand XPS and sticking ChromeOS on it myself actually.

Dataportability and Dock.io

Dock.io stack

You may have gotten an invite to dock.io which is a service which reminds me of the late atomkeep;

Atomkeep helps users sync their profile information on social networks, job boards and other Internet sites. Users gain a streamlined way to validate and control their social identity across multiple sites.

The nice thing about the Dock.io is they are doing things more correctly. The potential of blockchain is being talked about everywhere but its great to have these services showing the actual potential.

I always found Atomkeep interesting but found it heavy on the trust and apis. Dock.io benefits from dataportability and GDPR, as I was able to get my Linkedin data dump and drop it in dock.io. Export and import, now thats good! Dock.io reminds me of openhumans as you can have applications which run on top of the protocol which then talks to the actual data.

So far so good, sure to write more about it soon including the use of Ethereum and IPFS.

Do you trust grammarly?

grammarly - better writing made easy

Been looking at Grammarly for a while and to be fair they have been massively advertising too. Obviously Google & Facebook know I’m dyslexic and I imagine Grammarly are targeting people like me.

But I’m not keen on the process of sending the text to their centralised server. I understand but I think there is another way to do this, however that way conflicts with their business model. Maybe its a another case for something which should be a public service not left to the private sector?

I’m not the only one asking questions; I have been browsing the terms and conditions too and not keen on what I’ve read so far, the privacy policy alone speaks volumes.

I’ve been using Language tool as their privacy policy seems more reasonable to me and it can work offline and in a more decentralised manner.

Be interested to hear how others get on with it, maybe the benefits greatly outweigh the data ethical concerns?