Alternative user interfaces

I studied interaction design in university and always had an imprecation for good interaction and interface design. Recently I seen a few examples which have got me a little excited.

Ubuntu’s scopes
I like ubuntu’s unity paradigms of scopes and lens, even though I prefer to use Gnome Shell as my default on the desktop. The scopes and lens really make a lot of sense. It was fascinating to see Ubuntu apply it across their phone and tablet. Be interesting to see how it works on Ubuntu TV if thats still ongoing?

Pebble timeline
When I first saw the pebble time interface, I instantly thought, when are they going to roll that across there existing line of smartwatches? If not, maybe I might invest in one of the new ones. Division of a interface by future, present and the past on a watch makes a lot more sense than anything else I have seen to date including the Apple Watch.

Android Material Design
Ice cream sandwich or Android 4.0 was a massive step up in style for Android but Android 5.0 Lollipop really was the first Android when the interaction design was thought about at a deeper level.

I don’t necessarily  like the style of flat plates of colour for example the Google hangout app is just the wrong kind of green for my pallet but the interaction model is nice. Although I have spotted a few places where the rules are broken by certain apps.

Perceptive learning resources

Future of StoryTelling

For the last few Wednesdays I have been watching the Future of StoryTelling hangouts online. I first heard about them from Matt Locke and Frank Rose last year when I gatecrashed a planned hangout with Perceptive Radio.

The Future of StoryTelling speaker Hangout series continues on Wednesday, January 15th, with a discussion about interactive gaming, and how great entertainment can transport you from your daily life and immerse you in another world.

You can watch the whole thing here on youtube. and last weeks with Google creative labs Robert Wong. This weeks Including my question which is based off my noticing, interaction and narrative keeps getting thrown around together when they are quite different things.

The guest this week was Microsoft’s Shannon Loftis, General Manager at Xbox Entertainment Studios. She said a lot of things I agreed with but switching narrative for interactive, paused me to think about the origins of Perceptive Media.

I’m not going to say Games and interactive experiences are not storytelling. I would be very wrong, but what I’m surprised at is Microsoft have this amazing device with cutting edge sensors and they sound like they are doing some perception. But they are only using it for Games? Shannon even talks about the golden age of Television then slides off into Games again.

Real shame…

Anyway there was a question asking about what this all can mean for children. Most of the guests give some answers which I couldn’t disagree with but Charles Melcher (founder of future of storytelling) jumps in with something quite profound.

I clipped it and put it on Archive.org but its something I’ve been thinking about since the early days of perceptive media.

The beauty of media which adapts, responds or as I prefer preconceives the audience and the context. Is it can unfold one way and unfold another way for someone else. Like Charles, I’m dyslexic and sometimes just can’t get my head around learning resources which are written for a majority of people.

I understand why its been that way. The cost of creating multiple versions of a learning resource is going to be a bad idea from a resourcing idea. But that only applies if you build your resources in a solid non-flexible way (like a blob) your going to run into the same problem described.  However if you have something more fluid (generative) or object based you can change aspects on the fly.

Simple example, a Book (any book) vs a Ereader (like a Kindle). I’m sure I’ve talked about this before but line lengths is a common issue with people who are dyslexic. We tend to loose what line we’re on for a split second.

I can reshape the lines lengths to make it more readable for myself (thats interactive). An Ereader with sensors could follow my eyes patterns and reshape the line lengths and fonts to give me the best reading experience (now thats perceptive). This all works because the text is digital and therefore an object which can be manipulated.

Back to Charles, a resource which can be manipulated by a person is good but one which can be manipulated by a process of data and sensors is even better (if they are working to aid you). Combining/aggregating resources together gets you to a position where you can weave a story together. I won’t bore you with my campfire == perceptive media equals and this is what humans do thoughts. But I do feel this is the future of storytelling. Charles vision is achievable and its something I’d love to talk to BBC Learning about in more depth.

I’ll be honest and say not only has this one got me writing but I also started writing after hearing Robert Wong talking last week about leadership and inspiring people.

Imagine XBMC with Leap…

Ever since the microsoft kinect was hacked to work with non xbox machines, xbmc hackers have been messing or modifying there setups to support gesture control. So popular was the idea of controlling media with gestures, even the BBC adopted this in the Xbox version of iplayer. However the limits of the kinect was being discovered by the XBMC hackers.

After the first rush for controlling media using your whole body, came the idea of using just your arm then finally just the hand. But the Microsoft kinect didn’t have the density to support this. Now leap motion have brought out their own kinect style solution.

XBMC users should love Leapmotion specially with driver support for windows, mac and Linux.

Supporting not only fingers but even pencils and pens too. all the things needed to really make the xbmc interface amazing.