The Verse, a wired 3D future

So I was blown away by a new game which revision3's Coop (ex 1upshow) previewed, according to the site, the description is this.

Eskil Steenberg is the sole developer of the game Love. He dropped by the week of GDC to give an extended demo of this 200-player, persistent, and uniquely beautiful game world in which players have complete control–even over the very landscape. Created with tools of his own making, including a 3D modeler and renderer, Love is an incredible example of just how far a solo project can go.

Its all highly impressive stuff, and so I hit the web to find out more about the game and the tools Eskil built to create the game. What I found was something very different from just a game. Eskil has a complete technical demo online which you can download and play with. The editor (Loq Airou) is also downloadable but the whole project seems to be a front for Verse. Verse being a real time network protocol that lets 3D apps talk to each other. Like a 3D aware XMPP? Blender3D already has Verse support and so does GIMP via a plugin. 3D studio max has a plug which has been built too, but thats about it for now. So back to Love, Love is a side project of Verse and so the Love engine is just a client using Verse? Its quite a bit to get your head around but currently the whole thing is freely available. Eskil has said he might make it either donation-ware or open source in the future, which is great news. I think I'm going to have a play tomorrow to see if I can get it working.

Verse sounds utterly amazing, and its good to read some of the thinking behind verse. Wired did cover this a while back but I missed it.

Comments [Comments]
Trackbacks [0]

Gwibber the dashboard for streams and flow?

I partly talked about this before but theres been a series of updates which I couldn't help but blog about.

So I was talking to Miles about a client which could support much more that Microblogging and we were suprised that no ones actually built a clever client app which supports Microblogging + RSS + XMPP? Well the closes we can find to that idea is a OSX application called Eventbox. Actually this blog entry does a much better job explaining what it can do, and what a difference it means for advanced users.

If you imagine the dashboard of Facebook (credit to Stowe Boyd) but under your control using the services you prefer. Fan of Flickr, just add them and the RSS feed. Prefer photobucket use that instead. Its a bit like the life streaming services such as Plaxo, Mybloglog and Friendfeed. The application/client should be clever enough to look at the service and work out through maybe some discovery service/xml whats possible with the service. So for example adding Twitter will allow you to post and read, while a flickr feed won't. It would be cool to also finally start adding some of those comment services into the mix, so for example allow backtype comments if you start adding stuff to a RSS feed from a blog. Hell why not add a proper metaweblog/atom Blog editor too maybe?

Anyway, Eventbox is close and seems to be on the right track and I was starting to get worried that once again the linux platform would be left behind in this area. But I was wrong actually deadly wrong because under my own nose was Gwibber.

I've been using it for a while now and its actually fended off competition from Air apps like Twitterdeck (far too twitterfied) and Twirl (crashes a lot) for my ubuntu desktop. But what shocked me today when talking to Miles was the new supported protocals it has. I had done updates and never knew about the new features.

Gwibber 0.72 Screenshot by you.

So now theres support for,

  • RSS/Atom
  • Digg
  • La.conica
  • Twitter
  • Pidgin
  • Ping.fm
  • Facebook
  • Jaiku
  • Pownce
  • Flickr
  • Indenti.ca

So most of the Microblogging services including the recently defunked Pownce and open source La.conica. RSS including automatic discovery for Digg and Flickr. Then some of the interesting ones, Facebook with the ability to also send messages into the Facebook paywall. Ping.FM support, means you can send from Gwibber to all those other services such as Brightkite, Rejaw, etc, etc. But the one which is strange and most exciting is Pidgin support. The problem is, there is no documentation for the Pidgin part and the account says you can send only. So after some playing around, I worked out that when you send a message on Gwibber, it will also set the status of Pidgin. This is cool, but I also want the ability to recieve XMPP messages straight into Gwibber.

Gwibber 0.72 Screenshot by you.

Actually Gwibber has the structure to really move forward. I've already seen multiple types of authentication from username/password to a Oauth like facebook auth. Each protocal gets its own colour which you can set and you can enable recieve, send and search on each one (protocal supporting). Search works well, but I'll like to see some kind of watch or a pounce system like you get in Pidgin or Specto. Finally it would be useful to support the Newsgator API (yep I switched from bloglines to newsgator) for RSS, so you can properly manage the RSS and not end up reading the same news over and over again.

Technorati Tags: , , , , , , , ,

Comments [Comments]
Trackbacks [0]

And the social stacks fit together like that…

One of the things I really missed out on but have been following is the developments around the open stack. I kind of prefer social stack but I can see a lot of benefit to open over social. Anyway, this work has been pioneered by some really good guys including David Recordon, Chris Messina, Sebastian Küpers, etc, etc (sorry to many names to list). Today I was struck by Jyri's blog post about Chris Messina's talk at some event recently.

In his presentation at Friday's event, Chris Messina demonstrated the use case of subscribing to someone who lives on a foreign Web service.

In what follows I'll expand on Chris' story by discussing another use case, where you add the
foreign friend to your address book without needing to go to their site.

Imagine I want to add a friend, David Recordon to my contacts. I
know his email address, so I click 'add contact' in my client and enter
his email.

My client translates David's email address into his OpenID URL, probably using a method called Email to URL Translation.

Now that my client knows where to find David on the Web, it goes out to David's URL and fetches a little file that contains machine-readable pointers
to David's public profile and the photos, status messages, bookmarks,
blogs, and other feeds he publishes. The enabling standards at work
here are likely to be XRDS-Simple and Portable Contacts.

This loop is simply referred to as 'discovery'.

Once my client is done, it is ready to display its findings to me.
Here's a mock-up to illustrate what I might see (the same mock is in
Chris' slides):

Dave

After selecting David's contact information and some of his feeds, I
click 'Save', and a subscription request is sent to these services. They
return a few of David's most recent public updates to me.

The next time David logs into these services, he sees a standard new
subscriber notification. His service can perform discovery on me to
display my name and profile summary to him, and allow him to
reciprocate.

David may also choose to allow me to see some of his private information, such as his contact details. The enabling standard here is of course OAuth.

I have never needed to join any of the services David uses; in fact,
I don't even need to know their names. It is irrelevant to me if he
uses Twitter, Plurk, or Friendfeed to publish his status updates or
prefers Flickr, Photobucket, or Picasa for sharing his photos. All I care about is seeing his updates and being able to respond to them using my own client.

Information wants to be free, and social objects want to travel.

The thing this reminds me of, is when Tim Burners-Lee wrote about the Semantic web and how agents talk to services, etc. You can follow how it works without even knowing the different technologies too well. So while these guys figure out the webside of things, these other guys earn a mention for there work on the services stuff and Controlyourself for there work on openmicroblogging.

Comments [Comments]
Trackbacks [0]

Social Mediaflows with Tarpipe

A friend of mine Mike Lott sent me a link to lifehacker where they talk about Tarpipe.

Tarpipe streamlines your updates to various social web sites, creating simple or complex workflows to update several buckets in one fell swoop. Let's say you want to do something simple like upload a new picture to Flickr and then tweet about it on Twitter. Normally you'd need to upload the photo to Flickr, find the URL of the pic, run it through some sort of URL shrinker, and then update your Twitter account with the shrunken link to the Flickr page. It may not seem like all that much work, but Tarpipe can tackle this entire process in one step—all you have to do is send one email.

Tarpipe creates custom email addresses that, when emailed, run through a pre-defined set of actions to update any service you define. Creating a custom workflow will look very familiar if you've ever used Yahoo Pipes, but rather than creating custom RSS feeds like Pipes, Tarpipes creates custom social media workflows. The site supports integration with Pownce, Flickr, PhotoBucket, Tumblr, Plurk, Evernote, Delicious, TinyURL, FriendFeed, Twitter, and tons more, so if you use more than one of these sites, Tarpipe could come in really handy.

And seriously…

I've not been so excited since Ping.FM (no Pixelpipe didn't excite me).

These guys have done everything correct like ping.fm. Every chance they have to use Oauth for authentication – there using it, OpenID is the default way to join up and get a account and they support everything from Twitter to Indeti.ca. The use of email to control everything is a little odd, but there is support for a API. I'm sure Instant messenger and other methods are not far behind. Most of you already know I favor pipeline interfaces for complex operations and until now I've been pining my hopes on Conduit which supports much more services but is really a syncing application rather that a Pipelining application. Anyway, I've only played with tarpipe for a few minutes, so I'll hopefully have more to say and show once I get going tomorrow.

Comments [Comments]
Trackbacks [0]

Web of Flow

I think Stowe Boyd is a very clever man he's thoughts behind social tools run very deep. And rightly so, while the rest of us were trying to grapple with social anything, he coined the term social tools and understood the power of these tools and the conversation. I kind of liken him to people like Doc Searls and Howard Rheningold but instantly more accessible.

A lot of people don't like his presentation style which is more a jumble of mini-thoughts and pointers. So when someone pointed me at Phil Windley's piece about Stowe's latest thought, I knew what the bulk of the post would be about.

Although Phil may not have enjoyed the talk much, I certainly did. It also got me thinking.

He shows his desktop: Snackr,
Friendfeed flow UI, Flickr, Twitterfox, and so on. These are all
flow apps. There are dozens of streams now and there will be lots
more in the future. These differ on the basis of the social
interactions they enable. There will be 5 or 6 themes, but lots of
implementations.

This leads to a model called “lifestreaming.” People are continually
broadcasting their life to groups of friends and even strangers.
People know where you are and ask you questions about things in your
life because of life streaming.

If you take a look at one of my desktops from yesterday when I was watching the us elections (go obama). You can clearly see some common elements between Stowe's and mine.

In Stowe's talk and screenshot he's got the friends activity stream as a page up on the right but using rss there's no need to have that at all. Actually I noticed my microblogging client Gwibber supports not only microblogging services but also Facebook and Flickr. I think with some hacking around in the Python code I can get it to have a generic RSS input too. Another interesting element is snackr, which is the scrolling rss driven marqaue at the bottom. If we could get Gwibber to spit out rss too, that would be cool for snackr. But I can't help but feel the guys are Faradaymedia have already venutured into this area before with Touchstone/Particls. Unfortuelly having the attention engine on your machine wasn't the best of ideas. Which is where a combination of something new I also heard about at Web 2.0 expo could come in useful in relevency area.

Not one to hide my ideas but this time, I want to try hacking around with some software to see what I build either into Gwibber or Snackr.

Comments [Comments]
Trackbacks [0]

A glimmer of what a semantic web could be like


Freebase Parallax: A new way to browse and explore data from David Huynh on Vimeo.

I love the idea of Freebase, even Tim O'reilly was upbeat about the project when it first launched over a year ago. But this video of a project called Parallax by freebase is simply amazing. It looks closer to the semantic web than almost anything else I've seen. Theres so much going on in this area now, you have things like DBpedia, Freebase, Musicbrainz, last.fm, etc, etc… Then lots of Linking data projects using foaf, atom, rdfa, etc, etc. W3C and other standard bodies working on things like RulesML, APML, etc, etc. And there's even stuff you can run yourself like the many Semantic Wikis and Blogs which are starting to popup. There's some real progress being made, the semantic web is closer that we think…

Comments [Comments]
Trackbacks [0]

I was that close to installing Laconica today

Evan from Laconica/Identi.ca/Wikitravel and much more, was on Floss Weekly this week. Here's the blurb from twit.tv.

Laconica, the open source microblogging tool implementing the OpenMicroBlogging standard used in Identi.ca.

Guest: Evan Prodromou for Laconica and Identi.ca

Its quite a long discussion but lots of lovely information about why Evan started Laconica and some of the changes he's planning for the project. I was a little shocked to find laconica isn't based on a messaging protocal but theres plan a foot to change towards that. Installing Lasconi.ca could do with being a little easier though. Some features include a Twitter compatible API, Tags, Feditaration, Jaiku like replies, FOAF, OpenAuth. Actually I missed Dan Brickleys blog post about it.

In a world where Twitter is falling apart (don't get me wrong I love twitter but whens the im bot coming back and then the total removal of the SMS service for the UK and parts of Europe), Jaiku seems to be stalled and going no where thanks to Google buying most of the developers up. And the rest of the copiers are copying the twitter silo to the pixel. Its great to have Ping.FM and Identi.ca busting these silos. I love the company name – controlyourself.ca by the way.

Comments [Comments]
Trackbacks [0]

The attention economy is only just around the corner

APML logo

One of my proposals for Xtech 2008 was accepted. I submitted two, one around APML and the other around Data Portability. As expected the Data Portability one got dropped, I guess because its maybe a little light and fluffy for a conference like Xtech. So I'll be talking about APML, which hopefully will be up to version 1.0. I'll be covering both the actual XML format but also the practical uses and services which have sprung up since APML.

The attention economy was talked about at the end of 06 to death. Through all the hype, a couple of guys from down-under started to make sense of attention and proposed APML (Attention Profiling Markup Language).Unfortunately little is known about APML and there is a lot of mis-information on APML. As one of the working group members I will run through what it is, its purpose and why its important.

Technorati Tags: , , ,

Comments [Comments]
Trackbacks [0]

Data Portability for Educators

This is the slides I used for the Educational Jisc event. The event went really well and although there are over 80 slides, I managed to wiz through them in about 30mins, leaving plenty of time for questions. Someone commented they were pretty blown away and would need to review the slides again because there was just so much information to take in. Another lecturer, commented that she will spend more time in the future looking at eula's and data portability features before recommending them to students. So a good result all round.

Yes there was a geekdinner about Dataportability which I was part of. Imp has put up a complete video of the night which I'm not going to watch ever (hate watching myself on video). Enjoy.

Technorati Tags: , , ,

Comments [Comments]
Trackbacks [0]

DataPortabilityAndMe videos

So finally I have edited and re-encoded my videos for the DataPortabilityAndMe video project. Now before you all moan about the quality or the very strange eye movements. Bear in mind this was shot while in Hamburg at about 2am in a hotel room while I was trying to talk clearly but not loudly, as to not wake my neighbours. I had also just come from a girl geekdinner where I was drinking wine and just finished a espresso before turning the camera towards myself. So yes generally not the best of conditions but I just needed to do it, or the nerves would have gotten hold of me and I would never have done it.

The quality is a little poorer that expected because I had finally run it through 3 different encoders. At the start it was shot on my Sanyo HD1 camera at HighDef 720p resoultion using the lower quality mpeg4 codec (I wish I'd used the higher quality one now). The footage was copied from SD card to hard drive and transcoded into Mpeg4 video and Mpeg1 audio so Kdenlive could edit the footage without the audio error I was getting when importing straight from the Sanyo HD1 footage. So after editing using Kdenlive (which I have to say is one of the best desktop editing tools for gnu/Linux at the moment) I exported the video in a range of video formats for uploading to Blip.tv. In the end I found it easier to just output from Kdenlive using Divx and Mpeg3 audio then once again transcode that using VLC into Mpeg4 which Blip could understand. Yes its a pain but I finally got there. Next time I would convert once to HDV files and encode again before sending to bliptv.

Some of you might have noticed I also shot two videos. DataPortabilityAndMe (Large version) is me talking for less that 5mins and the longer version where I quote “wax lyrical” (yes I wish I had not said that too) is called DataPortabilityAndMe Adhoc Talk (Small version). Enjoy and let me know what you all think.

Technorati Tags: , , , , , , , , , , ,

Comments [Comments]
Trackbacks [0]

Kevin Rose talks about Data Portability on Diggnation

So it started off about the facebook applications on any site and their global javascript library. But before long Kevin Rose is informing Alex about the advantages of this move for Facebook and some of its users. Kevin then points out that its disadvantage is for the users because its still tied to Facebook. He briefly mentions OpenSocial then starts talking about DataPortability. During the following 3-4 mins Alex challenges Kevin about Digg.com and its Dataportanility stance and to be Kevin admits he's all for dataportability in Digg.com. This is obviously very fitting looking forward to the announcement a while back that Digg joins the Dataportability group.

For someone whose one of the founders of the Dataportability group, I've been quite quiet about it. Don't get me wrong I'm lurking a lot and I already have my fingers in certain dataportability pies. You may have seen some data portability videos around, well I'm glad to say I have completed mine and I'm just trying to edit mine with Kdenlive and Pitivi but not having much luck. It seems Kdenlive doesn't like my Sanyo's Mpeg4 audio format. So I need to convert them first into something else using VLC. Pitivi is strange and does weird things to the video, which means it won't play in much including the great VLC.

Big thanks to Kevin Rose for allowing me the permission to clip this video and put it up on Blip.TV. Originally not only was I having problems with encoding but Blip kept removing my video because I was breaching Revision3's Copyright. So after a brief email to Kevin directly, he replied yes but he would have liked to have seen the video first. I told him if he doesn't like it I will take it down.

There is also now a Geekdinner about Data Portability in London. If your interested in this subject and in the area of London on Wednesday 27th Feb, come along for a good debate about the whole project and subject.

Comments [Comments]
Trackbacks [0]

Either Googles really good, or I should be scared?

Googles good?

alt.support.divorce and diet.low-carb, hummm I guess I should be scared. Google couldn't have worked this out from my gmail because I don't tend to use that much for personal emails. My google reader subscriptions and blog might be the source if this isn't just a fluke. Problem is that I can't see the same page without logging in, otherwise I would be able to tell for sure. Now if Google only supported APML then I could know for sure. Yes another reason for APML along with Tristan's thoughts on the radio labs blog.

Technorati Tags: , , , ,

Comments [Comments]
Trackbacks [0]

Xtech 2008 proposals

I have put in a couple of proposals for Xtech 2008 which is this year in Dublin. The theme for the year is the web on the go which actually fits in nicely with our thoughts about for over the air (more details about that really soon). So the way I see it is data portability is a type of movement like data on the go, in actual fact you can't have data on the go without some dataportability. Roll on the brief descriptions… .

The truth behind Data Portability
Data portability is in a way one of the greatest freedoms users and developers can have. Portability of data underpins the web of data, apis and the ability to move data to other services, platforms and devices. It is silo busting and is deeply weaved into the debate over social platforms, identity and mobile data. In this talk, I will explore the problems, solutions and gamut of policy decisions

The attention economy is only just around the corner
The attention economy was talked about at the end of 06 to death. Through all the hype, a couple of guys from down-under started to make sense of attention and proposed APML (Attention Profiling Markup Language).Unfortunately little is known about APML and there is a lot of mis-information on APML. As one of the working group members I will run through what it is, its purpose and why its important

Technorati Tags: , , ,

Comments [Comments]
Trackbacks [0]

Things to make you move

APML logo

I have not written much about APML for a while, partly because I've been focused on Data Portability. But I saw this a while ago and forgot to blog it over December.

Attention data is an emerging concept in web circles that focusses on making people the owners of the data about the websites they visit, or pay attention to. Currently, this kind of information builds up in the individual websites that we visit, but in most cases that information is never shared us. The Attention movement seeks to remedy that by making data about what we pay attention to available in a portable format.

That format is known as APML, or Attention Profile Markup Language. And, with some simple integration with APML providers, Engagd, we’ve now made it possible for Ma.gnolia members to build an attention profile from their bookmarks.

A new setting in your profile controls, found on the newly-renamed Bookmarks tab, allows you to turn on the generation of APML from your bookmarks feed. After the profile has had some time to build up, you can grab a copy of your profile data.

When I first read this, my finger hovered the sign up button. Yes I was that close to converting from delicious to magnolia. And lets be honest Magnolia's openID signup would have made things so simple. I bet it would have taken all of about 5mins to signup and to be using it like how I use delicious right now. Its not too late either… If anything will make me move or join a new service its going to be APML support and OpenID. OAuth support is a nice touch but not essential. RSS/ATOM, a API and Microformats should now be a given. If there not, obvioulsy the service isn't serious about its dataportability policy or you could say really care about the advanced participations.

Great news about Yahoo and OpenID by the way, I look forward to seeing OpenID support on Flickr, Delicious, etc very soon.

Technorati Tags: , , , , , ,

Comments [Comments]
Trackbacks [0]

Data Portability Video


DataPortability – Connect, Control, Share, Remix from Smashcut Media on Vimeo.


CREDITS:
Written, designed and edited by Michael Pick, smashcut-media.com

Music – “Bongo Avenger” – Eric & Ryan Kilkenny: CC Attrib. Non Commercial

Hands photo: Scol22 – Stock.xchng

Additional images: istockphoto

Animated Flourishes: Andrew Kramer

So I was impressed with the production value of the video but felt it needed more snap. Chris asked what I meant by “snap?” But I think you will know what I mean and agree, it certainly needs more snappyness. Not that I'm slagging it off, actually its really good and well worth sending around to people who don't know or understand the whole data portability movement.

I've also finally put in a Xtech 2008 proposal for Data Portability at long last. Here is my short description.

Data portability is in a way one of the greatest freedoms users and developers can have. Portability of data underpins the web of data, apis and the ability to move data to other services, platforms and devices. It is silo busting and is deeply weaved into the debate over social platforms, identity and mobile data. In this talk, I will explore the problems, solutions and gamut of policy decisions

Comments [Comments]
Trackbacks [0]