Why I love the idea of APML

APML support so far

I decided to split up my posts about the girl geekdinner because something happened later when we got to the pub in Victoria afterwards.

Walid from Trustedplaces.com was showing me some of the new features there planning. Obviously these are not to be repeated so I won't. But we got talking about the Trusted places taste tester and Walid pointed out a site I've never seen before called Imagini. Now how we got on to that subject is about profiling. I was suggesting to Walid it would be great if you make the profiling data available to the user so they could tweak it or share it. Glyn asked about the business motivations for doing so. I didn't really have a answer except it would be very cool.

So why?

Well imagini tries to map out who you are by asking you about 13 questions. Its results are poor and very general. But worst still is once you've done all that work, you get rewarded with a widget, some facts about yourself according to them, some travel sites you might like and being added to their facewall. The author calls it VisualDNA, theres lots more about VisualDNA including this part which talks about the reasoning behind it.

Did you know that businesses around the globe spend a staggering $18 billion per year on market research, trying to work out better ways of understanding what we all want? On top of this, about another $350 billion is spent every year advertising to persuade us to buy what’s been produced and available…

We think that this is totally outdated and simply not a sustainable way to carry on. It just makes sense that the future must be about producing less whilst meeting peoples needs more. We believe that the changing way in which we are all using the internet will make this possible by enabling people to get together and share information about what they like, want and need.

Our view is that the way to start assisting this process is to open up a completely new method of communication – a language that everyone who can see can interact with and understand – a language of images that enables people to understand each other in a different way.

The reason we have chosen images as a way of doing this is because about 90% of the way we all communicate is non-verbal. This 90% is made up of all sorts of different components that include many visual aspects such as who we look, act and behave.

 

This may sound cool but I'm left thinking, what else is it for me?. Now imagine it created a APML (Attention Profile Markup Language) file along with everything else. Then that would be something special.

This got me thinking too, what if other more established places like Trustedplaces, Last.FM, etc also gave away a APML file as part of the profile of each user?

One of the things I loved about APML is the Implicit Data (U-AR) and Explicit Data (I-AM) elements. You can just imagine how simple it would be to output APML from something Last.FM. (whats below isn't true APML markup, just my lazy json like writing)

Implicit (U-AR) last.fm {
concept{ Ferry Corsten = 0.87 }
concept{ Armin Van Buuren = 0.90 }
concept{ Sugar Babes = 0.1 }
concept{ Lemonhead = 0.00001 }
}

Anyway thinking about Glyns question about the business angle, I still don't quite have an answer except to say I've been following Steve Gibsons Security Now which recently has been talking about multifactor authentication.

  1. Something you know
  2. Something you have
  3. Something you are
  4. Someone you know

Well I was thinking APML could be useful for 1 and 3 but started thinking about a 5th factor. Something you know about someone. So a question could be does friend1 prefer ferry corsten, Armin, sugar babes or lemonhead? Maybe? or Maybe not?

Anyway I look forward to seeing more applications and services using APML or something like it. I think there's business reason behind APML but I can't put my finger on it right now. Hopefully someone like Trusted places gets it before Digg who just annouced something similar to trustedplaces.

Comments [Comments]
Trackbacks [0]

Particls – now you can all pay attention…

Particls on my desktop

The private alpha ends today. Yep the guys from Faraday Media have made Particls available to anyone who wants it. Go get it now.

For those who don't know Particls is an extensible attention platform. It learns what you like to consume and gives you more of that. I have been using it for quite some time now and have found it very useful.

  • For users: Particls is a filtered news reader or widget that learns what you care about and alerts you to important news and information while you work. More at www.particls.com
  • For bloggers and site owners: Particls allows bloggers and site owners to create a custom version of the application. Particls will share revenue with partners. More at www.particls.com/intouch
  • For developers: Particls is freely extensible by developers. Reach into corporate databases and web APIs to grab and display data in new and interesting ways. More at http://www.particls.com/extensions/
  • How much is it: Particls is a free download with some ads. Later, an ad-free Pro version will be available for a small subscription fee. It is free for Partners to create custom versions.

So Particls is the biggest step forward in the debate over attention. Some of the scenarios people have talked about can be played out in Particls. For example if Particls knows what your browsing about, it can throw up an alert from a site owner suggesting a 20% discount if the person buys that item they were searching for on ebay.co.uk right now. And thats just the start of things.

I once outlined a scenario where Particls is looking at your Microsoft Money account and whats in your Amazon wishlist. It notices you always get paid on the 28th of the month. So through clever logic pops up alerts with discounts for some of your items on your wishlist when you have enough money to pay for it.

This is quite scary but possible. And raises the issue of people taking control of their attention data. Which is where APML fits in perfectly. One of the things which always impressed me with Particls was the ability to look at the result of their I-AM/U-AR engine in XML and adjust it accordingly. This means you can just erase a large section of your personal attention data without too much hassle. It also means you can import from something else like another attention engine or your keywords from your lifestream for example.

So enough chatter, you can download it for the PC here or check out options for the Mac while they develop their native mac version.

Comments [Comments]
Trackbacks [0]