Social news with digg

Diggnation video

I have been thinking about using some other tool to capture news content which dont quite make its way into slashdot and boingboing. Metafiter is one which Tom uses religiously but I dont quite like the old style of web product it is. Del.icio.us popular is good, but the content tends not to be totally news. Sometimes you get the links related to the news, instead of the actual news. This also applies to technorati popular.

So I'm testing out Digg which has a slashdot type model but allows for a much more longtail approch via friend aggregation as well as the whole digg nation. Talking of which, there is a show hosted by Kevin Rose and Alex Albright called Digg Nation where they take the top few stories from a week on digg and simply talk about it. Its kinda of slashdot review but with comments turned on. The show comes in audio and video podcast forms and I have to give huge kudos to Jon for making there feeds compatable with a RSS TV setup. Do check it out if you have a Azureus with RSS automation type setup. But back to Digg Nation for a moment. Currently Digg Nation only deals with the most digged/popular news. But theres nothing stopping someone doing a show about the most digged/popular content between certain groups of friends or a certain category. If they adopted a tagging option too, it would be endless.

For now digg is getting the thumbs up for me. I know I would love to see digg move away from the website. What I mean is tools and applications which mean I dont need to log into digg to do everything. At the moment you can get RSS feeds for pretty much anything in Digg, but it woud be great to see APIs for posting and digging news. Maybe directly in the RSS reader via a RESTful url like how a trackback ping works. So you can see what being digged and dig it by simply clicking a link at the bottom of the entry.

Comments [Comments]
Trackbacks [0]

The matrix revisited, this time by the fans…

Matrix remixed

From the CCB the editors of The Matrix Dezionized, a special fanmade edition of Matrix – Reloaded and Matrix – Revolutions.

We got the idea, when we finally watched Matrix Revolutions and couldn't believe, how bad it was, pathetic, without purpose. The most important thing is, to realize what is wrong, why is this movie not working, what makes it so bad. We decided that the complete plot string about Zion and Zion's battle against the machines was rather distracting from the main Matrix plot than improving it in any way. So we edited both DVDs, took out plenty of scenes, wherever needed and made a new edition, about 90 minutes smaller (the length of an entire moive, yes) and combined them to one final Matrix chapter.
What you get is a complete story, the story of Neo and his being the one. Plenty of people already watched this and the feedback was nothing but great. Still we had to redo the thing over and over again, just to get rid of possible flaws, bad cuts or whatever seemed unnecessary to us. Of course you can have a different opinion. And if you do and still like the idea of a better Matrix 2 movie, make your own edition and let's watch it. Choice is the problem of the Matrix universe, but choice is the thing that means freedom for us. We all like to be able to choose between editions and switch to the one we like best, no
matter who has created it. This is not meant to take any credit from the great Wachowski brothers, we admire your work, we are not interested in credit at all and absolutely not in making profit or money. Our interest is to improve an already existing work of art. See for yourself, decide for yourself.

Watch the trailer here (encoded using Xvid)

I'm currently watching another remix titled The Matrix Regenerated. The Matrix regenerated has an alternative ending and seems to be a little more suttle with the cuts. There is a point between Part 2 and 3 which has the cliff hanger about neo, well they did some very good work to change that and make it a smooth flowing section. Generally I would have like to have seen a better quality 2CD version, as some scenes are a little pixelated due to the low bitrate. The sound is also plain stereo which sucks because the Matrix was made for a AC3 surround sound system. But I imagine editing in AC3 is still really difficult. I guess these people like myself already bought the dvds and copied them to there hard drive for editing, it really makes me want to do the same for a couple of films I own – Donnie Darko and Matrix included.

Wired did a shamefully small piece about the greatest graffiti artist I have ever seen – Banksy. Just on a side point, I grew up in Bristol and that means living and seeing Banksy anywhere and everywhere. But I was flicking through his latest outdoor collections and came across this one and some text, which I've copied into real text.

Rat pouring away toxic sewage

Imagine a city where graffiti wasn't illegal, a city where everybody could draw wherever they liked. Where every street was awash with a million colours and little phrases. Where standing at a bus stop was never boring. A city that felt like a living breathing thing which belonged to everybody, not just estate agents and barons of big business. Imagine a city like that and stop leaning against the wall – its wet.

If your not quite getting the connection between banksy and the matrix remixes, your missing out on remix culture. Not until now have I really thought about banksy being the perfect example of remix culture. His help pages are a inspiration to all those involved in remix culture.

  • Think from outside the box – Be highly creative
  • Collapse the box and take a fucking sharp knife to it – If the box is too restrictive, take a fucking sharp knife to it and build your own. Creative Commons and Bit Torrent is a great example of this in action.
  • Leave the house before you find something worth staying in for – Dont sit on your arse and simply consume. The internet was always meant to be a read/write medium, take advantage of that fact. Why cant this read/write ability apply to other mediums?
  • Remember crime against property is not real crime. People look at an oil painting and admire the use of brushstrokes to convey meaning. People look at a graffiti painting and admire the use of a drainpipe to gain access – Theres lots of parellels between graffiti artists and internet remixers. Its a crime yes, but there not really villains.
  • The time of getting fame for your name on its own is over. Artwork that is only about wanting to be famous will never make you famous. Any fame is a by-product of making something that means something. You don't go to a restaurant and order a meal because you want to have a shit – This is very much the net too. Fame is a by-product of your actions online. You are no one till you start leaving a digital footprint.

Comments [Comments]
Trackbacks [0]

The matrix revisited, this time by the fans…

From the CCB the editors of The Matrix Dezionized, a special fanmade edition of Matrix – Reloaded and Matrix – Revolutions.

We got the idea, when we finally watched Matrix Revolutions and couldn't believe, how bad it was, pathetic, without purpose. The most important thing is, to realize what is wrong, why is this movie not working, what makes it so bad. We decided that the complete plot string about Zion and Zion's battle against the machines was rather distracting from the main Matrix plot than improving it in any way. So we edited both DVDs, took out plenty of scenes, wherever needed and made a new edition, about 90 minutes smaller (the length of an entire moive, yes) and combined them to one final Matrix chapter.
What you get is a complete story, the story of Neo and his being the one. Plenty of people already watched this and the feedback was nothing but great. Still we had to redo the thing over and over again, just to get rid of possible flaws, bad cuts or whatever seemed unnecessary to us. Of course you can have a different opinion. And if you do and still like the idea of a better Matrix 2 movie, make your own edition and let's watch it. Choice is the problem of the Matrix universe, but choice is the thing that means freedom for us. We all like to be able to choose between editions and switch to the one we like best, no
matter who has created it. This is not meant to take any credit from the great Wachowski brothers, we admire your work, we are not interested in credit at all and absolutely not in making profit or money. Our interest is to improve an already existing work of art. See for yourself, decide for yourself.

I'm currently watching another remix titled The Matrix Regenerated. The Matrix regenerated has an alternative ending and seems to be a little more suttle with the cuts. There is a point between Part 2 and 3 which has the cliff hanger about neo, well they did some very good work to change that and make it a smooth flowing section. Generally I would have like to have seen a better quality 2CD version, as some scenes are a little pixelated due to the low bitrate. The sound is also plain stereo which sucks because the Matrix was made for a AC3 surround sound system. But I imagine editing in AC3 is still really difficult. I guess these people like myself already bought the dvds and copied them to there hard drive for editing, it really makes me want to do the same for a couple of films I own – Donnie Darko and Matrix included.

Comments [Comments]
Trackbacks [0]

Makoto Shinkai somewhat in Conversation

Blury shot of the conversation

So yesterday I did make it to the NFT for Makoto Shinkai in Conversation which kicked off a short season of Anime at the NFT. To start off we were treated to She and Her Cat followed by the insanely popular Voices of a distant star. The nicely air-conditioned NFT2 cinema was perfectly dark, specially when compared to the Filmworks cinemas where half the lights were on through out the film (highly distracting). After which the somewhat conversation started. There was tons of clapping while Makoto Shinkai walked down the cinema followed by his translator. Well it seemed that way at first. See the translator was very tired and didnt want to translate slightly difficult questions. She was struggling with questions about what anime Makoto is looking forward to this year! When I asked the tricky question about what Makoto felt about remixes, fansubs and re-versions of anime for other audiences (such as the uk one). We must have spent about 10mins on it. In the end Makoto answered with a interesting answer. First he was not totally aware of alternative versions of his films but he believes in building on top of peers work which involves taking other peoples anime films apart is a fine. When you post it to the net for others to learn and build upon your in tricky ground. He some what supported this but followed with a comment about his publishers and producers would not support such a movement.

Other questions included the usual what influences you, did you really make the anime yourself, what applications do you use? Adobe Photoshop and After Effects. A japanese lady asked in japanese how old he was and did he have a girlfriend? Makoto covered his mouth when he replied out of shyness, but at only 32 he's still looking like 24. After the questions and answers Makoto signed tickets and DVDs (got mine done) outside in the lobby. Generally it was good event let down by the translator, which I understand is a bloody difficult job but its what you do.

Comments [Comments]
Trackbacks [0]

Please tell them to give it a rest…

Give it a rest please!

So I went and watched the Skeleton Key with Sarah on Wednesday night at the Filmworks, and walked pass this lit up poster. Its one thing to warning people not to record the movie but to start throwing together the words terrorist and pirates. I wont even get started on the single words which have been twisted and abused recently (last 20 years). But all I can say is, this is a seriously worrying trend. Without going into a huge post about the perception of piracy in this day of age. I'll stop here and repeat what Sarah said. In her own words, What idiot would pay money for a DVD when they could download it for Free? (big smile) You got love a little humour on such a negative subject.

Comments [Comments]
Trackbacks [0]

Saxon 8.5 with collection()

Via Cafeconleche some really good news about the new version of Saxon.

Michael Kay has released version 8.5 of Saxon, his XSLT 2.0 and XQuery processor. Saxon 8.5 is published in two versions for both of which Java 1.4 or later is required. Saxon 8.5B is an open source product published under the Mozilla Public License 1.0 that “implements the 'basic' conformance level for XSLT 2.0 and XQuery.” Saxon 8.5SA is a £250.00 payware version that “allows stylesheets and queries to import an XML Schema, to validate input and output trees against a schema, and to select elements and attributes based on their schema-defined type. Saxon-SA also incorporates a free-standard XML Schema validator. In addition Saxon-SA incorporates some advanced extensions not available in the Saxon-B product. These include a try/catch capability for catching dynamic errors, improved error diagnostics, support for higher-order functions, and additional facilities in XQuery including support for grouping, advanced regular expression analysis, and formatting of dates and numbers.” Besides bug fixes, version 8.5 adds Unicode normalization and enables the collection() function to process a directory.

The collection() function is of interest because it gives XSLT the ability to read a file system metadata without any prior knowledge of the file system, like in the document() function. This was one of the useful things Cocoon does really well via the directory generator. Anyhow, more details can be found on the Xml hack blog including exactly how to call the function.

Comments [Comments]
Trackbacks [0]

Social networks vs social tools

Honestly I dont really go for these social networks for the sake of social networking. Linked in, Friendster, etc have been some I've looked at but have not been that interested in. The one I invested the most time in is Microsoft Wallop, which is still an invite only beta. At the moment its simply reading my RSS feed from this blog and using it as a base for my own wallop blog. Weirdly enough, it also counts anything I link to as a resource of my own which it then stores. Anyhow, i've recently started using O'Reilly's connection beta which is another social network but focused around developers, net type people and designers. The ability to get FOAF data out seems really simple and I have to admit it seems quite simple and not this super elaborate affair of wallop or orkut.

I do however have to wonder once again, how long I'll stay with this one? All the social networks I use all the time tend to be more like social tools. Del.icio.us, Flickr, Audioscrobbler and even recently Yahoo's new killer/social tools, seem to take a different view of social networking. There much more intergrated into daily life that a place to hang out. If O'Reilly's connections was to open things up a litle more, so for example using other services for job listings, etc. Then maybe it would stand a better chance at staying relveant. I do have a feeling that they will base it around there books and conferences in the future but will that be enough? I mean is it a career development site? no is my gut answer but what room is there for connections in face of the other tools?

I was kinda of upset that it didnt import my current FOAF profile. I mean I would have uploaded if they liked, but I would expect it to look at my website url and read the linked FOAF file without a bother. But oh no, I need to add all the data again via multiple html forms, which really sucks. Isnt this the point of FOAF in the first place? I have no idea if there's the ability to see inside the network from outside? For example here is my profile view and the FOAF data which is created from it. Please note, although I said I was based in the UK, it still puts US in my FOAF data. Maybe this is a bug in there processing? It must be because for some reason it hasnt even got my surname – forrester! I'm also not happy about the fact it uses many redirects to make up the FOAF data. http://connection.oreilly.com/users/profile.public.php?user_id=1671/ is not my homepage sorry.

There is no way i can use this as my FOAF profile till its sorted out and reliable. Also note everyone in FOAF profile has to be registered with Connections for it to add a entry to my FOAF data. This is deeply inward looking and this is why I think Yahoo 360 will succed where others failed. I mean think about it for a quick second. Buy a social network/tool like flickr with millions of users and billions of user generated data then buy yourself on to peoples desktops through Konfabulator. The possiblities are pretty endless, now I can see where the 360 nature comes from.

Comments [Comments]
Trackbacks [0]

Great Night at Donnie Darko in the Park

The screen from our spot

We had a great evening and night at Donnie Darko in the park. Listening to the national symphony orchestra playing live versions of Donnie Darko theme and background music was great while people found places to sit. My favour titled the Tangent Universe was very enjoyable to hear live. Then about 9:30pm there was a short interview with Richard Kelly, where he thanked us all for attending the showing and supporting the film. He also talked about his new film which he's currently working on. Kelly sounded confident in proving the critics. he is not a one trick pony. After which the film started with a lots of clapping and cheering from the now huge audience. When I first sat down, I never expected people to be sitting all the way at the entrance which was some distance away from the screen, but it certainly happened. I'm so glad the version they played of Donnie Darko was the orginal version not the watered down directors cut. No offense, but that was obviously made for those who didnt quite get the orginal version. I highly recommend this experience if there is a Stella screening with a film you really enjoy.

Comments [Comments]
Trackbacks [0]

Makoto Shinkai in Conversation

Makoto Shinkai in conversation



Makoto Shinkai in Conversation: Friday 12 August– at the NFT, South Bank, London

Via my lovely self-sacrificing wife who doesn't even like (her own words). Its looks like Japanese Animation month at the NFT. The highlight of the month includes a discussion with Makoto Shinkai who created the simply amazing Voices of a distant star.

Voices of a Distant Star is the moving tale of Mikako and Noboru, high-school sweethearts who try to keep their romance alive against all odds. Mikako is sent into space as part of a team to save mankind from an alien threat, leaving Noboru at home with texting as their only means of contact. The further he moves away from earth, the longer the texts take to come back, and soon a whole lifetime stands between them. Made solely on a Macintosh computer, using off-the-shelf software, this is an astonishing achievement, and we are delighted to welcome Makoto Shinkai to the to talk about it and his other films, including the earlier She and Her Cat.

Like the Donnie Darko screening this weekend, I'll be attending. certainly is a great place to live.

Comments [Comments]
Trackbacks [0]

Kill A9 and delicious now, Technorati later?

Yahoo search!

Well it seems Yahoo are really making moves recently. After there buying of Konfabulator and Flickr. There is now a two version beta of there new search service. The web 1.0 version which saves searches and competes with A9. And the tagging web 2.0 version which competes with Del.icio.us. Oh and you better add podcast Alley and other podcast directorys to the Yahoo! hit list now they have a audio search service. And how could I forget the Yahoo ad service which is aimed at the small website and blogger market. No need to add google to the Yahoo! hit list, they have been on there for quite some time. I'm sure a shopping API would really lands some damage on Froogle.

Comments [Comments]
Trackbacks [0]

The up hill battle of embracing new media?

This is so weird, I was reading through Ben Metcalfe's Massive Rant about the BBC news away day at the same time as hearing about the Breaking news about the Toronto Air Crash and followed by the BBC. But started thinking how different Bens experience of his away day compares to my World Service away day (which to be fair was within a new media team). But slowly this entry turns into a realization that his observations are actually not that far off from my own. Ben sums up with this which I'm in two minds about.

So life goes on. Us ‘new media’ folk continue to push the boundaries and the ‘old media’ folk continue prop up the established broadcast mediums. On the outside we try and look like we’re all connected and know what each other are doing. And most of the time it works – we appear to be a progressive organisation working together in harmoney. Inside people like myself are desparatly trying to pull the old guard, kicking and screaming, into the 21st Century.

The rest of my entry…

At which point I was effectively ostracised by the group as to these people it all appeared too radical, too unfamiliar (and probably too scary).

In my own experience, few idea are too radical and if unfamilar people will ask futher questions till it relates to something they have experience of. Even in future brainstorming sessions with language services which tend to be Radio focused. They really push the ideas out there and demand more experiences like how they use the internet.

However, it was clear how threatened most of them felt. Here was a medium they barely understood, with behaviours and opportunities they had no comprehension of, being communicated to them by someone who, for a few of them, had lived for fewer years than they had worked at the BBC.

I have not seen much of this in my away day but I know exactly what Ben means in other aspects of my work life. Miles always said the great divide going into the near future is not those with or without net access. Its those who get it and those who dont. Honestly its worrying because those who dont are really holding back those ideas from those who do. I wont go into details but just recently I had a large discussion about tagging vs categorisation. I'm fine with having such a discussion but you need to understand or at least tried both sides of the coin to really get it. So generally tagging will never be taken seriously by the old media people because they dont get Flickr, dont get social software, emergence, etc. It kind of makes things really difficult when suggesting new ideas and ways forward which really could benefit our audience.

It is quite disapointing just how much they don’t ‘get it’; that they assume that only “professional old school news people” can come up with these ideas and as such further assume no one else outside of a news background might have already thought of such an idea.

I get this all the time, but on a different take. During 7/7 (london bombs) I checked out Flickr, Googlenews, Yahoonews, the BBC, wikinews and technorati. Most of those sources are out side of the professional old school news media. Some of the people who say they get it [Type 3 if were going by Ben's observations], actually dont ever consider looking any where different. They claim to be forward thinking but turn there back on new media when push comes to shove. Ben uses the word Embrace and I really think this is key. Its like applying rock and roll values to dance music, there maybe room for overlap but you need to embrace it to truely understand it – get it!

What’s dangerous is that they tend to want to apply their old media values to it rather than embrace the already established culture of openness, freedom of expression and equality that is so much more apparent on the Internet than in the traditional broadcast industry. (This is, of course, nothing new as we’ve seen this elsewhere – for example the music and film industry taking on p2p)

Exactly… The question is where we go from now? The example of someone moving around to get experience of other professionalism is a good way forward and I would suggest that you do not need to leave your job to do so. I'm sure I'm not revelaing any World Service secrets but the attachment scheme basicly allows people to move around the world service without forsaking there jobs. I'm sure the attachment scheme isnt unique to the world service but for it to work there needs to be an embracement. Luckly there seems to be a lot more of that from where I'm sitting.

Comments [Comments]
Trackbacks [0]

The perfect desktop aggregator

It feels like i'm on this never ending quest to find the perfect aggregator. I can not even remember how many I have tested and played with on my computers.

blogbridge
Rss owl
Great news
Fire ant
Blogmatrix sparks!
Blogmatrix jaeger
feed reader
AmpheteDesk
Flock
Sharp reader
Newsgator
Feed demon
Straw
News Monster
aKregator
Netnewswire
and more which I can't remember right now…

None of them quite have all the features need to make me quite stick.
Up till recently I was using Blogmatrix Jaeger but the lack of support from the authors, has me worried plus there are some really silly bugs which are very ignoying. Lets not go there right now. Before that I was using RSSOwl which was good even back in the older versions, but its lack of synchronization drove me spare. Theres nothing worst than reading through a ton of interesting news on one client then go and all of them being highlighted again! Well there is but trust me it becomes a pain after a while switching back and forth between different machines.
Just recently I tried Blogbridge which is ok, but resource hungry and not all that useful with 225 RSS feeds split into about 11 categories. Geez I'm not even close to Robert Scoble and others which have over 400 RSS feeds which need to be sorted, filtered and aggregated. But honestly I have not even started with my search feeds from yahoo, pubsub and blogdigger. So I expect the perfect will need to support almost 300 feeds without breaking into a sweat when I go for a search.

Attachment support

some other things which are needed in the perfect . Podcasting support, it needs to be as flexable as jaeger where it doesn't always need the enclosure tag to get stuff. And when I say stuff I mean not just Mp3s but videos like on channel9 etc. Bit torrent support would be cool but not essential. At least pass the Bit torrent files on to Azureus would be very useful.
Currently i'm trying fireANT to deal with all my attachment support but it doesn't do a very job with anything else. It actually kept crashing when loading my OPML. Yes I could just input the rss which supports podcasting, but what happens when someone I usually read starts podcasting? I don't fancy having to copy and paste the url just for a couple of entries. Jaeger works well because it highlights that this rss feed also has attachments (which can be enclosures or simply links to rich media in the post) this is great for example when epic 2015 came out. Someone I read regularly did a review of the changes between epic 2014 and epic 2015. And linked directly to the quicktime version. Jaeger automaticly downloaded because I have an option set to download all without asking me. So there was need to download the movie as it had already been done. This doesn't always work so well however, I'm subscribed to the TED feed and that usually has a lot of pdf attached with badly thoughtout file names. So I end up deleting a load of PDFs every week. talking of
cleaning up files, the perfect aggregator will need a user defined timelimit like in jaeger. This basicly removes all downloaded files unless there marked (maybe tagged, which I'll go into later. – idea being if you tag it, you care enough to keep it?). This stops the hard drive being filled up with junk, and makes synchronization on to a mobile/portable audio device a lot easier. So at the moment I got Jaeger deleting stuff after 2 weeks. Which works well for myself. But you can define from Never to every day. I expect a smarter way would be to have the time limit but also pioritse files by tagging, categories, read/unread status or even files size? I mean seriously if I haven't read or even glanced at the entry, will I be interested at the attachment? I would say no.
On this same thread, I would like to see more TVRSS features in the perfect aggregator. The plugin for Azureus is good for downloading media content, but I think that would be better done in an aggregator and the plugin simply interfaces with it using output of the files to download or picks up torrent files from a certain defined directory. So basicly the perfect aggregator would need a regular experessions engine or hey why not just use the search and smart search (almost playlist) feature which then also generates rss of the output which the TVRSS plugin could use. So for example if Kevin Rose releases another systm or thebroken I would have that feed set to automaticly download the media or torrent file (which would be passed on azureus). If i'm using one of the many RSS feeds with many torrent media links as items, I simply run a regex or simple search on it. Pop that into a smart search folder and tell the aggregator to give me RSS of what evers in that folder. azureus can do the rest.


RSS in and RSS out

Another thing which I rarely see, is RSS output. Now this may sound odd, but really want to beable to subscribe to the output of an aggregated source. For example I have a RSS screen saver on my machines, rather than it going and pulling the exact same feeds once again, wouldn't it be great if it pulled RSS from the perfect aggregator? Yeah starts to make sense right? Also there are other applications like widgets which I would rather pull from my perfect aggregator than from the web again. Currently I have a jabber bot which tells me what's new on slashdot, that's good but a little widget which just runs the latest headline in the corner of my screen would be highly useful. So generally a local RSS aggregator which also server's XHTML and RSS would be ideal. I have found ways to make Jaeger do this, but you can only have one or the other and it only serves to the local machine, aka no one on the same network can access my aggregator. I know Amphedesk does do this well but its highly useable and looks kinda of crap in my own mind. Maybe I should revisit it because its been a while now, and I do believe it supports themes or some kind of skinning.
But on the same front this is a problem when considering the speed issue.

Operating speed and cross platform capabilities

Blogbridge was so laggy when I had 225 feeds in it. RSS Owl copes well for a java application. While Jaeger and my new favorate Great news have no problems with 225 feeds and more. There seems to be a whole rash of .net based aggregators which work ok, but tend to be quite slow too. I do not know why this is, maybe theres some odd configuration on my laptop and desktop machine? It doesn't run as slow as some java aggreagtors but when loading 225 feeds in, its not far off. The compiled windows versions obvioudly run very quick, weirdly having an internal parser seems to be quicker still.
I have another odd requirement which I do not believe the perfect aggregator should service. I own a HP Ipaq (pocketpc) and I do read news on it quite a lot. Pocketpc RSS readers are slowly getting much better but are running into the same problems as there desktop counter-parts. But I don't believe the perfect aggregator should have a pocketpc version. Nono, that would be too much, plus through the steps talked about in this entry, any decent RSS reader on a pocketpc, palm, symbian device or smartphone device should beable to suck down an opml file using synchronization. Its not ideal because the synchronization would work both ways allowing you to also upload read and unread items along with downloading them. This feature seems to be a couple versions off pocketrss at least. I did notice one of its rivals now supports bloglines synchronization which is what Great news also supports well – more on this later.
But generally the perfect aggregator would support aall the main platforms be open source so at least people can port it to platforms like Solaris if needed. I expect when I mean main platfoms I mean Windows, Linux and OSX if your really pushing me. I like the idea of BeOS but how many people still use it? If its open then its at least somewhat portable.

While i'm talking about the nitty grittys of the application it would also support ALL languages in all types of encoding. Most aggregatorss fall back to IE or a mozilla core for display which is ok. But this would be best served as a choice by the user. On the mac obviously Safari would also be a choice too. Ben metcalfe recently linked to a stress test for RSS readers to check if they were subsectical to common dirty html tricks. This includes things like iframes, remote exploits via javascript and other natry things. I did a test with Jaeger and it passed all the test with no problem becaause when ever there was any html type stuff, it would pass that on to firefox which has all the adblocking, popupchecking, etc features I need and have defined already. I have yet to try this on Great news (tried it finally and it crashed Great news – I submitted a bug report, but i'm worried because great news uses IE for its internal browsing and there seems to be no way to change it the firefox core. So I assume a lot of thos dirty html tricks will work in great news? Unless great news doesn't pass any of that crap on to the IE core render?
This may seem the overboard comments of a open souce fan but this is important if like me you haven't really setup IE for your own preferences because your using something else such as Firefox. Security is a large problem and will get worst if the perfect aggregator also because a good way to exploit systems. RSS spam is only the start of things.

Tagging and Categorization

Talking of Great news, I'm really starting to love the abaility to tag and categories rss items. Jaeger has this but its not quite the same. Both also support a feature which I can only call search folders, basicly you define a set search and the aggregator will mark or highlight new items which match the criteria. This is essential and very useful. But tagging is great for reminding yourself of useful things which you may have not searched for. For example I have setup a tag in great news which is simply called blogthis. When I have time during lunch, I quickly run through the tagged entries in blogthis and sometimes blog it. I guess the perfect aggregator would have a compatablity with del.icio.us, technorati, etc so I could use the same tags to categories entries and maybe store them in del.icio.us using its public APIs. I personally think we are only scratching the surface with tagging and categories and maybe something combined between Thuderbirds straight forward search and smart search folders with the tagging ability of technorati and del.icio.us would be useful for making sense of content in an aggregator.

synchronization

Ah my new favor talking point when it comes to RSS aggregator. Like how most RSS readers only read RSS and don't allow any RSS out. Most Aggregators let you input OPML but not export it with equal ease. But lets get the basics right before considering advanced features.
The perfect aggregator should support at least.
OPML input via File and URL.
OPML output via File.
On the next step up in perfection.
OPML input and output via FTP
If you next consider the usage of idisk and .mac drives (better know as WebDav/deltaV) the natrual next step up would be.
OPML input and output via WebDav (unsecure and secure – https)
honestly I do not know any which have WebDav support but it will make sense with more services like idisk and .mac drives. Some support FTP like Jaeger for synchronization and storage.
The final and perfered way of synchronization is via webservice. Generally its using something else to send messages back and forth. There are growing propsals in this area, one I like is attention.xml which has backing from technorati. At this moment most people are using some modified opml to do synchronization which works but is so adhoc its untrue. I have never seen a opml carry what's modified and what's unread information across from one application to another. They tend to only work in there own application domain which sucks. The most used webservice type synchronization is done using bloglines open API service. For example in Great News all I needed to do is enter my email address and password I used to logon to bloglines and it will download not only my subscriptions but which items are unread and read. I do not believe the api supports locked items or any other states but read and unread is usually all you need. When your done with a feed in great news it will send the changes to bloglines. Perfect you would say but what incase I do not want to use bloglines? Well at the moment you could adopt the newsgator package which is a rss aggregator and web service which support syncing between the both of them but outside of that your kinda of stuck.

This maybe an area where Yahoo could really blow away the market. Allowing synchronization with your myyahoo subscribed feeds and maybe further the yahoo!360 service. But it strikes me that myyahoo isn't a bloglines and isn't meant to be either. There are few online aggregators as popular and as powerful as bloglines at this moment, so unless mymsn or myyahoo step up and give real aggregation features, bloglines will be the one and only service to sync with. Maybe in the future others will come along and even adopt the bloglines api in an attept to short cut into the synchronization market?
So in the case of the perfect aggregator, bloglines synchronization is key and needs to be there above syncing over opml.

Other features

Here's a list of key features which are highly recommended
Automatically generated search feeds and Pubsub support (also known as Watch feeds). PocketRSS has this nice feature which allows you to enter a search term and it automatically makes a new RSS search feed and subscribes to it. Practically this is usually a slight url change using a RESTful api but its dead handy when doing research.
The ability to filter a large feed (cited the case for torrents and regex), filtering on top of a search rss feed (like that of yahoo, technorati and blogdigger) is essential because not everything you get through is of interest. I would almost go as far as to say search and torrent feeds should be treated slightly differently. They tend to update quickly and contain many items. It would be good to automaticly remove old items, or archive them away like how azureus rss plugin does.
Comment support. The ability to automatically see the comments to a blog entry or at least click through to see the comments is handy and shouldn't go a miss in the perfect aggregator.
Blog this support. This is usually just a link which sends the permalink url and maybe the content to an external blogging application. I don't think this is quite standardized but this shouldn't be over looked in the perfect aggregator. I would also suggest Email this would be included in this same space.
Search this item. The ability to search all related or at least linked to blogs to that first item is vital when doing research or gaging views on the first item. Blogdigger and technorati do this well allowing you to search for all blogs which link to that first link or search via related terms used in the first item. Having the feedback in the aggregator is tricky but at least having that feature in the perfect aggregator which then sends the results to a browser is good enough. Like the search feed, this is usually just a RESTful method call with the correct URL.
All types of autodiscovery supported. It should support drag and drop from the RSS link and the page link which will cause the aggregator to search for all alternative link feeds in the head of the html page. It should also support feed:// and that weird USM (Universal Subscribe Mechanism) feature.
For display as mentioned before, it should support RSS output but also some simple templating system for displaying the HTML to the local browser or network client. CSS for style should be used while something like VM (velocity), XSLT or Groovy should be used for layout. I'm favouring XSLT. But it should be trivial to add different flavors or templates depending on a simple url query string. Yep this borrows from Blojsom's flavor idea but it works well.
Social aggregation. The idea of what your friends tag and recommend is not exactly new but it would be great to see what your friends recommend directly in the aggregator. Ampheterate was one way of doing this but it requires too much commitment on a user. Attention.xml has ways to deal with this which are well worth checking out.
Microformat and rss extention support via a community. I was orginally thinking just rss extention and microformat support but I'm thinking if the core rss parser engine simply outputs what it gets in, the template engine could deal with the extra rss support. However this is not strictly true. For example Amazons A9 open search is little use after its passed the parser, it would be good for the parser to understand the attributes and elements while it could be argued that Microsofts list extentions should also be parsed at the parser, most of the grunt work would be done at the display/template engine. I 'm not sure how easy it would be to patch the parser but having templates which can be shared around a community certainly makes a lot of sense. Saying that, Flock a decent but underdeveloped aggregator uses XSLT for all its parsing so its possible I guess.
Aggregator engine seperated from the front end. Memory is a worry no matter what operating system you maybe using. It would be real nice to have the actual simple rss catcher/fetcher and parser separate from the actual output and gui. So for example you could settup all your feeds and just let it run every couple of hours and collect a store of RSS and attachments. You could then run the gui to search/view the feeds, add/remove feeds or do anything else like this. I know theres one application which actually does this quite well already. Blogwave is not your typical rss aggregator, it has a front end but that's mainly to setup the actions and the like. Once its setup it will just download rss and run a set process on it if specified. Its good at what it does and I believe it is possible to get another RSS reader/aggregator to read the downloaded feeds but I have not tried this combination yet. I assume its possible to do more with blogwave than I'm suggesting, but why bother? Let something like Jaeger, Great news, Feedreader do the gui stuff and just build links and remote calls to blogwave.
But in the same vein I would like to see more RSS screensavers, widgets, etc which make use of the features of the RSS output. I currently also read RSS news via bloglines on my xbox using xbox media centres bloglines python plugin. Its pretty cool and allows for syncing back and forth as its based on the bloglines api (more on this later) but is it really necessary for each mini application to make calls to bloglines each time? As mentioned I have a bloglines script on the xbox, greatnews installed on 2 machines so far and a RSS screensaver on 2 machines and I haven't even started on the widgets yet. Each one of those are making calls to bloglines and individual blogs/news sites. It would be so much more effective if they all made calls to a proxy of some kind instead? I expect blogwave isn't quite enough for this, but maybe it is?
One, Two and Three pane support. Yeah this is always a conversational firestarter. I like Jaegers one panel solution but I've grown up on three pane rss readers with feedreader, rssowl and now great news. Sparks! has the ability to switch to two pane view if needed. So I pose the question of why not support all three types of RSS reading?
So you can use your browser for RSS reading using the templating engine ala Jaeger and Amphetedesk.
Two and Three pane reading via the browser core ala Sharpreader, feedreader, Feed demon, etc, etc. If anyone knows any other types of mass RSS reading outside of these three methods please let me know.
Automatically change items to read when selected. Unbelievable but I have encountered aggregators which do not support this feature. So you have to manually select read or wait anything up to 10 secs before it counts it as read. Standard feature which should be included.
Alternative grouping. This reflects tagging and categories section but the thrust of this feature is to get away from hiararchical folders to categories rss feeds. It works ok but honestly it gets ignoying when trying to manage the folders. Tagging the feeds means they can come together in groups when ever needed. This should apply to not only the feed but also the individual items.
User selectable views or Stylesheets. Great news does this so well using CSS I can't praise this feature any more, its hard to go back to anything different now.
User configable keyboard navigation. This should allow anyone to setup there own key combinations for going to the next/previous unread/read items.

Some may ask why I don't just build this magic aggregator which supports all these great features. Well honestly I would if I could. I was planning to build something using combinations of other aggregators out there. For example the ability to have just a aggregator engine which simply parses RSS without a front end is kinda of almost there if I use something like Blogwave or even something more complex like Apache Cocoon. Yeah over kill for a lot of programmers but a step in the right direction.
The main purpose of this mini essay/long blog entry was to inform others of the growing conclusions of RSS aggregators. I 'm actually hoping that within a year most of these things I and many others have suggested will be there as standard on most rss aggregators. I expect like the itunes 4.9 intergration of podcasting that longhorn/windows vista will not cover all the bases well first time but it will get better. However there is a burning need for advanced RSS aggregation beyond the usual RSS reader. Even in the naming I have started to make the distiction between the two. RSS readers would include the xbmc bloglines plugin, my pocketpc reader (pocketrss), widgets, screensavers and tons of simple but effective desktop readers. Aggregators on the other hand must have that ability to output content too. So I would include anphetedesk, flock, feeddemon, jaeger, sparks! To this list. I'm also thinking aggregators are much more hackable or remixable using standard technologies such as templating, scripts and other things.

Comments [Comments]
Trackbacks [0]

Recovering from a great geek weekend

The scale of particiption at open data 2005

So after a couple of days I was finally able to get my notes together and email some of the people I met. was simply great this year. The line up was full of stars including Ted Nelson, Jeremy Zawodny and Danny O'Brien.

Ted was entertaining as usual but his projects including Transliterature have not moved on a whole lot. Generally the philosopher Ted in my mind is right about the problems with operating systems but the way he goes about it tends to be restrictive and confusing to say the least.

Xanadu alternative views

Ben Metcalfe was in his element at the official launch of backstage.bbc.co.uk (note the lack of beta now), which kicked off well except bbc news published the story a little too early which spoiled it for people like myself who read there aggregator before they went to opentech 05. Anyhow, Ben did a great job of presenting the competition and answering all the questions and even managed my tricky question around people from around the world using backstage.bbc.co.uk. I did want to get the point over that backstage.bbc.co.uk isnt just a developer network, its also for designers who want to submit ideas, thoughts and even get involved.

Ben surrounded by backstagers

Jeremy Zawodny was very interesting and pointed out a couple of things.

  • The rumours about working on a Technorati killer, are true.
  • The aggregator will support Microformats and RSS Extensions, including some of Yahoo's rivals
  • Yahoo will be REALLY opening up more APIs. Zawodny failed or kept very quiet about the Konfabulator take over
  • Yahoo are counting RSS/Atom as a type of API not just as a syndication format

Yahoo! hearts BBC Creative Archive

I then stuck around for Hacking the TV Stream, where BBC R&D and BBCi staff showed off the biggest PVR (promiscuous video recorder), Dirac codec and how to hack Freeview /images/emoticons/laugh.gifVB) broadcast streams. I didnt know how easy it was to do and it came to my suprise that the BBC is encoraging people to do this under a backstage non-commercial type licence. There was also some reference to UKNova in one of the presentations, which I keep meaning to send to the UK nova members. Yes the BBC are fully aware of Uknova, and the people at Opentech had a good laugh when it was mentioned.

Uknova slide

Some of the other highlights included, Tom Reynolds who now seems to be turning into one of those A class british bloggers. Don Young from Amazon services, who talked about all the APIs and services Amazon is opening. Lee Bryant's Collaborative Archives which trigger a whole load of thoughts about how this could/should work across languages. And the Greasemonkey presentations by Simon Willison and Rob McKinnon who I later talked to at a indian resturant about a number of things including Ruby, SVG, Cocoon, Python, American Poltics, Media and many more things. I also have to say Nicola Smyth and her partner were also good company to our quite geeky conversation. Its just so rare to meet some so into SVG as myself.
Talking of which, I finally met Matt Webb, Ben Hammersley, the NTK guys and many more. Its a shame I missed the discussion on where the British EFF was? Cory Doctorow filled me on the main point after the discussion while the afternoon break was on but I cant wait to see the videos of the debate. It also made its way on to slashdot.

I still cant believe the whole event only costs 5 pounds, I would have happily paid 20 plus pounds for such a event. Talking of which, geekdinner prices are getting really silly now. 20 pounds for some cheap nibbles, loud music and no drinks. Yes the company is great, but we could all just meetup somewhere free and talk around a pub table. I'm just thankful Ben came up with the idea of poping down to tesco and getting a ready meal before hand. really needs to slum it for a bit otherwise people will get pissed off and stop going. Its not even like the orginser is making any money. Its all going to the venue owner, and in that case I would rather spend my money else where not give it to some stuck-up Picadilly bar where a cranberry juice costs 2 pounds something.

Overall, it was an enjoyable weekend except the rain which soaked me when riding between opentech and the resturant which I couldnt find for over an hour! Hope to see everyone next year. As usual there are photos on flickr and tons of talk around the blogsphere

Comments [Comments]
Trackbacks [0]