Geek dinner with Paul Boag

Paul Boag and Marcus from Boagworld

Another successful and great Geekdinner, this time with Paul Boag from the Boagworld design podcast. This time around I decided to try a local designer doing good stuff for the community. PaulBoag fitted the bill perfectly since I've been listening to his podcast for a while now. Around 30 people showed up and enjoyed the evening with Paul. Now this may seem small compared to the 70 of Dave Shea but its not far off Tim O'Reilly and Molly type attendance. What I found interesting is the percentage of new faces and regulars. It always seems to stay around the same percentage no matter what guest. I've not worked out if this is good or bad yet but we'll see at the next one I guess. Something else which did change this time was the percentage of women. It went down this time which is a shame but I know quite a few women who simply couldn't make it this time.

So as usual people turned up before even I got there which is always a suprise. Paul and Marcus turned up just after myself which was about 18:30 ish. Between 19:20 and 19:45 the bulk of the people came which is usual. And before you knew it the food was being handed out. This was great because usually people ask me when the foods coming and I usually have to say soon, honestly soon. But this time it was so early that it even took me by suprise. I had not even collected the money and given out sticky name tags yet. So I'm glad to say that the Polar Bear will be come the home of Geekdinner unless they have it booked for other events (they have June booked out for the World Cup or something). Its big enough for 100 people and we have complete control over music, noise and the lighting. There's a complete DJ and Mic setup so its possible to record talks and even finally broadcast the speakers talking to the whole room without the speaker shouting out loud and loosing there voice.

But back to the night, by the time everyone had a chance to get there food and digested I had introduced Paul and we were away. Paul's talk was mainly a public reply to Molly's post How to sniff out a Rotten Standardista. I won't recite much of what Paul said but it was pretty funny in parts and it was recorded by Marcus and another person (can't remember your name sorry). I know the Boagworld is planning a special podcast with the recordings they made, so I'm hoping to be linking to the podcast soon enoughPodcast. But after talking about Molly's post and the fact that we look inside of the design community too much, Paul started on Web 2.0. He was convinced its all smells too much like a bubble. Before long Paul was being beaten to death with questions and comments. In the end, Paul had to give in and make it clear that something like Digg.com smells like a Bubble type site while Flickr before it was bought was not because they at least had a business plan. Honestly Paul was great, he really took it all on the chin and was glad to admit he might have changed his mind a little. Paul also gave a nice talk about how we and the pros are actually making it up as they go along. If we were more honest about this things might actually be better overall. So without spilling any more tasty beans, I got to say Paul was fantastic and Marcus was great too (i can't believe his arm never fell asleep. holding the mic for almost a hour I believe?) I actually closed my eyes at one point and seriously it sounded like the batter of a usual Boagworld podcast. Great stuff and thanks again to Paul and Marcus for making the trip up to London and taking part in a great Geekdinner.

For those who don't know there was a special competition which included a trip to Austin Texas to the South by South East interactive conference (SXSW). Not only did the jammy winner get tickets to the conference but he also got his hotel paid for and flights paid for by the little bit of sponsership which sits on Geekdinner.co.uk now. The names were pulled out of my grey hat by Paul Boag and Simon won, much to the disappointment of everyone else who entered. Kinda of wish I could have entered myself really too. Simon's already offered to fly the London Geekdinner flag for us and plans to meetup with a couple of people who have already booked there tickets to go. Good luck Simon, have a great time!

I spoke to so many people during the rest of the evening but I would like to say thanks to everyone for coming and making it a great geekdinner. It was less people but it was nice and friendly and there was plenty of food to go around for all. Oh by the way Jen and other vegatarians, the polar bear serves food on single plates so the meat is easily avoidable. Tom asked me if it was ok for a Vegan and I couldn't honestly say yes, but I believe even he was able to find something good and safe to eat. Another reason to stay at the polar bear I guess? Whatever it is, there is a flickr group I set up here and my own photos tagged geekdinnerwithboag here. Can't wait to arrange the next one now.

Oh by the way Simon that lucky guy and Rachel have a good review of the evening too.

Comments [Comments]
Trackbacks [0]

High Definition format wars, who cares?

Blu ray DVD

I don't believe I've put my foot down on either side of the Next generation DVD war. You got the whole HDDVD vs BluRay thing and then you got every large company coming down on one side or the other. Some of them like Samsung have pledged to support both in the same drive or at least the same unit. But in this world of Networks do we really need another disc format? Well I personally don't think so. I remember the day when I got the Toshiba Tablet PC i'm typing this on now. It has no Opitical drive and I always thought this is kind of scary because it assumes your always in a networked environment or dependant on memory cards. But its worked out for the best because all my media is on a network drive and its been simple to copy stuff from CD via another machine. Anyway I found this really interesting thread in Slashdot today by WhiteWolf666.

The 3rd technology has already emerged.

H.264 on standard DVD, with the upgrade path being ANY sort of higher capacity device.

H.264 means you can do 1080p (not 1080i, but 1080 progressive) with 5.1 audio in 1 MB/sec. That's about 3.5 GB per hour. That gives you 2.5 hours of 1080p on a standard DVD disk. You can squeeze the main title in 2, and then use the remainder for all the other stuff in SD. Or, make it a two disk set. Both of these will cost FAR, FAR less than blu-ray or HD-DVD.

H.264 enables SD TV over standard broadband, NOW. Take a look at this: http://www.apple.com/macosx/cnbc/ [apple.com] . Thats technically 480p content. Its playing at 675 kbit/sec, or 84.73 KB/sec. 720p content is similarly small; you'll have no problems whatsoever fitting everything you'd want on a single title blu-ray disk onto a standard dvd if your encoding with H.264 on 720p.

I suspect with a really smart encoder, using intelligent VBR type stuff, you can get 1080p down to an average of 800-900 KB/sec. Perhaps even less. If someone can get the standard DVD above the 3 hour of footage barrier, blu-ray/HD-DVD immediately become a niche market, at least until HDTV 2.0 comes out. Oh; and new displays, as well. But even with _today's_ setup, you can fire up Final Cut Studio, and produce a 2.5 hour feature length movie, slap in on a standard DVD in 1080p, and then put all your extras on the second disk.

H.264 enables 1080i HDTV on a standard dual layer DVD. You need a beefy processor to play it back, but various manufacturers have already produced embedded decoders. H.264 is the future of IPTV, of satellite transmission, even cable transmission. Most likely, the “upgrade” path is H.264 on standard disks, and then the elimination of disks altogether.

Why would I _EVER_ carry a pile of blu-ray disks around when I could simply walk with an iPod, or a mobile phone, or a flash disk, or some other portable media library, and wirelessly (bluetooth 4.5, or 802.11n, or whatever) “rent” a video from the blockbuster kiosk? Heck; strip out the middleman; just buy the movie from iMovie store, or Amazon's movies, or Walmart Video Online. Whatever; it doesn't matter.

Someone asks afterwards where the popular content is. Which reminds me of this great quote from Doc Searls article on Linux Journal titled The Home produced movie revolution

The next era the one in which the bulk of producers will emerge from a mass market formerly filled only with consumers will begin when video customers begin to realize they can produce higher-definition video than what they can get over their cable and satellite connections. That will happen quickest for customers who buy 1920 x 1080 screens to take full advantage of their new 1920 x 1080 camcorders. While spending under $2000 for both.

This may seem like a dream, but I'm not so sure were that far off now. For example I met the woman behind the amazing and very brave Modfilms site. The creative thinking behind modfilms is staggering and not only that, she's fully commited to it being a success. And lets be honest, how many times have you wanted to remove parts of a film?

Comments [Comments]
Trackbacks [0]

Blogs will not save old media alone

Blogburst

Since being alerted to Blogburst by Ben Metcalfe, I've had a RSS search filter running through-out the day. I've seen some really interesting posts filter through.

Well first what is blogburst? The tech crunch guys have pretty much got this covered, but there post has sparked a lot of coversations about there covering of Blogburst. For example Scott Karp over at Publishing 2.0.

Michael Arrington declares that BlogBurst Can Save Big (print) Media. To suggest that the lack of blog content is all that ails Old Media is deeply naive. Old Media needs to follow bloggers into the new content creation frontier, but that in itself will NOT solve the problem of business models.

And he's very right, its not about simply adding blog content to a already ageing medium. Anyone who does is seriously mistaken if they think thats the end of the deal. Its about a conversation not simply publishing. Scott moves on.

But why do publishers need BlogBurst as a middleman? Why can’t publishers hire an editor whose job it is to go out into the blogosphere and pull in the best and most relevant content, which is already easily and freely available through RSS feeds?

Agreed, but for some reason this does not happen. I can't work out why, theres already enough tools to keep a track of whats going on in a given subject and RSS can make these things more automated. But back to the point it simply does not happen. Like Scott says I will give BlogBurst credit for the understanding that the blogosphere needs a human filter to extract value. and thats where I totally agree too. Blogburst is a human filter on the blogosphere and this is a welcomed new model.

The rest of Scott's post is about Tech Crunch which I'm not that interested in, but yes this could reflect badly on Blogburst if taken out of context. Mark Evans talks about the sign up issue for bloggers.

nother question is why would a blogger sign up unless they really, really want exposure and/or traffic. Blogburst takes a blogger's content and provides the following: “visibility and exposure”, “new readers”, “authority and credibility” and “the opportunity to take your blog to the next level” (whatever that means). The downside is there's no economic incentive for the blogger and little guarantee readers are going to visit your blog unless they click on your byline. For anyone really trying to build a brand, they should want and encourage people to visit their blogs.

Point taken, its a interesting question which I personally don't have a complete answer for yet. Blackfriars' Marketing talks about the scale issue and how effective human filtering is at the million plus mark.

This is a system that works great with 100 or 1,000 blogs, but collapses under its own weight with 100,000 or a million blogs. No editor or reporter is going to wade through a reading list of 1,000 entries, but that could easily happen with big categories like News and Opinion or Technology. If that happens, editors will go back to reading Memeorandum.com or TailRank.com.

One second, who said they were using Memo or tailrank? I certainly don't see any signs that this is true. Anyway moving on…

My suggestion to BlogBurst: take a page out of Web 2.0 and allow members of the newspaper community to vote feeds and stories up and down in the rankings. Otherwise, a successful BlogBurst could do just that — burst.

And I'm in total agreement, its about collabrative filtering.

What I find most interesting is BlogBurst's powerful Publisher Workbench. Its a API between there system and the internal content management of the mainstream publishers. How effective it is, we shall see but its a good move and being SOAP and XML (rest) means any internal development team could work with there service. I'm hoping this will cause Bloglines to release a API for there Citations service. I'm also wondering when Feedster and Blogdigger will consider collabrative filtering as another option with there machine filtering? Back saving old media for the closing. Old media needs to engage in the conversation if they would like to be saved as such. Its no good just dipping in and out of these conversations. The smart ones are already moving out of the slow decending circles. Hint hint.

Comments [Comments]
Trackbacks [0]

A XSL transformation mindset

Someone asks on Metafilter.

When you imagine XSLT transformations happening in your mind's eye, what does it look like?

Its a really good question and opens up a whole range of thinking about the differences in peoples thought processes. So first Jeff talks about the question.

This is a very powerful question to ask, because ancient, procedurally oriented developers like me sometimes have trouble following the non-linear, pattern-driven processing that takes place when an XSLT template is applied to a tree of XML elements. In fact I have noticed that non-developers sometimes have an easier time with XSLT than do experienced developers, because they don't try as hard to figure out what is happening beneath the covers.

I would kind of agree with that statement. Theres something about XSL and XML which just makes sense in my head. I'm not from a traditional software or computer science background, so I still find it weird to be called a programmer by some of my peers. John wrote this fantastic comment.

My first project with XSLT a few years back was to actually generate XSLT *from* XML and XSLT and forced me to break my ideas of how it worked. When I finally got the whole “it happens all at once” approach, it started to make sense. However, every programmer that I've brought on board to an XSLT project since has had trouble getting out of the procedural thinking and that ends up being the biggest source for their mistakes.

Unfortunately, like MagicEye images, some people just aren't able to unfocus their minds in the right way to really grok XSLT beyond the simplest examples.

I have heard of programmers comparing XSLT to Prolog and even Lisp, I'm not sure how true this is but its certain that you can't approch XSLT in a regular way. Recursion is one of those things which seems to drive people mad. In XSL there's a lot of recursion and declaration which seems to fit the way I think. I always wanted to create a SVG of a XSLT process. So you can see in lines and boxes what templates are being called and add some kind of dimension to XSL. I'm sure its not that hard and even my experiements with transforming Cocoon's Sitemap file into SVG didn't require too much work. Talking about recursion someone posted this nice animated gif of how it all works. There's no douht that XSL requires a different mindset and working with a programming language like Java or Perl will be more of a hinderance that an advantage.

I posted this question to a few of the XSL developers I know and got a variaty of answers. In my own mind I see lots of lines and trees which get broken into branches

Comments [Comments]
Trackbacks [0]

Virtual Private Java Hosting choices

So I almost made my mind up when I thought I'd check out what people were saying about 4java.ca. The first result got my attention, as its someone looking for a private Java servlet hoster too. Erik also decided on 4Java.com but was considering Addr.com which I've honestly never seen before. I was a little put off by there image driven site but quickly found the lighter version which is much easier to navigate and compare prices. However I couldn't really found anything about the actual hosting besides they support Java Servlets. So I checked out the comments to the blog entry. Hub.org gets my vote for worst designed and styled hoster site, but there prices are nice and cheap at only 11.99 dollars for a basic Tomcat VPS setup. They also as standard run Cocoon 2.1.3, OpenLDAP and Jabber! Now thats something worth shouting about. KGBbinternet are simply too expensive for what you get. I mean you have pay 60 dollars a month to get 15gig of bandwidth a month. Cubicgarden.com is currently using about 6gig in webpage access and RSS and I expect that to grow over the coming year. Back to the blog, someone else pointers Erik to eapps which was one I considered a while ago.

Later this evening I spent some time talking to the live help at hub.org and they pointed me to the Cocoon hosters list, but that was not very fruitful. But I have to say the person on the other end did seem to know what they were talking about and seemed to be less sales and more sys op. At one point I did send the link to my blog about being told to leave Interadvantage and remarked it was the first time he's ever seen someone told to leave for such reasons. I also spent some time talking to Miles about Memory usage in Java Servlet containers. He had some great questions to ask any Java hoster. Whats the permanent generation set to on the JVM if at all? What's the command line used to start the JVM? Whats the percentage of customers which go over-bandwidth a month? and Email addresses of 3 customers I can correspond with. I've yet to get any response back from Hub.org but there certainly looking like the right choice if I can keep my bandwidth down to less that 8gig a month. As Miles has remarked already, I'm not using 304's in blojsom and I'm not Gzipping responses. So realisticly it shouldn't be a big problem. Those famous last words…

Comments [Comments]
Trackbacks [0]

SXSW interactive competition, updated

Taken from the geekdinner website.

We are please to tell ya’ll we have joined forces with the lovely folks @ SXSW Interative to give one lucky Geek Dinner attendee a free SXSW Interactive ticket.

To be in with a chance of winning this amazing prize, in the spirit of the Podcast Geek Dinner on February 23rd, please SMS the word “Podcast” followed by your Name, URL, Email, and Location to 60300.

The lucky winner will be announced by Paul Boag at the February 23rd Geek Dinner. You need to attend to win, if your the winner and did not attend, another name with be drawn from Ian’s Hat!
p.s. standard sms charges apply [this is not a premium rate shortcode!]. no terms and conditions. you win, you get the ticket.

So get txting, and good luck

The thing which has been putting people off is the flights and hotel prices. But its no longer a worry.

Thanks, now, to our buddies at Shopzilla this SXSWi ticket now comes with a return flight, and hotel. Sweet!

Lee was able to get the flights and hotel also sorted for the one lucky winner. So in totally were talking a prize of about 1000+ pounds! Its such a good deal that honestly even I'm tempted to enter but know I can't really. Imagine if I won, people would spam this blog forever with fix. Anyhow, I would enter if you can because right now there is a 1 in about 35 chance that you will win the ticket. Thats fantastic odds and even if you don't win you still get a great night out next Thursday with Paul Boag and the rest of the geeks. Like Lee says, good luck and hope to be pulling your name out of my magic hat.

Comments [Comments]
Trackbacks [0]

Its all about Yahoo, again?

The one to look out for is certainly Yahoo. This week they released there UI library into the public domain under a BSD licence and then showed off there design patterns which I sent around to our designers for consideration. I also got the chance to read through Tom Coates fantastic presentation at the Carson workshops future of web apps summit and to top everything off. Yahoo is now hiring semantic web developers? Yahoo is once again on a roll. No wonder why Tom Coates moved. Oh by the way, don't forget to check out Simons notes which are great when flicking through the pdf presentation offline.

Forgot to mention Jeremy Zawodny has taken the main points and broke it down into something which can be translated for product managers. Yahoo are certainly on a run!

Comments [Comments]
Trackbacks [0]

Geek and Geekhag podcast number three – Dark side of the mobile

Me and Sarah's third podcast is now available online. Enjoy and please leave a comment if you've enjoyed it or simply hate it. This time we slide from topic to topic and get interupted by a few people at 1am in the morning. If I need to tie this podcast down to something, I would say its about the dark side of the mobile phone. Bluetooth, Toothing, Happy Slapping and now Happy Shagging. Luckly we never got into this which is terriable to even think about.

Comments [Comments]
Trackbacks [0]

Moving cubicgarden.com again

For quite sometime I've been having issues with running Blojsom on a shared Java host. See its possible but not ideal. It would seem Blojsom is best placed in a dedcated servlet container where it can have room to move. So for quite some time I've been holding on to my very cheap shared hosting by Interadvantage. The System Admistrator has been helping me out for quite some time but it just seems Cubicgarden.com is just generally growing in popularity and outgrowing its small plot of internet land. For the last 2 months I've been trying to cut down on system hit by using OSCache and Log4j to solve the errors I might be getting. But it came to a head just recently… here's a slightly edited email I recieved.

I've noticed that whenever your site gets hit hard, our server's load goes way up and other sites become unresponsive. This is particularly a problem in the mornings, from about 8 – 12 AM our time. I assume all the geeks over here who are addicted to your blog get their RSS feed (because you get a lot of RSS traffic then), and then go hit your blog if they see something of interest.

I also took a look at your stats, and about 25% of your page loads come from crawler.bloglines.com.

At any rate, your site is successful, and active, and it depends on Blojsom. When it's getting hit, your site dominates a loaded Dual Xeon server, so I think it's unlikely we can reduce the load to an adequate level just by tweaking Blojsom. Sadly, I again need to encourage you to look for another host. I don't think it's fair for you to try and host your blog in a shared environment; I really think you should put it on a VPS or dedicated server so that Blojsom can't hog CPU and memory that is being shared by other sites.

This came to a head this morning because our company president was trying to do something on your server at 10:00 AM, and he was very upset by the performance. He wants resolution to this situation, and suggested we give you 30 days to find a different host.

It has been fun working with you, and I'm sorry to write this note. I do wish you the best in your future endeavors.

Gratefully,

interAdvantage Administrator

So once again I'm on the look out for a good java hoster which provide virtual private servers for people on tight budgets. So far the cheapest I've found is 4Java.ca's private tomcat at 14.95 candian dollars a month for Tomcat 5 with 600meg of space and 10gig of data transfer. But Daily Razor's RazorBLAZE package attracts me because they also supporting Cocoon cost 19.95 american dollars a month for 5gig of space and 80gig of data transfer which is fantastic in comparision. VPS land seems ok too at 3gig of space and 40gig of bandwidth. One of the things I loved about Interadvantage which seems hard to to come by is, the friendly and knowledgable system administrator. The System admin has been working with me for quite some time and I know for a fact that this email was something he didn't want to write. We tried to get Cocoon working in a shared environment but came to the conclusion that it was not possible with serious security overrides. So please don't blame the ISP for this letter, its my fault for trying to slot a popular blog and amazing blog software in a shared environment. I'm sorry to the other people on the same server and I'll be moving soon.

So if you have any other hosts which do Java servlets, allow for at last 5gig of transfer data a month and cost as little as 10 pounds a month do please recommend them to me in the comments.

Comments [Comments]
Trackbacks [0]