Sunday, November 9, 2008

Browsing Time Machine backups

OS X Leopard Time MachineImage by antwanp via FlickrAlthough Time Machine is a fantastic consumer-targeted solution for backups whose simplicity is fantastic, it unfortunately has it's drawbacks. It saved me quite some time when my graphic card died but when I wanted to browse through my backups on another computer (in my case with Ubuntu 8.10), I was quite surprised over details of implementation.

It appers that although you can browse through your Users directory, you can't just dive into any subdirectory and copy the file you are looking for to your current computer.

What you have to do is this (thanks to Carson!):

You have to run command ls -l which returns something like this:

me@desktop:/media/disk/Backups.backupdb/Computer Name/Latest/Disk/Users/username$ ls -l

-r--r--r-- 9891110 root 10239884 0 2008-04-21 20:19 Applications
-r--r--r-- 9861491 root 10239885 0 2008-04-21 20:19 Desktop
-r--r--r-- 10192194 root 10239886 0 2008-04-21 20:19 Dev
-r--r--r-- 10197872 root 10239887 0 2008-04-21 20:19 Documents

Here you have to take notice of the number in italic formating (forget about the bold for now). This number normally represents the number of files in that directory but in case of Time Machine this number is a unique identifier. To really access your file, you have to go to the top level directory of you external hard drive and look for a directory .HFS+ Private Directory Data - note the . before the name because it is a hidden directory.

Then you can either run command ls in that directory (but it will take some time and you get a bunch of directories that tell you nothing of it's content) or you can just type cd dir_10192194 (in this case I decided to look into the latest content of directory Dev that is printed in bold font before) that correctly lists all files in it. Note that if you want to go to a subdirectory, you again have to repeat the process and reference it in .HFS+ Private Directory Data.

Therefore restoring to a Windows/Linux system is possible, it's just not something you really want to be doing (especially to restore a large number of files). For that I would probably write a Python script or a new bash command (cdtm as in change directory time machine) to ease the pain of constantly checking which directory you have to go into to find your files.

Which could be a nice little project after I get my Macbook Pro back...

Reblog this post [with Zemanta]

When your Macbook Pro's graphic card dies

Time Machine (Apple software)Image via WikipediaAs it was noted in major publications several times already, NVidia made quite a mistake when they produced N84/N86 graphic cards with faulty parts. So when I became an early adopter and bought myself a Macbook Pro in July of 2007, I was also given one of these graphic cards and until now it's performance was nothing but breath taking. Unfortunately it appers that my usage patterns (probably a bit above-average carrying around, but not everyday) have finally brought it to it's knees and on Friday evening it died.

Since in Slovenia we can't use Apple Care I was more than pleased when Apple announced that they will be extending the warranty for their graphic cards to two years - I was just a month or so over warranty.

With my mind at rest about possible costs of repair I decided it was time to see how can I make the newest backup of the data on the hard drive. Since I could clearly see that my MBP successfully log-ins and connect to my home WI-FI network (broadcasting ping helped with that - ping -b, my first thought was that I should use SSH to log into the computer and then begin the painful process of copying somewhere around 100GB of data over 802.11G WI-FI and to the USB hard drive. But this didn't work because I have closed port 22 for SSH. For a laptop, that seemed quite reasonable at the time.

After discussing options like taking disk out and use converter for SATA/IDE-to-USB to plug it into other computer, mounting my MBP as target disk on my friend's Mac, etc., I was finally struck with divine inspiration - "If it can connect to WI-FI then obviously everything starts as normal, therefore I can just use Time Machine". The second I thought about it I was thinking to myself "how couldn't I thought of that before"?

Few minutes on and Time Machine was happily making backups.

So remember kids: Use backup and use a backup solution that can do EVERYTHING on itself with no buttons like "Yes, make me a backup".
Reblog this post [with Zemanta]

Wednesday, August 20, 2008

Neat chairs lying around Ljubljana

Since I am fully occupied this summer (first a trip to Belgium, Paris and Amsterdam, now internship at Zemanta) I am left to plea guilty on charges of neglecting my blog. ;)

For a start I would like to touch a bit lighter theme - does anyone know what all those chairs that lie around Ljubljana are for (besides sitting on them :-)? I have a feeling that they are meant to be there with some kind of artistic reason - if not that, were they at least put there at some special occasion or something?

"Ljubljana chairs" (that is written on them) is just a bit too general search term for Google...

Reblog this post [with Zemanta]

Thursday, June 12, 2008

Visor, turns Terminal to quake-like console

A few days ago I went to Blacktree to download Quicksilver to give it another try (when I started with OS X I didn't saw value in it; few months later it really turns out that Spotlight is a bit limiting) and found a rather interesting add-on for standard Terminal - Visor. Visor gives you the ability to press a custom key combination to show Terminal - just like Quakes in-game console for advanced commands and finer tweaking of the game.

As far as I've been using it, it turns out to be quite useful. When you need to quickly do something in console it is already at your fingertips, but if you someday feel the need for a conventional stand-alone console you just press Command+N to open it.

I find it particularly useful because I can summon the console by just pressing alt+space instead of constantly trying to find the console with alt+tab+tab+tab ("Oh no, I just pressed tab one time too many and now landed on another Space. Again.")

Zemanta Pixie

Tuesday, June 10, 2008

Download photos from Picasa Web

iPhoto 7Image via WikipediaFew days ago I had a problem of downloading pictures from one of Picasa web galleries. There was no download button and even if it were I would still need Picasa to download them. At first I was a bit disappointed by this and thought that I would have to write a script for myself or use a separate program to get them. But I then realized that Picasa Web makes a RSS feed for every gallery unless user specifically disables it.

I launched iPhoto and quickly found the solution to my problem - Subscribe to Photo Feed (Command+U) in File menu. Voila! Problem solved and entire album downloaded without the hasle to download another program just for a bunch of photos as it is suggested in one of the related articles.

Zemanta Pixie

Monday, May 19, 2008

Status updates on your pizzas

365 2008 05 12 - Day 222Image by brotherxii via FlickrBy pure coincidence I stumbled upon an interesting script - while reading Amit Gupta's Tumblr I noticed he linked to a python script that checks on regular intervals what is the status of pizza you ordered.

But more than the script I was plesantly surprised about the idea - pizzas were delivered to our homes even before internet but only now had the concept of live tracking came to life. For the time being, only in US.

It will take some time to be used widely even though ordering meals to your home is quite popular among students in Ljubljana and would probably be welcomed among them. When one is hungry minutes just seem to last forever and I can imagine that knowing what is happening to your meal would make a bit of a difference.

Personally I am not a fan of ordering food - with the notable exception of pizzas - everything just looks a bit weird to me when they deliver it to your door. But such live tracking would be awesome not just for food.

Friday, May 16, 2008

Mail costs - US to Slovenia

think pinkImage from FlickrFew days ago I decided that my iPod Touch is going to need a bit of protection if I am going to carry it around. After a bit of thought I decided to order Sena Ultraslim Pouch since I wasn't too thrilled about some of the other cases that look a bit to chunky for everyday use.

Not too pricey and with a special offer 9.99 dollars for international shipping it seemed a pretty good deal. I grabed my debit card and ordered it straight away. The only problem was that I forgot to read fine print regarding international shipping.

Only after already making a transaction I took the time to read the fine print and see that Slovenia was one of the countries which are not valid for special offer. It seems that although Slovenia joined the EU in 2004 and is heading it for the past five months or so, we still sometimes find ourselves experiencing special treatment. In my case this special treatment would mean that I would have to pay additional 30 dollars to Fedex for costs of getting a package from civilization to Slovenia - the same amount as I would have to pay for the case itself.

If I was a bit disappointed about Fedex (I presume its their policy/offer), I do have to give some praise to Sena though - after contacting their support about my sloppiness, their average response time was about 5 minutes and they reverted transaction straight away. Although funds were already withdrawn from my account, they should happily reappear in a few business days according to their representative. I will probably try to get the same case but this time in some other way.

Oh well, need to think pink and forget about banana republic...

Friday, May 2, 2008

View PDFs on iPod Touch with FilemarkMaker

Ipod TouchImage by riccardodivirgilio via FlickrAfter playing around with my iPod Touch for a few days (didn't jailbreaked it yet though) I quickly came to the idea of reading PDFs on it - and as simple as that may sounds, it turns out it's not. Although you can sync quite a few things with your Mac, you can't sync PDFs.

At first I tried by just importing PDFs to iTunes, but you still can't copy them over to the iPod. Then I tried to save and view PDF as an image but although I was using pretty decent images, iPod still shrinks them down to 640x480 or something which makes them unreadable.

Two ways that I found to actualy work were mailing PDFs to myself and reading them in the Mail application or using Filemark Maker. Mailing PDFs would be acceptable, but it still feels a bit awkward to me personaly and I decided that for the time being I will be using Filemark Maker.

Basicaly what Filemark Maker does is that it uses data: URI scheme to encode PDFs as URLs which you can bookmark and - after syncing your bookmarks with your iPod Touch - open and read as if they were located localy on file system.

It works well with small files and until you bookmark a couple of PDFs - after that I noticed a bit of a lag (about 5-10 seconds) when starting Safari for the first time since restarting iPod. I suspect it caches bookmarks afterwards and all works well.

I am still wondering however why Apple decided not to support syncing of PDFs or giving iPod Touch and iPhone users some other way of storing PDFs and other formats that Safari/Mail support...

Saturday, March 29, 2008

Zemanta went public

Five days ago.

And the guys are receiving quite some positive feedback which they more than deserve. I won't go into details about what they are doing and how - they have some interesting videos on their web site that you can check out, as of wednesday you can download Zemantas plugin for Firefox and try it yourself, but the best testament of their technology (at least on the page you are looking at now) can be seen in my blogposts since I began using and testing it at the beginning of February.

Even before I began using their technology as a novice blogger I also had an opportunity to work with their technology in London (on Wordpress plugin) for a brief period of time. Even back then when their technology was still in its infancy I was often pleasantly surprised about the results it returned. But as I left London and didn't have any actual contact with their technology for 2 months I was even more shocked when I saw to what extent had their product matured and how useful it actualy became when I started to look at it as a user and not as a developer.

At several occasions it proved at least as useful tool as Google - if not even more. One such example would be when I was writting a blog post and sometime during the process I took notice of articles that it suggested to me. The good thing was that those articles weren't strictly from the topic about which I was writting about - which may, based on ones intuition sound as something one would want to avoid at all cost. But it turned out that the articles that it suggested had helped me develop my ideas a bit more thoroughly.

Google is a very useful tool for searching precise information but it fails when you would want it to cover a bit broader field of knowladge. Either your search is very precise and in the process loses any interesting bits of information that could turn out to be useful for developing ideas and gathering knowladge, or its too broad and you get a whole bunch of information (majority of which happens to be useless to you) that quickly overwhelms you and causes you to lose focus.

In part this is a result of the method used for searching - Google has to find good information based on few keywords, Zemanta has the advantage to extract data based on a lot more input information. While Googles way turns out to be more useful the majority of time, Zemantas way is more useful when you are writing something like blog post (or any other piece of writing, for that matter) that can benefit from general links and related articles (besides images and tags that Zemanta also provides).

Since Zemantas product is still in alpha phase I am more than eagerly waiting to see how its technology will mature and what kind of suggestions can give me in my future posts.

Tuesday, March 25, 2008

Tiger users still a majority

Source: FlickrI was more than a bit surprised yesterday when I first saw the statistics published by Omnigroup regarding their userbase and what kind of operationg system their users are using. Sure enough, this data is not a definite indicator and probably only Apple can give more accurate data, but its probably good enough that we can at least draw some conclusions from it.

Omnigroup develops whole bunch of different applications that are focused both on business users and on home users. I haven't been able to find any statistics about how large their userbase actualy is and will for the time being presume that their large array of products gives a rather good sample of users - if, however, anyone finds this jumping to conclusions too quick, feel free to enlighten me in the comments. ;-)

On the first sight of data I was almost a bit disappointed about the number of Leopard users. 32% of Leopard users seemed rather low for a system that was relesed at the end of October and had brought with it a number of interesting and exciting features. But then I gave the whole idea of 32% another thought - if one third of users on a particular platform upgraded in roughly six months to the flagship version of some companys operating system, that doesn't sound that bad. Since Microsoft released Windows Vista in a recent past it could even make a rather interesting comparison of the speed of adoption of a new platform.

It turns out that only 8.7% of Windows users had opened their hearts to Vista (either voluntarily or when they purchased their new computers). Since Vista was available in retail stores on 30. January of 2007 this means that it was present on a market for about 420 days, which in turn means that its approximate rate of adoption is about .02% of Windows users per day or about .6% per month. For Leopard, this rate is almost ten times higher - since its introduction on 30th October of 2007 it was present on a market for about 150 days and was adopted at an avarage rate of .2 %. per day or 6% per month.

Based on the data that I used this means that Mac users are migrating to newest platform at almost ten times the rate of Windows users. This puts Apple in rather favourable position compared to Microsoft. If Microsoft has to maintain compatibility for such a long time (if adoption rate would be constant this would mean Windows XP would still be in usage for about 10+ years after Vista was released - until 2017 or more) it means it can not implament new features at a pace Apple can - or at least it can not make them requirements at until a large majority of its users migrate to a new platform. Apple on the other hand can implament new features and declare the old ones as deprecated far more easily and more often.

(Luckily Microsoft doesn't wait for 10+ years for its users to upgrade - Windows XP will enjoy full support until 14th April of 2009, after which it will enter Extended Support that will last until 2014, although personally I think Microsoft will extend both full support and Extended Support.)

For developers this means that when Apple said Carbon would not enjoy 64-bit benefits that can be found in Leopard and encouraged developers to get their hands dirty with Cocoa, developers can sleep a bit easier knowing that their users will shortly follow the development cycle that Apple dictates. On the other hand developers that had huge code bases written with Carbon probably enjoyed a number of sleepless nights after first hearing Steve Jobs about implamenting 64-bit support only for Cocoa.

At the end it is only fair to write a honest disclaimer - both Omnigroup and W3Schools say that their data is not to be fully trusted, to which I can only add that you should trust my conclusions even less. I only did some quick math with basic data that both of these companies provide online in an effort to see how users of different platforms view flagship products of their beloved company and how this love is translated into purchase of new operating system or new computer. This data clearly misses all those users that had migrated either to OS X or to Windows from other platforms (and probably a bunch of other edge cases). Basically my analysis is just a quick look on a subject that is almost impossible to analyse with great certainty and should be treated in that manner.

Monday, March 10, 2008

iPhone SDK and Apple's deal to developers

Source: FlickrWhen Apple held their press event for the announcment of SDK previous thursday I was watching closely what will come out of it - after all, there was a whole bunch of rumors and predictions about the direction that Apple could take and how it can either win the hearts of developers or burn in hell with it's platform by it's side.

As far as I'm concerned I think Apple more or less won my heart. Although I already downloaded[1] the iPhone SDK and installed it, I haven't yet programmed anything with it - one notable obsticle is that I am far more into web development and have almost no experiance whatsoever with Objective-C. But as I spend more and more time reading about Objective-C and watching various iPhone Application Framework videos of what can be done (and how) I am becoming a bit more confident to say that Apple's aproach is the right one.

Not only does a developer get a stable enviorment that Apple itself used to develop it's mobile version of OS X and has a lot of similarities with desktop/server version of OS X (this is one of those things that any Mac developer will appreciate because they can start programming without having to re-examinate yet another platform), they also get access to huge userbase that is accustomed to using iTunes and ready to pay for things they like.

App Store, the store which Apple will use to distribute software among users and give them a one-click-away solution for purchasing all software ever developed for the platform is what in my opinion distinguishes Apple's platform from other already developed and marketed solutions. It will be installed on iPhone and integrated into iTunes and therefore have a tremendous userbase - and it's userbase is what gives Apple an edge over it's competition. It is consisted of people that pay for their music, are a bit more tech-savvy than avarage population and will therefore appreciate the ease of use and user experiance that Apple will give to them. And gladly pay for the software that will be useful to them.

Business model used also seem rather fair to developers - if you want to charge for your applications you get to share 30% of your income with Apple and remaining 70% gets into your pocket. I find it a bit less friendly if you are prepared to give your application away for free - you will still have to get yourself iPhone Developer Program account that will cost you $99 per year - but I do not mind paying up those 99 bucks if they provide good enough value with their offical support.

Another very pleasing sight is how other software companies began stepping up their efforts for an emerging platform - my biggest surprise was Sun's announcement that they will develop Java for iPhone. I understand how they can develop it with the technology provided but I am very interested knowing how they will bypass various legal issues. Not that I mind if Java is developed for iPhone - on the contrary, I will be more than pleased to use a whole bunch of already developed Java applications.

How will all this come together in reality is yet to be seen in July when Apple will relase final version of SDK and give users second version of it's iPhone software that will include support for applications from 3rd party developers. I will however try to get familiar with Objective-C and start playing with SDK soon - and hopefully there will be a follow-up to this blog post in near future that will discuss my experiences with development itself.

[1] you need to have Apple Developer Connection account in order to download the SDK - free account will do

Saturday, March 1, 2008

"Apple Cripples Non-Apple Software"

Source: FlickrIt didn't come as no suprise to me when I first saw this sensationalistic title appear on Slashdot two days ago. But it was sad to see how a wonderful piece of work like that produced by Vladimir Vukićević in his blog post got distorted and taken out of context (or at least part of it) for the purpose of the argument.

As Vladimir was looking how to give Firefox 3 for OS X some performance boost he did quite some research and testing to locate origins of bottlenecks that made OS X version of Firefox a black sheep in the past - at least compared to it's Windows and Linux siblings.

During his research he discovered that Apple was using some undocumented methods to give Safari some boost in performance. Not realy a very nice thing to do for sure, yet he still emphesized that there are other non-programaticall ways for gaining the same boost. But the news of Apple's Evil had already began to spread in the wild and blow out of proportion.

Personally I think Apple was right this time not to publish this part of API - David Hyatt himself said that this code was more of a hack and they themself are not happy with that part of WebKit's code. As any developer knows publishing hacks is never a realy good idea - even using hacks is not a good idea but sometimes due to time constraints or some other reason you just have to use them.

One one side, using them can get you in such an awkward position as Apple has arguably found itself in, yet publishing information about them is still wrong - you lose the ability to fix the problems and remove hacks in the next release and are faced with maintaining them for a forseable future. Unless, of course, you want to be hated by developers using your API for making such (dramatic) changed over night.

But one is still left wondering if there are some other hidden and undiscovered cookies in Apple's jar. They are, after all, only a company and even if most of it's empoyees are open-source minded, there probably hides a genious or two in the dark corners of Apple that thinks otherwise and could destroy Apple's reputation on this matter and create a picture of another company with Microsoftish malpractice.

Sunday, February 24, 2008

Facebook - your lifelong partner

Source: WikipediaI never realy knew a lot of people who were involved in social networking sites. Sure, there were few people I knew over the internet or a person here or there that I knew personally but that was about it. But when I joined the university this quickly changed and friend invitations became a daily phenomena on Facebook - as predicted by the Metcalfe's law the number of people using Facebook increased faster and faster as more people began using it and as it became a useful (or not) tool for nurturing old and new friendships.

Personally I never liked to share all the juicy details about my life or my conversations with other people - maybe some tiny paranoid bit of me was always thinking about what can happen if something goes wrong and published information end up in the wrong hands. That's probably why I hadn't joined any other social networking sites (with the notable exception of LinkedIn) before I was almost forced to.

Unfortunately it didn't took long before I started reading all kinds of horror stories about Facebook. A news that someone lost his job because he called his boss an idiot or got kicked out of school for posting pictures of drinking parties on his MySpace profile is realy becomming "business as usual". And as long as I or majority of my friends weren't using those sites the problem didn't seem important enough to bother or give it another thought - after all, everything was happening in the United States and Europe is still lagging behind in social networks usage.

Or does it?

Viadeo recently published survey data that 62 percent of British employers checked Facebook, Bebo or MySpace to see what kind of dirty things their future staff has done in the past and at least a quarter had already rejected a candidate because of their findings. With continuing trend of increased social network sites's usage the rest of Europe will probably follow - if it hasn't already.

In the end the simplest observation is to enjoy social networking sites as long as they benefit you in the collage when you still meet new people but when you grow up and get yourself a serious job the best course of action would be to just erase your Facebook/MySpace/Something profile and forget about those things you did during collage. Well, for Facebook at least, this is where things get messy. Until recently you could only "disable" your account - Facebook was hoping you would one day come back and to ease your pain you could just reactivate your account and continue where you left - which is still not good enough when you just want your account deleted, plain and simple.

Nipon Das, who faced this problem, had to start threatening with legal action against Facebook before the company got serious about it and finally deleted his account - which, acording to New York Times, still didn't prevent a reporter from sending him an email message. The deletion process did get a tiny bit better just two days after the publication of that article in NYT but a simple and obvious "Delete" button is still no where to be found.

What is then to do about your online identity? Probably the best solution is the one that was around all along and none of those involved into horror storied realy thought about - common sense. Would you show your parents how you got drunk and puked around someone's aparment? Would you want that to be seen by your boss? Probably not.

Friday, February 8, 2008

Cross-domain XMLHttpRequest

Source: WikipediaDuring my presentation @ Barcamp I was talking about Cross-site XMLHttpRequests and promised I'll write a follow-up on my presentation as well. As any web developer that didn't go to hidding before GMail came to life knows, XMLHttpRequests can be a very helpful tool for your users - sure, there are some accessibility and security issues but, providing that you use it correctly, it can be an enormous benefit to the user experiance.

One thing that plagued XMLHttpRequests (at least if you were a legit user) was same-origin policy that you had to follow. At some point, probably due to a bunch of mails, even Yahoo posted a tutorial on how to resolve this issue with a proxy. But fear no more! Firefox 3 bring us support for Cross-site XMLHttpRequests which provides us with options to control who can access our content with remote Javascript calls. W3C Access Control draft goes into very specific details on how exactly should browsers implament this feature, but I won't cover this in that much of a detail because Firefox 3 still exhibit some problems with it. However, it does bring us a nice new feature that we can play with and give it a try.

My lightning talk was just a quick introduction to this subject - due to time constrains I was also unable to show a practical example (hey, that's the whole idea of lightning talks! :) - and so will be my example which was built with Django. Basically just the code you get when you start a new project and nothing else.

On the client side (i.e. in browser) you code remains pretty much the same as it would be if you would be making same-domain request:

window.onload = function(){
var xhr = new XMLHttpRequest();"POST", "http://localhost:8001/");
xhr.onreadystatechange = function(){
if ( xhr.readyState == 4 ) {
if ( xhr.status == 200 ) {
document.body.innerHTML = "I received: " + xhr.responseText;
} else {
document.body.innerHTML = "Some error occured.";

(Basically I just took John Resig's code for the client side, since he first publish this in the start of January.)

(Also note that
I am making calls from django instance running at port 8000 to the instance running at port 8001 - if you check Mozilla's same-origin policy you will see that different ports are also forbidden.)

Then of course comes the interesting part, the server side code - John published his blog post with PHP code, I will publish it for Django, but since it's so easy, you have to make it working in every language or framework you code/work with. :) Interesting part of course is the control of who can access your code and in what kind of way.

def foo(request):
r = HttpResponse("some content")
r["Access-Control"] = "allow <localhost:8000>"
And that's it! Yes, access control allows whole bunch of interesting features - for example, you can allow all sites [1] to access your resources (probably useful for mashups), limit your access only to specific request types [2], etc.

"allow <*>" [1]
"allow <> method POST" [2]

But during my experimants I discovered that, at least for POST requests, Firefox still returned uncaught exceptions. If anyone has any ideas what went wrong, please don't hesitate to tell me in the comments or mail me so that I can publish it.

Now, if only IE would bring something equally useful...

Tuesday, February 5, 2008

Barcamp Klagenfurt 2008, #2

(As promised, I'm also adding my impressions regarding "Blogs and their connectivity network" and "BIG Weblog projects")

The "BIG Weblog projects" was a presentation on what is needed for a successful blog project and what kind of benefits that brings to your blog. The guy (I do have to apologise for not remembering his name and forgetting to take notes at the start of his presentation) has a cooking blog and decided to write 2 recipes for each game played (one from every country) for the entire duration of world football cup of 2006. I thought that searching for recipes themselfes was a trouble by itself but he decided to test every recipe as well! His entire preparatins lasted for a year and sure brought some rewards - his traffic was 100% up during the duration of football cup and remained 50% higher when the championship ended.

Source: Wikipedia
Max Kossatz had something even more interesting for me as I had already done some hacking with Google Maps. In his project he looked at how blogs are interconnected between each other - as it turns out although some blogs are heavily connected to the rest of the world, some are completly isolated (one such example are link farms) - and mashed that information with Google Maps. The result is quite impressive - I attached an image below so that you can see it for yourself.

He asked us not to run with the link to Slashdot so I won't be publishing that for now. ;-)

I also have to thank Werner for giving me a lift from pizzeria to trainstation when busses stopped to drive. Thanks! :)

Monday, February 4, 2008

Barcamp Klagenfurt 2008

As I mentioned in my introductionary post, I was getting high on Web 2.0 hype in Klagenfurt at the user-lead unconference Barcamp. When I took a train to Klagenfurt I never expected to be such a positive experience - after seeing a video from San Francisco's Barcamp (I'm including it below), I didn't realy expected that something organized almost in my neighbourhood could bring together so many interesting people with equaly interesting lectures.

The first lecture that I attended and obviously one that had some influance on me was the lecture from Monika Meurer titled "Time & ideas for blogging". Her idea of getting yourself a routine and following it as much as possible made me brave enough to start rethinking about blogging. The sheer amount of effort and work that she and her husband put into their blogs was almost jaw-dropping - it turns out that she prepars material for a month or so in advance and then uses Timestamp feature of Wordpress so that posts are evenly distributed. One could write enough posts to accommodate his holidays, use Timestamp and his readers would never even notice that he was out of town! :)

Following that I got myself some reality check with Stefan Jäger's presentation of his trip to North Korea. When you constantly annoy yourself with everyday troubles, a lecture like this does show you the other perspective of your life and how life goes on in some other, less fortunate parts of the world. When you get stripped of your mobile phone when you land at the airport you know the situation has to be bad, but then things just get worse in NK. I'm still waiting if Stefan will publish his slides on the net so that you can take a look by yourself. I know that most of it's people don't care about internet access and mobile phones due to North Korea's economic position, but it's just unbelivable to hear that goverment officals have access to both of these, while ordinary citizens have to get permits just to travel accross country.

After seeing "Blogs and their connectivity network" and "BIG Weblog Projects" (I'll write a follow up on them tomorrow), I slowly got to the part when you could see me before the board. I attended lightning talks (5 minutes, everybody, everything :)) and had a quick speech about Cross-site XMLHttpRequests. 

Me giving a speech
Me giving a speech (yes, my hair does look a bit weird :)

I'll be publishing my keynote and some code samples tomorrow or the day after it in a follow-up - I would like to do some code clean-up and make it as concise as possible. 

Here We Go (Again)

After much pressure from Jure (although I might be grateful to him someday :)) and after getting high with a bit of Web 2.0 hype at the Barcamp in Klagenfurt I had done some thinking and came to the conclusion that it's time to give blogging another go (as only I remember I did had something resembling blog in the past, but that didn't turned out very well).

Now, let's finish this introductionary post and let's get the show on the road with some thoughts about Barcamp! :)