Use an authenticated feed in Google Reader

You currently can't subscribe to an authenticated feed (for example in Basecamp) in Google Reader.

If you want to do it nonetheless you can use this script of mine which will talk to the server that needs authentication, passing through all the headers (so that also cookies and "not modified" requests will come through): download authenticated-feed-passthru.php


<?php
// change this url
$url = "https://username:password@proj.basecamphq.com/projects/123/feed/recent_items_rss";

$ch = curl_init($url);

if (isset($_SERVER['REQUEST_METHOD']) && strtolower($_SERVER['REQUEST_METHOD']) == 'post') {
    curl_setopt($ch, CURLOPT_POST, true);
    curl_setopt($ch, CURLOPT_POSTFIELDS, $_POST);
}

curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_HEADER, true);

$headers = array();
foreach ($_SERVER as $name => $value) {
    if (substr($name, 0, 5) != 'HTTP_') continue;
    if ($name == "HTTP_HOST") continue;
    $headers[] = str_replace(' ', '-', ucwords(strtolower(str_replace('_', ' ', substr($name, 5))))) . ": " . $value;
}
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);

list($header, $contents) = preg_split('/([\r\n][\r\n])\\1/', curl_exec($ch), 2);
curl_close($ch);

foreach (preg_split('/[\r\n]+/', $header) as $header) {
    header($header);
}

echo $contents;

If you don't mind giving away your credentials you can also use Free My Feed.

Posted in web

New Feature for HN Collapsible Threads: Collapse Whole Thread

I have added a feature to the HN Collapsible Threads bookmarklet that enables you to close a whole thread from any point within the thread:

This is useful when you are reading a thread and decided that you are having enough of it and want to move on to the next thread. Before you had to scroll all the way up to the top post and collapse that one.

Drag this to your bookmarks bar: collapsible threads

Install Greasemonkey script

Posted in web

Safari Extension: Clean URLs

I have been picking up and developing a fork of Grant Heaslip's Safari extension URL clenser which removes all sorts of un-necessary junk for the URL so that you can easily pass on a clean URL to someone else. Things being removed include:

  • Google Analytics parameters (utm_source=, utm_medium, etc.)
  • Youtube related parameters (feature=)
  • Partner tracking stuff for NYTimes, Macword, CNN, CBC Canada and The Star

You can download my version here: url_cleanser.safariextz

Posted in web

Title Junk: Solve it with Javascript

There is some back and forth by John Gruber and others, about HTML <title> tags, with Gruber complaining (and rightly so) that for SEO reasons the titles are filled up with junk having little to do with the real page content.

The writers of cam.ly suggest to use the SEO title in the HTML and have something proper be displayed in Google by using an OpenSearch description. But this still doesn't solve the problem of bloated window titles and bookmarks.

So my solution to that: use JavaScript. If you want to satisfy your readers with a good title and present a nice title to Google, simply set the title to something nice after the page has loaded with JavaScript:


document.title = "Title Junk: Solve it with JavaScript";

Everyone happy. Except those who have JavaScript disabled maybe.

I have also created a tiny WordPress plugin that does just that: title-junk.zip

Discussion on Hacker News

Posted in web

Reddit-like Collapsible Threads for Hacker News

I enjoy consuming and participating at Hacker News by Y Combinator resp. Paul Graham.

One thing that needs improvement is the reading comments there. At times it happens that the first comment develops into a huge thread, and then the second top-level comment (which might also be well worth reading) disappears somewhere down into the page.

Collapsible Threads at Hacker News through a bookmarkletReddit has combatted this common problem by making threads easily collapsible. I think it is worth having this also on Hacker News, so I implemented it and wrapped it into a bookmarklet so that you can use this functionality on-demand at Hacker News.

Drag this to your bookmarks bar: collapsible threads

As soon as it is available in your bookmarks bar, go to Hacker News and click on it when viewing a comments page. Next to each thread a symbol [+] will appear. Click it to collapse the thread and it will change to a [-]. Click that to expand the thread again.

I have licensed the source code under an MIT License. Click here to view the source code of hackernews-collapsible-threads.js. (Actually for caching reasons the bookmarklet currently loads hackernews-collapsible-threads-v6.js which is actually just the same)

The Hacker News HTML source code seems quite fragile in the sense that the comments section of a page can't be identified in a really unique way (for example it does not have an HTML id attribute), so it might break when the layout of the page changes. This is why the bookmarklet is actually only a loader for the script on my server. I have tuned the HTTP headers in a way that your browser should properly cache the script so that the speed of my server should not affect the loading of the bookmarklet.

Enjoy :)

If you use Hackernews on another URL than news.ycombinator.com or hackerne.ws, use this bookmarklet: collapsible threads (no domain check)

Update March 18, 2011: Paul Biggar has contributed a greasemonkey script that also works on Firefox 4. I have adapted it so that it also works (which basically involved copying the jQuery script above mine) in Safari and Chrome (using NinjaKit).

Install Greasemonkey script

Install Paul Biggar's Greasemonkey script

Update November 22, 2011: Eemeli Aro has sent me a little CSS tweak so that the lines don't move around when collapsing. The code downloadable from above contains his code. Thank you!

Posted in web

Even Faster Web Sites, a book by Steve Souders

Steve Souders has recently released something like a sequel to his previous book "High Performance Web Sites" (HPWS) which I have already reviewed earlier. With Even Faster Web Sites he and his co-authors (specialists in their fields, such as Doug Crockford (JavaScript: The Good Parts) on Javascript) elaborate on some of the rules Steve postulated in HPWS.

It needs to be stated first that if you haven't read and followed Steve's first book, you should go and do that first. It's a must-read that makes it pretty easy to understand why your page might be slow and how to improve it.

In "Even Faster Web Sites", Steve and his co-authors walk a fine line between fast and maintainable code. While most techniques described in his first book could be integrated with an intelligent deployment process, it is much harder with "Even Faster Web Sites".

In the chapters that Steve wrote himself for "Even Faster Web Sites," he is pretty much obsessed with analyzing when, in what sequence, and how parallel the parts of a web page are loaded. Being able to have resources transfered in parallel lead to the highest gains in page loading speed. The enemy of the parallel download is the script tag, so Steve spends (like in HPWS but in greater detail in this book) quite a few pages analyzing which technique of embedding external scripts lead to which sequence in loading the resources of the page.

Steve also covers interesting techniques such as ways to split the initial payload of a web site (lazy loading) and also chunked HTTP responses into consideration that allow sending back HTTP responses even before the script has finished. Downgrading to HTTP/1.0 can only be considered as hard-core technique that just huge sites such as Wikipedia are using right now and should be considered being covered for educational reasons only.

There is a section focussing on Optimizing Images which thankfully takes the deployment process into consideration and shows how to automate the techniques they suggest to optimize the images.

My only real disappointment with "Even Faster Web Sites" is the section by Nicolas C. Zakas. He writes about how to Write Efficient JavaScript but fails to prove it. To be fair: in the first section of the chapter he shows benchmarks and draws conclusions that I can confirm in the real world (accessing properties of objects and their child-objects can be expensive). But then he gives advice for writing code that can hardly be called maintainable (e.g. re-ordering and nesting if-statements (!), re-writing loops as repeated statements (!!!)) and then doesn't even prove that this makes the code any faster. I suspect that the gains of these micro-optimizations are negligible, so chapters like these should be (if at all) included in an appendix.

Speaking of appendices, I love what Steve has put in here: he shows a selection of the finest performance tools that can be found in the field.

This book can help you make your site dangerously fast. You also need to be dangerously careful what tips you follow and how you try to keep your site maintainable at the same time. "Even Faster Web Sites" is great for people who can't get enough of site optimization and therefore a worthy sequel to "High Performance Web Sites," but just make sure that you also read and follow Steve's first book first.

The book has been published by O'Reilly in June 2009, ISBN 9780596522308.

Posted in web

Website Optimization, a book by Andrew B. King

Website Optimization

This time I'm reviewing a book by Andy King. Unlike High Performance website by Steve Souders, it doesn't solely focus on the speed side of optimization, but it adds the art of Search Engine Optimization to form a compelling mix in a single book.

If you have a website that underperforms your expectations, this single book can be your one-stop shop to get all the knowledge you need.

Andy uses interesting examples of how he succeeded in improving his clients' pages that illustrate well what he describes in theory before. He not only focuses on how to make your website show up at high ranks in search engines (what he calls "natural SEO"), but also discusses in detail how to use pay per click (PPC) ads to drive even more people to one's site. I especially liked how Andy describes how to find the best keywords to pick and also describes how to monitor success of PPC.

The part about the optimization for speed feels a little too separated in the book. It is a good read and provides similar content as Steve Souders book, though the level of detail feels a little awkward considering how different the audience for the SEO part of the book is. Still, programmers can easily get deep knowledge about how to get that page load fast.

Unfortunately Andy missed out a little on bringing this all into the grand picture. Why would I want to follow not only SEO but also optimize the speed of the page? There is a chapter meant to "bridge" the topics, but it turns out to be about how to properly do statistics and use the correct metrics. Important, but not enough to really connect the topics (and actually I would have expected this bridging beforehand).

Altogether I would have structured things a little different. For example: It's the content that makes search engines find the page and makes people return to a page, yet Andy explains how to pick the right keywords for the content first whereas he tells the reader how to create it only afterwards.
Everything is there, I had just hoped for a different organization of things.

All in all, the book really deserves the broad title "Website Optimization." Other books leave out SEO which usually is the thing that people mean when they want to optimize their websites (or have them optimized).

I really liked that the topics are combined a book and I highly recommend the book for everyone who wants to get his or her website in shape.

The book has been published by O'Reilly in July 2008, ISBN 9780596515089. Also take a look at the Website Optimization Secrets companion site.

Thanks to Andy for providing me a review copy of this book.

Facebook discloses its users to 3rd party web sites

Q&A with Dave Morin of Facebook

Just a quick post, because what I read at Joshua Porter's blog somewhat alarms me: Facebook?s Brilliant but Evil design.

I feel more and more reassured at why I don't use Facebook and have a bad feeling about them.

The gist is this: when you buy something at a participating web site (Ethan Zuckerman shows how it is done at overstock.com), Facebook discloses to that 3rd party web site, that you are a user of Facebook, and hands over some more details about you — while you are only visiting that 3rd party page (and not facebook.com)!!

This goes against the idea of separate Domains on the Internet. Joshua fortunately also goes into technical detail, how this could be done.

In my opinion Facebook users should quit the service and heavily protest against these practices. But I am afraid, few of them will even notice that this is happening.

Posted in web

This was FOWA Expo 2007

fowa.jpg

I have been attending this year's Future of Web Apps Expo in London's ExCeL centre.

There were a ton of interesting speakers and I enjoyed listening a lot. Amongst others there were Steve Souders of Yahoo (High Performance Web Sites), Paul Graham of Y Combinator (The future of web startups), Matt Mullenweg of WordPress.com (The architecture of WordPress.com, he was the only one to go into some detail) and Kevin Rose of digg (Launching Startups).

I also enjoyed Robin Christopherson's talk very much. He is vision impaired and showed how he browses the web (amazing how fast he had set the speed of his screen reader — I know why and guess that most vision impared people turn up the speed, yet it still feels awkward to listen to it) and which challenges therefore arise. Unfortunately Chris Shiflett only held a workshop which I was not attending.

The conference was clearly not so much for developers (at some points I would have greatly enjoyed some delving into code), so I am trying to keep my eyes open for even nerdier conferences :) Any suggestions?

On the evening of the first day there was a "live" diggnation recorded which was pretty fun.

According to Ryan Carson, he will be publishing audio files of the talks on www.futureofwebapps.com soon. Thanks to Carsonified for installing this great conference. I hope I will be able to return next year.

I have posted more photos to flickr.

,

High Performance Web Sites, a book by Steve Souders

I'd like to introduce you to this great book by Steve Souders. There already have been several reports on the Internet about it, for example on the Yahoo Developers Blog. There is also a video of Steve Souders talking about the book.

The book is structured into 14 rules, which, when applied properly, can vastly improve the speed of a web site or web application.

Alongside with the book he also introduced YSlow, an extension for the Firefox extension FireBug. YSlow helps the developer to see how good his site complies with the rules Steve has set up.

I had the honour to do the technical review on this book, and I love it. Apart from some standard techniques (for example employing HTTP headers like Expires or Last-Modified/Etag), Steve certainly has some tricks up his sleave:

For instance he shows how it is possible to reduce the number of HTTP requests (by inlining the script sources) for first time visitors, while still filling up their cache for their next page load (see page 59ff).

The small down side of this book is that some rules need to be taken with care when applied to smaller environments; for example, it does not make sense (from a cost-benefit perspective) for everyone to employ a CDN. A book just can't be perfect for all readers.

If you are interested in web site performance and have a developer background, then buy this book (or read it online). It is certainly something for you.

The book has been published by O'Reilly in September 2007, ISBN 9780596529307.

Some more links on the topic:

,

Webkit catching up with Firefox and Firebug

Webkit, the rendering Toolkit that powers Apple's Safari web browser, is getting a lot of love lately (iPhone, Windows beta version).

But for developers it was always hard to debug and inspect your web applications running in Safari. With Drosera a decent debugger exists since June 2006 (for Webkit only so far, though — it's not going to happen with Safari 2).

And now, the (already existent, but somewhat weird looking) (Web) Inspector got a makeover:

Webkit: New Inspector

This is a big step, giving web developers not only the chance to precisely identify why this or that DOM element is shown in the way it is, but it also allows a look into how the web page loads, much like Firebug on Firefox.

As a neat extra, you can view how your components add to the loading time of the page.

Webkit: Transfer Time

Even though Webkit is in some ways just mimicking Firebug, it is a good step for future web development on Safari. Even more as the new Webkit builds contain less than the usual number of browser quirks that make programming Safari difficult in the Ajax world.

The Webkit nightly builds provide the new feature by a right click on the page, selecting "Inspect Element". For more info, see the blog post on Surfin' Safari Webkit blog.

Finally one more pic, because it's quite beautiful :)

Webkit: CSS/DOM

, , , ,

Posted in web