archive about

Overriding caching limitations with Javascript

I've mentioned some times monitor.vrypan.net, my greek blogs agregator (and search engine too now!). One of the problems I had was that page generation is very demanding in CPU cycles and memory (many SQL queries, and some recursive functions). The obvious solution is caching which is actualy a good one in this case since I know when I update data (fetch new feeds), and in between nothing changes.

However, I wanted to add some personalization by highlighting new posts for each visitor. This means that each visitor would see a different page, depending on when was his last visit. How can this be done on a cahced page? An elegant solution I found was to use a dynamicly generated Javascript do do some "ajustements" to the cached page, after it is loaded.

Here is how.

Firts of all, all posts, are inserted in a table named monitor_posts. This table has a unique id column. So what I would need to do was to set a cookie with the last ID shown, then next time highlite entries with a bigger id.

The generated (and cached) html has entries like <div id="ITEM_xxxxx" ><div id="TITLE_xxxxx">post title</div> ...more html here... </div> where xxxxx is monitor_posts.id. I wrote a PHP script that dynamicly generates a javascript file, mark_new.php, that looks like this: <? if ($_COOKIE['lastID']) $lastID=$_COOKIE['lastID'] ; else $lastID=0 ;

$db = DB::connect($DSN); if (DB::isError( $db )) { echo DB::errorMessage($db); die() ; }

$res = $db->query("SELECT MAX(id) AS M FROM monitor_posts") ; $row=$res->fetchRow(DB_FETCHMODE_ASSOC) ; $newID = $row['M'] ;

echo " function mark_new_posts() { var MaxID=" . $lastID . " ; var main = document.getElementById('main') ; var i ; var j ; var items ; var feed ; var title ; for (i=0; i<main.childNodes.length; i++) { feed = main.childNodes[i] ; for (j=0; j<feed.childNodes.length; j++) { if (feed.childNodes[j].id && feed.childNodes[j].id.substring(0,5)=='IITEM') { itemID = feed.childNodes[j].id.substring(6,38) ; if (parseInt(itemID)>MaxID) { title=document.getElementById('TITLE_'+itemID) ; title.style.color = '#aa0000' ; } } } } SetCookie('lastID', '" . $newID . "') ; } " ; ?>

Then I include the dynamicly generated js file in my "static" cached page: <script language="JavaScript" type="text/javascript" src="/2005/06/10/overriding-caching-limitations-with-javascript/mark_new.php"></script>

and I call it "onLoad": <body onload="mark_new_posts();" >

Obviously, the generation of the js file by PHP is not so demanding, the 2 queries performed are fast. And of course, you could have used the same technique based on a session that involves user authentication, etc.

Don't stick to details as why do you traverse the whole DOM tree and not... This is a specific example for a specific application, with its own quirks and twists. What I find interesting in this solution is how you can use server-side caching to serve a "generic" HTML page and then use it as a canvas to draw on it with a dynamicaly generated Javascript that adds personalization or other features.