Posts

Showing posts from 2009

Fun with Layars

Image
Last night I installed Layar on my phone, and had some fun checking out the twitter and wikipedia layers. So I signed up for an API key, and 30 seconds later saw a tweet mentioning the California Data Camp . Perfect! After a rare and blissful sleep-in, I wandered over to see what was going on at Citizen Space, thinking I'd try get a proof of concept demo showing some City of SF data in Layar. Turns out, despite a number of interesting conversations taking precedence over my coding, I managed to get a simple demo working, and even win Honorary Mention (and an iPod touch) for my efforts. And a couple of Layars (crime data and handicapped parking spaces) are just waiting for publishing approval from Layar, and will hopefully be available in a few hours. Just search for "datasf" in your Layar app. Since GeoDjango was the reason I was able to get a mockup going so quickly, I thought I'd just write a few short notes on the steps I took

Tiling Kibera

Image
The upcoming Map Kibera project acquired some imagery recently, and I got ahold of it yesterday to set up a quick tilecache preview. There's actually been quite a few requests here recently for getting some tiles up quickly from various sets of source imagery, so I thought I'd write a few blog posts on some different ways to go about it. First, I'm assuming the end user will be requesting tiles, and that these tiles will be projected in Spherical Mercator for viewing on the web in a browser like OpenLayers or Google Maps (so I'm skipping over the bits for creating tiles that might be used in a browser like Google Earth). With that in mind, there are a few ways to get your tiles.  Note that the Kibera imagery is a nice simple example, because the area of the imagery is not that large (about 25 square km), and the source file is only a couple hundred megs as an uncompressed TIF. Option A: Pre-generate all your tiles in advance The easiest way to generate all y

Featureserver on AppEngine

AppEngine is awesome. The more we use it, the more we like it. Recently, someone contacted us who needed a site up, in a hurry, to serve up some points on a google map. The catch was there were about 50k points (so it seemed server side clustering might be nice). Also they wanted to be able to serve up at _least_ tens of millions of requests a day. And maybe quite a lot more. Given the scaling requirements, it seemed like AppEngine might be a nice fit, since then we wouldn't have to worry so much about tons of caching, or ensuring clients made similar bounding box requests, and so forth. And as for the posting/getting of points to/from appengine, we decided to go for using FeatureServer as a base. If you're not familiar with featureserver, a quick overview: It makes it easy to (amongst other things) post/update your features to some datastore, and pull them out with bounding box and/or attribute queries in a variety of vector formats (kml, json, wfs, etc). Al

Walk On

A few months ago, Walkscore.com launched their new API , aimed a Real Estate sites, academic and other largescale studies, and anyone else who might want programmatic access to the walkability of a set of locations. Recently, a number of large sites ( zillow.com , Estately.com , BaseEstate.com , and ColoProperty.com amongst many others) have started to incorporate walkscore in their listings, which has led to the happy situation of ensuring the site can handle the popularity. When the good folks at Walkscore first contacted me about designing the API, we talked about the likelyhood we'd be serving many, many millions of daily requests shortly after launch. This quickly led into discussions about what framework we wanted to build it on, and how much IT we were interested in taking on. Eventually, we decided to use Google's AppEngine . Given the current, and quickly growing, popularity of the API, I think this ended up being a great choice - no worries about having to even