I spend most of my time using Ruby on Rails, but enjoy exploring other web application frameworks. There’s a wonderful diversity of frameworks out there, with more popping up all the time, but there are two challenges I’ve found with trying to find ones I want to investigate: Keeping track of them all:
Knowing which ones are worth looking at: This is far more challenging. When looking at a list of names of frameworks I’ve never heard of, the question that comes to mind is “which ones are people actually using?”. Not that popularity is a perfect proxy for how good a framework is, but I do put some faith in what fellow developers are into, and there are definite benefits to having a good sized community. In order to judge popularity, I look at github stats for projects living on github, and failing that will also look at how much traffic the framework’s site gets and how many people are linking in to it.
After numerous occasions of such exploration, it occurred to me that I may not be the only one who does this sort of thing. I decided to formalize my methods for turning up nuggets of web framework goodness and share the results with others. I was also inspired by The Ruby Toolbox, which beautifully does what I’m talking about for Ruby plugins/gems. But unlike Ruby, where essentially everything interesting is happening on github these days, not all (or even most) web frameworks have their source code hosted on github. This meant employing some different tactics for rating popularity.
The Popularity Contest
Since github stats aren’t universally available, I decided to look at other measures of popularity. Traffic statistics seemed a reasonable approach, but these too weren’t universally available since many web frameworks don’t have their own domain (e.g. Tapestry is at a subdomain of apache.org), and traffic statistics are typically only available at the domain level. I also considered looking at things like Google search trends or Twitter mentions, but the problem with these is one of specificity. For example, I can be fairly sure that any time I see SproutCore come up on Twitter it’s referring to the framework, but Cappuccino mentions could very well just be talking about the caffeinated beverage. So the third measure I’m employing is the number of inbound links, since these are specific and can be checked against any URL.
So inbound links seem to be the lowest common denominator for measuring popularity, but it would be a shame to neglect the other popularity measures where they’re available. Plus, inbound links are sometimes biased, for instance many Django apps have a “Powered by Django” link at the bottom so Django has an enormous number of inbound links. What I decided to do is take whichever of the three popularity measures (github stats, site traffic, inbound links) are available for a framework, and average them. But since the raw statistics are on very different scales, I first put them on a 0-100 scale, and then simply average the three scores. It’s not incredibly sophisticated, but seems to get the job done.
See the results at:
Currently there are really just three components to the site: rankings charts, individual stats for each framework, and a blog. This is just the first iteration though so there’s plenty more I can envision adding.