These days Middleman is my tool of choice for building web pages.
I've become a big fan of building static web pages. Before Middleman, I was in love with Jekyll. But I found Jekyll to ultimately be too limited.
For some people, when you say "static" web page, they immediately envision that you're building every single page by hand and that your content probably doesn't change much. Otherwise, you'd surely go mad from all of the hand-updating.
If that were the case, I would hate that.
But these days, "static" web pages can be so much more.
Would you ever guess that Cyclocane, a frequently updated tropical cyclone tracker, was actually a static page? I'm guessing not. And to be clear, I'm not talking about it being served as static pages by some caching backend. It's full-blown static pages served off of Amazon S3.
To reveal some of the "magic" here, it's just that the Cyclocane site gets rebuilt with the latest data on a regular basis. I'll talk about that more in a bit.
Cyclocane works well as a static page because, though the data is dynamic in nature, the updates are controlled by the server, not the user. What data needs to be retrieved/kept is known in advance.
So let me give you a different example. wickedwx uses data from the Storm Prediction Center and the National Weather Service to give the current severe weather outlook and things like watches and warnings. Though it was written using Sinatra, this site could've easily been written in Middleman (and in fact I wish it had been), because all of the data sources are known in advance and the update schedule is controlled by the server.
As a counterpoint, let's picture any given weather forecast site. Static would be inappropriate and/or impossible for this, because you can't know who your users are going to be and when they're going to show up and want to know their forecast.
Now, there are some scenarios I can think of where a static forecast site would somewhat work out.
First is the one you definitely don't want to do (or at least I think it would be a bad idea). Let's say you personally had all of the forecast data for the US somewhere already on your servers (as opposed to needing to talk to a weather service's API to get forecast data). You could consider pregenerating forecast pages for every single possible location in the United States. This would be overkill. And likely the time spent generating every single one of those pages as the data was updated, wouldn't be worth the trade-off in being able to serve a completely static page.
Alternately, if that API source happens to be you (as opposed to some third party), you'd have the option of architecting your application such that the dynamic components lived on something, um, dynamic, and that your frontend application was served through something like Middleman.
Now, I have seen some workarounds for this being discussed. I don't have any links handy, but I believe one method involved redirecting all search engines to a separate server that used PhantomJS to generate traditional scrapable pages for the web bots. I have to figure that this is going to be easier to solve at some point in the future, because "single page apps" seem to be getting more and more popular. Either search engines will get smarter or someone will figure out how to wrap this up in a nice easy-to-implement package.
So given that are potential downsides to designing around a static site, why would you want to do one?
Until Cyclocane, my Amazon S3 bill was crazy cheap. We're talking pennies per month here for each site hosted on it. And in the aftermath of Cyclocane, my Amazon S3 bill is still low enough that it's not breaking the bank (we're talking a latte a month here).
With S3, unless you end up with a crazy successful site, your domain costs (the $10/year) are quite possibly going to be way more than your hosting bill ever will be.
I also use nearlyfreespeech for a few sites. NFS and S3 are both good, but S3 is cheaper.
You can also use Heroku basically entirely for free, but I don't recommend them for this because of the spin-up time (or whatever it's called).
As I understand it, one of the ways that services like Heroku can offer services for so cheap is that they do things like "disable" your app when it's been idle for too long (i.e., you haven't received any traffic). Okay, so I haven't ever officially clocked it, but it often feels like it takes at least 5-10 seconds for Heroku to respond to the first request. But after that, subsequent requests will be super fast. Though "pinging" web monitoring services are supposed to be a workaround for making sure that your app never idles, I personally didn't find success with this method for the site I tried it on, and I personally think the pennies/month you'd spend on S3 is worth it to avoid using Heroku for something it wasn't really intended for.
Okay, so this one wouldn't actually be a given if you're running on some host with a lot of network latency, but static is about as fast as you can get. I will admit, a caching system running from memory (like memcached) may actually serve web pages faster (never looked into benchmarks), but if you go that route, you've just driven up your costs and complexity.
Static pages just are. There's nothing for the server to do other than to just serve them.
You're also fairly protected against traffic spikes. Now, I'm sure there probably is some upper limit with S3 where things would start to slow down once you got enough traffic, but I've personally never seen it. I'd assume at that point, you'd maybe deploy your site to multiple regions (or multiple static hosts) and then maybe something as simple as round-robin DNS is all you'd need rather than a full blown load balancer. Point is, you can get pretty successful before you have to start worrying about scalability issues.
You can also feel free to write some code without worrying about how inefficient it is, because the user will never know that it takes a god awful time to build your site; they'll just know that your site loads super quick (I haven't clocked it recently but Cyclocane is past the minute mark for site build time, possibly even in the 2-3 minute range by now, especially if there are a lot of active storms).
Now of course, they're not completely secure. If your web host gets compromised or your personal account gets compromised, you're screwed anyway. But the point here is that your application itself will never be the cause of a security hole, because you can't hack a plain HTML page.
This also means that, unlike something like Wordpress, you can upload it and forget about it, if you so choose. It will be just as vulnerable a year from now as it is today. For someone like me who likes to launch a lot of tiny experiments out into the world, this is blessing, because my tiny project isn't going to somehow bite me in the arse later on.
As I mentioned before, I was in love with Jekyll prior to Middleman. Jekyll was my hammer.
But eventually I outgrew Jekyll and started looking for something else (I'm fairly certain that I started searching out a new solution, rather than I happened to hear of Middleman, but anyways).
The problems for me were this:
Granted, there were forks of Jekyll that would give you Haml support, but for me, that wasn't good enough, because there's no guarantee that a fork is going to continue to stay updated/stay compatible with the official release. I'd way rather see the support baked in or at least officially supported through some sort of add-on module system or system.
As for the second option, I abused the concept of blog posts a lot for Jekyll sites I built. For instance, for the jekyll gallery I built, the time the page was last updated was used as its "post" time, so that the most recently updated sites were shown first (side note: that project got abandoned since no one was interested). But there were plenty of cases where chronological sorting just didn't make any sense. It just started to feel so unnatural.
With Middleman, I get access to any template type that the tilt gem supports. I can go crazy with the Haml and Slim if I want to.
For me, my faves probably would be:
In short, I get to build sites using modern tools that make the work much pleasant, while foregoing the overhead of a traditional dynamic site.
First, I'm gonna save the story of how Cyclocane updates for another blog post (as a spoiler, it's nothing fancier than a shell script that updates yml files, runs a middleman build, and then uses s3cmd to push the updated site to S3).
I can't say whether Middleman is right for you, but I can tell you this. Given the cost of hosting (or virtual lack thereof) and the lack of security worries, Middleman has enabled me to build/deploy a whole bunch of sites that I would otherwise would've just perpetually put on the backburner because they weren't worth the worry and expense.
Thank you Middleman!