This post appeared originally in our sysadvent series and has been moved here following the discontinuation of the sysadvent microsite
While Varnish is most famous for its speedy caching capabilities, it is also a general Swiss army knife of web serving. In the spirit of Christmas, here’s Twelve Days of Varnish Cache, or at least, twelve use cases.
And no, none of these examples are proof-read nor tested. They are only left as a taste of what Varnish can do. Quality assurance is left as an exercise to the happy reader.
1. Use Varnish to offload the backend and speed up your website
On the first day of Christmas, my varnish served for me
The fastest webcache you will ever see
Varnish was first and foremost built for accelerating your web site. And it is fast. Very fast. Fast, as in going through hyperspace, powered by the Dark Side of the Force. On steroids.
Basic configuration of caching is simple. Install varnish. Configure a backend. HTTP objects with proper caching headers will be stored in the cache, and later requests for the same object will be served from memory. To force caching of specific objects, call lookup. To override cache headers, add that in the backend response. By default, Varnish won’t cache objects with cookies, so when overriding, remember to strip those.
Note: Do not flatly cut and paste code here. Removing cookies may remove functionality, so be sure to know why and where you introduce flat caching without cookies.
sub vcl_recv {
if (req.url ~ "^/always/cache/this/image.jpg") {
unset req.http.Cookie;
# return(lookup); # Explicit forced lookup actually seldom necessary
}
}
sub vcl_backend_response {
# Strip cookies for static files and set a long cache expiry time.
if (req.url ~ "^/always/cache/this/image.jpg") {
unset beresp.http.set-cookie;
set beresp.ttl = 12h;
}
}
Good caching will offload your backends, and make your website a lot faster. So you might end up saving money on hardware and gaining better customer experience. Profit!
There are a lot more to say about cookies and caching. Read on, for example Dridi’s blog post on the subject.
2. Use Varnish as a Layer 7 router
On the second day of Christmas my varnish served for me
Two strange paths
Varnish’ configuration language is quite powerful, making it possible to build a level 7 HTTP router, making routing decisions based on almost anything in the HTTP object.
sub vcl_recv {
set req.backend_hint = default;
if (req.url ~ "^/safe/oldschool/java/app") {
set req.backend_hint = java;
}
if (req.http.cookie ~ "node_js_app_cookie") {
set req.backend_hint = nodejs;
}
if (req.http.User-Agent ~ "(?i)iphone") {
set req.backend_hint = iphone_gets_the_special_treatment;
}
if (req.http.Accept-Language ~ "no") {
set req.backend_hint = norwegians_go_home;
}
}
3. Use Varnish as a Layer 7 firewall
On the third day of Christmas my varnish served for me
Three admins
Protect your /admin area? Easy! Use access lists! And while we’re at it, we might even throw in (static) username and password authentication. (That auth string is base64 of username:password).
acl local_network {
"10.10.2.0/24"; /* (almost) all of this network */
! "10.10.2.42"; /* we don't trust this guy */
}
acl fixed_ip_admins {
"87.238.42.0/23";
"that.nice.guy.example.com";
}
sub vcl_recv {
if (req.url ~ "^/admin") {
if (client.ip ~ local || client.ip ~ fixed_ip_admins) {
if (! req.http.Authorization ~ "Basic dXNlcm5hbWU6cGFzc3dvcmQK" ) {
return (synth(401, "Restricted"));
}
else {
return(pass);
}
}
else {
return(synth(403, "Access denied."));
}
}
}
Remember to make sure we are on HTTPS
import std;
sub vcl_recv {
if (std.port(local.ip) == 80 && req.url ~ "^/admin") {
set req.http.x-redir = "https://" + req.http.host + req.url;
return(synth(301));
}
}
sub vcl_synth {
if (resp.status == 301) {
set resp.http.Location = req.http.x-redir;
return (deliver);
}
}
4. Use Varnish as a Load balancer
On the fifth day of Christmas my true love gave to me
Four shiny backs
There are several popular load balancers in the Open Source world. HAProxy, Nginx, and even Apache, may do the job.
Varnish does load balancing as good as any other web proxy, including directors for (weighted) round robin, random, fallback (first healthy), hash, and more. Of course, it also does backend health checks. So there is no need to add some other balancer software to the stack.
This example is pasted directly from the Varnish user’s guide. It defines a simple round robin director of two backends with health checks:
backend server1 {
.host = "server1.example.com";
.port = "8080";
.probe = {
.url = "/";
.timeout = 1s;
.interval = 5s;
.window = 5;
.threshold = 3;
}
}
backend server2 {
.host = "server2.example.com";
.port = "8090";
.probe = {
.url = "/";
.timeout = 1s;
.interval = 5s;
.window = 5;
.threshold = 3;
}
}
import directors;
sub vcl_init {
new myroundrobin = directors.round_robin();
myroundrobin.add_backend(server1);
myroundrobin.add_backend(server2);
}
sub vcl_recv {
if (req.url ~ "^/this/goes/to/the/balancer") {
set req.backend_hint = myroundrobin;
}
}
Now you can use the director myroundrobin as a backend. Unhealthy backends will not get traffic. To see health status, try from the shell,
varnishadm backend.list
5. Use Varnish as a Web proxy with SSL termination
On the fifth day of Christmas my varnish served for me
Five valid certs
We saw that varnish may be used as a load balancer, thus it is a proxy server, but without SSL support.
As you would expect, the Internet flows over of tips on how to fix that. Many of these advice you to add another web server in front of varnish to terminate SSL. The most common variant seems to be Nginx.
Now, varnish has its own SSL terminator. It’s called hitch! What makes hitch different from other web servers with SSL support? Hitch is not a web server! It sole purpose is to terminate SSL. So while Nginx and other web servers do completely unnecessary full HTTP session in front of varnish, hitch encrypts and decrypts traffic, and hands the rest over to varnish.
Hitch is a modern and extremely capable bit of software. It supports newer versions of SSL, and does TLS, SNI, OCSP stapling, and more. Given enough hardware, it will handle tens of thousands of concurrent connections without breaking a sweat. Setting up a basic hitch instance is very simple. Install the package. Add a certificate to the configuration and start hitch. The default configuration uses 4 CPU cores.
yum install hitch
echo 'pem-file = "/etc/pki/tls/private/default.example.com.pem"' \
>> /etc/hitch/hitch.conf
systemctl start hitch
The default install of hitch sends traffic to a varnish server at port 6086, using the proxy protocol. This means you should either change hitch to use the varnish HTTP port, or add a proxy listener to varnish, by adding ‘-a :6086,PROXY’ to the varnishd command line. For the standard systemd service, this may work:
sed 's/-a /-a :6086,PROXY -a /;' /lib/systemd/system/varnish.service \
> /etc/systemd/system/varnish.service
systemctl daemon-reload
Note that varnish listening to port 6086 is not part of the standard varnish SELinux setup, so if you have enabled SELinux, add you may need to add that port.
semanage port -a -t varnishd_port_t -p tcp 6086
6. Use Varnish as a Paywall
On the sixth day of Christmas my varnish served for me
Six articles hiding
Varnish Software’s product Varnish Plus, the proprietary sister of the Open Source Varnish Cache, makes an excellent paywall, with a large customer base, and proven track record. While it is probably possible to build a paywall using Varnish Cache and open source vmods, I would advice customers consider using Varnish Plus. Their paywall includes edge side authorization, that is, an authenticated user may get protected content from varnish served from cache without hitting any backend or auth servers. I know no other caching servers that provides this functionality.
7. Use Varnish to build a CDN
On the seventh day of Christmas my varnish served for me
Seven servers serving
With varnish on your web origin, a-caching and a-routing and a-walling for your local users delight, would it not be great to give the same experience also to the users over the Hill, and across the Water?
Varnish servers also work well in a chain. Put local varnish cache servers close to your users, even in the cloud, set the origin server(s) as backend, use a GeoDNS service to point your users to the closest server, and off you go. Boom! You built your own CDN with off-the-shelf parts! Bonus commercial: If you use the already mentioned Varnish Plus paywall, users will get protected content delivered from their closest server as well.
8. Use Varnish to uniform and normalize
On the eighth day of Christmas my varnish served for me
Eight parsers cleaning
Different web browsers and other clients do a lot of strange things while fetching content. Varnish may help in cleaning up all the mess before sending them clean and proper to your backend servers.
Here are a few examples, snipped from varnish’ own wiki:
Use pure regex power to simplify headers
sub vcl_recv {
if (req.http.Accept-Language) {
set req.http.Accept-Language = regsub(req.http.Accept-Language, "(^[^,;]+)[,;]*.*", "\1");
}
}
Clean headers for noise and normalize them
sub vcl_recv {
if (req.http.Accept-Language) {
if (req.http.Accept-Language ~ "en") { set req.http.Accept-Language = "en"; }
elsif (req.http.Accept-Language ~ "de") { set req.http.Accept-Language = "de"; }
elsif (req.http.Accept-Language ~ "fr") { set req.http.Accept-Language = "fr"; }
else {
# unknown language. Remove the accept-language header and
# use the backend default.
unset req.http.Accept-Language
}
}
}
sub vcl_recv {
if (req.http.Accept-Encoding) {
if (req.url ~ "\.(jpg|png|gif|gz|tgz|bz2|tbz|mp3|ogg)") {
# No point in compressing these
unset req.http.Accept-Encoding;
}
elsif (req.http.Accept-Encoding ~ "gzip") { set req.http.Accept-Encoding = "gzip"; }
elsif (req.http.Accept-Encoding ~ "deflate") { set req.http.Accept-Encoding = "deflate"; }
else {
# unknown algorithm
unset req.http.Accept-Encoding;
}
}
}
9. Use Varnish to keep your site up
On the ninth day of Christmas my varnish served for me
Nine DC fallouts
There is not much point in building a rock solid cache front for your app servers, if all they show is 503 if a backend goes down. Rather, we want to serve static, and even stale content, while our dear devops boys and girls work frenetically to get the apps up again.
Guess what! Varnish helps us again. Grace is built exactly to do this, that is, hide backend problems. For most problems of this kind, Grace is enough.
In simplified terms, Grace makes varnish prefer a fresh object, but if it isn’t present, a stale one will be served. So we serve the stale object from cache, while persistently trying to get an updated one.
The configuration and implementation of Saint mode and Grace changed from core functionality to a module in varnish-4.0. More details and a configuration example may be found in these two blog posts by Varnish Software:
Configure saint mode & grace in Varnish 4.1
Grace in Varnish 4
10. Use Varnish to fix that problem
On the tenth day of Christmas my varnish served for me
Ten backend fixes
We have all seen them. Stupid bugs. Exposed security problems. That ugly CVE report. While we wait for the developers to come up with a fix, we need to plug that hole or fix that app. Varnish helps again. Here are examples for inspiration:
sub vcl_recv {
# The infamous shellshock
if ( req.http.User-Agent ~ "[(][^)]*[)][^{]*[{][^;]*[;][^}]*[}][^;]*[;]" ) {
return( synth( 418, "I'm a teapot" ));
}
# Force that special header for my app
if (req.http.host ~ "^thatapp\." && !resp.http.X-UA-Compatible) {
set resp.http.X-UA-Compatible = "IE=Edge";
}
}
sub vcl_backend_response {
# Force compression in case backend server forgot to
if ( beresp.http.content-type ~ "^(text|application/x-javascript|application/javascript|application/json)") {
set beresp.do_gzip = true;
}
# Remove that extra debug header from the app before it reaches clients
unset beresp.http.X-LocalDebugHeader;
# The app sends duplicated or garbled date headers. Set it anew
unset beresp.http.Date;
set beresp.http.Date = now;
# Make this server more anonymous
unset beresp.http.Via;
unset beresp.http.X-Powered-By;
unset beresp.http.X-Varnish;
unset beresp.http.Age;
unset beresp.http.Server;
unset beresp.http.X-Any-other-header-here;
set beresp.http.Server = "World version 42";
set beresp.http.X-Powered-By = "pure electricity";
}
For a real world example, see Are’s hack overriding 302 using grace: Varnish and misbehaving application servers
11. Use Varnish to add IPv6 to your web app
On the eleventh day of Christmas my varnish served for me
Eleven too few IPs
When the Internet was invented and standardized, 4.3 billion addresses probably sounded like more than enough for all thinkable usage. Now we know otherwise, and IPv6 is here to save us. Given that all your web servers support IPv6. They don’t yet. So what do you do? Put Varnish in front of course!
Varnish was built for the IPv6 age, and has supported it since the project started, speaking both IPv4 and IPv6 to both backends and clients. So put your IPv4 only backend on RFC 1918 networks, use Varnish, and save those precious IPv4 addresses where you really need them.
12. Use Varnish to fence off a DDOS
On the twelfth day of Christmas my varnish served for me
Twelve million requests
Somebody in the customer’s organization went on record and was loud and clear about a subject that sounded harsh in the ears of the Open Internet Kindergarten of the Anonymous legion. Now they are attacking your website by DDOS, and they have actually torn down all app-server backends. What to do?
Fencing off a DDOS is not necessarily a simple task, but you may be lucky. Analyze the traffic using tools like varnishncsa or varnishlog. If it is possible to sort out the attackers, fencing the attack off may be possible.
sub vcl_recv {
if ( req.url ~ "/5228/take/down/service" || # The attackers' request is against a specific url
req.request == "POST" || # The atteckers' request is a specific request
req.http.User-Agent ~ "W34r3leg10n" || # The attackers' request matches a specific user-agent
req.http.referer ~ "3vils1te.com" # The attackers are linked from a specific site
) { return (synth(509, "Bandwidth exceeded")); }
}
Serving a static status code is very lightweight, and with a bit luck, varnish should be able to handle this kind of traffic, as long as you have enough bandwidth available. Then at least, the problem is at the network layer, and no longer hammering your app servers.
Wrap up
Varnish is more than just a super-fast web cache. It is also a general purpose programmable Swiss army knife for your web services. The examples given in this post are in no way exhaustive.
Now fall in while we sing the last verse together:
On the twelfth day of Christmas my varnish served for me
Twelve million requests
Eleven too few IPs
Ten backend fixes
Nine DC fallouts
Eight parsers cleaning
Seven servers serving
Six articles hiding
Five valid certs
Four shiny backs
Three admins
Two strange paths
And the fastest web-cache you will ever see
12 Bonus points to the readers who found the Tolkien reference.
Thoughts on the CrowdStrike Outage
Unless you’ve been living under a rock, you probably know that last Friday a global crash of computer systems caused by ‘CrowdStrike’ led to widespread chaos and mayhem: flights were cancelled, shops closed their doors, even some hospitals and pharmacies were affected. When things like this happen, I first have a smug feeling “this would never happen at our place”, then I start thinking. Could it?
Broken Software Updates
Our department do take responsibility for keeping quite a lot ... [continue reading]