Varnish Fixes Everything: Practical Approaches in the Varnish Configuration Language

Everyone knows that the unofficial motto of Drupal is, "There's a module for that!" But everyone also knows that the fastest way to having the slowest site is to install every module under the sun. Still, when the client ranks a piece of functionality on their "must-have" list, and the only way to accomplish it involves installing a complicated module or writing your own, what other option do you have, even if it means your site is slow as molasses?
Enter Varnish, the reverse proxy wunderkind. Varnish lives near the top of the Web caching stack, intercepting every request and transforming it in arbitrary ways. At the end of this transformation pipeline, the request can be fulfilled by any of a configurable set of backends, by a cached copy of a previous backend response to a similar query, or even by a synthetic response created within Varnish itself! As a practical matter, this translates to the ability to efficiently load-balance without a dedicated load-balancer, run staging and development side by side, filter DOS attacks, and speed up your site to an extent unmatched by anything you've worked with before—even static HTML pages served by nginx aren't as fast as a Drupal home page served by Varnish.
We'll start by examining what a HTTP request looks like under the hood, with a quick primer on relevant underpinnings using the OSI Model. Next, we'll consider how a request flows through Varnish from connection to completion, and the various pieces of a VCL (Varnish Control Language) program and what they do. Then, we'll dive deep into some real-world examples and look at both simple and not-so-simple solutions and "gotchas". Though this is intended as a beginner-level session, if there's interest or extra time, we'll finish by touching on just how powerful VCL can truly get, by considering the possibilities inherent in synthetic response creation, ESI, and C extensions.

Session Track


Experience Level


Drupal Version