How to Hook Varnish Cache Up to Nginx on CentOS 8 or AlmaLinux 8
If you’re running a website that’s been getting a steady stream of visitors and you want to keep the load on your server low, this is the quick‑and‑dirty guide for installing Varnish Cache in front of Nginx on CentOS 8 (or its close cousin AlmaLinux 8). You’ll see the commands, the little quirks that often trip people up, and why each step matters.
Why Bother with Varnish?
Varnish sits between your users and Nginx, caching static and dynamic content so that most requests are served from memory instead of hitting the disk or spinning up PHP every time. I’ve seen this happen after a bad driver update: a sudden traffic spike sent our server into the red because every request had to be processed by Nginx and then by FastCGI. With Varnish in place, that same spike just served cached pages until the cache refreshed.
Step 1 – Install the Repos (EPEL & Remi)
Varnish isn’t in the base CentOS repo, so first bring in EPEL (Extra Packages for Enterprise Linux) and the Remi repository, which hosts the latest Varnish packages:
sudo dnf install -y epel-release sudo dnf install -y https://rpms.remirepo.net/enterprise/remi-release-8.rpm
The default CentOS 8 repo only ships Varnish 3.0, which is ancient and missing many bug fixes. Remi’s repo gives you Varnish 6.x with full support for modern features.
Step 2 – Install Varnish
With the repos in place, grab Varnish:
sudo dnf install -y varnish
Once installed, check the version to make sure you’re on a recent build:
varnishd -v # => Varnish Cache 6.5.1 ...
If you see anything older than that, double‑check your repo config.
Step 3 – Stop Nginx Temporarily
We’ll reconfigure the network stack so that Varnish listens on port 80 and forwards traffic to Nginx on a different port (normally 8080). First stop Nginx:
sudo systemctl stop nginx
Don’t worry, we’ll bring it back up later.
Step 4 – Configure Nginx for the Backend
Edit /etc/nginx/conf.d/varnish.conf (or whatever site file you’re using) and point it to the new port:
server {
listen 8080;
server_name example.com;
# … your usual location blocks …
}
Why 8080? Varnish will swallow port 80, so Nginx needs a different one to avoid conflicts.
Reload Nginx’s config (we’ll start it up again later):
sudo nginx -t && sudo systemctl restart nginx
Step 5 – Edit Varnish Default Backend
Varnish comes with a default backend pointing at 127.0.0.1:80. We need to change that to the new Nginx port:
sudo vi /etc/varnish/default.vcl
Locate the backend default block and update it:
backend default {
.host = "127.0.0.1";
.port = "8080";
}
If you forget this step, Varnish will try to talk to a non‑existent service on port 80 and your site will be down.
Step 6 – Enable and Start Varnish
Enable the service so it starts on boot:
sudo systemctl enable varnish sudo systemctl start varnish
Check that it’s listening on port 80:
ss -tuln | grep :80 # => LISTEN 0.0.0.0:80 ...
If you see a process stuck in LISTEN but not Varnish, double‑check the firewall rules.
Step 7 – Tune the Cache (Optional, but Worth It)
Edit /etc/varnish/default.vcl again if you want to tweak caching policies. A common tweak is to make Varnish bypass caching for logged‑in users:
sub vcl_recv {
if (req.http.Cookie ~ "sessionid") {
return (pass);
}
}
After editing, reload the Varnish config:
sudo varnishreload
Step 8 – Test Everything
Open a browser and go to your site. Use curl -I http://example.com to see if you’re getting a 200 OK and check the headers for X-Varnish. If you see that header, Varnish is doing its job.
Also run:
varnishstat
to monitor cache hits vs. misses in real time. A healthy site will show a high hit ratio after a few minutes of traffic.
Common Pitfalls
- Firewall blocking port 80 – On CentOS/AlmaLinux, firewalld may need an extra rule: sudo firewall-cmd --permanent --add-port=80/tcp && sudo firewall-cmd --reload.
- SELinux context errors – If you’re running SELinux in enforcing mode, Varnish may refuse to bind to port 80. Either set it to permissive (setenforce 0) or adjust the policy: sudo semanage port -a -t http_port_t -p tcp 80.
- Nginx still on 80 – Double‑check that you didn’t forget to change the listen directive; otherwise Nginx will steal the port and Varnish never starts.
Wrap‑up
That’s it. You now have a Varnish front‑end serving cached pages, while your Nginx instance does the heavy lifting behind the scenes on port 8080. Expect lower CPU usage, fewer disk reads, and happier users during traffic spikes.