...

/

Configure the Proxy Server

Configure the Proxy Server

Learn how to install and configure NGINX with Heroku.

Installing NGINX

NGINX is not a Python package, so it can’t be inferred from the Pipfile. Instead, we need to explicitly tell Heroku to install it using the buildpack.

$ heroku buildpacks:add heroku-community/nginx

NGINX configuration

The easiest way to set up NGINX is to serve files from a directory. However, that’s not going to be good enough here, so we need to look into customizing its configuration file. We’ll put our NGINX and Gunicorn configuration files in a new config directory at the top of the Git repository. When requests first come in, they’ll go to NGINX, so we’ll look at the nginx.conf.erb file first. Here it is, in its entirety:

Press + to interact
daemon off;
worker_processes auto;
events {
use epoll;
accept_mutex on;
worker_connections <%= ENV['NGINX_WORKER_CONNECTIONS'] || 1024 %>;
multi_accept on;
}
http {
server_tokens off;
gzip on;
gzip_min_length 512;
log_format l2met 'measure#nginx.service=$request_time request_id=$http_x_request_id';
access_log <%= ENV['NGINX_ACCESS_LOG_PATH'] || 'logs/nginx/access.log' %> l2met;
error_log <%= ENV['NGINX_ERROR_LOG_PATH'] || 'logs/nginx/error.log' %>;
include mime.types;
default_type application/octet-stream;
sendfile on;
#Must read the body in 5 seconds.
client_body_timeout 5;
upstream app_server {
server unix:/tmp/nginx.socket fail_timeout=0;
}
server {
listen <%= ENV["PORT"] %>;
server_name _;
keepalive_timeout 5;
# serve the client (static) files
location / {
root /app/client/build/;
index index.html;
}
# serve the api
location /api/ {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://app_server;
}
}
}

The main thing to recognize is that there are two main blocks. In the events block, NGINX is set up appropriately for the size and OS of the virtual machines it will be running on. It should be able to handle 1024 simultaneous connections per worker, and there should be one worker per CPU core. Typically, a browser uses two connections, so this setup should support about 2000 simultaneous users. That’s a lot! ...