How to configure Docker with NGINX and PHP application


Introduction

At the end of this tutorial, you will be ready with Docker NGINX container communicating with PHP application.
Capture
We are going to build above simple application workflow by using few docker commands.
In this tutorial, Browser is going to communicate with NGINX container to serve any HTTP/HTTPS request and NGINX container will communicate with Container1 and Container2 to respond back to the request from the browser.
We are going to create three containers.

Container 1 and Container 2

These are both PHP 7.0 Apache web server built on Ubuntu latest base image.  These containers are linked with NGINX container.
Container 1 is configured to display index page as  Server 1and Container 2 is configured to display index page as  Server 2.

NGINX

Nginx is used as a reverse proxy server for HTTP, HTTPS, SMTP, IMAP, POP3 protocols, on the other hand, it is also used for servers load balancing and HTTP Cache.

Befinits of using NGINX

  • Secure/Protect both the containers from unauthorised direct access.
  • Act as a Load Balancer, for better performance, optimising resource utilisation, maximising throughput, reducing latency, There are three different types of load balancing mechanism support by NGINX.
    • round-robin — requests to the application servers are distributed in a round-robin fashion,
    • least-connected — next request is assigned to the server with the least number of active connections,
    • IP-hash — a hash function is used to determine what server should be selected for the next request (based on the client’s IP address).
For more information about NGINX please visit official website NGINX load balancing.

Pre-Requirements

Before we get started, Ensure you install docker and docker-compose on your Docker host machine.
This tutorial full source code can be found on GitHub account – RohanMohite/Docker-Nginx-PHP

How to build

Go to your Docker host machine and download or clone the source code.
Clone the project locally to your Docker host.
git clone https://github.com/RohanMohite/Docker-Nginx-PHP.git
This project by default uses port number 80, make sure that port number 80 is free and available to use.
Go to the project directory and run the following command.
1
docker-compose up -d
As we are using docker-compose, hence in one command all containers will get build and run with latest base images.
All three containers should now be up and running, you can access service on port number 80.
Run the following command to check if Docker containers are up and running?
1
docker ps -a
On your browser check port no 80 with your Docker host IP address, your application should be running:
 http://your-docker-host-ip-address:80
Page display from Container 1.
Server1Page
Page display from Container 2.
Server2Page

Docker magic

Open docker-compose.yml file
1
2
3
4
5
6
7
8
9
10
11
12
13
server1:
build: server1
 
server2:
build: server2
 
server_nginx:
build: server_nginx
ports:
- "80:80"
links:
- server1:server1
- server2:server2
In docker-compose.yml, we have exported only NGINX container (server_nginx) to port no 80:80 and link with server1 and server2.
For other two containers (server1 and server2), we have not exposed any of the port, hence only NGINX container is having the capability to communicate with server1 and server2.
1
2
3
4
5
upstream ServerList {
#ip_hash;
server server1:80;
server server2:80;
}
In NGINX server.conf file.
First Line #ip_hash; – This is used to setup protocol (round-robin, least-connected, IP-hash), NGINX by default use round-robin protocol.
Next lines are used to mentioned list of servers with their port number, here NGINX serves the request to both other containers (server1:80, server2:80).
Summary:
At the end, we will have NGINX container acting as a reverse proxy and communicating with both containers.
With this approach our main containers server1 and server2 are both secure, no one will able to directly access those containers.  If we setup NGINX correctly then it can act as a load balancer, which will improve the performance of a server.