Digital Ocean bulutunda nginx ile nodejs sunuculara reverse proxy ve cache load olayına girmek

Nginx Load Balancer and Reverse Proxy for Node.js Applications On Digital Ocean

Joe McCann
Digitial Ocean is rad. A modern VPS with SSD servers for super cheap. Easy to spin up or down. Cloning and backing up images are a breeze and they have a solid, easy to use API with a great support team.
I recently moved a bunch of my static sites to one machine on Digital Ocean. They are all sites powered by Express via Node.js. However, I wanted to put Nginx in front of them to act as a reverse proxy so I could have one machine serve up many websites. This was very simple to do and within an hour I had 7 sites up and running on one machine being handled by Nginx in front of all of them as the reverse proxy.
However, to make these sites highly available, I needed to reconfigure my infrastructure a bit. I ended up needing not one, but three total machines.
  • Machine 1 with Nginx installed (to act as load balancer and reverse proxy)
  • Machine 2 with Node.js installed (to serve up the static sites)
  • Machine 3 which is an exact clone of Machine 2
I created these "droplets", all running Ubuntu 13.04 x64, on Digital Ocean pretty easily and installed Nginx on machine 1 and node.js on machines 2 and 3.
For all seven of the websites, I upated their respective A records to point to the load balancer's (machine 1) IP address. That way any request for any of the sites, would hit the load balancer first, then the load balancer would send the request to an available machine (machine 2 or 3).
Machines 2 and 3 have their own respective IP addresses to which they are referred in the Nginx configuration files. Since they are my Node.js application servers, they obviously have my node sites/apps running on them.
For every site, I have an Nginx config file that is similar to the following:
12345678910111213141516171819202122232425262728
upstream lb-subprint {
ip_hash;
server 192.241.180.249:3222 weight=10 max_fails=3 fail_timeout=30s; # Reverse proxy to machine-1
server 192.241.241.152:3222 weight=10 max_fails=3 fail_timeout=30s; # Reverse proxy to machine-2
}
 
server {
listen 80;
 
server_name www.subprint.com subprint.com;
 
access_log /var/log/nginx/nginx.access.subprint.log;
error_log /var/log/nginx/nginx_error.subprint.log debug;
 
location / {
proxy_pass http://lb-subprint; # Load balance the URL location "/" to the upstream lb-subprint
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
 
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /var/www/nginx-default;
}
}
All seven of these config files are located in my /etc/nginx/conf.d/ directory on machine 1, the load balancer.
Every one of the node sites/apps listens on a particular port, but not port 80. That way Nginx can proxy the request through to the appropriate site while Nginx itself listens to port 80.
And voilà, there you have it. A straightforward, highly available setup with Nginx and Node.js on Digital Ocean.


Nginx üzerinde sanal site kurulumu
https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-virtual-hosts-server-blocks-on-centos-6

Yorumlar

Bu blogdaki popüler yayınlar

22.06.2020 - 26.06.2020 arası işler

Asp.net RestSharp ile data post etmek

List Box Item içindeki elemanları aşağı veya yukarı taşımak