Deploying a PostgreSQL Express React Nodejs web application, using Namecheap to purchase domain and mail server and Digital ocean droplet as server hosting, and integration of MailJet service for sending emails.
Table of Contents
- Domain Purchase
- Mail Jet Setup
- Digital Ocean Setup
- Initial Server Setup
- About My App
- Application cloning
- Hosting Application
- Deploying React.js application using Nginx
- Add a domain to Digital Ocean
- Set up Database
- Setup Https with ssl let’s Encrypt
- Configure Environment Variables
- Remote access to PostgreSQL DataBase
Connect to web Mail in : https://www.privateemail.com.
open puttyGen in your system and generate a ssh key and don’t forget to choose type of key to generate : RSA.
After generating key, you can save the public and private key in your system and also copy public key and paste in droplet Authentication section.
ssh root@your_server_ip
Once you are logged in as root, we’re prepared to add the new user account that we will use to log in from now on.
adduser [name]
Now, we have a new user account with regular account privileges. However, we may sometimes need to do administrative tasks.
To avoid having to log out of our normal user and log back in as the root account, we can set up what is known as “superuser” or root privileges for our normal account.
This will allow our normal user to run commands with administrative privileges by putting the word sudo before each command.
By default, on Ubuntu 16.04, users who belong to the “sudo” group are allowed to use the sudo command.
usermod -aG sudo [name]
Ubuntu 16.04 servers can use the UFW firewall to make sure only connections to certain services are allowed.
Different applications can register their profiles with UFW upon installation.
These profiles allow UFW to manage these applications by name. OpenSSH, the service allowing us to connect to our server now, has a profile registered with UFW.
sudo ufw app list.
We need to make sure that the firewall allows SSH connections so that we can log back in next time. We can allow these connections by typing:
sudo ufw allow OpenSSH
sudo ufw allow ssh
sudo ufw allow http
sudo ufw allow https
sudo ufw enable
sudo ufw status
apt-get update && apt-get upgrade
source ~/.profile
curl https://raw.githubusercontent.com/creationix/nvm/master/install.sh | bash
nvm ls-remote
nvm install v0.0.0
nvm use v0.0.0
cd /home
sudo apt-get install git
npm i -g pm2
PM2 is a production process manager for Node.js applications (with a built-in load balancer). Simply, allows you to keep node applications alive forever.
Systemctl stop apache2
Systemctl disable apache2
Apt remove apache2
Apt autoremove
apt clean all && sudo apt update
sudo apt-get install nginx -y
sudo systemctl enable nginx
systemctl status nginx
Nginx is open source high-performance HTTP Server. It is one of the most popular web server because it is known for its stability, high performance, simple configuration.
Our node.js is running generally on port : 3001 or any other port, and react application is running generally on port 3000 or any other port.
Upload your frontend and backend source code to GitHub repository, using git push origin [branch].
git clone https://github.com/backend-app.
cd backend-app && npm install
git clone https://github.com/[frontend-app]
cd frontend-app && npm install
npm run build
sudo swapon --show
free -h
df -h
sudo fallocate -l 1G /swapfile
ls -lh /swapfile
sudo chmod 600 /swapfile
ls -lh /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
sudo swapon --show
free -h
sudo cp /etc/fstab /etc/fstab.bak
echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab
cat /proc/sys/vm/swappiness
sudo sysctl vm.swappiness=10
This setting will persist until the next reboot. We can set this value automatically at restart by adding the line to our /etc/sysctl.conf file:
sudo nano /etc/sysctl.conf
vm.swappiness=10
cat /proc/sys/vm/vfs_cache_pressure
sudo sysctl vm.vfs_cache_pressure=50
sudo nano /etc/sysctl.conf
vm.vfs_cache_pressure=50
! **IF Node.js heap out of memory**
you want to increase the memory usage of the node globally - not only single script, you can export environment variable, like this:
export NODE_OPTIONS=--max_old_space_size=4096
npm run build
scp -r build/ root@[server-ip-address]:/[server-front-end-location]
I am running my app using npm start because I am using express-generator. You can run node app.js also.
cd backend-app && npm start
cd backend-app
pm2 start /[server-file-location] --name [name]
pm2 stop [name | id]
pm2 status
pm2 delete [name | id]
pm2 startup
sudo env PATH=$PATH:/usr/bin /usr/lib/node_modules/pm2/bin/pm2 startup systemd -u ubuntu --hp /home/ubuntu
pm2 save
npm install
cd /etc/nginx/sites-available/
sudo cp default [domain-name.com]
nano [domain-name.com]
server {
listen 80;
listen [::]:80;
root /[location]/build;
index index.html index.htm index.nginx-debian.html;
server_name [domain-name.com] [www.domain-name.com];
location / {
try_files $uri /index.html;
}
location /API {
proxy_pass http://localhost:3001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
}
add also security headers, note that headers wont works in other locations because an add header exists in this location, thus we added them here.
location ~* \.(png|jpg|jpeg|gif)$ {
expires 365d;
add_header Cache-Control "public, no-transform";
}
location ~* \.(js|css|pdf|html|swf)$ {
expires 30d;
add_header Cache-Control "public, no-transform";
add_header Strict-Transport-Security 'max-age=31536000; includeSubDomains; preload';
add_header Content-Security-Policy "default-src 'self';script-src 'self';style-src 'self';object-src 'none';base-uri 'self';connect-src 'self';font-src 'self';frame-src 'self';img-src 'self' data:;manifest-src 'self';media-src 'self';worker-src 'none';form-action 'self';";
add_header X-Content-Type-Options nosniff;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header Referrer-Policy "strict-origin";
add_header Permissions-Policy "sync-xhr=(),magnetometer=(),gyroscope=(),fullscreen=(self),payment=()";
}
Redirect [domain].com to www.[domain].com
if ($host = [doamin].com){
return 301 https://www.$host$request_uri;
}
gzip on;
gzip_vary on;
gzip_min_length 10240;
gzip_proxied expired no-cache no-store private auth;
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml;
gzip_disable "MSIE [1-6]\.";
then I am checking If there is any match for /api in the request, then forward the request to localhost:3001/ where our node.js server is running.
sudo ln -s /etc/nginx/sites-available/theclassmap.com /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl restart nginx
ns1.digitalocean.com
ns2.digitalocean.com
ns3.digitalocean.com
sudo apt install postgresql postgresql-contrib -y
sudo cat /etc/passwd
sudo -i -u postgres
psql
For this tutorial, the node process will be run under the ubuntu user and for the sake of simplicity, an ubuntu user will be created on Postgres as well. If you want to use a different user feel free to create a different user in postgres.
To create a Postgres user run the following command which will give an interactive prompt for configuring the new user. For the sake of simplicity, the ubuntu user will be a superuser, which is the equivalent of being a root user on Linux.
createuser --interactive
\du
To log in as user to psql use following syntax because postgres uses normally the user name the same as database so psql will access postgres database that’s why we use following syntax:
psql -d postgres
\conninfo
ALTER USER ubuntu PASSWORD 'password';
Create database name;
As with most sql database PostgreSQL allows us to easily copy our database schema and data from our local development postgres instance and copy it over to the Postgres instance running in our production server.
pg_dump -U postgres -f [local-db-name].pgsql -C [local-db-name]
The -U flag specifies the user u want to login as, if you are using a different username, update it accordingly. The -f yelp.pgsql flag will write the database schema and data to a file called yelp.pgsql in the current directory -C flag add the create database command to the file as well yelp is the name of the database in our local dev server that we want to dump. If your database is called something else update that accordingly. If you leave out the database name altogether it will dump all databases.
scp -i [path to pem file] [path to [local-db-name].pgsql] username@[server-ip]:[directory to copy file to]
In this example the pem file and yelp.gsql file are located in the current directory and my server ip is 1.1.1.1 and the username is Ubuntu
psql [db-name] < /[location]/[local-db-name].pgsql
\l
\c dbName
\password
\q
sudo snap install --classic certbot
Execute the following instruction on the command line on the machine to ensure that the certbot command can be run.
sudo ln -s /snap/bin/certbot /usr/bin/certbot
Run this command to get a certificate and have Certbot edit your nginx configuration automatically to serve it, turning on HTTPS access in a single step.
sudo certbot --nginx
We now need to make sure that all of the proper environment variables are setup on our production Ubuntu Server. In our development environment, we made use of a package called dotenv to load up environment variables. In the production environment the environment variables are going to be set on the OS instead of within Node.
Create a file called .env in /home/ubuntu/. The file does not need to be named .env and it does not need to be stored in /home/ubuntu, these were just the name/location of my choosing. The only thing I recommend avoid doing is placing the file in the same directory as the app code as we want to make sure we don't accidentally check our environment variables into git and end up exposing our credentials.
source .env
printenv
NODE_ENV=production
Although it’s not required for this example project, it is common practice. For many other projects (depending on how the backend is setup) they may require this to be set in a production environment.
set -o allexport; source /home/amine/theclassmap/.env; set +o allexport
apt install postgresql postgresql-contrib
update-rc.d postgresql enable
service postgresql start
cd /etc/postgresql/[version]/main
host all all [your-public-ipAddress] md5
listen_addresses = '*'
service postgresql restart
ufw status numbered
ufw delete [number]
ufw allow from [public-IP-address] to any port [5432]