Советы

Создаем API на Docker, Node.js, Nginx, Postgres

June 30, 2020

This is part 2 of a 2 part post about Docker for web development.

Intro

My previous Docker post was an introduction to what Docker is, and how it can be used to package an application in to a Docker image for sharing.

Although my previous post covered essential knowledge for Docker, it doesn’t cover what I think is it’s most useful and practical purpose which is to enable developers to run their apps using Docker as a development environment.

This post continues to explain why this is useful, and how to implement it to help with your own development process.

Specifically, this post covers:

  • Why use Docker for development?
  • Using Docker Compose
  • Networks and Volumes with Docker Compose
  • Problems with dependencies and local development inside containers

Also a prerequisite here is to have Docker Desktop installed on your host machine, and have read my previous Docker post for an understanding of the differences between the uses of Docker.

Why use Docker for development?

The benefits of Docker discussed in this post are simple — it allows an application to be ran using one or more containers which each have their own software versions specified. For example, you can run a specific version of Linux, using a specific version of Node. This is great for sharing because you’ll be sure that it will run on other users machines.

But this advantage of specifying software and versions is also ideal for the development process.

I’ve spent countless hours trying to run old side projects from a couple of years ago and encountering problems where a certain NPM package or function doesn’t work anymore because my machine has been updated.

Docker as a dev environment allows an application to be ran exactly as it was created, meaning you can jump right back in to a project and not have to worry about those dreadful issues.

Another advantage is that with Docker being host machine agnostic, you actually don’t need some software installed on your host machine at all. You can completely uninstall Ruby or Go or Node and your app will run. Similarly, it’s much easier to play with other tech stacks as you don’t need to worry about maintaining all the baggage on your host machine.

The implementation

For the purposes of this post, I’ll be showing how I updated on of my Github projects to use Docker for development recently. This project is simple, consisting of a front end React app (built with Next.js), a server side Node API, and a Postgres database. With this setup I wanted to have 3 separate containers:

  • One for the Next.js React app
  • One for the server side Node API
  • One for the Postgres database

This will be done using a Dockerfile and Docker Compose. The Dockerfile defines the setup for the local environment and provides a way to package the app up to eventually be production ready, and Docker Compose is used to orchestrate the containers together and allow them to communicate between one another.

Docker Compose makes use of Services, Networks and Volumes to allow containers to work together on one machine.

The same end result can be achieved without Docker Compose, and instead using Docker CLI and creating the setup manually, but Docker Compose makes it much faster and easier to get your perfect dev environment ready so you can focus on the developing! Services, Networks and Volumes will be covered more later.

Using Docker for development

For some context before we jump in, here’s my desired project structure for the boilerplate with Docker Compose:

— /api
|- Dockerfile
|- package.json
|- … (other files in the Node API)
— /client
|- Dockerfile
|- package.json
|- … (other files in the Next.js client)
— docker-compose.yml
— README.md

To get started, a Dockerfile is needed for each of the front end and back end parts of the app. Here’s the front end:

FROM node:12.12.0-alpine WORKDIR /home/app/client EXPOSE 3000

And here’s the back end one:

FROM node:12.12.0-alpine WORKDIR /home/app/api EXPOSE 3001

I’ve ran in to problems before specifying the Node version to be “node-latest”, or similar which define the latest version of Node to be used. I recommend specifying a version as that’s kind of the whole point of using Docker this way — keeping the versions and everything exactly what it should be to run your app.

You’ll notice the Dockerfiles are very simple compared to the ones in the last post. This is because the last post’s Dockerfiles were set to package the app up for use to run the application, not for development of the application.

The Dockerfles are set to perform tasks:

  • FROM — install a specific version of Node (12.12.0) using the Node Alpine which is a leaner version of Node taking up much less space than the full version.
  • WORKDIR — specify where you want the app to be ran inside the container.
  • EXPOSE — expose a port for the app to be accessed on your local machine. More on this later.

The Dockerfiles in the last post also contained instructions to COPY files over from the host machine to the container, and RUN the command to actually start the app. But again, the instructions in the last post were for packaging the app to be ran, not used in development.

Similarly, we can’t use the above, shorter Dockerfile using the same docker run… CLI command as before. Instead we can use a docker-compose.yml file to run our app:

version: «3»
services: client_dev: build: ./client command: sh -c «npm install && npm run dev» ports: — 8000:3000 working_dir: /home/app/client volumes: — ./client:/home/app/client api_dev: build: ./api command: sh -c «npm run migrate-db && npm install && npm run dev» ports: — 8001:3001 working_dir: /home/app/api volumes: — ./api:/home/app/api depends_on: — db db: image: postgres environment: — POSTGRES_USER=user — POSTGRES_DB=postgresdb — POSTGRES_PASSWORD=password — POSTGRES_HOST_AUTH_METHOD=trust volumes: — ./db/data/postgres:/var/lib/postgresql/data ports: — 5432:5432

You can see each of the 3 services are defined in this YAML file — client_dev, api_dev and db. These define the 3 containers which Docker Compose will spin up. They can be called what ever you want, but these service names are used when you want to debug using Docker CLI.

For client_dev and api_dev, there’s a build property. This defines the location of the Dockerfile which Docker Compose uses to build your containers.

There’s also the command property, which defines a command which is to be ran once the containers are up and running. Once the container is running, we want to ensure the packages are installed from the package.

json file, otherwise our app won’t work. You can run the npm install commands separately and manually before running the containers, but it’s good to have it here so people don’t need to enter commands themselves.

The ports property is similar to the port forwarding explained in the last post. It enables your host machine to access your containers posts on localhost. In our client app, the Next.js app is set to run on port 3000, and the Dockerfile EXPOSEs port 3000 to run the app.

The ports section of the Compose file allows us to then access the app using port 8000. We could have the Compose file use the same ports for local and container, such as — 3000:3000 but you may have some ports in use and need to specify different ones for your machine.

The working_dir defines the directory for the Compose file to use to run the project. This is set to the same working directory as defined in the Dockerfile of course, as this is where the app code is located.

Finally, the volumes is what is required to define a Volume using the Compose file. As described on the Docker site, Volumes are a way “for persisting data generated by and used by Docker containers”.

In our case, it means we want to use a Volume to keep the code of our app outside of the container, but have the code mounted so it can still be used by the container. I’ll explain this in more detail later.

Now the docker-compose.yml and each Dockerfile is in the right place, we can run docker-compose up which will spin up each of the 3 containers for us. But what’s actually happening when we run that command?

Читайте также:  Что выбрать: монолиты, микросервисы и бессерверная архитектура

The docker-compose process explained

Once the Compose command is ran, Docker will check the location of the Dockerfiles you have defined in the build part of the Compose file.

These Dockerfiles contain an image which the file requires, in our example is Node, so the Node image is downloaded for us.

The db service in our Compose file also contains the Postgres database, which is downloaded without the use of a Dockerfile. Instead, this image is defined within the Compose file definition of image: postgres.

Once the images are downloaded, our own images are created to use the downloaded Node image. You can confirm this by running docker images and seeing the Node image, and another image for each service defined in the Compose file:

Создаем API на Docker, Node.js, Nginx, Postgres

The image name is prefixed with the directory of the Compose file. My project sits in a directory called “nextjs-typescript-jwt-boilerplate”, so images are created with this prefix so we can tell which Docker Compose setup this relates to.

Once the images are created, the containers are then ran from the images we’ve just created — all happening automatically. Again, this can be confirmed by entering docker ps -a to show a list of running containers:

Создаем API на Docker, Node.js, Nginx, Postgres

Volumes

At this point, as we’ve not copied over any files from our host machine to create the image, the container only contains what ever Node version we’ve chosen. This is where the Volumes come in. In the client app, the Volumes are defined as ./client:/home/app/client.

This means that from our host machine at the location ./client (relative to the docker-compose.yml file) we want to mount this directory in our container at the directory /home/app/client.

This is crucial to enable our local development environment, because we want to be able to change files a lot when developing so we can save, test, debug, build etc. so mounting our host machine files to the container that way is perfect.

The alternative for this is to COPY the files over to the image, and when saving a file have the whole image re-build and deployed as a fresh container. This has obvious overheads and will take some time to re-build each time.

As well as using Volumes to speed up the dev process for the client_dev and api_dev services, Volumes are also utilised when running the Postgres image. Due to the nature of containers and images as discussed in the last post, data is not persistent.

When a container crashes or is closed down, all data within the container is gone, and resets back to the data contained in the original image when the container is created again.

Volumes are used with the postgres container to keep a copy of all data on our host machines so it can be persisted when the container is closed down.

Network

As well as having a clear separation between local files and the files contained in the images/containers, it’s important that the containers can communicate with one another.

Luckily, with Docker Compose this is all handled automatically. Without Docker Compose for example, containers must be added to a Network, which is done by entering commands in to the CLI.

However, a project ran using Docker Compose automatically creates a Network for all containers in the project.

Running docker network ls will show you all the Networks on your host machine:

Создаем API на Docker, Node.js, Nginx, Postgres

The Networks defined by Docker Compose are prefixed with the same name as the image and container prefixes. Again, to help show where the Network originates from.

JavaScript CRUD Rest API using Nodejs, Express, Sequelize, Postgres, Docker and Docker Compose

Let's create a CRUD rest API in JavaScript, using:

  • Node.js
  • Express
  • Sequelize
  • Postgres
  • Docker
  • Docker Compose

All the code is available in the GitHub repository (link in the video description): https://youtube.com/live/Uv-jMWV29rU

Intro

  • Here is a schema of the architecture of the application we are going to create:
  • Создаем API на Docker, Node.js, Nginx, Postgres
  • We will create 5 endpoints for basic CRUD operations:
  • Create
  • Read all
  • Read one
  • Update
  • Delete

We will create a Node.js application using:

  • Express as a framework
  • Sequelize as an ORM
  1. We will Dockerize the Node.js application

  2. We will have a Postgres istance, we will test it with Tableplus

  3. We will create a docker compose file to run both the services

  4. We will test the APIs with Postman

Step-by-step guide

  1. Here is a step-by step guide.

  2. create a new folder
  3. step into it
  4. initialize a new npm project
  5. install the dependencies

npm i express pg sequelize

  • express is the Node.js framework
  • pg is a driver for a connection with a Postgres db
  • sequelize is the ORM so we avoid typing SQL queries

create 4 folders

mkdir controllers routes util models

  • Open the folder with your favorite IDE. If you have Visual Studio Code, you can type this from the terminal:
  • You should now have a folder similar to this one:
  • Создаем API на Docker, Node.js, Nginx, Postgres
  • Now let's start coding.

Database connection

Create a file called «database.js» inside the «util» folder.

This file will contain the internal configuration to allow the connection between the Node.js application and the running Postgres instance.

Populate the util/database.js file

const Sequelize = require('sequelize'); const sequelize = new Sequelize( process.env.PG_DB, process.env.PG_USER, process.env.PG_PASSWORD, { host: process.env.PG_HOST, dialect: 'postgres', }
); module.exports = sequelize;

User model

Create a file called «user.js» inside the «models» folder.

This file will contain the model, in this case a user with an auto-incremented id, a name and an email.

Populate the models/user.js file:

const Sequelize = require('sequelize');
const db = require('../util/database'); const User = db.define('user', { id: { type: Sequelize.INTEGER, autoIncrement: true, allowNull: false, primaryKey: true }, name: Sequelize.STRING, email: Sequelize.STRING
}); module.exports = User;

Controllers

  1. This is the file that contains all the functions to execute in order to interact with the database and have the 4 basic functionalities:
  2. Create a file called «users.js» inside the «controllers» folder
  3. Populate the controllers/users.js file

const User = require('..

/models/user'); // CRUD Controllers //get all users
exports.getUsers = (req, res, next) => { User.findAll() .then(users => { res.status(200).json({ users: users }); }) .catch(err => console.log(err));
} //get user by id
exports.getUser = (req, res, next) => { const userId = req.params.userId; User.findByPk(userId) .

then(user => { if (!user) { return res.status(404).json({ message: 'User not found!' }); } res.status(200).json({ user: user }); }) .catch(err => console.log(err));
} //create user
exports.createUser = (req, res, next) => { const name = req.body.name; const email = req.body.email; User.create({ name: name, email: email }) .

then(result => { console.log('Created User'); res.status(201).json({ message: 'User created successfully!', user: result }); }) .catch(err => { console.log(err); }); } //update user
exports.updateUser = (req, res, next) => { const userId = req.params.userId; const updatedName = req.body.name; const updatedEmail = req.

body.email; User.findByPk(userId) .then(user => { if (!user) { return res.status(404).json({ message: 'User not found!' }); } user.name = updatedName; user.email = updatedEmail; return user.save(); }) .then(result => { res.status(200).json({message: 'User updated!', user: result}); }) .catch(err => console.

log(err));
} //delete user
exports.deleteUser = (req, res, next) => { const userId = req.params.userId; User.findByPk(userId) .then(user => { if (!user) { return res.status(404).json({ message: 'User not found!' }); } return User.destroy({ where: { id: userId } }); }) .then(result => { res.status(200).

json({ message: 'User deleted!' }); }) .catch(err => console.log(err));
}

Routes

Create a file called «users.js» inside the «routes» folder.

Populate the routes/users.js file

const controller = require('../controllers/users');
const router = require('express').Router(); // CRUD Routes /users
router.get('/', controller.getUsers); // /users
router.get('/:userId', controller.getUser); // /users/:userId
router.post('/', controller.createUser); // /users
router.put('/:userId', controller.updateUser); // /users/:userId
router.delete('/:userId', controller.deleteUser); // /users/:userId module.exports = router;

Index file

To run our application we need to create on more file at the root level. this is the file that will be executed by the docker container.

in the root folder, create a file called index.js

Populate the «index.js file»:

const express = require('express');
const bodyparser = require('body-parser');
const sequelize = require('./util/database');
const User = require('./models/user'); const app = express(); app.use(bodyparser.json());
app.use(bodyparser.urlencoded({ extended: false })); app.use((req, res, next) => { res.setHeader('Access-Control-Allow-Origin', '*'); res.setHeader('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE'); next();
}); //test route
app.get('/', (req, res, next) => { res.send('Hello World');
}); //CRUD routes
app.use('/users', require('./routes/users')); //error handling
app.use((error, req, res, next) => { console.log(error); const status = error.statusCode || 500; const message = error.message; res.status(status).json({ message: message });
}); //sync database
sequelize .sync() .then(result => { console.log(«Database connected»); app.listen(3000); }) .catch(err => console.log(err));

Docker Part

Let's create 3 more files at the root level:

  • .dockerignore (it starts with a dot)
  • Dockerfile (capital D)
  • docker-compose.yml

Dockerizing a Node+Express+Postgress App (with NGINX)

The following is a step by step guide to dockerizing a basic Node.js application. Before starting you should have a basic understanding of Containers and Images. Here is the basic layout of the app we will be moving into containers.

Читайте также:  Основы Flask: создаем сайт на Python с нуля

├── database
│   ├── DATA.sql
├── server
│   ├── index.js
├── package.json

DATA.sql is simply a dump of a POSTGRES database that is queried by a vanilla express app (index.js) that serves a handful of basic GET routes. A quick note about getting to this point, the DATA.

sql file was created using the following command.

The no-privileges and no-owner commands are important as we will be specifying a new super user when the POSTGRES database runs inside the docker container.

pg_dump -h localhost -U sieke —no-privileges —no-owner xxDBNAMExx >> DATA.sql

What is the Goal?

We are going to take the following high level steps to turn our basic API into one that has 5 containers (below).

  1. Set up a dockerfile to create a postgres docker image for running our database.
  2. Set up a dockerfile to create a node docker image for running our API(s).
  3. Create a docker-compose.yml configuration to «link» our 5 containers together.
  4. Set up a dockerfile to create an nginx docker image for load balancer / reverse proxy.
  5. Make some final modifications to our server to make sure it can connect to our new containerized database.
  6. Use docker-compose to launch our images + containers.

Создаем API на Docker, Node.js, Nginx, Postgres

The first container will run NGINX and will be used as a load-balancing reverse-proxy that distributes 1/3 of requests to each of our servers. Our express API server will be deployed in 3 separate containers. Each of the 3 containers will point to a single Postgres database.

An added benefit of having incoming requests hit the NGINX server is the ease of adding a caching layer. A few lines added to the config yields incredibly flexible caching options.

1. Postgres Container

We're going to create filed called Dockerfile in the /database/ directory. This file will contain the instructions for creating our Postgres Image.

├── database
│   ├── DATA.sql
│   ├── Dockerfile

The Dockerfile should contain the following instructions.

FROM postgres:latest ENV POSTGRES_USER my-username
ENV POSTGRES_PASSWORD my-password
ENV POSTGRES_DB my-database COPY DATA.sql /docker-entrypoint-initdb.d/

The Dockerfile will pull the latest postgres image. Setting the 3 environment variables will initialize the container with those a SQL user/pass and create a db. The my-database name should match the name of the local db that you used in your local application.

Finally, the Dockerfile copies the DATA.sql dump file into a special folder inside the instance. The script will automatically run when the instance starts.

2. API (Node + Express) Containers

We're going to create a second Dockerfile in the root directory of our project (same as package.json).

FROM node:12
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
COPY .env_docker .env
ENV PORT=8080
EXPOSE 8080
CMD [ «npm», «start» ]

The above dockerfile contains the instructions for building our Node.js API. It creates the image from the base node image, sets the working directory, copies package.json from our local machine to the image, then runs npm install.

Пример развертки на Docker PostgreSQL вместе с сервером для бэкапов и админки и проекта на Node.js + Nginx + SSL + MongoDB

Для начала создадим образ через Dockerfile, для бэкапов, для остального будем использовать образы из Docker Hub

Создаем в папке два файла Dockerfile и файл скрипта для автоматического бэкапа баз.

Файл Dockerfile:

FROM postgres:alpine

COPY db-backup.sh /usr/local/bin/db-backup.sh

RUN echo '0 23 * * * /usr/local/bin/db-backup.sh' > /etc/crontabs/root

CMD [«crond», «-f», «-d», «0»]

  • В этом файле говорится, что образ будет создаваться на основании postgres:alpine
  • В него копируется скрипт для копирования, под название db-backup.sh
  • Устанавливается команда для cron, которая говорит, что скрипт нужно запускать каждый день в 23 часа.
  • И запускается cron на переднем плане.
  • Сам файлик для бэкапа db-backup.sh может быть таким:

#!/bin/sh

now=$(date +»%d-%m-%Y_%H-%M»)

DB_BASE=`/bin/su postgres -c «/usr/local/bin/psql -h db -qAt -c 'SELECT datname FROM pg_database;'» |
cut -d»|» -f1 | /bin/grep -v template | /bin/grep -v postgres`

echo $DB_BASE

for DB_NAME in $DB_BASE
do
/usr/local/bin/pg_dump -h db -U postgres ${DB_NAME} > «/backup/db_${DB_NAME}_$now.sql»
/usr/local/bin/pg_dump -Fc -h db -U postgres ${DB_NAME} > «/backup/db_${DB_NAME}_$now.dump»
done

find /backup -name «*.sql» -type f -mtime +30 -delete
find /backup -name «*.dump» -type f -mtime +30 -delete

exit 0

Для того чтобы скомпоновать образ, можно запустить команду:

docker build . -t localhost:5000/cron:latest

Dockerfile — для проекта Node.js

В папке проекта на Node.js, создаем Dockerfile, примерно с таким содержимым:

FROM node

COPY . .

RUN npm install

EXPOSE 5001

# Для приложения написанного на Node.js
CMD [«node», «app.js»]

Для его компоновки, можем использовать команду:

docker build . -t localhost:5000/api:latest

Docker файл для Nginx

FROM nginx:alpine

RUN apk add —update python3 py3-pip

RUN apk add —no-cache certbot

RUN pip install certbot-nginx

Для его компоновки можем использовать команду:

docker build . -t localhost:5000/proxy

Файл docker-compose.yml

Далее можно в другой папке, создаем файл docker-compose.yml:

version: '3.1'

services:
proxy:
image: localhost:5000/proxy
ports:
— 80:80
— 443:443
volumes:
— 'letsencrypt:/etc/letsencrypt'
— './nginx.conf:/etc/nginx/nginx.conf'

api:
image: localhost:5000/api
environment:
EMAIL_PORT: 465

mongo:
image: mongo
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: example
volumes:
— 'mongodb:/data/db'

mongo-express:
image: mongo-express
restart: always
ports:
— 8081:8081
environment:
ME_CONFIG_MONGODB_ADMINUSERNAME: root
ME_CONFIG_MONGODB_ADMINPASSWORD: example
ME_CONFIG_MONGODB_URL: mongodb://root:example@mongo:27017/

db:
image: postgres:alpine
restart: always
environment:
POSTGRES_PASSWORD: Сюда пароль пишем 🙂
volumes:
— 'pgdata:/var/lib/postgresql/data'

db-backup:
image: localhost:5000/cron
environment:
PGPASSWORD: Сюда тоже пароль пишем
volumes:
— './backup:/backup'

adminer:
image: adminer
restart: always
ports:
— 8080:8080

volumes:
pgdata:
external: true

letsencrypt:
external: true

mongodb:
external: true

Далее нам нужно создать volume, его будем использовать для базы данных:

docker volume create pgdata

Еще один volume, для сертификатов ssl

docker volume create letsencrypt

Еще volume, для mongo

docker volume create mongodb

Первоначальный файл nginx.conf

В той же папке где и docker-compose.yml

user nginx;
worker_processes auto;

error_log /var/log/nginx/error.log notice;
pid /var/run/nginx.pid;

events {
worker_connections 1024;
}

http {
include /etc/nginx/mime.types;
default_type application/octet-stream;

log_format main '$remote_addr — $remote_user [$time_local] «$request» '
'$status $body_bytes_sent «$http_referer» '
'»$http_user_agent» «$http_x_forwarded_for»';

access_log /var/log/nginx/access.log main;

sendfile on;
#tcp_nopush on;

keepalive_timeout 65;

#gzip on;

include /etc/nginx/conf.d/*.conf;

server {
root /var/www/html;

# Add index.php to the list if you are using PHP
index index.html index.htm index.nginx-debian.html;

server_name api.polyakovdmitriy.ru;

location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.

proxy_pass http://api:5001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
}

Запуск docker compose:

docker-compose up -d

Который разворачивает все контейнеры.

Дальше нужно зайти в контейнер proxy и получить сертификат

Входим в контейнер:

docker exec -it /bin/sh

Получаем сертификат:

certbot —nginx -d api.polyakovdmitriy.ru

Проверка сертификата когда заканчивается

Для проверки сертификата, можно выполнить команду:

openssl s_client -servername api.polyakovdmitriy.ru -connect api.polyakovdmitriy.ru:443 2>/dev/null | openssl x509 -noout -dates

Далее смотрим что выдало в параметре notAfter

Остановка docker compose, производится командой:

docker-compose down

Создать копию базы mongo, можно командой:

docker exec mongo_1 sh -c «exec mongodump —authenticationDatabase admin -u -p —archive» > d://backup-mongo/all-collections-%date%.archive

Setting Up Docker Compose With Nodejs Nginx Postgres And – Otosection

Dockerizing A React Application With Nodejs Postgres And Nginx Dev

Dockerizing A React Application With Nodejs Postgres And Nginx Dev What is the goal? we are going to take the following high level steps to turn our basic api into one that has 5 containers (below).

set up a dockerfile to create a postgres docker image for running our database. set up a dockerfile to create a node docker image for running our api (s). Nginx with docker and node.

js — a beginner’s guide ashwin · follow 11 min read · oct 29, 2020 6 this article was written as i was learning nginx and docker, and explains how to set up a.

Setting Up Nginx Proxy Manager With Docker Compose

Setting Up Nginx Proxy Manager With Docker Compose Jun 8 this tutorial explains how to dockerize a react application with node.js, postgres and nginx. it provides step by step instructions on setting up the back end, front end, and. Prerequisites to follow this tutorial, you will need: a development server running ubuntu 18.

04, along with a non root user with sudo privileges and an active firewall. for guidance on how to set these up, please see this initial server setup guide. Docker compose for node.js and postgresql. docker is the response to an ongoing problem of differences between environments in which application runs.

whether those differences are across machines of the development team, continuous integration server, or production environment. since you are reading this, i assume you are already more or less. We will use an existing application api with node.

js and postgres replacing a remote postgres with a local one running with docker and docker compose, so it would be advisable to read the previous post about it. given the prerequisites have been mentioned we can move forward to the next section where we will run some docker commands.

Читайте также:  Как установить Python на Windows, Ubuntu и macOS

создаем Api на Docker Node Js Nginx Postgres

создаем Api на Docker Node Js Nginx Postgres Docker compose for node.js and postgresql. docker is the response to an ongoing problem of differences between environments in which application runs. whether those differences are across machines of the development team, continuous integration server, or production environment.

since you are reading this, i assume you are already more or less. We will use an existing application api with node.

js and postgres replacing a remote postgres with a local one running with docker and docker compose, so it would be advisable to read the previous post about it. given the prerequisites have been mentioned we can move forward to the next section where we will run some docker commands. Prerequisites to follow this tutorial, you will need: an ubuntu 18.04 server, a non root user with sudo privileges, and an active firewall. for guidance on how to set these up, please read this initial server setup guide. docker and docker compose installed on your server.

Node.js rest api setup with docker compose, express and postgres kundan · follow 3 min read · aug 24, 2020 2 learn how to deploy a node.js app with postgresql db using docker compose.

Dockerizing A React Application With Nodejs Postgres And Nginx | Dev And Prod | Step By Step

in this tutorial we are going to create a multicontainer application which will be running in docker containers. we will create both a docker compose tutorial where we create a postgresql and node.js container.

in this tutorial we are going to create a multicontainer application which will be running in docker containers. we will create both in this video i am going to show you how to setup postgres and pg admin in docker compose.

docker cmd: if you've always heard about docker and how much awesome it is and all developers are using here and there and always to find become a patron and help the channel grow patreon classsed ⌨️ node app docker #dockercompose #dockertutorials #nodejs #nodejstutorial #nginx #devopstutorial #devops in this video, i'm going through dockerizing your backend is a critical skill to have to facilitate an easy development period. in this video, we look at how we can in this tutorial we will learn how to build a node.js, mongodb, nginx application with docker containers. we will see the we have seen node js and nginx in docker container in my previous tutorial. in this video we are going run node js and nginx learn the core fundamentals of docker by building a node express app with a mongo & redis database. we'll start off by keeping

Conclusion

Taking everything into consideration, there is no doubt that post provides informative insights about Setting Up Docker Compose With Nodejs Nginx Postgres And

Dockerizing a Node.js Web Application

If you’ve ever developed anything that needs to ‘live’ somewhere besides your local machine, you know that getting an application up and running on a different machine is no simple task.

There are countless considerations to be had, from the very basics of “how do I get my environment variables set” to which runtimes you’ll need and which dependencies those will rely on, not to mention the need to automate the process.

It’s simply not feasible for software teams to rely on a manual deploy process anymore.

A number of technologies have sought to solve this problem of differing environments, automation, and deployment configuration, but the most well-known and perhaps most notable attempt in recent years is Docker.

By the end of this tutorial you should be able to:

  • understand what Docker is and what it does
  • create a simple Dockerfile
  • run a Node.js application using Docker
  • use Continuous Integration to automatically build and test Docker containers

Docker’s homepage describes Docker as follows:

“Docker is an open platform for building, shipping and running distributed applications. It gives programmers, development teams and operations engineers the common toolbox they need to take advantage of the distributed and networked nature of modern applications.”

Put differently, Docker is an abstraction on top of low-level operating system tools that allows you to run one or more containerized processes or applications within one or more virtualized Linux instances.

Before we dive in, it’s important to stress the potential usefulness of Docker in your software development workflow. It’s not a “silver bullet”, but it can be hugely helpful in certain cases. Note the many potential benefits it can bring, including:

  • Rapid application deployment
  • Portability across machines
  • Version control and component reuse
  • Sharing of images/dockerfiles
  • Lightweight footprint and minimal overhead
  • Simplified maintenance
  • Before you begin this tutorial, ensure the following is installed to your system:
  • You can find all the example code in this post in the dockerizing-nodejs repository.
  • Create an empty repository to host your code:
  1. Go to GitHub and sign up.
  2. Use the New button under Repositories to create a new repository.
  3. In Add .gitignore, select Node.
  4. Create the repository.
  5. Clone the repository to your work machine.

We’ll be using a basic Express application as our example Node.js application to run in our Docker container. To keep things moving, we’ll use Express’s scaffolding tool to generate our directory structure and basic files.

$ npx express-generator —no-view src
$ cd src
$ npm install

This should have created a number of files in your directory, including bin and routes directories. Make sure to run npm install so that npm can get all of your Node.js modules set up and ready to use.

We’ll write an addressbook API that stores people’s names in a database.

Routes are how we handle each HTTP request. The express starter project has a few example routes and we’ll add one more to handle our API calls.

  • Create a new file called routes/persons.js with the following content:

// persons.js

var express = require('express');
var router = express.Router();
var db = require('../database');

router.get(«/all», function(req, res) {
db.Person.findAll()
.then( persons => {
res.status(200).send(JSON.stringify(persons));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});

router.get(«/:id», function(req, res) {
db.Person.findByPk(req.params.id)
.then( person => {
res.status(200).send(JSON.stringify(person));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});

router.put(«/», function(req, res) {
db.Person.create({
firstName: req.body.firstName,
lastName: req.body.lastName,
id: req.body.id
})
.then( person => {
res.status(200).send(JSON.stringify(person));
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});

router.delete(«/:id», function(req, res) {
db.Person.destroy({
where: {
id: req.params.id
}
})
.then( () => {
res.status(200).send();
})
.catch( err => {
res.status(500).send(JSON.stringify(err));
});
});

module.exports = router;

This file implements all the API methods our application will support, we can:

  • Get all persons
  • Create a person
  • Get a single person by id
  • Delete a person

All the routes return the person information encoded in JSON.

All person routes require a database to store the data. We’ll use a PostgreSQL database to keep our contact details.

  1. Install the PostgreSQL node driver and sequelize ORM:

$ npm install —save pg sequelize

Sequelize handles all our SQL code for us, it will also create the initial tables on the database.

  1. Create a file called database.js

API with Node.js + PostgreSQL + TypeORM: Project Setup

In this article, you’ll learn how to set up a Node.js project with TypeScript, ExpressJs, PostgreSQL, TypeORM, and Redis.

Related Post: Backend

  • How to run docker containers with docker-compose
  • How to connect a TypeORM Express app to PostgreSQL
  • How to connect a TypeORM Express app to Redis
  • Knowledge of Docker and Docker-compose
  • Knowledge of Node.js and Express

The most straightforward way to get PostgreSQL and Redis database instances running on our computer is to use Docker and Docker-compose.

Am going to assume you already have Docker and Docker-compose installed on your machine.

Create a docker-compose.yml file in your root directory and paste the configurations below into it.

docker-compose.yml

version: '3'
services:
postgres:
image: postgres:latest
container_name: postgres
ports:
— '6500:5432'
volumes:
— progresDB:/var/lib/postgresql/data
env_file:
— ./.env

redis:
image: redis:alpine
container_name: redis
ports:
— '6379:6379'
volumes:
— redisDB:/data
volumes:
progresDB:
redisDB:

Later, we’re going to use this VS Code extension to view the data stored in both the PostgreSQL and Redis databases.

To provide the credentials needed by the PostgreSQL Docker image, we need to create a .env file in the root directory.

Добавить комментарий

Ваш адрес email не будет опубликован. Обязательные поля помечены *