|Watch Youtube Video 🎥|
If you are reading this, then you probably had issues previously or even now with database installations on your laptop on local development environment in general.
Same with me, I had problems with running them consistently, cleaning them up or keeping different projects to work with the same database. And sometimes it is a nightmare to configure things to support different versions.
But luckily I’m about to tell you how to avoid all of that and keep your dev environment clean and predictable.
It is very simple but you have to use a Docker containers. Meaning that for example to use a MongoDB locally you have to run MongoDB docker container, and guess what it is way easier than running MongoDB locally!
There is some learning curve, but you get huge benefit of keeping your environment clean without blowing it up with huge dependency packages and configuration hell when it fails to work. With Docker container you can just remove MongoDB container whenever you’ve done with working your project and don’t worry about any system left-overs.
Let’s see a basic example here to make it clear how to use PostgreSQL docker container for your existing project. Back in a day we’ve done this TypeScript TypeORM starter kit on this channel. If you haven’t checked it out yet, please make sure you watch it!
This project has a dependency of PostgreSQL with a connection settings, which means if you are coding this locally then you have to have a database to run this project. So to run Docker container for postgres we first have to look it up on Dockerhub where they have a nice documentation here https://hub.docker.com/_/postgres
After digging a bit you will end up having a command like this to run your local database.
docker run --name postgre-db -e POSTGRES_PASSWORD=password -e POSTGRES_DB=node_starter -d postgres
By checking docker running services with
docker ps you will see that we don’t have a mapped postgresql port
5432 to a host machine, which means your Node.js application wouldn’t be able to access it.
~ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES fc9fd9feb427 postgres "docker-entrypoint.s…" 30 seconds ago Up 28 seconds 5432/tcp postgre-db
This means that we have to map a port when we start a service and keep it accessible from host machine. Otherwise it will be just a dummy docker container running on your machine. BUT remember to remove previously made container, otherwise Docker will complain about container naming.
~ docker kill postgre-db && docker rm postgre-db ~ docker run --name postgre-db -p 5432:5432 -e POSTGRES_PASSWORD=password -e POSTGRES_DB=node_starter -d postgres
Now if you run
docker ps you will see that there is a nice output with an information that our postgres port is mapped to host machine and we now have ability to use postgres connection from our local Node.js service.
~ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES fb74c76f8415 postgres "docker-entrypoint.s…" 5 seconds ago Up 4 seconds 0.0.0.0:5432->5432/tcp postgre-db
We just have to configure our Node.js service to use the same PostgreSQL password and db name as we provided during container start.
It is exactly the same for MongoDB or MySQL or any other database existing officially as a Docker container, so you can just look them up and run commands. Another example is MongoDB using like this
docker run --name mongodb -p 27017:27017 mongo
Here it is a lot easier because Mongo doesn’t require you to make a database name in advance you can do that with a connection string and schema is automatic, so you need a lot less configurations here, but again as with PostgreSQL you don’t have to have it installed locally, it is just a Docker container and you can just remove it anytime you want, or keep different versions running for different applications, or just change binding ports without even touching database configuration.
Yes! It brings you a lot of comfort, but be aware that it might clean your data whenever you remove your container and you have to define separate Docker container volume to keep your data persistent. For example with MongoDB you keep your data persistent by just mounting specific path to one of your local directories.
docker run --name mongodb -p 27017:27017 -v /my/own/datadir:/data/db mongo
Now your container will put everything from MongoDB data and schema files inside your
/my/own/datadir folder, which means that if you remove your container, you still will have your data, if that’s something is interesting for you for your local development.
Usually I just don’t remove database containers, sometimes I just stop them, or if I restart my machine, I’m starting them up by just running
docker start mongodb whenever I need them in my local development.
That’s probably it for this short but powerful tip, which helped me to keep my system a lot cleaner that I had previously. I’m actually using Docker for almost every dev environment application, but it is completely different story…