I'm building an app running on NodeJS using postgresql. I'm using SequelizeJS as ORM. To avoid using real postgres daemon and having nodejs on my own device, i'm using containers with docker-compose.
when I run docker-compose up
it starts the pg database
database system is ready to accept connections
and the nodejs server. but the server can't connect to database.
Error: connect ECONNREFUSED 127.0.01:5432
If I try to run the server without using containers (with real nodejs and postgresd on my machine) it works.
But I want it to work correctly with containers. I don't understand what i'm doing wrong.
here is the docker-compose.yml
file
web:
image: node
command: npm start
ports:
- "8000:4242"
links:
- db
working_dir: /src
environment:
SEQ_DB: mydatabase
SEQ_USER: username
SEQ_PW: pgpassword
PORT: 4242
DATABASE_URL: postgres://username:[email protected]:5432/mydatabase
volumes:
- ./:/src
db:
image: postgres
ports:
- "5432:5432"
environment:
POSTGRES_USER: username
POSTGRES_PASSWORD: pgpassword
Could someone help me please?
(someone who likes docker :) )
Your DATABASE_URL
refers to 127.0.0.1
, which is the loopback adapter (more here). This means "connect to myself".
When running both applications (without using Docker) on the same host, they are both addressable on the same adapter (also known as localhost
).
When running both applications in containers they are not both on localhost as before. Instead you need to point the web
container to the db
container's IP address on the docker0
adapter - which docker-compose
sets for you.
Change:
DATABASE_URL: postgres://username:[email protected]:5432/mydatabase
to
DATABASE_URL: postgres://username:pgpassword@db:5432/mydatabase
This works thanks to Docker links: the web
container has a file (/etc/hosts
) with a db
entry pointing to the IP that the db
container is on. This is the first place a system (in this case, the container) will look when trying to resolve hostnames.