Dev Environment ‐ Setup Tools - FullstackCodingGuy/Developer-Fundamentals GitHub Wiki
To test docker instance is running successfully.
docker run hello-world
docker pull postgres
docker run --name my-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres
read
- Data Source: Bank Customer Churn - https://www.kaggle.com/datasets/radheshyamkollipara/bank-customer-churn
Steps to import
To import CSV data into a PostgreSQL database running in a Docker container, follow these steps:
If you don't already have a PostgreSQL container running, start one:
docker run --name my-postgres -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mysecretpassword -e POSTGRES_DB=bankmaster -p 5432:5432 -d postgres
This starts a PostgreSQL container with:
-
User:
myuser
-
Password:
mypassword
-
Database:
mydatabase
-
Port:
5432
If your CSV file is on your local machine, copy it into the container:
docker cp mydata.csv my-postgres:/mydata.csv
Alternatively, mount a local directory when running the container:
docker run --name my-postgres -v $(pwd)/data:/data -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mysecretpassword -e POSTGRES_DB=bankmaster -p 5432:5432 -d postgres
Then place the CSV file inside the data/
folder.
Access the running PostgreSQL container:
docker exec -it my-postgres psql -U myuser -d mydatabase
Ensure the table structure matches your CSV data:
CREATE TABLE mytable (
id SERIAL PRIMARY KEY,
name TEXT,
age INT,
city TEXT
);
Run the following command inside PostgreSQL:
COPY mytable(name, age, city)
FROM '/mydata.csv'
DELIMITER ','
CSV HEADER;
-
DELIMITER ','
→ Specifies that values are comma-separated. -
CSV HEADER
→ Skips the first row if it contains column headers.
Check if the data was imported successfully:
SELECT * FROM mytable;
If the CSV is on your local machine and PostgreSQL is accessible outside Docker:
cat mydata.csv | docker exec -i postgres-container psql -U myuser -d mydatabase -c "COPY mytable FROM STDIN WITH CSV HEADER DELIMITER ',';"
-
Permission Denied? Run:
ALTER TABLE mytable OWNER TO myuser;
- Wrong File Path? Ensure the file is inside the container.
-
Encoding Issues? Convert the file to UTF-8 using:
iconv -f ISO-8859-1 -t UTF-8 mydata.csv -o mydata_utf8.csv
If your interactive PostgreSQL session inside Docker is not working, here are possible reasons and solutions:
First, check if your PostgreSQL container is running:
docker ps
If you don’t see your container, start it:
docker start postgres-container
Or, if you haven't created one yet:
docker run --name postgres-container -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mypassword -e POSTGRES_DB=mydatabase -p 5432:5432 -d postgres
Try accessing PostgreSQL using:
docker exec -it postgres-container psql -U myuser -d mydatabase
If this doesn’t work, check logs for errors:
docker logs postgres-container
If docker exec
isn’t working, enter the container using bash:
docker exec -it postgres-container bash
Then manually run:
psql -U myuser -d mydatabase
If psql
is not found, PostgreSQL might not be running inside the container. Check with:
pg_isready
If it’s not ready, restart the service inside the container:
service postgresql restart
or
pg_ctl -D /var/lib/postgresql/data start
If the service is failing, inspect logs inside the container:
docker logs postgres-container | tail -n 20
If you're connecting from outside the container, check if PostgreSQL is listening on port 5432:
docker inspect postgres-container | grep 5432
Or from inside the container:
netstat -tulnp | grep 5432
If it’s not listening, try:
docker restart postgres-container
If you see authentication errors, edit pg_hba.conf
inside the container:
docker exec -it postgres-container bash
nano /var/lib/postgresql/data/pg_hba.conf
Ensure it contains:
host all all 0.0.0.0/0 md5
Then restart PostgreSQL:
service postgresql restart
If docker exec
isn’t working, try connecting from your host:
psql -h localhost -U myuser -d mydatabase -p 5432
If that fails, ensure PostgreSQL is accepting external connections by editing postgresql.conf
:
docker exec -it postgres-container bash
nano /var/lib/postgresql/data/postgresql.conf
Change:
listen_addresses = '*'
Then restart PostgreSQL:
service postgresql restart
-
Check container status:
docker ps -a
-
Restart PostgreSQL inside container:
service postgresql restart
-
Run
psql
manually inside container:docker exec -it postgres-container psql -U myuser -d mydatabase
-
Inspect logs for errors:
docker logs postgres-container | tail -n 20
--
To import the Customer-Churn-Records.csv file into the customer_churn table in PostgreSQL, follow these steps:
First, make sure the customer_churn table is created in PostgreSQL.
If not, create it using:
CREATE TABLE customer_churn (
row_number SERIAL PRIMARY KEY,
customer_id BIGINT UNIQUE NOT NULL,
surname TEXT NOT NULL,
credit_score INT CHECK (credit_score BETWEEN 0 AND 1000),
geography VARCHAR(50),
gender VARCHAR(10) CHECK (gender IN ('Male', 'Female')),
age INT CHECK (age >= 18),
tenure INT CHECK (tenure >= 0),
balance DECIMAL(18,2),
num_of_products INT CHECK (num_of_products >= 0),
has_cr_card BOOLEAN,
is_active_member BOOLEAN,
estimated_salary DECIMAL(18,2),
exited BOOLEAN,
complain BOOLEAN,
satisfaction_score INT CHECK (satisfaction_score BETWEEN 1 AND 5),
card_type VARCHAR(50),
point_earned INT CHECK (point_earned >= 0)
);
If PostgreSQL is running inside Docker, copy the file into the container:
docker cp /mnt/data/Customer-Churn-Records.csv postgres-container:/tmp/customer_churn.csv
Run the following command inside psql:
COPY customer_churn(customer_id, surname, credit_score, geography, gender, age, tenure, balance, num_of_products,
has_cr_card, is_active_member, estimated_salary, exited, complain, satisfaction_score,
card_type, point_earned)
FROM '/tmp/customer_churn.csv'
DELIMITER ','
CSV HEADER;
-
COPY customer_churn (...)
→ Specifies the table and columns to import into. -
FROM '/tmp/customer_churn.csv'
→ Path to the CSV inside the Docker container or server. -
DELIMITER ','
→ Comma-separated values. -
CSV HEADER
→ Skips the first row (column names).
After import, check if data is successfully loaded:
SELECT * FROM customer_churn LIMIT 10;
If you get a permission error, allow PostgreSQL to access the file:
chmod 777 /tmp/customer_churn.csv
-
Check column order in
COPY
matches the CSV. -
Convert boolean fields (e.g.,
HasCrCard
) using:ALTER TABLE customer_churn ALTER COLUMN has_cr_card TYPE BOOLEAN USING has_cr_card::BOOLEAN;
-
If
COPY
fails, usepsql
from your local machine:cat /mnt/data/Customer-Churn-Records.csv | docker exec -i postgres-container psql -U myuser -d mydatabase -c "COPY customer_churn FROM STDIN WITH CSV HEADER DELIMITER ',';"
If you don’t have superuser privileges, use:
\copy customer_churn FROM '/tmp/customer_churn.csv' DELIMITER ',' CSV HEADER;
This should successfully import your CSV into PostgreSQL! 🚀 Let me know if you face any issues.