Dev Environment ‐ Setup Tools - FullstackCodingGuy/Developer-Fundamentals GitHub Wiki

Docker

To test docker instance is running successfully.

docker run hello-world

Postgres

docker pull postgres
docker run --name my-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres

Data Import

read

Steps to import

To import CSV data into a PostgreSQL database running in a Docker container, follow these steps:


1. Start a PostgreSQL Container

If you don't already have a PostgreSQL container running, start one:

docker run --name my-postgres -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mysecretpassword -e POSTGRES_DB=bankmaster -p 5432:5432 -d postgres

This starts a PostgreSQL container with:

  • User: myuser
  • Password: mypassword
  • Database: mydatabase
  • Port: 5432

2. Copy CSV File into the Container

If your CSV file is on your local machine, copy it into the container:

docker cp mydata.csv my-postgres:/mydata.csv

Alternatively, mount a local directory when running the container:

docker run --name my-postgres -v $(pwd)/data:/data -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mysecretpassword -e POSTGRES_DB=bankmaster -p 5432:5432 -d postgres

Then place the CSV file inside the data/ folder.


3. Connect to PostgreSQL

Access the running PostgreSQL container:

docker exec -it my-postgres psql -U myuser -d mydatabase

4. Create a Table (If Needed)

Ensure the table structure matches your CSV data:

CREATE TABLE mytable (
    id SERIAL PRIMARY KEY,
    name TEXT,
    age INT,
    city TEXT
);

5. Import CSV Data Using COPY

Run the following command inside PostgreSQL:

COPY mytable(name, age, city)
FROM '/mydata.csv'
DELIMITER ','
CSV HEADER;
  • DELIMITER ',' → Specifies that values are comma-separated.
  • CSV HEADER → Skips the first row if it contains column headers.

6. Verify Import

Check if the data was imported successfully:

SELECT * FROM mytable;

Alternative: Import CSV Using psql from Host

If the CSV is on your local machine and PostgreSQL is accessible outside Docker:

cat mydata.csv | docker exec -i postgres-container psql -U myuser -d mydatabase -c "COPY mytable FROM STDIN WITH CSV HEADER DELIMITER ',';"

Troubleshooting

  • Permission Denied? Run:
    ALTER TABLE mytable OWNER TO myuser;
  • Wrong File Path? Ensure the file is inside the container.
  • Encoding Issues? Convert the file to UTF-8 using:
    iconv -f ISO-8859-1 -t UTF-8 mydata.csv -o mydata_utf8.csv

If your interactive PostgreSQL session inside Docker is not working, here are possible reasons and solutions:


1. Ensure the Container is Running

First, check if your PostgreSQL container is running:

docker ps

If you don’t see your container, start it:

docker start postgres-container

Or, if you haven't created one yet:

docker run --name postgres-container -e POSTGRES_USER=myuser -e POSTGRES_PASSWORD=mypassword -e POSTGRES_DB=mydatabase -p 5432:5432 -d postgres

2. Execute psql Inside the Container

Try accessing PostgreSQL using:

docker exec -it postgres-container psql -U myuser -d mydatabase

If this doesn’t work, check logs for errors:

docker logs postgres-container

3. Bash into the Container and Manually Run psql

If docker exec isn’t working, enter the container using bash:

docker exec -it postgres-container bash

Then manually run:

psql -U myuser -d mydatabase

If psql is not found, PostgreSQL might not be running inside the container. Check with:

pg_isready

If it’s not ready, restart the service inside the container:

service postgresql restart

or

pg_ctl -D /var/lib/postgresql/data start

4. Check PostgreSQL Logs for Issues

If the service is failing, inspect logs inside the container:

docker logs postgres-container | tail -n 20

5. Verify PostgreSQL Port Binding

If you're connecting from outside the container, check if PostgreSQL is listening on port 5432:

docker inspect postgres-container | grep 5432

Or from inside the container:

netstat -tulnp | grep 5432

If it’s not listening, try:

docker restart postgres-container

6. Check PostgreSQL Authentication Issues

If you see authentication errors, edit pg_hba.conf inside the container:

docker exec -it postgres-container bash
nano /var/lib/postgresql/data/pg_hba.conf

Ensure it contains:

host all all 0.0.0.0/0 md5

Then restart PostgreSQL:

service postgresql restart

7. Alternative: Connect from Your Local Machine

If docker exec isn’t working, try connecting from your host:

psql -h localhost -U myuser -d mydatabase -p 5432

If that fails, ensure PostgreSQL is accepting external connections by editing postgresql.conf:

docker exec -it postgres-container bash
nano /var/lib/postgresql/data/postgresql.conf

Change:

listen_addresses = '*'

Then restart PostgreSQL:

service postgresql restart

Final Steps:

  1. Check container status: docker ps -a
  2. Restart PostgreSQL inside container: service postgresql restart
  3. Run psql manually inside container: docker exec -it postgres-container psql -U myuser -d mydatabase
  4. Inspect logs for errors: docker logs postgres-container | tail -n 20 --

Importing data from CSV

To import the Customer-Churn-Records.csv file into the customer_churn table in PostgreSQL, follow these steps:


1. Ensure the Table Exists

First, make sure the customer_churn table is created in PostgreSQL.

If not, create it using:

CREATE TABLE customer_churn (
    row_number SERIAL PRIMARY KEY,
    customer_id BIGINT UNIQUE NOT NULL,
    surname TEXT NOT NULL,
    credit_score INT CHECK (credit_score BETWEEN 0 AND 1000),
    geography VARCHAR(50),
    gender VARCHAR(10) CHECK (gender IN ('Male', 'Female')),
    age INT CHECK (age >= 18),
    tenure INT CHECK (tenure >= 0),
    balance DECIMAL(18,2),
    num_of_products INT CHECK (num_of_products >= 0),
    has_cr_card BOOLEAN,
    is_active_member BOOLEAN,
    estimated_salary DECIMAL(18,2),
    exited BOOLEAN,
    complain BOOLEAN,
    satisfaction_score INT CHECK (satisfaction_score BETWEEN 1 AND 5),
    card_type VARCHAR(50),
    point_earned INT CHECK (point_earned >= 0)
);

2. Copy CSV File to the PostgreSQL Container

If PostgreSQL is running inside Docker, copy the file into the container:

docker cp /mnt/data/Customer-Churn-Records.csv postgres-container:/tmp/customer_churn.csv

3. Import CSV into PostgreSQL

Run the following command inside psql:

COPY customer_churn(customer_id, surname, credit_score, geography, gender, age, tenure, balance, num_of_products, 
                    has_cr_card, is_active_member, estimated_salary, exited, complain, satisfaction_score, 
                    card_type, point_earned)
FROM '/tmp/customer_churn.csv'
DELIMITER ','
CSV HEADER;

Explanation:

  • COPY customer_churn (...) → Specifies the table and columns to import into.
  • FROM '/tmp/customer_churn.csv' → Path to the CSV inside the Docker container or server.
  • DELIMITER ',' → Comma-separated values.
  • CSV HEADER → Skips the first row (column names).

4. Verify the Data

After import, check if data is successfully loaded:

SELECT * FROM customer_churn LIMIT 10;

5. Handling Issues

Permission Denied Error?

If you get a permission error, allow PostgreSQL to access the file:

chmod 777 /tmp/customer_churn.csv

Data Type Mismatch?

  • Check column order in COPY matches the CSV.

  • Convert boolean fields (e.g., HasCrCard) using:

    ALTER TABLE customer_churn ALTER COLUMN has_cr_card TYPE BOOLEAN USING has_cr_card::BOOLEAN;
  • If COPY fails, use psql from your local machine:

    cat /mnt/data/Customer-Churn-Records.csv | docker exec -i postgres-container psql -U myuser -d mydatabase -c "COPY customer_churn FROM STDIN WITH CSV HEADER DELIMITER ',';"

Alternative: Use \copy (For Non-Superusers)

If you don’t have superuser privileges, use:

\copy customer_churn FROM '/tmp/customer_churn.csv' DELIMITER ',' CSV HEADER;

This should successfully import your CSV into PostgreSQL! 🚀 Let me know if you face any issues.

References

⚠️ **GitHub.com Fallback** ⚠️