I keep...not being able to enter the PostGRES database via terminal when it's running in a Docker image.
So I scuppered my work, redownloaded the image, took clearer notes with commands and examples on how prompts should appear, and got inside, taking all of 10 minutes.
It's like entering a multi-layered vault.
If you're starting a Docker container through the terminal, it is case-sensitive.
docker exec -it RUNTCPIP-DBRep /bin/bash is different than
docker exec -it RUNTCPIP-DBREP /bin/bash
Downloading Open Datasets Two Ways For Test Data
Heard of Kaggle? Now you have.
The first way: I took this dataset (free to use) and downloaded it to the Linux system in the container.
I updated packages with apt-get -y and installed curl -( apt-get -y install curl).
(Keep in mind you will likely have to change the path Kaggle gives. I used a period [.] and ended up putting it in a directory I could only reach via an odd song and dance of entering and exiting a directory. I expected it to download the zip file in the directory I was currently in. That's on me.)
The second way: In the end I used the docker cp command learned here after downloading the file locally to my Docker folder on Windows 11 and it worked flawlessly.
I also unzipped it with unzip.
After talking with people in the industry, we realized that this may not be the best way to have a reusable data-testing environment; It would take too long to start up each time with the amount of data.
So I encouraged a solution where the production data was backed up to an S3 bucket every so often that people could pull from and use to test.
We could also schedule snapshots of DynamoDBs.
Comments
Post a Comment