I have an sql file which should creates tables and their data, I also want to use that dump file in my docker-compose file. The best solution I could come up with was running the curl command to upload the dump file from an external url and then use it in my docker entrypoint. I also want to automate this process, is it possible to run the curl command in the pipeline and delete the dump file after running the containers?
1 Answer
Whether or not you have curl available in your pipeline depends on the image the pipeline is using.
For example, I was recently running a pipeline with the node:18-alpine image. That image is very minimal - it doesn't have many libraries installed - and so commands like curl and bash were not available.
I switched to the node:18 image because it comes with those libraries installed.
If the image you're using has the package manager apt-get installed you could use it to install curl, eg apt-get update && apt-get install -y curl, but I imagine if your image has apt-get it likely already has curl installed.
In terms of your reason for wanting to use curl in the first place, it sounds like what you need is the ability to access a SQL dump file during your Docker build process?
If the goal is to upload that dump file into your DB, you can do that without curl by mounting the file as a volume and then executing the SQL commands in the file against the database.
How exactly that works will depend on which RDBMS you are using.
Eg, here's an example of how to use a SQL dump in a container running MySQL, where you simply mount the SQL file into a file named start.sql and MySQL automatically executes the commands in the file: How to import a mysql dump file into a Docker mysql container
Here's another example for PostgreSQL, where you run the import command via the docker exec command: Backup/Restore a dockerized PostgreSQL database
Hope that helps
Is it possibleHave you tried it?