Dockerizing a real world asp.net core application
The day after successfully dockerizing simplcommerce, I started to re-look at the approach, the code. I also received a bug report that the container fail to start again after stopping. The proudest thing of what I have done is that it only need one command to run the entire the application including the database in one container. But it revealed several drawbacks:
- The dockerfile is big, and it take long time to build. Around 15 minutes in dockerhub
- Postgres has its own way to initialize the container. I have extended that process to do entity framework migration, import static data and call
dotnet run
to launch the website. But this process only run once, after the container started for the first time, so this is the root cause of the bug I have mentioned above - Putting both the database and the website into one box generally is not a good practice.
I have decided to make changes. The first thing I do is separating the database and the website. For the database I use the default images of postgres without any customization.
“Separating what changes from what stays the same” is always a good practice. So it’s a good idea to separate the source code from the sdk. With this in mind I created simpl-sdk docker image. I need dotnet core 1.1 project json sdk, nodejs, gulp-cli and postgresql client. It make scene to start from microsoft/dotnet:1.1.0-sdk-projectjson then install other stuffs.
simpl-sdk Dockerfile
I created a github repository for it and went to docker hub to created an automated build repository similar to the way I did for simpcommerce earlier
SimplCommerce Dockerfile
Now the Dockerfile for simpcommerce become very small and clean.
From the simpl-sdk, copy the source code into the image, restore nugget packages, build entire the projects, call gulp copy-modules
to copy the build output of modules to the host. Copy and set the entry point.
The entry point
At this time the connection to database is ready. Run dotnet ef database update
to run migration, and use psql connect to the database, if no data found in database then import static data. Finally call dotnet run
to start the app
It looks simple and straight forward huh. But it took me a lot of time to made it run smoothly
- First I am not familiar with writing a shell script.
- Second, I am also not familiar with psql.
- Third some weird differences between linux and windows.
The first error that I got when starting the container was
“panic: standard_init_linux.go:175: exec user process caused “no such file or directory” [recovered] panic: standard_init_linux.go:175: exec user process caused “no such file or directory”
What the hell is that? After some googling, I fixed by changing the line ending in the docker-entrypoint.sh from CRLF to LF. In Notepad++ select Edit -> EOL Conversion
The second error was
“docker: Error response from daemon: invalid header field value “oci runtime error: container_linux.go:247: starting container process caused "exec: \"/docker-entrypoint.sh\": permission denied"\n”.”
Something related to permission. Continue googling, then I was able to fix it by adding
RUN chmod 755 /docker-entrypoint.sh
before the ENTRYPOINT ["/docker-entrypoint.sh"]
Bonus - Some usefull psql commands
Connect to a server run a query and print the out to a file
echo 'select count(*) from "Core_User";' | psql -h simpldb --username postgres -d simplcommerce -t > /tmp/varfile.txt
-h: host, -d: database, -t: to get just the tuple value
Run an sql script
psql -h simpldb --username postgres -d simplcommerce -a -f /app/src/Database/StaticData_Postgres.sql'
-h: host, -d: database, -a: echo all queries from scripts, -f: file
Connect to a server
psql -h simpldb --username postgres
When connected
\l
: list all databases
\c dbname
: connect to dbname database
In a database
\i path.sql
: exe sql
\dt
: list table
select * from "Core_User";
execute a query, the semicon at the end is required, table name is case sensitive
Quick psql
\q
: quick