Quickly master the use of Docker to build a development environment

Quickly master the use of Docker to build a development environment

As the platform continues to grow, the project's research and development has become more and more dependent on various external environments for developers, especially on basic services. These phenomena often lead to developers directly using public basic components for collaborative development for the sake of simplicity. In the case of parallel development, especially changes to the database or data, other developers often spend unnecessary time troubleshooting problems, resulting in reduced overall development efficiency and creating huge obstacles for remote assistance. In order to solve the above problems, Docker Compose technology will be used to assist developers in building the development environment, and ultimately developers can complete the construction of the entire development environment as long as they have Docker installed.

Knowing a thing and achieving it are two completely different things. Since the day Docker was born, we have dreamed of things like "deploy a project in 15 seconds", "version-controlled development environment", and trendy operation and maintenance terms such as "rolling development" and "software-defined architecture". Industry professionals at the forefront of the wave are participating in the trend of defining, redefining and commercializing many terms and tools, such as "orchestration" and "service discovery" with unprecedented enthusiasm.

I think the catalyst for this trend is the beautiful interface and abstraction that Docker brings between applications and infrastructure. Developers can talk about infrastructure without having to know the underlying architecture, and operations staff don't have to spend a lot of time figuring out how to install and manage software. There must be some power hidden beneath the seemingly simple appearance that simplifies everyone's life and makes it more efficient.

The real world is cruel. Don't take it for granted that adopting a new technology will only bring you enjoyment. Having worked on a number of projects over the past few years, I have experienced some strange circumstances, and I think Docker is no exception. But a certain experience can usually be directly applied to the next stage of the project. To gain proficiency from Docker, you must immerse yourself in actual projects to hone your skills.

Over the past year, I have been focusing on teaching my book on Docker fundamentals, Docker in Action.

I've noticed that almost everyone who starts learning Docker technology struggles with how to create a development environment before they can understand the relationships between everyone in the ecosystem. Everyone starts out thinking that using Docker will make setting up the environment easier, which is not entirely wrong. There are many "containerization" tutorials that cover creating an image and how to package a tool into a container, but how to Dockerize the development environment is a completely different matter.

As a pioneer, I can share my experience.

I used to be an experienced Java user, but the experience I shared is not about Java, but about my application development with Go and Node. I have some experience in Go development and am actively working on improving my skills in this area. The main problem with getting started quickly in an unfamiliar field is getting the right workflow, and I also hate installing software on my laptop all the time, which drives me to try to use Docker for these tasks, or sometimes Vagrant.

The project I am involved in is to write a standard REST service in Go, based on gin, and relying on certain libraries and services of Redis and NSQ. This means that I need to import some libraries of the locally running Redis and NSQ instances. What’s more interesting is that I also use some static resources served by NGINX.

For the uninitiated, Go is a programming language that actually has a command line tool called “go”. It is used for everything from dependency management, compilation, test cases, and many other tasks. For Go projects, apart from Git and a good editor, all that's left is dealing with it. However, there is still a problem. I don’t want to install Go on my laptop. I only want to install Git and Docker on my laptop. These issues limit compatibility with other environments and lower the barrier to entry for newcomers.

This project has runtime dependencies, meaning that the toolset needs to include Docker Compose for simple environment definition and orchestration. Many people will feel uncomfortable with this, so what should we do? Start creating a Dockerfile or docker-compose.yml? Well, let me first tell you how I did it and then explain why I did it.

For example in the case of www.sangpi.com I want my local packages to be fully automatic. I don't like to do the steps manually, and my vim config file is pretty simple. I just want to control the runtime environment from a "run or not" level. Localized development environment targets are quickly replicated not only for productivity, but also for sharing Docker images. I finally finished the Dockerfile that produces images that include Go, Node, and my most commonly used packaging tool Gulp. This Dockerfile has no embedded code, and the image does not have an embedded Gulpfile. Instead, a volume is defined on an established GOPATH (the root of the Go workspace).

Finally, I set up an entrypoint for gulp to serve these images and set the default command to monitor. The output images are definitely not what I call a build artifact. In this sense, the only thing this environment does is provide a running instance to help us determine whether the code is running. For my scenario, it works great. I will use "artifacts" to refer to another build.

Next I define a local development environment using Compose. First, define all the dependent services defined in Docker Hub used in the images and connect them to a "target" service. This service references where the new Dockerfile was generated from, bind-mounts the local source directory to the mount point where the new image is expected to be output, and exposes some ports that can be tested. Then, a service was added that would continuously loop through a series of integration tests against the target service. Finally, I added an NGINX service and mounted a volume with many configuration files and static assets. The benefit of using volumes is that you can reuse configuration files and assets without rebuilding the image.

All the code will eventually generate a local development environment on your computer. When using:

docker-compose up –d

, git clone is started and then runs in a loop; there is no need to rebuild the image or restart the container. Every time a .go file changes, Gulp rebuilds and restarts my service in the running container. It's that simple.

Is it easy to create this environment? Not really, but it did happen. Wouldn’t it be simpler to not use a container but install Go, Node, and Gulp locally? Maybe in this scenario, but it is limited to running this dependent service with Docker. I don't like this.

I used to have to manage different versions of these tools, resulting in complex environment variables and artifacts generated everywhere. I had to remind colleagues about these conflict-prone environment variables due to the lack of centralized version control.

Maybe you don't like the environment described above, or have different needs for your project. Well, that’s right. This article is not about running all tools in Docker. If that were the case, it would mean that we haven’t considered what problem we are trying to solve.

When I designed this environment, I considered the following questions, concerns, and some potential answers. When you start working with Docker, you may find that the actual situation may be worse than your answer.

When you think about packaging and environment, what is the first thing you consider?

This is indeed the most important question. In this scenario, there are several options. I can program directly inside the container using Go, which looks like this:

In fact, most of the bolierplate in this example can be hidden through shell aliases or functions, which makes it feel like Go is installed on its own device. It can also be connected with the Go workflow to create artifacts. These features are beneficial for non-server projects, but not necessarily for library and software projects.

Assuming you are already using Gulp, make, ant, or other scripts, you can go ahead and use Docker as a target for those tools.

Alternatively, I can get more Docker-oriented experience by using Dockerbuild to define and control my builds. The code is as follows:

Using Docker to control your build has several benefits. You can use previously compiled images, and Dockerfilebuilds use a caching approach so that compilation work only repeats minimal steps (assuming you have a great Dockerfile). Finally, the images generated by these builds can also be shared with other developers.

In this case, I use onbuildimage from the golang repository as a base. This includes some cool logic for downloading dependencies. This method will generate a Docker image that can be easily used in other non-production environments. The problem with this approach for production-level images is that steps must be taken to avoid large images and include some init scripts to verify status before starting and monitoring the service.

Interestingly, Docker uses a series of scripts, Makefiles and Dockerfiles. The build system is relatively robust and is responsible for various game tests, linting, etc., as well as artifacts of various operating systems and architectures. In this scenario, the container is a tool used to generate binaries, but it is implemented from a local build image.

To expand the options of Docker build, you can use Compose to define a complete development environment.

Compose is responsible for environment management. It’s not surprising that the system is very clean. Compose ties everything together, optimizing volume management, automatically building images when they are missing, and aggregating log output. I chose these switches to simplify the service dependencies and also because it generates the artifacts I need.

This example is a runtime container, Compose or Docker have suitable tools for this. In this scenario, you may need a distributed image, or you may want the build to produce a binary file for your local machine.

If you want to get the desired image, you must ensure that the source code or precompiled library is embedded in the image during build. The volume is not mounted during the build, which means that the image needs to be rebuilt every time it is repeated.

If you want to generate certain artifacts inside the container, you need to introduce a mounted volume. This is easy to do using the Docker command line or a Compose environment. But be aware that build does not work unless the container is running, which means you cannot just use docker build.

Summary

There is currently no Docker way to create a development environment. Docker is an orchestration tool, not just a holy book. Instead of using someone else's existing dockerbuild system, it is better to spend some time learning this tool, clarify your needs, and then create a Docker environment that suits you.

This is the end of this article about how to quickly master the use of Docker to build a development environment. For more relevant content about building a development environment with Docker, please search for previous articles on 123WORDPRESS.COM or continue to browse the following related articles. I hope everyone will support 123WORDPRESS.COM in the future!

You may also be interested in:
  • PyCharm uses Docker images to build a Python development environment
  • Docker builds your own PHP development environment
  • Use docker-compose to build AspNetCore development environment
  • How to quickly build an Oracle development environment using Docker
  • Detailed explanation of setting up Docker development environment on MAC OSX
  • Ubuntu builds a LNMP+Redis development environment based on Docker (picture and text)
  • A detailed tutorial on building a PHP development environment based on Docker

<<:  Several popular website navigation directions in the future

>>:  Simple comparison of meta tags in html

Recommend

How to export mysql query results to csv

To export MySQL query results to csv , you usuall...

Detailed example of MySQL exchange partition

Detailed example of MySQL exchange partition Pref...

A brief discussion on tags in HTML

0. What is a tag? XML/HTML CodeCopy content to cl...

Database index knowledge points summary

Table of contents First Look Index The concept of...

How to use explain to query SQL execution plan in MySql

The explain command is the primary way to see how...

8 examples of using killall command to terminate processes in Linux

The Linux command line provides many commands to ...

Installation process of zabbix-agent on Kylin V10

1. Download the installation package Download add...

Some thoughts and experience sharing on web page (website) design and production

First, before posting! Thanks again to I Want to S...

Analysis of the Principles of MySQL Slow Query Related Parameters

MySQL slow query, whose full name is slow query l...

MAC+PyCharm+Flask+Vue.js build system

Table of contents Configure node.js+nvm+npm npm s...

Simple principles for web page layout design

This article summarizes some simple principles of...

Determine the direction of mouse entry based on CSS

In a front-end technology group before, a group m...

MySQL5.7 master-slave configuration example analysis

MySQL5.7 master-slave configuration implementatio...