Improve programmer guide

Signed-off-by: Solomon Hykes <sh.github.6811@hykes.org>
This commit is contained in:
Solomon Hykes 2021-03-27 00:13:43 +00:00
parent d6c51c6df5
commit 6a894afaf2

View File

@ -23,19 +23,32 @@ The deployment plan lays out every node in the application supply chain, and how
* Hosting infrastructure: compute, storage, networking, databases, CDN..
* Software dependencies: operating systems, languages, libraries, frameworks, etc.
Each node is a standalone software component, with its own code, inputs and outputs.
The interconnected network of component inputs and outputs forms a special kind of graph called a [DAG]().
Nodes are interconnected to model the flow of code and data through the supply chain:
source code flows from a git repository to a build system; system dependencies are
combined in a docker image, then uploaded to a registry; configuration files are
generated then sent to a compute cluster or load balancer; etc.
Dagger follows a *reactive* programming model: when a component receives a new input
(for example a new version of source code, or a new setting), it recomputes its outputs,
which then propagate to adjacent nodes, and so on. Thus the flow of data through
the DAG mimics the flow of goods through a supply chain.
## Relays
Dagger executes deployments by running *relays*.
A relay is a standalone software component assigned to a single node in the deployment plan.
One relay fetches might source code; another runs the build system; another uploads the container image; etc.
Relays are written in Cue, like the deployment plan they are part of. A relay is made of 3 parts:
* Inputs: data received from the user, or upstream relays
* A processing pipeline: code executed against each new input
* Outputs: data produced by the processing pipeline
Relays run in parallel, with their inputs and outputs interconnected into a special kind of graph,
called a *DAG*. When a relay receives a new input, it runs it through the processing pipeline,
and produces new outputs, which are propagated to downstream relays as inputs, and so on.
## Using third-party components
## Using third-party relays
Cue includes a complete package system. This makes it easy to create a complex deployment plan in very few
lines of codes, simply by composing existing packages.
lines of codes, simply by importing relays from third-party packages.
For example, to create a deployment plan involving Github, Heroku and Amazon RDS, one might import the three
corresponding packages:
@ -58,15 +71,34 @@ backend: heroku.#App & {
db: rds.#Database & {
// RDS configuration values
}
```
## Creating a new component
## Creating a new relay
Sometimes there is no third-party component available for a particular node in the application's supply chain;
or it exists but needs to be customized.
Sometimes there is no third-party relay available for a particular node in the deployment plan;
or it may exist but need to be customized.
A Dagger component is simply a Cue definition annotated with [LLB](https://github.com/moby/buildkit) pipelines.
LLB is a standard executable format pioneered by the Buildkit project. It allows Dagger components to run
sophisticated pipelines to ingest, and process artifacts such as source code, binaries, database exports, etc.
Best of all LLB pipelines can securely build and run any docker container, effectively making Dagger
A relay is typically contained in a cue definition, with the definition name reflecting its function.
For example a relay for a git repository might be defined as `#Repository`.
The inputs and outputs of a relay are simply cue values in the definition.
The processing pipeline is a crucial feature of Dagger. It uses the [LLB](https://github.com/moby/buildkit)
executable format pioneered by the Buildkit project. It allows Dagger components to run
sophisticated pipelines to ingest produce artifacts such as source code, binaries, database exports, etc.
Best of all, LLB pipelines can securely build and run any docker container, effectively making Dagger
scriptable in any language.
## Docker compatibility
Thanks to its native support of LLB, Dagger offers native compatibility with Docker.
This makes it very easy to extend an existing Docker-based workflow, including:
* Reusing Dockerfiles and docker-compose files without modification
* Wrapping other deployment tools in a Dagger relay by running them inside a container
* Robust multi-arch and multi-OS support, including Arm and Windows.
* Integration with existing Docker engines and registries
* Integration with Docker for Mac and Docker for Windows on developer machines