- Refactored to keep every transformation of built-in types (e.g. FS,
Secret, etc) to/from CUE in the same place (plancontext)
- dagger.#Service and dagger.#Secret are now following the new FS-like format
(e.g. `_service: id: string`)
- Backward compatibility
- dagger.#Stream is now an alias for dagger.#Service
Signed-off-by: Andrea Luzzardi <aluzzardi@gmail.com>
- Implement dagger.#FS support
- Migrate `context.imports` to dagger.#FS
- Backward compat: dagger.#FS can be passed in lieu of a
dagger.#Artifact
- For instance, an import (`dagger.#FS`) can be passed to the current
`yarn.#Package` implementation
Signed-off-by: Andrea Luzzardi <aluzzardi@gmail.com>
This change helps the transition between `dagger input` and `#Plan.context`.
In summary, the codebase now relies on a *context* for execution with mapping to *IDs*.
In the future, *context* will come from a `#Plan.context`.
In the meantime, a bridge converts `dagger input` to a plan context. This allows both *old* and *new* style configurations to co-exist with the same underlying engine.
- Implement `plancontext`. Context holds the execution context for a plan. Currently this includes the platform, local directories, secrets and services (e.g. unix/npipe).
- Contextual data can be registered at any point. In the future, this will be done by `#Plan.context`
- Migrated the `dagger input` codebase to register inputs in a `plancontext`
- Migrated low-level types/operations to the *Context ID* pattern.
- `dagger.#Stream` now only includes an `id` (instead of `unix` path)
- `dagger.#Secret` still includes only an ID, but now it's based off `plancontext`
- `op.#Local` now only includes an `id` (instead of `path`, `include`, `exclude`.
Signed-off-by: Andrea Luzzardi <aluzzardi@gmail.com>
This adds support to loading artifacts (e.g. docker.#Build,
os.#Container, ...) into any arbitrary docker engine (through a
dagger.#Stream for UNIX sockets or SSH for a remote engine)
Implementation:
- Add op.#SaveImage which serializes an artifact into an arbitrary path
(docker tarball format)
- Add docker.#Load which uses op.#SaveImage to serialize to disk and
executes `docker load` to load it back
Caveats: Because we're doing this in userspace rather than letting
dagger itself load the image, the performance is pretty bad.
The buildkit API is meant for streaming (get a stream of a docker image
pipe it into docker load). Because of userspace, we have to load the
entire docker image into memory, then serialize it in a single WriteFile
LLB operation.
Example:
```cue
package main
import (
"alpha.dagger.io/dagger"
"alpha.dagger.io/docker"
)
source: dagger.#Input & dagger.#Artifact
dockersocket: dagger.#Input & dagger.#Stream
build: docker.#Build & {
"source": source
}
load: docker.#Load & {
source: build
tag: "testimage"
socket: dockersocket
}
```
Signed-off-by: Andrea Luzzardi <aluzzardi@gmail.com>