Put into place initial release engineering
This can release x86_64-linux binaries to staging, with ephemeral keys.
I think it's good enough to review at least at this point, so we don't
keep adding more stuff to it to make it harder to review.
Change-Id: Ie95e8f35d1252f5d014e819566f170b30eda152e
This commit is contained in:
parent
611b1de441
commit
c32a01f9eb
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -28,3 +28,4 @@ buildtime.bin
|
||||||
# We generate this with a Nix shell hook
|
# We generate this with a Nix shell hook
|
||||||
/.pre-commit-config.yaml
|
/.pre-commit-config.yaml
|
||||||
/.nocontribmsg
|
/.nocontribmsg
|
||||||
|
/release
|
||||||
|
|
|
@ -207,7 +207,6 @@
|
||||||
overlays.default = overlayFor (p: p.stdenv);
|
overlays.default = overlayFor (p: p.stdenv);
|
||||||
|
|
||||||
hydraJobs = {
|
hydraJobs = {
|
||||||
|
|
||||||
# Binary package for various platforms.
|
# Binary package for various platforms.
|
||||||
build = forAllSystems (system: self.packages.${system}.nix);
|
build = forAllSystems (system: self.packages.${system}.nix);
|
||||||
|
|
||||||
|
@ -297,6 +296,11 @@
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
|
release-jobs = import ./releng/release-jobs.nix {
|
||||||
|
inherit (self) hydraJobs;
|
||||||
|
pkgs = nixpkgsFor.x86_64-linux.native;
|
||||||
|
};
|
||||||
|
|
||||||
# NOTE *do not* add fresh derivations to checks, always add them to
|
# NOTE *do not* add fresh derivations to checks, always add them to
|
||||||
# hydraJobs first (so CI will pick them up) and only link them here
|
# hydraJobs first (so CI will pick them up) and only link them here
|
||||||
checks = forAvailableSystems (
|
checks = forAvailableSystems (
|
||||||
|
|
|
@ -143,7 +143,7 @@ def run_on_dir(author_info: AuthorInfoDB, d):
|
||||||
|
|
||||||
for category in CATEGORIES:
|
for category in CATEGORIES:
|
||||||
if entries[category]:
|
if entries[category]:
|
||||||
print('\n#', category)
|
print('\n##', category)
|
||||||
do_category(author_info, entries[category])
|
do_category(author_info, entries[category])
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
|
|
132
releng/README.md
Normal file
132
releng/README.md
Normal file
|
@ -0,0 +1,132 @@
|
||||||
|
# Release engineering
|
||||||
|
|
||||||
|
This directory contains the release engineering scripts for Lix.
|
||||||
|
|
||||||
|
## Release process
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
* FIXME: validation via misc tests in nixpkgs, among other things? What other
|
||||||
|
validation do we need before we can actually release?
|
||||||
|
* Have a release post ready to go as a PR on the website repo.
|
||||||
|
* No [release-blocker bugs][release-blockers].
|
||||||
|
|
||||||
|
[release-blockers]: https://git.lix.systems/lix-project/lix/issues?q=&type=all&sort=&labels=145&state=open&milestone=0&project=0&assignee=0&poster=0
|
||||||
|
|
||||||
|
### Process
|
||||||
|
|
||||||
|
The following process can be done either against the staging environment or the
|
||||||
|
live environment.
|
||||||
|
|
||||||
|
For staging, the buckets are `staging-releases`, `staging-cache`, etc.
|
||||||
|
|
||||||
|
FIXME: obtainment of signing key for signing cache paths?
|
||||||
|
|
||||||
|
First, we prepare the release. `python -m releng prepare` is used for this.
|
||||||
|
|
||||||
|
* Gather everything in `doc/manual/rl-next` and put it in
|
||||||
|
`doc/manual/src/release-notes/rl-MAJOR.md`.
|
||||||
|
|
||||||
|
Then we tag the release with `python -m releng tag`:
|
||||||
|
|
||||||
|
* Git HEAD is detached.
|
||||||
|
* `officialRelease = true` is set in `flake.nix`, this is committed, and a
|
||||||
|
release is tagged.
|
||||||
|
* The tag is merged back into the last branch (either `main` for new releases
|
||||||
|
or `release-MAJOR` for maintenance releases) with `git merge -s ours VERSION`
|
||||||
|
creating a history link but ignoring the tree of the release tag.
|
||||||
|
* Git HEAD is once again detached onto the release tag.
|
||||||
|
|
||||||
|
Then, we build the release artifacts with `python -m releng build`:
|
||||||
|
|
||||||
|
* Source tarball is generated with `git archive`, then checksummed.
|
||||||
|
* Manifest for `nix upgrade-nix` is produced and put in `s3://releases` at
|
||||||
|
`/manifest.nix` and `/lix/lix-VERSION`.
|
||||||
|
* Release is built: `hydraJobs.binaryTarball` jobs are built, and joined into a
|
||||||
|
derivation that depends on all of them and adds checksum files. This and the
|
||||||
|
sources go into `s3://releases/lix/lix-VERSION`.
|
||||||
|
|
||||||
|
At this point we have a `release/artifacts` and `release/manual` directory
|
||||||
|
which are ready to publish, and tags ready for publication. No keys are
|
||||||
|
required to do this part.
|
||||||
|
|
||||||
|
Next, we do the publication with `python -m releng upload`:
|
||||||
|
|
||||||
|
* Artifacts for this release are uploaded:
|
||||||
|
* s3://releases/manifest.nix, changing the default version of Lix for
|
||||||
|
`nix upgrade-nix`.
|
||||||
|
* s3://releases/lix/lix-VERSION/ gets the following contents
|
||||||
|
* Binary tarballs
|
||||||
|
* Docs: `manual/` (FIXME: should we actually do this? what about putting it
|
||||||
|
on docs.lix.systems? I think doing both is correct, since the Web site
|
||||||
|
should not be an archive of random old manuals)
|
||||||
|
* Docs as tarball in addition to web.
|
||||||
|
* Source tarball
|
||||||
|
* Docker image (FIXME: upload to forgejo registry and github registry [in the future][upload-docker])
|
||||||
|
* s3://docs/manual/lix/MAJOR
|
||||||
|
* s3://docs/manual/lix/stable
|
||||||
|
|
||||||
|
* The tag is uploaded to the remote repo.
|
||||||
|
* **Manually** build the installer using the scripts in the installer repo and upload.
|
||||||
|
|
||||||
|
FIXME: This currently requires a local Apple Macintosh® aarch64 computer, but
|
||||||
|
we could possibly automate doing it from the aarch64-darwin builder.
|
||||||
|
* **Manually** Push the main/release branch directly to gerrit.
|
||||||
|
* If this is a new major release, branch-off to `release-MAJOR` and push *that* branch
|
||||||
|
directly to gerrit as well (FIXME: special creds for doing this as a service
|
||||||
|
account so we don't need to have the gerrit perms to shoot ourselves in the
|
||||||
|
foot by default because pushing to main is bad?).
|
||||||
|
|
||||||
|
FIXME: automate branch-off to `release-*` branch.
|
||||||
|
* **Manually** (FIXME?) switch back to the release branch, which now has the
|
||||||
|
correct revision.
|
||||||
|
* Post!!
|
||||||
|
* Merge release blog post to [lix-website].
|
||||||
|
* Toot about it! https://chaos.social/@lix_project
|
||||||
|
* Tweet about it! https://twitter.com/lixproject
|
||||||
|
|
||||||
|
[lix-website]: https://git.lix.systems/lix-project/lix-website
|
||||||
|
|
||||||
|
[upload-docker]: https://git.lix.systems/lix-project/lix/issues/252
|
||||||
|
|
||||||
|
### Installer
|
||||||
|
|
||||||
|
The installer is cross-built to several systems from a Mac using
|
||||||
|
`build-all.xsh` and `upload-to-lix.xsh` in the installer repo (FIXME: currently
|
||||||
|
at least; maybe this should be moved here?) .
|
||||||
|
|
||||||
|
It installs a binary tarball (FIXME: [it should be taught to substitute from
|
||||||
|
cache instead][installer-substitute])
|
||||||
|
from some URL; this is the `hydraJobs.binaryTarball`. The default URLs differ
|
||||||
|
by architecture and are [configured here][tarball-urls].
|
||||||
|
|
||||||
|
[installer-substitute]: https://git.lix.systems/lix-project/lix-installer/issues/13
|
||||||
|
[tarball-urls]: https://git.lix.systems/lix-project/lix-installer/src/commit/693592ed10d421a885bec0a9dd45e87ab87eb90a/src/settings.rs#L14-L28
|
||||||
|
|
||||||
|
## Infrastructure summary
|
||||||
|
|
||||||
|
* releases.lix.systems (`s3://releases`):
|
||||||
|
* Each release gets a directory: https://releases.lix.systems/lix/lix-2.90-beta.1
|
||||||
|
* Binary tarballs: `nix-2.90.0-beta.1-x86_64-linux.tar.xz`, from `hydraJobs.binaryTarball`
|
||||||
|
* Manifest: `manifest.nix`, an attrset of the store paths by architecture.
|
||||||
|
* Manifest for `nix upgrade-nix` to the latest release at `/manifest.nix`.
|
||||||
|
* cache.lix.systems (`s3://cache`):
|
||||||
|
* Receives all artifacts for released versions of Lix; is a plain HTTP binary cache.
|
||||||
|
* install.lix.systems (`s3://install`):
|
||||||
|
```
|
||||||
|
~ » aws s3 ls s3://install/lix/
|
||||||
|
PRE lix-2.90-beta.0/
|
||||||
|
PRE lix-2.90-beta.1/
|
||||||
|
PRE lix-2.90.0pre20240411/
|
||||||
|
PRE lix-2.90.0pre20240412/
|
||||||
|
2024-05-05 18:59:11 6707344 lix-installer-aarch64-darwin
|
||||||
|
2024-05-05 18:59:16 7479768 lix-installer-aarch64-linux
|
||||||
|
2024-05-05 18:59:14 7982208 lix-installer-x86_64-darwin
|
||||||
|
2024-05-05 18:59:17 8978352 lix-installer-x86_64-linux
|
||||||
|
|
||||||
|
~ » aws s3 ls s3://install/lix/lix-2.90-beta.1/
|
||||||
|
2024-05-05 18:59:01 6707344 lix-installer-aarch64-darwin
|
||||||
|
2024-05-05 18:59:06 7479768 lix-installer-aarch64-linux
|
||||||
|
2024-05-05 18:59:03 7982208 lix-installer-x86_64-darwin
|
||||||
|
2024-05-05 18:59:07 8978352 lix-installer-x86_64-linux
|
||||||
|
```
|
17
releng/__init__.py
Normal file
17
releng/__init__.py
Normal file
|
@ -0,0 +1,17 @@
|
||||||
|
from xonsh.main import setup
|
||||||
|
setup()
|
||||||
|
del setup
|
||||||
|
|
||||||
|
from releng import environment
|
||||||
|
from releng import create_release
|
||||||
|
from releng import keys
|
||||||
|
from releng import version
|
||||||
|
from releng import cli
|
||||||
|
|
||||||
|
def reload():
|
||||||
|
import importlib
|
||||||
|
importlib.reload(environment)
|
||||||
|
importlib.reload(create_release)
|
||||||
|
importlib.reload(keys)
|
||||||
|
importlib.reload(version)
|
||||||
|
importlib.reload(cli)
|
3
releng/__main__.py
Normal file
3
releng/__main__.py
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
from . import cli
|
||||||
|
|
||||||
|
cli.main()
|
73
releng/cli.py
Normal file
73
releng/cli.py
Normal file
|
@ -0,0 +1,73 @@
|
||||||
|
from . import create_release
|
||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
def do_build(args):
|
||||||
|
create_release.build_artifacts(no_check_git=args.no_check_git)
|
||||||
|
|
||||||
|
|
||||||
|
def do_tag(args):
|
||||||
|
create_release.do_tag_merge(force_tag=args.force_tag,
|
||||||
|
no_check_git=args.no_check_git)
|
||||||
|
|
||||||
|
|
||||||
|
def do_upload(args):
|
||||||
|
create_release.setup_creds()
|
||||||
|
create_release.upload_artifacts(force_push_tag=args.force_push_tag,
|
||||||
|
noconfirm=args.noconfirm)
|
||||||
|
|
||||||
|
def do_prepare(args):
|
||||||
|
create_release.prepare_release_notes()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
ap = argparse.ArgumentParser(description='*Lix ur release engineering*')
|
||||||
|
|
||||||
|
def fail(args):
|
||||||
|
ap.print_usage()
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
ap.set_defaults(cmd=fail)
|
||||||
|
|
||||||
|
sps = ap.add_subparsers()
|
||||||
|
|
||||||
|
prepare = sps.add_parser('prepare', help='Prepares for a release by moving the release notes over.')
|
||||||
|
prepare.set_defaults(cmd=do_prepare)
|
||||||
|
|
||||||
|
tag = sps.add_parser(
|
||||||
|
'tag',
|
||||||
|
help=
|
||||||
|
'Create the tag for the current release in .version and merge it back to the current branch, then switch to it'
|
||||||
|
)
|
||||||
|
tag.add_argument('--no-check-git',
|
||||||
|
action='store_true',
|
||||||
|
help="Don't check git state before tagging. For testing.")
|
||||||
|
tag.add_argument('--force-tag',
|
||||||
|
action='store_true',
|
||||||
|
help='Overwrite the existing tag. For testing.')
|
||||||
|
tag.set_defaults(cmd=do_tag)
|
||||||
|
|
||||||
|
build = sps.add_parser(
|
||||||
|
'release',
|
||||||
|
help=
|
||||||
|
'Build an artifacts/ directory with the things that would be released')
|
||||||
|
build.add_argument(
|
||||||
|
'--no-check-git',
|
||||||
|
action='store_true',
|
||||||
|
help="Don't check git state before building. For testing.")
|
||||||
|
build.set_defaults(cmd=do_build)
|
||||||
|
|
||||||
|
upload = sps.add_parser(
|
||||||
|
'upload', help='Upload artifacts to cache and releases bucket')
|
||||||
|
upload.add_argument('--force-push-tag',
|
||||||
|
action='store_true',
|
||||||
|
help='Force push the tag. For testing.')
|
||||||
|
upload.add_argument(
|
||||||
|
'--noconfirm',
|
||||||
|
action='store_true',
|
||||||
|
help="Don't ask for confirmation. For testing/automation.")
|
||||||
|
upload.set_defaults(cmd=do_upload)
|
||||||
|
|
||||||
|
args = ap.parse_args()
|
||||||
|
args.cmd(args)
|
297
releng/create_release.xsh
Normal file
297
releng/create_release.xsh
Normal file
|
@ -0,0 +1,297 @@
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import itertools
|
||||||
|
import textwrap
|
||||||
|
from pathlib import Path
|
||||||
|
import tempfile
|
||||||
|
import hashlib
|
||||||
|
import datetime
|
||||||
|
from . import environment
|
||||||
|
from . import keys
|
||||||
|
from .version import VERSION, RELEASE_NAME, MAJOR
|
||||||
|
|
||||||
|
$RAISE_SUBPROC_ERROR = True
|
||||||
|
$XONSH_SHOW_TRACEBACK = True
|
||||||
|
|
||||||
|
RELENG_ENV = environment.STAGING
|
||||||
|
|
||||||
|
RELEASES_BUCKET = RELENG_ENV.releases_bucket
|
||||||
|
CACHE_STORE = RELENG_ENV.cache_store_uri()
|
||||||
|
REPO = RELENG_ENV.git_repo
|
||||||
|
|
||||||
|
GCROOTS_DIR = Path('./release/gcroots')
|
||||||
|
BUILT_GCROOTS_DIR = Path('./release/gcroots-build')
|
||||||
|
DRVS_TXT = Path('./release/drvs.txt')
|
||||||
|
ARTIFACTS = Path('./release/artifacts')
|
||||||
|
|
||||||
|
RELENG_MSG = "Release created with releng/create_release.xsh"
|
||||||
|
|
||||||
|
BUILD_CORES = 16
|
||||||
|
MAX_JOBS = 2
|
||||||
|
|
||||||
|
# TODO
|
||||||
|
RELEASE_SYSTEMS = ["x86_64-linux"]
|
||||||
|
|
||||||
|
|
||||||
|
def setup_creds():
|
||||||
|
key = keys.get_ephemeral_key(RELENG_ENV)
|
||||||
|
$AWS_SECRET_ACCESS_KEY = key.secret_key
|
||||||
|
$AWS_ACCESS_KEY_ID = key.id
|
||||||
|
$AWS_DEFAULT_REGION = 'garage'
|
||||||
|
$AWS_ENDPOINT_URL = environment.S3_ENDPOINT
|
||||||
|
|
||||||
|
|
||||||
|
def git_preconditions():
|
||||||
|
# verify there is nothing in index ready to stage
|
||||||
|
proc = !(git diff-index --quiet --cached HEAD --)
|
||||||
|
assert proc.rtn == 0
|
||||||
|
# verify there is nothing *stageable* and tracked
|
||||||
|
proc = !(git diff-files --quiet)
|
||||||
|
assert proc.rtn == 0
|
||||||
|
|
||||||
|
|
||||||
|
def official_release_commit_tag(force_tag=False):
|
||||||
|
print('[+] Setting officialRelease in flake.nix and tagging')
|
||||||
|
prev_branch = $(git symbolic-ref --short HEAD).strip()
|
||||||
|
|
||||||
|
git switch --detach
|
||||||
|
sed -i 's/officialRelease = false/officialRelease = true/' flake.nix
|
||||||
|
git add flake.nix
|
||||||
|
message = f'release: {VERSION} "{RELEASE_NAME}"\n\nRelease produced with releng/create_release.xsh'
|
||||||
|
git commit -m @(message)
|
||||||
|
git tag @(['-f'] if force_tag else []) -a -m @(message) @(VERSION)
|
||||||
|
|
||||||
|
return prev_branch
|
||||||
|
|
||||||
|
|
||||||
|
def merge_to_release(prev_branch):
|
||||||
|
git switch @(prev_branch)
|
||||||
|
# Create a merge back into the release branch so that git tools understand
|
||||||
|
# that the release branch contains the tag, without the release commit
|
||||||
|
# actually influencing the tree.
|
||||||
|
merge_msg = textwrap.dedent("""\
|
||||||
|
release: merge release {VERSION} back to mainline
|
||||||
|
|
||||||
|
This merge commit returns to the previous state prior to the release but leaves the tag in the branch history.
|
||||||
|
{RELENG_MSG}
|
||||||
|
""").format(VERSION=VERSION, RELENG_MSG=RELENG_MSG)
|
||||||
|
git merge -m @(merge_msg) -s ours @(VERSION)
|
||||||
|
|
||||||
|
|
||||||
|
def realise(paths: list[str]):
|
||||||
|
args = [
|
||||||
|
'--realise',
|
||||||
|
'--max-jobs',
|
||||||
|
MAX_JOBS,
|
||||||
|
'--cores',
|
||||||
|
BUILD_CORES,
|
||||||
|
'--log-format',
|
||||||
|
'bar-with-logs',
|
||||||
|
'--add-root',
|
||||||
|
BUILT_GCROOTS_DIR
|
||||||
|
]
|
||||||
|
nix-store @(args) @(paths)
|
||||||
|
|
||||||
|
|
||||||
|
def eval_jobs():
|
||||||
|
nej_output = $(nix-eval-jobs --workers 4 --gc-roots-dir @(GCROOTS_DIR) --force-recurse --flake '.#release-jobs')
|
||||||
|
return [x for x in (json.loads(s) for s in nej_output.strip().split('\n'))
|
||||||
|
if x['system'] in RELEASE_SYSTEMS
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def upload_drv_paths_and_outputs(paths: list[str]):
|
||||||
|
proc = subprocess.Popen([
|
||||||
|
'nix',
|
||||||
|
'copy',
|
||||||
|
'-v',
|
||||||
|
'--to',
|
||||||
|
CACHE_STORE,
|
||||||
|
'--stdin',
|
||||||
|
],
|
||||||
|
stdin=subprocess.PIPE,
|
||||||
|
env=__xonsh__.env.detype(),
|
||||||
|
)
|
||||||
|
|
||||||
|
proc.stdin.write('\n'.join(itertools.chain(paths, x + '^*' for x in paths)).encode())
|
||||||
|
proc.stdin.close()
|
||||||
|
rv = proc.wait()
|
||||||
|
if rv != 0:
|
||||||
|
raise subprocess.CalledProcessError(rv, proc.args)
|
||||||
|
|
||||||
|
|
||||||
|
def make_manifest(eval_result):
|
||||||
|
manifest = {vs['system']: vs['outputs']['out'] for vs in eval_result}
|
||||||
|
def manifest_line(system, out):
|
||||||
|
return f' {system} = "{out}";'
|
||||||
|
|
||||||
|
manifest_text = textwrap.dedent("""\
|
||||||
|
# This file was generated by releng/create_release.xsh in Lix
|
||||||
|
{{
|
||||||
|
{lines}
|
||||||
|
}}
|
||||||
|
""").format(lines='\n'.join(manifest_line(s, p) for (s, p) in manifest.items()))
|
||||||
|
|
||||||
|
return manifest_text
|
||||||
|
|
||||||
|
|
||||||
|
def make_git_tarball(to: Path):
|
||||||
|
git archive --verbose --prefix=lix-@(VERSION)/ --format=tar.gz -o @(to) @(VERSION)
|
||||||
|
|
||||||
|
|
||||||
|
def confirm(prompt, expected):
|
||||||
|
resp = input(prompt)
|
||||||
|
|
||||||
|
if resp != expected:
|
||||||
|
raise ValueError('Unconfirmed')
|
||||||
|
|
||||||
|
|
||||||
|
def sha256_file(f: Path):
|
||||||
|
hasher = hashlib.sha256()
|
||||||
|
|
||||||
|
with open(f, 'rb') as h:
|
||||||
|
while data := h.read(1024 * 1024):
|
||||||
|
hasher.update(data)
|
||||||
|
|
||||||
|
return hasher.hexdigest()
|
||||||
|
|
||||||
|
|
||||||
|
def make_artifacts_dir(eval_result, d: Path):
|
||||||
|
d.mkdir(exist_ok=True, parents=True)
|
||||||
|
version_dir = d / 'lix' / f'lix-{VERSION}'
|
||||||
|
version_dir.mkdir(exist_ok=True, parents=True)
|
||||||
|
|
||||||
|
tarballs_drv = next(p for p in eval_result if p['attr'] == 'tarballs')
|
||||||
|
cp --no-preserve=mode -r @(tarballs_drv['outputs']['out'])/* @(version_dir)
|
||||||
|
|
||||||
|
# FIXME: upgrade-nix searches for manifest.nix at root, which is rather annoying
|
||||||
|
with open(d / 'manifest.nix', 'w') as h:
|
||||||
|
h.write(make_manifest(eval_result))
|
||||||
|
|
||||||
|
with open(version_dir / 'manifest.nix', 'w') as h:
|
||||||
|
h.write(make_manifest(eval_result))
|
||||||
|
|
||||||
|
print('[+] Make sources tarball')
|
||||||
|
|
||||||
|
filename = f'lix-{VERSION}.tar.gz'
|
||||||
|
git_tarball = version_dir / filename
|
||||||
|
make_git_tarball(git_tarball)
|
||||||
|
|
||||||
|
file_hash = sha256_file(git_tarball)
|
||||||
|
|
||||||
|
print(f'Hash: {file_hash}')
|
||||||
|
with open(version_dir / f'{filename}.sha256', 'w') as h:
|
||||||
|
h.write(file_hash)
|
||||||
|
|
||||||
|
|
||||||
|
def prepare_release_notes():
|
||||||
|
print('[+] Preparing release notes')
|
||||||
|
RELEASE_NOTES_PATH = Path('doc/manual/rl-next')
|
||||||
|
|
||||||
|
if RELEASE_NOTES_PATH.isdir():
|
||||||
|
notes_body = subprocess.check_output(['build-release-notes', '--change-authors', 'doc/manual/change-authors.yml', 'doc/manual/rl-next']).decode()
|
||||||
|
else:
|
||||||
|
# I guess nobody put release notes on their changes?
|
||||||
|
print('[-] Warning: seemingly missing any release notes, not worrying about it')
|
||||||
|
notes_body = ''
|
||||||
|
|
||||||
|
rl_path = Path(f'doc/manual/src/release-notes/rl-{MAJOR}.md')
|
||||||
|
|
||||||
|
existing_rl = ''
|
||||||
|
try:
|
||||||
|
with open(rl_path, 'r') as fh:
|
||||||
|
existing_rl = fh.read()
|
||||||
|
except FileNotFoundError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
date = datetime.datetime.now().strftime('%Y-%m-%d')
|
||||||
|
|
||||||
|
minor_header = f'# Lix {VERSION} ({date})'
|
||||||
|
|
||||||
|
header = f'# Lix {MAJOR} "{RELEASE_NAME}"'
|
||||||
|
if existing_rl.startswith(header):
|
||||||
|
# strip the header off for minor releases
|
||||||
|
lines = existing_rl.splitlines()
|
||||||
|
header = lines[0]
|
||||||
|
existing_rl = '\n'.join(lines[1:])
|
||||||
|
else:
|
||||||
|
header += f' ({date})\n\n'
|
||||||
|
|
||||||
|
header += '\n' + minor_header + '\n'
|
||||||
|
|
||||||
|
notes = header
|
||||||
|
notes += notes_body
|
||||||
|
notes += "\n\n"
|
||||||
|
notes += existing_rl
|
||||||
|
|
||||||
|
# make pre-commit happy about one newline
|
||||||
|
notes = notes.rstrip()
|
||||||
|
notes += "\n"
|
||||||
|
|
||||||
|
with open(rl_path, 'w') as fh:
|
||||||
|
fh.write(notes)
|
||||||
|
|
||||||
|
commit_msg = textwrap.dedent("""\
|
||||||
|
release: release notes for {VERSION}
|
||||||
|
|
||||||
|
{RELENG_MSG}
|
||||||
|
""").format(VERSION=VERSION, RELENG_MSG=RELENG_MSG)
|
||||||
|
|
||||||
|
git add @(rl_path)
|
||||||
|
git rm doc/manual/rl-next/*.md
|
||||||
|
|
||||||
|
git commit -m @(commit_msg)
|
||||||
|
|
||||||
|
|
||||||
|
def verify_are_on_tag():
|
||||||
|
current_tag = $(git describe --tag).strip()
|
||||||
|
assert current_tag == VERSION
|
||||||
|
|
||||||
|
|
||||||
|
def upload_artifacts(noconfirm=False, force_push_tag=False):
|
||||||
|
assert 'AWS_SECRET_ACCESS_KEY' in __xonsh__.env
|
||||||
|
|
||||||
|
tree @(ARTIFACTS)
|
||||||
|
|
||||||
|
not noconfirm and confirm(
|
||||||
|
f'Would you like to release {ARTIFACTS} as {VERSION}? Type "I want to release this" to confirm\n',
|
||||||
|
'I want to release this'
|
||||||
|
)
|
||||||
|
|
||||||
|
print('[+] Upload to cache')
|
||||||
|
with open(DRVS_TXT) as fh:
|
||||||
|
upload_drv_paths_and_outputs([x.strip() for x in fh.readlines() if x])
|
||||||
|
|
||||||
|
|
||||||
|
print('[+] Upload to release bucket')
|
||||||
|
aws s3 cp --recursive @(ARTIFACTS)/ @(RELEASES_BUCKET)/
|
||||||
|
|
||||||
|
print('[+] git push tag')
|
||||||
|
git push @(['-f'] if force_push_tag else []) @(REPO) f'{VERSION}:refs/tags/{VERSION}'
|
||||||
|
|
||||||
|
|
||||||
|
def do_tag_merge(force_tag=False, no_check_git=False):
|
||||||
|
if not no_check_git:
|
||||||
|
git_preconditions()
|
||||||
|
prev_branch = official_release_commit_tag(force_tag=force_tag)
|
||||||
|
merge_to_release(prev_branch)
|
||||||
|
git switch --detach @(VERSION)
|
||||||
|
|
||||||
|
|
||||||
|
def build_artifacts(no_check_git=False):
|
||||||
|
if not no_check_git:
|
||||||
|
verify_are_on_tag()
|
||||||
|
git_preconditions()
|
||||||
|
|
||||||
|
print('[+] Evaluating')
|
||||||
|
eval_result = eval_jobs()
|
||||||
|
drv_paths = [x['drvPath'] for x in eval_result]
|
||||||
|
|
||||||
|
print('[+] Building')
|
||||||
|
realise(drv_paths)
|
||||||
|
|
||||||
|
with open(DRVS_TXT, 'w') as fh:
|
||||||
|
fh.write('\n'.join(drv_paths))
|
||||||
|
|
||||||
|
make_artifacts_dir(eval_result, ARTIFACTS)
|
||||||
|
print(f'[+] Done! See {ARTIFACTS}')
|
50
releng/environment.py
Normal file
50
releng/environment.py
Normal file
|
@ -0,0 +1,50 @@
|
||||||
|
import dataclasses
|
||||||
|
import urllib.parse
|
||||||
|
|
||||||
|
S3_HOST = 's3.lix.systems'
|
||||||
|
S3_ENDPOINT = 'https://s3.lix.systems'
|
||||||
|
|
||||||
|
DEFAULT_STORE_URI_BITS = {
|
||||||
|
'region': 'garage',
|
||||||
|
'endpoint': 's3.lix.systems',
|
||||||
|
'want-mass-query': 'true',
|
||||||
|
'write-nar-listing': 'true',
|
||||||
|
'ls-compression': 'zstd',
|
||||||
|
'narinfo-compression': 'zstd',
|
||||||
|
'compression': 'zstd',
|
||||||
|
'parallel-compression': 'true',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class RelengEnvironment:
|
||||||
|
name: str
|
||||||
|
|
||||||
|
aws_profile: str
|
||||||
|
cache_store_overlay: dict[str, str]
|
||||||
|
cache_bucket: str
|
||||||
|
releases_bucket: str
|
||||||
|
git_repo: str
|
||||||
|
|
||||||
|
def cache_store_uri(self):
|
||||||
|
qs = DEFAULT_STORE_URI_BITS.copy()
|
||||||
|
qs.update(self.cache_store_overlay)
|
||||||
|
return self.cache_bucket + "?" + urllib.parse.urlencode(qs)
|
||||||
|
|
||||||
|
STAGING = RelengEnvironment(
|
||||||
|
name='staging',
|
||||||
|
aws_profile='garage_staging',
|
||||||
|
cache_bucket='s3://staging-cache',
|
||||||
|
cache_store_overlay={
|
||||||
|
'secret-key': 'staging.key'
|
||||||
|
},
|
||||||
|
releases_bucket='s3://staging-releases',
|
||||||
|
git_repo='ssh://git@git.lix.systems/lix-project/lix-releng-staging',
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class S3Credentials:
|
||||||
|
name: str
|
||||||
|
id: str
|
||||||
|
secret_key: str
|
18
releng/keys.py
Normal file
18
releng/keys.py
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
import subprocess
|
||||||
|
import json
|
||||||
|
from . import environment
|
||||||
|
|
||||||
|
|
||||||
|
def get_ephemeral_key(
|
||||||
|
env: environment.RelengEnvironment) -> environment.S3Credentials:
|
||||||
|
output = subprocess.check_output([
|
||||||
|
'ssh', '-l', 'root', environment.S3_HOST, 'garage-ephemeral-key',
|
||||||
|
'new', '--name', f'releng-{env.name}', '--read', '--write',
|
||||||
|
'--age-secs', '3600',
|
||||||
|
env.releases_bucket.removeprefix('s3://'),
|
||||||
|
env.cache_bucket.removeprefix('s3://')
|
||||||
|
])
|
||||||
|
d = json.loads(output.decode())
|
||||||
|
return environment.S3Credentials(name=d['name'],
|
||||||
|
id=d['id'],
|
||||||
|
secret_key=d['secret_key'])
|
57
releng/release-jobs.nix
Normal file
57
releng/release-jobs.nix
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
{ hydraJobs, pkgs }:
|
||||||
|
let
|
||||||
|
inherit (pkgs) lib;
|
||||||
|
lix = hydraJobs.build.x86_64-linux;
|
||||||
|
|
||||||
|
systems = [ "x86_64-linux" ];
|
||||||
|
dockerSystems = [ "x86_64-linux" ];
|
||||||
|
|
||||||
|
doTarball =
|
||||||
|
{
|
||||||
|
target,
|
||||||
|
targetName,
|
||||||
|
rename ? null,
|
||||||
|
}:
|
||||||
|
''
|
||||||
|
echo "doing: ${target}"
|
||||||
|
# expand wildcard
|
||||||
|
filename=$(echo ${target}/${targetName})
|
||||||
|
basename="$(basename $filename)"
|
||||||
|
|
||||||
|
echo $filename $basename
|
||||||
|
cp -v "$filename" "$out"
|
||||||
|
${lib.optionalString (rename != null) ''
|
||||||
|
mv "$out/$basename" "$out/${rename}"
|
||||||
|
basename="${rename}"
|
||||||
|
''}
|
||||||
|
sha256sum --binary $filename | cut -f1 -d' ' > $out/$basename.sha256
|
||||||
|
'';
|
||||||
|
|
||||||
|
targets =
|
||||||
|
builtins.map (system: {
|
||||||
|
target = hydraJobs.binaryTarball.${system};
|
||||||
|
targetName = "*.tar.xz";
|
||||||
|
}) systems
|
||||||
|
++ builtins.map (system: {
|
||||||
|
target = hydraJobs.dockerImage.${system};
|
||||||
|
targetName = "image.tar.gz";
|
||||||
|
rename = "lix-${lix.version}-docker-image-${system}.tar.gz";
|
||||||
|
}) dockerSystems;
|
||||||
|
|
||||||
|
manualTar = pkgs.runCommand "lix-manual-tarball" { } ''
|
||||||
|
mkdir -p $out
|
||||||
|
cp -r ${lix.doc}/share/doc/nix/manual lix-${lix.version}-manual
|
||||||
|
tar -cvzf "$out/lix-${lix.version}-manual.tar.gz" lix-${lix.version}-manual
|
||||||
|
'';
|
||||||
|
|
||||||
|
tarballs = pkgs.runCommand "lix-release-tarballs" { } ''
|
||||||
|
mkdir -p $out
|
||||||
|
${lib.concatMapStringsSep "\n" doTarball targets}
|
||||||
|
cp ${manualTar}/*.tar.gz $out
|
||||||
|
cp -r ${lix.doc}/share/doc/nix/manual $out
|
||||||
|
'';
|
||||||
|
in
|
||||||
|
{
|
||||||
|
inherit (hydraJobs) build;
|
||||||
|
inherit tarballs;
|
||||||
|
}
|
6
releng/version.py
Normal file
6
releng/version.py
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
import json
|
||||||
|
|
||||||
|
version_json = json.load(open('version.json'))
|
||||||
|
VERSION = version_json['version']
|
||||||
|
MAJOR = '.'.join(VERSION.split('.')[:2])
|
||||||
|
RELEASE_NAME = version_json['release_name']
|
Loading…
Reference in a new issue