lix/releng
jade e715e5fd31 releng: fix logging inside interactive xonsh
I don't know when this broke, it seems like it happened since the 24.05
upgrade, so xonsh 0.15.

What happened is that xonsh was trying to intercept log output, which
explodes if you have the logger survive past one command input. This is,
however, impossible to avoid if you are trying to use logging when you
import releng from inside xonsh for interactive use!

The error below is because the memory handler backing the stdout/stderr
of the one command that's just been run was closed after the command
completed.

Change-Id: I2be642aebf93da9818d08ff8b97c2e72ba5ac581

--- Logging error ---
Traceback (most recent call last):
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/logging/__init__.py", line 1113, in emit
    stream.write(msg + self.terminator)
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/xonsh/base_shell.py", line 183, in write
    self.mem.write(s)
ValueError: I/O operation on closed file.
Call stack:
  File "/nix/store/xgdp1p1gv8ni1awnkzyqasnn6gz5wlvx-xonsh-0.15.1/bin/xonsh", line 8, in <module>
    sys.exit(main())
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/xonsh/main.py", line 470, in main
    sys.exit(main_xonsh(args))
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/xonsh/main.py", line 514, in main_xonsh
    shell.shell.cmdloop()
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/xonsh/ptk_shell/shell.py", line 406, in cmd
loop
    line = self.singleline(auto_suggest=auto_suggest)
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/xonsh/ptk_shell/shell.py", line 374, in sin
gleline
    line = self.prompter.prompt(**prompt_args)
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/prompt_toolkit/shortcuts/prompt.py", line 1
026, in prompt
    return self.app.run(
  File "/nix/store/34951j60xcsw6zj4v8lsaf491acv0by3-python3-3.11.9-env/lib/python3.11/site-packages/prompt_toolkit/application/application.py",
 line 1002, in run
    return asyncio.run(coro)
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/runners.py", line 189, in run
    with Runner(debug=debug) as runner:
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/runners.py", line 59, in __enter__
    self._lazy_init()
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/runners.py", line 137, in _lazy_init
    self._loop = events.new_event_loop()
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/events.py", line 810, in new_event_loop
    return get_event_loop_policy().new_event_loop()
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/events.py", line 699, in new_event_loop
    return self._loop_factory()
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/unix_events.py", line 64, in __init__
    super().__init__(selector)
  File "/nix/store/7hnr99nxrd2aw6lghybqdmkckq60j6l9-python3-3.11.9/lib/python3.11/asyncio/selector_events.py", line 54, in __init__
    logger.debug('Using selector: %s', selector.__class__.__name__)
Message: 'Using selector: %s'
Arguments: ('EpollSelector',)

Change-Id: I90959809129aaf96aad4577599031688599ed85e
2024-06-13 15:17:44 -07:00
..
__init__.py releng: fix logging inside interactive xonsh 2024-06-13 15:17:44 -07:00
__main__.py Put into place initial release engineering 2024-06-06 20:53:08 -07:00
cli.py releng: support multiple systems 2024-06-13 14:36:03 -07:00
create_release.xsh releng: support multiple systems 2024-06-13 14:36:03 -07:00
docker.xsh releng: add prod environment, ready for release 2024-06-09 20:33:24 -07:00
docker_assemble.py releng: support multiple systems 2024-06-13 14:36:03 -07:00
environment.py releng: add prod environment, ready for release 2024-06-09 20:33:24 -07:00
gitutils.xsh releng: automatically figure out if we should tag latest for docker 2024-06-09 20:33:24 -07:00
keys.py releng: support pushing the manual to docs also 2024-06-06 20:53:08 -07:00
README.md Put into place initial release engineering 2024-06-06 20:53:08 -07:00
release-jobs.nix releng: support multiple systems 2024-06-13 14:36:03 -07:00
version.py Put into place initial release engineering 2024-06-06 20:53:08 -07:00

Release engineering

This directory contains the release engineering scripts for Lix.

Release process

Prerequisites

  • FIXME: validation via misc tests in nixpkgs, among other things? What other validation do we need before we can actually release?
  • Have a release post ready to go as a PR on the website repo.
  • No release-blocker bugs.

Process

The following process can be done either against the staging environment or the live environment.

For staging, the buckets are staging-releases, staging-cache, etc.

FIXME: obtainment of signing key for signing cache paths?

First, we prepare the release. python -m releng prepare is used for this.

  • Gather everything in doc/manual/rl-next and put it in doc/manual/src/release-notes/rl-MAJOR.md.

Then we tag the release with python -m releng tag:

  • Git HEAD is detached.
  • officialRelease = true is set in flake.nix, this is committed, and a release is tagged.
  • The tag is merged back into the last branch (either main for new releases or release-MAJOR for maintenance releases) with git merge -s ours VERSION creating a history link but ignoring the tree of the release tag.
  • Git HEAD is once again detached onto the release tag.

Then, we build the release artifacts with python -m releng build:

  • Source tarball is generated with git archive, then checksummed.
  • Manifest for nix upgrade-nix is produced and put in s3://releases at /manifest.nix and /lix/lix-VERSION.
  • Release is built: hydraJobs.binaryTarball jobs are built, and joined into a derivation that depends on all of them and adds checksum files. This and the sources go into s3://releases/lix/lix-VERSION.

At this point we have a release/artifacts and release/manual directory which are ready to publish, and tags ready for publication. No keys are required to do this part.

Next, we do the publication with python -m releng upload:

  • Artifacts for this release are uploaded:

    • s3://releases/manifest.nix, changing the default version of Lix for nix upgrade-nix.
    • s3://releases/lix/lix-VERSION/ gets the following contents
      • Binary tarballs
      • Docs: manual/ (FIXME: should we actually do this? what about putting it on docs.lix.systems? I think doing both is correct, since the Web site should not be an archive of random old manuals)
      • Docs as tarball in addition to web.
      • Source tarball
      • Docker image (FIXME: upload to forgejo registry and github registry in the future)
    • s3://docs/manual/lix/MAJOR
    • s3://docs/manual/lix/stable
  • The tag is uploaded to the remote repo.

  • Manually build the installer using the scripts in the installer repo and upload.

    FIXME: This currently requires a local Apple Macintosh® aarch64 computer, but we could possibly automate doing it from the aarch64-darwin builder.

  • Manually Push the main/release branch directly to gerrit.

  • If this is a new major release, branch-off to release-MAJOR and push that branch directly to gerrit as well (FIXME: special creds for doing this as a service account so we don't need to have the gerrit perms to shoot ourselves in the foot by default because pushing to main is bad?).

    FIXME: automate branch-off to release-* branch.

  • Manually (FIXME?) switch back to the release branch, which now has the correct revision.

  • Post!!

Installer

The installer is cross-built to several systems from a Mac using build-all.xsh and upload-to-lix.xsh in the installer repo (FIXME: currently at least; maybe this should be moved here?) .

It installs a binary tarball (FIXME: it should be taught to substitute from cache instead) from some URL; this is the hydraJobs.binaryTarball. The default URLs differ by architecture and are configured here.

Infrastructure summary

  • releases.lix.systems (s3://releases):
    • Each release gets a directory: https://releases.lix.systems/lix/lix-2.90-beta.1
      • Binary tarballs: nix-2.90.0-beta.1-x86_64-linux.tar.xz, from hydraJobs.binaryTarball
      • Manifest: manifest.nix, an attrset of the store paths by architecture.
    • Manifest for nix upgrade-nix to the latest release at /manifest.nix.
  • cache.lix.systems (s3://cache):
    • Receives all artifacts for released versions of Lix; is a plain HTTP binary cache.
  • install.lix.systems (s3://install):
    ~ » aws s3 ls s3://install/lix/
                               PRE lix-2.90-beta.0/
                               PRE lix-2.90-beta.1/
                               PRE lix-2.90.0pre20240411/
                               PRE lix-2.90.0pre20240412/
    2024-05-05 18:59:11    6707344 lix-installer-aarch64-darwin
    2024-05-05 18:59:16    7479768 lix-installer-aarch64-linux
    2024-05-05 18:59:14    7982208 lix-installer-x86_64-darwin
    2024-05-05 18:59:17    8978352 lix-installer-x86_64-linux
    
    ~ » aws s3 ls s3://install/lix/lix-2.90-beta.1/
    2024-05-05 18:59:01    6707344 lix-installer-aarch64-darwin
    2024-05-05 18:59:06    7479768 lix-installer-aarch64-linux
    2024-05-05 18:59:03    7982208 lix-installer-x86_64-darwin
    2024-05-05 18:59:07    8978352 lix-installer-x86_64-linux