This commit adds several meson.build, which successfully build and
install Lix executables, libraries, and headers. Meson does not yet
build docs, Perl bindings, or run tests, which will be added in
following commits. As such, this commit does not remove the existing
build system, or make it the default, and also as such, this commit has
several FIXMEs and TODOs as notes for what should be done before the
existing autoconf + make buildsystem can be removed and Meson made the
default. This commit does not modify any source files.
A Meson-enabled build is also added as a Hydra job, and to
`nix flake check`.
Change-Id: I667c8685b13b7bab91e281053f807a11616ae3d4
These now have equivalents in the standard lib in C++20. This change was
performed with a custom clang-tidy check which I will submit later.
Executed like so:
ninja -C build && run-clang-tidy -checks='-*,nix-*' -load=build/libnix-clang-tidy.so -p .. -fix ../tests | tee -a clang-tidy-result
Change-Id: I62679e315ff9e7ce72a40b91b79c3e9fc01b27e9
Don't attempt to `git add` ignored files
(cherry picked from commit 359990dfdc713c80aabd7ea6f7e4528628fbe108)
===
also added a regression test that isn't upstream to be sure we're
actually fixing the bug.
Change-Id: I8267a3d0ece9909d8008b7435b90e7b3eee366f6
Combine `AbstractPos`, `PosAdapter`, and `Pos`
(cherry picked from commit 113499d16fc87d53b73fb62fe6242154909756ed)
===
this is a bit cursed because originally it was based on InputAccessor
code that we don't have and moved/patched features we likewise don't
have (fetchToStore caching, all the individual accessors,
ContentAddressMethod). the commit is adjusted accordingly to
match (remove caching, ignore accessors, use FileIngestionMethod).
note that `state.rootPath . CanonPath == abs` and
computeStorePathForPath works relative to cwd, so the slight rewrite in
the moved fetchToStore is legal.
Change-Id: I05fd340c273f0bcc8ffabfebdc4a88b98083bce5
Special-casing the file name is rather ugly, so we shouldn't do
that. So now any {file,http,https} URL is handled by
TarballInputScheme, except for non-flake inputs (i.e. inputs that have
the attribute `flake = false`).
Whereas `ContentAddressWithReferences` is a sum type complex because different
varieties support different notions of reference, and
`ContentAddressMethod` is a nested enum to support that,
`ContentAddress` can be a simple pair of a method and hash.
`ContentAddress` does not need to be a sum type on the outside because
the choice of method doesn't effect what type of hashes we can use.
Co-Authored-By: Cale Gibbard <cgibbard@gmail.com>
Previously, for tarball flakes, we recorded the original URL of the
tarball flake, rather than the URL to which it ultimately
redirects. Thus, a flake URL like
http://example.org/patchelf-latest.tar that redirects to
http://example.org/patchelf-<revision>.tar was not really usable. We
couldn't record the redirected URL, because sites like GitHub redirect
to CDN URLs that we can't rely on to be stable.
So now we use the redirected URL only if the server returns the
`x-nix-is-immutable` or `x-amz-meta-nix-is-immutable` headers in its
response.
* Finish converting existing comments for internal API docs
99% of this was just reformatting existing comments. Only two exceptions:
- Expanded upon `BuildResult::status` compat note
- Split up file-level `symbol-table.hh` doc comments to get
per-definition docs
Also fixed a few whitespace goofs, turning leading tabs to spaces and
removing trailing spaces.
Picking up from #8133
* Fix two things from comments
* Use triple-backtick not indent for `dumpPath`
* Convert GNU-style `\`..'` quotes to markdown style in API docs
This will render correctly.
This introduces the SourcePath type from lazy-trees as an abstraction
for accessing files from inputs that may not be materialized in the
real filesystem (e.g. Git repositories). Currently, however, it's just
a wrapper around CanonPath, so it shouldn't change any behaviour. (On
lazy-trees, SourcePath is a <InputAccessor, CanonPath> tuple.)
We hide them in various ways if the experimental feature isn't enabled.
To do this, we had to move the experimental features list out of
libnixstore, because the setting machinary itself depends on it. To do
that, we made a new `ExperimentalFeatureSettings`.
If this documentation is inaccurate in any way please do not hesitate to suggest corrections.
My understanding of this function is strictly from reading the source code and some limited experience implementing fetchers.
Previously we would completely refetch the submodules from the
network, even though the repo might already have them. Now we copy the
.git/modules directory from the repo as an optimisation. This speeds
up evaluating
builtins.fetchTree { type = "git"; url = "/path/to/blender"; submodules = true; }
(where /path/to/blender already has the needed submodules) from 121s
to 57s.
This is still pretty inefficient and a hack, but a better solution is
best done on the lazy-trees branch.
This change also help in the case where the repo already has the
submodules but the origin is unfetchable for whatever reason
(e.g. there have been cases where Nix in a GitHub action doesn't have
the right authentication set up).
We cannot use 'actualUrl', because for file:// repos that's not the
original URL that the repo was fetched from. This is a problem since
submodules may be relative to the original URL.
Fixes e.g.
nix eval --impure --json --expr 'builtins.fetchTree { type = "git"; url = "/path/to/blender"; submodules = true; }'
where /path/to/blender is a clone of
https://github.com/blender/blender.git (which has several relative
submodules like '../blender-addons.git').
With the switch to C++20, the rules became more strict, and we can no
longer initialize base classes. Make them comments instead.
(BTW
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2021/p2287r1.html
this offers some new syntax for this use-case. Hopefully this will be
adopted and we can eventually use it.)
If we don't have any github token, we won't be able to fetch private
repos, but we are also more likely to run into API limits since
we don't have a token. To mitigate this only ever use the github api
if we actually have a token.