Compare commits

...

71 commits

Author SHA1 Message Date
jade a510d17484
build-time: hide boost stacktrace in a .cc file
Saves about 16s of CPU time. Not a lot but not nothing. Feels more like
the principle of the thing.

Change-Id: I0992d4024317c20d6985a7977d5649edfb9f46bb
2024-08-28 09:55:09 -07:00
jade 04f8a14833
tree-wide: shuffle headers around for about 30s compile time
This didn't really feel so worth it afterwards, but I did untangle a
bunch of stuff that should not have been tangled.

The general gist of this change is that variant bullshit was causing a
bunch of compile time, and it seems like the only way to deal with
variant induced compile time is to keep variant types out of headers.
Explicit template instantiation seems to do nothing for them.

I also seem to have gotten some back-end time improvement from
explicitly instantiating regex, but I don't know why. There is no
corresponding front-end time improvement from it: regex is still at the
top of the sinners list.

**** Templates that took longest to instantiate:
 15231 ms: std::basic_regex<char>::_M_compile (28 times, avg 543 ms)
 15066 ms: std::__detail::_Compiler<std::regex_traits<char>>::_Compiler (28 times, avg 538 ms)
 12571 ms: std::__detail::_Compiler<std::regex_traits<char>>::_M_disjunction (28 times, avg 448 ms)
 12454 ms: std::__detail::_Compiler<std::regex_traits<char>>::_M_alternative (28 times, avg 444 ms)
 12225 ms: std::__detail::_Compiler<std::regex_traits<char>>::_M_term (28 times, avg 436 ms)
 11363 ms: nlohmann::basic_json<>::parse<const char *> (21 times, avg 541 ms)
 10628 ms: nlohmann::basic_json<>::basic_json (109 times, avg 97 ms)
 10134 ms: std::__detail::_Compiler<std::regex_traits<char>>::_M_atom (28 times, avg 361 ms)

Back-end time before messing with the regex:
**** Function sets that took longest to compile / optimize:
  8076 ms: void boost::io::detail::put<$>(boost::io::detail::put_holder<$> cons... (177 times, avg 45 ms)
  4382 ms: std::_Rb_tree<$>::_M_erase(std::_Rb_tree_node<$>*) (1247 times, avg 3 ms)
  3137 ms: boost::stacktrace::detail::to_string_impl_base<boost::stacktrace::de... (137 times, avg 22 ms)
  2896 ms: void boost::io::detail::mk_str<$>(std::__cxx11::basic_string<$>&, ch... (177 times, avg 16 ms)
  2304 ms: std::_Rb_tree<$>::_M_get_insert_hint_unique_pos(std::_Rb_tree_const_... (210 times, avg 10 ms)
  2116 ms: bool std::__detail::_Compiler<$>::_M_expression_term<$>(std::__detai... (112 times, avg 18 ms)
  2051 ms: std::_Rb_tree_iterator<$> std::_Rb_tree<$>::_M_emplace_hint_unique<$... (244 times, avg 8 ms)
  2037 ms: toml::result<$> toml::detail::sequence<$>::invoke<$>(toml::detail::l... (93 times, avg 21 ms)
  1928 ms: std::__detail::_Compiler<$>::_M_quantifier() (28 times, avg 68 ms)
  1859 ms: nlohmann::json_abi_v3_11_3::detail::serializer<$>::dump(nlohmann::js... (41 times, avg 45 ms)
  1824 ms: std::_Function_handler<$>::_M_manager(std::_Any_data&, std::_Any_dat... (973 times, avg 1 ms)
  1810 ms: std::__detail::_BracketMatcher<$>::_BracketMatcher(std::__detail::_B... (112 times, avg 16 ms)
  1793 ms: nix::fetchers::GitInputScheme::fetch(nix::ref<$>, nix::fetchers::Inp... (1 times, avg 1793 ms)
  1759 ms: std::_Rb_tree<$>::_M_get_insert_unique_pos(std::__cxx11::basic_strin... (281 times, avg 6 ms)
  1722 ms: bool nlohmann::json_abi_v3_11_3::detail::parser<$>::sax_parse_intern... (19 times, avg 90 ms)
  1677 ms: boost::io::basic_altstringbuf<$>::overflow(int) (194 times, avg 8 ms)
  1674 ms: std::__cxx11::basic_string<$>::_M_mutate(unsigned long, unsigned lon... (249 times, avg 6 ms)
  1660 ms: std::_Rb_tree_node<$>* std::_Rb_tree<$>::_M_copy<$>(std::_Rb_tree_no... (304 times, avg 5 ms)
  1599 ms: bool nlohmann::json_abi_v3_11_3::detail::parser<$>::sax_parse_intern... (19 times, avg 84 ms)
  1568 ms: void std::__detail::_Compiler<$>::_M_insert_bracket_matcher<$>(bool) (112 times, avg 14 ms)
  1541 ms: std::__shared_ptr<$>::~__shared_ptr() (531 times, avg 2 ms)
  1539 ms: nlohmann::json_abi_v3_11_3::detail::serializer<$>::dump_escaped(std:... (41 times, avg 37 ms)
  1471 ms: void std::__detail::_Compiler<$>::_M_insert_character_class_matcher<... (112 times, avg 13 ms)

After messing with the regex (notice std::__detail::_Compiler vanishes
here, but I don't know why):

**** Function sets that took longest to compile / optimize:
  8054 ms: void boost::io::detail::put<$>(boost::io::detail::put_holder<$> cons... (177 times, avg 45 ms)
  4313 ms: std::_Rb_tree<$>::_M_erase(std::_Rb_tree_node<$>*) (1217 times, avg 3 ms)
  3259 ms: boost::stacktrace::detail::to_string_impl_base<boost::stacktrace::de... (137 times, avg 23 ms)
  3045 ms: void boost::io::detail::mk_str<$>(std::__cxx11::basic_string<$>&, ch... (177 times, avg 17 ms)
  2314 ms: std::_Rb_tree<$>::_M_get_insert_hint_unique_pos(std::_Rb_tree_const_... (207 times, avg 11 ms)
  1923 ms: std::_Rb_tree_iterator<$> std::_Rb_tree<$>::_M_emplace_hint_unique<$... (216 times, avg 8 ms)
  1817 ms: bool nlohmann::json_abi_v3_11_3::detail::parser<$>::sax_parse_intern... (18 times, avg 100 ms)
  1816 ms: toml::result<$> toml::detail::sequence<$>::invoke<$>(toml::detail::l... (93 times, avg 19 ms)
  1788 ms: nlohmann::json_abi_v3_11_3::detail::serializer<$>::dump(nlohmann::js... (40 times, avg 44 ms)
  1749 ms: std::_Rb_tree<$>::_M_get_insert_unique_pos(std::__cxx11::basic_strin... (278 times, avg 6 ms)
  1724 ms: std::__cxx11::basic_string<$>::_M_mutate(unsigned long, unsigned lon... (248 times, avg 6 ms)
  1697 ms: boost::io::basic_altstringbuf<$>::overflow(int) (194 times, avg 8 ms)
  1684 ms: nix::fetchers::GitInputScheme::fetch(nix::ref<$>, nix::fetchers::Inp... (1 times, avg 1684 ms)
  1680 ms: std::_Rb_tree_node<$>* std::_Rb_tree<$>::_M_copy<$>(std::_Rb_tree_no... (303 times, avg 5 ms)
  1589 ms: bool nlohmann::json_abi_v3_11_3::detail::parser<$>::sax_parse_intern... (18 times, avg 88 ms)
  1483 ms: non-virtual thunk to boost::wrapexcept<$>::~wrapexcept() (181 times, avg 8 ms)
  1447 ms: nlohmann::json_abi_v3_11_3::detail::serializer<$>::dump_escaped(std:... (40 times, avg 36 ms)
  1441 ms: std::__shared_ptr<$>::~__shared_ptr() (496 times, avg 2 ms)
  1420 ms: boost::stacktrace::basic_stacktrace<$>::init(unsigned long, unsigned... (137 times, avg 10 ms)
  1396 ms: boost::basic_format<$>::~basic_format() (194 times, avg 7 ms)
  1290 ms: std::__cxx11::basic_string<$>::_M_replace_cold(char*, unsigned long,... (231 times, avg 5 ms)
  1258 ms: std::vector<$>::~vector() (354 times, avg 3 ms)
  1222 ms: std::__cxx11::basic_string<$>::_M_replace(unsigned long, unsigned lo... (231 times, avg 5 ms)
  1194 ms: std::_Rb_tree<$>::_M_get_insert_hint_unique_pos(std::_Rb_tree_const_... (49 times, avg 24 ms)
  1186 ms: bool tao::pegtl::internal::sor<$>::match<$>(std::integer_sequence<$>... (1 times, avg 1186 ms)
  1149 ms: std::__detail::_Executor<$>::_M_dfs(std::__detail::_Executor<$>::_Ma... (70 times, avg 16 ms)
  1123 ms: toml::detail::sequence<$>::invoke(toml::detail::location&) (69 times, avg 16 ms)
  1110 ms: nlohmann::json_abi_v3_11_3::basic_json<$>::json_value::destroy(nlohm... (55 times, avg 20 ms)
  1079 ms: std::_Function_handler<$>::_M_manager(std::_Any_data&, std::_Any_dat... (541 times, avg 1 ms)
  1033 ms: nlohmann::json_abi_v3_11_3::detail::lexer<$>::scan_number() (20 times, avg 51 ms)

Change-Id: I10af282bcd4fc39c2d3caae3453e599e4639c70b
2024-08-28 09:55:05 -07:00
jade e6f2af06e6
clang-tidy: fix the fact that we are not linting headers properly
This, however, took fixing a pile of lints that we predictably missed
because of this bug.

Change-Id: I92c36feb4a03f62bc594c2051c7bd7418d13fb08
2024-08-28 09:52:08 -07:00
jade 4d89844207
build: remove about 30 cpu-sec of compile time by explicit instantiation
Apparently the fmt contraption has some extremely popular overloads, and
the boost stuff in there gets built approximately infinite times in
every compilation unit.

Change-Id: Ideba2db7d6bf8559e4d91974bab636f5ed106198
2024-08-28 09:52:05 -07:00
Rebecca Turner 422550fd68 Merge "libstore: remove static initializers for Store registrations" into main 2024-08-28 16:43:22 +00:00
jade 5d31e889d7 Merge "treewide: fix a bunch of lints" into main 2024-08-28 03:40:27 +00:00
Pierre Bourdon 4f02255c20
libstore: remove static initializers for Store registrations
Ref #359.

Change-Id: Ia45530ddee25fa9fc399ff10738bb0d8bbc8b221
2024-08-26 16:27:31 -07:00
Rebecca Turner 0dc486a5bf Merge "Fix comment in getHome" into main 2024-08-26 23:17:04 +00:00
jade 0cc285f87b
treewide: fix a bunch of lints
Fixes:
- Identifiers starting with _ are prohibited
- Some driveby header dependency cleaning which wound up with doing some
  extra fixups.
- Fucking C style casts, man. C++ made these 1000% worse by letting you
  also do memory corruption with them with references.
  - Remove casts to Expr * where ExprBlackHole is an incomplete type by
    introducing an explicitly-cast eBlackHoleAddr as Expr *.
  - An incredibly illegal cast of the text bytes of the StorePath hash
    into a size_t directly. You can't DO THAT.

    Replaced with actually parsing the hash so we get 100% of the bits
    being entropy, then memcpying the start of the hash. If this shows
    up in a profile we should just make the hash parser faster with a
    lookup table or something sensible like that.
  - This horrendous bit of UB which I thankfully slapped a deprecation
    warning on, built, and it didn't trigger anywhere so it was dead
    code and I just deleted it. But holy crap you *cannot* do that.

    inline void mkString(const Symbol & s)
    {
        mkString(((const std::string &) s).c_str());
    }
- Some wrong lints. Lots of wrong macro lints, one wrong
  suspicious-sizeof lint triggered by the template being instantiated
  with only pointers, but the calculation being correct for both
  pointers and not-pointers.
- Exceptions in destructors strike again. I tried to catch the
  exceptions that might actually happen rather than all the exceptions
  imaginable. We can let the runtime hard-kill it on other exceptions
  imo.

Change-Id: I71761620846cba64d66ee7ca231b20c061e69710
2024-08-26 16:13:03 -07:00
Rebecca Turner ca08f1217d
rowan: 0.15.15 -> 0.15.16
This fixes an ambiguous pointer comparison warning.

See: https://github.com/rust-analyzer/rowan/pull/162
Change-Id: Iaac2c8cab0051eb97211893ad547d8dfa8fda560
2024-08-26 11:34:43 -07:00
eldritch horrors b6884388a1 add dedicated test for hash mismatch url reporting
the current test relies on derivation build order being deterministic,
which will not be a reasonable expectation for all that long any more.

Change-Id: I9be44a7725185f614a9a4c724045b8b1e6962c03
2024-08-25 22:21:32 +00:00
Rebecca Turner 0582999bd1 Merge "Add ApplyConfigOptions" into main 2024-08-25 22:06:45 +00:00
eldritch horrors 398894b856 libstore: make Goal::ex a shared_ptr
this makes WorkResult copyable, and just all around easier to deal with.
in the future we'll need this to let Goal::work() return a promise for a
WorkResult (or even just a Finished) that can be awaited by other goals.

Change-Id: Ic5a1ce04c5a0f8e683bd00a2ed2b77a2e28989c1
2024-08-25 21:21:55 +00:00
eldritch horrors 30a87b4cd5 libstore: remove unused Goal ctor parameter
Change-Id: I9345fe272d6df5bd592621ce2da369fc1cd36d6d
2024-08-25 20:40:19 +00:00
jade 72f91767a8 Merge "fix: good errors for failures caused by allowSubstitutes" into main 2024-08-25 20:00:58 +00:00
jade 3bf8819fa2 Merge changes Ief8e8ebc,Id3135db0,If1e76169 into main
* changes:
  libutil: delete unused boost context cruft
  build: remove approximately 400 seconds of CPU time (30%)
  fix: use http proxy for s3 access
2024-08-25 19:59:46 +00:00
Rebecca Turner c300efc0e1
Add ApplyConfigOptions
Change-Id: Ic876bcabd0b68e579bbd30ca1755919df43d4813
2024-08-25 12:18:20 -07:00
eldritch horrors cae260a158 libstore: diagnose local build failure in goal
this should be done where we're actually trying to build something, not
in the main worker loop that shouldn't have to be aware of such details

Change-Id: I07276740c0e2e5591a8ce4828a4bfc705396527e
2024-08-25 19:55:47 +02:00
eldritch horrors 04b591dc1d devShell: can we have debuggers?
macos: we have debuggers at home

(the debuggers at home: entitled little brats)

Change-Id: Iefd4b5880da97846a81d601db05d2b46530a2b58
2024-08-24 21:34:13 +02:00
jade 686120ee4a fix: good errors for failures caused by allowSubstitutes
This caused an absolute saga which I would not like anyone else to have
to experience. Let's put in a laser targeted error message that
diagnoses this exact problem.

Fixes: lix-project/lix#484
Change-Id: I2a79f04aeb4a1b67c10115e5e39501d958836298
2024-08-23 17:49:15 -07:00
Rebecca Turner fabc9f29b8
Fix comment in getHome
The logic in the comment is the opposite of the truth.

Change-Id: I64add84539209782ffa46431f3db1fb306d90b3f
2024-08-23 15:15:21 -07:00
Rebecca Turner c5949bfe31 Merge "libutil/config: unify path setting types" into main 2024-08-23 22:09:11 +00:00
jade 7e677d15a4 libutil: delete unused boost context cruft
This was from before we got rid of the boost coroutines. Now we don't
need any of this code.

Change-Id: Ief8e8ebc184f02f48e30cb253a66b540faa56329
2024-08-23 13:23:33 -07:00
jade af546be205 build: remove approximately 400 seconds of CPU time (30%)
This took parsing time from 1421s or so to 1060s or so. The reason is
entirely nlohmann. All of the stuff below is just Obliterated because it's
built in the PCH instead:

**** Templates that took longest to instantiate:
219051 ms: nlohmann::basic_json<>::parse<const char *> (276 times, avg 793 ms)
169675 ms: nlohmann::basic_json<>::basic_json (1127 times, avg 150 ms)
129416 ms: nlohmann::detail::parser<nlohmann::basic_json<>, nlohmann::detail::i... (276 times, avg 468 ms)
 98155 ms: nlohmann::detail::parser<nlohmann::basic_json<>, nlohmann::detail::i... (276 times, avg 355 ms)
 81322 ms: nlohmann::basic_json<>::json_value::json_value (1405 times, avg 57 ms)
 53531 ms: nlohmann::detail::json_sax_dom_callback_parser<nlohmann::basic_json<... (276 times, avg 193 ms)

clang-only. This brings the clang build time to not far from *half* of
the gcc build time.

Also, clang does not enjoy so much to miscompile coroutines. Maybe we
should just be clang-only.

Change-Id: Id3135db0094e4560830674090e32e6da2c22fcc6
2024-08-23 13:23:33 -07:00
jade 9aacf425dc fix: use http proxy for s3 access
I don't know why the AWS sdk disabled it by default. It would be nice
to have test coverage of the s3 store or proxies, but neither currently
exist.

Fixes: lix-project/lix#433
Change-Id: If1e76169a3d66dbec2e926af0d0d0eccf983b97b
2024-08-23 13:23:33 -07:00
Rebecca Turner 9845637359
lix-clang-tidy: Require Clang >= 16
Nixpkgs Clang on macOS is 16, not 17.

nix-repl> packages.aarch64-darwin.nix-clangStdenv.stdenv.cc
«derivation /nix/store/ycych9qpim4r42hjkznl8f6zmj0jns45-clang-wrapper-16.0.6.drv»

nix-repl> packages.x86_64-linux.nix-clangStdenv.stdenv.cc
«derivation /nix/store/y48dhgidb2vs230r9ayim14q61xwcdg9-clang-wrapper-17.0.6.drv»

Change-Id: Ib267b8882f80eef4db665fb9df50ae285ea68b2b
2024-08-23 12:17:01 -07:00
jade 87fd6e0095 Merge "Revert "libexpr: Replace regex engine with boost::regex"" into main 2024-08-22 22:34:10 +00:00
jade 9896d309cb Revert "libexpr: Replace regex engine with boost::regex"
This reverts commit 447212fa65.

Reason for revert: Regression in eval behaviour bug-compatibility.

Expected behaviour (Nix 2.18.5, macOS and Linux [libstdc++/libc++]):

```
nix-repl> builtins.match "\\.*(.*)" ".keep"
[ "keep" ]

nix-repl> builtins.match "(\\.*)(.*)" ".keep"
[ "." "keep" ]
```

Actual behaviour (boost::regex):

```
nix-repl> builtins.match "\\.*(.*)" ".keep"
[ ".keep" ]

nix-repl> builtins.match "(\\.*)(.*)" ".keep"
[
  "."
  "keep"
]
```

Bug: lix-project/lix#483
Change-Id: Id462eb8586dcd54856cf095f09b3e3a216955b60
2024-08-22 18:35:11 +00:00
sugar🍬🍭🏳️‍⚧️ f2e7f8bab8 Merge "libexpr: Replace regex engine with boost::regex" into main 2024-08-22 07:20:00 +00:00
sugar🍬🍭🏳️‍⚧️ 447212fa65 libexpr: Replace regex engine with boost::regex
This avoids C++'s standard library regexes, which aren't the same
across platforms, and have many other issues, like using stack
so much that they stack overflow when processing a lot of data.

To avoid backwards and forward compatibility issues, regexes are
processed using a function converting libstdc++ regexes into Boost
regexes, escaping characters that Boost needs to have escaped, and
rejecting features that Boost has and libstdc++ doesn't.

Related context:

- Original failed attempt to use `boost::regex` in CppNix, failed due to
  boost icu dependency being large (disabling ICU is no longer necessary
  because linking ICU requires using a different header file,
  `boost/regex/icu.hpp`): https://github.com/NixOS/nix/pull/3826

- An attempt to use PCRE, rejected due to providing less backwards
  compatibility with `std::regex` than `boost::regex`:
  https://github.com/NixOS/nix/pull/7336

- Second attempt to use `boost::regex`, failed due to `}` regex failing
  to compile (dealt with by writing a wrapper that parses a regular
  expression and escapes `}` characters):
  https://github.com/NixOS/nix/pull/7762

Closes #34. Closes #476.

Change-Id: Ieb0eb9e270a93e4c7eed412ba4f9f96cb00a5fa4
2024-08-22 03:17:55 +02:00
jade 651cc0e5b4 fix: build with meson 1.5 also
nixpkgs delivered us the untimely gift of a meson 1.5 upgrade, which
*does* make our lives easier by allowing us to delete wrap generation
code, but it does so at the cost of renaming all rust crates in such a
way that the wrap logic cannot tolerate the new names on the old meson
version 😭.

It also means that support burden for this is going to be atrocious
until we either give in and vendor meson 1.5 or we make a CI target for
it. Neither seems appealing, though the latter is not super absurd for
ensuring we don't break nixpkgs unstable.

This commit causes meson 1.5 to ignore the .wrap files in subprojects/
entirely (since they have the wrong names lol) and instead use
Cargo.lock, so it now hard-depends on our workspace reshuffling
improvement.

It also deletes the hack that we were using to get the sources of Cargo
deps into meson by using a feature that went unnoticed when this code
was originally written: MESON_PACKAGE_CACHE_DIR:
8a202de6ec/mesonbuild/wrap/wrap.py (L490-L502)

Change-Id: I7a28f12fc2812c6ed7537b60bc3025c141a05874
2024-08-21 17:09:10 +00:00
jade dba615098d build: move to a Cargo workspace
This is purely to let Cargo's dependency resolver do stuff for us, we do
not actually intend to build this stuff with Cargo to begin with.

Change-Id: I4c08d55595c7c27b7096375022581e1e34308a87
2024-08-21 17:09:10 +00:00
alois31 e3c289dbe9
libutil/config: unify path setting types
There have been multiple setting types for paths that are supposed to be
canonicalised, depending on whether zero or one, one, or any number of paths is
to be specified. Naturally, they behaved in slightly different ways in the
code. Simplify things by unifying them and removing special behaviour (mainly
the "multiple paths type can coerce to boolean" thing).

Change-Id: I7c1ce95e9c8e1829a866fb37d679e167811e9705
2024-08-21 17:57:23 +02:00
piegames e38410799b Merge "libexpr: Soft-deprecate ancient let syntax" into main 2024-08-21 11:22:48 +00:00
piegames 0edfea450b libexpr: Soft-deprecate ancient let syntax
Change-Id: I6802b26f038578870ea1fa1ed298f0c4b1f29c4a
2024-08-21 12:59:03 +02:00
jade 3cbbe22fab Merge "flake: fix compiler warning" into main 2024-08-21 10:44:01 +00:00
piegames 0a8888d1c7 treewide: Stop using ancient let syntax
Shows for how long these tests have not been touched by anyone …

Change-Id: I3d0c1209a86283ddb012db4e7d45073264fdd0eb
2024-08-21 06:55:52 +00:00
piegames 7210ed1b87 libexpr: Soft-deprecate __overrides
Change-Id: I787e69e1dad6edc5ccdb747b74a9ccd6e8e13bb3
2024-08-21 06:55:52 +00:00
jade c25c43d8c8 flake: fix compiler warning
GCC was complaining, rightfully, about mixed-sign comparisons in there.
I removed some extra sign mixing too.

Change-Id: I949a618c7405c23d4dc3fd17440ea2d7b5c22c9d
2024-08-20 16:13:17 -07:00
Audrey Dutcher ac6974777e Merge "tests/functional/restricted: Don't use a process substitution" into main 2024-08-20 22:51:38 +00:00
jade 736b5d5913 lix-doc: move under src/
This is required to make more meson stuff easier/possible, and honestly
it *is* now Lix sources anyhow.

Change-Id: Ia6c38fabce9aa5c53768745ee38c5cf344f5c226
2024-08-20 13:38:46 -06:00
Qyriad 95863b258b build: build lix-doc with Meson! 🎉
lix-doc is now built with Meson, with lix-doc's dependencies built as
Meson subprojects, either fetched on demand with .wrap files, or fetched
in advance by Nix with importCargoLock. It even builds statically.

Fixes #256.

Co-authored-by: Lunaphied <lunaphied@lunaphied.me>
Co-authored-by: Jade Lovelace <lix@jade.fyi>

Change-Id: I3a4731ff13278e7117e0316bc0d7169e85f5eb0c
2024-08-20 17:21:13 +00:00
Yureka f1533160aa Merge "libutil: fix conditional for close_range availability" into main 2024-08-20 07:55:01 +00:00
Yureka df49d37b71 libutil: fix conditional for close_range availability
This check is wrong and would cause the close_range() function being called even when it's not available

Change-Id: Ide65b36830e705fe772196c37349873353622761
2024-08-20 08:58:25 +02:00
Audrey Dutcher ae628d4af2 tests/functional/restricted: Don't use a process substitution
The <() process substitution syntax doesn't work for this one testcase
in bash for FreeBSD. The exact reason for this is unknown, possibly to
do with pipe vs file vs fifo EOF behavior. The prior behavior was this
test hanging forever, with no children of the bash process.

Change-Id: I71822a4b9dea6059b34300568256c5b7848109ac
2024-08-19 20:37:51 -07:00
Maximilian Bosch 040e783232 flake: don't refetch unmodified inputs by recursive follows
Closes #460

I managed to trigger the issue by having the following inputs (shortened):

    authentik-nix.url = "github:nix-community/authentik-nix";
    authentik-nix.inputs.poetry2nix.inputs.nixpkgs.follows = "nixpkgs";

When evaluating this using

    nix-eval-jobs --flake .#hydraJobs

I got the following error:

    error: cannot update unlocked flake input 'authentik-nix/poetry2nix' in pure mode

The issue we have here is that `authentik-nix/poetry2nix` was written
into the `overrideMap` which caused Nix to assume it's a new input and
tried to refetch it (#460) or errored out in pure mode
(nix-eval-jobs / Hydra).

The testcase unfortunately only involves checking for the output log
and makes sure that something *is* logged on the first fetch so that
the test doesn't rot when the logging changes since I didn't
manage to trigger the error above with the reproducer from #460. In
fact, I only managed to trigger the `cannot update unlocked flake input`
error in this context with `nix-eval-jobs`.

Change-Id: Ifd00091eec9a0067ed4bb3e5765a15d027328807
2024-08-19 19:57:12 +00:00
eldritch horrors e727dbc3a3 libstore: un-enable_shared_from_this Goal
it's no longer needed for anything, and not even a great idea.

Change-Id: Ia7a59e1e3f9d8f4ad2ac3b054e38485157c210a6
2024-08-19 09:13:44 +00:00
eldritch horrors b40369942c libstore: make Worker::childStarted private
this can be a proper WorkResult now. childTerminated is unfortunately a
lot more stubborn and won't be made private for quite a while yet. once
we can get rid of the Worker poll loop that *should* be possible though

Change-Id: I2218df202da5cb84e852f6a37e4c20367495b617
2024-08-19 09:13:44 +00:00
eldritch horrors fca523d661 libstore: turn HookReply into a variant type
we'll need this once we want to pass extra information out of accepting
replies, such as fd sets or possibly even async output reader promises.

Change-Id: I5e2f18cdb80b0d2faf3067703cc18bd263329b3f
2024-08-19 09:13:44 +00:00
eldritch horrors 5e9db09761 libstore: downsize hook pipes
don't keep fds open we're not using. currently this does not cause any
problems, but it does increase the size of our fd table needlessly and
in the future, when we have proper async processing, having builderOut
open in the daemon once the hook has been fully started is problematic

Change-Id: I6e7fb773b280b042873103638d3e04272ca1e4fc
2024-08-19 09:13:44 +00:00
eldritch horrors e513cd2beb libstore: run childStarted as late as possible
otherwise we *technically* give away the output fds before we've read them.

Change-Id: I6ad0d6a1bb553ecfcdd7708f50d34142a425374d
2024-08-19 09:13:44 +00:00
eldritch horrors fb8eb539fc libstore: move respect-timeoutiness to goal method
this is useless to do on the face of it, but it'll make it easier to
convert the entire output handling to use async io and promises soon

Change-Id: I2d1eb62c4bbf8f57bd558b9599c08710a389b1a8
2024-08-19 09:13:44 +00:00
jade 3d14567d0b Merge "doc: fix broken meson deps for various manuals outputs" into main 2024-08-19 04:30:39 +00:00
jade 925e08b858 Merge "build: limit clang-tidy concurrency and respect NIX_BUILD_CORES" into main 2024-08-19 02:55:48 +00:00
eldritch horrors 5cbca85535 libstore: clarify that build log fd and hook log fd are different
only DerivationGoal can set the hook to anything at all. it always sets
buildOutFD to something that is not related to fromHook in any way, and
mixing the two would have rather dire consequences for log consistency.

Change-Id: Ida86727fd1cd5e1ecd78f07f3bde330a346658a8
2024-08-18 22:44:11 +00:00
jade ecfe9345cf build: limit clang-tidy concurrency and respect NIX_BUILD_CORES
Apparently it was impolite to lint with 128 jobs on our CI machine with
128 threads. Let's fix it.

Change-Id: I9ca7306294c6773c6f233690ba49d45a1da6bf7a
2024-08-18 15:39:05 -07:00
jade 84543b459c doc: fix broken meson deps for various manuals outputs
This is incredibly haunted, but it can happen that you change libutil,
breaking the generation of the .json files, which then does not rebuild
the files. I don't expect they are slow to build, so it does not seem so
bad to just rebuild them every time instead of extracting a list of all
the possible deps.

We want to delete this nonsense anyway and replace it with generated
code.

Change-Id: Ia576d1a3bdee48fbaefbb5ac194354428d179a84
2024-08-18 15:19:15 -07:00
eldritch horrors e2d330aeed libstore: remove DerivationGoal::isReadDesc
all derivation goals need a log fd of some description. let's save this
single fd in a dedicated pointer field for all subclasses so that later
we have just the one spot to change if we turn this into async promises

Change-Id: If223adf90909247363fb823d751cae34d25d0c0b
2024-08-18 22:04:06 +00:00
piegames 007211e7a2 libutil: Optimize feature checks
Instead of doing a linear search on an std::set, we use a bitset enum.

Change-Id: Ide537f6cffdd16d06e59aaeb2e4ac0acb6493421
2024-08-18 16:56:49 +00:00
eldritch horrors 7506d680ac libstore: don't ignore max-build-log-size for ssh-ng
Change-Id: Ieab14662bea6e6f5533325f0e945147be998f9a2
2024-08-18 09:10:05 +00:00
eldritch horrors 38f550708d libstore: add explicit in-build-slot-ness to goals
we don't need to expose information about how busy a Worker is if the
worker can instead tell its work items whether they are in a slot. in
the future we might use this to not start items waiting for a slot if
no slots are currently available, but that requires more preparation.

Change-Id: Ibe01ac536da7e6d6f80520164117c43e772f9bd9
2024-08-18 09:10:05 +00:00
eldritch horrors 176e1058f1 libstore: remove method without definition
Change-Id: I676411752a4b1777045d7211ac1176693f1a3d7d
2024-08-18 09:10:05 +00:00
eldritch horrors 91a74ba82a libstore: remove unused includes in worker code
Change-Id: I6c7fccc4e710e23a22faae2669cb75f2f6da27b4
2024-08-18 09:10:05 +00:00
eldritch horrors b66fd9ff4b libstore: make Worker::removeGoal private
Change-Id: I8583d9ff752f702a10ec52b0330b0d4d4d2614fa
2024-08-18 09:10:05 +00:00
piegames 278fddc317 libexpr: Deprecate URL literals
Closes #437.

Change-Id: I9f67fc965bb4a7e7fd849e5067ac1cb3bab064cd
2024-08-17 20:31:57 +02:00
piegames 49d61b2e4b libexpr: Introduce Deprecated features
They are like experimental features, but opt-in instead of opt-out. They
will allow us to gracefully remove language features. See #437

Change-Id: I9ca04cc48e6926750c4d622c2b229b25cc142c42
2024-08-17 19:47:51 +02:00
piegames 1c080a8239 treewide: Stop using URL literals
They must die

Change-Id: Ibe2b1818b21d98ec1a68836d01d5dad729b8c501
2024-08-17 15:48:10 +00:00
Artemis Tosini 41a0b08e64 meson: Don't use target_machine
The target_machine variable is meant for the target
of cross compilers. We are not a cross compiler, so
instead reuse our host_machine based checks.

Fixes Linux→FreeBSD cross, since Meson can't figure
out `target_machine.kernel()` in that case.

Fixes: lix-project/lix#469

Change-Id: Ia46a64c8d507c3b08987a1de1eda171ff5e50df4
2024-08-16 14:24:03 +00:00
Artemis Tosini b016eb0895 Merge "libutil: Add bindPath function from libstore" into main 2024-08-13 19:39:10 +00:00
jade f9a3bf6ccc Update version to 2.92
Change-Id: Ib64d695c50a733e0e739ff193f1ea65ed7cb0a57
2024-08-12 18:06:08 -07:00
Artemis Tosini 3058029fba
libutil: Add bindPath function from libstore
bindPath/doBind is a useful function in build that is used in several
parts of LocalDerivationGoal. Moving this function makes it easier to
split LocalDerivationGoal implementation between several files.

Change-Id: Ic5a0768479c153c1aa3ed425f12604b20bbf0f42
2024-07-27 19:40:40 +00:00
237 changed files with 2330 additions and 1119 deletions

7
.gitignore vendored
View file

@ -9,6 +9,10 @@ GTAGS
# ccls
/.ccls-cache
# auto-generated compilation database
compile_commands.json
rust-project.json
result
result-*
@ -29,3 +33,6 @@ buildtime.bin
/.pre-commit-config.yaml
/.nocontribmsg
/release
# Rust build files when using Cargo (not actually supported for building but it spews the files anyway)
/target/

View file

@ -2,12 +2,6 @@
# It is not intended for manual editing.
version = 3
[[package]]
name = "autocfg"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa"
[[package]]
name = "countme"
version = "3.0.1"
@ -16,15 +10,15 @@ checksum = "7704b5fdd17b18ae31c4c1da5a2e0305a2bf17b5249300a9ee9ed7b72114c636"
[[package]]
name = "dissimilar"
version = "1.0.7"
version = "1.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86e3bdc80eee6e16b2b6b0f87fbc98c04bee3455e35174c0de1a125d0688c632"
checksum = "59f8e79d1fbf76bdfbde321e902714bf6c49df88a7dda6fc682fc2979226962d"
[[package]]
name = "expect-test"
version = "1.4.1"
version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30d9eafeadd538e68fb28016364c9732d78e420b9ff8853fa5e4058861e9f8d3"
checksum = "9e0be0a561335815e06dab7c62e50353134c796e7a6155402a64bcff66b6a5e0"
dependencies = [
"dissimilar",
"once_cell",
@ -45,15 +39,6 @@ dependencies = [
"rowan",
]
[[package]]
name = "memoffset"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "488016bfae457b036d996092f6cb448677611ce4449e970ceaf42695203f218a"
dependencies = [
"autocfg",
]
[[package]]
name = "once_cell"
version = "1.19.0"
@ -71,13 +56,12 @@ dependencies = [
[[package]]
name = "rowan"
version = "0.15.15"
version = "0.15.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32a58fa8a7ccff2aec4f39cc45bf5f985cec7125ab271cf681c279fd00192b49"
checksum = "0a542b0253fa46e632d27a1dc5cf7b930de4df8659dc6e720b647fc72147ae3d"
dependencies = [
"countme",
"hashbrown",
"memoffset",
"rustc-hash",
"text-size",
]

6
Cargo.toml Normal file
View file

@ -0,0 +1,6 @@
[workspace]
resolver = "2"
members = ["src/lix-doc"]
[workspace.package]
edition = "2021"

View file

@ -1,9 +1,14 @@
# Usually "experimental" or "deprecated"
kind:
# "xp" or "dp"
kindShort:
with builtins;
with import ./utils.nix;
let
showExperimentalFeature = name: doc: ''
- [`${name}`](@docroot@/contributing/experimental-features.md#xp-feature-${name})
- [`${name}`](@docroot@/contributing/${kind}-features.md#${kindShort}-feature-${name})
'';
in
xps: indent " " (concatStrings (attrValues (mapAttrs showExperimentalFeature xps)))

View file

@ -0,0 +1,18 @@
# Usually "experimental" or "deprecated"
_kind:
# "xp" or "dp"
kindShort:
with builtins;
with import ./utils.nix;
let
showFeature =
name: doc:
squash ''
## [`${name}`]{#${kindShort}-feature-${name}}
${doc}
'';
in
xps: (concatStringsSep "\n" (attrValues (mapAttrs showFeature xps)))

View file

@ -1,13 +0,0 @@
with builtins;
with import ./utils.nix;
let
showExperimentalFeature =
name: doc:
squash ''
## [`${name}`]{#xp-feature-${name}}
${doc}
'';
in
xps: (concatStringsSep "\n" (attrValues (mapAttrs showExperimentalFeature xps)))

View file

@ -20,6 +20,8 @@ conf_file_json = custom_target(
capture : true,
output : 'conf-file.json',
env : nix_env_for_docs,
# FIXME: put the actual lib targets in here? meson have introspection challenge 2024 though.
build_always_stale : true,
)
nix_conf_file_md_body = custom_target(
@ -50,6 +52,8 @@ nix_exp_features_json = custom_target(
command : [ nix, '__dump-xp-features' ],
capture : true,
output : 'xp-features.json',
# FIXME: put the actual lib targets in here? meson have introspection challenge 2024 though.
build_always_stale : true,
)
language_json = custom_target(
@ -57,6 +61,8 @@ language_json = custom_target(
output : 'language.json',
capture : true,
env : nix_env_for_docs,
# FIXME: put the actual lib targets in here? meson have introspection challenge 2024 though.
build_always_stale : true,
)
nix3_cli_json = custom_target(
@ -64,6 +70,8 @@ nix3_cli_json = custom_target(
capture : true,
output : 'nix.json',
env : nix_env_for_docs,
# FIXME: put the actual lib targets in here? meson have introspection challenge 2024 though.
build_always_stale : true,
)
generate_manual_deps = files(
@ -72,9 +80,9 @@ generate_manual_deps = files(
# Generates builtins.md and builtin-constants.md.
subdir('src/language')
# Generates new-cli pages, experimental-features-shortlist.md, and conf-file.md.
# Generates new-cli pages, {experimental,deprecated}-features-shortlist.md, and conf-file.md.
subdir('src/command-ref')
# Generates experimental-feature-descriptions.md.
# Generates {experimental,deprecated}-feature-descriptions.md.
subdir('src/contributing')
# Generates rl-next-generated.md.
subdir('src/release-notes')
@ -106,6 +114,8 @@ manual = custom_target(
nix3_cli_files,
experimental_features_shortlist_md,
experimental_feature_descriptions_md,
deprecated_features_shortlist_md,
deprecated_feature_descriptions_md,
conf_file_md,
builtins_md,
builtin_constants_md,

View file

@ -0,0 +1,21 @@
---
synopsis: "Build failures caused by `allowSubstitutes = false` while being the wrong system now produce a decent error"
issues: [fj#484]
cls: [1841]
category: Fixes
credits: jade
---
Nix allows derivations to set `allowSubstitutes = false` in order to force them to be built locally without querying substituters for them.
This is useful for derivations that are very fast to build (especially if they produce large output).
However, this can shoot you in the foot if the derivation *has* to be substituted such as if the derivation is for another architecture, which is what `--always-allow-substitutes` is for.
Perhaps such derivations that are known to be impossible to build locally should ignore `allowSubstitutes` (irrespective of remote builders) in the future, but this at least reports the failure and solution directly.
```
$ nix build -f fail.nix
error: a 'unicornsandrainbows-linux' with features {} is required to build '/nix/store/...-meow.drv', but I am a 'x86_64-linux' with features {...}
Hint: the failing derivation has allowSubstitutes set to false, forcing it to be built rather than substituted.
Passing --always-allow-substitutes to force substitution may resolve this failure if the path is available in a substituter.
```

View file

@ -0,0 +1,17 @@
---
synopsis: Deprecated language features
issues: [fj#437]
cls: [1785, 1736, 1735, 1744]
category: Breaking Changes
credits: [piegames, horrors]
---
A system for deprecation (and then the planned removal) of undesired language features has been put into place.
It is controlled via feature flags much like experimental features, except that the deprecations are enabled default,
and can be disabled via the flags for backwards compatibility (opt-out with `--extra-deprecated-features` or the Nix configuration file).
- `url-literals`: **URL literals** have long been obsolete and discouraged of use, and now they are officially deprecated.
This means that all URLs must be properly put within quotes like all other strings.
- `rec-set-overrides`: **__overrides** is an old arcane syntax which has not been in use for more than a decade.
It is soft-deprecated with a warning only, with the plan to turn that into an error in a future release.
- `ancient-let`: **The old `let` syntax** (`let { body = …; … }`) is soft-deprecated with a warning as well. Use the regular `let … in` instead.

View file

@ -0,0 +1,10 @@
---
synopsis: HTTP proxy environment variables are now respected for S3 binary cache stores
issues: [fj#433]
cls: [1788]
category: Fixes
credits: jade
---
Due to "legacy reasons" (according to the AWS C++ SDK docs), the AWS SDK ignores system proxy configuration by default.
We turned it back on.

View file

@ -192,6 +192,7 @@
- [Hacking](contributing/hacking.md)
- [Testing](contributing/testing.md)
- [Experimental Features](contributing/experimental-features.md)
- [Deprecated Features](contributing/deprecated-features.md)
- [CLI guideline](contributing/cli-guideline.md)
- [C++ style guide](contributing/cxx.md)
- [Release Notes](release-notes/release-notes.md)

View file

@ -1,23 +1,37 @@
xp_features_json = custom_target(
command : [nix, '__dump-xp-features'],
capture : true,
output : 'xp-features.json',
)
experimental_features_shortlist_md = custom_target(
command : nix_eval_for_docs + [
'--expr',
'import @INPUT0@ (builtins.fromJSON (builtins.readFile @INPUT1@))',
'import @INPUT0@ "experimental" "xp" (builtins.fromJSON (builtins.readFile @INPUT1@))',
],
input : [
'../../generate-xp-features-shortlist.nix',
xp_features_json,
'../../generate-features-shortlist.nix',
nix_exp_features_json,
],
capture : true,
output : 'experimental-features-shortlist.md',
env : nix_env_for_docs,
)
dp_features_json = custom_target(
command : [nix, '__dump-dp-features'],
capture : true,
output : 'dp-features.json',
)
deprecated_features_shortlist_md = custom_target(
command : nix_eval_for_docs + [
'--expr',
'import @INPUT0@ "deprecated" "dp" (builtins.fromJSON (builtins.readFile @INPUT1@))',
],
input : [
'../../generate-features-shortlist.nix',
dp_features_json,
],
capture : true,
output : 'deprecated-features-shortlist.md',
env : nix_env_for_docs,
)
# Intermediate step for manpage generation.
# This splorks the output of generate-manpage.nix as JSON,
# which gets written as a directory tree below.
@ -60,6 +74,7 @@ conf_file_md = custom_target(
'../../utils.nix',
conf_file_json,
experimental_features_shortlist_md,
deprecated_features_shortlist_md,
],
output : 'conf-file.md',
env : nix_env_for_docs,

View file

@ -0,0 +1,37 @@
This section describes the notion of *deprecated features*, and how it fits into the big picture of the development of Lix.
# What are deprecated features?
Deprecated features are disabled by default, with the intent to eventually remove them.
Users must explicitly enable them to keep using them, by toggling the associated [deprecated feature flags](@docroot@/command-ref/conf-file.md#conf-deprecated-features).
This allows backwards compatibility and a graceful transition away from undesired features.
# Which features can be deprecated?
Undesired features should be soft-deprecated by yielding a warning when used for a significant amount of time before the can be deprecated.
Legacy obsolete feature with little to no usage may go through this process faster.
Deprecated features should have a migration path to a preferred alternative.
# Lifecycle of a deprecated feature
This description is not normative, but a feature removal may roughly happen like this:
1. Add a warning when the feature is being used.
2. Disable the feature by default, putting it behind a deprecated feature flag.
- If disabling the feature started out as an opt-in experimental feature, turn that experimental flag into a no-op or remove it entirely.
For example, `--extra-experimental-features=no-url-literals` becomes `--extra-deprecated-features=url-literals`.
3. Decide on a time frame for how long that feature will still be supported for backwards compatibility, and clearly communicate that in the error messages.
- Sometimes, automatic migration to alternatives is possible, and such should be provided if possible
- At least one NixOS release cycle should be the minimum
4. Finally remove the feature entirely, only keeping the error message for those still using it.
# Relation to language versioning
Obviously, removing anything breaks backwards compatibility.
In an ideal world, we'd have SemVer controls over the language and its features, cleanly allowing us to make breaking changes.
See https://wiki.lix.systems/books/lix-contributors/page/language-versioning and [RFC 137](https://github.com/nixos/rfcs/pull/137) for efforts on that.
However, we do not live in such an ideal world, and currently this goal is so far away, that "just disable it with some back-compat for a couple of years" is the most realistic solution, especially for comparatively minor changes.
# Currently available deprecated features
{{#include @generated@/contributing/deprecated-feature-descriptions.md}}

View file

@ -4,12 +4,25 @@
experimental_feature_descriptions_md = custom_target(
command : nix_eval_for_docs + [
'--expr',
'import @INPUT0@ (builtins.fromJSON (builtins.readFile @INPUT1@))',
'import @INPUT0@ "experimental" "xp" (builtins.fromJSON (builtins.readFile @INPUT1@))',
],
input : [
'../../generate-xp-features.nix',
xp_features_json,
'../../generate-features.nix',
nix_exp_features_json,
],
capture : true,
output : 'experimental-feature-descriptions.md',
)
deprecated_feature_descriptions_md = custom_target(
command : nix_eval_for_docs + [
'--expr',
'import @INPUT0@ "deprecated" "dp" (builtins.fromJSON (builtins.readFile @INPUT1@))',
],
input : [
'../../generate-features.nix',
dp_features_json,
],
capture : true,
output : 'deprecated-feature-descriptions.md',
)

View file

@ -77,12 +77,6 @@
}
```
Finally, as a convenience, *URIs* as defined in appendix B of
[RFC 2396](http://www.ietf.org/rfc/rfc2396.txt) can be written *as
is*, without quotes. For instance, the string
`"http://example.org/foo.tar.bz2"` can also be written as
`http://example.org/foo.tar.bz2`.
- <a id="type-number" href="#type-number">Number</a>
Numbers, which can be *integers* (like `123`) or *floating point*

View file

@ -30,6 +30,14 @@
# FIXME: This hack should be removed when https://git.lix.systems/lix-project/lix/issues/359
# is fixed.
#
# lix-doc is built with Meson in lix-doc/meson.build, and linked into libcmd in
# src/libcmd/meson.build. When building outside the Nix sandbox, Meson will use the .wrap
# files in subprojects/ to download and extract the dependency crates into subprojects/.
# When building inside the Nix sandbox, Lix's derivation in package.nix uses a
# fixed-output derivation to fetch those crates in advance instead, and then symlinks
# them into subprojects/ with the same names that Meson uses when downloading them
# itself -- perfect for --wrap-mode=nodownload, which mesonConfigurePhase uses.
#
# Unit tests are setup in tests/unit/meson.build, under the test suite "check".
#
# Functional tests are a bit more complicated. Generally they're defined in
@ -38,10 +46,11 @@
# be placed in specific directories' meson.build files to create the right directory tree
# in the build directory.
project('lix', 'cpp',
project('lix', 'cpp', 'rust',
version : run_command('bash', '-c', 'echo -n $(jq -r .version < ./version.json)$VERSION_SUFFIX', check : true).stdout().strip(),
default_options : [
'cpp_std=c++2a',
'rust_std=2021',
# TODO(Qyriad): increase the warning level
'warning_level=1',
'debug=true',
@ -138,6 +147,17 @@ if should_pch
# Unlike basically everything else that takes a file, Meson requires the arguments to
# cpp_pch : to be strings and doesn't accept files(). So absolute path it is.
cpp_pch = [meson.project_source_root() / 'src/pch/precompiled-headers.hh']
# Saves about 400s (30% at time of writing) from compile time on-cpu, mostly
# by removing instantiations of nlohmann from every single damned compilation
# unit.
# There is no equivalent in gcc.
if cxx.get_id() == 'clang'
add_project_arguments(
'-fpch-instantiate-templates',
language : 'cpp',
)
endif
else
cpp_pch = []
endif
@ -322,13 +342,6 @@ pegtl = dependency(
nlohmann_json = dependency('nlohmann_json', required : true, include_type : 'system')
# lix-doc is a Rust project provided via buildInputs and unfortunately doesn't have any way to be detected.
# Just declare it manually to resolve this.
#
# FIXME: build this with meson in the future after we drop Make (with which we
# *absolutely* are not going to make it work)
lix_doc = declare_dependency(link_args : [ '-llix_doc' ])
if is_freebsd
libprocstat = declare_dependency(link_args : [ '-lprocstat' ])
endif
@ -417,7 +430,7 @@ check_funcs = [
'strsignal',
'sysconf',
]
if target_machine.kernel() in ['linux', 'freebsd']
if is_linux or is_freebsd
# musl does not have close_range as of 2024-08-10
# patch: https://www.openwall.com/lists/musl/2024/08/01/9
check_funcs += [ 'close_range' ]
@ -543,6 +556,33 @@ if cxx.get_id() in ['clang', 'gcc']
)
endif
# Until Meson 1.5¹, we can't just give Meson a Cargo.lock file and be done with it.
# Meson will *detect* what dependencies are needed from Cargo files; it just won't
# fetch them. The Meson 1.5 feature essentially internally translates Cargo.lock entries
# to .wrap files, and that translation is incredibly straightforward, so let's just
# use a simple Python script to generate the .wrap files ourselves while we wait for
# Meson 1.5. Weirdly, it seems Meson will only detect dependencies from other
# dependency() calls, so we have to specify lix-doc's two top-level dependencies,
# rnix and rowan, manually, and then their dependencies will be recursively translated
# into more dependency() calls.
#
# When Meson translates a Cargo dependency, the string passed to `dependency()` follows
# a fixed format, which is important as the .wrap files' basenames must match the string
# passed to `dependency()` exactly.
# In Meson 1.4, this format is `$packageName-rs`. Meson 1.5 changes this to
# `$packageName-$shortenedVersionString-rs`, because of course it does, but we'll cross
# that bridge when we get there...
#
# [1]: https://github.com/mesonbuild/meson/commit/9b8378985dbdc0112d11893dd42b33b7bc8d1e62
# FIXME: remove (along with its generated wrap files) when we get rid of meson 1.4
run_command(
python,
meson.project_source_root() / 'meson/cargo-lock-to-wraps.py',
meson.project_source_root() / 'Cargo.lock',
meson.project_source_root() / 'subprojects',
check : true,
)
if is_darwin
configure_file(
input : 'misc/launchd/org.nixos.nix-daemon.plist.in',

43
meson/cargo-lock-to-wraps.py Executable file
View file

@ -0,0 +1,43 @@
#!/usr/bin/env python3
import argparse
import tomllib
import sys
DOWNLOAD_URI_FORMAT = 'https://crates.io/api/v1/crates/{crate}/{version}/download'
WRAP_TEMPLATE = """
[wrap-file]
method = cargo
directory = {crate}-{version}
source_url = {url}
source_filename = {crate}-{version}.tar.gz
source_hash = {hash}
""".lstrip()
parser = argparse.ArgumentParser()
parser.add_argument('lockfile', help='path to the Cargo lockfile to generate wraps from')
parser.add_argument('outdir', help="the 'subprojects' directory to write .wrap files to")
args = parser.parse_args()
with open(args.lockfile, 'rb') as f:
lock_toml = tomllib.load(f)
for dependency in lock_toml['package']:
try:
hash = dependency['checksum']
except KeyError:
# The base package, e.g. lix-doc, won't have a checksum, and conveniently
# the base package is also not something we want a wrap file for.
# Doesn't that work out nicely?
continue
crate = dependency['name']
version = dependency['version']
url = DOWNLOAD_URI_FORMAT.format(crate=crate, version=version)
wrap_text = WRAP_TEMPLATE.format(crate=crate, version=version, url=url, hash=hash)
with open(f'{args.outdir}/{crate}-rs.wrap', 'w') as f:
f.write(wrap_text)

View file

@ -0,0 +1,89 @@
#!/usr/bin/env python3
"""
Runs run-clang-tidy. A bit meta. Maybe it will replace run-clang-tidy one day
because the run-clang-tidy UX is so questionable.
"""
# I hereby dedicate this script to fuck you meson.
# I cannot simply write my code to invoke a subprocess in a meson file because
# Meson corrupts backslashes in command line args to subprocesses.
# This is allegedly for "Windows support", but last time I checked Windows
# neither needs nor wants you to corrupt its command lines.
# https://github.com/mesonbuild/meson/issues/1564
import multiprocessing
import subprocess
import os
import sys
from pathlib import Path
def default_concurrency():
return min(multiprocessing.cpu_count(),
int(os.environ.get("NIX_BUILD_CORES", "16")))
def go(exe: str, plugin_path: Path, compile_commands_json_dir: Path, jobs: int,
paths: list[Path], werror: bool, fix: bool):
args = [
# XXX: This explicitly invokes it with python because of a nixpkgs bug
# where clang-unwrapped does not patch interpreters in run-clang-tidy.
# However, making clang-unwrapped depend on python is also silly, so idk.
sys.executable,
exe,
'-quiet',
'-load',
plugin_path,
'-p',
compile_commands_json_dir,
'-j',
str(jobs),
'-header-filter',
r'src/[^/]+/.*\.hh'
]
if werror:
args += ['-warnings-as-errors', '*']
if fix:
args += ['-fix']
args += ['--']
args += paths
os.execvp(sys.executable, args)
def main():
import argparse
ap = argparse.ArgumentParser(description='Runs run-clang-tidy for you')
ap.add_argument('--jobs',
'-j',
type=int,
default=default_concurrency(),
help='Parallel linting jobs to run')
ap.add_argument('--plugin-path',
type=Path,
help='Path to the Lix clang-tidy plugin')
# FIXME: maybe we should integrate this so it just fixes the compdb for you and throws it in a tempdir?
ap.add_argument(
'--compdb-path',
type=Path,
help=
'Path to the directory containing the fixed-up compilation database from clean_compdb'
)
ap.add_argument('--werror',
action='store_true',
help='Warnings get turned into errors')
ap.add_argument('--fix',
action='store_true',
help='Apply fixes for warnings')
ap.add_argument('--run-clang-tidy-path',
default='run-clang-tidy',
help='Path to run-clang-tidy')
ap.add_argument('paths', nargs='*', help='Source paths to check')
args = ap.parse_args()
go(args.run_clang_tidy_path, args.plugin_path, args.compdb_path, args.jobs,
args.paths, args.werror, args.fix)
if __name__ == '__main__':
main()

View file

@ -13,8 +13,8 @@ def process_compdb(compdb: list[dict]) -> list[dict]:
out = []
eat_next = False
for i, arg in enumerate(args):
if arg == '-fpch-preprocess':
# as used with gcc
if arg in ['-fpch-preprocess', '-fpch-instantiate-templates']:
# -fpch-preprocess as used with gcc, -fpch-instantiate-templates as used by clang
continue
elif arg == '-include-pch' or (arg == '-include' and args[i + 1] == 'precompiled-headers.hh'):
# -include-pch some-pch (clang), or -include some-pch (gcc)
@ -30,7 +30,14 @@ def process_compdb(compdb: list[dict]) -> list[dict]:
item['command'] = shlex.join(munch_command(shlex.split(item['command'])))
return item
return [chomp(x) for x in compdb if not x['file'].endswith('precompiled-headers.hh')]
def cmdfilter(item: dict) -> bool:
file = item['file']
return (
not file.endswith('precompiled-headers.hh')
and not file.endswith('.rs')
)
return [chomp(x) for x in compdb if cmdfilter(x)]
def main():

View file

@ -58,26 +58,17 @@ build_all_generated_headers = custom_target(
if lix_clang_tidy_so_found
run_clang_tidy_args = [
'-load',
lix_clang_tidy_so,
'-p',
# We have to workaround a run-clang-tidy bug too, so we must give the
# directory name rather than the actual compdb file.
# https://github.com/llvm/llvm-project/issues/101440
meson.current_build_dir(),
'-quiet',
meson.current_source_dir() / 'clang-tidy-runner.py',
'--run-clang-tidy-path', run_clang_tidy,
'--compdb-path', meson.current_build_dir(),
'--plugin-path', lix_clang_tidy_so,
]
run_target(
'clang-tidy',
command : [
# XXX: This explicitly invokes it with python because of a nixpkgs bug
# where clang-unwrapped does not patch interpreters in run-clang-tidy.
# However, making clang-unwrapped depend on python is also silly, so idk.
python,
run_clang_tidy,
run_clang_tidy_args,
'-warnings-as-errors',
'*',
'--werror',
],
depends : [
build_all_generated_headers,
@ -87,9 +78,8 @@ if lix_clang_tidy_so_found
'clang-tidy-fix',
command : [
python,
run_clang_tidy,
run_clang_tidy_args,
'-fix',
'--fix',
],
depends : [
build_all_generated_headers,

View file

@ -41,6 +41,8 @@
pkg-config,
python3,
rapidcheck,
rustPlatform,
rustc,
sqlite,
toml11,
util-linuxMinimal ? utillinuxMinimal,
@ -49,9 +51,6 @@
busybox-sandbox-shell,
# internal fork of nix-doc providing :doc in the repl
lix-doc ? __forDefaults.lix-doc,
pname ? "lix",
versionSuffix ? "",
officialRelease ? __forDefaults.versionJson.official_release,
@ -83,7 +82,6 @@
configureFlags = prev.configureFlags or [ ] ++ [ (lib.enableFeature true "sigstop") ];
});
lix-doc = callPackage ./lix-doc/package.nix { };
build-release-notes = callPackage ./maintainers/build-release-notes.nix { };
},
}:
@ -139,6 +137,8 @@ let
./meson
./scripts/meson.build
./subprojects
# Required for meson to generate Cargo wraps
./Cargo.lock
]);
functionalTestFiles = fileset.unions [
@ -219,6 +219,7 @@ stdenv.mkDerivation (finalAttrs: {
meson
ninja
cmake
rustc
]
++ [
(lib.getBin lowdown)
@ -258,7 +259,6 @@ stdenv.mkDerivation (finalAttrs: {
lowdown
libsodium
toml11
lix-doc
pegtl
]
++ lib.optionals hostPlatform.isLinux [
@ -288,8 +288,15 @@ stdenv.mkDerivation (finalAttrs: {
env = {
BOOST_INCLUDEDIR = "${lib.getDev boost}/include";
BOOST_LIBRARYDIR = "${lib.getLib boost}/lib";
# Meson allows referencing a /usr/share/cargo/registry shaped thing for subproject sources.
# Turns out the Nix-generated Cargo dependencies are named the same as they
# would be in a Cargo registry cache.
MESON_PACKAGE_CACHE_DIR = finalAttrs.cargoDeps;
};
cargoDeps = rustPlatform.importCargoLock { lockFile = ./Cargo.lock; };
preConfigure =
lib.optionalString (!finalAttrs.dontBuild && !hostPlatform.isStatic) ''
# Copy libboost_context so we don't get all of Boost in our closure.
@ -425,6 +432,10 @@ stdenv.mkDerivation (finalAttrs: {
pre-commit-checks,
contribNotice,
check-syscalls,
# debuggers
gdb,
rr,
}:
let
glibcFix = lib.optionalAttrs (buildPlatform.isLinux && glibcLocales != null) {
@ -504,6 +515,8 @@ stdenv.mkDerivation (finalAttrs: {
]
++ lib.optional (pre-commit-checks ? enabledPackages) pre-commit-checks.enabledPackages
++ lib.optional (lib.meta.availableOn buildPlatform clangbuildanalyzer) clangbuildanalyzer
++ lib.optional (!stdenv.isDarwin) gdb
++ lib.optional (lib.meta.availableOn buildPlatform rr) rr
++ finalAttrs.checkInputs;
shellHook = ''

View file

@ -1,11 +1,7 @@
#include <cstdlib>
#include <cstring>
#include <algorithm>
#include <set>
#include <memory>
#include <string_view>
#include <tuple>
#include <iomanip>
#if __APPLE__
#include <sys/time.h>
#endif
@ -18,6 +14,7 @@
#include "build-result.hh"
#include "store-api.hh"
#include "derivations.hh"
#include "strings.hh"
#include "local-store.hh"
#include "legacy.hh"
#include "experimental-features.hh"

View file

@ -20,13 +20,15 @@ struct SingleBuiltPathBuilt {
DECLARE_CMP(SingleBuiltPathBuilt);
};
using _SingleBuiltPathRaw = std::variant<
namespace built_path::detail {
using SingleBuiltPathRaw = std::variant<
DerivedPathOpaque,
SingleBuiltPathBuilt
>;
}
struct SingleBuiltPath : _SingleBuiltPathRaw {
using Raw = _SingleBuiltPathRaw;
struct SingleBuiltPath : built_path::detail::SingleBuiltPathRaw {
using Raw = built_path::detail::SingleBuiltPathRaw;
using Raw::Raw;
using Opaque = DerivedPathOpaque;
@ -65,17 +67,19 @@ struct BuiltPathBuilt {
DECLARE_CMP(BuiltPathBuilt);
};
using _BuiltPathRaw = std::variant<
namespace built_path::detail {
using BuiltPathRaw = std::variant<
DerivedPath::Opaque,
BuiltPathBuilt
>;
}
/**
* A built path. Similar to a DerivedPath, but enriched with the corresponding
* output path(s).
*/
struct BuiltPath : _BuiltPathRaw {
using Raw = _BuiltPathRaw;
struct BuiltPath : built_path::detail::BuiltPathRaw {
using Raw = built_path::detail::BuiltPathRaw;
using Raw::Raw;
using Opaque = DerivedPathOpaque;

View file

@ -1,6 +1,7 @@
#include "editor-for.hh"
#include "environment-variables.hh"
#include "source-path.hh"
#include "strings.hh"
namespace nix {

View file

@ -50,7 +50,7 @@ libcmd = library(
editline,
lowdown,
nlohmann_json,
lix_doc
liblix_doc,
],
cpp_pch : cpp_pch,
install : true,

View file

@ -926,7 +926,7 @@ void NixRepl::loadFiles()
void NixRepl::loadReplOverlays()
{
if (!evalSettings.replOverlays) {
if (evalSettings.replOverlays.get().empty()) {
return;
}

View file

@ -1,9 +1,8 @@
#pragma once
///@file
#include <algorithm>
#include "error.hh"
#include "types.hh"
#include "pos-idx.hh"
namespace nix {

View file

@ -31,7 +31,7 @@ Value * EvalState::allocValue()
#endif
nrValues++;
return (Value *) p;
return static_cast<Value *>(p);
}
@ -54,10 +54,10 @@ Env & EvalState::allocEnv(size_t size)
void * p = *env1AllocCache;
*env1AllocCache = GC_NEXT(p);
GC_NEXT(p) = nullptr;
env = (Env *) p;
env = static_cast<Env *>(p);
} else
#endif
env = (Env *) gcAllocBytes(sizeof(Env) + size * sizeof(Value *));
env = static_cast<Env *>(gcAllocBytes(sizeof(Env) + size * sizeof(Value *)));
/* We assume that env->values has been cleared by the allocator; maybeThunk() and lookupVar fromWith expect this. */

View file

@ -151,7 +151,7 @@ struct EvalSettings : Config
This is useful for debugging warnings in third-party Nix code.
)"};
PathsSetting replOverlays{this, Paths(), "repl-overlays",
PathsSetting<Paths> replOverlays{this, Paths(), "repl-overlays",
R"(
A list of files containing Nix expressions that can be used to add
default bindings to [`nix

View file

@ -19,6 +19,7 @@
#include "gc-small-vector.hh"
#include "fetch-to-store.hh"
#include "flake/flakeref.hh"
#include "exit.hh"
#include <algorithm>
#include <iostream>
@ -249,6 +250,7 @@ EvalState::EvalState(
.findFile = symbols.create("__findFile"),
.nixPath = symbols.create("__nixPath"),
.body = symbols.create("body"),
.overrides = symbols.create("__overrides"),
}
, repair(NoRepair)
, emptyBindings(0)
@ -2710,20 +2712,29 @@ Expr & EvalState::parseExprFromFile(const SourcePath & path, std::shared_ptr<Sta
}
Expr & EvalState::parseExprFromString(std::string s_, const SourcePath & basePath, std::shared_ptr<StaticEnv> & staticEnv, const ExperimentalFeatureSettings & xpSettings)
Expr & EvalState::parseExprFromString(
std::string s_,
const SourcePath & basePath,
std::shared_ptr<StaticEnv> & staticEnv,
const FeatureSettings & featureSettings
)
{
// NOTE this method (and parseStdin) must take care to *fully copy* their input
// into their respective Pos::Origin until the parser stops overwriting its input
// data.
auto s = make_ref<std::string>(s_);
s_.append("\0\0", 2);
return *parse(s_.data(), s_.size(), Pos::String{.source = s}, basePath, staticEnv, xpSettings);
return *parse(s_.data(), s_.size(), Pos::String{.source = s}, basePath, staticEnv, featureSettings);
}
Expr & EvalState::parseExprFromString(std::string s, const SourcePath & basePath, const ExperimentalFeatureSettings & xpSettings)
Expr & EvalState::parseExprFromString(
std::string s,
const SourcePath & basePath,
const FeatureSettings & featureSettings
)
{
return parseExprFromString(std::move(s), basePath, staticBaseEnv, xpSettings);
return parseExprFromString(std::move(s), basePath, staticBaseEnv, featureSettings);
}

View file

@ -12,6 +12,7 @@
#include "experimental-features.hh"
#include "search-path.hh"
#include "repl-exit-status.hh"
#include "backed-string-view.hh"
#include <map>
#include <optional>
@ -344,8 +345,17 @@ public:
/**
* Parse a Nix expression from the specified string.
*/
Expr & parseExprFromString(std::string s, const SourcePath & basePath, std::shared_ptr<StaticEnv> & staticEnv, const ExperimentalFeatureSettings & xpSettings = experimentalFeatureSettings);
Expr & parseExprFromString(std::string s, const SourcePath & basePath, const ExperimentalFeatureSettings & xpSettings = experimentalFeatureSettings);
Expr & parseExprFromString(
std::string s,
const SourcePath & basePath,
std::shared_ptr<StaticEnv> & staticEnv,
const FeatureSettings & xpSettings = featureSettings
);
Expr & parseExprFromString(
std::string s,
const SourcePath & basePath,
const FeatureSettings & xpSettings = featureSettings
);
Expr & parseStdin();
@ -569,7 +579,7 @@ private:
Pos::Origin origin,
const SourcePath & basePath,
std::shared_ptr<StaticEnv> & staticEnv,
const ExperimentalFeatureSettings & xpSettings = experimentalFeatureSettings);
const FeatureSettings & xpSettings = featureSettings);
/**
* Current Nix call stack depth, used with `max-call-depth` setting to throw stack overflow hopefully before we run out of system stack.
@ -782,4 +792,4 @@ static constexpr std::string_view corepkgsPrefix{"/__corepkgs__/"};
}
#include "eval-inline.hh"
#include "eval-inline.hh" // IWYU pragma: keep

View file

@ -342,8 +342,21 @@ static void updateOverrides(std::map<InputPath, FlakeInput> & overrideMap, const
for (auto & [id, input] : overrides) {
auto inputPath(inputPathPrefix);
inputPath.push_back(id);
// Do not override existing assignment from outer flake
overrideMap.insert({inputPath, input});
/* Given
*
* { inputs.hydra.inputs.nix-eval-jobs.inputs.lix.follows = "lix"; }
*
* then `nix-eval-jobs` doesn't have an override.
* It's neither replaced using follows nor by a different
* URL. Thus no need to add it to overrides and thus re-fetch
* it.
*/
if (input.ref || input.follows) {
// Do not override existing assignment from outer flake
overrideMap.insert({inputPath, input});
}
updateOverrides(overrideMap, input.overrides, inputPath);
}
}

View file

@ -120,6 +120,7 @@ inline T * gcAllocType(size_t howMany = 1)
// However, people can and do request zero sized allocations, so we need
// to check that neither of our multiplicands were zero before complaining
// about it.
// NOLINTNEXTLINE(bugprone-sizeof-expression): yeah we only seem to alloc pointers with this. the calculation *is* correct though!
auto checkedSz = checked::Checked<size_t>(howMany) * sizeof(T);
size_t sz = checkedSz.valueWrapping();
if (checkedSz.overflowed()) {

View file

@ -11,6 +11,7 @@
namespace nix {
ExprBlackHole eBlackHole;
Expr *eBlackHoleAddr = &eBlackHole;
// FIXME: remove, because *symbols* are abstract and do not have a single
// textual representation; see printIdentifier()

View file

@ -11,6 +11,7 @@
#include "eval-error.hh"
#include "pos-idx.hh"
#include "pos-table.hh"
#include "strings.hh"
namespace nix {
@ -48,7 +49,7 @@ protected:
public:
struct AstSymbols {
Symbol sub, lessThan, mul, div, or_, findFile, nixPath, body;
Symbol sub, lessThan, mul, div, or_, findFile, nixPath, body, overrides;
};

View file

@ -115,7 +115,7 @@ struct ExprState
std::unique_ptr<Expr> pipe(PosIdx pos, State & state, bool flip = false)
{
if (!state.xpSettings.isEnabled(Xp::PipeOperator))
if (!state.featureSettings.isEnabled(Xp::PipeOperator))
throw ParseError({
.msg = HintFmt("Pipe operator is disabled"),
.pos = state.positions[pos]
@ -656,10 +656,10 @@ template<> struct BuildAST<grammar::expr::path> : p::maybe_nothing {};
template<> struct BuildAST<grammar::expr::uri> {
static void apply(const auto & in, ExprState & s, State & ps) {
bool noURLLiterals = ps.xpSettings.isEnabled(Xp::NoUrlLiterals);
if (noURLLiterals)
bool URLLiterals = ps.featureSettings.isEnabled(Dep::UrlLiterals);
if (!URLLiterals)
throw ParseError({
.msg = HintFmt("URL literals are disabled"),
.msg = HintFmt("URL literals are deprecated, allow using them with --extra-deprecated-features=url-literals"),
.pos = ps.positions[ps.at(in)]
});
s.pushExpr<ExprString>(ps.at(in), in.string());
@ -668,6 +668,16 @@ template<> struct BuildAST<grammar::expr::uri> {
template<> struct BuildAST<grammar::expr::ancient_let> : change_head<BindingsState> {
static void success(const auto & in, BindingsState & b, ExprState & s, State & ps) {
// Added 2024-09-18. Turn into an error at some point in the future.
// See the documentation on deprecated features for more details.
if (!ps.featureSettings.isEnabled(Dep::AncientLet))
warn(
"%s found at %s. This feature is deprecated and will be removed in the future. Use %s to silence this warning.",
"let {",
ps.positions[ps.at(in)],
"--extra-deprecated-features ancient-let"
);
b.attrs.pos = ps.at(in);
b.attrs.recursive = true;
s.pushExpr<ExprSelect>(b.attrs.pos, b.attrs.pos, std::make_unique<ExprAttrs>(std::move(b.attrs)), ps.s.body);
@ -676,6 +686,12 @@ template<> struct BuildAST<grammar::expr::ancient_let> : change_head<BindingsSta
template<> struct BuildAST<grammar::expr::rec_set> : change_head<BindingsState> {
static void success(const auto & in, BindingsState & b, ExprState & s, State & ps) {
// Before inserting new attrs, check for __override and throw an error
// (the error will initially be a warning to ease migration)
if (!featureSettings.isEnabled(Dep::RecSetOverrides) && b.attrs.attrs.contains(ps.s.overrides)) {
ps.overridesFound(ps.at(in));
}
b.attrs.pos = ps.at(in);
b.attrs.recursive = true;
s.pushExpr<ExprAttrs>(b.attrs.pos, std::move(b.attrs));
@ -858,7 +874,7 @@ Expr * EvalState::parse(
Pos::Origin origin,
const SourcePath & basePath,
std::shared_ptr<StaticEnv> & staticEnv,
const ExperimentalFeatureSettings & xpSettings)
const FeatureSettings & featureSettings)
{
parser::State s = {
symbols,
@ -866,7 +882,7 @@ Expr * EvalState::parse(
basePath,
positions.addOrigin(origin, length),
exprSymbols,
xpSettings
featureSettings,
};
parser::ExprState x;

View file

@ -2,6 +2,7 @@
///@file
#include "eval.hh"
#include "logging.hh"
namespace nix::parser {
@ -19,10 +20,11 @@ struct State
SourcePath basePath;
PosTable::Origin origin;
const Expr::AstSymbols & s;
const ExperimentalFeatureSettings & xpSettings;
const FeatureSettings & featureSettings;
void dupAttr(const AttrPath & attrPath, const PosIdx pos, const PosIdx prevPos);
void dupAttr(Symbol attr, const PosIdx pos, const PosIdx prevPos);
void overridesFound(const PosIdx pos);
void addAttr(ExprAttrs * attrs, AttrPath && attrPath, std::unique_ptr<Expr> e, const PosIdx pos);
std::unique_ptr<Formals> validateFormals(std::unique_ptr<Formals> formals, PosIdx pos = noPos, Symbol arg = {});
std::unique_ptr<Expr> stripIndentation(const PosIdx pos,
@ -58,6 +60,17 @@ inline void State::dupAttr(Symbol attr, const PosIdx pos, const PosIdx prevPos)
});
}
inline void State::overridesFound(const PosIdx pos) {
// Added 2024-09-18. Turn into an error at some point in the future.
// See the documentation on deprecated features for more details.
warn(
"%s found at %s. This feature is deprecated and will be removed in the future. Use %s to silence this warning.",
"__overrides",
positions[pos],
"--extra-deprecated-features rec-set-overrides"
);
}
inline void State::addAttr(ExprAttrs * attrs, AttrPath && attrPath, std::unique_ptr<Expr> e, const PosIdx pos)
{
AttrPath::iterator i;
@ -123,6 +136,12 @@ inline void State::addAttr(ExprAttrs * attrs, AttrPath && attrPath, std::unique_
dupAttr(attrPath, pos, j->second.pos);
}
} else {
// Before inserting new attrs, check for __override and throw an error
// (the error will initially be a warning to ease migration)
if (attrs->recursive && !featureSettings.isEnabled(Dep::RecSetOverrides) && i->symbol == s.overrides) {
overridesFound(pos);
}
// This attr path is not defined. Let's create it.
e->setName(i->symbol);
attrs->attrs.emplace(std::piecewise_construct,

View file

@ -136,7 +136,9 @@ class ExternalValueBase
std::ostream & operator << (std::ostream & str, const ExternalValueBase & v);
extern ExprBlackHole eBlackHole;
/** This is just the address of eBlackHole. It exists because eBlackHole has an
* incomplete type at usage sites so is not possible to cast. */
extern Expr *eBlackHoleAddr;
struct NewValueAs
{
@ -196,6 +198,7 @@ private:
public:
// Discount `using NewValueAs::*;`
// NOLINTNEXTLINE(bugprone-macro-parentheses)
#define USING_VALUETYPE(name) using name = NewValueAs::name
USING_VALUETYPE(integer_t);
USING_VALUETYPE(floating_t);
@ -473,7 +476,7 @@ public:
/// Constructs an evil thunk, whose evaluation represents infinite recursion.
explicit Value(blackhole_t)
: internalType(tThunk)
, thunk({ .env = nullptr, .expr = reinterpret_cast<Expr *>(&eBlackHole) })
, thunk({ .env = nullptr, .expr = eBlackHoleAddr })
{ }
Value(Value const & rhs) = default;
@ -513,7 +516,10 @@ public:
// type() == nThunk
inline bool isThunk() const { return internalType == tThunk; };
inline bool isApp() const { return internalType == tApp; };
inline bool isBlackhole() const;
inline bool isBlackhole() const
{
return internalType == tThunk && thunk.expr == eBlackHoleAddr;
}
// type() == nFunction
inline bool isLambda() const { return internalType == tLambda; };
@ -669,11 +675,6 @@ public:
void mkStringMove(const char * s, const NixStringContext & context);
inline void mkString(const Symbol & s)
{
mkString(((const std::string &) s).c_str());
}
void mkPath(const SourcePath & path);
inline void mkPath(const char * path)
@ -732,7 +733,11 @@ public:
lambda.fun = f;
}
inline void mkBlackhole();
inline void mkBlackhole()
{
internalType = tThunk;
thunk.expr = eBlackHoleAddr;
}
void mkPrimOp(PrimOp * p);
@ -832,18 +837,6 @@ public:
}
};
bool Value::isBlackhole() const
{
return internalType == tThunk && thunk.expr == (Expr*) &eBlackHole;
}
void Value::mkBlackhole()
{
internalType = tThunk;
thunk.expr = (Expr*) &eBlackHole;
}
using ValueVector = GcVector<Value *>;
using ValueMap = GcMap<Symbol, Value *>;
using ValueVectorMap = std::map<Symbol, ValueVector>;

View file

@ -7,6 +7,8 @@
#include "path.hh"
#include "attrs.hh"
#include "url.hh"
#include "ref.hh"
#include "strings.hh"
#include <memory>

View file

@ -1,8 +1,9 @@
#include "progress-bar.hh"
#include "file-system.hh"
#include "sync.hh"
#include "store-api.hh"
#include "names.hh"
#include "terminal.hh"
#include "strings.hh"
#include <map>
#include <thread>

View file

@ -6,6 +6,8 @@
#include "loggers.hh"
#include "current-process.hh"
#include "terminal.hh"
#include "strings.hh"
#include "exit.hh"
#include <algorithm>
#include <exception>

View file

@ -7,12 +7,10 @@
#include "path.hh"
#include "derived-path.hh"
#include "processes.hh"
#include "exit.hh"
#include "strings.hh"
#include <signal.h>
#include <locale>
namespace nix {

View file

@ -10,6 +10,7 @@
#include "nar-accessor.hh"
#include "thread-pool.hh"
#include "signals.hh"
#include "strings.hh"
#include <chrono>
#include <regex>

View file

@ -1,22 +1,15 @@
#include "derivation-goal.hh"
#include "hook-instance.hh"
#include "worker.hh"
#include "builtins.hh"
#include "builtins/buildenv.hh"
#include "references.hh"
#include "finally.hh"
#include "archive.hh"
#include "compression.hh"
#include "common-protocol.hh"
#include "common-protocol-impl.hh"
#include "topo-sort.hh"
#include "common-protocol-impl.hh" // IWYU pragma: keep
#include "local-store.hh" // TODO remove, along with remaining downcasts
#include "logging-json.hh"
#include "substitution-goal.hh"
#include "drv-output-substitution-goal.hh"
#include <regex>
#include <queue>
#include "strings.hh"
#include <fstream>
#include <sys/types.h>
@ -66,7 +59,7 @@ namespace nix {
DerivationGoal::DerivationGoal(const StorePath & drvPath,
const OutputsSpec & wantedOutputs, Worker & worker, BuildMode buildMode)
: Goal(worker, DerivedPath::Built { .drvPath = makeConstantStorePathRef(drvPath), .outputs = wantedOutputs })
: Goal(worker)
, useDerivation(true)
, drvPath(drvPath)
, wantedOutputs(wantedOutputs)
@ -84,7 +77,7 @@ DerivationGoal::DerivationGoal(const StorePath & drvPath,
DerivationGoal::DerivationGoal(const StorePath & drvPath, const BasicDerivation & drv,
const OutputsSpec & wantedOutputs, Worker & worker, BuildMode buildMode)
: Goal(worker, DerivedPath::Built { .drvPath = makeConstantStorePathRef(drvPath), .outputs = wantedOutputs })
: Goal(worker)
, useDerivation(false)
, drvPath(drvPath)
, wantedOutputs(wantedOutputs)
@ -127,6 +120,7 @@ std::string DerivationGoal::key()
void DerivationGoal::killChild()
{
hook.reset();
builderOutFD = nullptr;
}
@ -137,9 +131,9 @@ Goal::Finished DerivationGoal::timedOut(Error && ex)
}
Goal::WorkResult DerivationGoal::work()
Goal::WorkResult DerivationGoal::work(bool inBuildSlot)
{
return (this->*state)();
return (this->*state)(inBuildSlot);
}
void DerivationGoal::addWantedOutputs(const OutputsSpec & outputs)
@ -163,7 +157,7 @@ void DerivationGoal::addWantedOutputs(const OutputsSpec & outputs)
}
Goal::WorkResult DerivationGoal::getDerivation()
Goal::WorkResult DerivationGoal::getDerivation(bool inBuildSlot)
{
trace("init");
@ -171,7 +165,7 @@ Goal::WorkResult DerivationGoal::getDerivation()
exists. If it doesn't, it may be created through a
substitute. */
if (buildMode == bmNormal && worker.evalStore.isValidPath(drvPath)) {
return loadDerivation();
return loadDerivation(inBuildSlot);
}
@ -180,7 +174,7 @@ Goal::WorkResult DerivationGoal::getDerivation()
}
Goal::WorkResult DerivationGoal::loadDerivation()
Goal::WorkResult DerivationGoal::loadDerivation(bool inBuildSlot)
{
trace("loading derivation");
@ -207,11 +201,11 @@ Goal::WorkResult DerivationGoal::loadDerivation()
}
assert(drv);
return haveDerivation();
return haveDerivation(inBuildSlot);
}
Goal::WorkResult DerivationGoal::haveDerivation()
Goal::WorkResult DerivationGoal::haveDerivation(bool inBuildSlot)
{
trace("have derivation");
@ -239,7 +233,7 @@ Goal::WorkResult DerivationGoal::haveDerivation()
});
}
return gaveUpOnSubstitution();
return gaveUpOnSubstitution(inBuildSlot);
}
for (auto & i : drv->outputsAndOptPaths(worker.store))
@ -268,34 +262,39 @@ Goal::WorkResult DerivationGoal::haveDerivation()
through substitutes. If that doesn't work, we'll build
them. */
WaitForGoals result;
if (settings.useSubstitutes && parsedDrv->substitutesAllowed())
for (auto & [outputName, status] : initialOutputs) {
if (!status.wanted) continue;
if (!status.known)
result.goals.insert(
worker.makeDrvOutputSubstitutionGoal(
DrvOutput{status.outputHash, outputName},
buildMode == bmRepair ? Repair : NoRepair
)
);
else {
auto * cap = getDerivationCA(*drv);
result.goals.insert(worker.makePathSubstitutionGoal(
status.known->path,
buildMode == bmRepair ? Repair : NoRepair,
cap ? std::optional { *cap } : std::nullopt));
if (settings.useSubstitutes) {
if (parsedDrv->substitutesAllowed()) {
for (auto & [outputName, status] : initialOutputs) {
if (!status.wanted) continue;
if (!status.known)
result.goals.insert(
worker.makeDrvOutputSubstitutionGoal(
DrvOutput{status.outputHash, outputName},
buildMode == bmRepair ? Repair : NoRepair
)
);
else {
auto * cap = getDerivationCA(*drv);
result.goals.insert(worker.makePathSubstitutionGoal(
status.known->path,
buildMode == bmRepair ? Repair : NoRepair,
cap ? std::optional { *cap } : std::nullopt));
}
}
} else {
trace("skipping substitute because allowSubstitutes is false");
}
}
if (result.goals.empty()) { /* to prevent hang (no wake-up event) */
return outputsSubstitutionTried();
return outputsSubstitutionTried(inBuildSlot);
} else {
state = &DerivationGoal::outputsSubstitutionTried;
return result;
}
}
Goal::WorkResult DerivationGoal::outputsSubstitutionTried()
Goal::WorkResult DerivationGoal::outputsSubstitutionTried(bool inBuildSlot)
{
trace("all outputs substituted (maybe)");
@ -338,7 +337,7 @@ Goal::WorkResult DerivationGoal::outputsSubstitutionTried()
if (needRestart == NeedRestartForMoreOutputs::OutputsAddedDoNeed) {
needRestart = NeedRestartForMoreOutputs::OutputsUnmodifedDontNeed;
return haveDerivation();
return haveDerivation(inBuildSlot);
}
auto [allValid, validOutputs] = checkPathValidity();
@ -354,13 +353,13 @@ Goal::WorkResult DerivationGoal::outputsSubstitutionTried()
worker.store.printStorePath(drvPath));
/* Nothing to wait for; tail call */
return gaveUpOnSubstitution();
return gaveUpOnSubstitution(inBuildSlot);
}
/* At least one of the output paths could not be
produced using a substitute. So we have to build instead. */
Goal::WorkResult DerivationGoal::gaveUpOnSubstitution()
Goal::WorkResult DerivationGoal::gaveUpOnSubstitution(bool inBuildSlot)
{
WaitForGoals result;
@ -424,7 +423,7 @@ Goal::WorkResult DerivationGoal::gaveUpOnSubstitution()
}
if (result.goals.empty()) {/* to prevent hang (no wake-up event) */
return inputsRealised();
return inputsRealised(inBuildSlot);
} else {
state = &DerivationGoal::inputsRealised;
return result;
@ -495,7 +494,7 @@ Goal::WorkResult DerivationGoal::repairClosure()
}
Goal::WorkResult DerivationGoal::closureRepaired()
Goal::WorkResult DerivationGoal::closureRepaired(bool inBuildSlot)
{
trace("closure repaired");
if (nrFailed > 0)
@ -505,7 +504,7 @@ Goal::WorkResult DerivationGoal::closureRepaired()
}
Goal::WorkResult DerivationGoal::inputsRealised()
Goal::WorkResult DerivationGoal::inputsRealised(bool inBuildSlot)
{
trace("all inputs realised");
@ -519,7 +518,7 @@ Goal::WorkResult DerivationGoal::inputsRealised()
if (retrySubstitution == RetrySubstitution::YesNeed) {
retrySubstitution = RetrySubstitution::AlreadyRetried;
return haveDerivation();
return haveDerivation(inBuildSlot);
}
/* Gather information necessary for computing the closure and/or
@ -653,7 +652,7 @@ Goal::WorkResult DerivationGoal::inputsRealised()
return ContinueImmediately{};
}
Goal::WorkResult DerivationGoal::started()
void DerivationGoal::started()
{
auto msg = fmt(
buildMode == bmRepair ? "repairing outputs of '%s'" :
@ -664,10 +663,9 @@ Goal::WorkResult DerivationGoal::started()
act = std::make_unique<Activity>(*logger, lvlInfo, actBuild, msg,
Logger::Fields{worker.store.printStorePath(drvPath), hook ? machineName : "", 1, 1});
mcRunningBuilds = std::make_unique<MaintainCount<uint64_t>>(worker.runningBuilds);
return StillAlive{};
}
Goal::WorkResult DerivationGoal::tryToBuild()
Goal::WorkResult DerivationGoal::tryToBuild(bool inBuildSlot)
{
trace("trying to build");
@ -739,25 +737,35 @@ Goal::WorkResult DerivationGoal::tryToBuild()
&& settings.maxBuildJobs.get() != 0;
if (!buildLocally) {
switch (tryBuildHook()) {
case rpAccept:
/* Yes, it has started doing so. Wait until we get
EOF from the hook. */
actLock.reset();
buildResult.startTime = time(0); // inexact
state = &DerivationGoal::buildDone;
return started();
case rpPostpone:
/* Not now; wait until at least one child finishes or
the wake-up timeout expires. */
if (!actLock)
actLock = std::make_unique<Activity>(*logger, lvlTalkative, actBuildWaiting,
fmt("waiting for a machine to build '%s'", Magenta(worker.store.printStorePath(drvPath))));
outputLocks.unlock();
return WaitForAWhile{};
case rpDecline:
/* We should do it ourselves. */
break;
auto hookReply = tryBuildHook(inBuildSlot);
auto result = std::visit(
overloaded{
[&](HookReply::Accept & a) -> std::optional<WorkResult> {
/* Yes, it has started doing so. Wait until we get
EOF from the hook. */
actLock.reset();
buildResult.startTime = time(0); // inexact
state = &DerivationGoal::buildDone;
started();
return WaitForWorld{std::move(a.fds), false};
},
[&](HookReply::Postpone) -> std::optional<WorkResult> {
/* Not now; wait until at least one child finishes or
the wake-up timeout expires. */
if (!actLock)
actLock = std::make_unique<Activity>(*logger, lvlTalkative, actBuildWaiting,
fmt("waiting for a machine to build '%s'", Magenta(worker.store.printStorePath(drvPath))));
outputLocks.unlock();
return WaitForAWhile{};
},
[&](HookReply::Decline) -> std::optional<WorkResult> {
/* We should do it ourselves. */
return std::nullopt;
},
},
hookReply);
if (result) {
return std::move(*result);
}
}
@ -767,7 +775,7 @@ Goal::WorkResult DerivationGoal::tryToBuild()
return ContinueImmediately{};
}
Goal::WorkResult DerivationGoal::tryLocalBuild() {
Goal::WorkResult DerivationGoal::tryLocalBuild(bool inBuildSlot) {
throw Error(
"unable to build with a primary store that isn't a local store; "
"either pass a different '--store' or enable remote builds."
@ -822,14 +830,16 @@ void replaceValidPath(const Path & storePath, const Path & tmpPath)
int DerivationGoal::getChildStatus()
{
builderOutFD = nullptr;
return hook->pid.kill();
}
void DerivationGoal::closeReadPipes()
{
hook->builderOut.readSide.reset();
hook->fromHook.readSide.reset();
hook->builderOut.reset();
hook->fromHook.reset();
builderOutFD = nullptr;
}
@ -925,7 +935,7 @@ void runPostBuildHook(
proc.getStdout()->drainInto(sink);
}
Goal::WorkResult DerivationGoal::buildDone()
Goal::WorkResult DerivationGoal::buildDone(bool inBuildSlot)
{
trace("build done");
@ -1045,7 +1055,7 @@ Goal::WorkResult DerivationGoal::buildDone()
}
}
Goal::WorkResult DerivationGoal::resolvedFinished()
Goal::WorkResult DerivationGoal::resolvedFinished(bool inBuildSlot)
{
trace("resolved derivation finished");
@ -1116,9 +1126,9 @@ Goal::WorkResult DerivationGoal::resolvedFinished()
return done(status, std::move(builtOutputs));
}
HookReply DerivationGoal::tryBuildHook()
HookReply DerivationGoal::tryBuildHook(bool inBuildSlot)
{
if (!worker.hook.available || !useDerivation) return rpDecline;
if (!worker.hook.available || !useDerivation) return HookReply::Decline{};
if (!worker.hook.instance)
worker.hook.instance = std::make_unique<HookInstance>();
@ -1128,7 +1138,7 @@ HookReply DerivationGoal::tryBuildHook()
/* Send the request to the hook. */
worker.hook.instance->sink
<< "try"
<< (worker.getNrLocalBuilds() < settings.maxBuildJobs ? 1 : 0)
<< (inBuildSlot ? 1 : 0)
<< drv->platform
<< worker.store.printStorePath(drvPath)
<< parsedDrv->getRequiredSystemFeatures();
@ -1140,7 +1150,7 @@ HookReply DerivationGoal::tryBuildHook()
while (true) {
auto s = [&]() {
try {
return readLine(worker.hook.instance->fromHook.readSide.get());
return readLine(worker.hook.instance->fromHook.get());
} catch (Error & e) {
e.addTrace({}, "while reading the response from the build hook");
throw;
@ -1161,14 +1171,14 @@ HookReply DerivationGoal::tryBuildHook()
debug("hook reply is '%1%'", reply);
if (reply == "decline")
return rpDecline;
return HookReply::Decline{};
else if (reply == "decline-permanently") {
worker.hook.available = false;
worker.hook.instance.reset();
return rpDecline;
return HookReply::Decline{};
}
else if (reply == "postpone")
return rpPostpone;
return HookReply::Postpone{};
else if (reply != "accept")
throw Error("bad hook reply '%s'", reply);
@ -1176,9 +1186,9 @@ HookReply DerivationGoal::tryBuildHook()
if (e.errNo == EPIPE) {
printError(
"build hook died unexpectedly: %s",
chomp(drainFD(worker.hook.instance->fromHook.readSide.get())));
chomp(drainFD(worker.hook.instance->fromHook.get())));
worker.hook.instance.reset();
return rpDecline;
return HookReply::Decline{};
} else
throw;
}
@ -1186,7 +1196,7 @@ HookReply DerivationGoal::tryBuildHook()
hook = std::move(worker.hook.instance);
try {
machineName = readLine(hook->fromHook.readSide.get());
machineName = readLine(hook->fromHook.get());
} catch (Error & e) {
e.addTrace({}, "while reading the machine name from the build hook");
throw;
@ -1209,17 +1219,17 @@ HookReply DerivationGoal::tryBuildHook()
}
hook->sink = FdSink();
hook->toHook.writeSide.reset();
hook->toHook.reset();
/* Create the log file and pipe. */
Path logFile = openLogFile();
std::set<int> fds;
fds.insert(hook->fromHook.readSide.get());
fds.insert(hook->builderOut.readSide.get());
worker.childStarted(shared_from_this(), fds, false, false);
fds.insert(hook->fromHook.get());
fds.insert(hook->builderOut.get());
builderOutFD = &hook->builderOut;
return rpAccept;
return HookReply::Accept{std::move(fds)};
}
@ -1279,24 +1289,23 @@ void DerivationGoal::closeLogFile()
}
bool DerivationGoal::isReadDesc(int fd)
{
return fd == hook->builderOut.readSide.get();
}
Goal::WorkResult DerivationGoal::handleChildOutput(int fd, std::string_view data)
{
assert(builderOutFD);
auto tooMuchLogs = [&] {
killChild();
return done(
BuildResult::LogLimitExceeded, {},
Error("%s killed after writing more than %d bytes of log output",
getName(), settings.maxLogSize));
};
// local & `ssh://`-builds are dealt with here.
auto isWrittenToLog = isReadDesc(fd);
if (isWrittenToLog)
{
if (fd == builderOutFD->get()) {
logSize += data.size();
if (settings.maxLogSize && logSize > settings.maxLogSize) {
killChild();
return done(
BuildResult::LogLimitExceeded, {},
Error("%s killed after writing more than %d bytes of log output",
getName(), settings.maxLogSize));
return tooMuchLogs();
}
for (auto c : data)
@ -1311,9 +1320,10 @@ Goal::WorkResult DerivationGoal::handleChildOutput(int fd, std::string_view data
}
if (logSink) (*logSink)(data);
return StillAlive{};
}
if (hook && fd == hook->fromHook.readSide.get()) {
if (hook && fd == hook->fromHook.get()) {
for (auto c : data)
if (c == '\n') {
auto json = parseJSONMessage(currentHookLine);
@ -1321,11 +1331,17 @@ Goal::WorkResult DerivationGoal::handleChildOutput(int fd, std::string_view data
auto s = handleJSONLogMessage(*json, worker.act, hook->activities, true);
// ensure that logs from a builder using `ssh-ng://` as protocol
// are also available to `nix log`.
if (s && !isWrittenToLog && logSink) {
if (s && logSink) {
const auto type = (*json)["type"];
const auto fields = (*json)["fields"];
if (type == resBuildLogLine) {
(*logSink)((fields.size() > 0 ? fields[0].get<std::string>() : "") + "\n");
const std::string logLine =
(fields.size() > 0 ? fields[0].get<std::string>() : "") + "\n";
logSize += logLine.size();
if (settings.maxLogSize && logSize > settings.maxLogSize) {
return tooMuchLogs();
}
(*logSink)(logLine);
} else if (type == resSetPhase && ! fields.is_null()) {
const auto phase = fields[0];
if (! phase.is_null()) {
@ -1530,7 +1546,7 @@ Goal::Finished DerivationGoal::done(
return Finished{
.result = buildResult.success() ? ecSuccess : ecFailed,
.ex = ex ? std::make_unique<Error>(std::move(*ex)) : nullptr,
.ex = ex ? std::make_shared<Error>(std::move(*ex)) : nullptr,
.permanentFailure = buildResult.status == BuildResult::PermanentFailure,
.timedOut = buildResult.status == BuildResult::TimedOut,
.hashMismatch = anyHashMismatchSeen,

View file

@ -14,7 +14,21 @@ using std::map;
struct HookInstance;
typedef enum {rpAccept, rpDecline, rpPostpone} HookReply;
struct HookReplyBase {
struct [[nodiscard]] Accept {
std::set<int> fds;
};
struct [[nodiscard]] Decline {};
struct [[nodiscard]] Postpone {};
};
struct [[nodiscard]] HookReply
: HookReplyBase,
std::variant<HookReplyBase::Accept, HookReplyBase::Decline, HookReplyBase::Postpone>
{
HookReply() = delete;
using variant::variant;
};
/**
* Unless we are repairing, we don't both to test validity and just assume it,
@ -186,12 +200,19 @@ struct DerivationGoal : public Goal
*/
std::unique_ptr<HookInstance> hook;
/**
* Builder output is pulled from this file descriptor when not null.
* Owned by the derivation goal or subclass, must not be reset until
* the build has finished and no more output must be processed by us
*/
AutoCloseFD * builderOutFD = nullptr;
/**
* The sort of derivation we are building.
*/
std::optional<DerivationType> derivationType;
typedef WorkResult (DerivationGoal::*GoalState)();
typedef WorkResult (DerivationGoal::*GoalState)(bool inBuildSlot);
GoalState state;
BuildMode buildMode;
@ -224,7 +245,7 @@ struct DerivationGoal : public Goal
std::string key() override;
WorkResult work() override;
WorkResult work(bool inBuildSlot) override;
/**
* Add wanted outputs to an already existing derivation goal.
@ -234,23 +255,23 @@ struct DerivationGoal : public Goal
/**
* The states.
*/
WorkResult getDerivation();
WorkResult loadDerivation();
WorkResult haveDerivation();
WorkResult outputsSubstitutionTried();
WorkResult gaveUpOnSubstitution();
WorkResult closureRepaired();
WorkResult inputsRealised();
WorkResult tryToBuild();
virtual WorkResult tryLocalBuild();
WorkResult buildDone();
WorkResult getDerivation(bool inBuildSlot);
WorkResult loadDerivation(bool inBuildSlot);
WorkResult haveDerivation(bool inBuildSlot);
WorkResult outputsSubstitutionTried(bool inBuildSlot);
WorkResult gaveUpOnSubstitution(bool inBuildSlot);
WorkResult closureRepaired(bool inBuildSlot);
WorkResult inputsRealised(bool inBuildSlot);
WorkResult tryToBuild(bool inBuildSlot);
virtual WorkResult tryLocalBuild(bool inBuildSlot);
WorkResult buildDone(bool inBuildSlot);
WorkResult resolvedFinished();
WorkResult resolvedFinished(bool inBuildSlot);
/**
* Is the build hook willing to perform the build?
*/
HookReply tryBuildHook();
HookReply tryBuildHook(bool inBuildSlot);
virtual int getChildStatus();
@ -290,8 +311,6 @@ struct DerivationGoal : public Goal
virtual void cleanupPostOutputsRegisteredModeCheck();
virtual void cleanupPostOutputsRegisteredModeNonCheck();
virtual bool isReadDesc(int fd);
/**
* Callback used by the worker to write to the log.
*/
@ -328,7 +347,7 @@ struct DerivationGoal : public Goal
WorkResult repairClosure();
WorkResult started();
void started();
Finished done(
BuildResult::Status status,

View file

@ -11,7 +11,7 @@ DrvOutputSubstitutionGoal::DrvOutputSubstitutionGoal(
Worker & worker,
RepairFlag repair,
std::optional<ContentAddress> ca)
: Goal(worker, DerivedPath::Opaque { StorePath::dummy })
: Goal(worker)
, id(id)
{
state = &DrvOutputSubstitutionGoal::init;
@ -20,7 +20,7 @@ DrvOutputSubstitutionGoal::DrvOutputSubstitutionGoal(
}
Goal::WorkResult DrvOutputSubstitutionGoal::init()
Goal::WorkResult DrvOutputSubstitutionGoal::init(bool inBuildSlot)
{
trace("init");
@ -30,17 +30,14 @@ Goal::WorkResult DrvOutputSubstitutionGoal::init()
}
subs = settings.useSubstitutes ? getDefaultSubstituters() : std::list<ref<Store>>();
return tryNext();
return tryNext(inBuildSlot);
}
Goal::WorkResult DrvOutputSubstitutionGoal::tryNext()
Goal::WorkResult DrvOutputSubstitutionGoal::tryNext(bool inBuildSlot)
{
trace("trying next substituter");
/* Make sure that we are allowed to start a substitution. Note that even
if maxSubstitutionJobs == 0, we still allow a substituter to run. This
prevents infinite waiting. */
if (worker.runningSubstitutions >= std::max(1U, settings.maxSubstitutionJobs.get())) {
if (!inBuildSlot) {
return WaitForSlot{};
}
@ -78,13 +75,11 @@ Goal::WorkResult DrvOutputSubstitutionGoal::tryNext()
return sub->queryRealisation(id);
});
worker.childStarted(shared_from_this(), {downloadState->outPipe.readSide.get()}, true, false);
state = &DrvOutputSubstitutionGoal::realisationFetched;
return StillAlive{};
return WaitForWorld{{downloadState->outPipe.readSide.get()}, true};
}
Goal::WorkResult DrvOutputSubstitutionGoal::realisationFetched()
Goal::WorkResult DrvOutputSubstitutionGoal::realisationFetched(bool inBuildSlot)
{
worker.childTerminated(this);
maintainRunningSubstitutions.reset();
@ -97,7 +92,7 @@ Goal::WorkResult DrvOutputSubstitutionGoal::realisationFetched()
}
if (!outputInfo) {
return tryNext();
return tryNext(inBuildSlot);
}
WaitForGoals result;
@ -114,7 +109,7 @@ Goal::WorkResult DrvOutputSubstitutionGoal::realisationFetched()
worker.store.printStorePath(localOutputInfo->outPath),
worker.store.printStorePath(depPath)
);
return tryNext();
return tryNext(inBuildSlot);
}
result.goals.insert(worker.makeDrvOutputSubstitutionGoal(depId));
}
@ -123,14 +118,14 @@ Goal::WorkResult DrvOutputSubstitutionGoal::realisationFetched()
result.goals.insert(worker.makePathSubstitutionGoal(outputInfo->outPath));
if (result.goals.empty()) {
return outPathValid();
return outPathValid(inBuildSlot);
} else {
state = &DrvOutputSubstitutionGoal::outPathValid;
return result;
}
}
Goal::WorkResult DrvOutputSubstitutionGoal::outPathValid()
Goal::WorkResult DrvOutputSubstitutionGoal::outPathValid(bool inBuildSlot)
{
assert(outputInfo);
trace("output path substituted");
@ -159,9 +154,9 @@ std::string DrvOutputSubstitutionGoal::key()
return "a$" + std::string(id.to_string());
}
Goal::WorkResult DrvOutputSubstitutionGoal::work()
Goal::WorkResult DrvOutputSubstitutionGoal::work(bool inBuildSlot)
{
return (this->*state)();
return (this->*state)(inBuildSlot);
}

View file

@ -58,20 +58,20 @@ class DrvOutputSubstitutionGoal : public Goal {
public:
DrvOutputSubstitutionGoal(const DrvOutput& id, Worker & worker, RepairFlag repair = NoRepair, std::optional<ContentAddress> ca = std::nullopt);
typedef WorkResult (DrvOutputSubstitutionGoal::*GoalState)();
typedef WorkResult (DrvOutputSubstitutionGoal::*GoalState)(bool inBuildSlot);
GoalState state;
WorkResult init();
WorkResult tryNext();
WorkResult realisationFetched();
WorkResult outPathValid();
WorkResult init(bool inBuildSlot);
WorkResult tryNext(bool inBuildSlot);
WorkResult realisationFetched(bool inBuildSlot);
WorkResult outPathValid(bool inBuildSlot);
WorkResult finished();
Finished timedOut(Error && ex) override { abort(); };
std::string key() override;
WorkResult work() override;
WorkResult work(bool inBuildSlot) override;
JobCategory jobCategory() const override {
return JobCategory::Substitution;

View file

@ -2,6 +2,7 @@
#include "substitution-goal.hh"
#include "derivation-goal.hh"
#include "local-store.hh"
#include "strings.hh"
namespace nix {
@ -16,13 +17,13 @@ void Store::buildPaths(const std::vector<DerivedPath> & reqs, BuildMode buildMod
worker.run(goals);
StringSet failed;
std::optional<Error> ex;
std::shared_ptr<Error> ex;
for (auto & i : goals) {
if (i->ex) {
if (ex)
logError(i->ex->info());
else
ex = std::move(*i->ex);
ex = i->ex;
}
if (i->exitCode != Goal::ecSuccess) {
if (auto i2 = dynamic_cast<DerivationGoal *>(i.get()))

View file

@ -1,5 +1,4 @@
#include "goal.hh"
#include "worker.hh"
namespace nix {

View file

@ -51,7 +51,7 @@ enum struct JobCategory {
Substitution,
};
struct Goal : public std::enable_shared_from_this<Goal>
struct Goal
{
typedef enum {ecSuccess, ecFailed, ecNoSubstituters, ecIncompleteClosure} ExitCode;
@ -112,9 +112,13 @@ public:
struct [[nodiscard]] WaitForGoals {
Goals goals;
};
struct [[nodiscard]] WaitForWorld {
std::set<int> fds;
bool inBuildSlot;
};
struct [[nodiscard]] Finished {
ExitCode result;
std::unique_ptr<Error> ex;
std::shared_ptr<Error> ex;
bool permanentFailure = false;
bool timedOut = false;
bool hashMismatch = false;
@ -127,6 +131,7 @@ public:
WaitForAWhile,
ContinueImmediately,
WaitForGoals,
WaitForWorld,
Finished>
{
WorkResult() = delete;
@ -136,9 +141,9 @@ public:
/**
* Exception containing an error message, if any.
*/
std::unique_ptr<Error> ex;
std::shared_ptr<Error> ex;
Goal(Worker & worker, DerivedPath path)
explicit Goal(Worker & worker)
: worker(worker)
{ }
@ -147,7 +152,7 @@ public:
trace("goal destroyed");
}
virtual WorkResult work() = 0;
virtual WorkResult work(bool inBuildSlot) = 0;
virtual void waiteeDone(GoalPtr waitee) { }
@ -160,6 +165,11 @@ public:
{
}
virtual bool respectsTimeouts()
{
return false;
}
void trace(std::string_view s);
std::string getName() const

View file

@ -2,6 +2,7 @@
#include "file-system.hh"
#include "globals.hh"
#include "hook-instance.hh"
#include "strings.hh"
namespace nix {
@ -26,18 +27,21 @@ HookInstance::HookInstance()
args.push_back(std::to_string(verbosity));
/* Create a pipe to get the output of the child. */
fromHook.create();
Pipe fromHook_;
fromHook_.create();
/* Create the communication pipes. */
toHook.create();
Pipe toHook_;
toHook_.create();
/* Create a pipe to get the output of the builder. */
builderOut.create();
Pipe builderOut_;
builderOut_.create();
/* Fork the hook. */
pid = startProcess([&]() {
if (dup2(fromHook.writeSide.get(), STDERR_FILENO) == -1)
if (dup2(fromHook_.writeSide.get(), STDERR_FILENO) == -1)
throw SysError("cannot pipe standard error into log file");
commonChildInit();
@ -45,16 +49,16 @@ HookInstance::HookInstance()
if (chdir("/") == -1) throw SysError("changing into /");
/* Dup the communication pipes. */
if (dup2(toHook.readSide.get(), STDIN_FILENO) == -1)
if (dup2(toHook_.readSide.get(), STDIN_FILENO) == -1)
throw SysError("dupping to-hook read side");
/* Use fd 4 for the builder's stdout/stderr. */
if (dup2(builderOut.writeSide.get(), 4) == -1)
if (dup2(builderOut_.writeSide.get(), 4) == -1)
throw SysError("dupping builder's stdout/stderr");
/* Hack: pass the read side of that fd to allow build-remote
to read SSH error messages. */
if (dup2(builderOut.readSide.get(), 5) == -1)
if (dup2(builderOut_.readSide.get(), 5) == -1)
throw SysError("dupping builder's stdout/stderr");
execv(buildHook.c_str(), stringsToCharPtrs(args).data());
@ -63,10 +67,11 @@ HookInstance::HookInstance()
});
pid.setSeparatePG(true);
fromHook.writeSide.reset();
toHook.readSide.reset();
fromHook = std::move(fromHook_.readSide);
toHook = std::move(toHook_.writeSide);
builderOut = std::move(builderOut_.readSide);
sink = FdSink(toHook.writeSide.get());
sink = FdSink(toHook.get());
std::map<std::string, Config::SettingInfo> settings;
globalConfig.getSettings(settings);
for (auto & setting : settings)
@ -78,7 +83,7 @@ HookInstance::HookInstance()
HookInstance::~HookInstance()
{
try {
toHook.writeSide.reset();
toHook.reset();
if (pid) pid.kill();
} catch (...) {
ignoreException();

View file

@ -10,19 +10,19 @@ namespace nix {
struct HookInstance
{
/**
* Pipes for talking to the build hook.
* Pipe for talking to the build hook.
*/
Pipe toHook;
AutoCloseFD toHook;
/**
* Pipe for the hook's standard output/error.
*/
Pipe fromHook;
AutoCloseFD fromHook;
/**
* Pipe for the builder's standard output/error.
*/
Pipe builderOut;
AutoCloseFD builderOut;
/**
* The process ID of the hook.

View file

@ -1,14 +1,12 @@
#include "local-derivation-goal.hh"
#include "indirect-root-store.hh"
#include "hook-instance.hh"
#include "machines.hh"
#include "store-api.hh"
#include "worker.hh"
#include "builtins.hh"
#include "builtins/buildenv.hh"
#include "path-references.hh"
#include "finally.hh"
#include "archive.hh"
#include "compression.hh"
#include "daemon.hh"
#include "topo-sort.hh"
#include "json-utils.hh"
@ -17,6 +15,8 @@
#include "namespaces.hh"
#include "child.hh"
#include "unix-domain-socket.hh"
#include "mount.hh"
#include "strings.hh"
#include <regex>
#include <queue>
@ -149,17 +149,30 @@ void LocalDerivationGoal::killSandbox(bool getStats)
}
Goal::WorkResult LocalDerivationGoal::tryLocalBuild()
Goal::WorkResult LocalDerivationGoal::tryLocalBuild(bool inBuildSlot)
{
#if __APPLE__
additionalSandboxProfile = parsedDrv->getStringAttr("__sandboxProfile").value_or("");
#endif
unsigned int curBuilds = worker.getNrLocalBuilds();
if (curBuilds >= settings.maxBuildJobs) {
if (!inBuildSlot) {
state = &DerivationGoal::tryToBuild;
outputLocks.unlock();
return WaitForSlot{};
if (0U != settings.maxBuildJobs) {
return WaitForSlot{};
}
if (getMachines().empty()) {
throw Error(
"unable to start any build; either set '--max-jobs' to a non-zero value or enable "
"remote builds.\n"
"https://docs.lix.systems/manual/lix/stable/advanced-topics/distributed-builds.html"
);
} else {
throw Error(
"unable to start any build; remote machines may not have all required system features.\n"
"https://docs.lix.systems/manual/lix/stable/advanced-topics/distributed-builds.html"
);
}
}
assert(derivationType);
@ -230,7 +243,14 @@ Goal::WorkResult LocalDerivationGoal::tryLocalBuild()
try {
/* Okay, we have to build. */
startBuilder();
auto fds = startBuilder();
/* This state will be reached when we get EOF on the child's
log pipe. */
state = &DerivationGoal::buildDone;
started();
return WaitForWorld{std::move(fds), true};
} catch (BuildError & e) {
outputLocks.unlock();
@ -239,12 +259,6 @@ Goal::WorkResult LocalDerivationGoal::tryLocalBuild()
report.permanentFailure = true;
return report;
}
/* This state will be reached when we get EOF on the child's
log pipe. */
state = &DerivationGoal::buildDone;
return started();
}
@ -279,8 +293,10 @@ void LocalDerivationGoal::closeReadPipes()
{
if (hook) {
DerivationGoal::closeReadPipes();
} else
builderOut.close();
} else {
builderOutPTY.close();
builderOutFD = nullptr;
}
}
@ -372,40 +388,7 @@ void LocalDerivationGoal::cleanupPostOutputsRegisteredModeNonCheck()
cleanupPostOutputsRegisteredModeCheck();
}
#if __linux__
static void doBind(const Path & source, const Path & target, bool optional = false) {
debug("bind mounting '%1%' to '%2%'", source, target);
auto bindMount = [&]() {
if (mount(source.c_str(), target.c_str(), "", MS_BIND | MS_REC, 0) == -1)
throw SysError("bind mount from '%1%' to '%2%' failed", source, target);
};
auto maybeSt = maybeLstat(source);
if (!maybeSt) {
if (optional)
return;
else
throw SysError("getting attributes of path '%1%'", source);
}
auto st = *maybeSt;
if (S_ISDIR(st.st_mode)) {
createDirs(target);
bindMount();
} else if (S_ISLNK(st.st_mode)) {
// Symlinks can (apparently) not be bind-mounted, so just copy it
createDirs(dirOf(target));
copyFile(source, target, {});
} else {
createDirs(dirOf(target));
writeFile(target, "");
bindMount();
}
};
#endif
void LocalDerivationGoal::startBuilder()
std::set<int> LocalDerivationGoal::startBuilder()
{
if ((buildUser && buildUser->getUIDCount() != 1)
#if __linux__
@ -466,13 +449,23 @@ void LocalDerivationGoal::startBuilder()
killSandbox(false);
/* Right platform? */
if (!parsedDrv->canBuildLocally(worker.store))
throw Error("a '%s' with features {%s} is required to build '%s', but I am a '%s' with features {%s}",
drv->platform,
concatStringsSep(", ", parsedDrv->getRequiredSystemFeatures()),
worker.store.printStorePath(drvPath),
settings.thisSystem,
concatStringsSep<StringSet>(", ", worker.store.systemFeatures));
if (!parsedDrv->canBuildLocally(worker.store)) {
HintFmt addendum{""};
if (settings.useSubstitutes && !parsedDrv->substitutesAllowed()) {
addendum = HintFmt("\n\nHint: the failing derivation has %s set to %s, forcing it to be built rather than substituted.\n"
"Passing %s to force substitution may resolve this failure if the path is available in a substituter.",
"allowSubstitutes", "false", "--always-allow-substitutes");
}
throw Error({
.msg = HintFmt("a '%s' with features {%s} is required to build '%s', but I am a '%s' with features {%s}%s",
drv->platform,
concatStringsSep(", ", parsedDrv->getRequiredSystemFeatures()),
worker.store.printStorePath(drvPath),
settings.thisSystem,
concatStringsSep<StringSet>(", ", worker.store.systemFeatures),
Uncolored(addendum))
});
}
/* Create a temporary directory where the build will take
place. */
@ -704,12 +697,13 @@ void LocalDerivationGoal::startBuilder()
Path logFile = openLogFile();
/* Create a pseudoterminal to get the output of the builder. */
builderOut = AutoCloseFD{posix_openpt(O_RDWR | O_NOCTTY)};
if (!builderOut)
builderOutPTY = AutoCloseFD{posix_openpt(O_RDWR | O_NOCTTY)};
if (!builderOutPTY)
throw SysError("opening pseudoterminal master");
builderOutFD = &builderOutPTY;
// FIXME: not thread-safe, use ptsname_r
std::string slaveName = ptsname(builderOut.get());
std::string slaveName = ptsname(builderOutPTY.get());
if (buildUser) {
if (chmod(slaveName.c_str(), 0600))
@ -720,12 +714,12 @@ void LocalDerivationGoal::startBuilder()
}
#if __APPLE__
else {
if (grantpt(builderOut.get()))
if (grantpt(builderOutPTY.get()))
throw SysError("granting access to pseudoterminal slave");
}
#endif
if (unlockpt(builderOut.get()))
if (unlockpt(builderOutPTY.get()))
throw SysError("unlocking pseudoterminal");
/* Open the slave side of the pseudoterminal and use it as stderr. */
@ -756,14 +750,13 @@ void LocalDerivationGoal::startBuilder()
/* parent */
pid.setSeparatePG(true);
worker.childStarted(shared_from_this(), {builderOut.get()}, true, true);
/* Check if setting up the build environment failed. */
std::vector<std::string> msgs;
while (true) {
std::string msg = [&]() {
try {
return readLine(builderOut.get());
return readLine(builderOutPTY.get());
} catch (Error & e) {
auto status = pid.wait();
e.addTrace({}, "while waiting for the build environment for '%s' to initialize (%s, previous messages: %s)",
@ -775,7 +768,7 @@ void LocalDerivationGoal::startBuilder()
}();
if (msg.substr(0, 1) == "\2") break;
if (msg.substr(0, 1) == "\1") {
FdSource source(builderOut.get());
FdSource source(builderOutPTY.get());
auto ex = readError(source);
ex.addTrace({}, "while setting up the build environment");
throw ex;
@ -783,6 +776,8 @@ void LocalDerivationGoal::startBuilder()
debug("sandbox setup: " + msg);
msgs.push_back(std::move(msg));
}
return {builderOutPTY.get()};
}
@ -1307,7 +1302,7 @@ void LocalDerivationGoal::addDependency(const StorePath & path)
Path target = chrootRootDir + worker.store.printStorePath(path);
if (pathExists(target)) {
// There is a similar debug message in doBind, so only run it in this block to not have double messages.
// There is a similar debug message in bindPath, so only run it in this block to not have double messages.
debug("bind-mounting %s -> %s", target, source);
throw Error("store path '%s' already exists in the sandbox", worker.store.printStorePath(path));
}
@ -1324,7 +1319,7 @@ void LocalDerivationGoal::addDependency(const StorePath & path)
if (setns(sandboxMountNamespace.get(), 0) == -1)
throw SysError("entering sandbox mount namespace");
doBind(source, target);
bindPath(source, target);
_exit(0);
});
@ -1516,7 +1511,7 @@ void LocalDerivationGoal::runChild()
chmodPath(dst, 0555);
} else
#endif
doBind(i.second.source, chrootRootDir + i.first, i.second.optional);
bindPath(i.second.source, chrootRootDir + i.first, i.second.optional);
}
/* Bind a new instance of procfs on /proc. */
@ -1555,8 +1550,8 @@ void LocalDerivationGoal::runChild()
} else {
if (errno != EINVAL)
throw SysError("mounting /dev/pts");
doBind("/dev/pts", chrootRootDir + "/dev/pts");
doBind("/dev/ptmx", chrootRootDir + "/dev/ptmx");
bindPath("/dev/pts", chrootRootDir + "/dev/pts");
bindPath("/dev/ptmx", chrootRootDir + "/dev/ptmx");
}
}
@ -2596,13 +2591,6 @@ void LocalDerivationGoal::deleteTmpDir(bool force)
}
bool LocalDerivationGoal::isReadDesc(int fd)
{
return (hook && DerivationGoal::isReadDesc(fd)) ||
(!hook && fd == builderOut.get());
}
StorePath LocalDerivationGoal::makeFallbackPath(OutputNameView outputName)
{
return worker.store.makeStorePath(

View file

@ -40,7 +40,7 @@ struct LocalDerivationGoal : public DerivationGoal
* Master side of the pseudoterminal used for the builder's
* standard output/error.
*/
AutoCloseFD builderOut;
AutoCloseFD builderOutPTY;
/**
* Pipe for synchronising updates to the builder namespaces.
@ -211,12 +211,12 @@ struct LocalDerivationGoal : public DerivationGoal
/**
* The additional states.
*/
WorkResult tryLocalBuild() override;
WorkResult tryLocalBuild(bool inBuildSlot) override;
/**
* Start building a derivation.
*/
void startBuilder();
std::set<int> startBuilder();
/**
* Fill in the environment for the builder.
@ -285,8 +285,6 @@ struct LocalDerivationGoal : public DerivationGoal
void cleanupPostOutputsRegisteredModeCheck() override;
void cleanupPostOutputsRegisteredModeNonCheck() override;
bool isReadDesc(int fd) override;
/**
* Delete the temporary directory, if we have one.
*/
@ -359,6 +357,10 @@ protected:
return false;
}
virtual bool respectsTimeouts() override
{
return true;
}
};
}

View file

@ -7,7 +7,7 @@
namespace nix {
PathSubstitutionGoal::PathSubstitutionGoal(const StorePath & storePath, Worker & worker, RepairFlag repair, std::optional<ContentAddress> ca)
: Goal(worker, DerivedPath::Opaque { storePath })
: Goal(worker)
, storePath(storePath)
, repair(repair)
, ca(ca)
@ -39,13 +39,13 @@ Goal::Finished PathSubstitutionGoal::done(
}
Goal::WorkResult PathSubstitutionGoal::work()
Goal::WorkResult PathSubstitutionGoal::work(bool inBuildSlot)
{
return (this->*state)();
return (this->*state)(inBuildSlot);
}
Goal::WorkResult PathSubstitutionGoal::init()
Goal::WorkResult PathSubstitutionGoal::init(bool inBuildSlot)
{
trace("init");
@ -61,11 +61,11 @@ Goal::WorkResult PathSubstitutionGoal::init()
subs = settings.useSubstitutes ? getDefaultSubstituters() : std::list<ref<Store>>();
return tryNext();
return tryNext(inBuildSlot);
}
Goal::WorkResult PathSubstitutionGoal::tryNext()
Goal::WorkResult PathSubstitutionGoal::tryNext(bool inBuildSlot)
{
trace("trying next substituter");
@ -97,23 +97,23 @@ Goal::WorkResult PathSubstitutionGoal::tryNext()
if (sub->storeDir == worker.store.storeDir)
assert(subPath == storePath);
} else if (sub->storeDir != worker.store.storeDir) {
return tryNext();
return tryNext(inBuildSlot);
}
try {
// FIXME: make async
info = sub->queryPathInfo(subPath ? *subPath : storePath);
} catch (InvalidPath &) {
return tryNext();
return tryNext(inBuildSlot);
} catch (SubstituterDisabled &) {
if (settings.tryFallback) {
return tryNext();
return tryNext(inBuildSlot);
}
throw;
} catch (Error & e) {
if (settings.tryFallback) {
logError(e.info());
return tryNext();
return tryNext(inBuildSlot);
}
throw;
}
@ -126,7 +126,7 @@ Goal::WorkResult PathSubstitutionGoal::tryNext()
} else {
printError("asked '%s' for '%s' but got '%s'",
sub->getUri(), worker.store.printStorePath(storePath), sub->printStorePath(info->path));
return tryNext();
return tryNext(inBuildSlot);
}
}
@ -147,7 +147,7 @@ Goal::WorkResult PathSubstitutionGoal::tryNext()
{
warn("ignoring substitute for '%s' from '%s', as it's not signed by any of the keys in 'trusted-public-keys'",
worker.store.printStorePath(storePath), sub->getUri());
return tryNext();
return tryNext(inBuildSlot);
}
/* To maintain the closure invariant, we first have to realise the
@ -158,7 +158,7 @@ Goal::WorkResult PathSubstitutionGoal::tryNext()
result.goals.insert(worker.makePathSubstitutionGoal(i));
if (result.goals.empty()) {/* to prevent hang (no wake-up event) */
return referencesValid();
return referencesValid(inBuildSlot);
} else {
state = &PathSubstitutionGoal::referencesValid;
return result;
@ -166,7 +166,7 @@ Goal::WorkResult PathSubstitutionGoal::tryNext()
}
Goal::WorkResult PathSubstitutionGoal::referencesValid()
Goal::WorkResult PathSubstitutionGoal::referencesValid(bool inBuildSlot)
{
trace("all references realised");
@ -186,14 +186,11 @@ Goal::WorkResult PathSubstitutionGoal::referencesValid()
}
Goal::WorkResult PathSubstitutionGoal::tryToRun()
Goal::WorkResult PathSubstitutionGoal::tryToRun(bool inBuildSlot)
{
trace("trying to run");
/* Make sure that we are allowed to start a substitution. Note that even
if maxSubstitutionJobs == 0, we still allow a substituter to run. This
prevents infinite waiting. */
if (worker.getNrSubstitutions() >= std::max(1U, (unsigned int) settings.maxSubstitutionJobs)) {
if (!inBuildSlot) {
return WaitForSlot{};
}
@ -224,14 +221,12 @@ Goal::WorkResult PathSubstitutionGoal::tryToRun()
}
});
worker.childStarted(shared_from_this(), {outPipe.readSide.get()}, true, false);
state = &PathSubstitutionGoal::finished;
return StillAlive{};
return WaitForWorld{{outPipe.readSide.get()}, true};
}
Goal::WorkResult PathSubstitutionGoal::finished()
Goal::WorkResult PathSubstitutionGoal::finished(bool inBuildSlot)
{
trace("substitute finished");

View file

@ -66,7 +66,7 @@ struct PathSubstitutionGoal : public Goal
std::unique_ptr<MaintainCount<uint64_t>> maintainExpectedSubstitutions,
maintainRunningSubstitutions, maintainExpectedNar, maintainExpectedDownload;
typedef WorkResult (PathSubstitutionGoal::*GoalState)();
typedef WorkResult (PathSubstitutionGoal::*GoalState)(bool inBuildSlot);
GoalState state;
/**
@ -94,17 +94,16 @@ public:
return "a$" + std::string(storePath.name()) + "$" + worker.store.printStorePath(storePath);
}
WorkResult work() override;
WorkResult work(bool inBuildSlot) override;
/**
* The states.
*/
WorkResult init();
WorkResult tryNext();
WorkResult gotInfo();
WorkResult referencesValid();
WorkResult tryToRun();
WorkResult finished();
WorkResult init(bool inBuildSlot);
WorkResult tryNext(bool inBuildSlot);
WorkResult referencesValid(bool inBuildSlot);
WorkResult tryToRun(bool inBuildSlot);
WorkResult finished(bool inBuildSlot);
/**
* Callback used by the worker to write to the log.

View file

@ -1,5 +1,4 @@
#include "charptr-cast.hh"
#include "machines.hh"
#include "worker.hh"
#include "substitution-goal.hh"
#include "drv-output-substitution-goal.hh"
@ -151,7 +150,7 @@ void Worker::goalFinished(GoalPtr goal, Goal::Finished & f)
if (!goal->waiters.empty())
logError(f.ex->info());
else
goal->ex = std::move(f.ex);
goal->ex = f.ex;
}
for (auto & i : goal->waiters) {
@ -198,6 +197,7 @@ void Worker::handleWorkResult(GoalPtr goal, Goal::WorkResult how)
dep->waiters.insert(goal);
}
},
[&](Goal::WaitForWorld & w) { childStarted(goal, w.fds, w.inBuildSlot); },
[&](Goal::Finished & f) { goalFinished(goal, f); },
},
how
@ -232,20 +232,8 @@ void Worker::wakeUp(GoalPtr goal)
}
unsigned Worker::getNrLocalBuilds()
{
return nrLocalBuilds;
}
unsigned Worker::getNrSubstitutions()
{
return nrSubstitutions;
}
void Worker::childStarted(GoalPtr goal, const std::set<int> & fds,
bool inBuildSlot, bool respectTimeouts)
bool inBuildSlot)
{
Child child;
child.goal = goal;
@ -253,7 +241,6 @@ void Worker::childStarted(GoalPtr goal, const std::set<int> & fds,
child.fds = fds;
child.timeStarted = child.lastOutput = steady_time_point::clock::now();
child.inBuildSlot = inBuildSlot;
child.respectTimeouts = respectTimeouts;
children.emplace_back(child);
if (inBuildSlot) {
switch (goal->jobCategory()) {
@ -307,8 +294,8 @@ void Worker::waitForBuildSlot(GoalPtr goal)
{
goal->trace("wait for build slot");
bool isSubstitutionGoal = goal->jobCategory() == JobCategory::Substitution;
if ((!isSubstitutionGoal && getNrLocalBuilds() < settings.maxBuildJobs) ||
(isSubstitutionGoal && getNrSubstitutions() < settings.maxSubstitutionJobs))
if ((!isSubstitutionGoal && nrLocalBuilds < settings.maxBuildJobs) ||
(isSubstitutionGoal && nrSubstitutions < settings.maxSubstitutionJobs))
wakeUp(goal); /* we can do it right away */
else
wantingToBuild.insert(goal);
@ -364,7 +351,12 @@ void Worker::run(const Goals & _topGoals)
awake.clear();
for (auto & goal : awake2) {
checkInterrupt();
handleWorkResult(goal, goal->work());
/* Make sure that we are always allowed to run at least one substitution.
This prevents infinite waiting. */
const bool inSlot = goal->jobCategory() == JobCategory::Substitution
? nrSubstitutions < std::max(1U, (unsigned int) settings.maxSubstitutionJobs)
: nrLocalBuilds < settings.maxBuildJobs;
handleWorkResult(goal, goal->work(inSlot));
actDerivations.progress(
doneBuilds, expectedBuilds + doneBuilds, runningBuilds, failedBuilds
@ -388,18 +380,6 @@ void Worker::run(const Goals & _topGoals)
if (!children.empty() || !waitingForAWhile.empty())
waitForInput();
else {
if (awake.empty() && 0U == settings.maxBuildJobs)
{
if (getMachines().empty())
throw Error("unable to start any build; either increase '--max-jobs' "
"or enable remote builds."
"\nhttps://docs.lix.systems/manual/lix/stable/advanced-topics/distributed-builds.html");
else
throw Error("unable to start any build; remote machines may not have "
"all required system features."
"\nhttps://docs.lix.systems/manual/lix/stable/advanced-topics/distributed-builds.html");
}
assert(!awake.empty());
}
}
@ -434,11 +414,13 @@ void Worker::waitForInput()
// Periodicallty wake up to see if we need to run the garbage collector.
nearest = before + std::chrono::seconds(10);
for (auto & i : children) {
if (!i.respectTimeouts) continue;
if (0 != settings.maxSilentTime)
nearest = std::min(nearest, i.lastOutput + std::chrono::seconds(settings.maxSilentTime));
if (0 != settings.buildTimeout)
nearest = std::min(nearest, i.timeStarted + std::chrono::seconds(settings.buildTimeout));
if (auto goal = i.goal.lock()) {
if (!goal->respectsTimeouts()) continue;
if (0 != settings.maxSilentTime)
nearest = std::min(nearest, i.lastOutput + std::chrono::seconds(settings.maxSilentTime));
if (0 != settings.buildTimeout)
nearest = std::min(nearest, i.timeStarted + std::chrono::seconds(settings.buildTimeout));
}
}
if (nearest != steady_time_point::max()) {
timeout = std::max(1L, (long) std::chrono::duration_cast<std::chrono::seconds>(nearest - before).count());
@ -491,7 +473,7 @@ void Worker::waitForInput()
if (!goal->exitCode.has_value() &&
0 != settings.maxSilentTime &&
j->respectTimeouts &&
goal->respectsTimeouts() &&
after - j->lastOutput >= std::chrono::seconds(settings.maxSilentTime))
{
handleWorkResult(
@ -507,7 +489,7 @@ void Worker::waitForInput()
else if (!goal->exitCode.has_value() &&
0 != settings.buildTimeout &&
j->respectTimeouts &&
goal->respectsTimeouts() &&
after - j->timeStarted >= std::chrono::seconds(settings.buildTimeout))
{
handleWorkResult(

View file

@ -29,7 +29,6 @@ struct Child
WeakGoalPtr goal;
Goal * goal2; // ugly hackery
std::set<int> fds;
bool respectTimeouts;
bool inBuildSlot;
/**
* Time we last got output on stdout/stderr
@ -153,6 +152,18 @@ private:
*/
void waitForInput();
/**
* Remove a dead goal.
*/
void removeGoal(GoalPtr goal);
/**
* Registers a running child process. `inBuildSlot` means that
* the process counts towards the jobs limit.
*/
void childStarted(GoalPtr goal, const std::set<int> & fds,
bool inBuildSlot);
public:
const Activity act;
@ -224,29 +235,6 @@ public:
*/
GoalPtr makeGoal(const DerivedPath & req, BuildMode buildMode = bmNormal);
/**
* Remove a dead goal.
*/
void removeGoal(GoalPtr goal);
/**
* Return the number of local build processes currently running (but not
* remote builds via the build hook).
*/
unsigned int getNrLocalBuilds();
/**
* Return the number of substitution processes currently running.
*/
unsigned int getNrSubstitutions();
/**
* Registers a running child process. `inBuildSlot` means that
* the process counts towards the jobs limit.
*/
void childStarted(GoalPtr goal, const std::set<int> & fds,
bool inBuildSlot, bool respectTimeouts);
/**
* Unregisters a running child process.
*/

View file

@ -1,4 +1,5 @@
#include "buildenv.hh"
#include "strings.hh"
#include <sys/stat.h>
#include <sys/types.h>

View file

@ -3,6 +3,7 @@
#include "store-api.hh"
#include "archive.hh"
#include "compression.hh"
#include "strings.hh"
namespace nix {

View file

@ -20,6 +20,7 @@ namespace nix {
{ \
return LengthPrefixedProtoHelper<CommonProto, T >::read(store, conn); \
} \
/* NOLINTNEXTLINE(bugprone-macro-parentheses) */ \
TEMPLATE [[nodiscard]] WireFormatGenerator CommonProto::Serialise< T >::write(const Store & store, CommonProto::WriteConn conn, const T & t) \
{ \
return LengthPrefixedProtoHelper<CommonProto, T >::write(store, conn, t); \

View file

@ -1,6 +1,7 @@
#include "args.hh"
#include "content-address.hh"
#include "split.hh"
#include "strings.hh"
namespace nix {

View file

@ -2,7 +2,7 @@
#include "monitor-fd.hh"
#include "worker-protocol.hh"
#include "worker-protocol-impl.hh"
#include "build-result.hh"
#include "build-result.hh" // IWYU pragma: keep
#include "store-api.hh"
#include "store-cast.hh"
#include "gc-store.hh"
@ -12,6 +12,7 @@
#include "finally.hh"
#include "archive.hh"
#include "derivations.hh"
#include "strings.hh"
#include "args.hh"
#include <sstream>

View file

@ -3,11 +3,12 @@
#include "store-api.hh"
#include "globals.hh"
#include "types.hh"
#include "split.hh"
#include "common-protocol.hh"
#include "common-protocol-impl.hh"
#include "fs-accessor.hh"
#include "json-utils.hh"
#include "strings.hh"
#include "backed-string-view.hh"
#include <boost/container/small_vector.hpp>
#include <nlohmann/json.hpp>

View file

@ -5,6 +5,7 @@
#include "path.hh"
#include "outputs-spec.hh"
#include "comparator.hh"
#include "ref.hh"
#include <variant>
@ -78,10 +79,12 @@ struct SingleDerivedPathBuilt {
DECLARE_CMP(SingleDerivedPathBuilt);
};
using _SingleDerivedPathRaw = std::variant<
namespace derived_path::detail {
using SingleDerivedPathRaw = std::variant<
DerivedPathOpaque,
SingleDerivedPathBuilt
>;
}
/**
* A "derived path" is a very simple sort of expression (not a Nix
@ -94,8 +97,8 @@ using _SingleDerivedPathRaw = std::variant<
* - built, in which case it is a pair of a derivation path and an
* output name.
*/
struct SingleDerivedPath : _SingleDerivedPathRaw {
using Raw = _SingleDerivedPathRaw;
struct SingleDerivedPath : derived_path::detail::SingleDerivedPathRaw {
using Raw = derived_path::detail::SingleDerivedPathRaw;
using Raw::Raw;
using Opaque = DerivedPathOpaque;
@ -201,10 +204,12 @@ struct DerivedPathBuilt {
DECLARE_CMP(DerivedPathBuilt);
};
using _DerivedPathRaw = std::variant<
namespace derived_path::detail {
using DerivedPathRaw = std::variant<
DerivedPathOpaque,
DerivedPathBuilt
>;
}
/**
* A "derived path" is a very simple sort of expression that evaluates
@ -216,8 +221,8 @@ using _DerivedPathRaw = std::variant<
* - built, in which case it is a pair of a derivation path and some
* output names.
*/
struct DerivedPath : _DerivedPathRaw {
using Raw = _DerivedPathRaw;
struct DerivedPath : derived_path::detail::DerivedPathRaw {
using Raw = derived_path::detail::DerivedPathRaw;
using Raw::Raw;
using Opaque = DerivedPathOpaque;

View file

@ -1,3 +1,4 @@
#include "dummy-store.hh"
#include "store-api.hh"
namespace nix {
@ -73,6 +74,8 @@ struct DummyStore : public virtual DummyStoreConfig, public virtual Store
{ unsupported("getFSAccessor"); }
};
static RegisterStoreImplementation<DummyStore, DummyStoreConfig> regDummyStore;
void registerDummyStore() {
StoreImplementations::add<DummyStore, DummyStoreConfig>();
}
}

View file

@ -0,0 +1,8 @@
#pragma once
///@file
namespace nix {
void registerDummyStore();
}

View file

@ -5,6 +5,7 @@
#include "s3.hh"
#include "signals.hh"
#include "compression.hh"
#include "strings.hh"
#if ENABLE_S3
#include <aws/core/client/ClientConfiguration.h>

View file

@ -2,6 +2,7 @@
///@file
#include "box_ptr.hh"
#include "ref.hh"
#include "logging.hh"
#include "serialise.hh"
#include "types.hh"

View file

@ -5,6 +5,7 @@
#include "signals.hh"
#include "finally.hh"
#include "unix-domain-socket.hh"
#include "strings.hh"
#include <queue>
#include <regex>

View file

@ -33,6 +33,16 @@
#include <sys/sysctl.h>
#endif
// All built-in store implementations.
#include "dummy-store.hh"
#include "http-binary-cache-store.hh"
#include "legacy-ssh-store.hh"
#include "local-binary-cache-store.hh"
#include "local-store.hh"
#include "s3-binary-cache-store.hh"
#include "ssh-store.hh"
#include "uds-remote-store.hh"
namespace nix {
@ -396,6 +406,17 @@ static void preloadNSS()
});
}
static void registerStoreImplementations() {
registerDummyStore();
registerHttpBinaryCacheStore();
registerLegacySSHStore();
registerLocalBinaryCacheStore();
registerLocalStore();
registerS3BinaryCacheStore();
registerSSHStore();
registerUDSRemoteStore();
}
static bool initLibStoreDone = false;
void assertLibStoreInitialized() {
@ -433,6 +454,8 @@ void initLibStore() {
unsetenv("TMPDIR");
#endif
registerStoreImplementations();
initLibStoreDone = true;
}

View file

@ -634,7 +634,7 @@ public:
line.
)"};
OptionalPathSetting diffHook{
PathsSetting<std::optional<Path>> diffHook{
this, std::nullopt, "diff-hook",
R"(
Absolute path to an executable capable of diffing build

View file

@ -1,3 +1,4 @@
#include "http-binary-cache-store.hh"
#include "binary-cache-store.hh"
#include "filetransfer.hh"
#include "globals.hh"
@ -194,6 +195,8 @@ protected:
}
};
static RegisterStoreImplementation<HttpBinaryCacheStore, HttpBinaryCacheStoreConfig> regHttpBinaryCacheStore;
void registerHttpBinaryCacheStore() {
StoreImplementations::add<HttpBinaryCacheStore, HttpBinaryCacheStoreConfig>();
}
}

View file

@ -0,0 +1,8 @@
#pragma once
///@file
namespace nix {
void registerHttpBinaryCacheStore();
}

View file

@ -1,4 +1,4 @@
#include "ssh-store-config.hh"
#include "legacy-ssh-store.hh"
#include "archive.hh"
#include "pool.hh"
#include "remote-store.hh"
@ -8,6 +8,8 @@
#include "store-api.hh"
#include "path-with-outputs.hh"
#include "ssh.hh"
#include "ssh-store.hh"
#include "strings.hh"
#include "derivations.hh"
namespace nix {
@ -412,6 +414,8 @@ public:
{ unsupported("queryRealisation"); }
};
static RegisterStoreImplementation<LegacySSHStore, LegacySSHStoreConfig> regLegacySSHStore;
void registerLegacySSHStore() {
StoreImplementations::add<LegacySSHStore, LegacySSHStoreConfig>();
}
}

View file

@ -0,0 +1,8 @@
#pragma once
///@file
namespace nix {
void registerLegacySSHStore();
}

View file

@ -61,9 +61,9 @@ template<class Inner, typename... Ts>
LENGTH_PREFIXED_PROTO_HELPER(Inner, std::tuple<Ts...>);
template<class Inner, typename K, typename V>
#define _X std::map<K, V>
LENGTH_PREFIXED_PROTO_HELPER(Inner, _X);
#undef _X
#define DONT_SUBSTITUTE_KV_TYPE std::map<K, V>
LENGTH_PREFIXED_PROTO_HELPER(Inner, DONT_SUBSTITUTE_KV_TYPE);
#undef DONT_SUBSTITUTE_KV_TYPE
template<class Inner, typename T>
std::vector<T>

View file

@ -1,3 +1,4 @@
#include "local-binary-cache-store.hh"
#include "binary-cache-store.hh"
#include "globals.hh"
#include "nar-info-disk-cache.hh"
@ -124,6 +125,8 @@ std::set<std::string> LocalBinaryCacheStore::uriSchemes()
return {"file"};
}
static RegisterStoreImplementation<LocalBinaryCacheStore, LocalBinaryCacheStoreConfig> regLocalBinaryCacheStore;
void registerLocalBinaryCacheStore() {
StoreImplementations::add<LocalBinaryCacheStore, LocalBinaryCacheStoreConfig>();
}
}

View file

@ -0,0 +1,8 @@
#pragma once
///@file
namespace nix {
void registerLocalBinaryCacheStore();
}

View file

@ -11,21 +11,21 @@ struct LocalFSStoreConfig : virtual StoreConfig
{
using StoreConfig::StoreConfig;
const OptionalPathSetting rootDir{this, std::nullopt,
const PathsSetting<std::optional<Path>> rootDir{this, std::nullopt,
"root",
"Directory prefixed to all other paths."};
const PathSetting stateDir{this,
const PathsSetting<Path> stateDir{this,
rootDir.get() ? *rootDir.get() + "/nix/var/nix" : settings.nixStateDir,
"state",
"Directory where Lix will store state."};
const PathSetting logDir{this,
const PathsSetting<Path> logDir{this,
rootDir.get() ? *rootDir.get() + "/nix/var/log/nix" : settings.nixLogDir,
"log",
"directory where Lix will store log files."};
const PathSetting realStoreDir{this,
const PathsSetting<Path> realStoreDir{this,
rootDir.get() ? *rootDir.get() + "/nix/store" : storeDir, "real",
"Physical path of the Nix store."};
};

View file

@ -10,6 +10,7 @@
#include "signals.hh"
#include "finally.hh"
#include "compression.hh"
#include "strings.hh"
#include <algorithm>
#include <cstring>

View file

@ -421,4 +421,7 @@ void canonicaliseTimestampAndPermissions(const Path & path);
MakeError(PathInUse, Error);
// Implemented by the relevant platform/ module being used.
void registerLocalStore();
}

View file

@ -1,11 +1,10 @@
#pragma once
///@file
#include "types.hh"
#include <optional>
#include <memory>
#include <sys/types.h>
#include <vector>
namespace nix {

View file

@ -1,7 +1,9 @@
#pragma once
///@file
#include "types.hh"
#include "ref.hh"
#include <set>
#include <vector>
namespace nix {

View file

@ -1,5 +1,6 @@
#include "make-content-addressed.hh"
#include "references.hh"
#include "strings.hh"
namespace nix {

View file

@ -118,12 +118,16 @@ libstore_headers = files(
'derived-path-map.hh',
'derived-path.hh',
'downstream-placeholder.hh',
'dummy-store.hh',
'filetransfer.hh',
'fs-accessor.hh',
'gc-store.hh',
'globals.hh',
'http-binary-cache-store.hh',
'indirect-root-store.hh',
'legacy-ssh-store.hh',
'length-prefixed-protocol-helper.hh',
'local-binary-cache-store.hh',
'local-fs-store.hh',
'local-store.hh',
'lock.hh',
@ -152,8 +156,8 @@ libstore_headers = files(
'serve-protocol-impl.hh',
'serve-protocol.hh',
'sqlite.hh',
'ssh-store-config.hh',
'ssh.hh',
'ssh-store.hh',
'store-api.hh',
'store-cast.hh',
'uds-remote-store.hh',

View file

@ -1,12 +1,12 @@
#include "derivations.hh"
#include "parsed-derivations.hh"
#include "globals.hh"
#include "local-store.hh"
#include "store-api.hh"
#include "thread-pool.hh"
#include "topo-sort.hh"
#include "closure.hh"
#include "filetransfer.hh"
#include "strings.hh"
namespace nix {

View file

@ -1,10 +1,12 @@
#pragma once
///@file
#include "fs-accessor.hh"
#include "ref.hh"
#include <functional>
#include <nlohmann/json_fwd.hpp>
#include "fs-accessor.hh"
namespace nix {

View file

@ -4,6 +4,7 @@
#include "sqlite.hh"
#include "globals.hh"
#include "users.hh"
#include "strings.hh"
#include <sqlite3.h>
#include <nlohmann/json.hpp>

View file

@ -1,4 +1,4 @@
#include "globals.hh"
#include "strings.hh"
#include "nar-info.hh"
#include "store-api.hh"

View file

@ -1,6 +1,7 @@
#include "local-store.hh"
#include "globals.hh"
#include "signals.hh"
#include "strings.hh"
#include <cstring>
#include <sys/types.h>

View file

@ -1,4 +1,5 @@
#include "parsed-derivations.hh"
#include "strings.hh"
#include <nlohmann/json.hpp>
#include <regex>

View file

@ -1,5 +1,6 @@
#include "path-info.hh"
#include "store-api.hh"
#include "strings.hh"
namespace nix {

View file

@ -1,7 +1,6 @@
#include "path-with-outputs.hh"
#include "store-api.hh"
#include <regex>
#include "strings.hh"
namespace nix {

View file

@ -112,3 +112,13 @@ PathSet Store::printStorePathSet(const StorePathSet & paths) const
}
}
std::size_t std::hash<nix::StorePath>::operator()(const nix::StorePath & path) const noexcept
{
// It's already a cryptographic hash of 160 bits (assuming that nobody gives us bogus ones...), so just parse it.
auto h = nix::Hash::parseNonSRIUnprefixed(path.hashPart(), nix::HashType::SHA1);
// This need not be stable across machines, so bit casting the start of it is fine.
size_t r;
memcpy(&r, h.hash, sizeof(r));
return r;
}

View file

@ -2,8 +2,9 @@
///@file
#include <string_view>
#include <string>
#include "types.hh"
#include "types.hh" // IWYU pragma: keep
namespace nix {
@ -89,10 +90,7 @@ const std::string drvExtension = ".drv";
namespace std {
template<> struct hash<nix::StorePath> {
std::size_t operator()(const nix::StorePath & path) const noexcept
{
return * (std::size_t *) path.to_string().data();
}
std::size_t operator()(const nix::StorePath & path) const noexcept;
};
}

View file

@ -49,8 +49,12 @@ struct FdLock
~FdLock()
{
if (acquired)
lockFile(fd, ltNone, false);
try {
if (acquired)
lockFile(fd, ltNone, false);
} catch (SysError &) {
ignoreException();
}
}
};

View file

@ -2,6 +2,7 @@
#include "signals.hh"
#include "platform/darwin.hh"
#include "regex.hh"
#include "strings.hh"
#include <sys/proc_info.h>
#include <sys/sysctl.h>
@ -261,4 +262,9 @@ void DarwinLocalDerivationGoal::execBuilder(std::string builder, Strings args, S
posix_spawn(nullptr, builder.c_str(), nullptr, &attrp, stringsToCharPtrs(args).data(), stringsToCharPtrs(envStrs).data());
}
void registerLocalStore() {
StoreImplementations::add<DarwinLocalStore, LocalStoreConfig>();
}
}

View file

@ -1,5 +1,7 @@
#include "platform/fallback.hh"
namespace nix {
static RegisterStoreImplementation<FallbackLocalStore, LocalStoreConfig> regLocalStore;
void registerLocalStore() {
Implementations::add<FallbackLocalStore, LocalStoreConfig>();
}
}

View file

@ -5,6 +5,7 @@
#include "signals.hh"
#include "platform/linux.hh"
#include "regex.hh"
#include "strings.hh"
#include <grp.h>
#include <regex>
@ -25,7 +26,9 @@ namespace {
constexpr const std::string_view nativeSystem = SYSTEM;
}
static RegisterStoreImplementation<LinuxLocalStore, LocalStoreConfig> regLocalStore;
void registerLocalStore() {
StoreImplementations::add<LinuxLocalStore, LocalStoreConfig>();
}
static void readProcLink(const std::string & file, UncheckedRoots & roots)
{

Some files were not shown because too many files have changed in this diff Show more