forked from lix-project/lix
Merge branch 'path-info' into ca-drv-exotic
This commit is contained in:
commit
ff2a8ccfe1
|
@ -110,7 +110,7 @@ default, set it to `-`.
|
||||||
7. A comma-separated list of *mandatory features*. A machine will only
|
7. A comma-separated list of *mandatory features*. A machine will only
|
||||||
be used to build a derivation if all of the machine’s mandatory
|
be used to build a derivation if all of the machine’s mandatory
|
||||||
features appear in the derivation’s `requiredSystemFeatures`
|
features appear in the derivation’s `requiredSystemFeatures`
|
||||||
attribute..
|
attribute.
|
||||||
|
|
||||||
8. The (base64-encoded) public host key of the remote machine. If omitted, SSH
|
8. The (base64-encoded) public host key of the remote machine. If omitted, SSH
|
||||||
will use its regular known-hosts file. Specifically, the field is calculated
|
will use its regular known-hosts file. Specifically, the field is calculated
|
||||||
|
|
|
@ -84,7 +84,9 @@ The installer will modify `/etc/bashrc`, and `/etc/zshrc` if they exist.
|
||||||
The installer will first back up these files with a `.backup-before-nix`
|
The installer will first back up these files with a `.backup-before-nix`
|
||||||
extension. The installer will also create `/etc/profile.d/nix.sh`.
|
extension. The installer will also create `/etc/profile.d/nix.sh`.
|
||||||
|
|
||||||
You can uninstall Nix with the following commands:
|
## Uninstalling
|
||||||
|
|
||||||
|
### Linux
|
||||||
|
|
||||||
```console
|
```console
|
||||||
sudo rm -rf /etc/profile/nix.sh /etc/nix /nix ~root/.nix-profile ~root/.nix-defexpr ~root/.nix-channels ~/.nix-profile ~/.nix-defexpr ~/.nix-channels
|
sudo rm -rf /etc/profile/nix.sh /etc/nix /nix ~root/.nix-profile ~root/.nix-defexpr ~root/.nix-channels ~/.nix-profile ~/.nix-defexpr ~/.nix-channels
|
||||||
|
@ -95,15 +97,95 @@ sudo systemctl stop nix-daemon.service
|
||||||
sudo systemctl disable nix-daemon.socket
|
sudo systemctl disable nix-daemon.socket
|
||||||
sudo systemctl disable nix-daemon.service
|
sudo systemctl disable nix-daemon.service
|
||||||
sudo systemctl daemon-reload
|
sudo systemctl daemon-reload
|
||||||
|
|
||||||
# If you are on macOS, you will need to run:
|
|
||||||
sudo launchctl unload /Library/LaunchDaemons/org.nixos.nix-daemon.plist
|
|
||||||
sudo rm /Library/LaunchDaemons/org.nixos.nix-daemon.plist
|
|
||||||
```
|
```
|
||||||
|
|
||||||
There may also be references to Nix in `/etc/profile`, `/etc/bashrc`,
|
There may also be references to Nix in `/etc/profile`, `/etc/bashrc`,
|
||||||
and `/etc/zshrc` which you may remove.
|
and `/etc/zshrc` which you may remove.
|
||||||
|
|
||||||
|
### macOS
|
||||||
|
|
||||||
|
1. Edit `/etc/zshrc` and `/etc/bashrc` to remove the lines sourcing
|
||||||
|
`nix-daemon.sh`, which should look like this:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Nix
|
||||||
|
if [ -e '/nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh' ]; then
|
||||||
|
. '/nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh'
|
||||||
|
fi
|
||||||
|
# End Nix
|
||||||
|
```
|
||||||
|
|
||||||
|
If these files haven't been altered since installing Nix you can simply put
|
||||||
|
the backups back in place:
|
||||||
|
|
||||||
|
```console
|
||||||
|
sudo mv /etc/zshrc.backup-before-nix /etc/zshrc
|
||||||
|
sudo mv /etc/bashrc.backup-before-nix /etc/bashrc
|
||||||
|
```
|
||||||
|
|
||||||
|
This will stop shells from sourcing the file and bringing everything you
|
||||||
|
installed using Nix in scope.
|
||||||
|
|
||||||
|
2. Stop and remove the Nix daemon services:
|
||||||
|
|
||||||
|
```console
|
||||||
|
sudo launchctl unload /Library/LaunchDaemons/org.nixos.nix-daemon.plist
|
||||||
|
sudo rm /Library/LaunchDaemons/org.nixos.nix-daemon.plist
|
||||||
|
sudo launchctl unload /Library/LaunchDaemons/org.nixos.darwin-store.plist
|
||||||
|
sudo rm /Library/LaunchDaemons/org.nixos.darwin-store.plist
|
||||||
|
```
|
||||||
|
|
||||||
|
This stops the Nix daemon and prevents it from being started next time you
|
||||||
|
boot the system.
|
||||||
|
|
||||||
|
3. Remove the `nixbld` group and the `_nixbuildN` users:
|
||||||
|
|
||||||
|
```console
|
||||||
|
sudo dscl . -delete /Groups/nixbld
|
||||||
|
for u in $(sudo dscl . -list /Users | grep _nixbld); do sudo dscl . -delete /Users/$u; done
|
||||||
|
```
|
||||||
|
|
||||||
|
This will remove all the build users that no longer serve a purpose.
|
||||||
|
|
||||||
|
4. Edit fstab using `sudo vifs` to remove the line mounting the Nix Store
|
||||||
|
volume on `/nix`, which looks like this,
|
||||||
|
`LABEL=Nix\040Store /nix apfs rw,nobrowse`. This will prevent automatic
|
||||||
|
mounting of the Nix Store volume.
|
||||||
|
|
||||||
|
5. Edit `/etc/synthetic.conf` to remove the `nix` line. If this is the only
|
||||||
|
line in the file you can remove it entirely, `sudo rm /etc/synthetic.conf`.
|
||||||
|
This will prevent the creation of the empty `/nix` directory to provide a
|
||||||
|
mountpoint for the Nix Store volume.
|
||||||
|
|
||||||
|
6. Remove the files Nix added to your system:
|
||||||
|
|
||||||
|
```console
|
||||||
|
sudo rm -rf /etc/nix /var/root/.nix-profile /var/root/.nix-defexpr /var/root/.nix-channels ~/.nix-profile ~/.nix-defexpr ~/.nix-channels
|
||||||
|
```
|
||||||
|
|
||||||
|
This gets rid of any data Nix may have created except for the store which is
|
||||||
|
removed next.
|
||||||
|
|
||||||
|
7. Remove the Nix Store volume:
|
||||||
|
|
||||||
|
```console
|
||||||
|
sudo diskutil apfs deleteVolume /nix
|
||||||
|
```
|
||||||
|
|
||||||
|
This will remove the Nix Store volume and everything that was added to the
|
||||||
|
store.
|
||||||
|
|
||||||
|
> **Note**
|
||||||
|
>
|
||||||
|
> After you complete the steps here, you will still have an empty `/nix`
|
||||||
|
> directory. This is an expected sign of a successful uninstall. The empty
|
||||||
|
> `/nix` directory will disappear the next time you reboot.
|
||||||
|
>
|
||||||
|
> You do not have to reboot to finish uninstalling Nix. The uninstall is
|
||||||
|
> complete. macOS (Catalina+) directly controls root directories and its
|
||||||
|
> read-only root will prevent you from manually deleting the empty `/nix`
|
||||||
|
> mountpoint.
|
||||||
|
|
||||||
# macOS Installation <a name="sect-macos-installation-change-store-prefix"></a><a name="sect-macos-installation-encrypted-volume"></a><a name="sect-macos-installation-symlink"></a><a name="sect-macos-installation-recommended-notes"></a>
|
# macOS Installation <a name="sect-macos-installation-change-store-prefix"></a><a name="sect-macos-installation-encrypted-volume"></a><a name="sect-macos-installation-symlink"></a><a name="sect-macos-installation-recommended-notes"></a>
|
||||||
<!-- Note: anchors above to catch permalinks to old explanations -->
|
<!-- Note: anchors above to catch permalinks to old explanations -->
|
||||||
|
|
||||||
|
|
|
@ -1 +1,16 @@
|
||||||
# Release X.Y (202?-??-??)
|
# Release X.Y (202?-??-??)
|
||||||
|
|
||||||
|
* Various nix commands can now read expressions from stdin with `--file -`.
|
||||||
|
|
||||||
|
* `nix store make-content-addressable` has been renamed to `nix store
|
||||||
|
make-content-addressed`.
|
||||||
|
|
||||||
|
* New experimental builtin function `builtins.fetchClosure` that
|
||||||
|
copies a closure from a binary cache at evaluation time and rewrites
|
||||||
|
it to content-addressed form (if it isn't already). Like
|
||||||
|
`builtins.storePath`, this allows importing pre-built store paths;
|
||||||
|
the difference is that it doesn't require the user to configure
|
||||||
|
binary caches and trusted public keys.
|
||||||
|
|
||||||
|
This function is only available if you enable the experimental
|
||||||
|
feature `fetch-closure`.
|
||||||
|
|
|
@ -1,7 +1,8 @@
|
||||||
ifdef HOST_LINUX
|
ifdef HOST_LINUX
|
||||||
|
|
||||||
$(foreach n, nix-daemon.socket nix-daemon.service, $(eval $(call install-file-in, $(d)/$(n), $(prefix)/lib/systemd/system, 0644)))
|
$(foreach n, nix-daemon.socket nix-daemon.service, $(eval $(call install-file-in, $(d)/$(n), $(prefix)/lib/systemd/system, 0644)))
|
||||||
|
$(foreach n, nix-daemon.conf, $(eval $(call install-file-in, $(d)/$(n), $(prefix)/lib/tmpfiles.d, 0644)))
|
||||||
|
|
||||||
clean-files += $(d)/nix-daemon.socket $(d)/nix-daemon.service
|
clean-files += $(d)/nix-daemon.socket $(d)/nix-daemon.service $(d)/nix-daemon.conf
|
||||||
|
|
||||||
endif
|
endif
|
||||||
|
|
1
misc/systemd/nix-daemon.conf.in
Normal file
1
misc/systemd/nix-daemon.conf.in
Normal file
|
@ -0,0 +1 @@
|
||||||
|
d @localstatedir@/nix/daemon-socket 0755 root root - -
|
|
@ -1,5 +1,6 @@
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=Nix Daemon
|
Description=Nix Daemon
|
||||||
|
Documentation=man:nix-daemon https://nixos.org/manual
|
||||||
RequiresMountsFor=@storedir@
|
RequiresMountsFor=@storedir@
|
||||||
RequiresMountsFor=@localstatedir@
|
RequiresMountsFor=@localstatedir@
|
||||||
ConditionPathIsReadWrite=@localstatedir@/nix/daemon-socket
|
ConditionPathIsReadWrite=@localstatedir@/nix/daemon-socket
|
||||||
|
|
|
@ -739,7 +739,7 @@ install_from_extracted_nix() {
|
||||||
cd "$EXTRACTED_NIX_PATH"
|
cd "$EXTRACTED_NIX_PATH"
|
||||||
|
|
||||||
_sudo "to copy the basic Nix files to the new store at $NIX_ROOT/store" \
|
_sudo "to copy the basic Nix files to the new store at $NIX_ROOT/store" \
|
||||||
cp -RLp ./store/* "$NIX_ROOT/store/"
|
cp -RPp ./store/* "$NIX_ROOT/store/"
|
||||||
|
|
||||||
_sudo "to make the new store non-writable at $NIX_ROOT/store" \
|
_sudo "to make the new store non-writable at $NIX_ROOT/store" \
|
||||||
chmod -R ugo-w "$NIX_ROOT/store/"
|
chmod -R ugo-w "$NIX_ROOT/store/"
|
||||||
|
|
|
@ -9,6 +9,8 @@ readonly SERVICE_DEST=/etc/systemd/system/nix-daemon.service
|
||||||
readonly SOCKET_SRC=/lib/systemd/system/nix-daemon.socket
|
readonly SOCKET_SRC=/lib/systemd/system/nix-daemon.socket
|
||||||
readonly SOCKET_DEST=/etc/systemd/system/nix-daemon.socket
|
readonly SOCKET_DEST=/etc/systemd/system/nix-daemon.socket
|
||||||
|
|
||||||
|
readonly TMPFILES_SRC=/lib/tmpfiles.d/nix-daemon.conf
|
||||||
|
readonly TMPFILES_DEST=/etc/tmpfiles.d/nix-daemon.conf
|
||||||
|
|
||||||
# Path for the systemd override unit file to contain the proxy settings
|
# Path for the systemd override unit file to contain the proxy settings
|
||||||
readonly SERVICE_OVERRIDE=${SERVICE_DEST}.d/override.conf
|
readonly SERVICE_OVERRIDE=${SERVICE_DEST}.d/override.conf
|
||||||
|
@ -83,6 +85,13 @@ EOF
|
||||||
poly_configure_nix_daemon_service() {
|
poly_configure_nix_daemon_service() {
|
||||||
if [ -e /run/systemd/system ]; then
|
if [ -e /run/systemd/system ]; then
|
||||||
task "Setting up the nix-daemon systemd service"
|
task "Setting up the nix-daemon systemd service"
|
||||||
|
|
||||||
|
_sudo "to create the nix-daemon tmpfiles config" \
|
||||||
|
ln -sfn /nix/var/nix/profiles/default/$TMPFILES_SRC $TMPFILES_DEST
|
||||||
|
|
||||||
|
_sudo "to run systemd-tmpfiles once to pick that path up" \
|
||||||
|
sytemd-tmpfiles create --prefix=/nix/var/nix
|
||||||
|
|
||||||
_sudo "to set up the nix-daemon service" \
|
_sudo "to set up the nix-daemon service" \
|
||||||
systemctl link "/nix/var/nix/profiles/default$SERVICE_SRC"
|
systemctl link "/nix/var/nix/profiles/default$SERVICE_SRC"
|
||||||
|
|
||||||
|
|
|
@ -300,7 +300,7 @@ connected:
|
||||||
|
|
||||||
std::set<Realisation> missingRealisations;
|
std::set<Realisation> missingRealisations;
|
||||||
StorePathSet missingPaths;
|
StorePathSet missingPaths;
|
||||||
if (settings.isExperimentalFeatureEnabled(Xp::CaDerivations) && !derivationHasKnownOutputPaths(drv.type())) {
|
if (settings.isExperimentalFeatureEnabled(Xp::CaDerivations) && !drv.type().hasKnownOutputPaths()) {
|
||||||
for (auto & outputName : wantedOutputs) {
|
for (auto & outputName : wantedOutputs) {
|
||||||
auto thisOutputHash = outputHashes.at(outputName);
|
auto thisOutputHash = outputHashes.at(outputName);
|
||||||
auto thisOutputId = DrvOutput{ thisOutputHash, outputName };
|
auto thisOutputId = DrvOutput{ thisOutputHash, outputName };
|
||||||
|
|
|
@ -204,7 +204,8 @@ Strings editorFor(const Pos & pos)
|
||||||
if (pos.line > 0 && (
|
if (pos.line > 0 && (
|
||||||
editor.find("emacs") != std::string::npos ||
|
editor.find("emacs") != std::string::npos ||
|
||||||
editor.find("nano") != std::string::npos ||
|
editor.find("nano") != std::string::npos ||
|
||||||
editor.find("vim") != std::string::npos))
|
editor.find("vim") != std::string::npos ||
|
||||||
|
editor.find("kak") != std::string::npos))
|
||||||
args.push_back(fmt("+%d", pos.line));
|
args.push_back(fmt("+%d", pos.line));
|
||||||
args.push_back(pos.file);
|
args.push_back(pos.file);
|
||||||
return args;
|
return args;
|
||||||
|
|
|
@ -134,7 +134,9 @@ SourceExprCommand::SourceExprCommand()
|
||||||
addFlag({
|
addFlag({
|
||||||
.longName = "file",
|
.longName = "file",
|
||||||
.shortName = 'f',
|
.shortName = 'f',
|
||||||
.description = "Interpret installables as attribute paths relative to the Nix expression stored in *file*.",
|
.description =
|
||||||
|
"Interpret installables as attribute paths relative to the Nix expression stored in *file*. "
|
||||||
|
"If *file* is the character -, then a Nix expression will be read from standard input.",
|
||||||
.category = installablesCategory,
|
.category = installablesCategory,
|
||||||
.labels = {"file"},
|
.labels = {"file"},
|
||||||
.handler = {&file},
|
.handler = {&file},
|
||||||
|
@ -695,7 +697,10 @@ std::vector<std::shared_ptr<Installable>> SourceExprCommand::parseInstallables(
|
||||||
auto state = getEvalState();
|
auto state = getEvalState();
|
||||||
auto vFile = state->allocValue();
|
auto vFile = state->allocValue();
|
||||||
|
|
||||||
if (file)
|
if (file == "-") {
|
||||||
|
auto e = state->parseStdin();
|
||||||
|
state->eval(e, *vFile);
|
||||||
|
} else if (file)
|
||||||
state->evalFile(lookupFileArg(*state, *file), *vFile);
|
state->evalFile(lookupFileArg(*state, *file), *vFile);
|
||||||
else {
|
else {
|
||||||
auto e = state->parseExprFromString(*expr, absPath("."));
|
auto e = state->parseExprFromString(*expr, absPath("."));
|
||||||
|
|
|
@ -21,6 +21,8 @@ struct AttrDb
|
||||||
{
|
{
|
||||||
std::atomic_bool failed{false};
|
std::atomic_bool failed{false};
|
||||||
|
|
||||||
|
const Store & cfg;
|
||||||
|
|
||||||
struct State
|
struct State
|
||||||
{
|
{
|
||||||
SQLite db;
|
SQLite db;
|
||||||
|
@ -33,8 +35,9 @@ struct AttrDb
|
||||||
|
|
||||||
std::unique_ptr<Sync<State>> _state;
|
std::unique_ptr<Sync<State>> _state;
|
||||||
|
|
||||||
AttrDb(const Hash & fingerprint)
|
AttrDb(const Store & cfg, const Hash & fingerprint)
|
||||||
: _state(std::make_unique<Sync<State>>())
|
: cfg(cfg)
|
||||||
|
, _state(std::make_unique<Sync<State>>())
|
||||||
{
|
{
|
||||||
auto state(_state->lock());
|
auto state(_state->lock());
|
||||||
|
|
||||||
|
@ -254,10 +257,10 @@ struct AttrDb
|
||||||
return {{rowId, attrs}};
|
return {{rowId, attrs}};
|
||||||
}
|
}
|
||||||
case AttrType::String: {
|
case AttrType::String: {
|
||||||
std::vector<std::pair<Path, std::string>> context;
|
NixStringContext context;
|
||||||
if (!queryAttribute.isNull(3))
|
if (!queryAttribute.isNull(3))
|
||||||
for (auto & s : tokenizeString<std::vector<std::string>>(queryAttribute.getStr(3), ";"))
|
for (auto & s : tokenizeString<std::vector<std::string>>(queryAttribute.getStr(3), ";"))
|
||||||
context.push_back(decodeContext(s));
|
context.push_back(decodeContext(cfg, s));
|
||||||
return {{rowId, string_t{queryAttribute.getStr(2), context}}};
|
return {{rowId, string_t{queryAttribute.getStr(2), context}}};
|
||||||
}
|
}
|
||||||
case AttrType::Bool:
|
case AttrType::Bool:
|
||||||
|
@ -274,10 +277,10 @@ struct AttrDb
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
static std::shared_ptr<AttrDb> makeAttrDb(const Hash & fingerprint)
|
static std::shared_ptr<AttrDb> makeAttrDb(const Store & cfg, const Hash & fingerprint)
|
||||||
{
|
{
|
||||||
try {
|
try {
|
||||||
return std::make_shared<AttrDb>(fingerprint);
|
return std::make_shared<AttrDb>(cfg, fingerprint);
|
||||||
} catch (SQLiteError &) {
|
} catch (SQLiteError &) {
|
||||||
ignoreException();
|
ignoreException();
|
||||||
return nullptr;
|
return nullptr;
|
||||||
|
@ -288,7 +291,7 @@ EvalCache::EvalCache(
|
||||||
std::optional<std::reference_wrapper<const Hash>> useCache,
|
std::optional<std::reference_wrapper<const Hash>> useCache,
|
||||||
EvalState & state,
|
EvalState & state,
|
||||||
RootLoader rootLoader)
|
RootLoader rootLoader)
|
||||||
: db(useCache ? makeAttrDb(*useCache) : nullptr)
|
: db(useCache ? makeAttrDb(*state.store, *useCache) : nullptr)
|
||||||
, state(state)
|
, state(state)
|
||||||
, rootLoader(rootLoader)
|
, rootLoader(rootLoader)
|
||||||
{
|
{
|
||||||
|
@ -546,7 +549,7 @@ string_t AttrCursor::getStringWithContext()
|
||||||
if (auto s = std::get_if<string_t>(&cachedValue->second)) {
|
if (auto s = std::get_if<string_t>(&cachedValue->second)) {
|
||||||
bool valid = true;
|
bool valid = true;
|
||||||
for (auto & c : s->second) {
|
for (auto & c : s->second) {
|
||||||
if (!root->state.store->isValidPath(root->state.store->parseStorePath(c.first))) {
|
if (!root->state.store->isValidPath(c.first)) {
|
||||||
valid = false;
|
valid = false;
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
@ -563,7 +566,7 @@ string_t AttrCursor::getStringWithContext()
|
||||||
auto & v = forceValue();
|
auto & v = forceValue();
|
||||||
|
|
||||||
if (v.type() == nString)
|
if (v.type() == nString)
|
||||||
return {v.string.s, v.getContext()};
|
return {v.string.s, v.getContext(*root->state.store)};
|
||||||
else if (v.type() == nPath)
|
else if (v.type() == nPath)
|
||||||
return {v.path, {}};
|
return {v.path, {}};
|
||||||
else
|
else
|
||||||
|
|
|
@ -52,7 +52,7 @@ struct misc_t {};
|
||||||
struct failed_t {};
|
struct failed_t {};
|
||||||
typedef uint64_t AttrId;
|
typedef uint64_t AttrId;
|
||||||
typedef std::pair<AttrId, Symbol> AttrKey;
|
typedef std::pair<AttrId, Symbol> AttrKey;
|
||||||
typedef std::pair<std::string, std::vector<std::pair<Path, std::string>>> string_t;
|
typedef std::pair<std::string, NixStringContext> string_t;
|
||||||
|
|
||||||
typedef std::variant<
|
typedef std::variant<
|
||||||
std::vector<Symbol>,
|
std::vector<Symbol>,
|
||||||
|
|
|
@ -24,6 +24,81 @@ LocalNoInlineNoReturn(void throwTypeError(const Pos & pos, const char * s, const
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
/* Note: Various places expect the allocated memory to be zeroed. */
|
||||||
|
[[gnu::always_inline]]
|
||||||
|
inline void * allocBytes(size_t n)
|
||||||
|
{
|
||||||
|
void * p;
|
||||||
|
#if HAVE_BOEHMGC
|
||||||
|
p = GC_MALLOC(n);
|
||||||
|
#else
|
||||||
|
p = calloc(n, 1);
|
||||||
|
#endif
|
||||||
|
if (!p) throw std::bad_alloc();
|
||||||
|
return p;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
[[gnu::always_inline]]
|
||||||
|
Value * EvalState::allocValue()
|
||||||
|
{
|
||||||
|
#if HAVE_BOEHMGC
|
||||||
|
/* We use the boehm batch allocator to speed up allocations of Values (of which there are many).
|
||||||
|
GC_malloc_many returns a linked list of objects of the given size, where the first word
|
||||||
|
of each object is also the pointer to the next object in the list. This also means that we
|
||||||
|
have to explicitly clear the first word of every object we take. */
|
||||||
|
if (!*valueAllocCache) {
|
||||||
|
*valueAllocCache = GC_malloc_many(sizeof(Value));
|
||||||
|
if (!*valueAllocCache) throw std::bad_alloc();
|
||||||
|
}
|
||||||
|
|
||||||
|
/* GC_NEXT is a convenience macro for accessing the first word of an object.
|
||||||
|
Take the first list item, advance the list to the next item, and clear the next pointer. */
|
||||||
|
void * p = *valueAllocCache;
|
||||||
|
*valueAllocCache = GC_NEXT(p);
|
||||||
|
GC_NEXT(p) = nullptr;
|
||||||
|
#else
|
||||||
|
void * p = allocBytes(sizeof(Value));
|
||||||
|
#endif
|
||||||
|
|
||||||
|
nrValues++;
|
||||||
|
return (Value *) p;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
[[gnu::always_inline]]
|
||||||
|
Env & EvalState::allocEnv(size_t size)
|
||||||
|
{
|
||||||
|
nrEnvs++;
|
||||||
|
nrValuesInEnvs += size;
|
||||||
|
|
||||||
|
Env * env;
|
||||||
|
|
||||||
|
#if HAVE_BOEHMGC
|
||||||
|
if (size == 1) {
|
||||||
|
/* see allocValue for explanations. */
|
||||||
|
if (!*env1AllocCache) {
|
||||||
|
*env1AllocCache = GC_malloc_many(sizeof(Env) + sizeof(Value *));
|
||||||
|
if (!*env1AllocCache) throw std::bad_alloc();
|
||||||
|
}
|
||||||
|
|
||||||
|
void * p = *env1AllocCache;
|
||||||
|
*env1AllocCache = GC_NEXT(p);
|
||||||
|
GC_NEXT(p) = nullptr;
|
||||||
|
env = (Env *) p;
|
||||||
|
} else
|
||||||
|
#endif
|
||||||
|
env = (Env *) allocBytes(sizeof(Env) + size * sizeof(Value *));
|
||||||
|
|
||||||
|
env->type = Env::Plain;
|
||||||
|
|
||||||
|
/* We assume that env->values has been cleared by the allocator; maybeThunk() and lookupVar fromWith expect this. */
|
||||||
|
|
||||||
|
return *env;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
[[gnu::always_inline]]
|
||||||
void EvalState::forceValue(Value & v, const Pos & pos)
|
void EvalState::forceValue(Value & v, const Pos & pos)
|
||||||
{
|
{
|
||||||
forceValue(v, [&]() { return pos; });
|
forceValue(v, [&]() { return pos; });
|
||||||
|
@ -52,6 +127,7 @@ void EvalState::forceValue(Value & v, Callable getPos)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
[[gnu::always_inline]]
|
||||||
inline void EvalState::forceAttrs(Value & v, const Pos & pos)
|
inline void EvalState::forceAttrs(Value & v, const Pos & pos)
|
||||||
{
|
{
|
||||||
forceAttrs(v, [&]() { return pos; });
|
forceAttrs(v, [&]() { return pos; });
|
||||||
|
@ -59,6 +135,7 @@ inline void EvalState::forceAttrs(Value & v, const Pos & pos)
|
||||||
|
|
||||||
|
|
||||||
template <typename Callable>
|
template <typename Callable>
|
||||||
|
[[gnu::always_inline]]
|
||||||
inline void EvalState::forceAttrs(Value & v, Callable getPos)
|
inline void EvalState::forceAttrs(Value & v, Callable getPos)
|
||||||
{
|
{
|
||||||
forceValue(v, getPos);
|
forceValue(v, getPos);
|
||||||
|
@ -67,6 +144,7 @@ inline void EvalState::forceAttrs(Value & v, Callable getPos)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
[[gnu::always_inline]]
|
||||||
inline void EvalState::forceList(Value & v, const Pos & pos)
|
inline void EvalState::forceList(Value & v, const Pos & pos)
|
||||||
{
|
{
|
||||||
forceValue(v, pos);
|
forceValue(v, pos);
|
||||||
|
@ -74,18 +152,5 @@ inline void EvalState::forceList(Value & v, const Pos & pos)
|
||||||
throwTypeError(pos, "value is %1% while a list was expected", v);
|
throwTypeError(pos, "value is %1% while a list was expected", v);
|
||||||
}
|
}
|
||||||
|
|
||||||
/* Note: Various places expect the allocated memory to be zeroed. */
|
|
||||||
inline void * allocBytes(size_t n)
|
|
||||||
{
|
|
||||||
void * p;
|
|
||||||
#if HAVE_BOEHMGC
|
|
||||||
p = GC_MALLOC(n);
|
|
||||||
#else
|
|
||||||
p = calloc(n, 1);
|
|
||||||
#endif
|
|
||||||
if (!p) throw std::bad_alloc();
|
|
||||||
return p;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -96,20 +96,20 @@ RootValue allocRootValue(Value * v)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void printValue(std::ostream & str, std::set<const void *> & seen, const Value & v)
|
void Value::print(std::ostream & str, std::set<const void *> * seen) const
|
||||||
{
|
{
|
||||||
checkInterrupt();
|
checkInterrupt();
|
||||||
|
|
||||||
switch (v.internalType) {
|
switch (internalType) {
|
||||||
case tInt:
|
case tInt:
|
||||||
str << v.integer;
|
str << integer;
|
||||||
break;
|
break;
|
||||||
case tBool:
|
case tBool:
|
||||||
str << (v.boolean ? "true" : "false");
|
str << (boolean ? "true" : "false");
|
||||||
break;
|
break;
|
||||||
case tString:
|
case tString:
|
||||||
str << "\"";
|
str << "\"";
|
||||||
for (const char * i = v.string.s; *i; i++)
|
for (const char * i = string.s; *i; i++)
|
||||||
if (*i == '\"' || *i == '\\') str << "\\" << *i;
|
if (*i == '\"' || *i == '\\') str << "\\" << *i;
|
||||||
else if (*i == '\n') str << "\\n";
|
else if (*i == '\n') str << "\\n";
|
||||||
else if (*i == '\r') str << "\\r";
|
else if (*i == '\r') str << "\\r";
|
||||||
|
@ -119,19 +119,19 @@ void printValue(std::ostream & str, std::set<const void *> & seen, const Value &
|
||||||
str << "\"";
|
str << "\"";
|
||||||
break;
|
break;
|
||||||
case tPath:
|
case tPath:
|
||||||
str << v.path; // !!! escaping?
|
str << path; // !!! escaping?
|
||||||
break;
|
break;
|
||||||
case tNull:
|
case tNull:
|
||||||
str << "null";
|
str << "null";
|
||||||
break;
|
break;
|
||||||
case tAttrs: {
|
case tAttrs: {
|
||||||
if (!v.attrs->empty() && !seen.insert(v.attrs).second)
|
if (seen && !attrs->empty() && !seen->insert(attrs).second)
|
||||||
str << "<REPEAT>";
|
str << "«repeated»";
|
||||||
else {
|
else {
|
||||||
str << "{ ";
|
str << "{ ";
|
||||||
for (auto & i : v.attrs->lexicographicOrder()) {
|
for (auto & i : attrs->lexicographicOrder()) {
|
||||||
str << i->name << " = ";
|
str << i->name << " = ";
|
||||||
printValue(str, seen, *i->value);
|
i->value->print(str, seen);
|
||||||
str << "; ";
|
str << "; ";
|
||||||
}
|
}
|
||||||
str << "}";
|
str << "}";
|
||||||
|
@ -141,12 +141,12 @@ void printValue(std::ostream & str, std::set<const void *> & seen, const Value &
|
||||||
case tList1:
|
case tList1:
|
||||||
case tList2:
|
case tList2:
|
||||||
case tListN:
|
case tListN:
|
||||||
if (v.listSize() && !seen.insert(v.listElems()).second)
|
if (seen && listSize() && !seen->insert(listElems()).second)
|
||||||
str << "<REPEAT>";
|
str << "«repeated»";
|
||||||
else {
|
else {
|
||||||
str << "[ ";
|
str << "[ ";
|
||||||
for (auto v2 : v.listItems()) {
|
for (auto v2 : listItems()) {
|
||||||
printValue(str, seen, *v2);
|
v2->print(str, seen);
|
||||||
str << " ";
|
str << " ";
|
||||||
}
|
}
|
||||||
str << "]";
|
str << "]";
|
||||||
|
@ -166,10 +166,10 @@ void printValue(std::ostream & str, std::set<const void *> & seen, const Value &
|
||||||
str << "<PRIMOP-APP>";
|
str << "<PRIMOP-APP>";
|
||||||
break;
|
break;
|
||||||
case tExternal:
|
case tExternal:
|
||||||
str << *v.external;
|
str << *external;
|
||||||
break;
|
break;
|
||||||
case tFloat:
|
case tFloat:
|
||||||
str << v.fpoint;
|
str << fpoint;
|
||||||
break;
|
break;
|
||||||
default:
|
default:
|
||||||
abort();
|
abort();
|
||||||
|
@ -177,10 +177,16 @@ void printValue(std::ostream & str, std::set<const void *> & seen, const Value &
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
std::ostream & operator << (std::ostream & str, const Value & v)
|
void Value::print(std::ostream & str, bool showRepeated) const
|
||||||
{
|
{
|
||||||
std::set<const void *> seen;
|
std::set<const void *> seen;
|
||||||
printValue(str, seen, v);
|
print(str, showRepeated ? nullptr : &seen);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
std::ostream & operator << (std::ostream & str, const Value & v)
|
||||||
|
{
|
||||||
|
v.print(str, false);
|
||||||
return str;
|
return str;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -449,8 +455,10 @@ EvalState::EvalState(
|
||||||
, regexCache(makeRegexCache())
|
, regexCache(makeRegexCache())
|
||||||
#if HAVE_BOEHMGC
|
#if HAVE_BOEHMGC
|
||||||
, valueAllocCache(std::allocate_shared<void *>(traceable_allocator<void *>(), nullptr))
|
, valueAllocCache(std::allocate_shared<void *>(traceable_allocator<void *>(), nullptr))
|
||||||
|
, env1AllocCache(std::allocate_shared<void *>(traceable_allocator<void *>(), nullptr))
|
||||||
#else
|
#else
|
||||||
, valueAllocCache(std::make_shared<void *>(nullptr))
|
, valueAllocCache(std::make_shared<void *>(nullptr))
|
||||||
|
, env1AllocCache(std::make_shared<void *>(nullptr))
|
||||||
#endif
|
#endif
|
||||||
, baseEnv(allocEnv(128))
|
, baseEnv(allocEnv(128))
|
||||||
, staticBaseEnv(false, 0)
|
, staticBaseEnv(false, 0)
|
||||||
|
@ -499,23 +507,6 @@ EvalState::~EvalState()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void EvalState::requireExperimentalFeatureOnEvaluation(
|
|
||||||
const ExperimentalFeature & feature,
|
|
||||||
const std::string_view fName,
|
|
||||||
const Pos & pos)
|
|
||||||
{
|
|
||||||
if (!settings.isExperimentalFeatureEnabled(feature)) {
|
|
||||||
throw EvalError({
|
|
||||||
.msg = hintfmt(
|
|
||||||
"Cannot call '%2%' because experimental Nix feature '%1%' is disabled. You can enable it via '--extra-experimental-features %1%'.",
|
|
||||||
feature,
|
|
||||||
fName
|
|
||||||
),
|
|
||||||
.errPos = pos
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
void EvalState::allowPath(const Path & path)
|
void EvalState::allowPath(const Path & path)
|
||||||
{
|
{
|
||||||
if (allowedPaths)
|
if (allowedPaths)
|
||||||
|
@ -727,9 +718,18 @@ LocalNoInlineNoReturn(void throwEvalError(const char * s, const std::string & s2
|
||||||
throw EvalError(s, s2);
|
throw EvalError(s, s2);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
LocalNoInlineNoReturn(void throwEvalError(const Pos & pos, const Suggestions & suggestions, const char * s, const std::string & s2))
|
||||||
|
{
|
||||||
|
throw EvalError(ErrorInfo {
|
||||||
|
.msg = hintfmt(s, s2),
|
||||||
|
.errPos = pos,
|
||||||
|
.suggestions = suggestions,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
LocalNoInlineNoReturn(void throwEvalError(const Pos & pos, const char * s, const std::string & s2))
|
LocalNoInlineNoReturn(void throwEvalError(const Pos & pos, const char * s, const std::string & s2))
|
||||||
{
|
{
|
||||||
throw EvalError({
|
throw EvalError(ErrorInfo {
|
||||||
.msg = hintfmt(s, s2),
|
.msg = hintfmt(s, s2),
|
||||||
.errPos = pos
|
.errPos = pos
|
||||||
});
|
});
|
||||||
|
@ -773,6 +773,16 @@ LocalNoInlineNoReturn(void throwTypeError(const Pos & pos, const char * s, const
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
LocalNoInlineNoReturn(void throwTypeError(const Pos & pos, const Suggestions & suggestions, const char * s, const ExprLambda & fun, const Symbol & s2))
|
||||||
|
{
|
||||||
|
throw TypeError(ErrorInfo {
|
||||||
|
.msg = hintfmt(s, fun.showNamePos(), s2),
|
||||||
|
.errPos = pos,
|
||||||
|
.suggestions = suggestions,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
LocalNoInlineNoReturn(void throwTypeError(const char * s, const Value & v))
|
LocalNoInlineNoReturn(void throwTypeError(const char * s, const Value & v))
|
||||||
{
|
{
|
||||||
throw TypeError(s, showType(v));
|
throw TypeError(s, showType(v));
|
||||||
|
@ -876,42 +886,6 @@ inline Value * EvalState::lookupVar(Env * env, const ExprVar & var, bool noEval)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
Value * EvalState::allocValue()
|
|
||||||
{
|
|
||||||
/* We use the boehm batch allocator to speed up allocations of Values (of which there are many).
|
|
||||||
GC_malloc_many returns a linked list of objects of the given size, where the first word
|
|
||||||
of each object is also the pointer to the next object in the list. This also means that we
|
|
||||||
have to explicitly clear the first word of every object we take. */
|
|
||||||
if (!*valueAllocCache) {
|
|
||||||
*valueAllocCache = GC_malloc_many(sizeof(Value));
|
|
||||||
if (!*valueAllocCache) throw std::bad_alloc();
|
|
||||||
}
|
|
||||||
|
|
||||||
/* GC_NEXT is a convenience macro for accessing the first word of an object.
|
|
||||||
Take the first list item, advance the list to the next item, and clear the next pointer. */
|
|
||||||
void * p = *valueAllocCache;
|
|
||||||
GC_PTR_STORE_AND_DIRTY(&*valueAllocCache, GC_NEXT(p));
|
|
||||||
GC_NEXT(p) = nullptr;
|
|
||||||
|
|
||||||
nrValues++;
|
|
||||||
auto v = (Value *) p;
|
|
||||||
return v;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
Env & EvalState::allocEnv(size_t size)
|
|
||||||
{
|
|
||||||
nrEnvs++;
|
|
||||||
nrValuesInEnvs += size;
|
|
||||||
Env * env = (Env *) allocBytes(sizeof(Env) + size * sizeof(Value *));
|
|
||||||
env->type = Env::Plain;
|
|
||||||
|
|
||||||
/* We assume that env->values has been cleared by the allocator; maybeThunk() and lookupVar fromWith expect this. */
|
|
||||||
|
|
||||||
return *env;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
void EvalState::mkList(Value & v, size_t size)
|
void EvalState::mkList(Value & v, size_t size)
|
||||||
{
|
{
|
||||||
v.mkList(size);
|
v.mkList(size);
|
||||||
|
@ -1281,8 +1255,15 @@ void ExprSelect::eval(EvalState & state, Env & env, Value & v)
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
state.forceAttrs(*vAttrs, pos);
|
state.forceAttrs(*vAttrs, pos);
|
||||||
if ((j = vAttrs->attrs->find(name)) == vAttrs->attrs->end())
|
if ((j = vAttrs->attrs->find(name)) == vAttrs->attrs->end()) {
|
||||||
throwEvalError(pos, "attribute '%1%' missing", name);
|
std::set<std::string> allAttrNames;
|
||||||
|
for (auto & attr : *vAttrs->attrs)
|
||||||
|
allAttrNames.insert(attr.name);
|
||||||
|
throwEvalError(
|
||||||
|
pos,
|
||||||
|
Suggestions::bestMatches(allAttrNames, name),
|
||||||
|
"attribute '%1%' missing", name);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
vAttrs = j->value;
|
vAttrs = j->value;
|
||||||
pos2 = j->pos;
|
pos2 = j->pos;
|
||||||
|
@ -1398,8 +1379,17 @@ void EvalState::callFunction(Value & fun, size_t nrArgs, Value * * args, Value &
|
||||||
/* Nope, so show the first unexpected argument to the
|
/* Nope, so show the first unexpected argument to the
|
||||||
user. */
|
user. */
|
||||||
for (auto & i : *args[0]->attrs)
|
for (auto & i : *args[0]->attrs)
|
||||||
if (!lambda.formals->has(i.name))
|
if (!lambda.formals->has(i.name)) {
|
||||||
throwTypeError(pos, "%1% called with unexpected argument '%2%'", lambda, i.name);
|
std::set<std::string> formalNames;
|
||||||
|
for (auto & formal : lambda.formals->formals)
|
||||||
|
formalNames.insert(formal.name);
|
||||||
|
throwTypeError(
|
||||||
|
pos,
|
||||||
|
Suggestions::bestMatches(formalNames, i.name),
|
||||||
|
"%1% called with unexpected argument '%2%'",
|
||||||
|
lambda,
|
||||||
|
i.name);
|
||||||
|
}
|
||||||
abort(); // can't happen
|
abort(); // can't happen
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1902,13 +1892,22 @@ std::string_view EvalState::forceString(Value & v, const Pos & pos)
|
||||||
|
|
||||||
/* Decode a context string ‘!<name>!<path>’ into a pair <path,
|
/* Decode a context string ‘!<name>!<path>’ into a pair <path,
|
||||||
name>. */
|
name>. */
|
||||||
std::pair<std::string, std::string> decodeContext(std::string_view s)
|
NixStringContextElem decodeContext(const Store & store, std::string_view s)
|
||||||
{
|
{
|
||||||
if (s.at(0) == '!') {
|
if (s.at(0) == '!') {
|
||||||
size_t index = s.find("!", 1);
|
size_t index = s.find("!", 1);
|
||||||
return {std::string(s.substr(index + 1)), std::string(s.substr(1, index - 1))};
|
return {
|
||||||
|
store.parseStorePath(s.substr(index + 1)),
|
||||||
|
std::string(s.substr(1, index - 1)),
|
||||||
|
};
|
||||||
} else
|
} else
|
||||||
return {s.at(0) == '/' ? std::string(s) : std::string(s.substr(1)), ""};
|
return {
|
||||||
|
store.parseStorePath(
|
||||||
|
s.at(0) == '/'
|
||||||
|
? s
|
||||||
|
: s.substr(1)),
|
||||||
|
"",
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -1920,13 +1919,13 @@ void copyContext(const Value & v, PathSet & context)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
std::vector<std::pair<Path, std::string>> Value::getContext()
|
NixStringContext Value::getContext(const Store & store)
|
||||||
{
|
{
|
||||||
std::vector<std::pair<Path, std::string>> res;
|
NixStringContext res;
|
||||||
assert(internalType == tString);
|
assert(internalType == tString);
|
||||||
if (string.context)
|
if (string.context)
|
||||||
for (const char * * p = string.context; *p; ++p)
|
for (const char * * p = string.context; *p; ++p)
|
||||||
res.push_back(decodeContext(*p));
|
res.push_back(decodeContext(store, *p));
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -133,9 +133,14 @@ private:
|
||||||
/* Cache used by prim_match(). */
|
/* Cache used by prim_match(). */
|
||||||
std::shared_ptr<RegexCache> regexCache;
|
std::shared_ptr<RegexCache> regexCache;
|
||||||
|
|
||||||
|
#if HAVE_BOEHMGC
|
||||||
/* Allocation cache for GC'd Value objects. */
|
/* Allocation cache for GC'd Value objects. */
|
||||||
std::shared_ptr<void *> valueAllocCache;
|
std::shared_ptr<void *> valueAllocCache;
|
||||||
|
|
||||||
|
/* Allocation cache for size-1 Env objects. */
|
||||||
|
std::shared_ptr<void *> env1AllocCache;
|
||||||
|
#endif
|
||||||
|
|
||||||
public:
|
public:
|
||||||
|
|
||||||
EvalState(
|
EvalState(
|
||||||
|
@ -144,12 +149,6 @@ public:
|
||||||
std::shared_ptr<Store> buildStore = nullptr);
|
std::shared_ptr<Store> buildStore = nullptr);
|
||||||
~EvalState();
|
~EvalState();
|
||||||
|
|
||||||
void requireExperimentalFeatureOnEvaluation(
|
|
||||||
const ExperimentalFeature &,
|
|
||||||
const std::string_view fName,
|
|
||||||
const Pos & pos
|
|
||||||
);
|
|
||||||
|
|
||||||
void addToSearchPath(const std::string & s);
|
void addToSearchPath(const std::string & s);
|
||||||
|
|
||||||
SearchPath getSearchPath() { return searchPath; }
|
SearchPath getSearchPath() { return searchPath; }
|
||||||
|
@ -347,8 +346,8 @@ public:
|
||||||
void autoCallFunction(Bindings & args, Value & fun, Value & res);
|
void autoCallFunction(Bindings & args, Value & fun, Value & res);
|
||||||
|
|
||||||
/* Allocation primitives. */
|
/* Allocation primitives. */
|
||||||
Value * allocValue();
|
inline Value * allocValue();
|
||||||
Env & allocEnv(size_t size);
|
inline Env & allocEnv(size_t size);
|
||||||
|
|
||||||
Value * allocAttr(Value & vAttrs, const Symbol & name);
|
Value * allocAttr(Value & vAttrs, const Symbol & name);
|
||||||
Value * allocAttr(Value & vAttrs, std::string_view name);
|
Value * allocAttr(Value & vAttrs, std::string_view name);
|
||||||
|
@ -425,7 +424,7 @@ std::string showType(const Value & v);
|
||||||
|
|
||||||
/* Decode a context string ‘!<name>!<path>’ into a pair <path,
|
/* Decode a context string ‘!<name>!<path>’ into a pair <path,
|
||||||
name>. */
|
name>. */
|
||||||
std::pair<std::string, std::string> decodeContext(std::string_view s);
|
NixStringContextElem decodeContext(const Store & store, std::string_view s);
|
||||||
|
|
||||||
/* If `path' refers to a directory, then append "/default.nix". */
|
/* If `path' refers to a directory, then append "/default.nix". */
|
||||||
Path resolveExprPath(Path path);
|
Path resolveExprPath(Path path);
|
||||||
|
@ -509,3 +508,5 @@ extern EvalSettings evalSettings;
|
||||||
static const std::string corepkgsPrefix{"/__corepkgs__/"};
|
static const std::string corepkgsPrefix{"/__corepkgs__/"};
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#include "eval-inline.hh"
|
||||||
|
|
|
@ -706,8 +706,6 @@ void callFlake(EvalState & state,
|
||||||
|
|
||||||
static void prim_getFlake(EvalState & state, const Pos & pos, Value * * args, Value & v)
|
static void prim_getFlake(EvalState & state, const Pos & pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
state.requireExperimentalFeatureOnEvaluation(Xp::Flakes, "builtins.getFlake", pos);
|
|
||||||
|
|
||||||
std::string flakeRefS(state.forceStringNoCtx(*args[0], pos));
|
std::string flakeRefS(state.forceStringNoCtx(*args[0], pos));
|
||||||
auto flakeRef = parseFlakeRef(flakeRefS, {}, true);
|
auto flakeRef = parseFlakeRef(flakeRefS, {}, true);
|
||||||
if (evalSettings.pureEval && !flakeRef.input.isLocked())
|
if (evalSettings.pureEval && !flakeRef.input.isLocked())
|
||||||
|
@ -723,7 +721,30 @@ static void prim_getFlake(EvalState & state, const Pos & pos, Value * * args, Va
|
||||||
v);
|
v);
|
||||||
}
|
}
|
||||||
|
|
||||||
static RegisterPrimOp r2("__getFlake", 1, prim_getFlake);
|
static RegisterPrimOp r2({
|
||||||
|
.name = "__getFlake",
|
||||||
|
.args = {"args"},
|
||||||
|
.doc = R"(
|
||||||
|
Fetch a flake from a flake reference, and return its output attributes and some metadata. For example:
|
||||||
|
|
||||||
|
```nix
|
||||||
|
(builtins.getFlake "nix/55bc52401966fbffa525c574c14f67b00bc4fb3a").packages.x86_64-linux.nix
|
||||||
|
```
|
||||||
|
|
||||||
|
Unless impure evaluation is allowed (`--impure`), the flake reference
|
||||||
|
must be "locked", e.g. contain a Git revision or content hash. An
|
||||||
|
example of an unlocked usage is:
|
||||||
|
|
||||||
|
```nix
|
||||||
|
(builtins.getFlake "github:edolstra/dwarffs").rev
|
||||||
|
```
|
||||||
|
|
||||||
|
This function is only available if you enable the experimental feature
|
||||||
|
`flakes`.
|
||||||
|
)",
|
||||||
|
.fun = prim_getFlake,
|
||||||
|
.experimentalFeature = Xp::Flakes,
|
||||||
|
});
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,7 @@
|
||||||
#include "get-drvs.hh"
|
#include "get-drvs.hh"
|
||||||
#include "util.hh"
|
#include "util.hh"
|
||||||
#include "eval-inline.hh"
|
#include "eval-inline.hh"
|
||||||
|
#include "derivations.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
#include "path-with-outputs.hh"
|
#include "path-with-outputs.hh"
|
||||||
|
|
||||||
|
@ -102,7 +103,7 @@ StorePath DrvInfo::queryOutPath() const
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
DrvInfo::Outputs DrvInfo::queryOutputs(bool onlyOutputsToInstall)
|
DrvInfo::Outputs DrvInfo::queryOutputs(bool withPaths, bool onlyOutputsToInstall)
|
||||||
{
|
{
|
||||||
if (outputs.empty()) {
|
if (outputs.empty()) {
|
||||||
/* Get the ‘outputs’ list. */
|
/* Get the ‘outputs’ list. */
|
||||||
|
@ -112,20 +113,24 @@ DrvInfo::Outputs DrvInfo::queryOutputs(bool onlyOutputsToInstall)
|
||||||
|
|
||||||
/* For each output... */
|
/* For each output... */
|
||||||
for (auto elem : i->value->listItems()) {
|
for (auto elem : i->value->listItems()) {
|
||||||
/* Evaluate the corresponding set. */
|
std::string output(state->forceStringNoCtx(*elem, *i->pos));
|
||||||
std::string name(state->forceStringNoCtx(*elem, *i->pos));
|
|
||||||
Bindings::iterator out = attrs->find(state->symbols.create(name));
|
|
||||||
if (out == attrs->end()) continue; // FIXME: throw error?
|
|
||||||
state->forceAttrs(*out->value, *i->pos);
|
|
||||||
|
|
||||||
/* And evaluate its ‘outPath’ attribute. */
|
if (withPaths) {
|
||||||
Bindings::iterator outPath = out->value->attrs->find(state->sOutPath);
|
/* Evaluate the corresponding set. */
|
||||||
if (outPath == out->value->attrs->end()) continue; // FIXME: throw error?
|
Bindings::iterator out = attrs->find(state->symbols.create(output));
|
||||||
PathSet context;
|
if (out == attrs->end()) continue; // FIXME: throw error?
|
||||||
outputs.emplace(name, state->coerceToStorePath(*outPath->pos, *outPath->value, context));
|
state->forceAttrs(*out->value, *i->pos);
|
||||||
|
|
||||||
|
/* And evaluate its ‘outPath’ attribute. */
|
||||||
|
Bindings::iterator outPath = out->value->attrs->find(state->sOutPath);
|
||||||
|
if (outPath == out->value->attrs->end()) continue; // FIXME: throw error?
|
||||||
|
PathSet context;
|
||||||
|
outputs.emplace(output, state->coerceToStorePath(*outPath->pos, *outPath->value, context));
|
||||||
|
} else
|
||||||
|
outputs.emplace(output, std::nullopt);
|
||||||
}
|
}
|
||||||
} else
|
} else
|
||||||
outputs.emplace("out", queryOutPath());
|
outputs.emplace("out", withPaths ? std::optional{queryOutPath()} : std::nullopt);
|
||||||
}
|
}
|
||||||
if (!onlyOutputsToInstall || !attrs)
|
if (!onlyOutputsToInstall || !attrs)
|
||||||
return outputs;
|
return outputs;
|
||||||
|
|
|
@ -13,7 +13,7 @@ namespace nix {
|
||||||
struct DrvInfo
|
struct DrvInfo
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
typedef std::map<std::string, StorePath> Outputs;
|
typedef std::map<std::string, std::optional<StorePath>> Outputs;
|
||||||
|
|
||||||
private:
|
private:
|
||||||
EvalState * state;
|
EvalState * state;
|
||||||
|
@ -46,8 +46,9 @@ public:
|
||||||
StorePath requireDrvPath() const;
|
StorePath requireDrvPath() const;
|
||||||
StorePath queryOutPath() const;
|
StorePath queryOutPath() const;
|
||||||
std::string queryOutputName() const;
|
std::string queryOutputName() const;
|
||||||
/** Return the list of outputs. The "outputs to install" are determined by `meta.outputsToInstall`. */
|
/** Return the unordered map of output names to (optional) output paths.
|
||||||
Outputs queryOutputs(bool onlyOutputsToInstall = false);
|
* The "outputs to install" are determined by `meta.outputsToInstall`. */
|
||||||
|
Outputs queryOutputs(bool withPaths = true, bool onlyOutputsToInstall = false);
|
||||||
|
|
||||||
StringSet queryMetaNames();
|
StringSet queryMetaNames();
|
||||||
Value * queryMeta(const std::string & name);
|
Value * queryMeta(const std::string & name);
|
||||||
|
|
|
@ -28,6 +28,13 @@ using namespace nix;
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
static inline Pos makeCurPos(const YYLTYPE & loc, ParseData * data)
|
||||||
|
{
|
||||||
|
return Pos(data->origin, data->file, loc.first_line, loc.first_column);
|
||||||
|
}
|
||||||
|
|
||||||
|
#define CUR_POS makeCurPos(*yylloc, data)
|
||||||
|
|
||||||
// backup to recover from yyless(0)
|
// backup to recover from yyless(0)
|
||||||
YYLTYPE prev_yylloc;
|
YYLTYPE prev_yylloc;
|
||||||
|
|
||||||
|
@ -37,7 +44,6 @@ static void initLoc(YYLTYPE * loc)
|
||||||
loc->first_column = loc->last_column = 1;
|
loc->first_column = loc->last_column = 1;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
static void adjustLoc(YYLTYPE * loc, const char * s, size_t len)
|
static void adjustLoc(YYLTYPE * loc, const char * s, size_t len)
|
||||||
{
|
{
|
||||||
prev_yylloc = *loc;
|
prev_yylloc = *loc;
|
||||||
|
@ -147,14 +153,20 @@ or { return OR_KW; }
|
||||||
try {
|
try {
|
||||||
yylval->n = boost::lexical_cast<int64_t>(yytext);
|
yylval->n = boost::lexical_cast<int64_t>(yytext);
|
||||||
} catch (const boost::bad_lexical_cast &) {
|
} catch (const boost::bad_lexical_cast &) {
|
||||||
throw ParseError("invalid integer '%1%'", yytext);
|
throw ParseError({
|
||||||
|
.msg = hintfmt("invalid integer '%1%'", yytext),
|
||||||
|
.errPos = CUR_POS,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
return INT;
|
return INT;
|
||||||
}
|
}
|
||||||
{FLOAT} { errno = 0;
|
{FLOAT} { errno = 0;
|
||||||
yylval->nf = strtod(yytext, 0);
|
yylval->nf = strtod(yytext, 0);
|
||||||
if (errno != 0)
|
if (errno != 0)
|
||||||
throw ParseError("invalid float '%1%'", yytext);
|
throw ParseError({
|
||||||
|
.msg = hintfmt("invalid float '%1%'", yytext),
|
||||||
|
.errPos = CUR_POS,
|
||||||
|
});
|
||||||
return FLOAT;
|
return FLOAT;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -280,7 +292,10 @@ or { return OR_KW; }
|
||||||
|
|
||||||
<INPATH_SLASH>{ANY} |
|
<INPATH_SLASH>{ANY} |
|
||||||
<INPATH_SLASH><<EOF>> {
|
<INPATH_SLASH><<EOF>> {
|
||||||
throw ParseError("path has a trailing slash");
|
throw ParseError({
|
||||||
|
.msg = hintfmt("path has a trailing slash"),
|
||||||
|
.errPos = CUR_POS,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
{SPATH} { yylval->path = {yytext, (size_t) yyleng}; return SPATH; }
|
{SPATH} { yylval->path = {yytext, (size_t) yyleng}; return SPATH; }
|
||||||
|
|
|
@ -23,14 +23,13 @@ MakeError(RestrictedPathError, Error);
|
||||||
|
|
||||||
struct Pos
|
struct Pos
|
||||||
{
|
{
|
||||||
FileOrigin origin;
|
|
||||||
Symbol file;
|
Symbol file;
|
||||||
unsigned int line, column;
|
uint32_t line;
|
||||||
|
FileOrigin origin:2;
|
||||||
Pos() : origin(foString), line(0), column(0) { }
|
uint32_t column:30;
|
||||||
Pos(FileOrigin origin, const Symbol & file, unsigned int line, unsigned int column)
|
Pos() : line(0), origin(foString), column(0) { };
|
||||||
: origin(origin), file(file), line(line), column(column) { }
|
Pos(FileOrigin origin, const Symbol & file, uint32_t line, uint32_t column)
|
||||||
|
: file(file), line(line), origin(origin), column(column) { };
|
||||||
operator bool() const
|
operator bool() const
|
||||||
{
|
{
|
||||||
return line != 0;
|
return line != 0;
|
||||||
|
|
|
@ -43,8 +43,8 @@ StringMap EvalState::realiseContext(const PathSet & context)
|
||||||
StringMap res;
|
StringMap res;
|
||||||
|
|
||||||
for (auto & i : context) {
|
for (auto & i : context) {
|
||||||
auto [ctxS, outputName] = decodeContext(i);
|
auto [ctx, outputName] = decodeContext(*store, i);
|
||||||
auto ctx = store->parseStorePath(ctxS);
|
auto ctxS = store->printStorePath(ctx);
|
||||||
if (!store->isValidPath(ctx))
|
if (!store->isValidPath(ctx))
|
||||||
throw InvalidPathError(store->printStorePath(ctx));
|
throw InvalidPathError(store->printStorePath(ctx));
|
||||||
if (!outputName.empty() && ctx.isDerivation()) {
|
if (!outputName.empty() && ctx.isDerivation()) {
|
||||||
|
@ -694,7 +694,32 @@ static void prim_genericClosure(EvalState & state, const Pos & pos, Value * * ar
|
||||||
|
|
||||||
static RegisterPrimOp primop_genericClosure(RegisterPrimOp::Info {
|
static RegisterPrimOp primop_genericClosure(RegisterPrimOp::Info {
|
||||||
.name = "__genericClosure",
|
.name = "__genericClosure",
|
||||||
|
.args = {"attrset"},
|
||||||
.arity = 1,
|
.arity = 1,
|
||||||
|
.doc = R"(
|
||||||
|
Take an *attrset* with values named `startSet` and `operator` in order to
|
||||||
|
return a *list of attrsets* by starting with the `startSet`, recursively
|
||||||
|
applying the `operator` function to each element. The *attrsets* in the
|
||||||
|
`startSet` and produced by the `operator` must each contain value named
|
||||||
|
`key` which are comparable to each other. The result is produced by
|
||||||
|
repeatedly calling the operator for each element encountered with a
|
||||||
|
unique key, terminating when no new elements are produced. For example,
|
||||||
|
|
||||||
|
```
|
||||||
|
builtins.genericClosure {
|
||||||
|
startSet = [ {key = 5;} ];
|
||||||
|
operator = item: [{
|
||||||
|
key = if (item.key / 2 ) * 2 == item.key
|
||||||
|
then item.key / 2
|
||||||
|
else 3 * item.key + 1;
|
||||||
|
}];
|
||||||
|
}
|
||||||
|
```
|
||||||
|
evaluates to
|
||||||
|
```
|
||||||
|
[ { key = 5; } { key = 16; } { key = 8; } { key = 4; } { key = 2; } { key = 1; } ]
|
||||||
|
```
|
||||||
|
)",
|
||||||
.fun = prim_genericClosure,
|
.fun = prim_genericClosure,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -1114,7 +1139,7 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
|
||||||
drv.inputSrcs.insert(j);
|
drv.inputSrcs.insert(j);
|
||||||
if (j.isDerivation()) {
|
if (j.isDerivation()) {
|
||||||
Derivation jDrv = state.store->readDerivation(j);
|
Derivation jDrv = state.store->readDerivation(j);
|
||||||
if(jDrv.type() != DerivationType::CAFloating)
|
if(jDrv.type().hasKnownOutputPaths())
|
||||||
drv.inputDrvs[j] = jDrv.outputNames();
|
drv.inputDrvs[j] = jDrv.outputNames();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1122,8 +1147,8 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
|
||||||
|
|
||||||
/* Handle derivation outputs of the form ‘!<name>!<path>’. */
|
/* Handle derivation outputs of the form ‘!<name>!<path>’. */
|
||||||
else if (path.at(0) == '!') {
|
else if (path.at(0) == '!') {
|
||||||
auto ctx = decodeContext(path);
|
auto ctx = decodeContext(*state.store, path);
|
||||||
drv.inputDrvs[state.store->parseStorePath(ctx.first)].insert(ctx.second);
|
drv.inputDrvs[ctx.first].insert(ctx.second);
|
||||||
}
|
}
|
||||||
|
|
||||||
/* Otherwise it's a source file. */
|
/* Otherwise it's a source file. */
|
||||||
|
@ -1171,22 +1196,21 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
|
||||||
std::move(h),
|
std::move(h),
|
||||||
{});
|
{});
|
||||||
|
|
||||||
DerivationOutputCAFixed dof { .ca = ca };
|
DerivationOutput::CAFixed dof { .ca = ca };
|
||||||
|
|
||||||
drv.env["out"] = state.store->printStorePath(dof.path(*state.store, drvName, "out"));
|
drv.env["out"] = state.store->printStorePath(dof.path(*state.store, drvName, "out"));
|
||||||
drv.outputs.insert_or_assign("out", DerivationOutput { .output = dof });
|
drv.outputs.insert_or_assign("out", dof);
|
||||||
}
|
}
|
||||||
|
|
||||||
else if (contentAddressed) {
|
else if (contentAddressed) {
|
||||||
HashType ht = parseHashType(outputHashAlgo);
|
HashType ht = parseHashType(outputHashAlgo);
|
||||||
for (auto & i : outputs) {
|
for (auto & i : outputs) {
|
||||||
drv.env[i] = hashPlaceholder(i);
|
drv.env[i] = hashPlaceholder(i);
|
||||||
drv.outputs.insert_or_assign(i, DerivationOutput {
|
drv.outputs.insert_or_assign(i,
|
||||||
.output = DerivationOutputCAFloating {
|
DerivationOutput::CAFloating {
|
||||||
.method = ingestionMethod,
|
.method = ingestionMethod,
|
||||||
.hashType = ht,
|
.hashType = ht,
|
||||||
},
|
});
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1200,43 +1224,36 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
|
||||||
for (auto & i : outputs) {
|
for (auto & i : outputs) {
|
||||||
drv.env[i] = "";
|
drv.env[i] = "";
|
||||||
drv.outputs.insert_or_assign(i,
|
drv.outputs.insert_or_assign(i,
|
||||||
DerivationOutput {
|
DerivationOutput::Deferred { });
|
||||||
.output = DerivationOutputInputAddressed {
|
|
||||||
.path = StorePath::dummy,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Regular, non-CA derivation should always return a single hash and not
|
// Regular, non-CA derivation should always return a single hash and not
|
||||||
// hash per output.
|
// hash per output.
|
||||||
auto hashModulo = hashDerivationModulo(*state.store, Derivation(drv), true);
|
auto hashModulo = hashDerivationModulo(*state.store, drv, true);
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](Hash & h) {
|
[&](const DrvHash & drvHash) {
|
||||||
for (auto & i : outputs) {
|
auto & h = drvHash.hash;
|
||||||
auto outPath = state.store->makeOutputPath(i, h, drvName);
|
switch (drvHash.kind) {
|
||||||
drv.env[i] = state.store->printStorePath(outPath);
|
case DrvHash::Kind::Deferred:
|
||||||
drv.outputs.insert_or_assign(i,
|
/* Outputs already deferred, nothing to do */
|
||||||
DerivationOutput {
|
break;
|
||||||
.output = DerivationOutputInputAddressed {
|
case DrvHash::Kind::Regular:
|
||||||
.path = std::move(outPath),
|
for (auto & [outputName, output] : drv.outputs) {
|
||||||
},
|
auto outPath = state.store->makeOutputPath(outputName, h, drvName);
|
||||||
});
|
drv.env[outputName] = state.store->printStorePath(outPath);
|
||||||
|
output = DerivationOutput::InputAddressed {
|
||||||
|
.path = std::move(outPath),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
[&](CaOutputHashes &) {
|
[&](const CaOutputHashes &) {
|
||||||
// Shouldn't happen as the toplevel derivation is not CA.
|
// Shouldn't happen as the toplevel derivation is not CA.
|
||||||
assert(false);
|
assert(false);
|
||||||
},
|
},
|
||||||
[&](DeferredHash &) {
|
|
||||||
for (auto & i : outputs) {
|
|
||||||
drv.outputs.insert_or_assign(i,
|
|
||||||
DerivationOutput {
|
|
||||||
.output = DerivationOutputDeferred{},
|
|
||||||
});
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
hashModulo);
|
hashModulo.raw());
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1248,12 +1265,9 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
|
||||||
|
|
||||||
/* Optimisation, but required in read-only mode! because in that
|
/* Optimisation, but required in read-only mode! because in that
|
||||||
case we don't actually write store derivations, so we can't
|
case we don't actually write store derivations, so we can't
|
||||||
read them later.
|
read them later. */
|
||||||
|
{
|
||||||
However, we don't bother doing this for floating CA derivations because
|
auto h = hashDerivationModulo(*state.store, drv, false);
|
||||||
their "hash modulo" is indeterminate until built. */
|
|
||||||
if (drv.type() != DerivationType::CAFloating) {
|
|
||||||
auto h = hashDerivationModulo(*state.store, Derivation(drv), false);
|
|
||||||
drvHashes.lock()->insert_or_assign(drvPath, h);
|
drvHashes.lock()->insert_or_assign(drvPath, h);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -3811,7 +3825,7 @@ RegisterPrimOp::RegisterPrimOp(std::string name, size_t arity, PrimOpFun fun)
|
||||||
.name = name,
|
.name = name,
|
||||||
.args = {},
|
.args = {},
|
||||||
.arity = arity,
|
.arity = arity,
|
||||||
.fun = fun
|
.fun = fun,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -3883,13 +3897,17 @@ void EvalState::createBaseEnv()
|
||||||
|
|
||||||
if (RegisterPrimOp::primOps)
|
if (RegisterPrimOp::primOps)
|
||||||
for (auto & primOp : *RegisterPrimOp::primOps)
|
for (auto & primOp : *RegisterPrimOp::primOps)
|
||||||
addPrimOp({
|
if (!primOp.experimentalFeature
|
||||||
.fun = primOp.fun,
|
|| settings.isExperimentalFeatureEnabled(*primOp.experimentalFeature))
|
||||||
.arity = std::max(primOp.args.size(), primOp.arity),
|
{
|
||||||
.name = symbols.create(primOp.name),
|
addPrimOp({
|
||||||
.args = primOp.args,
|
.fun = primOp.fun,
|
||||||
.doc = primOp.doc,
|
.arity = std::max(primOp.args.size(), primOp.arity),
|
||||||
});
|
.name = symbols.create(primOp.name),
|
||||||
|
.args = primOp.args,
|
||||||
|
.doc = primOp.doc,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
/* Add a wrapper around the derivation primop that computes the
|
/* Add a wrapper around the derivation primop that computes the
|
||||||
`drvPath' and `outPath' attributes lazily. */
|
`drvPath' and `outPath' attributes lazily. */
|
||||||
|
|
|
@ -16,6 +16,7 @@ struct RegisterPrimOp
|
||||||
size_t arity = 0;
|
size_t arity = 0;
|
||||||
const char * doc;
|
const char * doc;
|
||||||
PrimOpFun fun;
|
PrimOpFun fun;
|
||||||
|
std::optional<ExperimentalFeature> experimentalFeature;
|
||||||
};
|
};
|
||||||
|
|
||||||
typedef std::vector<Info> PrimOps;
|
typedef std::vector<Info> PrimOps;
|
||||||
|
@ -35,6 +36,7 @@ struct RegisterPrimOp
|
||||||
/* These primops are disabled without enableNativeCode, but plugins
|
/* These primops are disabled without enableNativeCode, but plugins
|
||||||
may wish to use them in limited contexts without globally enabling
|
may wish to use them in limited contexts without globally enabling
|
||||||
them. */
|
them. */
|
||||||
|
|
||||||
/* Load a ValueInitializer from a DSO and return whatever it initializes */
|
/* Load a ValueInitializer from a DSO and return whatever it initializes */
|
||||||
void prim_importNative(EvalState & state, const Pos & pos, Value * * args, Value & v);
|
void prim_importNative(EvalState & state, const Pos & pos, Value * * args, Value & v);
|
||||||
|
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
#include "primops.hh"
|
#include "primops.hh"
|
||||||
#include "eval-inline.hh"
|
#include "eval-inline.hh"
|
||||||
|
#include "derivations.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
@ -82,8 +83,8 @@ static void prim_getContext(EvalState & state, const Pos & pos, Value * * args,
|
||||||
drv = std::string(p, 1);
|
drv = std::string(p, 1);
|
||||||
path = &drv;
|
path = &drv;
|
||||||
} else if (p.at(0) == '!') {
|
} else if (p.at(0) == '!') {
|
||||||
std::pair<std::string, std::string> ctx = decodeContext(p);
|
NixStringContextElem ctx = decodeContext(*state.store, p);
|
||||||
drv = ctx.first;
|
drv = state.store->printStorePath(ctx.first);
|
||||||
output = ctx.second;
|
output = ctx.second;
|
||||||
path = &drv;
|
path = &drv;
|
||||||
}
|
}
|
||||||
|
|
154
src/libexpr/primops/fetchClosure.cc
Normal file
154
src/libexpr/primops/fetchClosure.cc
Normal file
|
@ -0,0 +1,154 @@
|
||||||
|
#include "primops.hh"
|
||||||
|
#include "store-api.hh"
|
||||||
|
#include "make-content-addressed.hh"
|
||||||
|
#include "url.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
static void prim_fetchClosure(EvalState & state, const Pos & pos, Value * * args, Value & v)
|
||||||
|
{
|
||||||
|
state.forceAttrs(*args[0], pos);
|
||||||
|
|
||||||
|
std::optional<std::string> fromStoreUrl;
|
||||||
|
std::optional<StorePath> fromPath;
|
||||||
|
bool toCA = false;
|
||||||
|
std::optional<StorePath> toPath;
|
||||||
|
|
||||||
|
for (auto & attr : *args[0]->attrs) {
|
||||||
|
if (attr.name == "fromPath") {
|
||||||
|
PathSet context;
|
||||||
|
fromPath = state.coerceToStorePath(*attr.pos, *attr.value, context);
|
||||||
|
}
|
||||||
|
|
||||||
|
else if (attr.name == "toPath") {
|
||||||
|
state.forceValue(*attr.value, *attr.pos);
|
||||||
|
toCA = true;
|
||||||
|
if (attr.value->type() != nString || attr.value->string.s != std::string("")) {
|
||||||
|
PathSet context;
|
||||||
|
toPath = state.coerceToStorePath(*attr.pos, *attr.value, context);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
else if (attr.name == "fromStore")
|
||||||
|
fromStoreUrl = state.forceStringNoCtx(*attr.value, *attr.pos);
|
||||||
|
|
||||||
|
else
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("attribute '%s' isn't supported in call to 'fetchClosure'", attr.name),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!fromPath)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("attribute '%s' is missing in call to 'fetchClosure'", "fromPath"),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!fromStoreUrl)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("attribute '%s' is missing in call to 'fetchClosure'", "fromStore"),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
|
||||||
|
auto parsedURL = parseURL(*fromStoreUrl);
|
||||||
|
|
||||||
|
if (parsedURL.scheme != "http" &&
|
||||||
|
parsedURL.scheme != "https" &&
|
||||||
|
!(getEnv("_NIX_IN_TEST").has_value() && parsedURL.scheme == "file"))
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("'fetchClosure' only supports http:// and https:// stores"),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
|
||||||
|
auto fromStore = openStore(parsedURL.to_string());
|
||||||
|
|
||||||
|
if (toCA) {
|
||||||
|
if (!toPath || !state.store->isValidPath(*toPath)) {
|
||||||
|
auto remappings = makeContentAddressed(*fromStore, *state.store, { *fromPath });
|
||||||
|
auto i = remappings.find(*fromPath);
|
||||||
|
assert(i != remappings.end());
|
||||||
|
if (toPath && *toPath != i->second)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("rewriting '%s' to content-addressed form yielded '%s', while '%s' was expected",
|
||||||
|
state.store->printStorePath(*fromPath),
|
||||||
|
state.store->printStorePath(i->second),
|
||||||
|
state.store->printStorePath(*toPath)),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
if (!toPath)
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt(
|
||||||
|
"rewriting '%s' to content-addressed form yielded '%s'; "
|
||||||
|
"please set this in the 'toPath' attribute passed to 'fetchClosure'",
|
||||||
|
state.store->printStorePath(*fromPath),
|
||||||
|
state.store->printStorePath(i->second)),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
copyClosure(*fromStore, *state.store, RealisedPath::Set { *fromPath });
|
||||||
|
toPath = fromPath;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* In pure mode, require a CA path. */
|
||||||
|
if (evalSettings.pureEval) {
|
||||||
|
auto info = state.store->queryPathInfo(*toPath);
|
||||||
|
if (!info->isContentAddressed(*state.store))
|
||||||
|
throw Error({
|
||||||
|
.msg = hintfmt("in pure mode, 'fetchClosure' requires a content-addressed path, which '%s' isn't",
|
||||||
|
state.store->printStorePath(*toPath)),
|
||||||
|
.errPos = pos
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
auto toPathS = state.store->printStorePath(*toPath);
|
||||||
|
v.mkString(toPathS, {toPathS});
|
||||||
|
}
|
||||||
|
|
||||||
|
static RegisterPrimOp primop_fetchClosure({
|
||||||
|
.name = "__fetchClosure",
|
||||||
|
.args = {"args"},
|
||||||
|
.doc = R"(
|
||||||
|
Fetch a Nix store closure from a binary cache, rewriting it into
|
||||||
|
content-addressed form. For example,
|
||||||
|
|
||||||
|
```nix
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = "https://cache.nixos.org";
|
||||||
|
fromPath = /nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1;
|
||||||
|
toPath = /nix/store/ldbhlwhh39wha58rm61bkiiwm6j7211j-git-2.33.1;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
fetches `/nix/store/r2jd...` from the specified binary cache,
|
||||||
|
and rewrites it into the content-addressed store path
|
||||||
|
`/nix/store/ldbh...`.
|
||||||
|
|
||||||
|
If `fromPath` is already content-addressed, or if you are
|
||||||
|
allowing impure evaluation (`--impure`), then `toPath` may be
|
||||||
|
omitted.
|
||||||
|
|
||||||
|
To find out the correct value for `toPath` given a `fromPath`,
|
||||||
|
you can use `nix store make-content-addressed`:
|
||||||
|
|
||||||
|
```console
|
||||||
|
# nix store make-content-addressed --from https://cache.nixos.org /nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1
|
||||||
|
rewrote '/nix/store/r2jd6ygnmirm2g803mksqqjm4y39yi6i-git-2.33.1' to '/nix/store/ldbhlwhh39wha58rm61bkiiwm6j7211j-git-2.33.1'
|
||||||
|
```
|
||||||
|
|
||||||
|
This function is similar to `builtins.storePath` in that it
|
||||||
|
allows you to use a previously built store path in a Nix
|
||||||
|
expression. However, it is more reproducible because it requires
|
||||||
|
specifying a binary cache from which the path can be fetched.
|
||||||
|
Also, requiring a content-addressed final store path avoids the
|
||||||
|
need for users to configure binary cache public keys.
|
||||||
|
|
||||||
|
This function is only available if you enable the experimental
|
||||||
|
feature `fetch-closure`.
|
||||||
|
)",
|
||||||
|
.fun = prim_fetchClosure,
|
||||||
|
.experimentalFeature = Xp::FetchClosure,
|
||||||
|
});
|
||||||
|
|
||||||
|
}
|
|
@ -145,7 +145,7 @@ static void fetchTree(
|
||||||
if (!params.allowNameArgument)
|
if (!params.allowNameArgument)
|
||||||
if (auto nameIter = attrs.find("name"); nameIter != attrs.end())
|
if (auto nameIter = attrs.find("name"); nameIter != attrs.end())
|
||||||
throw Error({
|
throw Error({
|
||||||
.msg = hintfmt("attribute 'name' isn’t supported in call to 'fetchTree'"),
|
.msg = hintfmt("attribute 'name' isn't supported in call to 'fetchTree'"),
|
||||||
.errPos = pos
|
.errPos = pos
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -334,7 +334,7 @@ static RegisterPrimOp primop_fetchTarball({
|
||||||
.fun = prim_fetchTarball,
|
.fun = prim_fetchTarball,
|
||||||
});
|
});
|
||||||
|
|
||||||
static void prim_fetchGit(EvalState &state, const Pos &pos, Value **args, Value &v)
|
static void prim_fetchGit(EvalState & state, const Pos & pos, Value * * args, Value & v)
|
||||||
{
|
{
|
||||||
fetchTree(state, pos, args, v, "git", FetchTreeParams { .emptyRevFallback = true, .allowNameArgument = true });
|
fetchTree(state, pos, args, v, "git", FetchTreeParams { .emptyRevFallback = true, .allowNameArgument = true });
|
||||||
}
|
}
|
||||||
|
|
|
@ -57,6 +57,8 @@ struct ExprLambda;
|
||||||
struct PrimOp;
|
struct PrimOp;
|
||||||
class Symbol;
|
class Symbol;
|
||||||
struct Pos;
|
struct Pos;
|
||||||
|
class StorePath;
|
||||||
|
class Store;
|
||||||
class EvalState;
|
class EvalState;
|
||||||
class XMLWriter;
|
class XMLWriter;
|
||||||
class JSONPlaceholder;
|
class JSONPlaceholder;
|
||||||
|
@ -64,6 +66,8 @@ class JSONPlaceholder;
|
||||||
|
|
||||||
typedef int64_t NixInt;
|
typedef int64_t NixInt;
|
||||||
typedef double NixFloat;
|
typedef double NixFloat;
|
||||||
|
typedef std::pair<StorePath, std::string> NixStringContextElem;
|
||||||
|
typedef std::vector<NixStringContextElem> NixStringContext;
|
||||||
|
|
||||||
/* External values must descend from ExternalValueBase, so that
|
/* External values must descend from ExternalValueBase, so that
|
||||||
* type-agnostic nix functions (e.g. showType) can be implemented
|
* type-agnostic nix functions (e.g. showType) can be implemented
|
||||||
|
@ -115,10 +119,13 @@ private:
|
||||||
InternalType internalType;
|
InternalType internalType;
|
||||||
|
|
||||||
friend std::string showType(const Value & v);
|
friend std::string showType(const Value & v);
|
||||||
friend void printValue(std::ostream & str, std::set<const void *> & seen, const Value & v);
|
|
||||||
|
void print(std::ostream & str, std::set<const void *> * seen) const;
|
||||||
|
|
||||||
public:
|
public:
|
||||||
|
|
||||||
|
void print(std::ostream & str, bool showRepeated = false) const;
|
||||||
|
|
||||||
// Functions needed to distinguish the type
|
// Functions needed to distinguish the type
|
||||||
// These should be removed eventually, by putting the functionality that's
|
// These should be removed eventually, by putting the functionality that's
|
||||||
// needed by callers into methods of this type
|
// needed by callers into methods of this type
|
||||||
|
@ -368,7 +375,7 @@ public:
|
||||||
non-trivial. */
|
non-trivial. */
|
||||||
bool isTrivial() const;
|
bool isTrivial() const;
|
||||||
|
|
||||||
std::vector<std::pair<Path, std::string>> getContext();
|
NixStringContext getContext(const Store &);
|
||||||
|
|
||||||
auto listItems()
|
auto listItems()
|
||||||
{
|
{
|
||||||
|
|
|
@ -222,22 +222,46 @@ struct GitInputScheme : InputScheme
|
||||||
if (!input.getRef() && !input.getRev() && isLocal) {
|
if (!input.getRef() && !input.getRev() && isLocal) {
|
||||||
bool clean = false;
|
bool clean = false;
|
||||||
|
|
||||||
/* Check whether this repo has any commits. There are
|
auto env = getEnv();
|
||||||
probably better ways to do this. */
|
// Set LC_ALL to C: because we rely on the error messages from git rev-parse to determine what went wrong
|
||||||
auto gitDir = actualUrl + "/.git";
|
// that way unknown errors can lead to a failure instead of continuing through the wrong code path
|
||||||
auto commonGitDir = chomp(runProgram(
|
env["LC_ALL"] = "C";
|
||||||
"git",
|
|
||||||
true,
|
|
||||||
{ "-C", actualUrl, "rev-parse", "--git-common-dir" }
|
|
||||||
));
|
|
||||||
if (commonGitDir != ".git")
|
|
||||||
gitDir = commonGitDir;
|
|
||||||
|
|
||||||
bool haveCommits = !readDirectory(gitDir + "/refs/heads").empty();
|
/* Check whether HEAD points to something that looks like a commit,
|
||||||
|
since that is the refrence we want to use later on. */
|
||||||
|
auto result = runProgram(RunOptions {
|
||||||
|
.program = "git",
|
||||||
|
.args = { "-C", actualUrl, "--git-dir=.git", "rev-parse", "--verify", "--no-revs", "HEAD^{commit}" },
|
||||||
|
.environment = env,
|
||||||
|
.mergeStderrToStdout = true
|
||||||
|
});
|
||||||
|
auto exitCode = WEXITSTATUS(result.first);
|
||||||
|
auto errorMessage = result.second;
|
||||||
|
|
||||||
|
if (errorMessage.find("fatal: not a git repository") != std::string::npos) {
|
||||||
|
throw Error("'%s' is not a Git repository", actualUrl);
|
||||||
|
} else if (errorMessage.find("fatal: Needed a single revision") != std::string::npos) {
|
||||||
|
// indicates that the repo does not have any commits
|
||||||
|
// we want to proceed and will consider it dirty later
|
||||||
|
} else if (exitCode != 0) {
|
||||||
|
// any other errors should lead to a failure
|
||||||
|
throw Error("getting the HEAD of the Git tree '%s' failed with exit code %d:\n%s", actualUrl, exitCode, errorMessage);
|
||||||
|
}
|
||||||
|
|
||||||
|
bool hasHead = exitCode == 0;
|
||||||
try {
|
try {
|
||||||
if (haveCommits) {
|
if (hasHead) {
|
||||||
runProgram("git", true, { "-C", actualUrl, "diff-index", "--quiet", "HEAD", "--" });
|
// Using git diff is preferrable over lower-level operations here,
|
||||||
|
// because its conceptually simpler and we only need the exit code anyways.
|
||||||
|
auto gitDiffOpts = Strings({ "-C", actualUrl, "diff", "HEAD", "--quiet"});
|
||||||
|
if (!submodules) {
|
||||||
|
// Changes in submodules should only make the tree dirty
|
||||||
|
// when those submodules will be copied as well.
|
||||||
|
gitDiffOpts.emplace_back("--ignore-submodules");
|
||||||
|
}
|
||||||
|
gitDiffOpts.emplace_back("--");
|
||||||
|
runProgram("git", true, gitDiffOpts);
|
||||||
|
|
||||||
clean = true;
|
clean = true;
|
||||||
}
|
}
|
||||||
} catch (ExecError & e) {
|
} catch (ExecError & e) {
|
||||||
|
@ -282,7 +306,7 @@ struct GitInputScheme : InputScheme
|
||||||
// modified dirty file?
|
// modified dirty file?
|
||||||
input.attrs.insert_or_assign(
|
input.attrs.insert_or_assign(
|
||||||
"lastModified",
|
"lastModified",
|
||||||
haveCommits ? std::stoull(runProgram("git", true, { "-C", actualUrl, "log", "-1", "--format=%ct", "--no-show-signature", "HEAD" })) : 0);
|
hasHead ? std::stoull(runProgram("git", true, { "-C", actualUrl, "log", "-1", "--format=%ct", "--no-show-signature", "HEAD" })) : 0);
|
||||||
|
|
||||||
return {std::move(storePath), input};
|
return {std::move(storePath), input};
|
||||||
}
|
}
|
||||||
|
|
|
@ -390,7 +390,7 @@ struct SourceHutInputScheme : GitArchiveInputScheme
|
||||||
|
|
||||||
ref_uri = line.substr(ref_index+5, line.length()-1);
|
ref_uri = line.substr(ref_index+5, line.length()-1);
|
||||||
} else
|
} else
|
||||||
ref_uri = fmt("refs/heads/%s", ref);
|
ref_uri = fmt("refs/(heads|tags)/%s", ref);
|
||||||
|
|
||||||
auto file = store->toRealPath(
|
auto file = store->toRealPath(
|
||||||
downloadFile(store, fmt("%s/info/refs", base_url), "source", false, headers).storePath);
|
downloadFile(store, fmt("%s/info/refs", base_url), "source", false, headers).storePath);
|
||||||
|
@ -399,9 +399,11 @@ struct SourceHutInputScheme : GitArchiveInputScheme
|
||||||
std::string line;
|
std::string line;
|
||||||
std::string id;
|
std::string id;
|
||||||
while(getline(is, line)) {
|
while(getline(is, line)) {
|
||||||
auto index = line.find(ref_uri);
|
// Append $ to avoid partial name matches
|
||||||
if (index != std::string::npos) {
|
std::regex pattern(fmt("%s$", ref_uri));
|
||||||
id = line.substr(0, index-1);
|
|
||||||
|
if (std::regex_search(line, pattern)) {
|
||||||
|
id = line.substr(0, line.find('\t'));
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
#include "fetchers.hh"
|
#include "fetchers.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "archive.hh"
|
||||||
|
|
||||||
namespace nix::fetchers {
|
namespace nix::fetchers {
|
||||||
|
|
||||||
|
@ -80,8 +81,9 @@ struct PathInputScheme : InputScheme
|
||||||
// nothing to do
|
// nothing to do
|
||||||
}
|
}
|
||||||
|
|
||||||
std::pair<StorePath, Input> fetch(ref<Store> store, const Input & input) override
|
std::pair<StorePath, Input> fetch(ref<Store> store, const Input & _input) override
|
||||||
{
|
{
|
||||||
|
Input input(_input);
|
||||||
std::string absPath;
|
std::string absPath;
|
||||||
auto path = getStrAttr(input.attrs, "path");
|
auto path = getStrAttr(input.attrs, "path");
|
||||||
|
|
||||||
|
@ -111,9 +113,15 @@ struct PathInputScheme : InputScheme
|
||||||
if (storePath)
|
if (storePath)
|
||||||
store->addTempRoot(*storePath);
|
store->addTempRoot(*storePath);
|
||||||
|
|
||||||
if (!storePath || storePath->name() != "source" || !store->isValidPath(*storePath))
|
time_t mtime = 0;
|
||||||
|
if (!storePath || storePath->name() != "source" || !store->isValidPath(*storePath)) {
|
||||||
// FIXME: try to substitute storePath.
|
// FIXME: try to substitute storePath.
|
||||||
storePath = store->addToStore("source", absPath);
|
auto src = sinkToSource([&](Sink & sink) {
|
||||||
|
mtime = dumpPathAndGetMtime(absPath, sink, defaultPathFilter);
|
||||||
|
});
|
||||||
|
storePath = store->addToStoreFromDump(*src, "source");
|
||||||
|
}
|
||||||
|
input.attrs.insert_or_assign("lastModified", uint64_t(mtime));
|
||||||
|
|
||||||
return {std::move(*storePath), input};
|
return {std::move(*storePath), input};
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,6 +2,7 @@
|
||||||
|
|
||||||
#include "crypto.hh"
|
#include "crypto.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
|
|
||||||
#include "pool.hh"
|
#include "pool.hh"
|
||||||
|
|
||||||
|
@ -28,7 +29,9 @@ struct BinaryCacheStoreConfig : virtual StoreConfig
|
||||||
"other than -1 which we reserve to indicate Nix defaults should be used"};
|
"other than -1 which we reserve to indicate Nix defaults should be used"};
|
||||||
};
|
};
|
||||||
|
|
||||||
class BinaryCacheStore : public virtual BinaryCacheStoreConfig, public virtual Store
|
class BinaryCacheStore : public virtual BinaryCacheStoreConfig,
|
||||||
|
public virtual Store,
|
||||||
|
public virtual LogStore
|
||||||
{
|
{
|
||||||
|
|
||||||
private:
|
private:
|
||||||
|
|
|
@ -204,11 +204,9 @@ void DerivationGoal::haveDerivation()
|
||||||
{
|
{
|
||||||
trace("have derivation");
|
trace("have derivation");
|
||||||
|
|
||||||
if (drv->type() == DerivationType::CAFloating)
|
if (!drv->type().hasKnownOutputPaths())
|
||||||
settings.requireExperimentalFeature(Xp::CaDerivations);
|
settings.requireExperimentalFeature(Xp::CaDerivations);
|
||||||
|
|
||||||
retrySubstitution = false;
|
|
||||||
|
|
||||||
for (auto & i : drv->outputsAndOptPaths(worker.store))
|
for (auto & i : drv->outputsAndOptPaths(worker.store))
|
||||||
if (i.second.second)
|
if (i.second.second)
|
||||||
worker.store.addTempRoot(*i.second.second);
|
worker.store.addTempRoot(*i.second.second);
|
||||||
|
@ -311,14 +309,11 @@ void DerivationGoal::outputsSubstitutionTried()
|
||||||
gaveUpOnSubstitution();
|
gaveUpOnSubstitution();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
/* At least one of the output paths could not be
|
/* At least one of the output paths could not be
|
||||||
produced using a substitute. So we have to build instead. */
|
produced using a substitute. So we have to build instead. */
|
||||||
void DerivationGoal::gaveUpOnSubstitution()
|
void DerivationGoal::gaveUpOnSubstitution()
|
||||||
{
|
{
|
||||||
/* Make sure checkPathValidity() from now on checks all
|
|
||||||
outputs. */
|
|
||||||
wantedOutputs.clear();
|
|
||||||
|
|
||||||
/* The inputs must be built before we can build this goal. */
|
/* The inputs must be built before we can build this goal. */
|
||||||
if (useDerivation)
|
if (useDerivation)
|
||||||
for (auto & i : dynamic_cast<Derivation *>(drv.get())->inputDrvs)
|
for (auto & i : dynamic_cast<Derivation *>(drv.get())->inputDrvs)
|
||||||
|
@ -426,7 +421,8 @@ void DerivationGoal::inputsRealised()
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (retrySubstitution) {
|
if (retrySubstitution && !retriedSubstitution) {
|
||||||
|
retriedSubstitution = true;
|
||||||
haveDerivation();
|
haveDerivation();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
@ -440,9 +436,28 @@ void DerivationGoal::inputsRealised()
|
||||||
if (useDerivation) {
|
if (useDerivation) {
|
||||||
auto & fullDrv = *dynamic_cast<Derivation *>(drv.get());
|
auto & fullDrv = *dynamic_cast<Derivation *>(drv.get());
|
||||||
|
|
||||||
if (settings.isExperimentalFeatureEnabled(Xp::CaDerivations) &&
|
auto drvType = fullDrv.type();
|
||||||
((!fullDrv.inputDrvs.empty() && derivationIsCA(fullDrv.type()))
|
bool resolveDrv = std::visit(overloaded {
|
||||||
|| fullDrv.type() == DerivationType::DeferredInputAddressed)) {
|
[&](const DerivationType::InputAddressed & ia) {
|
||||||
|
/* must resolve if deferred. */
|
||||||
|
return ia.deferred;
|
||||||
|
},
|
||||||
|
[&](const DerivationType::ContentAddressed & ca) {
|
||||||
|
return !fullDrv.inputDrvs.empty() && (
|
||||||
|
ca.fixed
|
||||||
|
/* Can optionally resolve if fixed, which is good
|
||||||
|
for avoiding unnecessary rebuilds. */
|
||||||
|
? settings.isExperimentalFeatureEnabled(Xp::CaDerivations)
|
||||||
|
/* Must resolve if floating and there are any inputs
|
||||||
|
drvs. */
|
||||||
|
: true);
|
||||||
|
},
|
||||||
|
}, drvType.raw());
|
||||||
|
|
||||||
|
if (resolveDrv)
|
||||||
|
{
|
||||||
|
settings.requireExperimentalFeature(Xp::CaDerivations);
|
||||||
|
|
||||||
/* We are be able to resolve this derivation based on the
|
/* We are be able to resolve this derivation based on the
|
||||||
now-known results of dependencies. If so, we become a stub goal
|
now-known results of dependencies. If so, we become a stub goal
|
||||||
aliasing that resolved derivation goal */
|
aliasing that resolved derivation goal */
|
||||||
|
@ -501,7 +516,7 @@ void DerivationGoal::inputsRealised()
|
||||||
|
|
||||||
/* Don't repeat fixed-output derivations since they're already
|
/* Don't repeat fixed-output derivations since they're already
|
||||||
verified by their output hash.*/
|
verified by their output hash.*/
|
||||||
nrRounds = derivationIsFixed(derivationType) ? 1 : settings.buildRepeat + 1;
|
nrRounds = derivationType.isFixed() ? 1 : settings.buildRepeat + 1;
|
||||||
|
|
||||||
/* Okay, try to build. Note that here we don't wait for a build
|
/* Okay, try to build. Note that here we don't wait for a build
|
||||||
slot to become available, since we don't need one if there is a
|
slot to become available, since we don't need one if there is a
|
||||||
|
@ -908,7 +923,7 @@ void DerivationGoal::buildDone()
|
||||||
st =
|
st =
|
||||||
dynamic_cast<NotDeterministic*>(&e) ? BuildResult::NotDeterministic :
|
dynamic_cast<NotDeterministic*>(&e) ? BuildResult::NotDeterministic :
|
||||||
statusOk(status) ? BuildResult::OutputRejected :
|
statusOk(status) ? BuildResult::OutputRejected :
|
||||||
derivationIsImpure(derivationType) || diskFull ? BuildResult::TransientFailure :
|
derivationType.isImpure() || diskFull ? BuildResult::TransientFailure :
|
||||||
BuildResult::PermanentFailure;
|
BuildResult::PermanentFailure;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1221,7 +1236,7 @@ void DerivationGoal::flushLine()
|
||||||
|
|
||||||
std::map<std::string, std::optional<StorePath>> DerivationGoal::queryPartialDerivationOutputMap()
|
std::map<std::string, std::optional<StorePath>> DerivationGoal::queryPartialDerivationOutputMap()
|
||||||
{
|
{
|
||||||
if (!useDerivation || drv->type() != DerivationType::CAFloating) {
|
if (!useDerivation || drv->type().hasKnownOutputPaths()) {
|
||||||
std::map<std::string, std::optional<StorePath>> res;
|
std::map<std::string, std::optional<StorePath>> res;
|
||||||
for (auto & [name, output] : drv->outputs)
|
for (auto & [name, output] : drv->outputs)
|
||||||
res.insert_or_assign(name, output.path(worker.store, drv->name, name));
|
res.insert_or_assign(name, output.path(worker.store, drv->name, name));
|
||||||
|
@ -1233,7 +1248,7 @@ std::map<std::string, std::optional<StorePath>> DerivationGoal::queryPartialDeri
|
||||||
|
|
||||||
OutputPathMap DerivationGoal::queryDerivationOutputMap()
|
OutputPathMap DerivationGoal::queryDerivationOutputMap()
|
||||||
{
|
{
|
||||||
if (!useDerivation || drv->type() != DerivationType::CAFloating) {
|
if (!useDerivation || drv->type().hasKnownOutputPaths()) {
|
||||||
OutputPathMap res;
|
OutputPathMap res;
|
||||||
for (auto & [name, output] : drv->outputsAndOptPaths(worker.store))
|
for (auto & [name, output] : drv->outputsAndOptPaths(worker.store))
|
||||||
res.insert_or_assign(name, *output.second);
|
res.insert_or_assign(name, *output.second);
|
||||||
|
|
|
@ -61,8 +61,12 @@ struct DerivationGoal : public Goal
|
||||||
bool needRestart = false;
|
bool needRestart = false;
|
||||||
|
|
||||||
/* Whether to retry substituting the outputs after building the
|
/* Whether to retry substituting the outputs after building the
|
||||||
inputs. */
|
inputs. This is done in case of an incomplete closure. */
|
||||||
bool retrySubstitution;
|
bool retrySubstitution = false;
|
||||||
|
|
||||||
|
/* Whether we've retried substitution, in which case we won't try
|
||||||
|
again. */
|
||||||
|
bool retriedSubstitution = false;
|
||||||
|
|
||||||
/* The derivation stored at drvPath. */
|
/* The derivation stored at drvPath. */
|
||||||
std::unique_ptr<Derivation> drv;
|
std::unique_ptr<Derivation> drv;
|
||||||
|
|
|
@ -28,7 +28,7 @@ void Goal::addWaitee(GoalPtr waitee)
|
||||||
|
|
||||||
void Goal::waiteeDone(GoalPtr waitee, ExitCode result)
|
void Goal::waiteeDone(GoalPtr waitee, ExitCode result)
|
||||||
{
|
{
|
||||||
assert(waitees.find(waitee) != waitees.end());
|
assert(waitees.count(waitee));
|
||||||
waitees.erase(waitee);
|
waitees.erase(waitee);
|
||||||
|
|
||||||
trace(fmt("waitee '%s' done; %d left", waitee->name, waitees.size()));
|
trace(fmt("waitee '%s' done; %d left", waitee->name, waitees.size()));
|
||||||
|
|
|
@ -40,21 +40,21 @@ struct Goal : public std::enable_shared_from_this<Goal>
|
||||||
WeakGoals waiters;
|
WeakGoals waiters;
|
||||||
|
|
||||||
/* Number of goals we are/were waiting for that have failed. */
|
/* Number of goals we are/were waiting for that have failed. */
|
||||||
unsigned int nrFailed;
|
size_t nrFailed = 0;
|
||||||
|
|
||||||
/* Number of substitution goals we are/were waiting for that
|
/* Number of substitution goals we are/were waiting for that
|
||||||
failed because there are no substituters. */
|
failed because there are no substituters. */
|
||||||
unsigned int nrNoSubstituters;
|
size_t nrNoSubstituters = 0;
|
||||||
|
|
||||||
/* Number of substitution goals we are/were waiting for that
|
/* Number of substitution goals we are/were waiting for that
|
||||||
failed because they had unsubstitutable references. */
|
failed because they had unsubstitutable references. */
|
||||||
unsigned int nrIncompleteClosure;
|
size_t nrIncompleteClosure = 0;
|
||||||
|
|
||||||
/* Name of this goal for debugging purposes. */
|
/* Name of this goal for debugging purposes. */
|
||||||
std::string name;
|
std::string name;
|
||||||
|
|
||||||
/* Whether the goal is finished. */
|
/* Whether the goal is finished. */
|
||||||
ExitCode exitCode;
|
ExitCode exitCode = ecBusy;
|
||||||
|
|
||||||
/* Build result. */
|
/* Build result. */
|
||||||
BuildResult buildResult;
|
BuildResult buildResult;
|
||||||
|
@ -65,10 +65,7 @@ struct Goal : public std::enable_shared_from_this<Goal>
|
||||||
Goal(Worker & worker, DerivedPath path)
|
Goal(Worker & worker, DerivedPath path)
|
||||||
: worker(worker)
|
: worker(worker)
|
||||||
, buildResult { .path = std::move(path) }
|
, buildResult { .path = std::move(path) }
|
||||||
{
|
{ }
|
||||||
nrFailed = nrNoSubstituters = nrIncompleteClosure = 0;
|
|
||||||
exitCode = ecBusy;
|
|
||||||
}
|
|
||||||
|
|
||||||
virtual ~Goal()
|
virtual ~Goal()
|
||||||
{
|
{
|
||||||
|
|
|
@ -395,7 +395,7 @@ void LocalDerivationGoal::startBuilder()
|
||||||
else if (settings.sandboxMode == smDisabled)
|
else if (settings.sandboxMode == smDisabled)
|
||||||
useChroot = false;
|
useChroot = false;
|
||||||
else if (settings.sandboxMode == smRelaxed)
|
else if (settings.sandboxMode == smRelaxed)
|
||||||
useChroot = !(derivationIsImpure(derivationType)) && !noChroot;
|
useChroot = !(derivationType.isImpure()) && !noChroot;
|
||||||
}
|
}
|
||||||
|
|
||||||
auto & localStore = getLocalStore();
|
auto & localStore = getLocalStore();
|
||||||
|
@ -608,7 +608,7 @@ void LocalDerivationGoal::startBuilder()
|
||||||
"nogroup:x:65534:\n", sandboxGid()));
|
"nogroup:x:65534:\n", sandboxGid()));
|
||||||
|
|
||||||
/* Create /etc/hosts with localhost entry. */
|
/* Create /etc/hosts with localhost entry. */
|
||||||
if (!(derivationIsImpure(derivationType)))
|
if (!(derivationType.isImpure()))
|
||||||
writeFile(chrootRootDir + "/etc/hosts", "127.0.0.1 localhost\n::1 localhost\n");
|
writeFile(chrootRootDir + "/etc/hosts", "127.0.0.1 localhost\n::1 localhost\n");
|
||||||
|
|
||||||
/* Make the closure of the inputs available in the chroot,
|
/* Make the closure of the inputs available in the chroot,
|
||||||
|
@ -796,7 +796,7 @@ void LocalDerivationGoal::startBuilder()
|
||||||
us.
|
us.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
if (!(derivationIsImpure(derivationType)))
|
if (!(derivationType.isImpure()))
|
||||||
privateNetwork = true;
|
privateNetwork = true;
|
||||||
|
|
||||||
userNamespaceSync.create();
|
userNamespaceSync.create();
|
||||||
|
@ -1049,7 +1049,7 @@ void LocalDerivationGoal::initEnv()
|
||||||
derivation, tell the builder, so that for instance `fetchurl'
|
derivation, tell the builder, so that for instance `fetchurl'
|
||||||
can skip checking the output. On older Nixes, this environment
|
can skip checking the output. On older Nixes, this environment
|
||||||
variable won't be set, so `fetchurl' will do the check. */
|
variable won't be set, so `fetchurl' will do the check. */
|
||||||
if (derivationIsFixed(derivationType)) env["NIX_OUTPUT_CHECKED"] = "1";
|
if (derivationType.isFixed()) env["NIX_OUTPUT_CHECKED"] = "1";
|
||||||
|
|
||||||
/* *Only* if this is a fixed-output derivation, propagate the
|
/* *Only* if this is a fixed-output derivation, propagate the
|
||||||
values of the environment variables specified in the
|
values of the environment variables specified in the
|
||||||
|
@ -1060,7 +1060,7 @@ void LocalDerivationGoal::initEnv()
|
||||||
to the builder is generally impure, but the output of
|
to the builder is generally impure, but the output of
|
||||||
fixed-output derivations is by definition pure (since we
|
fixed-output derivations is by definition pure (since we
|
||||||
already know the cryptographic hash of the output). */
|
already know the cryptographic hash of the output). */
|
||||||
if (derivationIsImpure(derivationType)) {
|
if (derivationType.isImpure()) {
|
||||||
for (auto & i : parsedDrv->getStringsAttr("impureEnvVars").value_or(Strings()))
|
for (auto & i : parsedDrv->getStringsAttr("impureEnvVars").value_or(Strings()))
|
||||||
env[i] = getEnv(i).value_or("");
|
env[i] = getEnv(i).value_or("");
|
||||||
}
|
}
|
||||||
|
@ -1340,6 +1340,12 @@ struct RestrictedStore : public virtual RestrictedStoreConfig, public virtual Lo
|
||||||
next->queryMissing(allowed, willBuild, willSubstitute,
|
next->queryMissing(allowed, willBuild, willSubstitute,
|
||||||
unknown, downloadSize, narSize);
|
unknown, downloadSize, narSize);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
virtual std::optional<std::string> getBuildLog(const StorePath & path) override
|
||||||
|
{ return std::nullopt; }
|
||||||
|
|
||||||
|
virtual void addBuildLog(const StorePath & path, std::string_view log) override
|
||||||
|
{ unsupported("addBuildLog"); }
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
|
@ -1668,7 +1674,7 @@ void LocalDerivationGoal::runChild()
|
||||||
/* Fixed-output derivations typically need to access the
|
/* Fixed-output derivations typically need to access the
|
||||||
network, so give them access to /etc/resolv.conf and so
|
network, so give them access to /etc/resolv.conf and so
|
||||||
on. */
|
on. */
|
||||||
if (derivationIsImpure(derivationType)) {
|
if (derivationType.isImpure()) {
|
||||||
// Only use nss functions to resolve hosts and
|
// Only use nss functions to resolve hosts and
|
||||||
// services. Don’t use it for anything else that may
|
// services. Don’t use it for anything else that may
|
||||||
// be configured for this system. This limits the
|
// be configured for this system. This limits the
|
||||||
|
@ -1912,7 +1918,7 @@ void LocalDerivationGoal::runChild()
|
||||||
|
|
||||||
sandboxProfile += "(import \"sandbox-defaults.sb\")\n";
|
sandboxProfile += "(import \"sandbox-defaults.sb\")\n";
|
||||||
|
|
||||||
if (derivationIsImpure(derivationType))
|
if (derivationType.isImpure())
|
||||||
sandboxProfile += "(import \"sandbox-network.sb\")\n";
|
sandboxProfile += "(import \"sandbox-network.sb\")\n";
|
||||||
|
|
||||||
/* Add the output paths we'll use at build-time to the chroot */
|
/* Add the output paths we'll use at build-time to the chroot */
|
||||||
|
@ -2272,7 +2278,7 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
auto newInfoFromCA = [&](const DerivationOutputCAFloating outputHash) -> ValidPathInfo {
|
auto newInfoFromCA = [&](const DerivationOutput::CAFloating outputHash) -> ValidPathInfo {
|
||||||
auto & st = outputStats.at(outputName);
|
auto & st = outputStats.at(outputName);
|
||||||
if (outputHash.method == ContentAddressMethod { FileIngestionMethod::Flat } ||
|
if (outputHash.method == ContentAddressMethod { FileIngestionMethod::Flat } ||
|
||||||
outputHash.method == ContentAddressMethod { TextHashMethod {} })
|
outputHash.method == ContentAddressMethod { TextHashMethod {} })
|
||||||
|
@ -2340,7 +2346,7 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
|
|
||||||
ValidPathInfo newInfo = std::visit(overloaded {
|
ValidPathInfo newInfo = std::visit(overloaded {
|
||||||
|
|
||||||
[&](const DerivationOutputInputAddressed & output) {
|
[&](const DerivationOutput::InputAddressed & output) {
|
||||||
/* input-addressed case */
|
/* input-addressed case */
|
||||||
auto requiredFinalPath = output.path;
|
auto requiredFinalPath = output.path;
|
||||||
/* Preemptively add rewrite rule for final hash, as that is
|
/* Preemptively add rewrite rule for final hash, as that is
|
||||||
|
@ -2357,7 +2363,7 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
return newInfo0;
|
return newInfo0;
|
||||||
},
|
},
|
||||||
|
|
||||||
[&](const DerivationOutputCAFixed & dof) {
|
[&](const DerivationOutput::CAFixed & dof) {
|
||||||
auto wanted = getContentAddressHash(dof.ca);
|
auto wanted = getContentAddressHash(dof.ca);
|
||||||
|
|
||||||
auto newInfo0 = newInfoFromCA(DerivationOutputCAFloating {
|
auto newInfo0 = newInfoFromCA(DerivationOutputCAFloating {
|
||||||
|
@ -2386,17 +2392,17 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
return newInfo0;
|
return newInfo0;
|
||||||
},
|
},
|
||||||
|
|
||||||
[&](DerivationOutputCAFloating & dof) {
|
[&](const DerivationOutput::CAFloating & dof) {
|
||||||
return newInfoFromCA(dof);
|
return newInfoFromCA(dof);
|
||||||
},
|
},
|
||||||
|
|
||||||
[&](DerivationOutputDeferred) -> ValidPathInfo {
|
[&](const DerivationOutput::Deferred &) -> ValidPathInfo {
|
||||||
// No derivation should reach that point without having been
|
// No derivation should reach that point without having been
|
||||||
// rewritten first
|
// rewritten first
|
||||||
assert(false);
|
assert(false);
|
||||||
},
|
},
|
||||||
|
|
||||||
}, output.output);
|
}, output.raw());
|
||||||
|
|
||||||
/* FIXME: set proper permissions in restorePath() so
|
/* FIXME: set proper permissions in restorePath() so
|
||||||
we don't have to do another traversal. */
|
we don't have to do another traversal. */
|
||||||
|
@ -2610,7 +2616,8 @@ DrvOutputs LocalDerivationGoal::registerOutputs()
|
||||||
signRealisation(thisRealisation);
|
signRealisation(thisRealisation);
|
||||||
worker.store.registerDrvOutput(thisRealisation);
|
worker.store.registerDrvOutput(thisRealisation);
|
||||||
}
|
}
|
||||||
builtOutputs.emplace(thisRealisation.id, thisRealisation);
|
if (wantOutput(outputName, wantedOutputs))
|
||||||
|
builtOutputs.emplace(thisRealisation.id, thisRealisation);
|
||||||
}
|
}
|
||||||
|
|
||||||
return builtOutputs;
|
return builtOutputs;
|
||||||
|
|
|
@ -47,9 +47,9 @@ static void createLinks(State & state, const Path & srcDir, const Path & dstDir,
|
||||||
throw;
|
throw;
|
||||||
}
|
}
|
||||||
|
|
||||||
/* The files below are special-cased to that they don't show up
|
/* The files below are special-cased to that they don't show
|
||||||
* in user profiles, either because they are useless, or
|
* up in user profiles, either because they are useless, or
|
||||||
* because they would cauase pointless collisions (e.g., each
|
* because they would cause pointless collisions (e.g., each
|
||||||
* Python package brings its own
|
* Python package brings its own
|
||||||
* `$out/lib/pythonX.Y/site-packages/easy-install.pth'.)
|
* `$out/lib/pythonX.Y/site-packages/easy-install.pth'.)
|
||||||
*/
|
*/
|
||||||
|
@ -57,7 +57,9 @@ static void createLinks(State & state, const Path & srcDir, const Path & dstDir,
|
||||||
hasSuffix(srcFile, "/nix-support") ||
|
hasSuffix(srcFile, "/nix-support") ||
|
||||||
hasSuffix(srcFile, "/perllocal.pod") ||
|
hasSuffix(srcFile, "/perllocal.pod") ||
|
||||||
hasSuffix(srcFile, "/info/dir") ||
|
hasSuffix(srcFile, "/info/dir") ||
|
||||||
hasSuffix(srcFile, "/log"))
|
hasSuffix(srcFile, "/log") ||
|
||||||
|
hasSuffix(srcFile, "/manifest.nix") ||
|
||||||
|
hasSuffix(srcFile, "/manifest.json"))
|
||||||
continue;
|
continue;
|
||||||
|
|
||||||
else if (S_ISDIR(srcSt.st_mode)) {
|
else if (S_ISDIR(srcSt.st_mode)) {
|
||||||
|
|
|
@ -3,7 +3,9 @@
|
||||||
#include "worker-protocol.hh"
|
#include "worker-protocol.hh"
|
||||||
#include "build-result.hh"
|
#include "build-result.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
#include "path-with-outputs.hh"
|
#include "path-with-outputs.hh"
|
||||||
#include "finally.hh"
|
#include "finally.hh"
|
||||||
#include "archive.hh"
|
#include "archive.hh"
|
||||||
|
@ -562,6 +564,8 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
BuildMode buildMode = (BuildMode) readInt(from);
|
BuildMode buildMode = (BuildMode) readInt(from);
|
||||||
logger->startWork();
|
logger->startWork();
|
||||||
|
|
||||||
|
auto drvType = drv.type();
|
||||||
|
|
||||||
/* Content-addressed derivations are trustless because their output paths
|
/* Content-addressed derivations are trustless because their output paths
|
||||||
are verified by their content alone, so any derivation is free to
|
are verified by their content alone, so any derivation is free to
|
||||||
try to produce such a path.
|
try to produce such a path.
|
||||||
|
@ -594,12 +598,12 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
derivations, we throw out the precomputed output paths and just
|
derivations, we throw out the precomputed output paths and just
|
||||||
store the hashes, so there aren't two competing sources of truth an
|
store the hashes, so there aren't two competing sources of truth an
|
||||||
attacker could exploit. */
|
attacker could exploit. */
|
||||||
if (drv.type() == DerivationType::InputAddressed && !trusted)
|
if (!(drvType.isCA() || trusted))
|
||||||
throw Error("you are not privileged to build input-addressed derivations");
|
throw Error("you are not privileged to build input-addressed derivations");
|
||||||
|
|
||||||
/* Make sure that the non-input-addressed derivations that got this far
|
/* Make sure that the non-input-addressed derivations that got this far
|
||||||
are in fact content-addressed if we don't trust them. */
|
are in fact content-addressed if we don't trust them. */
|
||||||
assert(derivationIsCA(drv.type()) || trusted);
|
assert(drvType.isCA() || trusted);
|
||||||
|
|
||||||
/* Recompute the derivation path when we cannot trust the original. */
|
/* Recompute the derivation path when we cannot trust the original. */
|
||||||
if (!trusted) {
|
if (!trusted) {
|
||||||
|
@ -608,7 +612,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
original not-necessarily-resolved derivation to verify the drv
|
original not-necessarily-resolved derivation to verify the drv
|
||||||
derivation as adequate claim to the input-addressed output
|
derivation as adequate claim to the input-addressed output
|
||||||
paths. */
|
paths. */
|
||||||
assert(derivationIsCA(drv.type()));
|
assert(drvType.isCA());
|
||||||
|
|
||||||
Derivation drv2;
|
Derivation drv2;
|
||||||
static_cast<BasicDerivation &>(drv2) = drv;
|
static_cast<BasicDerivation &>(drv2) = drv;
|
||||||
|
@ -649,7 +653,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
Path path = absPath(readString(from));
|
Path path = absPath(readString(from));
|
||||||
|
|
||||||
logger->startWork();
|
logger->startWork();
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
gcStore.addIndirectRoot(path);
|
gcStore.addIndirectRoot(path);
|
||||||
logger->stopWork();
|
logger->stopWork();
|
||||||
|
|
||||||
|
@ -667,7 +671,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
|
|
||||||
case wopFindRoots: {
|
case wopFindRoots: {
|
||||||
logger->startWork();
|
logger->startWork();
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
Roots roots = gcStore.findRoots(!trusted);
|
Roots roots = gcStore.findRoots(!trusted);
|
||||||
logger->stopWork();
|
logger->stopWork();
|
||||||
|
|
||||||
|
@ -699,7 +703,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
logger->startWork();
|
logger->startWork();
|
||||||
if (options.ignoreLiveness)
|
if (options.ignoreLiveness)
|
||||||
throw Error("you are not allowed to ignore liveness");
|
throw Error("you are not allowed to ignore liveness");
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
gcStore.collectGarbage(options, results);
|
gcStore.collectGarbage(options, results);
|
||||||
logger->stopWork();
|
logger->stopWork();
|
||||||
|
|
||||||
|
@ -957,11 +961,12 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
|
||||||
logger->startWork();
|
logger->startWork();
|
||||||
if (!trusted)
|
if (!trusted)
|
||||||
throw Error("you are not privileged to add logs");
|
throw Error("you are not privileged to add logs");
|
||||||
|
auto & logStore = require<LogStore>(*store);
|
||||||
{
|
{
|
||||||
FramedSource source(from);
|
FramedSource source(from);
|
||||||
StringSink sink;
|
StringSink sink;
|
||||||
source.drainInto(sink);
|
source.drainInto(sink);
|
||||||
store->addBuildLog(path, sink.s);
|
logStore.addBuildLog(path, sink.s);
|
||||||
}
|
}
|
||||||
logger->stopWork();
|
logger->stopWork();
|
||||||
to << 1;
|
to << 1;
|
||||||
|
|
|
@ -12,25 +12,25 @@ namespace nix {
|
||||||
std::optional<StorePath> DerivationOutput::path(const Store & store, std::string_view drvName, std::string_view outputName) const
|
std::optional<StorePath> DerivationOutput::path(const Store & store, std::string_view drvName, std::string_view outputName) const
|
||||||
{
|
{
|
||||||
return std::visit(overloaded {
|
return std::visit(overloaded {
|
||||||
[](const DerivationOutputInputAddressed & doi) -> std::optional<StorePath> {
|
[](const DerivationOutput::InputAddressed & doi) -> std::optional<StorePath> {
|
||||||
return { doi.path };
|
return { doi.path };
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed & dof) -> std::optional<StorePath> {
|
[&](const DerivationOutput::CAFixed & dof) -> std::optional<StorePath> {
|
||||||
return {
|
return {
|
||||||
dof.path(store, drvName, outputName)
|
dof.path(store, drvName, outputName)
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
[](const DerivationOutputCAFloating & dof) -> std::optional<StorePath> {
|
[](const DerivationOutput::CAFloating & dof) -> std::optional<StorePath> {
|
||||||
return std::nullopt;
|
return std::nullopt;
|
||||||
},
|
},
|
||||||
[](const DerivationOutputDeferred &) -> std::optional<StorePath> {
|
[](const DerivationOutput::Deferred &) -> std::optional<StorePath> {
|
||||||
return std::nullopt;
|
return std::nullopt;
|
||||||
},
|
},
|
||||||
}, output);
|
}, raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
StorePath DerivationOutputCAFixed::path(const Store & store, std::string_view drvName, std::string_view outputName) const {
|
StorePath DerivationOutput::CAFixed::path(const Store & store, std::string_view drvName, std::string_view outputName) const {
|
||||||
return store.makeFixedOutputPathFromCA(StorePathDescriptor {
|
return store.makeFixedOutputPathFromCA(StorePathDescriptor {
|
||||||
.name = outputPathName(drvName, outputName),
|
.name = outputPathName(drvName, outputName),
|
||||||
.info = ca,
|
.info = ca,
|
||||||
|
@ -38,47 +38,46 @@ StorePath DerivationOutputCAFixed::path(const Store & store, std::string_view dr
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
bool derivationIsCA(DerivationType dt) {
|
bool DerivationType::isCA() const {
|
||||||
switch (dt) {
|
/* Normally we do the full `std::visit` to make sure we have
|
||||||
case DerivationType::InputAddressed: return false;
|
exhaustively handled all variants, but so long as there is a
|
||||||
case DerivationType::CAFixed: return true;
|
variant called `ContentAddressed`, it must be the only one for
|
||||||
case DerivationType::CAFloating: return true;
|
which `isCA` is true for this to make sense!. */
|
||||||
case DerivationType::DeferredInputAddressed: return false;
|
return std::holds_alternative<ContentAddressed>(raw());
|
||||||
};
|
|
||||||
// Since enums can have non-variant values, but making a `default:` would
|
|
||||||
// disable exhaustiveness warnings.
|
|
||||||
assert(false);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
bool derivationIsFixed(DerivationType dt) {
|
bool DerivationType::isFixed() const {
|
||||||
switch (dt) {
|
return std::visit(overloaded {
|
||||||
case DerivationType::InputAddressed: return false;
|
[](const InputAddressed & ia) {
|
||||||
case DerivationType::CAFixed: return true;
|
return false;
|
||||||
case DerivationType::CAFloating: return false;
|
},
|
||||||
case DerivationType::DeferredInputAddressed: return false;
|
[](const ContentAddressed & ca) {
|
||||||
};
|
return ca.fixed;
|
||||||
assert(false);
|
},
|
||||||
|
}, raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
bool derivationHasKnownOutputPaths(DerivationType dt) {
|
bool DerivationType::hasKnownOutputPaths() const {
|
||||||
switch (dt) {
|
return std::visit(overloaded {
|
||||||
case DerivationType::InputAddressed: return true;
|
[](const InputAddressed & ia) {
|
||||||
case DerivationType::CAFixed: return true;
|
return !ia.deferred;
|
||||||
case DerivationType::CAFloating: return false;
|
},
|
||||||
case DerivationType::DeferredInputAddressed: return false;
|
[](const ContentAddressed & ca) {
|
||||||
};
|
return ca.fixed;
|
||||||
assert(false);
|
},
|
||||||
|
}, raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
bool derivationIsImpure(DerivationType dt) {
|
bool DerivationType::isImpure() const {
|
||||||
switch (dt) {
|
return std::visit(overloaded {
|
||||||
case DerivationType::InputAddressed: return false;
|
[](const InputAddressed & ia) {
|
||||||
case DerivationType::CAFixed: return true;
|
return false;
|
||||||
case DerivationType::CAFloating: return false;
|
},
|
||||||
case DerivationType::DeferredInputAddressed: return false;
|
[](const ContentAddressed & ca) {
|
||||||
};
|
return !ca.pure;
|
||||||
assert(false);
|
},
|
||||||
|
}, raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -178,34 +177,26 @@ static DerivationOutput parseDerivationOutput(const Store & store,
|
||||||
if (hashS != "") {
|
if (hashS != "") {
|
||||||
validatePath(pathS);
|
validatePath(pathS);
|
||||||
auto hash = Hash::parseNonSRIUnprefixed(hashS, hashType);
|
auto hash = Hash::parseNonSRIUnprefixed(hashS, hashType);
|
||||||
return DerivationOutput {
|
return DerivationOutput::CAFixed {
|
||||||
.output = DerivationOutputCAFixed {
|
// FIXME non-trivial fixed refs set
|
||||||
// FIXME non-trivial fixed refs set
|
.ca = contentAddressFromMethodHashAndRefs(
|
||||||
.ca = contentAddressFromMethodHashAndRefs(
|
method, std::move(hash), {}),
|
||||||
method, std::move(hash), {}),
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
} else {
|
} else {
|
||||||
settings.requireExperimentalFeature(Xp::CaDerivations);
|
settings.requireExperimentalFeature(Xp::CaDerivations);
|
||||||
assert(pathS == "");
|
assert(pathS == "");
|
||||||
return DerivationOutput {
|
return DerivationOutput::CAFloating {
|
||||||
.output = DerivationOutputCAFloating {
|
.method = std::move(method),
|
||||||
.method = std::move(method),
|
.hashType = std::move(hashType),
|
||||||
.hashType = std::move(hashType),
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
if (pathS == "") {
|
if (pathS == "") {
|
||||||
return DerivationOutput {
|
return DerivationOutput::Deferred { };
|
||||||
.output = DerivationOutputDeferred { }
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
validatePath(pathS);
|
validatePath(pathS);
|
||||||
return DerivationOutput {
|
return DerivationOutput::InputAddressed {
|
||||||
.output = DerivationOutputInputAddressed {
|
.path = store.parseStorePath(pathS),
|
||||||
.path = store.parseStorePath(pathS),
|
|
||||||
}
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -333,27 +324,27 @@ std::string Derivation::unparse(const Store & store, bool maskOutputs,
|
||||||
if (first) first = false; else s += ',';
|
if (first) first = false; else s += ',';
|
||||||
s += '('; printUnquotedString(s, i.first);
|
s += '('; printUnquotedString(s, i.first);
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const DerivationOutputInputAddressed & doi) {
|
[&](const DerivationOutput::InputAddressed & doi) {
|
||||||
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(doi.path));
|
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(doi.path));
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed & dof) {
|
[&](const DerivationOutput::CAFixed & dof) {
|
||||||
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(dof.path(store, name, i.first)));
|
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(dof.path(store, name, i.first)));
|
||||||
s += ','; printUnquotedString(s, printMethodAlgo(dof.ca));
|
s += ','; printUnquotedString(s, printMethodAlgo(dof.ca));
|
||||||
s += ','; printUnquotedString(s, getContentAddressHash(dof.ca).to_string(Base16, false));
|
s += ','; printUnquotedString(s, getContentAddressHash(dof.ca).to_string(Base16, false));
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFloating & dof) {
|
[&](const DerivationOutput::CAFloating & dof) {
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
s += ','; printUnquotedString(s, makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType));
|
s += ','; printUnquotedString(s, makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType));
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputDeferred &) {
|
[&](const DerivationOutput::Deferred &) {
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
s += ','; printUnquotedString(s, "");
|
s += ','; printUnquotedString(s, "");
|
||||||
}
|
}
|
||||||
}, i.second.output);
|
}, i.second.raw());
|
||||||
s += ')';
|
s += ')';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -421,13 +412,13 @@ DerivationType BasicDerivation::type() const
|
||||||
std::optional<HashType> floatingHashType;
|
std::optional<HashType> floatingHashType;
|
||||||
for (auto & i : outputs) {
|
for (auto & i : outputs) {
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const DerivationOutputInputAddressed &) {
|
[&](const DerivationOutput::InputAddressed &) {
|
||||||
inputAddressedOutputs.insert(i.first);
|
inputAddressedOutputs.insert(i.first);
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed &) {
|
[&](const DerivationOutput::CAFixed &) {
|
||||||
fixedCAOutputs.insert(i.first);
|
fixedCAOutputs.insert(i.first);
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFloating & dof) {
|
[&](const DerivationOutput::CAFloating & dof) {
|
||||||
floatingCAOutputs.insert(i.first);
|
floatingCAOutputs.insert(i.first);
|
||||||
if (!floatingHashType) {
|
if (!floatingHashType) {
|
||||||
floatingHashType = dof.hashType;
|
floatingHashType = dof.hashType;
|
||||||
|
@ -436,27 +427,37 @@ DerivationType BasicDerivation::type() const
|
||||||
throw Error("All floating outputs must use the same hash type");
|
throw Error("All floating outputs must use the same hash type");
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputDeferred &) {
|
[&](const DerivationOutput::Deferred &) {
|
||||||
deferredIAOutputs.insert(i.first);
|
deferredIAOutputs.insert(i.first);
|
||||||
},
|
},
|
||||||
}, i.second.output);
|
}, i.second.raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
||||||
throw Error("Must have at least one output");
|
throw Error("Must have at least one output");
|
||||||
} else if (! inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
} else if (! inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
||||||
return DerivationType::InputAddressed;
|
return DerivationType::InputAddressed {
|
||||||
|
.deferred = false,
|
||||||
|
};
|
||||||
} else if (inputAddressedOutputs.empty() && ! fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
} else if (inputAddressedOutputs.empty() && ! fixedCAOutputs.empty() && floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
||||||
if (fixedCAOutputs.size() > 1)
|
if (fixedCAOutputs.size() > 1)
|
||||||
// FIXME: Experimental feature?
|
// FIXME: Experimental feature?
|
||||||
throw Error("Only one fixed output is allowed for now");
|
throw Error("Only one fixed output is allowed for now");
|
||||||
if (*fixedCAOutputs.begin() != "out")
|
if (*fixedCAOutputs.begin() != "out")
|
||||||
throw Error("Single fixed output must be named \"out\"");
|
throw Error("Single fixed output must be named \"out\"");
|
||||||
return DerivationType::CAFixed;
|
return DerivationType::ContentAddressed {
|
||||||
|
.pure = false,
|
||||||
|
.fixed = true,
|
||||||
|
};
|
||||||
} else if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && ! floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
} else if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && ! floatingCAOutputs.empty() && deferredIAOutputs.empty()) {
|
||||||
return DerivationType::CAFloating;
|
return DerivationType::ContentAddressed {
|
||||||
|
.pure = true,
|
||||||
|
.fixed = false,
|
||||||
|
};
|
||||||
} else if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && !deferredIAOutputs.empty()) {
|
} else if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty() && !deferredIAOutputs.empty()) {
|
||||||
return DerivationType::DeferredInputAddressed;
|
return DerivationType::InputAddressed {
|
||||||
|
.deferred = true,
|
||||||
|
};
|
||||||
} else {
|
} else {
|
||||||
throw Error("Can't mix derivation output types");
|
throw Error("Can't mix derivation output types");
|
||||||
}
|
}
|
||||||
|
@ -508,13 +509,13 @@ static const DrvHashModulo pathDerivationModulo(Store & store, const StorePath &
|
||||||
*/
|
*/
|
||||||
DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs)
|
DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs)
|
||||||
{
|
{
|
||||||
bool isDeferred = false;
|
auto type = drv.type();
|
||||||
|
|
||||||
/* Return a fixed hash for fixed-output derivations. */
|
/* Return a fixed hash for fixed-output derivations. */
|
||||||
switch (drv.type()) {
|
if (type.isFixed()) {
|
||||||
case DerivationType::CAFixed: {
|
|
||||||
std::map<std::string, Hash> outputHashes;
|
std::map<std::string, Hash> outputHashes;
|
||||||
for (const auto & i : drv.outputs) {
|
for (const auto & i : drv.outputs) {
|
||||||
auto & dof = std::get<DerivationOutputCAFixed>(i.second.output);
|
auto & dof = std::get<DerivationOutput::CAFixed>(i.second.raw());
|
||||||
auto hash = hashString(htSHA256, "fixed:out:"
|
auto hash = hashString(htSHA256, "fixed:out:"
|
||||||
+ printMethodAlgo(dof.ca) + ":"
|
+ printMethodAlgo(dof.ca) + ":"
|
||||||
+ getContentAddressHash(dof.ca).to_string(Base16, false) + ":"
|
+ getContentAddressHash(dof.ca).to_string(Base16, false) + ":"
|
||||||
|
@ -523,33 +524,37 @@ DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool m
|
||||||
}
|
}
|
||||||
return outputHashes;
|
return outputHashes;
|
||||||
}
|
}
|
||||||
case DerivationType::CAFloating:
|
|
||||||
isDeferred = true;
|
auto kind = std::visit(overloaded {
|
||||||
break;
|
[](const DerivationType::InputAddressed & ia) {
|
||||||
case DerivationType::InputAddressed:
|
/* This might be a "pesimistically" deferred output, so we don't
|
||||||
break;
|
"taint" the kind yet. */
|
||||||
case DerivationType::DeferredInputAddressed:
|
return DrvHash::Kind::Regular;
|
||||||
break;
|
},
|
||||||
}
|
[](const DerivationType::ContentAddressed & ca) {
|
||||||
|
return ca.fixed
|
||||||
|
? DrvHash::Kind::Regular
|
||||||
|
: DrvHash::Kind::Deferred;
|
||||||
|
},
|
||||||
|
}, drv.type().raw());
|
||||||
|
|
||||||
/* For other derivations, replace the inputs paths with recursive
|
/* For other derivations, replace the inputs paths with recursive
|
||||||
calls to this function. */
|
calls to this function. */
|
||||||
std::map<std::string, StringSet> inputs2;
|
std::map<std::string, StringSet> inputs2;
|
||||||
for (auto & i : drv.inputDrvs) {
|
for (auto & [drvPath, inputOutputs0] : drv.inputDrvs) {
|
||||||
const auto & res = pathDerivationModulo(store, i.first);
|
// Avoid lambda capture restriction with standard / Clang
|
||||||
|
auto & inputOutputs = inputOutputs0;
|
||||||
|
const auto & res = pathDerivationModulo(store, drvPath);
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
// Regular non-CA derivation, replace derivation
|
// Regular non-CA derivation, replace derivation
|
||||||
[&](const Hash & drvHash) {
|
[&](const DrvHash & drvHash) {
|
||||||
inputs2.insert_or_assign(drvHash.to_string(Base16, false), i.second);
|
kind |= drvHash.kind;
|
||||||
},
|
inputs2.insert_or_assign(drvHash.hash.to_string(Base16, false), inputOutputs);
|
||||||
[&](const DeferredHash & deferredHash) {
|
|
||||||
isDeferred = true;
|
|
||||||
inputs2.insert_or_assign(deferredHash.hash.to_string(Base16, false), i.second);
|
|
||||||
},
|
},
|
||||||
// CA derivation's output hashes
|
// CA derivation's output hashes
|
||||||
[&](const CaOutputHashes & outputHashes) {
|
[&](const CaOutputHashes & outputHashes) {
|
||||||
std::set<std::string> justOut = { "out" };
|
std::set<std::string> justOut = { "out" };
|
||||||
for (auto & output : i.second) {
|
for (auto & output : inputOutputs) {
|
||||||
/* Put each one in with a single "out" output.. */
|
/* Put each one in with a single "out" output.. */
|
||||||
const auto h = outputHashes.at(output);
|
const auto h = outputHashes.at(output);
|
||||||
inputs2.insert_or_assign(
|
inputs2.insert_or_assign(
|
||||||
|
@ -557,15 +562,24 @@ DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool m
|
||||||
justOut);
|
justOut);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
}, res);
|
}, res.raw());
|
||||||
}
|
}
|
||||||
|
|
||||||
auto hash = hashString(htSHA256, drv.unparse(store, maskOutputs, &inputs2));
|
auto hash = hashString(htSHA256, drv.unparse(store, maskOutputs, &inputs2));
|
||||||
|
|
||||||
if (isDeferred)
|
return DrvHash { .hash = hash, .kind = kind };
|
||||||
return DeferredHash { hash };
|
}
|
||||||
else
|
|
||||||
return hash;
|
|
||||||
|
void operator |= (DrvHash::Kind & self, const DrvHash::Kind & other) noexcept
|
||||||
|
{
|
||||||
|
switch (other) {
|
||||||
|
case DrvHash::Kind::Regular:
|
||||||
|
break;
|
||||||
|
case DrvHash::Kind::Deferred:
|
||||||
|
self = other;
|
||||||
|
break;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -573,20 +587,15 @@ std::map<std::string, Hash> staticOutputHashes(Store & store, const Derivation &
|
||||||
{
|
{
|
||||||
std::map<std::string, Hash> res;
|
std::map<std::string, Hash> res;
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const Hash & drvHash) {
|
[&](const DrvHash & drvHash) {
|
||||||
for (auto & outputName : drv.outputNames()) {
|
for (auto & outputName : drv.outputNames()) {
|
||||||
res.insert({outputName, drvHash});
|
res.insert({outputName, drvHash.hash});
|
||||||
}
|
|
||||||
},
|
|
||||||
[&](const DeferredHash & deferredHash) {
|
|
||||||
for (auto & outputName : drv.outputNames()) {
|
|
||||||
res.insert({outputName, deferredHash.hash});
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
[&](const CaOutputHashes & outputHashes) {
|
[&](const CaOutputHashes & outputHashes) {
|
||||||
res = outputHashes;
|
res = outputHashes;
|
||||||
},
|
},
|
||||||
}, hashDerivationModulo(store, drv, true));
|
}, hashDerivationModulo(store, drv, true).raw());
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -667,27 +676,27 @@ void writeDerivation(Sink & out, const Store & store, const BasicDerivation & dr
|
||||||
for (auto & i : drv.outputs) {
|
for (auto & i : drv.outputs) {
|
||||||
out << i.first;
|
out << i.first;
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const DerivationOutputInputAddressed & doi) {
|
[&](const DerivationOutput::InputAddressed & doi) {
|
||||||
out << store.printStorePath(doi.path)
|
out << store.printStorePath(doi.path)
|
||||||
<< ""
|
<< ""
|
||||||
<< "";
|
<< "";
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed & dof) {
|
[&](const DerivationOutput::CAFixed & dof) {
|
||||||
out << store.printStorePath(dof.path(store, drv.name, i.first))
|
out << store.printStorePath(dof.path(store, drv.name, i.first))
|
||||||
<< printMethodAlgo(dof.ca)
|
<< printMethodAlgo(dof.ca)
|
||||||
<< getContentAddressHash(dof.ca).to_string(Base16, false);
|
<< getContentAddressHash(dof.ca).to_string(Base16, false);
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFloating & dof) {
|
[&](const DerivationOutput::CAFloating & dof) {
|
||||||
out << ""
|
out << ""
|
||||||
<< (makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType))
|
<< (makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType))
|
||||||
<< "";
|
<< "";
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputDeferred &) {
|
[&](const DerivationOutput::Deferred &) {
|
||||||
out << ""
|
out << ""
|
||||||
<< ""
|
<< ""
|
||||||
<< "";
|
<< "";
|
||||||
},
|
},
|
||||||
}, i.second.output);
|
}, i.second.raw());
|
||||||
}
|
}
|
||||||
worker_proto::write(store, out, drv.inputSrcs);
|
worker_proto::write(store, out, drv.inputSrcs);
|
||||||
out << drv.platform << drv.builder << drv.args;
|
out << drv.platform << drv.builder << drv.args;
|
||||||
|
@ -735,45 +744,63 @@ static void rewriteDerivation(Store & store, BasicDerivation & drv, const String
|
||||||
|
|
||||||
auto hashModulo = hashDerivationModulo(store, Derivation(drv), true);
|
auto hashModulo = hashDerivationModulo(store, Derivation(drv), true);
|
||||||
for (auto & [outputName, output] : drv.outputs) {
|
for (auto & [outputName, output] : drv.outputs) {
|
||||||
if (std::holds_alternative<DerivationOutputDeferred>(output.output)) {
|
if (std::holds_alternative<DerivationOutput::Deferred>(output.raw())) {
|
||||||
Hash h = std::get<Hash>(hashModulo);
|
auto & h = hashModulo.requireNoFixedNonDeferred();
|
||||||
auto outPath = store.makeOutputPath(outputName, h, drv.name);
|
auto outPath = store.makeOutputPath(outputName, h, drv.name);
|
||||||
drv.env[outputName] = store.printStorePath(outPath);
|
drv.env[outputName] = store.printStorePath(outPath);
|
||||||
output = DerivationOutput {
|
output = DerivationOutput::InputAddressed {
|
||||||
.output = DerivationOutputInputAddressed {
|
.path = std::move(outPath),
|
||||||
.path = std::move(outPath),
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const Hash & DrvHashModulo::requireNoFixedNonDeferred() const {
|
||||||
|
auto * drvHashOpt = std::get_if<DrvHash>(&raw());
|
||||||
|
assert(drvHashOpt);
|
||||||
|
assert(drvHashOpt->kind == DrvHash::Kind::Regular);
|
||||||
|
return drvHashOpt->hash;
|
||||||
|
}
|
||||||
|
|
||||||
|
static bool tryResolveInput(
|
||||||
|
Store & store, StorePathSet & inputSrcs, StringMap & inputRewrites,
|
||||||
|
const StorePath & inputDrv, const StringSet & inputOutputs)
|
||||||
|
{
|
||||||
|
auto inputDrvOutputs = store.queryPartialDerivationOutputMap(inputDrv);
|
||||||
|
|
||||||
|
auto getOutput = [&](const std::string & outputName) {
|
||||||
|
auto & actualPathOpt = inputDrvOutputs.at(outputName);
|
||||||
|
if (!actualPathOpt)
|
||||||
|
warn("output %s of input %s missing, aborting the resolving",
|
||||||
|
outputName,
|
||||||
|
store.printStorePath(inputDrv)
|
||||||
|
);
|
||||||
|
return actualPathOpt;
|
||||||
|
};
|
||||||
|
|
||||||
|
for (auto & outputName : inputOutputs) {
|
||||||
|
auto actualPathOpt = getOutput(outputName);
|
||||||
|
if (!actualPathOpt) return false;
|
||||||
|
auto actualPath = *actualPathOpt;
|
||||||
|
inputRewrites.emplace(
|
||||||
|
downstreamPlaceholder(store, inputDrv, outputName),
|
||||||
|
store.printStorePath(actualPath));
|
||||||
|
inputSrcs.insert(std::move(actualPath));
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
std::optional<BasicDerivation> Derivation::tryResolve(Store & store) {
|
std::optional<BasicDerivation> Derivation::tryResolve(Store & store) {
|
||||||
BasicDerivation resolved { *this };
|
BasicDerivation resolved { *this };
|
||||||
|
|
||||||
// Input paths that we'll want to rewrite in the derivation
|
// Input paths that we'll want to rewrite in the derivation
|
||||||
StringMap inputRewrites;
|
StringMap inputRewrites;
|
||||||
|
|
||||||
for (auto & input : inputDrvs) {
|
for (auto & [inputDrv, inputOutputs] : inputDrvs)
|
||||||
auto inputDrvOutputs = store.queryPartialDerivationOutputMap(input.first);
|
if (!tryResolveInput(store, resolved.inputSrcs, inputRewrites, inputDrv, inputOutputs))
|
||||||
StringSet newOutputNames;
|
return std::nullopt;
|
||||||
for (auto & outputName : input.second) {
|
|
||||||
auto actualPathOpt = inputDrvOutputs.at(outputName);
|
|
||||||
if (!actualPathOpt) {
|
|
||||||
warn("output %s of input %s missing, aborting the resolving",
|
|
||||||
outputName,
|
|
||||||
store.printStorePath(input.first)
|
|
||||||
);
|
|
||||||
return std::nullopt;
|
|
||||||
}
|
|
||||||
auto actualPath = *actualPathOpt;
|
|
||||||
inputRewrites.emplace(
|
|
||||||
downstreamPlaceholder(store, input.first, outputName),
|
|
||||||
store.printStorePath(actualPath));
|
|
||||||
resolved.inputSrcs.insert(std::move(actualPath));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
rewriteDerivation(store, resolved, inputRewrites);
|
rewriteDerivation(store, resolved, inputRewrites);
|
||||||
|
|
||||||
|
|
|
@ -4,6 +4,7 @@
|
||||||
#include "types.hh"
|
#include "types.hh"
|
||||||
#include "hash.hh"
|
#include "hash.hh"
|
||||||
#include "content-address.hh"
|
#include "content-address.hh"
|
||||||
|
#include "repair-flag.hh"
|
||||||
#include "sync.hh"
|
#include "sync.hh"
|
||||||
|
|
||||||
#include <map>
|
#include <map>
|
||||||
|
@ -44,19 +45,31 @@ struct DerivationOutputCAFloating
|
||||||
*/
|
*/
|
||||||
struct DerivationOutputDeferred {};
|
struct DerivationOutputDeferred {};
|
||||||
|
|
||||||
struct DerivationOutput
|
typedef std::variant<
|
||||||
|
DerivationOutputInputAddressed,
|
||||||
|
DerivationOutputCAFixed,
|
||||||
|
DerivationOutputCAFloating,
|
||||||
|
DerivationOutputDeferred
|
||||||
|
> _DerivationOutputRaw;
|
||||||
|
|
||||||
|
struct DerivationOutput : _DerivationOutputRaw
|
||||||
{
|
{
|
||||||
std::variant<
|
using Raw = _DerivationOutputRaw;
|
||||||
DerivationOutputInputAddressed,
|
using Raw::Raw;
|
||||||
DerivationOutputCAFixed,
|
|
||||||
DerivationOutputCAFloating,
|
using InputAddressed = DerivationOutputInputAddressed;
|
||||||
DerivationOutputDeferred
|
using CAFixed = DerivationOutputCAFixed;
|
||||||
> output;
|
using CAFloating = DerivationOutputCAFloating;
|
||||||
|
using Deferred = DerivationOutputDeferred;
|
||||||
|
|
||||||
/* Note, when you use this function you should make sure that you're passing
|
/* Note, when you use this function you should make sure that you're passing
|
||||||
the right derivation name. When in doubt, you should use the safer
|
the right derivation name. When in doubt, you should use the safer
|
||||||
interface provided by BasicDerivation::outputsAndOptPaths */
|
interface provided by BasicDerivation::outputsAndOptPaths */
|
||||||
std::optional<StorePath> path(const Store & store, std::string_view drvName, std::string_view outputName) const;
|
std::optional<StorePath> path(const Store & store, std::string_view drvName, std::string_view outputName) const;
|
||||||
|
|
||||||
|
inline const Raw & raw() const {
|
||||||
|
return static_cast<const Raw &>(*this);
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
typedef std::map<std::string, DerivationOutput> DerivationOutputs;
|
typedef std::map<std::string, DerivationOutput> DerivationOutputs;
|
||||||
|
@ -72,30 +85,50 @@ typedef std::map<std::string, std::pair<DerivationOutput, std::optional<StorePat
|
||||||
output IDs we are interested in. */
|
output IDs we are interested in. */
|
||||||
typedef std::map<StorePath, StringSet> DerivationInputs;
|
typedef std::map<StorePath, StringSet> DerivationInputs;
|
||||||
|
|
||||||
enum struct DerivationType : uint8_t {
|
struct DerivationType_InputAddressed {
|
||||||
InputAddressed,
|
bool deferred;
|
||||||
DeferredInputAddressed,
|
|
||||||
CAFixed,
|
|
||||||
CAFloating,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
/* Do the outputs of the derivation have paths calculated from their content,
|
struct DerivationType_ContentAddressed {
|
||||||
or from the derivation itself? */
|
bool pure;
|
||||||
bool derivationIsCA(DerivationType);
|
bool fixed;
|
||||||
|
};
|
||||||
|
|
||||||
/* Is the content of the outputs fixed a-priori via a hash? Never true for
|
typedef std::variant<
|
||||||
non-CA derivations. */
|
DerivationType_InputAddressed,
|
||||||
bool derivationIsFixed(DerivationType);
|
DerivationType_ContentAddressed
|
||||||
|
> _DerivationTypeRaw;
|
||||||
|
|
||||||
/* Is the derivation impure and needs to access non-deterministic resources, or
|
struct DerivationType : _DerivationTypeRaw {
|
||||||
pure and can be sandboxed? Note that whether or not we actually sandbox the
|
using Raw = _DerivationTypeRaw;
|
||||||
derivation is controlled separately. Never true for non-CA derivations. */
|
using Raw::Raw;
|
||||||
bool derivationIsImpure(DerivationType);
|
using InputAddressed = DerivationType_InputAddressed;
|
||||||
|
using ContentAddressed = DerivationType_ContentAddressed;
|
||||||
|
|
||||||
/* Does the derivation knows its own output paths?
|
|
||||||
* Only true when there's no floating-ca derivation involved in the closure.
|
/* Do the outputs of the derivation have paths calculated from their content,
|
||||||
*/
|
or from the derivation itself? */
|
||||||
bool derivationHasKnownOutputPaths(DerivationType);
|
bool isCA() const;
|
||||||
|
|
||||||
|
/* Is the content of the outputs fixed a-priori via a hash? Never true for
|
||||||
|
non-CA derivations. */
|
||||||
|
bool isFixed() const;
|
||||||
|
|
||||||
|
/* Is the derivation impure and needs to access non-deterministic resources, or
|
||||||
|
pure and can be sandboxed? Note that whether or not we actually sandbox the
|
||||||
|
derivation is controlled separately. Never true for non-CA derivations. */
|
||||||
|
bool isImpure() const;
|
||||||
|
|
||||||
|
/* Does the derivation knows its own output paths?
|
||||||
|
Only true when there's no floating-ca derivation involved in the
|
||||||
|
closure, or if fixed output.
|
||||||
|
*/
|
||||||
|
bool hasKnownOutputPaths() const;
|
||||||
|
|
||||||
|
inline const Raw & raw() const {
|
||||||
|
return static_cast<const Raw &>(*this);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
struct BasicDerivation
|
struct BasicDerivation
|
||||||
{
|
{
|
||||||
|
@ -150,8 +183,6 @@ struct Derivation : BasicDerivation
|
||||||
|
|
||||||
class Store;
|
class Store;
|
||||||
|
|
||||||
enum RepairFlag : bool { NoRepair = false, Repair = true };
|
|
||||||
|
|
||||||
/* Write a derivation to the Nix store, and return its path. */
|
/* Write a derivation to the Nix store, and return its path. */
|
||||||
StorePath writeDerivation(Store & store,
|
StorePath writeDerivation(Store & store,
|
||||||
const Derivation & drv,
|
const Derivation & drv,
|
||||||
|
@ -175,13 +206,43 @@ std::string outputPathName(std::string_view drvName, std::string_view outputName
|
||||||
// whose output hashes are always known since they are fixed up-front.
|
// whose output hashes are always known since they are fixed up-front.
|
||||||
typedef std::map<std::string, Hash> CaOutputHashes;
|
typedef std::map<std::string, Hash> CaOutputHashes;
|
||||||
|
|
||||||
struct DeferredHash { Hash hash; };
|
struct DrvHash {
|
||||||
|
Hash hash;
|
||||||
|
|
||||||
|
enum struct Kind: bool {
|
||||||
|
// Statically determined derivations.
|
||||||
|
// This hash will be directly used to compute the output paths
|
||||||
|
Regular,
|
||||||
|
// Floating-output derivations (and their reverse dependencies).
|
||||||
|
Deferred,
|
||||||
|
};
|
||||||
|
|
||||||
|
Kind kind;
|
||||||
|
};
|
||||||
|
|
||||||
|
void operator |= (DrvHash::Kind & self, const DrvHash::Kind & other) noexcept;
|
||||||
|
|
||||||
typedef std::variant<
|
typedef std::variant<
|
||||||
Hash, // regular DRV normalized hash
|
// Regular normalized derivation hash, and whether it was deferred (because
|
||||||
CaOutputHashes, // Fixed-output derivation hashes
|
// an ancestor derivation is a floating content addressed derivation).
|
||||||
DeferredHash // Deferred hashes for floating outputs drvs and their dependencies
|
DrvHash,
|
||||||
> DrvHashModulo;
|
// Fixed-output derivation hashes
|
||||||
|
CaOutputHashes
|
||||||
|
> _DrvHashModuloRaw;
|
||||||
|
|
||||||
|
struct DrvHashModulo : _DrvHashModuloRaw {
|
||||||
|
using Raw = _DrvHashModuloRaw;
|
||||||
|
using Raw::Raw;
|
||||||
|
|
||||||
|
/* Get hash, throwing if it is per-output CA hashes or a
|
||||||
|
deferred Drv hash.
|
||||||
|
*/
|
||||||
|
const Hash & requireNoFixedNonDeferred() const;
|
||||||
|
|
||||||
|
inline const Raw & raw() const {
|
||||||
|
return static_cast<const Raw &>(*this);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
/* Returns hashes with the details of fixed-output subderivations
|
/* Returns hashes with the details of fixed-output subderivations
|
||||||
expunged.
|
expunged.
|
||||||
|
|
|
@ -1,4 +1,5 @@
|
||||||
#include "derived-path.hh"
|
#include "derived-path.hh"
|
||||||
|
#include "derivations.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
|
||||||
#include <nlohmann/json.hpp>
|
#include <nlohmann/json.hpp>
|
||||||
|
@ -11,6 +12,21 @@ nlohmann::json DerivedPath::Opaque::toJSON(ref<Store> store) const {
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
nlohmann::json DerivedPath::Built::toJSON(ref<Store> store) const {
|
||||||
|
nlohmann::json res;
|
||||||
|
res["drvPath"] = store->printStorePath(drvPath);
|
||||||
|
// Fallback for the input-addressed derivation case: We expect to always be
|
||||||
|
// able to print the output paths, so let’s do it
|
||||||
|
auto knownOutputs = store->queryPartialDerivationOutputMap(drvPath);
|
||||||
|
for (const auto& output : outputs) {
|
||||||
|
if (knownOutputs.at(output))
|
||||||
|
res["outputs"][output] = store->printStorePath(knownOutputs.at(output).value());
|
||||||
|
else
|
||||||
|
res["outputs"][output] = nullptr;
|
||||||
|
}
|
||||||
|
return res;
|
||||||
|
}
|
||||||
|
|
||||||
nlohmann::json BuiltPath::Built::toJSON(ref<Store> store) const {
|
nlohmann::json BuiltPath::Built::toJSON(ref<Store> store) const {
|
||||||
nlohmann::json res;
|
nlohmann::json res;
|
||||||
res["drvPath"] = store->printStorePath(drvPath);
|
res["drvPath"] = store->printStorePath(drvPath);
|
||||||
|
@ -35,16 +51,22 @@ StorePathSet BuiltPath::outPaths() const
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
nlohmann::json derivedPathsWithHintsToJSON(const BuiltPaths & buildables, ref<Store> store) {
|
template<typename T>
|
||||||
|
nlohmann::json stuffToJSON(const std::vector<T> & ts, ref<Store> store) {
|
||||||
auto res = nlohmann::json::array();
|
auto res = nlohmann::json::array();
|
||||||
for (const BuiltPath & buildable : buildables) {
|
for (const T & t : ts) {
|
||||||
std::visit([&res, store](const auto & buildable) {
|
std::visit([&res, store](const auto & t) {
|
||||||
res.push_back(buildable.toJSON(store));
|
res.push_back(t.toJSON(store));
|
||||||
}, buildable.raw());
|
}, t.raw());
|
||||||
}
|
}
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
nlohmann::json derivedPathsWithHintsToJSON(const BuiltPaths & buildables, ref<Store> store)
|
||||||
|
{ return stuffToJSON<BuiltPath>(buildables, store); }
|
||||||
|
nlohmann::json derivedPathsToJSON(const DerivedPaths & paths, ref<Store> store)
|
||||||
|
{ return stuffToJSON<DerivedPath>(paths, store); }
|
||||||
|
|
||||||
|
|
||||||
std::string DerivedPath::Opaque::to_string(const Store & store) const {
|
std::string DerivedPath::Opaque::to_string(const Store & store) const {
|
||||||
return store.printStorePath(path);
|
return store.printStorePath(path);
|
||||||
|
|
|
@ -46,6 +46,7 @@ struct DerivedPathBuilt {
|
||||||
|
|
||||||
std::string to_string(const Store & store) const;
|
std::string to_string(const Store & store) const;
|
||||||
static DerivedPathBuilt parse(const Store & store, std::string_view);
|
static DerivedPathBuilt parse(const Store & store, std::string_view);
|
||||||
|
nlohmann::json toJSON(ref<Store> store) const;
|
||||||
};
|
};
|
||||||
|
|
||||||
using _DerivedPathRaw = std::variant<
|
using _DerivedPathRaw = std::variant<
|
||||||
|
@ -120,5 +121,6 @@ typedef std::vector<DerivedPath> DerivedPaths;
|
||||||
typedef std::vector<BuiltPath> BuiltPaths;
|
typedef std::vector<BuiltPath> BuiltPaths;
|
||||||
|
|
||||||
nlohmann::json derivedPathsWithHintsToJSON(const BuiltPaths & buildables, ref<Store> store);
|
nlohmann::json derivedPathsWithHintsToJSON(const BuiltPaths & buildables, ref<Store> store);
|
||||||
|
nlohmann::json derivedPathsToJSON(const DerivedPaths & , ref<Store> store);
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,13 +0,0 @@
|
||||||
#include "gc-store.hh"
|
|
||||||
|
|
||||||
namespace nix {
|
|
||||||
|
|
||||||
GcStore & requireGcStore(Store & store)
|
|
||||||
{
|
|
||||||
auto * gcStore = dynamic_cast<GcStore *>(&store);
|
|
||||||
if (!gcStore)
|
|
||||||
throw UsageError("Garbage collection not supported by this store");
|
|
||||||
return *gcStore;
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
|
@ -61,6 +61,8 @@ struct GCResults
|
||||||
|
|
||||||
struct GcStore : public virtual Store
|
struct GcStore : public virtual Store
|
||||||
{
|
{
|
||||||
|
inline static std::string operationName = "Garbage collection";
|
||||||
|
|
||||||
/* Add an indirect root, which is merely a symlink to `path' from
|
/* Add an indirect root, which is merely a symlink to `path' from
|
||||||
/nix/var/nix/gcroots/auto/<hash of `path'>. `path' is supposed
|
/nix/var/nix/gcroots/auto/<hash of `path'>. `path' is supposed
|
||||||
to be a symlink to a store path. The garbage collector will
|
to be a symlink to a store path. The garbage collector will
|
||||||
|
@ -79,6 +81,4 @@ struct GcStore : public virtual Store
|
||||||
virtual void collectGarbage(const GCOptions & options, GCResults & results) = 0;
|
virtual void collectGarbage(const GCOptions & options, GCResults & results) = 0;
|
||||||
};
|
};
|
||||||
|
|
||||||
GcStore & requireGcStore(Store & store);
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -678,7 +678,8 @@ void LocalStore::collectGarbage(const GCOptions & options, GCResults & results)
|
||||||
alive.insert(start);
|
alive.insert(start);
|
||||||
try {
|
try {
|
||||||
StorePathSet closure;
|
StorePathSet closure;
|
||||||
computeFSClosure(*path, closure);
|
computeFSClosure(*path, closure,
|
||||||
|
/* flipDirection */ false, gcKeepOutputs, gcKeepDerivations);
|
||||||
for (auto & p : closure)
|
for (auto & p : closure)
|
||||||
alive.insert(p);
|
alive.insert(p);
|
||||||
} catch (InvalidPath &) { }
|
} catch (InvalidPath &) { }
|
||||||
|
@ -841,7 +842,8 @@ void LocalStore::collectGarbage(const GCOptions & options, GCResults & results)
|
||||||
if (unlink(path.c_str()) == -1)
|
if (unlink(path.c_str()) == -1)
|
||||||
throw SysError("deleting '%1%'", path);
|
throw SysError("deleting '%1%'", path);
|
||||||
|
|
||||||
results.bytesFreed += st.st_size;
|
/* Do not accound for deleted file here. Rely on deletePath()
|
||||||
|
accounting. */
|
||||||
}
|
}
|
||||||
|
|
||||||
struct stat st;
|
struct stat st;
|
||||||
|
|
|
@ -2,6 +2,7 @@
|
||||||
|
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
@ -24,7 +25,10 @@ struct LocalFSStoreConfig : virtual StoreConfig
|
||||||
"physical path to the Nix store"};
|
"physical path to the Nix store"};
|
||||||
};
|
};
|
||||||
|
|
||||||
class LocalFSStore : public virtual LocalFSStoreConfig, public virtual Store, virtual GcStore
|
class LocalFSStore : public virtual LocalFSStoreConfig,
|
||||||
|
public virtual Store,
|
||||||
|
public virtual GcStore,
|
||||||
|
public virtual LogStore
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
|
|
||||||
|
|
|
@ -698,11 +698,11 @@ void LocalStore::checkDerivationOutputs(const StorePath & drvPath, const Derivat
|
||||||
std::optional<Hash> h;
|
std::optional<Hash> h;
|
||||||
for (auto & i : drv.outputs) {
|
for (auto & i : drv.outputs) {
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const DerivationOutputInputAddressed & doia) {
|
[&](const DerivationOutput::InputAddressed & doia) {
|
||||||
if (!h) {
|
if (!h) {
|
||||||
// somewhat expensive so we do lazily
|
// somewhat expensive so we do lazily
|
||||||
auto temp = hashDerivationModulo(*this, drv, true);
|
auto h0 = hashDerivationModulo(*this, drv, true);
|
||||||
h = std::get<Hash>(temp);
|
h = h0.requireNoFixedNonDeferred();
|
||||||
}
|
}
|
||||||
StorePath recomputed = makeOutputPath(i.first, *h, drvName);
|
StorePath recomputed = makeOutputPath(i.first, *h, drvName);
|
||||||
if (doia.path != recomputed)
|
if (doia.path != recomputed)
|
||||||
|
@ -710,16 +710,17 @@ void LocalStore::checkDerivationOutputs(const StorePath & drvPath, const Derivat
|
||||||
printStorePath(drvPath), printStorePath(doia.path), printStorePath(recomputed));
|
printStorePath(drvPath), printStorePath(doia.path), printStorePath(recomputed));
|
||||||
envHasRightPath(doia.path, i.first);
|
envHasRightPath(doia.path, i.first);
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed & dof) {
|
[&](const DerivationOutput::CAFixed & dof) {
|
||||||
auto path = dof.path(*this, drvName, i.first);
|
auto path = dof.path(*this, drvName, i.first);
|
||||||
envHasRightPath(path, i.first);
|
envHasRightPath(path, i.first);
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFloating &) {
|
[&](const DerivationOutput::CAFloating &) {
|
||||||
/* Nothing to check */
|
/* Nothing to check */
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputDeferred &) {
|
[&](const DerivationOutput::Deferred &) {
|
||||||
|
/* Nothing to check */
|
||||||
},
|
},
|
||||||
}, i.second.output);
|
}, i.second.raw());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
21
src/libstore/log-store.hh
Normal file
21
src/libstore/log-store.hh
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
#include "store-api.hh"
|
||||||
|
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
struct LogStore : public virtual Store
|
||||||
|
{
|
||||||
|
inline static std::string operationName = "Build log storage and retrieval";
|
||||||
|
|
||||||
|
/* Return the build log of the specified store path, if available,
|
||||||
|
or null otherwise. */
|
||||||
|
virtual std::optional<std::string> getBuildLog(const StorePath & path) = 0;
|
||||||
|
|
||||||
|
virtual void addBuildLog(const StorePath & path, std::string_view log) = 0;
|
||||||
|
|
||||||
|
static LogStore & require(Store & store);
|
||||||
|
};
|
||||||
|
|
||||||
|
}
|
83
src/libstore/make-content-addressed.cc
Normal file
83
src/libstore/make-content-addressed.cc
Normal file
|
@ -0,0 +1,83 @@
|
||||||
|
#include "make-content-addressed.hh"
|
||||||
|
#include "references.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
std::map<StorePath, StorePath> makeContentAddressed(
|
||||||
|
Store & srcStore,
|
||||||
|
Store & dstStore,
|
||||||
|
const StorePathSet & storePaths)
|
||||||
|
{
|
||||||
|
StorePathSet closure;
|
||||||
|
srcStore.computeFSClosure(storePaths, closure);
|
||||||
|
|
||||||
|
auto paths = srcStore.topoSortPaths(closure);
|
||||||
|
|
||||||
|
std::reverse(paths.begin(), paths.end());
|
||||||
|
|
||||||
|
std::map<StorePath, StorePath> remappings;
|
||||||
|
|
||||||
|
for (auto & path : paths) {
|
||||||
|
auto pathS = srcStore.printStorePath(path);
|
||||||
|
auto oldInfo = srcStore.queryPathInfo(path);
|
||||||
|
std::string oldHashPart(path.hashPart());
|
||||||
|
|
||||||
|
StringSink sink;
|
||||||
|
srcStore.narFromPath(path, sink);
|
||||||
|
|
||||||
|
StringMap rewrites;
|
||||||
|
|
||||||
|
PathReferences<StorePath> refs;
|
||||||
|
refs.hasSelfReference = oldInfo->hasSelfReference;
|
||||||
|
for (auto & ref : oldInfo->references) {
|
||||||
|
auto i = remappings.find(ref);
|
||||||
|
auto replacement = i != remappings.end() ? i->second : ref;
|
||||||
|
// FIXME: warn about unremapped paths?
|
||||||
|
if (replacement != ref) {
|
||||||
|
rewrites.insert_or_assign(srcStore.printStorePath(ref), srcStore.printStorePath(replacement));
|
||||||
|
refs.references.insert(std::move(replacement));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sink.s = rewriteStrings(sink.s, rewrites);
|
||||||
|
|
||||||
|
HashModuloSink hashModuloSink(htSHA256, oldHashPart);
|
||||||
|
hashModuloSink(sink.s);
|
||||||
|
|
||||||
|
auto narModuloHash = hashModuloSink.finish().first;
|
||||||
|
|
||||||
|
ValidPathInfo info {
|
||||||
|
dstStore,
|
||||||
|
StorePathDescriptor {
|
||||||
|
.name = std::string { path.name() },
|
||||||
|
.info = FixedOutputInfo {
|
||||||
|
{
|
||||||
|
.method = FileIngestionMethod::Recursive,
|
||||||
|
.hash = narModuloHash,
|
||||||
|
},
|
||||||
|
std::move(refs),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Hash::dummy,
|
||||||
|
};
|
||||||
|
|
||||||
|
printInfo("rewriting '%s' to '%s'", pathS, dstStore.printStorePath(info.path));
|
||||||
|
|
||||||
|
StringSink sink2;
|
||||||
|
RewritingSink rsink2(oldHashPart, std::string(info.path.hashPart()), sink2);
|
||||||
|
rsink2(sink.s);
|
||||||
|
rsink2.flush();
|
||||||
|
|
||||||
|
info.narHash = hashString(htSHA256, sink2.s);
|
||||||
|
info.narSize = sink.s.size();
|
||||||
|
|
||||||
|
StringSource source(sink2.s);
|
||||||
|
dstStore.addToStore(info, source);
|
||||||
|
|
||||||
|
remappings.insert_or_assign(std::move(path), std::move(info.path));
|
||||||
|
}
|
||||||
|
|
||||||
|
return remappings;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
12
src/libstore/make-content-addressed.hh
Normal file
12
src/libstore/make-content-addressed.hh
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
#include "store-api.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
std::map<StorePath, StorePath> makeContentAddressed(
|
||||||
|
Store & srcStore,
|
||||||
|
Store & dstStore,
|
||||||
|
const StorePathSet & storePaths);
|
||||||
|
|
||||||
|
}
|
|
@ -87,7 +87,7 @@ std::optional<ContentAddress> getDerivationCA(const BasicDerivation & drv)
|
||||||
auto out = drv.outputs.find("out");
|
auto out = drv.outputs.find("out");
|
||||||
if (out == drv.outputs.end())
|
if (out == drv.outputs.end())
|
||||||
return std::nullopt;
|
return std::nullopt;
|
||||||
if (auto dof = std::get_if<DerivationOutputCAFixed>(&out->second.output)) {
|
if (auto dof = std::get_if<DerivationOutput::CAFixed>(&out->second)) {
|
||||||
return std::visit(overloaded {
|
return std::visit(overloaded {
|
||||||
[&](const TextInfo & ti) -> std::optional<ContentAddress> {
|
[&](const TextInfo & ti) -> std::optional<ContentAddress> {
|
||||||
if (!ti.references.empty())
|
if (!ti.references.empty())
|
||||||
|
|
|
@ -93,7 +93,7 @@ StringSet ParsedDerivation::getRequiredSystemFeatures() const
|
||||||
StringSet res;
|
StringSet res;
|
||||||
for (auto & i : getStringsAttr("requiredSystemFeatures").value_or(Strings()))
|
for (auto & i : getStringsAttr("requiredSystemFeatures").value_or(Strings()))
|
||||||
res.insert(i);
|
res.insert(i);
|
||||||
if (!derivationHasKnownOutputPaths(drv.type()))
|
if (!drv.type().hasKnownOutputPaths())
|
||||||
res.insert("ca-derivations");
|
res.insert("ca-derivations");
|
||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
#pragma once
|
#pragma once
|
||||||
|
|
||||||
|
#include "derivations.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
|
||||||
#include <nlohmann/json_fwd.hpp>
|
#include <nlohmann/json_fwd.hpp>
|
||||||
|
|
|
@ -5,6 +5,7 @@
|
||||||
|
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
|
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
@ -30,7 +31,10 @@ struct RemoteStoreConfig : virtual StoreConfig
|
||||||
|
|
||||||
/* FIXME: RemoteStore is a misnomer - should be something like
|
/* FIXME: RemoteStore is a misnomer - should be something like
|
||||||
DaemonStore. */
|
DaemonStore. */
|
||||||
class RemoteStore : public virtual RemoteStoreConfig, public virtual Store, public virtual GcStore
|
class RemoteStore : public virtual RemoteStoreConfig,
|
||||||
|
public virtual Store,
|
||||||
|
public virtual GcStore,
|
||||||
|
public virtual LogStore
|
||||||
{
|
{
|
||||||
public:
|
public:
|
||||||
|
|
||||||
|
|
7
src/libstore/repair-flag.hh
Normal file
7
src/libstore/repair-flag.hh
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
enum RepairFlag : bool { NoRepair = false, Repair = true };
|
||||||
|
|
||||||
|
}
|
|
@ -52,6 +52,10 @@ public:
|
||||||
bool sameMachine() override
|
bool sameMachine() override
|
||||||
{ return false; }
|
{ return false; }
|
||||||
|
|
||||||
|
// FIXME extend daemon protocol, move implementation to RemoteStore
|
||||||
|
std::optional<std::string> getBuildLog(const StorePath & path) override
|
||||||
|
{ unsupported("getBuildLog"); }
|
||||||
|
|
||||||
private:
|
private:
|
||||||
|
|
||||||
struct Connection : RemoteStore::Connection
|
struct Connection : RemoteStore::Connection
|
||||||
|
|
|
@ -1,6 +1,7 @@
|
||||||
#include "crypto.hh"
|
#include "crypto.hh"
|
||||||
#include "fs-accessor.hh"
|
#include "fs-accessor.hh"
|
||||||
#include "globals.hh"
|
#include "globals.hh"
|
||||||
|
#include "derivations.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
#include "util.hh"
|
#include "util.hh"
|
||||||
#include "nar-info-disk-cache.hh"
|
#include "nar-info-disk-cache.hh"
|
||||||
|
|
|
@ -10,8 +10,8 @@
|
||||||
#include "sync.hh"
|
#include "sync.hh"
|
||||||
#include "globals.hh"
|
#include "globals.hh"
|
||||||
#include "config.hh"
|
#include "config.hh"
|
||||||
#include "derivations.hh"
|
|
||||||
#include "path-info.hh"
|
#include "path-info.hh"
|
||||||
|
#include "repair-flag.hh"
|
||||||
|
|
||||||
#include <atomic>
|
#include <atomic>
|
||||||
#include <limits>
|
#include <limits>
|
||||||
|
@ -62,6 +62,8 @@ MakeError(BadStorePath, Error);
|
||||||
|
|
||||||
MakeError(InvalidStoreURI, Error);
|
MakeError(InvalidStoreURI, Error);
|
||||||
|
|
||||||
|
struct BasicDerivation;
|
||||||
|
struct Derivation;
|
||||||
class FSAccessor;
|
class FSAccessor;
|
||||||
class NarInfoDiskCache;
|
class NarInfoDiskCache;
|
||||||
class Store;
|
class Store;
|
||||||
|
@ -601,14 +603,6 @@ public:
|
||||||
*/
|
*/
|
||||||
StorePathSet exportReferences(const StorePathSet & storePaths, const StorePathSet & inputPaths);
|
StorePathSet exportReferences(const StorePathSet & storePaths, const StorePathSet & inputPaths);
|
||||||
|
|
||||||
/* Return the build log of the specified store path, if available,
|
|
||||||
or null otherwise. */
|
|
||||||
virtual std::optional<std::string> getBuildLog(const StorePath & path)
|
|
||||||
{ return std::nullopt; }
|
|
||||||
|
|
||||||
virtual void addBuildLog(const StorePath & path, std::string_view log)
|
|
||||||
{ unsupported("addBuildLog"); }
|
|
||||||
|
|
||||||
/* Hack to allow long-running processes like hydra-queue-runner to
|
/* Hack to allow long-running processes like hydra-queue-runner to
|
||||||
occasionally flush their path info cache. */
|
occasionally flush their path info cache. */
|
||||||
void clearPathInfoCache()
|
void clearPathInfoCache()
|
||||||
|
|
16
src/libstore/store-cast.hh
Normal file
16
src/libstore/store-cast.hh
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
#pragma once
|
||||||
|
|
||||||
|
#include "store-api.hh"
|
||||||
|
|
||||||
|
namespace nix {
|
||||||
|
|
||||||
|
template<typename T>
|
||||||
|
T & require(Store & store)
|
||||||
|
{
|
||||||
|
auto * castedStore = dynamic_cast<T *>(&store);
|
||||||
|
if (!castedStore)
|
||||||
|
throw UsageError("%s not supported by store '%s'", T::operationName, store.getUri());
|
||||||
|
return *castedStore;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
|
@ -64,11 +64,12 @@ static void dumpContents(const Path & path, off_t size,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
static void dump(const Path & path, Sink & sink, PathFilter & filter)
|
static time_t dump(const Path & path, Sink & sink, PathFilter & filter)
|
||||||
{
|
{
|
||||||
checkInterrupt();
|
checkInterrupt();
|
||||||
|
|
||||||
auto st = lstat(path);
|
auto st = lstat(path);
|
||||||
|
time_t result = st.st_mtime;
|
||||||
|
|
||||||
sink << "(";
|
sink << "(";
|
||||||
|
|
||||||
|
@ -103,7 +104,10 @@ static void dump(const Path & path, Sink & sink, PathFilter & filter)
|
||||||
for (auto & i : unhacked)
|
for (auto & i : unhacked)
|
||||||
if (filter(path + "/" + i.first)) {
|
if (filter(path + "/" + i.first)) {
|
||||||
sink << "entry" << "(" << "name" << i.first << "node";
|
sink << "entry" << "(" << "name" << i.first << "node";
|
||||||
dump(path + "/" + i.second, sink, filter);
|
auto tmp_mtime = dump(path + "/" + i.second, sink, filter);
|
||||||
|
if (tmp_mtime > result) {
|
||||||
|
result = tmp_mtime;
|
||||||
|
}
|
||||||
sink << ")";
|
sink << ")";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -114,13 +118,20 @@ static void dump(const Path & path, Sink & sink, PathFilter & filter)
|
||||||
else throw Error("file '%1%' has an unsupported type", path);
|
else throw Error("file '%1%' has an unsupported type", path);
|
||||||
|
|
||||||
sink << ")";
|
sink << ")";
|
||||||
|
|
||||||
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
void dumpPath(const Path & path, Sink & sink, PathFilter & filter)
|
time_t dumpPathAndGetMtime(const Path & path, Sink & sink, PathFilter & filter)
|
||||||
{
|
{
|
||||||
sink << narVersionMagic1;
|
sink << narVersionMagic1;
|
||||||
dump(path, sink, filter);
|
return dump(path, sink, filter);
|
||||||
|
}
|
||||||
|
|
||||||
|
void dumpPath(const Path & path, Sink & sink, PathFilter & filter)
|
||||||
|
{
|
||||||
|
dumpPathAndGetMtime(path, sink, filter);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -48,6 +48,10 @@ namespace nix {
|
||||||
void dumpPath(const Path & path, Sink & sink,
|
void dumpPath(const Path & path, Sink & sink,
|
||||||
PathFilter & filter = defaultPathFilter);
|
PathFilter & filter = defaultPathFilter);
|
||||||
|
|
||||||
|
/* Same as `void dumpPath()`, but returns the last modified date of the path */
|
||||||
|
time_t dumpPathAndGetMtime(const Path & path, Sink & sink,
|
||||||
|
PathFilter & filter = defaultPathFilter);
|
||||||
|
|
||||||
void dumpString(std::string_view s, Sink & sink);
|
void dumpString(std::string_view s, Sink & sink);
|
||||||
|
|
||||||
/* FIXME: fix this API, it sucks. */
|
/* FIXME: fix this API, it sucks. */
|
||||||
|
|
|
@ -54,6 +54,7 @@ typedef enum {
|
||||||
lvlVomit
|
lvlVomit
|
||||||
} Verbosity;
|
} Verbosity;
|
||||||
|
|
||||||
|
/* adjust Pos::origin bit width when adding stuff here */
|
||||||
typedef enum {
|
typedef enum {
|
||||||
foFile,
|
foFile,
|
||||||
foStdin,
|
foStdin,
|
||||||
|
|
|
@ -11,6 +11,7 @@ std::map<ExperimentalFeature, std::string> stringifiedXpFeatures = {
|
||||||
{ Xp::NixCommand, "nix-command" },
|
{ Xp::NixCommand, "nix-command" },
|
||||||
{ Xp::RecursiveNix, "recursive-nix" },
|
{ Xp::RecursiveNix, "recursive-nix" },
|
||||||
{ Xp::NoUrlLiterals, "no-url-literals" },
|
{ Xp::NoUrlLiterals, "no-url-literals" },
|
||||||
|
{ Xp::FetchClosure, "fetch-closure" },
|
||||||
};
|
};
|
||||||
|
|
||||||
const std::optional<ExperimentalFeature> parseExperimentalFeature(const std::string_view & name)
|
const std::optional<ExperimentalFeature> parseExperimentalFeature(const std::string_view & name)
|
||||||
|
|
|
@ -19,7 +19,8 @@ enum struct ExperimentalFeature
|
||||||
Flakes,
|
Flakes,
|
||||||
NixCommand,
|
NixCommand,
|
||||||
RecursiveNix,
|
RecursiveNix,
|
||||||
NoUrlLiterals
|
NoUrlLiterals,
|
||||||
|
FetchClosure,
|
||||||
};
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
|
@ -1,14 +1,13 @@
|
||||||
#pragma once
|
#pragma once
|
||||||
|
|
||||||
#include <functional>
|
|
||||||
|
|
||||||
/* A trivial class to run a function at the end of a scope. */
|
/* A trivial class to run a function at the end of a scope. */
|
||||||
|
template<typename Fn>
|
||||||
class Finally
|
class Finally
|
||||||
{
|
{
|
||||||
private:
|
private:
|
||||||
std::function<void()> fun;
|
Fn fun;
|
||||||
|
|
||||||
public:
|
public:
|
||||||
Finally(std::function<void()> fun) : fun(fun) { }
|
Finally(Fn fun) : fun(std::move(fun)) { }
|
||||||
~Finally() { fun(); }
|
~Finally() { fun(); }
|
||||||
};
|
};
|
||||||
|
|
|
@ -39,30 +39,32 @@ void TarArchive::check(int err, const std::string & reason)
|
||||||
throw Error(reason, archive_error_string(this->archive));
|
throw Error(reason, archive_error_string(this->archive));
|
||||||
}
|
}
|
||||||
|
|
||||||
TarArchive::TarArchive(Source & source, bool raw)
|
TarArchive::TarArchive(Source & source, bool raw) : buffer(4096)
|
||||||
: source(&source), buffer(4096)
|
|
||||||
{
|
{
|
||||||
init();
|
this->archive = archive_read_new();
|
||||||
if (!raw)
|
this->source = &source;
|
||||||
|
|
||||||
|
if (!raw) {
|
||||||
|
archive_read_support_filter_all(archive);
|
||||||
archive_read_support_format_all(archive);
|
archive_read_support_format_all(archive);
|
||||||
else
|
} else {
|
||||||
|
archive_read_support_filter_all(archive);
|
||||||
archive_read_support_format_raw(archive);
|
archive_read_support_format_raw(archive);
|
||||||
|
archive_read_support_format_empty(archive);
|
||||||
|
}
|
||||||
check(archive_read_open(archive, (void *)this, callback_open, callback_read, callback_close), "Failed to open archive (%s)");
|
check(archive_read_open(archive, (void *)this, callback_open, callback_read, callback_close), "Failed to open archive (%s)");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
TarArchive::TarArchive(const Path & path)
|
TarArchive::TarArchive(const Path & path)
|
||||||
{
|
{
|
||||||
init();
|
this->archive = archive_read_new();
|
||||||
|
|
||||||
|
archive_read_support_filter_all(archive);
|
||||||
archive_read_support_format_all(archive);
|
archive_read_support_format_all(archive);
|
||||||
check(archive_read_open_filename(archive, path.c_str(), 16384), "failed to open archive: %s");
|
check(archive_read_open_filename(archive, path.c_str(), 16384), "failed to open archive: %s");
|
||||||
}
|
}
|
||||||
|
|
||||||
void TarArchive::init()
|
|
||||||
{
|
|
||||||
archive = archive_read_new();
|
|
||||||
archive_read_support_filter_all(archive);
|
|
||||||
}
|
|
||||||
|
|
||||||
void TarArchive::close()
|
void TarArchive::close()
|
||||||
{
|
{
|
||||||
check(archive_read_close(this->archive), "Failed to close archive (%s)");
|
check(archive_read_close(this->archive), "Failed to close archive (%s)");
|
||||||
|
|
|
@ -17,13 +17,10 @@ struct TarArchive {
|
||||||
// disable copy constructor
|
// disable copy constructor
|
||||||
TarArchive(const TarArchive &) = delete;
|
TarArchive(const TarArchive &) = delete;
|
||||||
|
|
||||||
void init();
|
|
||||||
|
|
||||||
void close();
|
void close();
|
||||||
|
|
||||||
~TarArchive();
|
~TarArchive();
|
||||||
};
|
};
|
||||||
|
|
||||||
void unpackTarfile(Source & source, const Path & destDir);
|
void unpackTarfile(Source & source, const Path & destDir);
|
||||||
|
|
||||||
void unpackTarfile(const Path & tarFile, const Path & destDir);
|
void unpackTarfile(const Path & tarFile, const Path & destDir);
|
||||||
|
|
|
@ -406,8 +406,29 @@ static void _deletePath(int parentfd, const Path & path, uint64_t & bytesFreed)
|
||||||
throw SysError("getting status of '%1%'", path);
|
throw SysError("getting status of '%1%'", path);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!S_ISDIR(st.st_mode) && st.st_nlink == 1)
|
if (!S_ISDIR(st.st_mode)) {
|
||||||
bytesFreed += st.st_size;
|
/* We are about to delete a file. Will it likely free space? */
|
||||||
|
|
||||||
|
switch (st.st_nlink) {
|
||||||
|
/* Yes: last link. */
|
||||||
|
case 1:
|
||||||
|
bytesFreed += st.st_size;
|
||||||
|
break;
|
||||||
|
/* Maybe: yes, if 'auto-optimise-store' or manual optimisation
|
||||||
|
was performed. Instead of checking for real let's assume
|
||||||
|
it's an optimised file and space will be freed.
|
||||||
|
|
||||||
|
In worst case we will double count on freed space for files
|
||||||
|
with exactly two hardlinks for unoptimised packages.
|
||||||
|
*/
|
||||||
|
case 2:
|
||||||
|
bytesFreed += st.st_size;
|
||||||
|
break;
|
||||||
|
/* No: 3+ links. */
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (S_ISDIR(st.st_mode)) {
|
if (S_ISDIR(st.st_mode)) {
|
||||||
/* Make the directory accessible. */
|
/* Make the directory accessible. */
|
||||||
|
@ -682,7 +703,14 @@ std::string drainFD(int fd, bool block, const size_t reserveSize)
|
||||||
|
|
||||||
void drainFD(int fd, Sink & sink, bool block)
|
void drainFD(int fd, Sink & sink, bool block)
|
||||||
{
|
{
|
||||||
int saved;
|
// silence GCC maybe-uninitialized warning in finally
|
||||||
|
int saved = 0;
|
||||||
|
|
||||||
|
if (!block) {
|
||||||
|
saved = fcntl(fd, F_GETFL);
|
||||||
|
if (fcntl(fd, F_SETFL, saved | O_NONBLOCK) == -1)
|
||||||
|
throw SysError("making file descriptor non-blocking");
|
||||||
|
}
|
||||||
|
|
||||||
Finally finally([&]() {
|
Finally finally([&]() {
|
||||||
if (!block) {
|
if (!block) {
|
||||||
|
@ -691,12 +719,6 @@ void drainFD(int fd, Sink & sink, bool block)
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!block) {
|
|
||||||
saved = fcntl(fd, F_GETFL);
|
|
||||||
if (fcntl(fd, F_SETFL, saved | O_NONBLOCK) == -1)
|
|
||||||
throw SysError("making file descriptor non-blocking");
|
|
||||||
}
|
|
||||||
|
|
||||||
std::vector<unsigned char> buf(64 * 1024);
|
std::vector<unsigned char> buf(64 * 1024);
|
||||||
while (1) {
|
while (1) {
|
||||||
checkInterrupt();
|
checkInterrupt();
|
||||||
|
|
|
@ -325,8 +325,7 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
|
|
||||||
state->printStats();
|
state->printStats();
|
||||||
|
|
||||||
auto buildPaths = [&](const std::vector<StorePathWithOutputs> & paths0) {
|
auto buildPaths = [&](const std::vector<DerivedPath> & paths) {
|
||||||
auto paths = toDerivedPaths(paths0);
|
|
||||||
/* Note: we do this even when !printMissing to efficiently
|
/* Note: we do this even when !printMissing to efficiently
|
||||||
fetch binary cache data. */
|
fetch binary cache data. */
|
||||||
uint64_t downloadSize, narSize;
|
uint64_t downloadSize, narSize;
|
||||||
|
@ -348,7 +347,7 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
auto & drvInfo = drvs.front();
|
auto & drvInfo = drvs.front();
|
||||||
auto drv = evalStore->derivationFromPath(drvInfo.requireDrvPath());
|
auto drv = evalStore->derivationFromPath(drvInfo.requireDrvPath());
|
||||||
|
|
||||||
std::vector<StorePathWithOutputs> pathsToBuild;
|
std::vector<DerivedPath> pathsToBuild;
|
||||||
RealisedPath::Set pathsToCopy;
|
RealisedPath::Set pathsToCopy;
|
||||||
|
|
||||||
/* Figure out what bash shell to use. If $NIX_BUILD_SHELL
|
/* Figure out what bash shell to use. If $NIX_BUILD_SHELL
|
||||||
|
@ -370,7 +369,10 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
throw Error("the 'bashInteractive' attribute in <nixpkgs> did not evaluate to a derivation");
|
throw Error("the 'bashInteractive' attribute in <nixpkgs> did not evaluate to a derivation");
|
||||||
|
|
||||||
auto bashDrv = drv->requireDrvPath();
|
auto bashDrv = drv->requireDrvPath();
|
||||||
pathsToBuild.push_back({bashDrv});
|
pathsToBuild.push_back(DerivedPath::Built {
|
||||||
|
.drvPath = bashDrv,
|
||||||
|
.outputs = {},
|
||||||
|
});
|
||||||
pathsToCopy.insert(bashDrv);
|
pathsToCopy.insert(bashDrv);
|
||||||
shellDrv = bashDrv;
|
shellDrv = bashDrv;
|
||||||
|
|
||||||
|
@ -382,17 +384,24 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Build or fetch all dependencies of the derivation.
|
// Build or fetch all dependencies of the derivation.
|
||||||
for (const auto & input : drv.inputDrvs)
|
for (const auto & [inputDrv0, inputOutputs] : drv.inputDrvs) {
|
||||||
|
// To get around lambda capturing restrictions in the
|
||||||
|
// standard.
|
||||||
|
const auto & inputDrv = inputDrv0;
|
||||||
if (std::all_of(envExclude.cbegin(), envExclude.cend(),
|
if (std::all_of(envExclude.cbegin(), envExclude.cend(),
|
||||||
[&](const std::string & exclude) {
|
[&](const std::string & exclude) {
|
||||||
return !std::regex_search(store->printStorePath(input.first), std::regex(exclude));
|
return !std::regex_search(store->printStorePath(inputDrv), std::regex(exclude));
|
||||||
}))
|
}))
|
||||||
{
|
{
|
||||||
pathsToBuild.push_back({input.first, input.second});
|
pathsToBuild.push_back(DerivedPath::Built {
|
||||||
pathsToCopy.insert(input.first);
|
.drvPath = inputDrv,
|
||||||
|
.outputs = inputOutputs
|
||||||
|
});
|
||||||
|
pathsToCopy.insert(inputDrv);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
for (const auto & src : drv.inputSrcs) {
|
for (const auto & src : drv.inputSrcs) {
|
||||||
pathsToBuild.push_back({src});
|
pathsToBuild.push_back(DerivedPath::Opaque{src});
|
||||||
pathsToCopy.insert(src);
|
pathsToCopy.insert(src);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -543,7 +552,7 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
|
|
||||||
else {
|
else {
|
||||||
|
|
||||||
std::vector<StorePathWithOutputs> pathsToBuild;
|
std::vector<DerivedPath> pathsToBuild;
|
||||||
std::vector<std::pair<StorePath, std::string>> pathsToBuildOrdered;
|
std::vector<std::pair<StorePath, std::string>> pathsToBuildOrdered;
|
||||||
RealisedPath::Set drvsToCopy;
|
RealisedPath::Set drvsToCopy;
|
||||||
|
|
||||||
|
@ -556,7 +565,7 @@ static void main_nix_build(int argc, char * * argv)
|
||||||
if (outputName == "")
|
if (outputName == "")
|
||||||
throw Error("derivation '%s' lacks an 'outputName' attribute", store->printStorePath(drvPath));
|
throw Error("derivation '%s' lacks an 'outputName' attribute", store->printStorePath(drvPath));
|
||||||
|
|
||||||
pathsToBuild.push_back({drvPath, {outputName}});
|
pathsToBuild.push_back(DerivedPath::Built{drvPath, {outputName}});
|
||||||
pathsToBuildOrdered.push_back({drvPath, {outputName}});
|
pathsToBuildOrdered.push_back({drvPath, {outputName}});
|
||||||
drvsToCopy.insert(drvPath);
|
drvsToCopy.insert(drvPath);
|
||||||
|
|
||||||
|
|
|
@ -1,4 +1,5 @@
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
#include "profiles.hh"
|
#include "profiles.hh"
|
||||||
#include "shared.hh"
|
#include "shared.hh"
|
||||||
|
@ -81,7 +82,7 @@ static int main_nix_collect_garbage(int argc, char * * argv)
|
||||||
// Run the actual garbage collector.
|
// Run the actual garbage collector.
|
||||||
if (!dryRun) {
|
if (!dryRun) {
|
||||||
auto store = openStore();
|
auto store = openStore();
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
options.action = GCOptions::gcDeleteDead;
|
options.action = GCOptions::gcDeleteDead;
|
||||||
GCResults results;
|
GCResults results;
|
||||||
PrintFreed freed(true, results);
|
PrintFreed freed(true, results);
|
||||||
|
|
|
@ -128,7 +128,12 @@ static void getAllExprs(EvalState & state,
|
||||||
if (hasSuffix(attrName, ".nix"))
|
if (hasSuffix(attrName, ".nix"))
|
||||||
attrName = std::string(attrName, 0, attrName.size() - 4);
|
attrName = std::string(attrName, 0, attrName.size() - 4);
|
||||||
if (!seen.insert(attrName).second) {
|
if (!seen.insert(attrName).second) {
|
||||||
printError("warning: name collision in input Nix expressions, skipping '%1%'", path2);
|
std::string suggestionMessage = "";
|
||||||
|
if (path2.find("channels") != std::string::npos && path.find("channels") != std::string::npos) {
|
||||||
|
suggestionMessage = fmt("\nsuggestion: remove '%s' from either the root channels or the user channels", attrName);
|
||||||
|
}
|
||||||
|
printError("warning: name collision in input Nix expressions, skipping '%1%'"
|
||||||
|
"%2%", path2, suggestionMessage);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
/* Load the expression on demand. */
|
/* Load the expression on demand. */
|
||||||
|
@ -918,12 +923,17 @@ static void queryJSON(Globals & globals, std::vector<DrvInfo> & elems, bool prin
|
||||||
pkgObj.attr("pname", drvName.name);
|
pkgObj.attr("pname", drvName.name);
|
||||||
pkgObj.attr("version", drvName.version);
|
pkgObj.attr("version", drvName.version);
|
||||||
pkgObj.attr("system", i.querySystem());
|
pkgObj.attr("system", i.querySystem());
|
||||||
|
pkgObj.attr("outputName", i.queryOutputName());
|
||||||
|
|
||||||
if (printOutPath) {
|
{
|
||||||
DrvInfo::Outputs outputs = i.queryOutputs();
|
DrvInfo::Outputs outputs = i.queryOutputs(printOutPath);
|
||||||
JSONObject outputObj = pkgObj.object("outputs");
|
JSONObject outputObj = pkgObj.object("outputs");
|
||||||
for (auto & j : outputs)
|
for (auto & j : outputs) {
|
||||||
outputObj.attr(j.first, globals.state->store->printStorePath(j.second));
|
if (j.second)
|
||||||
|
outputObj.attr(j.first, globals.state->store->printStorePath(*j.second));
|
||||||
|
else
|
||||||
|
outputObj.attr(j.first, nullptr);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (printMeta) {
|
if (printMeta) {
|
||||||
|
@ -1052,6 +1062,7 @@ static void opQuery(Globals & globals, Strings opFlags, Strings opArgs)
|
||||||
/* Print the desired columns, or XML output. */
|
/* Print the desired columns, or XML output. */
|
||||||
if (jsonOutput) {
|
if (jsonOutput) {
|
||||||
queryJSON(globals, elems, printOutPath, printMeta);
|
queryJSON(globals, elems, printOutPath, printMeta);
|
||||||
|
cout << '\n';
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1154,13 +1165,16 @@ static void opQuery(Globals & globals, Strings opFlags, Strings opArgs)
|
||||||
columns.push_back(drvPath ? store.printStorePath(*drvPath) : "-");
|
columns.push_back(drvPath ? store.printStorePath(*drvPath) : "-");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (xmlOutput)
|
||||||
|
attrs["outputName"] = i.queryOutputName();
|
||||||
|
|
||||||
if (printOutPath && !xmlOutput) {
|
if (printOutPath && !xmlOutput) {
|
||||||
DrvInfo::Outputs outputs = i.queryOutputs();
|
DrvInfo::Outputs outputs = i.queryOutputs();
|
||||||
std::string s;
|
std::string s;
|
||||||
for (auto & j : outputs) {
|
for (auto & j : outputs) {
|
||||||
if (!s.empty()) s += ';';
|
if (!s.empty()) s += ';';
|
||||||
if (j.first != "out") { s += j.first; s += "="; }
|
if (j.first != "out") { s += j.first; s += "="; }
|
||||||
s += store.printStorePath(j.second);
|
s += store.printStorePath(*j.second);
|
||||||
}
|
}
|
||||||
columns.push_back(s);
|
columns.push_back(s);
|
||||||
}
|
}
|
||||||
|
@ -1174,71 +1188,67 @@ static void opQuery(Globals & globals, Strings opFlags, Strings opArgs)
|
||||||
}
|
}
|
||||||
|
|
||||||
if (xmlOutput) {
|
if (xmlOutput) {
|
||||||
if (printOutPath || printMeta) {
|
XMLOpenElement item(xml, "item", attrs);
|
||||||
XMLOpenElement item(xml, "item", attrs);
|
DrvInfo::Outputs outputs = i.queryOutputs(printOutPath);
|
||||||
if (printOutPath) {
|
for (auto & j : outputs) {
|
||||||
DrvInfo::Outputs outputs = i.queryOutputs();
|
XMLAttrs attrs2;
|
||||||
for (auto & j : outputs) {
|
attrs2["name"] = j.first;
|
||||||
XMLAttrs attrs2;
|
if (j.second)
|
||||||
attrs2["name"] = j.first;
|
attrs2["path"] = store.printStorePath(*j.second);
|
||||||
attrs2["path"] = store.printStorePath(j.second);
|
xml.writeEmptyElement("output", attrs2);
|
||||||
xml.writeEmptyElement("output", attrs2);
|
}
|
||||||
}
|
if (printMeta) {
|
||||||
}
|
StringSet metaNames = i.queryMetaNames();
|
||||||
if (printMeta) {
|
for (auto & j : metaNames) {
|
||||||
StringSet metaNames = i.queryMetaNames();
|
XMLAttrs attrs2;
|
||||||
for (auto & j : metaNames) {
|
attrs2["name"] = j;
|
||||||
XMLAttrs attrs2;
|
Value * v = i.queryMeta(j);
|
||||||
attrs2["name"] = j;
|
if (!v)
|
||||||
Value * v = i.queryMeta(j);
|
printError(
|
||||||
if (!v)
|
"derivation '%s' has invalid meta attribute '%s'",
|
||||||
printError(
|
i.queryName(), j);
|
||||||
"derivation '%s' has invalid meta attribute '%s'",
|
else {
|
||||||
i.queryName(), j);
|
if (v->type() == nString) {
|
||||||
else {
|
attrs2["type"] = "string";
|
||||||
if (v->type() == nString) {
|
attrs2["value"] = v->string.s;
|
||||||
attrs2["type"] = "string";
|
xml.writeEmptyElement("meta", attrs2);
|
||||||
attrs2["value"] = v->string.s;
|
} else if (v->type() == nInt) {
|
||||||
xml.writeEmptyElement("meta", attrs2);
|
attrs2["type"] = "int";
|
||||||
} else if (v->type() == nInt) {
|
attrs2["value"] = (format("%1%") % v->integer).str();
|
||||||
attrs2["type"] = "int";
|
xml.writeEmptyElement("meta", attrs2);
|
||||||
attrs2["value"] = (format("%1%") % v->integer).str();
|
} else if (v->type() == nFloat) {
|
||||||
xml.writeEmptyElement("meta", attrs2);
|
attrs2["type"] = "float";
|
||||||
} else if (v->type() == nFloat) {
|
attrs2["value"] = (format("%1%") % v->fpoint).str();
|
||||||
attrs2["type"] = "float";
|
xml.writeEmptyElement("meta", attrs2);
|
||||||
attrs2["value"] = (format("%1%") % v->fpoint).str();
|
} else if (v->type() == nBool) {
|
||||||
xml.writeEmptyElement("meta", attrs2);
|
attrs2["type"] = "bool";
|
||||||
} else if (v->type() == nBool) {
|
attrs2["value"] = v->boolean ? "true" : "false";
|
||||||
attrs2["type"] = "bool";
|
xml.writeEmptyElement("meta", attrs2);
|
||||||
attrs2["value"] = v->boolean ? "true" : "false";
|
} else if (v->type() == nList) {
|
||||||
xml.writeEmptyElement("meta", attrs2);
|
attrs2["type"] = "strings";
|
||||||
} else if (v->type() == nList) {
|
XMLOpenElement m(xml, "meta", attrs2);
|
||||||
attrs2["type"] = "strings";
|
for (auto elem : v->listItems()) {
|
||||||
XMLOpenElement m(xml, "meta", attrs2);
|
if (elem->type() != nString) continue;
|
||||||
for (auto elem : v->listItems()) {
|
XMLAttrs attrs3;
|
||||||
if (elem->type() != nString) continue;
|
attrs3["value"] = elem->string.s;
|
||||||
XMLAttrs attrs3;
|
xml.writeEmptyElement("string", attrs3);
|
||||||
attrs3["value"] = elem->string.s;
|
|
||||||
xml.writeEmptyElement("string", attrs3);
|
|
||||||
}
|
|
||||||
} else if (v->type() == nAttrs) {
|
|
||||||
attrs2["type"] = "strings";
|
|
||||||
XMLOpenElement m(xml, "meta", attrs2);
|
|
||||||
Bindings & attrs = *v->attrs;
|
|
||||||
for (auto &i : attrs) {
|
|
||||||
Attr & a(*attrs.find(i.name));
|
|
||||||
if(a.value->type() != nString) continue;
|
|
||||||
XMLAttrs attrs3;
|
|
||||||
attrs3["type"] = i.name;
|
|
||||||
attrs3["value"] = a.value->string.s;
|
|
||||||
xml.writeEmptyElement("string", attrs3);
|
|
||||||
}
|
}
|
||||||
}
|
} else if (v->type() == nAttrs) {
|
||||||
|
attrs2["type"] = "strings";
|
||||||
|
XMLOpenElement m(xml, "meta", attrs2);
|
||||||
|
Bindings & attrs = *v->attrs;
|
||||||
|
for (auto &i : attrs) {
|
||||||
|
Attr & a(*attrs.find(i.name));
|
||||||
|
if(a.value->type() != nString) continue;
|
||||||
|
XMLAttrs attrs3;
|
||||||
|
attrs3["type"] = i.name;
|
||||||
|
attrs3["value"] = a.value->string.s;
|
||||||
|
xml.writeEmptyElement("string", attrs3);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} else
|
}
|
||||||
xml.writeEmptyElement("item", attrs);
|
|
||||||
} else
|
} else
|
||||||
table.push_back(columns);
|
table.push_back(columns);
|
||||||
|
|
||||||
|
|
|
@ -56,7 +56,7 @@ bool createUserEnv(EvalState & state, DrvInfos & elems,
|
||||||
output paths, and optionally the derivation path, as well
|
output paths, and optionally the derivation path, as well
|
||||||
as the meta attributes. */
|
as the meta attributes. */
|
||||||
std::optional<StorePath> drvPath = keepDerivations ? i.queryDrvPath() : std::nullopt;
|
std::optional<StorePath> drvPath = keepDerivations ? i.queryDrvPath() : std::nullopt;
|
||||||
DrvInfo::Outputs outputs = i.queryOutputs(true);
|
DrvInfo::Outputs outputs = i.queryOutputs(true, true);
|
||||||
StringSet metaNames = i.queryMetaNames();
|
StringSet metaNames = i.queryMetaNames();
|
||||||
|
|
||||||
auto attrs = state.buildBindings(7 + outputs.size());
|
auto attrs = state.buildBindings(7 + outputs.size());
|
||||||
|
@ -76,15 +76,15 @@ bool createUserEnv(EvalState & state, DrvInfos & elems,
|
||||||
for (const auto & [m, j] : enumerate(outputs)) {
|
for (const auto & [m, j] : enumerate(outputs)) {
|
||||||
(vOutputs.listElems()[m] = state.allocValue())->mkString(j.first);
|
(vOutputs.listElems()[m] = state.allocValue())->mkString(j.first);
|
||||||
auto outputAttrs = state.buildBindings(2);
|
auto outputAttrs = state.buildBindings(2);
|
||||||
outputAttrs.alloc(state.sOutPath).mkString(state.store->printStorePath(j.second));
|
outputAttrs.alloc(state.sOutPath).mkString(state.store->printStorePath(*j.second));
|
||||||
attrs.alloc(j.first).mkAttrs(outputAttrs);
|
attrs.alloc(j.first).mkAttrs(outputAttrs);
|
||||||
|
|
||||||
/* This is only necessary when installing store paths, e.g.,
|
/* This is only necessary when installing store paths, e.g.,
|
||||||
`nix-env -i /nix/store/abcd...-foo'. */
|
`nix-env -i /nix/store/abcd...-foo'. */
|
||||||
state.store->addTempRoot(j.second);
|
state.store->addTempRoot(*j.second);
|
||||||
state.store->ensurePath(j.second);
|
state.store->ensurePath(*j.second);
|
||||||
|
|
||||||
references.insert(j.second);
|
references.insert(*j.second);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Copy the meta attributes.
|
// Copy the meta attributes.
|
||||||
|
@ -105,8 +105,10 @@ bool createUserEnv(EvalState & state, DrvInfos & elems,
|
||||||
/* Also write a copy of the list of user environment elements to
|
/* Also write a copy of the list of user environment elements to
|
||||||
the store; we need it for future modifications of the
|
the store; we need it for future modifications of the
|
||||||
environment. */
|
environment. */
|
||||||
|
std::ostringstream str;
|
||||||
|
manifest.print(str, true);
|
||||||
auto manifestFile = state.store->addTextToStore("env-manifest.nix",
|
auto manifestFile = state.store->addTextToStore("env-manifest.nix",
|
||||||
fmt("%s", manifest), references);
|
str.str(), references);
|
||||||
|
|
||||||
/* Get the environment builder expression. */
|
/* Get the environment builder expression. */
|
||||||
Value envBuilder;
|
Value envBuilder;
|
||||||
|
|
|
@ -3,7 +3,9 @@
|
||||||
#include "dotgraph.hh"
|
#include "dotgraph.hh"
|
||||||
#include "globals.hh"
|
#include "globals.hh"
|
||||||
#include "build-result.hh"
|
#include "build-result.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
#include "local-store.hh"
|
#include "local-store.hh"
|
||||||
#include "monitor-fd.hh"
|
#include "monitor-fd.hh"
|
||||||
#include "serve-protocol.hh"
|
#include "serve-protocol.hh"
|
||||||
|
@ -435,7 +437,7 @@ static void opQuery(Strings opFlags, Strings opArgs)
|
||||||
store->computeFSClosure(
|
store->computeFSClosure(
|
||||||
args, referrers, true, settings.gcKeepOutputs, settings.gcKeepDerivations);
|
args, referrers, true, settings.gcKeepOutputs, settings.gcKeepDerivations);
|
||||||
|
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
Roots roots = gcStore.findRoots(false);
|
Roots roots = gcStore.findRoots(false);
|
||||||
for (auto & [target, links] : roots)
|
for (auto & [target, links] : roots)
|
||||||
if (referrers.find(target) != referrers.end())
|
if (referrers.find(target) != referrers.end())
|
||||||
|
@ -480,13 +482,15 @@ static void opReadLog(Strings opFlags, Strings opArgs)
|
||||||
{
|
{
|
||||||
if (!opFlags.empty()) throw UsageError("unknown flag");
|
if (!opFlags.empty()) throw UsageError("unknown flag");
|
||||||
|
|
||||||
|
auto & logStore = require<LogStore>(*store);
|
||||||
|
|
||||||
RunPager pager;
|
RunPager pager;
|
||||||
|
|
||||||
for (auto & i : opArgs) {
|
for (auto & i : opArgs) {
|
||||||
auto path = store->followLinksToStorePath(i);
|
auto path = logStore.followLinksToStorePath(i);
|
||||||
auto log = store->getBuildLog(path);
|
auto log = logStore.getBuildLog(path);
|
||||||
if (!log)
|
if (!log)
|
||||||
throw Error("build log of derivation '%s' is not available", store->printStorePath(path));
|
throw Error("build log of derivation '%s' is not available", logStore.printStorePath(path));
|
||||||
std::cout << *log;
|
std::cout << *log;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -596,7 +600,7 @@ static void opGC(Strings opFlags, Strings opArgs)
|
||||||
|
|
||||||
if (!opArgs.empty()) throw UsageError("no arguments expected");
|
if (!opArgs.empty()) throw UsageError("no arguments expected");
|
||||||
|
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
|
|
||||||
if (printRoots) {
|
if (printRoots) {
|
||||||
Roots roots = gcStore.findRoots(false);
|
Roots roots = gcStore.findRoots(false);
|
||||||
|
@ -635,7 +639,7 @@ static void opDelete(Strings opFlags, Strings opArgs)
|
||||||
for (auto & i : opArgs)
|
for (auto & i : opArgs)
|
||||||
options.pathsToDelete.insert(store->followLinksToStorePath(i));
|
options.pathsToDelete.insert(store->followLinksToStorePath(i));
|
||||||
|
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
|
|
||||||
GCResults results;
|
GCResults results;
|
||||||
PrintFreed freed(true, results);
|
PrintFreed freed(true, results);
|
||||||
|
|
|
@ -4,6 +4,7 @@
|
||||||
#include "eval-cache.hh"
|
#include "eval-cache.hh"
|
||||||
#include "names.hh"
|
#include "names.hh"
|
||||||
#include "command.hh"
|
#include "command.hh"
|
||||||
|
#include "derivations.hh"
|
||||||
|
|
||||||
namespace nix {
|
namespace nix {
|
||||||
|
|
||||||
|
@ -70,7 +71,7 @@ UnresolvedApp Installable::toApp(EvalState & state)
|
||||||
|
|
||||||
std::vector<StorePathWithOutputs> context2;
|
std::vector<StorePathWithOutputs> context2;
|
||||||
for (auto & [path, name] : context)
|
for (auto & [path, name] : context)
|
||||||
context2.push_back({state.store->parseStorePath(path), {name}});
|
context2.push_back({path, {name}});
|
||||||
|
|
||||||
return UnresolvedApp{App {
|
return UnresolvedApp{App {
|
||||||
.context = std::move(context2),
|
.context = std::move(context2),
|
||||||
|
|
|
@ -52,15 +52,26 @@ struct CmdBuild : InstallablesCommand, MixDryRun, MixJSON, MixProfile
|
||||||
|
|
||||||
void run(ref<Store> store) override
|
void run(ref<Store> store) override
|
||||||
{
|
{
|
||||||
|
if (dryRun) {
|
||||||
|
std::vector<DerivedPath> pathsToBuild;
|
||||||
|
|
||||||
|
for (auto & i : installables) {
|
||||||
|
auto b = i->toDerivedPaths();
|
||||||
|
pathsToBuild.insert(pathsToBuild.end(), b.begin(), b.end());
|
||||||
|
}
|
||||||
|
printMissing(store, pathsToBuild, lvlError);
|
||||||
|
if (json)
|
||||||
|
logger->cout("%s", derivedPathsToJSON(pathsToBuild, store).dump());
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
auto buildables = Installable::build(
|
auto buildables = Installable::build(
|
||||||
getEvalStore(), store,
|
getEvalStore(), store,
|
||||||
dryRun ? Realise::Derivation : Realise::Outputs,
|
Realise::Outputs,
|
||||||
installables, buildMode);
|
installables, buildMode);
|
||||||
|
|
||||||
if (json) logger->cout("%s", derivedPathsWithHintsToJSON(buildables, store).dump());
|
if (json) logger->cout("%s", derivedPathsWithHintsToJSON(buildables, store).dump());
|
||||||
|
|
||||||
if (dryRun) return;
|
|
||||||
|
|
||||||
if (outLink != "")
|
if (outLink != "")
|
||||||
if (auto store2 = store.dynamic_pointer_cast<LocalFSStore>())
|
if (auto store2 = store.dynamic_pointer_cast<LocalFSStore>())
|
||||||
for (const auto & [_i, buildable] : enumerate(buildables)) {
|
for (const auto & [_i, buildable] : enumerate(buildables)) {
|
||||||
|
|
|
@ -196,21 +196,22 @@ static StorePath getDerivationEnvironment(ref<Store> store, ref<Store> evalStore
|
||||||
drv.inputSrcs.insert(std::move(getEnvShPath));
|
drv.inputSrcs.insert(std::move(getEnvShPath));
|
||||||
if (settings.isExperimentalFeatureEnabled(Xp::CaDerivations)) {
|
if (settings.isExperimentalFeatureEnabled(Xp::CaDerivations)) {
|
||||||
for (auto & output : drv.outputs) {
|
for (auto & output : drv.outputs) {
|
||||||
output.second = {
|
output.second = DerivationOutput::Deferred {},
|
||||||
.output = DerivationOutputDeferred{},
|
|
||||||
};
|
|
||||||
drv.env[output.first] = hashPlaceholder(output.first);
|
drv.env[output.first] = hashPlaceholder(output.first);
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
for (auto & output : drv.outputs) {
|
for (auto & output : drv.outputs) {
|
||||||
output.second = { .output = DerivationOutputInputAddressed { .path = StorePath::dummy } };
|
output.second = DerivationOutput::Deferred { };
|
||||||
drv.env[output.first] = "";
|
drv.env[output.first] = "";
|
||||||
}
|
}
|
||||||
Hash h = std::get<0>(hashDerivationModulo(*evalStore, drv, true));
|
auto h0 = hashDerivationModulo(*evalStore, drv, true);
|
||||||
|
const Hash & h = h0.requireNoFixedNonDeferred();
|
||||||
|
|
||||||
for (auto & output : drv.outputs) {
|
for (auto & output : drv.outputs) {
|
||||||
auto outPath = store->makeOutputPath(output.first, h, drv.name);
|
auto outPath = store->makeOutputPath(output.first, h, drv.name);
|
||||||
output.second = { .output = DerivationOutputInputAddressed { .path = outPath } };
|
output.second = DerivationOutput::InputAddressed {
|
||||||
|
.path = outPath,
|
||||||
|
};
|
||||||
drv.env[output.first] = store->printStorePath(outPath);
|
drv.env[output.first] = store->printStorePath(outPath);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -24,8 +24,8 @@ this attribute to the location of the definition of the
|
||||||
`meta.description`, `version` or `name` derivation attributes.
|
`meta.description`, `version` or `name` derivation attributes.
|
||||||
|
|
||||||
The editor to invoke is specified by the `EDITOR` environment
|
The editor to invoke is specified by the `EDITOR` environment
|
||||||
variable. It defaults to `cat`. If the editor is `emacs`, `nano` or
|
variable. It defaults to `cat`. If the editor is `emacs`, `nano`,
|
||||||
`vim`, it is passed the line number of the derivation using the
|
`vim` or `kak`, it is passed the line number of the derivation using
|
||||||
argument `+<lineno>`.
|
the argument `+<lineno>`.
|
||||||
|
|
||||||
)""
|
)""
|
||||||
|
|
|
@ -2,6 +2,7 @@
|
||||||
#include "common-args.hh"
|
#include "common-args.hh"
|
||||||
#include "shared.hh"
|
#include "shared.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
#include "progress-bar.hh"
|
#include "progress-bar.hh"
|
||||||
|
|
||||||
using namespace nix;
|
using namespace nix;
|
||||||
|
@ -34,17 +35,24 @@ struct CmdLog : InstallableCommand
|
||||||
|
|
||||||
RunPager pager;
|
RunPager pager;
|
||||||
for (auto & sub : subs) {
|
for (auto & sub : subs) {
|
||||||
|
auto * logSubP = dynamic_cast<LogStore *>(&*sub);
|
||||||
|
if (!logSubP) {
|
||||||
|
printInfo("Skipped '%s' which does not support retrieving build logs", sub->getUri());
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
auto & logSub = *logSubP;
|
||||||
|
|
||||||
auto log = std::visit(overloaded {
|
auto log = std::visit(overloaded {
|
||||||
[&](const DerivedPath::Opaque & bo) {
|
[&](const DerivedPath::Opaque & bo) {
|
||||||
return sub->getBuildLog(bo.path);
|
return logSub.getBuildLog(bo.path);
|
||||||
},
|
},
|
||||||
[&](const DerivedPath::Built & bfd) {
|
[&](const DerivedPath::Built & bfd) {
|
||||||
return sub->getBuildLog(bfd.drvPath);
|
return logSub.getBuildLog(bfd.drvPath);
|
||||||
},
|
},
|
||||||
}, b.raw());
|
}, b.raw());
|
||||||
if (!log) continue;
|
if (!log) continue;
|
||||||
stopProgressBar();
|
stopProgressBar();
|
||||||
printInfo("got build log for '%s' from '%s'", installable->what(), sub->getUri());
|
printInfo("got build log for '%s' from '%s'", installable->what(), logSub.getUri());
|
||||||
std::cout << *log;
|
std::cout << *log;
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
|
@ -117,7 +117,7 @@ struct NixArgs : virtual MultiCommand, virtual MixCommonArgs
|
||||||
{"hash-path", {"hash", "path"}},
|
{"hash-path", {"hash", "path"}},
|
||||||
{"ls-nar", {"nar", "ls"}},
|
{"ls-nar", {"nar", "ls"}},
|
||||||
{"ls-store", {"store", "ls"}},
|
{"ls-store", {"store", "ls"}},
|
||||||
{"make-content-addressable", {"store", "make-content-addressable"}},
|
{"make-content-addressable", {"store", "make-content-addressed"}},
|
||||||
{"optimise-store", {"store", "optimise"}},
|
{"optimise-store", {"store", "optimise"}},
|
||||||
{"ping-store", {"store", "ping"}},
|
{"ping-store", {"store", "ping"}},
|
||||||
{"sign-paths", {"store", "sign"}},
|
{"sign-paths", {"store", "sign"}},
|
||||||
|
@ -289,6 +289,7 @@ void mainWrapped(int argc, char * * argv)
|
||||||
}
|
}
|
||||||
|
|
||||||
if (argc == 2 && std::string(argv[1]) == "__dump-builtins") {
|
if (argc == 2 && std::string(argv[1]) == "__dump-builtins") {
|
||||||
|
settings.experimentalFeatures = {Xp::Flakes, Xp::FetchClosure};
|
||||||
evalSettings.pureEval = false;
|
evalSettings.pureEval = false;
|
||||||
EvalState state({}, openStore("dummy://"));
|
EvalState state({}, openStore("dummy://"));
|
||||||
auto res = nlohmann::json::object();
|
auto res = nlohmann::json::object();
|
||||||
|
|
|
@ -1,102 +0,0 @@
|
||||||
#include "command.hh"
|
|
||||||
#include "store-api.hh"
|
|
||||||
#include "references.hh"
|
|
||||||
#include "common-args.hh"
|
|
||||||
#include "json.hh"
|
|
||||||
|
|
||||||
using namespace nix;
|
|
||||||
|
|
||||||
struct CmdMakeContentAddressable : StorePathsCommand, MixJSON
|
|
||||||
{
|
|
||||||
CmdMakeContentAddressable()
|
|
||||||
{
|
|
||||||
realiseMode = Realise::Outputs;
|
|
||||||
}
|
|
||||||
|
|
||||||
std::string description() override
|
|
||||||
{
|
|
||||||
return "rewrite a path or closure to content-addressed form";
|
|
||||||
}
|
|
||||||
|
|
||||||
std::string doc() override
|
|
||||||
{
|
|
||||||
return
|
|
||||||
#include "make-content-addressable.md"
|
|
||||||
;
|
|
||||||
}
|
|
||||||
|
|
||||||
void run(ref<Store> store, StorePaths && storePaths) override
|
|
||||||
{
|
|
||||||
auto paths = store->topoSortPaths(StorePathSet(storePaths.begin(), storePaths.end()));
|
|
||||||
|
|
||||||
std::reverse(paths.begin(), paths.end());
|
|
||||||
|
|
||||||
std::map<StorePath, StorePath> remappings;
|
|
||||||
|
|
||||||
auto jsonRoot = json ? std::make_unique<JSONObject>(std::cout) : nullptr;
|
|
||||||
auto jsonRewrites = json ? std::make_unique<JSONObject>(jsonRoot->object("rewrites")) : nullptr;
|
|
||||||
|
|
||||||
for (auto & path : paths) {
|
|
||||||
auto pathS = store->printStorePath(path);
|
|
||||||
auto oldInfo = store->queryPathInfo(path);
|
|
||||||
std::string oldHashPart(path.hashPart());
|
|
||||||
|
|
||||||
StringSink sink;
|
|
||||||
store->narFromPath(path, sink);
|
|
||||||
|
|
||||||
StringMap rewrites;
|
|
||||||
|
|
||||||
PathReferences<StorePath> refs;
|
|
||||||
refs.hasSelfReference = oldInfo->hasSelfReference;
|
|
||||||
for (auto & ref : oldInfo->references) {
|
|
||||||
auto i = remappings.find(ref);
|
|
||||||
auto replacement = i != remappings.end() ? i->second : ref;
|
|
||||||
// FIXME: warn about unremapped paths?
|
|
||||||
if (replacement != ref)
|
|
||||||
rewrites.insert_or_assign(store->printStorePath(ref), store->printStorePath(replacement));
|
|
||||||
refs.references.insert(std::move(replacement));
|
|
||||||
}
|
|
||||||
|
|
||||||
sink.s = rewriteStrings(sink.s, rewrites);
|
|
||||||
|
|
||||||
HashModuloSink hashModuloSink(htSHA256, oldHashPart);
|
|
||||||
hashModuloSink(sink.s);
|
|
||||||
|
|
||||||
auto narHash = hashModuloSink.finish().first;
|
|
||||||
|
|
||||||
ValidPathInfo info {
|
|
||||||
*store,
|
|
||||||
StorePathDescriptor {
|
|
||||||
.name = std::string { path.name() },
|
|
||||||
.info = FixedOutputInfo {
|
|
||||||
{
|
|
||||||
.method = FileIngestionMethod::Recursive,
|
|
||||||
.hash = narHash,
|
|
||||||
},
|
|
||||||
std::move(refs),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
narHash,
|
|
||||||
};
|
|
||||||
info.narSize = sink.s.size();
|
|
||||||
|
|
||||||
if (!json)
|
|
||||||
notice("rewrote '%s' to '%s'", pathS, store->printStorePath(info.path));
|
|
||||||
|
|
||||||
auto source = sinkToSource([&](Sink & nextSink) {
|
|
||||||
RewritingSink rsink2(oldHashPart, std::string(info.path.hashPart()), nextSink);
|
|
||||||
rsink2(sink.s);
|
|
||||||
rsink2.flush();
|
|
||||||
});
|
|
||||||
|
|
||||||
store->addToStore(info, *source);
|
|
||||||
|
|
||||||
if (json)
|
|
||||||
jsonRewrites->attr(store->printStorePath(path), store->printStorePath(info.path));
|
|
||||||
|
|
||||||
remappings.insert_or_assign(std::move(path), std::move(info.path));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
static auto rCmdMakeContentAddressable = registerCommand2<CmdMakeContentAddressable>({"store", "make-content-addressable"});
|
|
55
src/nix/make-content-addressed.cc
Normal file
55
src/nix/make-content-addressed.cc
Normal file
|
@ -0,0 +1,55 @@
|
||||||
|
#include "command.hh"
|
||||||
|
#include "store-api.hh"
|
||||||
|
#include "make-content-addressed.hh"
|
||||||
|
#include "common-args.hh"
|
||||||
|
#include "json.hh"
|
||||||
|
|
||||||
|
using namespace nix;
|
||||||
|
|
||||||
|
struct CmdMakeContentAddressed : virtual CopyCommand, virtual StorePathsCommand, MixJSON
|
||||||
|
{
|
||||||
|
CmdMakeContentAddressed()
|
||||||
|
{
|
||||||
|
realiseMode = Realise::Outputs;
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string description() override
|
||||||
|
{
|
||||||
|
return "rewrite a path or closure to content-addressed form";
|
||||||
|
}
|
||||||
|
|
||||||
|
std::string doc() override
|
||||||
|
{
|
||||||
|
return
|
||||||
|
#include "make-content-addressed.md"
|
||||||
|
;
|
||||||
|
}
|
||||||
|
|
||||||
|
void run(ref<Store> srcStore, StorePaths && storePaths) override
|
||||||
|
{
|
||||||
|
auto dstStore = dstUri.empty() ? openStore() : openStore(dstUri);
|
||||||
|
|
||||||
|
auto remappings = makeContentAddressed(*srcStore, *dstStore,
|
||||||
|
StorePathSet(storePaths.begin(), storePaths.end()));
|
||||||
|
|
||||||
|
if (json) {
|
||||||
|
JSONObject jsonRoot(std::cout);
|
||||||
|
JSONObject jsonRewrites(jsonRoot.object("rewrites"));
|
||||||
|
for (auto & path : storePaths) {
|
||||||
|
auto i = remappings.find(path);
|
||||||
|
assert(i != remappings.end());
|
||||||
|
jsonRewrites.attr(srcStore->printStorePath(path), srcStore->printStorePath(i->second));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (auto & path : storePaths) {
|
||||||
|
auto i = remappings.find(path);
|
||||||
|
assert(i != remappings.end());
|
||||||
|
notice("rewrote '%s' to '%s'",
|
||||||
|
srcStore->printStorePath(path),
|
||||||
|
srcStore->printStorePath(i->second));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
static auto rCmdMakeContentAddressed = registerCommand2<CmdMakeContentAddressed>({"store", "make-content-addressed"});
|
|
@ -5,7 +5,7 @@ R""(
|
||||||
* Create a content-addressed representation of the closure of GNU Hello:
|
* Create a content-addressed representation of the closure of GNU Hello:
|
||||||
|
|
||||||
```console
|
```console
|
||||||
# nix store make-content-addressable -r nixpkgs#hello
|
# nix store make-content-addressed nixpkgs#hello
|
||||||
…
|
…
|
||||||
rewrote '/nix/store/v5sv61sszx301i0x6xysaqzla09nksnd-hello-2.10' to '/nix/store/5skmmcb9svys5lj3kbsrjg7vf2irid63-hello-2.10'
|
rewrote '/nix/store/v5sv61sszx301i0x6xysaqzla09nksnd-hello-2.10' to '/nix/store/5skmmcb9svys5lj3kbsrjg7vf2irid63-hello-2.10'
|
||||||
```
|
```
|
||||||
|
@ -29,7 +29,7 @@ R""(
|
||||||
system closure:
|
system closure:
|
||||||
|
|
||||||
```console
|
```console
|
||||||
# nix store make-content-addressable -r /run/current-system
|
# nix store make-content-addressed /run/current-system
|
||||||
```
|
```
|
||||||
|
|
||||||
# Description
|
# Description
|
|
@ -107,8 +107,9 @@ struct ProfileManifest
|
||||||
element.storePaths.insert(state.store->parseStorePath((std::string) p));
|
element.storePaths.insert(state.store->parseStorePath((std::string) p));
|
||||||
element.active = e["active"];
|
element.active = e["active"];
|
||||||
if (e.value("uri", "") != "") {
|
if (e.value("uri", "") != "") {
|
||||||
|
auto originalUrl = e.value("originalUrl", e["originalUri"]);
|
||||||
element.source = ProfileElementSource{
|
element.source = ProfileElementSource{
|
||||||
parseFlakeRef(e["originalUri"]),
|
parseFlakeRef(originalUrl),
|
||||||
parseFlakeRef(e["uri"]),
|
parseFlakeRef(e["uri"]),
|
||||||
e["attrPath"]
|
e["attrPath"]
|
||||||
};
|
};
|
||||||
|
@ -143,7 +144,7 @@ struct ProfileManifest
|
||||||
obj["storePaths"] = paths;
|
obj["storePaths"] = paths;
|
||||||
obj["active"] = element.active;
|
obj["active"] = element.active;
|
||||||
if (element.source) {
|
if (element.source) {
|
||||||
obj["originalUri"] = element.source->originalRef.to_string();
|
obj["originalUrl"] = element.source->originalRef.to_string();
|
||||||
obj["uri"] = element.source->resolvedRef.to_string();
|
obj["uri"] = element.source->resolvedRef.to_string();
|
||||||
obj["attrPath"] = element.source->attrPath;
|
obj["attrPath"] = element.source->attrPath;
|
||||||
}
|
}
|
||||||
|
|
|
@ -70,7 +70,7 @@ are installed in this version of the profile. It looks like this:
|
||||||
{
|
{
|
||||||
"active": true,
|
"active": true,
|
||||||
"attrPath": "legacyPackages.x86_64-linux.zoom-us",
|
"attrPath": "legacyPackages.x86_64-linux.zoom-us",
|
||||||
"originalUri": "flake:nixpkgs",
|
"originalUrl": "flake:nixpkgs",
|
||||||
"storePaths": [
|
"storePaths": [
|
||||||
"/nix/store/wbhg2ga8f3h87s9h5k0slxk0m81m4cxl-zoom-us-5.3.469451.0927"
|
"/nix/store/wbhg2ga8f3h87s9h5k0slxk0m81m4cxl-zoom-us-5.3.469451.0927"
|
||||||
],
|
],
|
||||||
|
@ -84,11 +84,11 @@ are installed in this version of the profile. It looks like this:
|
||||||
Each object in the array `elements` denotes an installed package and
|
Each object in the array `elements` denotes an installed package and
|
||||||
has the following fields:
|
has the following fields:
|
||||||
|
|
||||||
* `originalUri`: The [flake reference](./nix3-flake.md) specified by
|
* `originalUrl`: The [flake reference](./nix3-flake.md) specified by
|
||||||
the user at the time of installation (e.g. `nixpkgs`). This is also
|
the user at the time of installation (e.g. `nixpkgs`). This is also
|
||||||
the flake reference that will be used by `nix profile upgrade`.
|
the flake reference that will be used by `nix profile upgrade`.
|
||||||
|
|
||||||
* `uri`: The immutable flake reference to which `originalUri`
|
* `uri`: The immutable flake reference to which `originalUrl`
|
||||||
resolved.
|
resolved.
|
||||||
|
|
||||||
* `attrPath`: The flake output attribute that provided this
|
* `attrPath`: The flake output attribute that provided this
|
||||||
|
|
|
@ -25,6 +25,7 @@ extern "C" {
|
||||||
#include "eval-inline.hh"
|
#include "eval-inline.hh"
|
||||||
#include "attr-path.hh"
|
#include "attr-path.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
#include "common-eval-args.hh"
|
#include "common-eval-args.hh"
|
||||||
#include "get-drvs.hh"
|
#include "get-drvs.hh"
|
||||||
#include "derivations.hh"
|
#include "derivations.hh"
|
||||||
|
@ -395,6 +396,7 @@ StorePath NixRepl::getDerivationPath(Value & v) {
|
||||||
|
|
||||||
bool NixRepl::processLine(std::string line)
|
bool NixRepl::processLine(std::string line)
|
||||||
{
|
{
|
||||||
|
line = trim(line);
|
||||||
if (line == "") return true;
|
if (line == "") return true;
|
||||||
|
|
||||||
_isInterrupted = false;
|
_isInterrupted = false;
|
||||||
|
@ -526,9 +528,16 @@ bool NixRepl::processLine(std::string line)
|
||||||
bool foundLog = false;
|
bool foundLog = false;
|
||||||
RunPager pager;
|
RunPager pager;
|
||||||
for (auto & sub : subs) {
|
for (auto & sub : subs) {
|
||||||
auto log = sub->getBuildLog(drvPath);
|
auto * logSubP = dynamic_cast<LogStore *>(&*sub);
|
||||||
|
if (!logSubP) {
|
||||||
|
printInfo("Skipped '%s' which does not support retrieving build logs", sub->getUri());
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
auto & logSub = *logSubP;
|
||||||
|
|
||||||
|
auto log = logSub.getBuildLog(drvPath);
|
||||||
if (log) {
|
if (log) {
|
||||||
printInfo("got build log for '%s' from '%s'", drvPathRaw, sub->getUri());
|
printInfo("got build log for '%s' from '%s'", drvPathRaw, logSub.getUri());
|
||||||
logger->writeToStdout(*log);
|
logger->writeToStdout(*log);
|
||||||
foundLog = true;
|
foundLog = true;
|
||||||
break;
|
break;
|
||||||
|
|
|
@ -65,20 +65,20 @@ struct CmdShowDerivation : InstallablesCommand
|
||||||
auto & outputName = _outputName; // work around clang bug
|
auto & outputName = _outputName; // work around clang bug
|
||||||
auto outputObj { outputsObj.object(outputName) };
|
auto outputObj { outputsObj.object(outputName) };
|
||||||
std::visit(overloaded {
|
std::visit(overloaded {
|
||||||
[&](const DerivationOutputInputAddressed & doi) {
|
[&](const DerivationOutput::InputAddressed & doi) {
|
||||||
outputObj.attr("path", store->printStorePath(doi.path));
|
outputObj.attr("path", store->printStorePath(doi.path));
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFixed & dof) {
|
[&](const DerivationOutput::CAFixed & dof) {
|
||||||
outputObj.attr("path", store->printStorePath(dof.path(*store, drv.name, outputName)));
|
outputObj.attr("path", store->printStorePath(dof.path(*store, drv.name, outputName)));
|
||||||
outputObj.attr("hashAlgo", printMethodAlgo(dof.ca));
|
outputObj.attr("hashAlgo", printMethodAlgo(dof.ca));
|
||||||
outputObj.attr("hash", getContentAddressHash(dof.ca).to_string(Base16, false));
|
outputObj.attr("hash", getContentAddressHash(dof.ca).to_string(Base16, false));
|
||||||
// FIXME print refs?
|
// FIXME print refs?
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputCAFloating & dof) {
|
[&](const DerivationOutput::CAFloating & dof) {
|
||||||
outputObj.attr("hashAlgo", makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType));
|
outputObj.attr("hashAlgo", makeContentAddressingPrefix(dof.method) + printHashType(dof.hashType));
|
||||||
},
|
},
|
||||||
[&](const DerivationOutputDeferred &) {},
|
[&](const DerivationOutput::Deferred &) {},
|
||||||
}, output.output);
|
}, output.raw());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,8 @@
|
||||||
#include "command.hh"
|
#include "command.hh"
|
||||||
#include "shared.hh"
|
#include "shared.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
|
#include "log-store.hh"
|
||||||
#include "sync.hh"
|
#include "sync.hh"
|
||||||
#include "thread-pool.hh"
|
#include "thread-pool.hh"
|
||||||
|
|
||||||
|
@ -26,7 +28,10 @@ struct CmdCopyLog : virtual CopyCommand, virtual InstallablesCommand
|
||||||
|
|
||||||
void run(ref<Store> srcStore) override
|
void run(ref<Store> srcStore) override
|
||||||
{
|
{
|
||||||
|
auto & srcLogStore = require<LogStore>(*srcStore);
|
||||||
|
|
||||||
auto dstStore = getDstStore();
|
auto dstStore = getDstStore();
|
||||||
|
auto & dstLogStore = require<LogStore>(*dstStore);
|
||||||
|
|
||||||
StorePathSet drvPaths;
|
StorePathSet drvPaths;
|
||||||
|
|
||||||
|
@ -35,8 +40,8 @@ struct CmdCopyLog : virtual CopyCommand, virtual InstallablesCommand
|
||||||
drvPaths.insert(drvPath);
|
drvPaths.insert(drvPath);
|
||||||
|
|
||||||
for (auto & drvPath : drvPaths) {
|
for (auto & drvPath : drvPaths) {
|
||||||
if (auto log = srcStore->getBuildLog(drvPath))
|
if (auto log = srcLogStore.getBuildLog(drvPath))
|
||||||
dstStore->addBuildLog(drvPath, *log);
|
dstLogStore.addBuildLog(drvPath, *log);
|
||||||
else
|
else
|
||||||
throw Error("build log for '%s' is not available", srcStore->printStorePath(drvPath));
|
throw Error("build log for '%s' is not available", srcStore->printStorePath(drvPath));
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,6 +2,7 @@
|
||||||
#include "common-args.hh"
|
#include "common-args.hh"
|
||||||
#include "shared.hh"
|
#include "shared.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
|
||||||
using namespace nix;
|
using namespace nix;
|
||||||
|
@ -33,7 +34,7 @@ struct CmdStoreDelete : StorePathsCommand
|
||||||
|
|
||||||
void run(ref<Store> store, std::vector<StorePath> && storePaths) override
|
void run(ref<Store> store, std::vector<StorePath> && storePaths) override
|
||||||
{
|
{
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
|
|
||||||
for (auto & path : storePaths)
|
for (auto & path : storePaths)
|
||||||
options.pathsToDelete.insert(path);
|
options.pathsToDelete.insert(path);
|
||||||
|
|
|
@ -2,6 +2,7 @@
|
||||||
#include "common-args.hh"
|
#include "common-args.hh"
|
||||||
#include "shared.hh"
|
#include "shared.hh"
|
||||||
#include "store-api.hh"
|
#include "store-api.hh"
|
||||||
|
#include "store-cast.hh"
|
||||||
#include "gc-store.hh"
|
#include "gc-store.hh"
|
||||||
|
|
||||||
using namespace nix;
|
using namespace nix;
|
||||||
|
@ -34,7 +35,7 @@ struct CmdStoreGC : StoreCommand, MixDryRun
|
||||||
|
|
||||||
void run(ref<Store> store) override
|
void run(ref<Store> store) override
|
||||||
{
|
{
|
||||||
auto & gcStore = requireGcStore(*store);
|
auto & gcStore = require<GcStore>(*store);
|
||||||
|
|
||||||
options.action = dryRun ? GCOptions::gcReturnDead : GCOptions::gcDeleteDead;
|
options.action = dryRun ? GCOptions::gcReturnDead : GCOptions::gcDeleteDead;
|
||||||
GCResults results;
|
GCResults results;
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
source common.sh
|
source common.sh
|
||||||
|
|
||||||
needLocalStore "“--no-require-sigs” can’t be used with the daemon"
|
needLocalStore "'--no-require-sigs' can’t be used with the daemon"
|
||||||
|
|
||||||
# We can produce drvs directly into the binary cache
|
# We can produce drvs directly into the binary cache
|
||||||
clearStore
|
clearStore
|
||||||
|
|
|
@ -50,3 +50,22 @@ nix build -f dependencies.nix -o $RESULT --dry-run
|
||||||
nix build -f dependencies.nix -o $RESULT
|
nix build -f dependencies.nix -o $RESULT
|
||||||
|
|
||||||
[[ -h $RESULT ]]
|
[[ -h $RESULT ]]
|
||||||
|
|
||||||
|
###################################################
|
||||||
|
# Check the JSON output
|
||||||
|
clearStore
|
||||||
|
clearCache
|
||||||
|
|
||||||
|
RES=$(nix build -f dependencies.nix --dry-run --json)
|
||||||
|
|
||||||
|
if [[ -z "$NIX_TESTS_CA_BY_DEFAULT" ]]; then
|
||||||
|
echo "$RES" | jq '.[0] | [
|
||||||
|
(.drvPath | test("'$NIX_STORE_DIR'.*\\.drv")),
|
||||||
|
(.outputs.out | test("'$NIX_STORE_DIR'"))
|
||||||
|
] | all'
|
||||||
|
else
|
||||||
|
echo "$RES" | jq '.[0] | [
|
||||||
|
(.drvPath | test("'$NIX_STORE_DIR'.*\\.drv")),
|
||||||
|
.outputs.out == null
|
||||||
|
] | all'
|
||||||
|
fi
|
||||||
|
|
|
@ -1,15 +1,27 @@
|
||||||
source common.sh
|
source common.sh
|
||||||
|
|
||||||
expectedJSONRegex='\[\{"drvPath":".*multiple-outputs-a.drv","outputs":\{"first":".*multiple-outputs-a-first","second":".*multiple-outputs-a-second"}},\{"drvPath":".*multiple-outputs-b.drv","outputs":\{"out":".*multiple-outputs-b"}}]'
|
clearStore
|
||||||
|
|
||||||
|
# Make sure that 'nix build' only returns the outputs we asked for.
|
||||||
|
nix build -f multiple-outputs.nix --json a --no-link | jq --exit-status '
|
||||||
|
(.[0] |
|
||||||
|
(.drvPath | match(".*multiple-outputs-a.drv")) and
|
||||||
|
(.outputs | keys | length == 1) and
|
||||||
|
(.outputs.first | match(".*multiple-outputs-a-first")))
|
||||||
|
'
|
||||||
|
|
||||||
nix build -f multiple-outputs.nix --json a.all b.all --no-link | jq --exit-status '
|
nix build -f multiple-outputs.nix --json a.all b.all --no-link | jq --exit-status '
|
||||||
(.[0] |
|
(.[0] |
|
||||||
(.drvPath | match(".*multiple-outputs-a.drv")) and
|
(.drvPath | match(".*multiple-outputs-a.drv")) and
|
||||||
|
(.outputs | keys | length == 2) and
|
||||||
(.outputs.first | match(".*multiple-outputs-a-first")) and
|
(.outputs.first | match(".*multiple-outputs-a-first")) and
|
||||||
(.outputs.second | match(".*multiple-outputs-a-second")))
|
(.outputs.second | match(".*multiple-outputs-a-second")))
|
||||||
and (.[1] |
|
and (.[1] |
|
||||||
(.drvPath | match(".*multiple-outputs-b.drv")) and
|
(.drvPath | match(".*multiple-outputs-b.drv")) and
|
||||||
|
(.outputs | keys | length == 1) and
|
||||||
(.outputs.out | match(".*multiple-outputs-b")))
|
(.outputs.out | match(".*multiple-outputs-b")))
|
||||||
'
|
'
|
||||||
|
|
||||||
testNormalization () {
|
testNormalization () {
|
||||||
clearStore
|
clearStore
|
||||||
outPath=$(nix-build ./simple.nix --no-out-link)
|
outPath=$(nix-build ./simple.nix --no-out-link)
|
||||||
|
|
6
tests/ca/build-dry.sh
Normal file
6
tests/ca/build-dry.sh
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
source common.sh
|
||||||
|
|
||||||
|
export NIX_TESTS_CA_BY_DEFAULT=1
|
||||||
|
|
||||||
|
cd .. && source build-dry.sh
|
||||||
|
|
5
tests/eval.nix
Normal file
5
tests/eval.nix
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
{
|
||||||
|
int = 123;
|
||||||
|
str = "foo";
|
||||||
|
attr.foo = "bar";
|
||||||
|
}
|
29
tests/eval.sh
Normal file
29
tests/eval.sh
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
source common.sh
|
||||||
|
|
||||||
|
clearStore
|
||||||
|
|
||||||
|
testStdinHeredoc=$(nix eval -f - <<EOF
|
||||||
|
{
|
||||||
|
bar = 3 + 1;
|
||||||
|
foo = 2 + 2;
|
||||||
|
}
|
||||||
|
EOF
|
||||||
|
)
|
||||||
|
[[ $testStdinHeredoc == '{ bar = 4; foo = 4; }' ]]
|
||||||
|
|
||||||
|
nix eval --expr 'assert 1 + 2 == 3; true'
|
||||||
|
|
||||||
|
[[ $(nix eval int -f "./eval.nix") == 123 ]]
|
||||||
|
[[ $(nix eval str -f "./eval.nix") == '"foo"' ]]
|
||||||
|
[[ $(nix eval str --raw -f "./eval.nix") == 'foo' ]]
|
||||||
|
[[ $(nix eval attr -f "./eval.nix") == '{ foo = "bar"; }' ]]
|
||||||
|
[[ $(nix eval attr --json -f "./eval.nix") == '{"foo":"bar"}' ]]
|
||||||
|
[[ $(nix eval int -f - < "./eval.nix") == 123 ]]
|
||||||
|
|
||||||
|
|
||||||
|
nix-instantiate --eval -E 'assert 1 + 2 == 3; true'
|
||||||
|
[[ $(nix-instantiate -A int --eval "./eval.nix") == 123 ]]
|
||||||
|
[[ $(nix-instantiate -A str --eval "./eval.nix") == '"foo"' ]]
|
||||||
|
[[ $(nix-instantiate -A attr --eval "./eval.nix") == '{ foo = "bar"; }' ]]
|
||||||
|
[[ $(nix-instantiate -A attr --eval --json "./eval.nix") == '{"foo":"bar"}' ]]
|
||||||
|
[[ $(nix-instantiate -A int --eval - < "./eval.nix") == 123 ]]
|
58
tests/fetchClosure.sh
Normal file
58
tests/fetchClosure.sh
Normal file
|
@ -0,0 +1,58 @@
|
||||||
|
source common.sh
|
||||||
|
|
||||||
|
enableFeatures "fetch-closure"
|
||||||
|
needLocalStore "'--no-require-sigs' can’t be used with the daemon"
|
||||||
|
|
||||||
|
clearStore
|
||||||
|
clearCacheCache
|
||||||
|
|
||||||
|
# Initialize binary cache.
|
||||||
|
nonCaPath=$(nix build --json --file ./dependencies.nix | jq -r .[].outputs.out)
|
||||||
|
caPath=$(nix store make-content-addressed --json $nonCaPath | jq -r '.rewrites | map(.) | .[]')
|
||||||
|
nix copy --to file://$cacheDir $nonCaPath
|
||||||
|
|
||||||
|
# Test basic fetchClosure rewriting from non-CA to CA.
|
||||||
|
clearStore
|
||||||
|
|
||||||
|
[ ! -e $nonCaPath ]
|
||||||
|
[ ! -e $caPath ]
|
||||||
|
|
||||||
|
[[ $(nix eval -v --raw --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $nonCaPath;
|
||||||
|
toPath = $caPath;
|
||||||
|
}
|
||||||
|
") = $caPath ]]
|
||||||
|
|
||||||
|
[ ! -e $nonCaPath ]
|
||||||
|
[ -e $caPath ]
|
||||||
|
|
||||||
|
# In impure mode, we can use non-CA paths.
|
||||||
|
[[ $(nix eval --raw --no-require-sigs --impure --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $nonCaPath;
|
||||||
|
}
|
||||||
|
") = $nonCaPath ]]
|
||||||
|
|
||||||
|
[ -e $nonCaPath ]
|
||||||
|
|
||||||
|
# 'toPath' set to empty string should fail but print the expected path.
|
||||||
|
nix eval -v --json --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $nonCaPath;
|
||||||
|
toPath = \"\";
|
||||||
|
}
|
||||||
|
" 2>&1 | grep "error: rewriting.*$nonCaPath.*yielded.*$caPath"
|
||||||
|
|
||||||
|
# If fromPath is CA, then toPath isn't needed.
|
||||||
|
nix copy --to file://$cacheDir $caPath
|
||||||
|
|
||||||
|
[[ $(nix eval -v --raw --expr "
|
||||||
|
builtins.fetchClosure {
|
||||||
|
fromStore = \"file://$cacheDir\";
|
||||||
|
fromPath = $caPath;
|
||||||
|
}
|
||||||
|
") = $caPath ]]
|
|
@ -11,7 +11,7 @@ repo=$TEST_ROOT/git
|
||||||
|
|
||||||
export _NIX_FORCE_HTTP=1
|
export _NIX_FORCE_HTTP=1
|
||||||
|
|
||||||
rm -rf $repo ${repo}-tmp $TEST_HOME/.cache/nix $TEST_ROOT/worktree $TEST_ROOT/shallow
|
rm -rf $repo ${repo}-tmp $TEST_HOME/.cache/nix $TEST_ROOT/worktree $TEST_ROOT/shallow $TEST_ROOT/minimal
|
||||||
|
|
||||||
git init $repo
|
git init $repo
|
||||||
git -C $repo config user.email "foobar@example.com"
|
git -C $repo config user.email "foobar@example.com"
|
||||||
|
@ -147,8 +147,13 @@ path3=$(nix eval --impure --raw --expr "(builtins.fetchGit $repo).outPath")
|
||||||
# (check dirty-tree handling was used)
|
# (check dirty-tree handling was used)
|
||||||
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit $repo).rev") = 0000000000000000000000000000000000000000 ]]
|
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit $repo).rev") = 0000000000000000000000000000000000000000 ]]
|
||||||
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit $repo).shortRev") = 0000000 ]]
|
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit $repo).shortRev") = 0000000 ]]
|
||||||
|
# Making a dirty tree clean again and fetching it should
|
||||||
|
# record correct revision information. See: #4140
|
||||||
|
echo world > $repo/hello
|
||||||
|
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit $repo).rev") = $rev2 ]]
|
||||||
|
|
||||||
# Committing shouldn't change store path, or switch to using 'master'
|
# Committing shouldn't change store path, or switch to using 'master'
|
||||||
|
echo dev > $repo/hello
|
||||||
git -C $repo commit -m 'Bla5' -a
|
git -C $repo commit -m 'Bla5' -a
|
||||||
path4=$(nix eval --impure --raw --expr "(builtins.fetchGit $repo).outPath")
|
path4=$(nix eval --impure --raw --expr "(builtins.fetchGit $repo).outPath")
|
||||||
[[ $(cat $path4/hello) = dev ]]
|
[[ $(cat $path4/hello) = dev ]]
|
||||||
|
@ -170,6 +175,14 @@ NIX=$(command -v nix)
|
||||||
path5=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = $repo; ref = \"dev\"; }).outPath")
|
path5=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = $repo; ref = \"dev\"; }).outPath")
|
||||||
[[ $path3 = $path5 ]]
|
[[ $path3 = $path5 ]]
|
||||||
|
|
||||||
|
# Fetching from a repo with only a specific revision and no branches should
|
||||||
|
# not fall back to copying files and record correct revision information. See: #5302
|
||||||
|
mkdir $TEST_ROOT/minimal
|
||||||
|
git -C $TEST_ROOT/minimal init
|
||||||
|
git -C $TEST_ROOT/minimal fetch $repo $rev2
|
||||||
|
git -C $TEST_ROOT/minimal checkout $rev2
|
||||||
|
[[ $(nix eval --impure --raw --expr "(builtins.fetchGit { url = $TEST_ROOT/minimal; }).rev") = $rev2 ]]
|
||||||
|
|
||||||
# Fetching a shallow repo shouldn't work by default, because we can't
|
# Fetching a shallow repo shouldn't work by default, because we can't
|
||||||
# return a revCount.
|
# return a revCount.
|
||||||
git clone --depth 1 file://$repo $TEST_ROOT/shallow
|
git clone --depth 1 file://$repo $TEST_ROOT/shallow
|
||||||
|
@ -193,3 +206,11 @@ rev4_nix=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = \"file://$
|
||||||
# The name argument should be handled
|
# The name argument should be handled
|
||||||
path9=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = \"file://$repo\"; ref = \"HEAD\"; name = \"foo\"; }).outPath")
|
path9=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = \"file://$repo\"; ref = \"HEAD\"; name = \"foo\"; }).outPath")
|
||||||
[[ $path9 =~ -foo$ ]]
|
[[ $path9 =~ -foo$ ]]
|
||||||
|
|
||||||
|
# should fail if there is no repo
|
||||||
|
rm -rf $repo/.git
|
||||||
|
(! nix eval --impure --raw --expr "(builtins.fetchGit \"file://$repo\").outPath")
|
||||||
|
|
||||||
|
# should succeed for a repo without commits
|
||||||
|
git init $repo
|
||||||
|
path10=$(nix eval --impure --raw --expr "(builtins.fetchGit \"file://$repo\").outPath")
|
||||||
|
|
6
tests/fetchPath.sh
Normal file
6
tests/fetchPath.sh
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
source common.sh
|
||||||
|
|
||||||
|
touch foo -t 202211111111
|
||||||
|
# We only check whether 2022-11-1* **:**:** is the last modified date since
|
||||||
|
# `lastModified` is transformed into UTC in `builtins.fetchTarball`.
|
||||||
|
[[ "$(nix eval --impure --raw --expr "(builtins.fetchTree \"path://$PWD/foo\").lastModifiedDate")" =~ 2022111.* ]]
|
|
@ -21,6 +21,7 @@ nix_tests = \
|
||||||
tarball.sh \
|
tarball.sh \
|
||||||
fetchGit.sh \
|
fetchGit.sh \
|
||||||
fetchurl.sh \
|
fetchurl.sh \
|
||||||
|
fetchPath.sh \
|
||||||
simple.sh \
|
simple.sh \
|
||||||
referrers.sh \
|
referrers.sh \
|
||||||
optimise-store.sh \
|
optimise-store.sh \
|
||||||
|
@ -52,6 +53,7 @@ nix_tests = \
|
||||||
build-remote-content-addressed-floating.sh \
|
build-remote-content-addressed-floating.sh \
|
||||||
nar-access.sh \
|
nar-access.sh \
|
||||||
pure-eval.sh \
|
pure-eval.sh \
|
||||||
|
eval.sh \
|
||||||
ca/post-hook.sh \
|
ca/post-hook.sh \
|
||||||
repl.sh \
|
repl.sh \
|
||||||
ca/repl.sh \
|
ca/repl.sh \
|
||||||
|
@ -95,7 +97,8 @@ nix_tests = \
|
||||||
describe-stores.sh \
|
describe-stores.sh \
|
||||||
nix-profile.sh \
|
nix-profile.sh \
|
||||||
suggestions.sh \
|
suggestions.sh \
|
||||||
store-ping.sh
|
store-ping.sh \
|
||||||
|
fetchClosure.sh
|
||||||
|
|
||||||
ifeq ($(HAVE_LIBCPUID), 1)
|
ifeq ($(HAVE_LIBCPUID), 1)
|
||||||
nix_tests += compute-levels.sh
|
nix_tests += compute-levels.sh
|
||||||
|
|
|
@ -13,3 +13,14 @@ rm -rf $NIX_LOG_DIR
|
||||||
(! nix-store -l $path)
|
(! nix-store -l $path)
|
||||||
nix-build dependencies.nix --no-out-link --compress-build-log
|
nix-build dependencies.nix --no-out-link --compress-build-log
|
||||||
[ "$(nix-store -l $path)" = FOO ]
|
[ "$(nix-store -l $path)" = FOO ]
|
||||||
|
|
||||||
|
# test whether empty logs work fine with `nix log`.
|
||||||
|
builder="$(mktemp)"
|
||||||
|
echo -e "#!/bin/sh\nmkdir \$out" > "$builder"
|
||||||
|
outp="$(nix-build -E \
|
||||||
|
'with import ./config.nix; mkDerivation { name = "fnord"; builder = '"$builder"'; }' \
|
||||||
|
--out-link "$(mktemp -d)/result")"
|
||||||
|
|
||||||
|
test -d "$outp"
|
||||||
|
|
||||||
|
nix log "$outp"
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue