Merge remote-tracking branch 'upstream/master' into buildable-variant

This commit is contained in:
John Ericson 2020-08-05 15:41:57 +00:00
commit d89472a912
92 changed files with 1861 additions and 1055 deletions

View file

@ -1 +1 @@
2.4
3.0

View file

@ -370,34 +370,6 @@ false</literal>.</para>
</varlistentry>
<varlistentry xml:id="conf-hashed-mirrors"><term><literal>hashed-mirrors</literal></term>
<listitem><para>A list of web servers used by
<function>builtins.fetchurl</function> to obtain files by
hash. The default is
<literal>http://tarballs.nixos.org/</literal>. Given a hash type
<replaceable>ht</replaceable> and a base-16 hash
<replaceable>h</replaceable>, Nix will try to download the file
from
<literal>hashed-mirror/<replaceable>ht</replaceable>/<replaceable>h</replaceable></literal>.
This allows files to be downloaded even if they have disappeared
from their original URI. For example, given the default mirror
<literal>http://tarballs.nixos.org/</literal>, when building the derivation
<programlisting>
builtins.fetchurl {
url = "https://example.org/foo-1.2.3.tar.xz";
sha256 = "2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7ae";
}
</programlisting>
Nix will attempt to download this file from
<literal>http://tarballs.nixos.org/sha256/2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7ae</literal>
first. If it is not available there, if will try the original URI.</para></listitem>
</varlistentry>
<varlistentry xml:id="conf-http-connections"><term><literal>http-connections</literal></term>
<listitem><para>The maximum number of parallel TCP connections

View file

@ -1,119 +0,0 @@
<section xmlns="http://docbook.org/ns/docbook"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:xi="http://www.w3.org/2001/XInclude"
version="5.0"
xml:id='sec-builder-syntax'>
<title>Builder Syntax</title>
<example xml:id='ex-hello-builder'><title>Build script for GNU Hello
(<filename>builder.sh</filename>)</title>
<programlisting>
source $stdenv/setup <co xml:id='ex-hello-builder-co-1' />
PATH=$perl/bin:$PATH <co xml:id='ex-hello-builder-co-2' />
tar xvfz $src <co xml:id='ex-hello-builder-co-3' />
cd hello-*
./configure --prefix=$out <co xml:id='ex-hello-builder-co-4' />
make <co xml:id='ex-hello-builder-co-5' />
make install</programlisting>
</example>
<para><xref linkend='ex-hello-builder' /> shows the builder referenced
from Hello's Nix expression (stored in
<filename>pkgs/applications/misc/hello/ex-1/builder.sh</filename>).
The builder can actually be made a lot shorter by using the
<emphasis>generic builder</emphasis> functions provided by
<varname>stdenv</varname>, but here we write out the build steps to
elucidate what a builder does. It performs the following
steps:</para>
<calloutlist>
<callout arearefs='ex-hello-builder-co-1'>
<para>When Nix runs a builder, it initially completely clears the
environment (except for the attributes declared in the
derivation). For instance, the <envar>PATH</envar> variable is
empty<footnote><para>Actually, it's initialised to
<filename>/path-not-set</filename> to prevent Bash from setting it
to a default value.</para></footnote>. This is done to prevent
undeclared inputs from being used in the build process. If for
example the <envar>PATH</envar> contained
<filename>/usr/bin</filename>, then you might accidentally use
<filename>/usr/bin/gcc</filename>.</para>
<para>So the first step is to set up the environment. This is
done by calling the <filename>setup</filename> script of the
standard environment. The environment variable
<envar>stdenv</envar> points to the location of the standard
environment being used. (It wasn't specified explicitly as an
attribute in <xref linkend='ex-hello-nix' />, but
<varname>mkDerivation</varname> adds it automatically.)</para>
</callout>
<callout arearefs='ex-hello-builder-co-2'>
<para>Since Hello needs Perl, we have to make sure that Perl is in
the <envar>PATH</envar>. The <envar>perl</envar> environment
variable points to the location of the Perl package (since it
was passed in as an attribute to the derivation), so
<filename><replaceable>$perl</replaceable>/bin</filename> is the
directory containing the Perl interpreter.</para>
</callout>
<callout arearefs='ex-hello-builder-co-3'>
<para>Now we have to unpack the sources. The
<varname>src</varname> attribute was bound to the result of
fetching the Hello source tarball from the network, so the
<envar>src</envar> environment variable points to the location in
the Nix store to which the tarball was downloaded. After
unpacking, we <command>cd</command> to the resulting source
directory.</para>
<para>The whole build is performed in a temporary directory
created in <varname>/tmp</varname>, by the way. This directory is
removed after the builder finishes, so there is no need to clean
up the sources afterwards. Also, the temporary directory is
always newly created, so you don't have to worry about files from
previous builds interfering with the current build.</para>
</callout>
<callout arearefs='ex-hello-builder-co-4'>
<para>GNU Hello is a typical Autoconf-based package, so we first
have to run its <filename>configure</filename> script. In Nix
every package is stored in a separate location in the Nix store,
for instance
<filename>/nix/store/9a54ba97fb71b65fda531012d0443ce2-hello-2.1.1</filename>.
Nix computes this path by cryptographically hashing all attributes
of the derivation. The path is passed to the builder through the
<envar>out</envar> environment variable. So here we give
<filename>configure</filename> the parameter
<literal>--prefix=$out</literal> to cause Hello to be installed in
the expected location.</para>
</callout>
<callout arearefs='ex-hello-builder-co-5'>
<para>Finally we build Hello (<literal>make</literal>) and install
it into the location specified by <envar>out</envar>
(<literal>make install</literal>).</para>
</callout>
</calloutlist>
<para>If you are wondering about the absence of error checking on the
result of various commands called in the builder: this is because the
shell script is evaluated with Bash's <option>-e</option> option,
which causes the script to be aborted if any command fails without an
error check.</para>
</section>

View file

@ -21,13 +21,13 @@ clean-files += $(GCH) $(PCH)
ifeq ($(PRECOMPILE_HEADERS), 1)
ifeq ($(CXX), g++)
ifeq ($(findstring g++,$(CXX)), g++)
GLOBAL_CXXFLAGS_PCH += -include $(buildprefix)precompiled-headers.h -Winvalid-pch
GLOBAL_ORDER_AFTER += $(GCH)
else ifeq ($(CXX), clang++)
else ifeq ($(findstring clang++,$(CXX)), clang++)
GLOBAL_CXXFLAGS_PCH += -include-pch $(PCH) -Winvalid-pch

View file

@ -80,7 +80,7 @@ SV * queryReferences(char * path)
SV * queryPathHash(char * path)
PPCODE:
try {
auto s = store()->queryPathInfo(store()->parseStorePath(path))->narHash.to_string(Base32, true);
auto s = store()->queryPathInfo(store()->parseStorePath(path))->narHash->to_string(Base32, true);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
croak("%s", e.what());
@ -106,7 +106,7 @@ SV * queryPathInfo(char * path, int base32)
XPUSHs(&PL_sv_undef);
else
XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(*info->deriver).c_str(), 0)));
auto s = info->narHash.to_string(base32 ? Base32 : Base16, true);
auto s = info->narHash->to_string(base32 ? Base32 : Base16, true);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
mXPUSHi(info->registrationTime);
mXPUSHi(info->narSize);
@ -224,7 +224,7 @@ SV * hashString(char * algo, int base32, char * s)
SV * convertHash(char * algo, char * s, int toBase32)
PPCODE:
try {
Hash h(s, parseHashType(algo));
auto h = Hash::parseAny(s, parseHashType(algo));
string s = h.to_string(toBase32 ? Base32 : Base16, false);
XPUSHs(sv_2mortal(newSVpv(s.c_str(), 0)));
} catch (Error & e) {
@ -285,7 +285,7 @@ SV * addToStore(char * srcPath, int recursive, char * algo)
SV * makeFixedOutputPath(int recursive, char * algo, char * hash, char * name)
PPCODE:
try {
Hash h(hash, parseHashType(algo));
auto h = Hash::parseAny(hash, parseHashType(algo));
auto method = recursive ? FileIngestionMethod::Recursive : FileIngestionMethod::Flat;
auto path = store()->makeFixedOutputPath(method, h, name);
XPUSHs(sv_2mortal(newSVpv(store()->printStorePath(path).c_str(), 0)));
@ -304,7 +304,10 @@ SV * derivationFromPath(char * drvPath)
HV * outputs = newHV();
for (auto & i : drv.outputs)
hv_store(outputs, i.first.c_str(), i.first.size(), newSVpv(store()->printStorePath(i.second.path).c_str(), 0), 0);
hv_store(
outputs, i.first.c_str(), i.first.size(),
newSVpv(store()->printStorePath(i.second.path(*store(), drv.name)).c_str(), 0),
0);
hv_stores(hash, "outputs", newRV((SV *) outputs));
AV * inputDrvs = newAV();

View file

@ -37,6 +37,8 @@ readonly PROFILE_NIX_FILE="$NIX_ROOT/var/nix/profiles/default/etc/profile.d/nix-
readonly NIX_INSTALLED_NIX="@nix@"
readonly NIX_INSTALLED_CACERT="@cacert@"
#readonly NIX_INSTALLED_NIX="/nix/store/j8dbv5w6jl34caywh2ygdy88knx1mdf7-nix-2.3.6"
#readonly NIX_INSTALLED_CACERT="/nix/store/7dxhzymvy330i28ii676fl1pqwcahv2f-nss-cacert-3.49.2"
readonly EXTRACTED_NIX_PATH="$(dirname "$0")"
readonly ROOT_HOME=$(echo ~root)
@ -69,9 +71,11 @@ uninstall_directions() {
subheader "Uninstalling nix:"
local step=0
if poly_service_installed_check; then
if [ -e /run/systemd/system ] && poly_service_installed_check; then
step=$((step + 1))
poly_service_uninstall_directions "$step"
else
step=$((step + 1))
fi
for profile_target in "${PROFILE_TARGETS[@]}"; do
@ -250,6 +254,8 @@ function finish_success {
echo "But fetching the nixpkgs channel failed. (Are you offline?)"
echo "To try again later, run \"sudo -i nix-channel --update nixpkgs\"."
fi
if [ -e /run/systemd/system ]; then
cat <<EOF
Before Nix will work in your existing shells, you'll need to close
@ -264,6 +270,26 @@ hesitate:
$(contactme)
EOF
else
cat <<EOF
Before Nix will work in your existing shells, you'll need to close
them and open them again. Other than that, you should be ready to go.
Try it! Open a new terminal, and type:
$ sudo nix-daemon
$ nix-shell -p nix-info --run "nix-info -m"
Additionally, you may want to add nix-daemon to your init-system.
Thank you for using this installer. If you have any feedback, don't
hesitate:
$(contactme)
EOF
fi
}
@ -664,12 +690,8 @@ main() {
# shellcheck source=./install-darwin-multi-user.sh
. "$EXTRACTED_NIX_PATH/install-darwin-multi-user.sh"
elif [ "$(uname -s)" = "Linux" ]; then
if [ -e /run/systemd/system ]; then
# shellcheck source=./install-systemd-multi-user.sh
. "$EXTRACTED_NIX_PATH/install-systemd-multi-user.sh"
else
failure "Sorry, the multi-user installation requires systemd on Linux (detected using /run/systemd/system)"
fi
. "$EXTRACTED_NIX_PATH/install-systemd-multi-user.sh" # most of this works on non-systemd distros also
else
failure "Sorry, I don't know what to do on $(uname)"
fi
@ -702,7 +724,10 @@ main() {
setup_default_profile
place_nix_configuration
if [ -e /run/systemd/system ]; then
poly_configure_nix_daemon_service
fi
trap finish_success EXIT
}

View file

@ -35,7 +35,7 @@ fi
# Determine if we could use the multi-user installer or not
if [ "$(uname -s)" = "Darwin" ]; then
echo "Note: a multi-user installation is possible. See https://nixos.org/nix/manual/#sect-multi-user-installation" >&2
elif [ "$(uname -s)" = "Linux" ] && [ -e /run/systemd/system ]; then
elif [ "$(uname -s)" = "Linux" ]; then
echo "Note: a multi-user installation is possible. See https://nixos.org/nix/manual/#sect-multi-user-installation" >&2
fi
@ -122,7 +122,7 @@ if [ "$(uname -s)" = "Darwin" ]; then
fi
if [ "$INSTALL_MODE" = "daemon" ]; then
printf '\e[1;31mSwitching to the Daemon-based Installer\e[0m\n'
printf '\e[1;31mSwitching to the Multi-user Installer\e[0m\n'
exec "$self/install-multi-user"
exit 0
fi

View file

@ -33,7 +33,7 @@ std::string escapeUri(std::string uri)
static string currentLoad;
static AutoCloseFD openSlotLock(const Machine & m, unsigned long long slot)
static AutoCloseFD openSlotLock(const Machine & m, uint64_t slot)
{
return openLockFile(fmt("%s/%s-%d", currentLoad, escapeUri(m.storeUri), slot), true);
}
@ -119,7 +119,7 @@ static int _main(int argc, char * * argv)
bool rightType = false;
Machine * bestMachine = nullptr;
unsigned long long bestLoad = 0;
uint64_t bestLoad = 0;
for (auto & m : machines) {
debug("considering building on remote machine '%s'", m.storeUri);
@ -130,8 +130,8 @@ static int _main(int argc, char * * argv)
m.mandatoryMet(requiredFeatures)) {
rightType = true;
AutoCloseFD free;
unsigned long long load = 0;
for (unsigned long long slot = 0; slot < m.maxJobs; ++slot) {
uint64_t load = 0;
for (uint64_t slot = 0; slot < m.maxJobs; ++slot) {
auto slotLock = openSlotLock(m, slot);
if (lockFile(slotLock.get(), ltWrite, false)) {
if (!free) {

View file

@ -285,11 +285,10 @@ static std::shared_ptr<AttrDb> makeAttrDb(const Hash & fingerprint)
}
EvalCache::EvalCache(
bool useCache,
const Hash & fingerprint,
std::optional<std::reference_wrapper<const Hash>> useCache,
EvalState & state,
RootLoader rootLoader)
: db(useCache ? makeAttrDb(fingerprint) : nullptr)
: db(useCache ? makeAttrDb(*useCache) : nullptr)
, state(state)
, rootLoader(rootLoader)
{

View file

@ -4,6 +4,7 @@
#include "hash.hh"
#include "eval.hh"
#include <functional>
#include <variant>
namespace nix::eval_cache {
@ -26,8 +27,7 @@ class EvalCache : public std::enable_shared_from_this<EvalCache>
public:
EvalCache(
bool useCache,
const Hash & fingerprint,
std::optional<std::reference_wrapper<const Hash>> useCache,
EvalState & state,
RootLoader rootLoader);

View file

@ -345,6 +345,7 @@ EvalState::EvalState(const Strings & _searchPath, ref<Store> store)
, sStructuredAttrs(symbols.create("__structuredAttrs"))
, sBuilder(symbols.create("builder"))
, sArgs(symbols.create("args"))
, sContentAddressed(symbols.create("__contentAddressed"))
, sOutputHash(symbols.create("outputHash"))
, sOutputHashAlgo(symbols.create("outputHashAlgo"))
, sOutputHashMode(symbols.create("outputHashMode"))
@ -1259,7 +1260,7 @@ void EvalState::callFunction(Value & fun, Value & arg, Value & v, const Pos & po
addErrorTrace(e, lambda.pos, "while evaluating %s",
(lambda.name.set()
? "'" + (string) lambda.name + "'"
: "anonymous lambdaction"));
: "anonymous lambda"));
addErrorTrace(e, pos, "from call site%s", "");
throw;
}

View file

@ -74,6 +74,7 @@ public:
sSystem, sOverrides, sOutputs, sOutputName, sIgnoreNulls,
sFile, sLine, sColumn, sFunctor, sToString,
sRight, sWrong, sStructuredAttrs, sBuilder, sArgs,
sContentAddressed,
sOutputHash, sOutputHashAlgo, sOutputHashMode,
sRecurseForDerivations,
sDescription, sSelf, sEpsilon;

View file

@ -106,6 +106,6 @@ void emitTreeAttrs(
EvalState & state,
const fetchers::Tree & tree,
const fetchers::Input & input,
Value & v);
Value & v, bool emptyRevFallback = false);
}

View file

@ -39,7 +39,7 @@ DrvInfo::DrvInfo(EvalState & state, ref<Store> store, const std::string & drvPat
if (i == drv.outputs.end())
throw Error("derivation '%s' does not have output '%s'", store->printStorePath(drvPath), outputName);
outPath = store->printStorePath(i->second.path);
outPath = store->printStorePath(i->second.path(*store, drv.name));
}

View file

@ -52,7 +52,7 @@ void EvalState::realiseContext(const PathSet & context)
DerivationOutputs::iterator i = drv.outputs.find(outputName);
if (i == drv.outputs.end())
throw Error("derivation '%s' does not have an output named '%s'", ctxS, outputName);
allowedPaths->insert(store->printStorePath(i->second.path));
allowedPaths->insert(store->printStorePath(i->second.path(*store, drv.name)));
}
}
}
@ -65,7 +65,7 @@ void EvalState::realiseContext(const PathSet & context)
/* For performance, prefetch all substitute info. */
StorePathSet willBuild, willSubstitute, unknown;
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
store->queryMissing(drvs, willBuild, willSubstitute, unknown, downloadSize, narSize);
store->buildPaths(drvs);
@ -91,8 +91,17 @@ static void prim_scopedImport(EvalState & state, const Pos & pos, Value * * args
Path realPath = state.checkSourcePath(state.toRealPath(path, context));
// FIXME
if (state.store->isStorePath(path) && state.store->isValidPath(state.store->parseStorePath(path)) && isDerivation(path)) {
Derivation drv = readDerivation(*state.store, realPath);
auto isValidDerivationInStore = [&]() -> std::optional<StorePath> {
if (!state.store->isStorePath(path))
return std::nullopt;
auto storePath = state.store->parseStorePath(path);
if (!(state.store->isValidPath(storePath) && isDerivation(path)))
return std::nullopt;
return storePath;
};
if (auto optStorePath = isValidDerivationInStore()) {
auto storePath = *optStorePath;
Derivation drv = readDerivation(*state.store, realPath, Derivation::nameFromPath(storePath));
Value & w = *state.allocValue();
state.mkAttrs(w, 3 + drv.outputs.size());
Value * v2 = state.allocAttr(w, state.sDrvPath);
@ -106,7 +115,7 @@ static void prim_scopedImport(EvalState & state, const Pos & pos, Value * * args
for (const auto & o : drv.outputs) {
v2 = state.allocAttr(w, state.symbols.create(o.first));
mkString(*v2, state.store->printStorePath(o.second.path), {"!" + o.first + "!" + path});
mkString(*v2, state.store->printStorePath(o.second.path(*state.store, drv.name)), {"!" + o.first + "!" + path});
outputsVal->listElems()[outputs_index] = state.allocValue();
mkString(*(outputsVal->listElems()[outputs_index++]), o.first);
}
@ -570,9 +579,11 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
/* Build the derivation expression by processing the attributes. */
Derivation drv;
drv.name = drvName;
PathSet context;
bool contentAddressed = false;
std::optional<std::string> outputHash;
std::string outputHashAlgo;
auto ingestionMethod = FileIngestionMethod::Flat;
@ -629,9 +640,14 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
if (i->value->type == tNull) continue;
}
if (i->name == state.sContentAddressed) {
settings.requireExperimentalFeature("ca-derivations");
contentAddressed = state.forceBool(*i->value, pos);
}
/* The `args' attribute is special: it supplies the
command-line arguments to the builder. */
if (i->name == state.sArgs) {
else if (i->name == state.sArgs) {
state.forceList(*i->value, pos);
for (unsigned int n = 0; n < i->value->listSize(); ++n) {
string s = state.coerceToString(posDrvName, *i->value->listElems()[n], context, true);
@ -751,7 +767,10 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
});
if (outputHash) {
/* Handle fixed-output derivations. */
/* Handle fixed-output derivations.
Ignore `__contentAddressed` because fixed output derivations are
already content addressed. */
if (outputs.size() != 1 || *(outputs.begin()) != "out")
throw Error({
.hint = hintfmt("multiple outputs are not supported in fixed-output derivations"),
@ -764,14 +783,28 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
auto outPath = state.store->makeFixedOutputPath(ingestionMethod, h, drvName);
if (!jsonObject) drv.env["out"] = state.store->printStorePath(outPath);
drv.outputs.insert_or_assign("out", DerivationOutput {
.path = std::move(outPath),
.output = DerivationOutputCAFixed {
.hash = FixedOutputHash {
.method = ingestionMethod,
.hash = std::move(h),
},
},
});
}
else if (contentAddressed) {
HashType ht = parseHashType(outputHashAlgo);
for (auto & i : outputs) {
if (!jsonObject) drv.env[i] = hashPlaceholder(i);
drv.outputs.insert_or_assign(i, DerivationOutput {
.output = DerivationOutputCAFloating {
.method = ingestionMethod,
.hashType = std::move(ht),
},
});
}
}
else {
/* Compute a hash over the "masked" store derivation, which is
the final one except that in the list of outputs, the
@ -783,20 +816,24 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
if (!jsonObject) drv.env[i] = "";
drv.outputs.insert_or_assign(i,
DerivationOutput {
.output = DerivationOutputInputAddressed {
.path = StorePath::dummy,
.hash = std::optional<FixedOutputHash> {},
},
});
}
Hash h = hashDerivationModulo(*state.store, Derivation(drv), true);
// Regular, non-CA derivation should always return a single hash and not
// hash per output.
Hash h = std::get<0>(hashDerivationModulo(*state.store, Derivation(drv), true));
for (auto & i : outputs) {
auto outPath = state.store->makeOutputPath(i, h, drvName);
if (!jsonObject) drv.env[i] = state.store->printStorePath(outPath);
drv.outputs.insert_or_assign(i,
DerivationOutput {
.output = DerivationOutputInputAddressed {
.path = std::move(outPath),
.hash = std::optional<FixedOutputHash>(),
},
});
}
}
@ -817,7 +854,7 @@ static void prim_derivationStrict(EvalState & state, const Pos & pos, Value * *
mkString(*state.allocAttr(v, state.sDrvPath), drvPathS, {"=" + drvPathS});
for (auto & i : drv.outputs) {
mkString(*state.allocAttr(v, state.symbols.create(i.first)),
state.store->printStorePath(i.second.path), {"!" + i.first + "!" + drvPathS});
state.store->printStorePath(i.second.path(*state.store, drv.name)), {"!" + i.first + "!" + drvPathS});
}
v.attrs->sort();
}
@ -1111,7 +1148,7 @@ static void prim_toFile(EvalState & state, const Pos & pos, Value * * args, Valu
static void addPath(EvalState & state, const Pos & pos, const string & name, const Path & path_,
Value * filterFun, FileIngestionMethod method, const Hash & expectedHash, Value & v)
Value * filterFun, FileIngestionMethod method, const std::optional<Hash> expectedHash, Value & v)
{
const auto path = evalSettings.pureEval && expectedHash ?
path_ :
@ -1142,7 +1179,7 @@ static void addPath(EvalState & state, const Pos & pos, const string & name, con
std::optional<StorePath> expectedStorePath;
if (expectedHash)
expectedStorePath = state.store->makeFixedOutputPath(method, expectedHash, name);
expectedStorePath = state.store->makeFixedOutputPath(method, *expectedHash, name);
Path dstPath;
if (!expectedHash || !state.store->isValidPath(*expectedStorePath)) {
dstPath = state.store->printStorePath(settings.readOnlyMode
@ -1176,7 +1213,7 @@ static void prim_filterSource(EvalState & state, const Pos & pos, Value * * args
.errPos = pos
});
addPath(state, pos, std::string(baseNameOf(path)), path, args[0], FileIngestionMethod::Recursive, Hash(), v);
addPath(state, pos, std::string(baseNameOf(path)), path, args[0], FileIngestionMethod::Recursive, std::nullopt, v);
}
static void prim_path(EvalState & state, const Pos & pos, Value * * args, Value & v)
@ -1186,7 +1223,7 @@ static void prim_path(EvalState & state, const Pos & pos, Value * * args, Value
string name;
Value * filterFun = nullptr;
auto method = FileIngestionMethod::Recursive;
Hash expectedHash;
std::optional<Hash> expectedHash;
for (auto & attr : *args[0]->attrs) {
const string & n(attr.name);

View file

@ -1,91 +0,0 @@
#include "primops.hh"
#include "eval-inline.hh"
#include "store-api.hh"
#include "hash.hh"
#include "fetchers.hh"
#include "url.hh"
namespace nix {
static void prim_fetchGit(EvalState & state, const Pos & pos, Value * * args, Value & v)
{
std::string url;
std::optional<std::string> ref;
std::optional<Hash> rev;
std::string name = "source";
bool fetchSubmodules = false;
PathSet context;
state.forceValue(*args[0]);
if (args[0]->type == tAttrs) {
state.forceAttrs(*args[0], pos);
for (auto & attr : *args[0]->attrs) {
string n(attr.name);
if (n == "url")
url = state.coerceToString(*attr.pos, *attr.value, context, false, false);
else if (n == "ref")
ref = state.forceStringNoCtx(*attr.value, *attr.pos);
else if (n == "rev")
rev = Hash(state.forceStringNoCtx(*attr.value, *attr.pos), htSHA1);
else if (n == "name")
name = state.forceStringNoCtx(*attr.value, *attr.pos);
else if (n == "submodules")
fetchSubmodules = state.forceBool(*attr.value, *attr.pos);
else
throw EvalError({
.hint = hintfmt("unsupported argument '%s' to 'fetchGit'", attr.name),
.errPos = *attr.pos
});
}
if (url.empty())
throw EvalError({
.hint = hintfmt("'url' argument required"),
.errPos = pos
});
} else
url = state.coerceToString(pos, *args[0], context, false, false);
// FIXME: git externals probably can be used to bypass the URI
// whitelist. Ah well.
state.checkURI(url);
if (evalSettings.pureEval && !rev)
throw Error("in pure evaluation mode, 'fetchGit' requires a Git revision");
fetchers::Attrs attrs;
attrs.insert_or_assign("type", "git");
attrs.insert_or_assign("url", url.find("://") != std::string::npos ? url : "file://" + url);
if (ref) attrs.insert_or_assign("ref", *ref);
if (rev) attrs.insert_or_assign("rev", rev->gitRev());
if (fetchSubmodules) attrs.insert_or_assign("submodules", fetchers::Explicit<bool>{true});
auto input = fetchers::Input::fromAttrs(std::move(attrs));
// FIXME: use name?
auto [tree, input2] = input.fetch(state.store);
state.mkAttrs(v, 8);
auto storePath = state.store->printStorePath(tree.storePath);
mkString(*state.allocAttr(v, state.sOutPath), storePath, PathSet({storePath}));
// Backward compatibility: set 'rev' to
// 0000000000000000000000000000000000000000 for a dirty tree.
auto rev2 = input2.getRev().value_or(Hash(htSHA1));
mkString(*state.allocAttr(v, state.symbols.create("rev")), rev2.gitRev());
mkString(*state.allocAttr(v, state.symbols.create("shortRev")), rev2.gitShortRev());
// Backward compatibility: set 'revCount' to 0 for a dirty tree.
mkInt(*state.allocAttr(v, state.symbols.create("revCount")),
input2.getRevCount().value_or(0));
mkBool(*state.allocAttr(v, state.symbols.create("submodules")), fetchSubmodules);
v.attrs->sort();
if (state.allowedPaths)
state.allowedPaths->insert(tree.actualPath);
}
static RegisterPrimOp r("fetchGit", 1, prim_fetchGit);
}

View file

@ -31,7 +31,7 @@ static void prim_fetchMercurial(EvalState & state, const Pos & pos, Value * * ar
// be both a revision or a branch/tag name.
auto value = state.forceStringNoCtx(*attr.value, *attr.pos);
if (std::regex_match(value, revRegex))
rev = Hash(value, htSHA1);
rev = Hash::parseAny(value, htSHA1);
else
ref = value;
}

View file

@ -14,7 +14,8 @@ void emitTreeAttrs(
EvalState & state,
const fetchers::Tree & tree,
const fetchers::Input & input,
Value & v)
Value & v,
bool emptyRevFallback)
{
assert(input.isImmutable());
@ -34,10 +35,20 @@ void emitTreeAttrs(
if (auto rev = input.getRev()) {
mkString(*state.allocAttr(v, state.symbols.create("rev")), rev->gitRev());
mkString(*state.allocAttr(v, state.symbols.create("shortRev")), rev->gitShortRev());
} else if (emptyRevFallback) {
// Backwards compat for `builtins.fetchGit`: dirty repos return an empty sha1 as rev
auto emptyHash = Hash(htSHA1);
mkString(*state.allocAttr(v, state.symbols.create("rev")), emptyHash.gitRev());
mkString(*state.allocAttr(v, state.symbols.create("shortRev")), emptyHash.gitRev());
}
if (input.getType() == "git")
mkBool(*state.allocAttr(v, state.symbols.create("submodules")), maybeGetBoolAttr(input.attrs, "submodules").value_or(false));
if (auto revCount = input.getRevCount())
mkInt(*state.allocAttr(v, state.symbols.create("revCount")), *revCount);
else if (emptyRevFallback)
mkInt(*state.allocAttr(v, state.symbols.create("revCount")), 0);
if (auto lastModified = input.getLastModified()) {
mkInt(*state.allocAttr(v, state.symbols.create("lastModified")), *lastModified);
@ -48,10 +59,26 @@ void emitTreeAttrs(
v.attrs->sort();
}
static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, Value & v)
std::string fixURI(std::string uri, EvalState &state)
{
settings.requireExperimentalFeature("flakes");
state.checkURI(uri);
return uri.find("://") != std::string::npos ? uri : "file://" + uri;
}
void addURI(EvalState &state, fetchers::Attrs &attrs, Symbol name, std::string v)
{
string n(name);
attrs.emplace(name, n == "url" ? fixURI(v, state) : v);
}
static void fetchTree(
EvalState &state,
const Pos &pos,
Value **args,
Value &v,
const std::optional<std::string> type,
bool emptyRevFallback = false
) {
fetchers::Input input;
PathSet context;
@ -64,8 +91,15 @@ static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, V
for (auto & attr : *args[0]->attrs) {
state.forceValue(*attr.value);
if (attr.value->type == tString)
attrs.emplace(attr.name, attr.value->string.s);
if (attr.value->type == tPath || attr.value->type == tString)
addURI(
state,
attrs,
attr.name,
state.coerceToString(*attr.pos, *attr.value, context, false, false)
);
else if (attr.value->type == tString)
addURI(state, attrs, attr.name, attr.value->string.s);
else if (attr.value->type == tBool)
attrs.emplace(attr.name, fetchers::Explicit<bool>{attr.value->boolean});
else if (attr.value->type == tInt)
@ -75,6 +109,9 @@ static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, V
attr.name, showType(*attr.value));
}
if (type)
attrs.emplace("type", type.value());
if (!attrs.count("type"))
throw Error({
.hint = hintfmt("attribute 'type' is missing in call to 'fetchTree'"),
@ -82,8 +119,18 @@ static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, V
});
input = fetchers::Input::fromAttrs(std::move(attrs));
} else
input = fetchers::Input::fromURL(state.coerceToString(pos, *args[0], context, false, false));
} else {
auto url = fixURI(state.coerceToString(pos, *args[0], context, false, false), state);
if (type == "git") {
fetchers::Attrs attrs;
attrs.emplace("type", "git");
attrs.emplace("url", url);
input = fetchers::Input::fromAttrs(std::move(attrs));
} else {
input = fetchers::Input::fromURL(url);
}
}
if (!evalSettings.pureEval && !input.isDirect())
input = lookupInRegistries(state.store, input).first;
@ -96,7 +143,13 @@ static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, V
if (state.allowedPaths)
state.allowedPaths->insert(tree.actualPath);
emitTreeAttrs(state, tree, input2, v);
emitTreeAttrs(state, tree, input2, v, emptyRevFallback);
}
static void prim_fetchTree(EvalState & state, const Pos & pos, Value * * args, Value & v)
{
settings.requireExperimentalFeature("flakes");
fetchTree(state, pos, args, v, std::nullopt);
}
static RegisterPrimOp r("fetchTree", 1, prim_fetchTree);
@ -159,7 +212,7 @@ static void fetch(EvalState & state, const Pos & pos, Value * * args, Value & v,
: hashFile(htSHA256, path);
if (hash != *expectedHash)
throw Error((unsigned int) 102, "hash mismatch in file downloaded from '%s':\n wanted: %s\n got: %s",
*url, expectedHash->to_string(Base32, true), hash.to_string(Base32, true));
*url, expectedHash->to_string(Base32, true), hash->to_string(Base32, true));
}
if (state.allowedPaths)
@ -178,7 +231,13 @@ static void prim_fetchTarball(EvalState & state, const Pos & pos, Value * * args
fetch(state, pos, args, v, "fetchTarball", true, "source");
}
static void prim_fetchGit(EvalState &state, const Pos &pos, Value **args, Value &v)
{
fetchTree(state, pos, args, v, "git", true);
}
static RegisterPrimOp r2("__fetchurl", 1, prim_fetchurl);
static RegisterPrimOp r3("fetchTarball", 1, prim_fetchTarball);
static RegisterPrimOp r4("fetchGit", 1, prim_fetchGit);
}

View file

@ -130,12 +130,12 @@ std::pair<Tree, Input> Input::fetch(ref<Store> store) const
tree.actualPath = store->toRealPath(tree.storePath);
auto narHash = store->queryPathInfo(tree.storePath)->narHash;
input.attrs.insert_or_assign("narHash", narHash.to_string(SRI, true));
input.attrs.insert_or_assign("narHash", narHash->to_string(SRI, true));
if (auto prevNarHash = getNarHash()) {
if (narHash != *prevNarHash)
throw Error("NAR hash mismatch in input '%s' (%s), expected '%s', got '%s'",
to_string(), tree.actualPath, prevNarHash->to_string(SRI, true), narHash.to_string(SRI, true));
throw Error((unsigned int) 102, "NAR hash mismatch in input '%s' (%s), expected '%s', got '%s'",
to_string(), tree.actualPath, prevNarHash->to_string(SRI, true), narHash->to_string(SRI, true));
}
if (auto prevLastModified = getLastModified()) {
@ -200,9 +200,12 @@ std::string Input::getType() const
std::optional<Hash> Input::getNarHash() const
{
if (auto s = maybeGetStrAttr(attrs, "narHash"))
// FIXME: require SRI hash.
return newHashAllowEmpty(*s, htSHA256);
if (auto s = maybeGetStrAttr(attrs, "narHash")) {
auto hash = s->empty() ? Hash(htSHA256) : Hash::parseSRI(*s);
if (hash.type != htSHA256)
throw UsageError("narHash must use SHA-256");
return hash;
}
return {};
}
@ -216,7 +219,7 @@ std::optional<std::string> Input::getRef() const
std::optional<Hash> Input::getRev() const
{
if (auto s = maybeGetStrAttr(attrs, "rev"))
return Hash(*s, htSHA1);
return Hash::parseAny(*s, htSHA1);
return {};
}

View file

@ -121,7 +121,7 @@ struct GitInputScheme : InputScheme
args.push_back(*ref);
}
if (input.getRev()) throw Error("cloning a specific revision is not implemented");
if (input.getRev()) throw UnimplementedError("cloning a specific revision is not implemented");
args.push_back(destDir);
@ -293,14 +293,14 @@ struct GitInputScheme : InputScheme
if (!input.getRev())
input.attrs.insert_or_assign("rev",
Hash(chomp(runProgram("git", true, { "-C", actualUrl, "rev-parse", *input.getRef() })), htSHA1).gitRev());
Hash::parseAny(chomp(runProgram("git", true, { "-C", actualUrl, "rev-parse", *input.getRef() })), htSHA1).gitRev());
repoDir = actualUrl;
} else {
if (auto res = getCache()->lookup(store, mutableAttrs)) {
auto rev2 = Hash(getStrAttr(res->first, "rev"), htSHA1);
auto rev2 = Hash::parseAny(getStrAttr(res->first, "rev"), htSHA1);
if (!input.getRev() || input.getRev() == rev2) {
input.attrs.insert_or_assign("rev", rev2.gitRev());
return makeResult(res->first, std::move(res->second));
@ -370,7 +370,7 @@ struct GitInputScheme : InputScheme
}
if (!input.getRev())
input.attrs.insert_or_assign("rev", Hash(chomp(readFile(localRefFile)), htSHA1).gitRev());
input.attrs.insert_or_assign("rev", Hash::parseAny(chomp(readFile(localRefFile)), htSHA1).gitRev());
}
bool isShallow = chomp(runProgram("git", true, { "-C", repoDir, "rev-parse", "--is-shallow-repository" })) == "true";

View file

@ -29,7 +29,7 @@ struct GitArchiveInputScheme : InputScheme
if (path.size() == 2) {
} else if (path.size() == 3) {
if (std::regex_match(path[2], revRegex))
rev = Hash(path[2], htSHA1);
rev = Hash::parseAny(path[2], htSHA1);
else if (std::regex_match(path[2], refRegex))
ref = path[2];
else
@ -41,7 +41,7 @@ struct GitArchiveInputScheme : InputScheme
if (name == "rev") {
if (rev)
throw BadURL("URL '%s' contains multiple commit hashes", url.url);
rev = Hash(value, htSHA1);
rev = Hash::parseAny(value, htSHA1);
}
else if (name == "ref") {
if (!std::regex_match(value, refRegex))
@ -191,7 +191,7 @@ struct GitHubInputScheme : GitArchiveInputScheme
readFile(
store->toRealPath(
downloadFile(store, url, "source", false).storePath)));
auto rev = Hash(std::string { json["sha"] }, htSHA1);
auto rev = Hash::parseAny(std::string { json["sha"] }, htSHA1);
debug("HEAD revision for '%s' is %s", url, rev.gitRev());
return rev;
}
@ -235,7 +235,7 @@ struct GitLabInputScheme : GitArchiveInputScheme
readFile(
store->toRealPath(
downloadFile(store, url, "source", false).storePath)));
auto rev = Hash(std::string(json[0]["id"]), htSHA1);
auto rev = Hash::parseAny(std::string(json[0]["id"]), htSHA1);
debug("HEAD revision for '%s' is %s", url, rev.gitRev());
return rev;
}

View file

@ -18,7 +18,7 @@ struct IndirectInputScheme : InputScheme
if (path.size() == 1) {
} else if (path.size() == 2) {
if (std::regex_match(path[1], revRegex))
rev = Hash(path[1], htSHA1);
rev = Hash::parseAny(path[1], htSHA1);
else if (std::regex_match(path[1], refRegex))
ref = path[1];
else
@ -29,7 +29,7 @@ struct IndirectInputScheme : InputScheme
ref = path[1];
if (!std::regex_match(path[2], revRegex))
throw BadURL("in flake URL '%s', '%s' is not a commit hash", url.url, path[2]);
rev = Hash(path[2], htSHA1);
rev = Hash::parseAny(path[2], htSHA1);
} else
throw BadURL("GitHub URL '%s' is invalid", url.url);

View file

@ -209,7 +209,7 @@ struct MercurialInputScheme : InputScheme
});
if (auto res = getCache()->lookup(store, mutableAttrs)) {
auto rev2 = Hash(getStrAttr(res->first, "rev"), htSHA1);
auto rev2 = Hash::parseAny(getStrAttr(res->first, "rev"), htSHA1);
if (!input.getRev() || input.getRev() == rev2) {
input.attrs.insert_or_assign("rev", rev2.gitRev());
return makeResult(res->first, std::move(res->second));
@ -252,7 +252,7 @@ struct MercurialInputScheme : InputScheme
runProgram("hg", true, { "log", "-R", cacheDir, "-r", revOrRef, "--template", "{node} {rev} {branch}" }));
assert(tokens.size() == 3);
input.attrs.insert_or_assign("rev", Hash(tokens[0], htSHA1).gitRev());
input.attrs.insert_or_assign("rev", Hash::parseAny(tokens[0], htSHA1).gitRev());
auto revCount = std::stoull(tokens[1]);
input.attrs.insert_or_assign("ref", tokens[2]);

View file

@ -36,7 +36,7 @@ void printGCWarning()
void printMissing(ref<Store> store, const std::vector<StorePathWithOutputs> & paths, Verbosity lvl)
{
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
StorePathSet willBuild, willSubstitute, unknown;
store->queryMissing(paths, willBuild, willSubstitute, unknown, downloadSize, narSize);
printMissing(store, willBuild, willSubstitute, unknown, downloadSize, narSize, lvl);
@ -45,7 +45,7 @@ void printMissing(ref<Store> store, const std::vector<StorePathWithOutputs> & pa
void printMissing(ref<Store> store, const StorePathSet & willBuild,
const StorePathSet & willSubstitute, const StorePathSet & unknown,
unsigned long long downloadSize, unsigned long long narSize, Verbosity lvl)
uint64_t downloadSize, uint64_t narSize, Verbosity lvl)
{
if (!willBuild.empty()) {
if (willBuild.size() == 1)
@ -384,7 +384,7 @@ RunPager::~RunPager()
}
string showBytes(unsigned long long bytes)
string showBytes(uint64_t bytes)
{
return (format("%.2f MiB") % (bytes / (1024.0 * 1024.0))).str();
}

View file

@ -47,7 +47,7 @@ void printMissing(
void printMissing(ref<Store> store, const StorePathSet & willBuild,
const StorePathSet & willSubstitute, const StorePathSet & unknown,
unsigned long long downloadSize, unsigned long long narSize, Verbosity lvl = lvlInfo);
uint64_t downloadSize, uint64_t narSize, Verbosity lvl = lvlInfo);
string getArg(const string & opt,
Strings::iterator & i, const Strings::iterator & end);
@ -110,7 +110,7 @@ extern volatile ::sig_atomic_t blockInt;
/* GC helpers. */
string showBytes(unsigned long long bytes);
string showBytes(uint64_t bytes);
struct GCResults;

View file

@ -153,6 +153,8 @@ void BinaryCacheStore::addToStore(const ValidPathInfo & info, Source & narSource
auto [fdTemp, fnTemp] = createTempFile();
AutoDelete autoDelete(fnTemp);
auto now1 = std::chrono::steady_clock::now();
/* Read the NAR simultaneously into a CompressionSink+FileSink (to
@ -167,6 +169,7 @@ void BinaryCacheStore::addToStore(const ValidPathInfo & info, Source & narSource
TeeSource teeSource(narSource, *compressionSink);
narAccessor = makeNarAccessor(teeSource);
compressionSink->finish();
fileSink.flush();
}
auto now2 = std::chrono::steady_clock::now();
@ -178,7 +181,7 @@ void BinaryCacheStore::addToStore(const ValidPathInfo & info, Source & narSource
auto [fileHash, fileSize] = fileHashSink.finish();
narInfo->fileHash = fileHash;
narInfo->fileSize = fileSize;
narInfo->url = "nar/" + narInfo->fileHash.to_string(Base32, false) + ".nar"
narInfo->url = "nar/" + narInfo->fileHash->to_string(Base32, false) + ".nar"
+ (compression == "xz" ? ".xz" :
compression == "bzip2" ? ".bz2" :
compression == "br" ? ".br" :
@ -280,7 +283,7 @@ void BinaryCacheStore::addToStore(const ValidPathInfo & info, Source & narSource
if (repair || !fileExists(narInfo->url)) {
stats.narWrite++;
upsertFile(narInfo->url,
std::make_shared<std::fstream>(fnTemp, std::ios_base::in),
std::make_shared<std::fstream>(fnTemp, std::ios_base::in | std::ios_base::binary),
"application/x-nix-nar");
} else
stats.narWriteAverted++;
@ -372,7 +375,7 @@ StorePath BinaryCacheStore::addToStore(const string & name, const Path & srcPath
method for very large paths, but `copyPath' is mainly used for
small files. */
StringSink sink;
Hash h;
std::optional<Hash> h;
if (method == FileIngestionMethod::Recursive) {
dumpPath(srcPath, sink, filter);
h = hashString(hashAlgo, *sink.s);
@ -382,7 +385,7 @@ StorePath BinaryCacheStore::addToStore(const string & name, const Path & srcPath
h = hashString(hashAlgo, s);
}
ValidPathInfo info(makeFixedOutputPath(method, h, name));
ValidPathInfo info(makeFixedOutputPath(method, *h, name));
auto source = StringSource { *sink.s };
addToStore(info, source, repair, CheckSigs);

View file

@ -297,7 +297,7 @@ public:
GoalPtr makeDerivationGoal(const StorePath & drvPath, const StringSet & wantedOutputs, BuildMode buildMode = bmNormal);
std::shared_ptr<DerivationGoal> makeBasicDerivationGoal(const StorePath & drvPath,
const BasicDerivation & drv, BuildMode buildMode = bmNormal);
GoalPtr makeSubstitutionGoal(const StorePath & storePath, RepairFlag repair = NoRepair);
GoalPtr makeSubstitutionGoal(const StorePath & storePath, RepairFlag repair = NoRepair, std::optional<ContentAddress> ca = std::nullopt);
/* Remove a dead goal. */
void removeGoal(GoalPtr goal);
@ -806,8 +806,8 @@ private:
/* RAII object to delete the chroot directory. */
std::shared_ptr<AutoDelete> autoDelChroot;
/* Whether this is a fixed-output derivation. */
bool fixedOutput;
/* The sort of derivation we are building. */
DerivationType derivationType;
/* Whether to run the build in a private network namespace. */
bool privateNetwork = false;
@ -1047,7 +1047,7 @@ DerivationGoal::DerivationGoal(const StorePath & drvPath, const BasicDerivation
{
this->drv = std::make_unique<BasicDerivation>(BasicDerivation(drv));
state = &DerivationGoal::haveDerivation;
name = fmt("building of %s", worker.store.showPaths(drv.outputPaths()));
name = fmt("building of %s", worker.store.showPaths(drv.outputPaths(worker.store)));
trace("created");
mcExpectedBuilds = std::make_unique<MaintainCount<uint64_t>>(worker.expectedBuilds);
@ -1182,7 +1182,7 @@ void DerivationGoal::haveDerivation()
retrySubstitution = false;
for (auto & i : drv->outputs)
worker.store.addTempRoot(i.second.path);
worker.store.addTempRoot(i.second.path(worker.store, drv->name));
/* Check what outputs paths are not already valid. */
auto invalidOutputs = checkPathValidity(false, buildMode == bmRepair);
@ -1195,9 +1195,9 @@ void DerivationGoal::haveDerivation()
parsedDrv = std::make_unique<ParsedDerivation>(drvPath, *drv);
if (parsedDrv->contentAddressed()) {
if (drv->type() == DerivationType::CAFloating) {
settings.requireExperimentalFeature("ca-derivations");
throw Error("ca-derivations isn't implemented yet");
throw UnimplementedError("ca-derivations isn't implemented yet");
}
@ -1206,7 +1206,7 @@ void DerivationGoal::haveDerivation()
them. */
if (settings.useSubstitutes && parsedDrv->substitutesAllowed())
for (auto & i : invalidOutputs)
addWaitee(worker.makeSubstitutionGoal(i, buildMode == bmRepair ? Repair : NoRepair));
addWaitee(worker.makeSubstitutionGoal(i, buildMode == bmRepair ? Repair : NoRepair, getDerivationCA(*drv)));
if (waitees.empty()) /* to prevent hang (no wake-up event) */
outputsSubstituted();
@ -1290,12 +1290,12 @@ void DerivationGoal::repairClosure()
StorePathSet outputClosure;
for (auto & i : drv->outputs) {
if (!wantOutput(i.first, wantedOutputs)) continue;
worker.store.computeFSClosure(i.second.path, outputClosure);
worker.store.computeFSClosure(i.second.path(worker.store, drv->name), outputClosure);
}
/* Filter out our own outputs (which we have already checked). */
for (auto & i : drv->outputs)
outputClosure.erase(i.second.path);
outputClosure.erase(i.second.path(worker.store, drv->name));
/* Get all dependencies of this derivation so that we know which
derivation is responsible for which path in the output
@ -1307,7 +1307,7 @@ void DerivationGoal::repairClosure()
if (i.isDerivation()) {
Derivation drv = worker.store.derivationFromPath(i);
for (auto & j : drv.outputs)
outputsToDrv.insert_or_assign(j.second.path, i);
outputsToDrv.insert_or_assign(j.second.path(worker.store, drv.name), i);
}
/* Check each path (slow!). */
@ -1379,7 +1379,7 @@ void DerivationGoal::inputsRealised()
for (auto & j : i.second) {
auto k = inDrv.outputs.find(j);
if (k != inDrv.outputs.end())
worker.store.computeFSClosure(k->second.path, inputPaths);
worker.store.computeFSClosure(k->second.path(worker.store, inDrv.name), inputPaths);
else
throw Error(
"derivation '%s' requires non-existent output '%s' from input derivation '%s'",
@ -1392,12 +1392,12 @@ void DerivationGoal::inputsRealised()
debug("added input paths %s", worker.store.showPaths(inputPaths));
/* Is this a fixed-output derivation? */
fixedOutput = drv->isFixedOutput();
/* What type of derivation are we building? */
derivationType = drv->type();
/* Don't repeat fixed-output derivations since they're already
verified by their output hash.*/
nrRounds = fixedOutput ? 1 : settings.buildRepeat + 1;
nrRounds = derivationIsFixed(derivationType) ? 1 : settings.buildRepeat + 1;
/* Okay, try to build. Note that here we don't wait for a build
slot to become available, since we don't need one if there is a
@ -1432,7 +1432,7 @@ void DerivationGoal::tryToBuild()
goal can start a build, and if not, the main loop will sleep a
few seconds and then retry this goal. */
PathSet lockFiles;
for (auto & outPath : drv->outputPaths())
for (auto & outPath : drv->outputPaths(worker.store))
lockFiles.insert(worker.store.Store::toRealPath(outPath));
if (!outputLocks.lockPaths(lockFiles, "", false)) {
@ -1460,16 +1460,16 @@ void DerivationGoal::tryToBuild()
return;
}
missingPaths = drv->outputPaths();
missingPaths = drv->outputPaths(worker.store);
if (buildMode != bmCheck)
for (auto & i : validPaths) missingPaths.erase(i);
/* If any of the outputs already exist but are not valid, delete
them. */
for (auto & i : drv->outputs) {
if (worker.store.isValidPath(i.second.path)) continue;
debug("removing invalid path '%s'", worker.store.printStorePath(i.second.path));
deletePath(worker.store.Store::toRealPath(i.second.path));
if (worker.store.isValidPath(i.second.path(worker.store, drv->name))) continue;
debug("removing invalid path '%s'", worker.store.printStorePath(i.second.path(worker.store, drv->name)));
deletePath(worker.store.Store::toRealPath(i.second.path(worker.store, drv->name)));
}
/* Don't do a remote build if the derivation has the attribute
@ -1646,13 +1646,13 @@ void DerivationGoal::buildDone()
So instead, check if the disk is (nearly) full now. If
so, we don't mark this build as a permanent failure. */
#if HAVE_STATVFS
unsigned long long required = 8ULL * 1024 * 1024; // FIXME: make configurable
uint64_t required = 8ULL * 1024 * 1024; // FIXME: make configurable
struct statvfs st;
if (statvfs(worker.store.realStoreDir.c_str(), &st) == 0 &&
(unsigned long long) st.f_bavail * st.f_bsize < required)
(uint64_t) st.f_bavail * st.f_bsize < required)
diskFull = true;
if (statvfs(tmpDir.c_str(), &st) == 0 &&
(unsigned long long) st.f_bavail * st.f_bsize < required)
(uint64_t) st.f_bavail * st.f_bsize < required)
diskFull = true;
#endif
@ -1692,7 +1692,7 @@ void DerivationGoal::buildDone()
fmt("running post-build-hook '%s'", settings.postBuildHook),
Logger::Fields{worker.store.printStorePath(drvPath)});
PushActivity pact(act.id);
auto outputPaths = drv->outputPaths();
auto outputPaths = drv->outputPaths(worker.store);
std::map<std::string, std::string> hookEnvironment = getEnv();
hookEnvironment.emplace("DRV_PATH", worker.store.printStorePath(drvPath));
@ -1783,7 +1783,7 @@ void DerivationGoal::buildDone()
st =
dynamic_cast<NotDeterministic*>(&e) ? BuildResult::NotDeterministic :
statusOk(status) ? BuildResult::OutputRejected :
fixedOutput || diskFull ? BuildResult::TransientFailure :
derivationIsImpure(derivationType) || diskFull ? BuildResult::TransientFailure :
BuildResult::PermanentFailure;
}
@ -1920,7 +1920,7 @@ StorePathSet DerivationGoal::exportReferences(const StorePathSet & storePaths)
if (j.isDerivation()) {
Derivation drv = worker.store.derivationFromPath(j);
for (auto & k : drv.outputs)
worker.store.computeFSClosure(k.second.path, paths);
worker.store.computeFSClosure(k.second.path(worker.store, drv.name), paths);
}
}
@ -1996,7 +1996,7 @@ void DerivationGoal::startBuilder()
else if (settings.sandboxMode == smDisabled)
useChroot = false;
else if (settings.sandboxMode == smRelaxed)
useChroot = !fixedOutput && !noChroot;
useChroot = !(derivationIsImpure(derivationType)) && !noChroot;
}
if (worker.store.storeDir != worker.store.realStoreDir) {
@ -2015,7 +2015,7 @@ void DerivationGoal::startBuilder()
/* Substitute output placeholders with the actual output paths. */
for (auto & output : drv->outputs)
inputRewrites[hashPlaceholder(output.first)] = worker.store.printStorePath(output.second.path);
inputRewrites[hashPlaceholder(output.first)] = worker.store.printStorePath(output.second.path(worker.store, drv->name));
/* Construct the environment passed to the builder. */
initEnv();
@ -2165,7 +2165,7 @@ void DerivationGoal::startBuilder()
"nogroup:x:65534:\n") % sandboxGid).str());
/* Create /etc/hosts with localhost entry. */
if (!fixedOutput)
if (!(derivationIsImpure(derivationType)))
writeFile(chrootRootDir + "/etc/hosts", "127.0.0.1 localhost\n::1 localhost\n");
/* Make the closure of the inputs available in the chroot,
@ -2200,7 +2200,7 @@ void DerivationGoal::startBuilder()
(typically the dependencies of /bin/sh). Throw them
out. */
for (auto & i : drv->outputs)
dirsInChroot.erase(worker.store.printStorePath(i.second.path));
dirsInChroot.erase(worker.store.printStorePath(i.second.path(worker.store, drv->name)));
#elif __APPLE__
/* We don't really have any parent prep work to do (yet?)
@ -2373,7 +2373,7 @@ void DerivationGoal::startBuilder()
us.
*/
if (!fixedOutput)
if (!(derivationIsImpure(derivationType)))
privateNetwork = true;
userNamespaceSync.create();
@ -2574,7 +2574,7 @@ void DerivationGoal::initEnv()
derivation, tell the builder, so that for instance `fetchurl'
can skip checking the output. On older Nixes, this environment
variable won't be set, so `fetchurl' will do the check. */
if (fixedOutput) env["NIX_OUTPUT_CHECKED"] = "1";
if (derivationIsFixed(derivationType)) env["NIX_OUTPUT_CHECKED"] = "1";
/* *Only* if this is a fixed-output derivation, propagate the
values of the environment variables specified in the
@ -2585,7 +2585,7 @@ void DerivationGoal::initEnv()
to the builder is generally impure, but the output of
fixed-output derivations is by definition pure (since we
already know the cryptographic hash of the output). */
if (fixedOutput) {
if (derivationIsImpure(derivationType)) {
for (auto & i : parsedDrv->getStringsAttr("impureEnvVars").value_or(Strings()))
env[i] = getEnv(i).value_or("");
}
@ -2613,7 +2613,7 @@ void DerivationGoal::writeStructuredAttrs()
/* Add an "outputs" object containing the output paths. */
nlohmann::json outputs;
for (auto & i : drv->outputs)
outputs[i.first] = rewriteStrings(worker.store.printStorePath(i.second.path), inputRewrites);
outputs[i.first] = rewriteStrings(worker.store.printStorePath(i.second.path(worker.store, drv->name)), inputRewrites);
json["outputs"] = outputs;
/* Handle exportReferencesGraph. */
@ -2817,7 +2817,7 @@ struct RestrictedStore : public LocalFSStore
auto drv = derivationFromPath(path.path);
for (auto & output : drv.outputs)
if (wantOutput(output.first, path.outputs))
newPaths.insert(output.second.path);
newPaths.insert(output.second.path(*this, drv.name));
} else if (!goal.isAllowed(path.path))
throw InvalidPath("cannot build unknown path '%s' in recursive Nix", printStorePath(path.path));
}
@ -2851,7 +2851,7 @@ struct RestrictedStore : public LocalFSStore
void queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePathSet & willBuild, StorePathSet & willSubstitute, StorePathSet & unknown,
unsigned long long & downloadSize, unsigned long long & narSize) override
uint64_t & downloadSize, uint64_t & narSize) override
{
/* This is slightly impure since it leaks information to the
client about what paths will be built/substituted or are
@ -3195,7 +3195,7 @@ void DerivationGoal::runChild()
/* Fixed-output derivations typically need to access the
network, so give them access to /etc/resolv.conf and so
on. */
if (fixedOutput) {
if (derivationIsImpure(derivationType)) {
ss.push_back("/etc/resolv.conf");
// Only use nss functions to resolve hosts and
@ -3436,7 +3436,7 @@ void DerivationGoal::runChild()
sandboxProfile += "(import \"sandbox-defaults.sb\")\n";
if (fixedOutput)
if (derivationIsImpure(derivationType))
sandboxProfile += "(import \"sandbox-network.sb\")\n";
/* Our rwx outputs */
@ -3579,7 +3579,7 @@ StorePathSet parseReferenceSpecifiers(Store & store, const BasicDerivation & drv
if (store.isStorePath(i))
result.insert(store.parseStorePath(i));
else if (drv.outputs.count(i))
result.insert(drv.outputs.find(i)->second.path);
result.insert(drv.outputs.find(i)->second.path(store, drv.name));
else throw BuildError("derivation contains an illegal reference specifier '%s'", i);
}
return result;
@ -3617,7 +3617,7 @@ void DerivationGoal::registerOutputs()
if (hook) {
bool allValid = true;
for (auto & i : drv->outputs)
if (!worker.store.isValidPath(i.second.path)) allValid = false;
if (!worker.store.isValidPath(i.second.path(worker.store, drv->name))) allValid = false;
if (allValid) return;
}
@ -3638,23 +3638,23 @@ void DerivationGoal::registerOutputs()
Nix calls. */
StorePathSet referenceablePaths;
for (auto & p : inputPaths) referenceablePaths.insert(p);
for (auto & i : drv->outputs) referenceablePaths.insert(i.second.path);
for (auto & i : drv->outputs) referenceablePaths.insert(i.second.path(worker.store, drv->name));
for (auto & p : addedPaths) referenceablePaths.insert(p);
/* Check whether the output paths were created, and grep each
output path to determine what other paths it references. Also make all
output paths read-only. */
for (auto & i : drv->outputs) {
auto path = worker.store.printStorePath(i.second.path);
if (!missingPaths.count(i.second.path)) continue;
auto path = worker.store.printStorePath(i.second.path(worker.store, drv->name));
if (!missingPaths.count(i.second.path(worker.store, drv->name))) continue;
Path actualPath = path;
if (needsHashRewrite()) {
auto r = redirectedOutputs.find(i.second.path);
auto r = redirectedOutputs.find(i.second.path(worker.store, drv->name));
if (r != redirectedOutputs.end()) {
auto redirected = worker.store.Store::toRealPath(r->second);
if (buildMode == bmRepair
&& redirectedBadOutputs.count(i.second.path)
&& redirectedBadOutputs.count(i.second.path(worker.store, drv->name))
&& pathExists(redirected))
replaceValidPath(path, redirected);
if (buildMode == bmCheck)
@ -3721,9 +3721,24 @@ void DerivationGoal::registerOutputs()
hash). */
std::optional<ContentAddress> ca;
if (fixedOutput) {
if (! std::holds_alternative<DerivationOutputInputAddressed>(i.second.output)) {
DerivationOutputCAFloating outputHash;
std::visit(overloaded {
[&](DerivationOutputInputAddressed doi) {
assert(false); // Enclosing `if` handles this case in other branch
},
[&](DerivationOutputCAFixed dof) {
outputHash = DerivationOutputCAFloating {
.method = dof.hash.method,
.hashType = dof.hash.hash.type,
};
},
[&](DerivationOutputCAFloating dof) {
outputHash = dof;
},
}, i.second.output);
if (i.second.hash->method == FileIngestionMethod::Flat) {
if (outputHash.method == FileIngestionMethod::Flat) {
/* The output path should be a regular file without execute permission. */
if (!S_ISREG(st.st_mode) || (st.st_mode & S_IXUSR) != 0)
throw BuildError(
@ -3734,13 +3749,18 @@ void DerivationGoal::registerOutputs()
/* Check the hash. In hash mode, move the path produced by
the derivation to its content-addressed location. */
Hash h2 = i.second.hash->method == FileIngestionMethod::Recursive
? hashPath(*i.second.hash->hash.type, actualPath).first
: hashFile(*i.second.hash->hash.type, actualPath);
Hash h2 = outputHash.method == FileIngestionMethod::Recursive
? hashPath(outputHash.hashType, actualPath).first
: hashFile(outputHash.hashType, actualPath);
auto dest = worker.store.makeFixedOutputPath(i.second.hash->method, h2, i.second.path.name());
auto dest = worker.store.makeFixedOutputPath(outputHash.method, h2, i.second.path(worker.store, drv->name).name());
if (i.second.hash->hash != h2) {
// true if either floating CA, or incorrect fixed hash.
bool needsMove = true;
if (auto p = std::get_if<DerivationOutputCAFixed>(& i.second.output)) {
Hash & h = p->hash.hash;
if (h != h2) {
/* Throw an error after registering the path as
valid. */
@ -3748,9 +3768,15 @@ void DerivationGoal::registerOutputs()
delayedException = std::make_exception_ptr(
BuildError("hash mismatch in fixed-output derivation '%s':\n wanted: %s\n got: %s",
worker.store.printStorePath(dest),
i.second.hash->hash.to_string(SRI, true),
h.to_string(SRI, true),
h2.to_string(SRI, true)));
} else {
// matched the fixed hash, so no move needed.
needsMove = false;
}
}
if (needsMove) {
Path actualDest = worker.store.Store::toRealPath(dest);
if (worker.store.isValidPath(dest))
@ -3770,7 +3796,7 @@ void DerivationGoal::registerOutputs()
assert(worker.store.parseStorePath(path) == dest);
ca = FixedOutputHash {
.method = i.second.hash->method,
.method = outputHash.method,
.hash = h2,
};
}
@ -3785,8 +3811,10 @@ void DerivationGoal::registerOutputs()
time. The hash is stored in the database so that we can
verify later on whether nobody has messed with the store. */
debug("scanning for references inside '%1%'", path);
HashResult hash;
auto references = worker.store.parseStorePathSet(scanForReferences(actualPath, worker.store.printStorePathSet(referenceablePaths), hash));
// HashResult hash;
auto pathSetAndHash = scanForReferences(actualPath, worker.store.printStorePathSet(referenceablePaths));
auto references = worker.store.parseStorePathSet(pathSetAndHash.first);
HashResult hash = pathSetAndHash.second;
if (buildMode == bmCheck) {
if (!worker.store.isValidPath(worker.store.parseStorePath(path))) continue;
@ -3894,7 +3922,7 @@ void DerivationGoal::registerOutputs()
/* If this is the first round of several, then move the output out of the way. */
if (nrRounds > 1 && curRound == 1 && curRound < nrRounds && keepPreviousRound) {
for (auto & i : drv->outputs) {
auto path = worker.store.printStorePath(i.second.path);
auto path = worker.store.printStorePath(i.second.path(worker.store, drv->name));
Path prev = path + checkSuffix;
deletePath(prev);
Path dst = path + checkSuffix;
@ -3912,7 +3940,7 @@ void DerivationGoal::registerOutputs()
if the result was not determistic? */
if (curRound == nrRounds) {
for (auto & i : drv->outputs) {
Path prev = worker.store.printStorePath(i.second.path) + checkSuffix;
Path prev = worker.store.printStorePath(i.second.path(worker.store, drv->name)) + checkSuffix;
deletePath(prev);
}
}
@ -4213,9 +4241,9 @@ StorePathSet DerivationGoal::checkPathValidity(bool returnValid, bool checkHash)
for (auto & i : drv->outputs) {
if (!wantOutput(i.first, wantedOutputs)) continue;
bool good =
worker.store.isValidPath(i.second.path) &&
(!checkHash || worker.pathContentsGood(i.second.path));
if (good == returnValid) result.insert(i.second.path);
worker.store.isValidPath(i.second.path(worker.store, drv->name)) &&
(!checkHash || worker.pathContentsGood(i.second.path(worker.store, drv->name)));
if (good == returnValid) result.insert(i.second.path(worker.store, drv->name));
}
return result;
}
@ -4272,6 +4300,10 @@ private:
/* The store path that should be realised through a substitute. */
StorePath storePath;
/* The path the substituter refers to the path as. This will be
* different when the stores have different names. */
std::optional<StorePath> subPath;
/* The remaining substituters. */
std::list<ref<Store>> subs;
@ -4305,8 +4337,11 @@ private:
typedef void (SubstitutionGoal::*GoalState)();
GoalState state;
/* Content address for recomputing store path */
std::optional<ContentAddress> ca;
public:
SubstitutionGoal(const StorePath & storePath, Worker & worker, RepairFlag repair = NoRepair);
SubstitutionGoal(const StorePath & storePath, Worker & worker, RepairFlag repair = NoRepair, std::optional<ContentAddress> ca = std::nullopt);
~SubstitutionGoal();
void timedOut(Error && ex) override { abort(); };
@ -4336,10 +4371,11 @@ public:
};
SubstitutionGoal::SubstitutionGoal(const StorePath & storePath, Worker & worker, RepairFlag repair)
SubstitutionGoal::SubstitutionGoal(const StorePath & storePath, Worker & worker, RepairFlag repair, std::optional<ContentAddress> ca)
: Goal(worker)
, storePath(storePath)
, repair(repair)
, ca(ca)
{
state = &SubstitutionGoal::init;
name = fmt("substitution of '%s'", worker.store.printStorePath(this->storePath));
@ -4414,14 +4450,18 @@ void SubstitutionGoal::tryNext()
sub = subs.front();
subs.pop_front();
if (sub->storeDir != worker.store.storeDir) {
if (ca) {
subPath = sub->makeFixedOutputPathFromCA(storePath.name(), *ca);
if (sub->storeDir == worker.store.storeDir)
assert(subPath == storePath);
} else if (sub->storeDir != worker.store.storeDir) {
tryNext();
return;
}
try {
// FIXME: make async
info = sub->queryPathInfo(storePath);
info = sub->queryPathInfo(subPath ? *subPath : storePath);
} catch (InvalidPath &) {
tryNext();
return;
@ -4440,6 +4480,19 @@ void SubstitutionGoal::tryNext()
throw;
}
if (info->path != storePath) {
if (info->isContentAddressed(*sub) && info->references.empty()) {
auto info2 = std::make_shared<ValidPathInfo>(*info);
info2->path = storePath;
info = info2;
} else {
printError("asked '%s' for '%s' but got '%s'",
sub->getUri(), worker.store.printStorePath(storePath), sub->printStorePath(info->path));
tryNext();
return;
}
}
/* Update the total expected download size. */
auto narInfo = std::dynamic_pointer_cast<const NarInfo>(info);
@ -4529,7 +4582,7 @@ void SubstitutionGoal::tryToRun()
PushActivity pact(act.id);
copyStorePath(ref<Store>(sub), ref<Store>(worker.store.shared_from_this()),
storePath, repair, sub->isTrusted ? NoCheckSigs : CheckSigs);
subPath ? *subPath : storePath, repair, sub->isTrusted ? NoCheckSigs : CheckSigs);
promise.set_value();
} catch (...) {
@ -4662,11 +4715,11 @@ std::shared_ptr<DerivationGoal> Worker::makeBasicDerivationGoal(const StorePath
}
GoalPtr Worker::makeSubstitutionGoal(const StorePath & path, RepairFlag repair)
GoalPtr Worker::makeSubstitutionGoal(const StorePath & path, RepairFlag repair, std::optional<ContentAddress> ca)
{
GoalPtr goal = substitutionGoals[path].lock(); // FIXME
if (!goal) {
goal = std::make_shared<SubstitutionGoal>(path, *this, repair);
goal = std::make_shared<SubstitutionGoal>(path, *this, repair, ca);
substitutionGoals.insert_or_assign(path, goal);
wakeUp(goal);
}
@ -5008,7 +5061,7 @@ bool Worker::pathContentsGood(const StorePath & path)
if (!pathExists(store.printStorePath(path)))
res = false;
else {
HashResult current = hashPath(*info->narHash.type, store.printStorePath(path));
HashResult current = hashPath(info->narHash->type, store.printStorePath(path));
Hash nullHash(htSHA256);
res = info->narHash == nullHash || info->narHash == current.first;
}
@ -5034,7 +5087,7 @@ void Worker::markContentsGood(const StorePath & path)
static void primeCache(Store & store, const std::vector<StorePathWithOutputs> & paths)
{
StorePathSet willBuild, willSubstitute, unknown;
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
store.queryMissing(paths, willBuild, willSubstitute, unknown, downloadSize, narSize);
if (!willBuild.empty() && 0 == settings.maxBuildJobs && getMachines().empty())

View file

@ -58,23 +58,6 @@ void builtinFetchurl(const BasicDerivation & drv, const std::string & netrcData)
}
};
/* We always have one output, and if it's a fixed-output derivation (as
checked below) it must be the only output */
auto & output = drv.outputs.begin()->second;
/* Try the hashed mirrors first. */
if (output.hash && output.hash->method == FileIngestionMethod::Flat)
for (auto hashedMirror : settings.hashedMirrors.get())
try {
if (!hasSuffix(hashedMirror, "/")) hashedMirror += '/';
auto & h = output.hash->hash;
fetch(hashedMirror + printHashType(*h.type) + "/" + h.to_string(Base16, false));
return;
} catch (Error & e) {
debug(e.what());
}
/* Otherwise try the specified URL. */
fetch(mainUrl);
}

View file

@ -1,9 +1,11 @@
#include "args.hh"
#include "content-address.hh"
#include "split.hh"
namespace nix {
std::string FixedOutputHash::printMethodAlgo() const {
return makeFileIngestionPrefix(method) + printHashType(*hash.type);
return makeFileIngestionPrefix(method) + printHashType(hash.type);
}
std::string makeFileIngestionPrefix(const FileIngestionMethod m) {
@ -36,38 +38,46 @@ std::string renderContentAddress(ContentAddress ca) {
}
ContentAddress parseContentAddress(std::string_view rawCa) {
auto prefixSeparator = rawCa.find(':');
if (prefixSeparator != string::npos) {
auto prefix = string(rawCa, 0, prefixSeparator);
auto rest = rawCa;
std::string_view prefix;
{
auto optPrefix = splitPrefixTo(rest, ':');
if (!optPrefix)
throw UsageError("not a content address because it is not in the form '<prefix>:<rest>': %s", rawCa);
prefix = *optPrefix;
}
auto parseHashType_ = [&](){
auto hashTypeRaw = splitPrefixTo(rest, ':');
if (!hashTypeRaw)
throw UsageError("content address hash must be in form '<algo>:<hash>', but found: %s", rawCa);
HashType hashType = parseHashType(*hashTypeRaw);
return std::move(hashType);
};
// Switch on prefix
if (prefix == "text") {
auto hashTypeAndHash = rawCa.substr(prefixSeparator+1, string::npos);
Hash hash = Hash(string(hashTypeAndHash));
if (*hash.type != htSHA256) {
throw Error("parseContentAddress: the text hash should have type SHA256");
}
return TextHash { hash };
// No parsing of the method, "text" only support flat.
HashType hashType = parseHashType_();
if (hashType != htSHA256)
throw Error("text content address hash should use %s, but instead uses %s",
printHashType(htSHA256), printHashType(hashType));
return TextHash {
.hash = Hash::parseNonSRIUnprefixed(rest, std::move(hashType)),
};
} else if (prefix == "fixed") {
// This has to be an inverse of makeFixedOutputCA
auto methodAndHash = rawCa.substr(prefixSeparator+1, string::npos);
if (methodAndHash.substr(0,2) == "r:") {
std::string_view hashRaw = methodAndHash.substr(2,string::npos);
// Parse method
auto method = FileIngestionMethod::Flat;
if (splitPrefix(rest, "r:"))
method = FileIngestionMethod::Recursive;
HashType hashType = parseHashType_();
return FixedOutputHash {
.method = FileIngestionMethod::Recursive,
.hash = Hash(string(hashRaw)),
.method = method,
.hash = Hash::parseNonSRIUnprefixed(rest, std::move(hashType)),
};
} else {
std::string_view hashRaw = methodAndHash;
return FixedOutputHash {
.method = FileIngestionMethod::Flat,
.hash = Hash(string(hashRaw)),
};
}
} else {
throw Error("parseContentAddress: format not recognized; has to be text or fixed");
}
} else {
throw Error("Not a content address because it lacks an appropriate prefix");
}
} else
throw UsageError("content address prefix '%s' is unrecognized. Recogonized prefixes are 'text' or 'fixed'", prefix);
};
std::optional<ContentAddress> parseContentAddressOpt(std::string_view rawCaOpt) {

View file

@ -289,7 +289,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
logger->startWork();
auto hash = store->queryPathInfo(path)->narHash;
logger->stopWork();
to << hash.to_string(Base16, false);
to << hash->to_string(Base16, false);
break;
}
@ -451,7 +451,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
case wopBuildDerivation: {
auto drvPath = store->parseStorePath(readString(from));
BasicDerivation drv;
readDerivation(from, *store, drv);
readDerivation(from, *store, drv, Derivation::nameFromPath(drvPath));
BuildMode buildMode = (BuildMode) readInt(from);
logger->startWork();
if (!trusted)
@ -579,7 +579,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
auto path = store->parseStorePath(readString(from));
logger->startWork();
SubstitutablePathInfos infos;
store->querySubstitutablePathInfos({path}, infos);
store->querySubstitutablePathInfos({{path, std::nullopt}}, infos);
logger->stopWork();
auto i = infos.find(path);
if (i == infos.end())
@ -595,10 +595,16 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
}
case wopQuerySubstitutablePathInfos: {
auto paths = readStorePaths<StorePathSet>(*store, from);
logger->startWork();
SubstitutablePathInfos infos;
store->querySubstitutablePathInfos(paths, infos);
StorePathCAMap pathsMap = {};
if (GET_PROTOCOL_MINOR(clientVersion) < 22) {
auto paths = readStorePaths<StorePathSet>(*store, from);
for (auto & path : paths)
pathsMap.emplace(path, std::nullopt);
} else
pathsMap = readStorePathCAMap(*store, from);
logger->startWork();
store->querySubstitutablePathInfos(pathsMap, infos);
logger->stopWork();
to << infos.size();
for (auto & i : infos) {
@ -632,7 +638,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
if (GET_PROTOCOL_MINOR(clientVersion) >= 17)
to << 1;
to << (info->deriver ? store->printStorePath(*info->deriver) : "")
<< info->narHash.to_string(Base16, false);
<< info->narHash->to_string(Base16, false);
writeStorePaths(*store, to, info->references);
to << info->registrationTime << info->narSize;
if (GET_PROTOCOL_MINOR(clientVersion) >= 16) {
@ -692,7 +698,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
auto deriver = readString(from);
if (deriver != "")
info.deriver = store->parseStorePath(deriver);
info.narHash = Hash(readString(from), htSHA256);
info.narHash = Hash::parseAny(readString(from), htSHA256);
info.references = readStorePaths<StorePathSet>(*store, from);
from >> info.registrationTime >> info.narSize >> info.ultimate;
info.sigs = readStrings<StringSet>(from);
@ -703,6 +709,64 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
if (!trusted)
info.ultimate = false;
if (GET_PROTOCOL_MINOR(clientVersion) >= 23) {
struct FramedSource : Source
{
Source & from;
bool eof = false;
std::vector<unsigned char> pending;
size_t pos = 0;
FramedSource(Source & from) : from(from)
{ }
~FramedSource()
{
if (!eof) {
while (true) {
auto n = readInt(from);
if (!n) break;
std::vector<unsigned char> data(n);
from(data.data(), n);
}
}
}
size_t read(unsigned char * data, size_t len) override
{
if (eof) throw EndOfFile("reached end of FramedSource");
if (pos >= pending.size()) {
size_t len = readInt(from);
if (!len) {
eof = true;
return 0;
}
pending = std::vector<unsigned char>(len);
pos = 0;
from(pending.data(), len);
}
auto n = std::min(len, pending.size() - pos);
memcpy(data, pending.data() + pos, n);
pos += n;
return n;
}
};
logger->startWork();
{
FramedSource source(from);
store->addToStore(info, source, (RepairFlag) repair,
dontCheckSigs ? NoCheckSigs : CheckSigs);
}
logger->stopWork();
}
else {
std::unique_ptr<Source> source;
if (GET_PROTOCOL_MINOR(clientVersion) >= 21)
source = std::make_unique<TunnelSource>(from, to);
@ -721,6 +785,8 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
dontCheckSigs ? NoCheckSigs : CheckSigs);
logger->stopWork();
}
break;
}
@ -730,7 +796,7 @@ static void performOp(TunnelLogger * logger, ref<Store> store,
targets.push_back(store->parsePathWithOutputs(s));
logger->startWork();
StorePathSet willBuild, willSubstitute, unknown;
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
store->queryMissing(targets, willBuild, willSubstitute, unknown, downloadSize, narSize);
logger->stopWork();
writeStorePaths(*store, to, willBuild);

View file

@ -7,6 +7,53 @@
namespace nix {
std::optional<StorePath> DerivationOutput::pathOpt(const Store & store, std::string_view drvName) const
{
return std::visit(overloaded {
[](DerivationOutputInputAddressed doi) -> std::optional<StorePath> {
return { doi.path };
},
[&](DerivationOutputCAFixed dof) -> std::optional<StorePath> {
return {
store.makeFixedOutputPath(dof.hash.method, dof.hash.hash, drvName)
};
},
[](DerivationOutputCAFloating dof) -> std::optional<StorePath> {
return std::nullopt;
},
}, output);
}
bool derivationIsCA(DerivationType dt) {
switch (dt) {
case DerivationType::InputAddressed: return false;
case DerivationType::CAFixed: return true;
case DerivationType::CAFloating: return true;
};
// Since enums can have non-variant values, but making a `default:` would
// disable exhaustiveness warnings.
assert(false);
}
bool derivationIsFixed(DerivationType dt) {
switch (dt) {
case DerivationType::InputAddressed: return false;
case DerivationType::CAFixed: return true;
case DerivationType::CAFloating: return false;
};
assert(false);
}
bool derivationIsImpure(DerivationType dt) {
switch (dt) {
case DerivationType::InputAddressed: return false;
case DerivationType::CAFixed: return true;
case DerivationType::CAFloating: return false;
};
assert(false);
}
bool BasicDerivation::isBuiltin() const
{
@ -99,7 +146,6 @@ static DerivationOutput parseDerivationOutput(const Store & store, std::istrings
expect(str, ","); const auto hash = parseString(str);
expect(str, ")");
std::optional<FixedOutputHash> fsh;
if (hashAlgo != "") {
auto method = FileIngestionMethod::Flat;
if (string(hashAlgo, 0, 2) == "r:") {
@ -107,22 +153,37 @@ static DerivationOutput parseDerivationOutput(const Store & store, std::istrings
hashAlgo = string(hashAlgo, 2);
}
const HashType hashType = parseHashType(hashAlgo);
fsh = FixedOutputHash {
.method = std::move(method),
.hash = Hash(hash, hashType),
};
}
return hash != ""
? DerivationOutput {
.output = DerivationOutputCAFixed {
.hash = FixedOutputHash {
.method = std::move(method),
.hash = Hash::parseNonSRIUnprefixed(hash, hashType),
},
}
}
: (settings.requireExperimentalFeature("ca-derivations"),
DerivationOutput {
.output = DerivationOutputCAFloating {
.method = std::move(method),
.hashType = std::move(hashType),
},
});
} else
return DerivationOutput {
.output = DerivationOutputInputAddressed {
.path = std::move(path),
.hash = std::move(fsh),
}
};
}
static Derivation parseDerivation(const Store & store, std::string && s)
static Derivation parseDerivation(const Store & store, std::string && s, std::string_view name)
{
Derivation drv;
drv.name = name;
std::istringstream str(std::move(s));
expect(str, "Derive([");
@ -166,10 +227,10 @@ static Derivation parseDerivation(const Store & store, std::string && s)
}
Derivation readDerivation(const Store & store, const Path & drvPath)
Derivation readDerivation(const Store & store, const Path & drvPath, std::string_view name)
{
try {
return parseDerivation(store, readFile(drvPath));
return parseDerivation(store, readFile(drvPath), name);
} catch (FormatError & e) {
throw Error("error parsing derivation '%1%': %2%", drvPath, e.msg());
}
@ -187,7 +248,7 @@ Derivation Store::readDerivation(const StorePath & drvPath)
{
auto accessor = getFSAccessor();
try {
return parseDerivation(*this, accessor->readFile(printStorePath(drvPath)));
return parseDerivation(*this, accessor->readFile(printStorePath(drvPath)), Derivation::nameFromPath(drvPath));
} catch (FormatError & e) {
throw Error("error parsing derivation '%s': %s", printStorePath(drvPath), e.msg());
}
@ -255,10 +316,21 @@ string Derivation::unparse(const Store & store, bool maskOutputs,
for (auto & i : outputs) {
if (first) first = false; else s += ',';
s += '('; printUnquotedString(s, i.first);
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(i.second.path));
s += ','; printUnquotedString(s, i.second.hash ? i.second.hash->printMethodAlgo() : "");
s += ','; printUnquotedString(s,
i.second.hash ? i.second.hash->hash.to_string(Base16, false) : "");
s += ','; printUnquotedString(s, maskOutputs ? "" : store.printStorePath(i.second.path(store, name)));
std::visit(overloaded {
[&](DerivationOutputInputAddressed doi) {
s += ','; printUnquotedString(s, "");
s += ','; printUnquotedString(s, "");
},
[&](DerivationOutputCAFixed dof) {
s += ','; printUnquotedString(s, dof.hash.printMethodAlgo());
s += ','; printUnquotedString(s, dof.hash.hash.to_string(Base16, false));
},
[&](DerivationOutputCAFloating dof) {
s += ','; printUnquotedString(s, makeFileIngestionPrefix(dof.method) + printHashType(dof.hashType));
s += ','; printUnquotedString(s, "");
},
}, i.second.output);
s += ')';
}
@ -310,59 +382,134 @@ bool isDerivation(const string & fileName)
}
bool BasicDerivation::isFixedOutput() const
DerivationType BasicDerivation::type() const
{
return outputs.size() == 1 &&
outputs.begin()->first == "out" &&
outputs.begin()->second.hash;
std::set<std::string_view> inputAddressedOutputs, fixedCAOutputs, floatingCAOutputs;
std::optional<HashType> floatingHashType;
for (auto & i : outputs) {
std::visit(overloaded {
[&](DerivationOutputInputAddressed _) {
inputAddressedOutputs.insert(i.first);
},
[&](DerivationOutputCAFixed _) {
fixedCAOutputs.insert(i.first);
},
[&](DerivationOutputCAFloating dof) {
floatingCAOutputs.insert(i.first);
if (!floatingHashType) {
floatingHashType = dof.hashType;
} else {
if (*floatingHashType != dof.hashType)
throw Error("All floating outputs must use the same hash type");
}
},
}, i.second.output);
}
if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty()) {
throw Error("Must have at least one output");
} else if (! inputAddressedOutputs.empty() && fixedCAOutputs.empty() && floatingCAOutputs.empty()) {
return DerivationType::InputAddressed;
} else if (inputAddressedOutputs.empty() && ! fixedCAOutputs.empty() && floatingCAOutputs.empty()) {
if (fixedCAOutputs.size() > 1)
// FIXME: Experimental feature?
throw Error("Only one fixed output is allowed for now");
if (*fixedCAOutputs.begin() != "out")
throw Error("Single fixed output must be named \"out\"");
return DerivationType::CAFixed;
} else if (inputAddressedOutputs.empty() && fixedCAOutputs.empty() && ! floatingCAOutputs.empty()) {
return DerivationType::CAFloating;
} else {
throw Error("Can't mix derivation output types");
}
}
DrvHashes drvHashes;
/* pathDerivationModulo and hashDerivationModulo are mutually recursive
*/
/* Returns the hash of a derivation modulo fixed-output
subderivations. A fixed-output derivation is a derivation with one
output (`out') for which an expected hash and hash algorithm are
specified (using the `outputHash' and `outputHashAlgo'
attributes). We don't want changes to such derivations to
propagate upwards through the dependency graph, changing output
paths everywhere.
/* Look up the derivation by value and memoize the
`hashDerivationModulo` call.
*/
static const DrvHashModulo & pathDerivationModulo(Store & store, const StorePath & drvPath)
{
auto h = drvHashes.find(drvPath);
if (h == drvHashes.end()) {
assert(store.isValidPath(drvPath));
// Cache it
h = drvHashes.insert_or_assign(
drvPath,
hashDerivationModulo(
store,
store.readDerivation(drvPath),
false)).first;
}
return h->second;
}
For instance, if we change the url in a call to the `fetchurl'
function, we do not want to rebuild everything depending on it
(after all, (the hash of) the file being downloaded is unchanged).
So the *output paths* should not change. On the other hand, the
*derivation paths* should change to reflect the new dependency
graph.
/* See the header for interface details. These are the implementation details.
That's what this function does: it returns a hash which is just the
hash of the derivation ATerm, except that any input derivation
paths have been replaced by the result of a recursive call to this
function, and that for fixed-output derivations we return a hash of
its output path. */
Hash hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs)
For fixed-output derivations, each hash in the map is not the
corresponding output's content hash, but a hash of that hash along
with other constant data. The key point is that the value is a pure
function of the output's contents, and there are no preimage attacks
either spoofing an output's contents for a derivation, or
spoofing a derivation for an output's contents.
For regular derivations, it looks up each subderivation from its hash
and recurs. If the subderivation is also regular, it simply
substitutes the derivation path with its hash. If the subderivation
is fixed-output, however, it takes each output hash and pretends it
is a derivation hash producing a single "out" output. This is so we
don't leak the provenance of fixed outputs, reducing pointless cache
misses as the build itself won't know this.
*/
DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs)
{
/* Return a fixed hash for fixed-output derivations. */
if (drv.isFixedOutput()) {
DerivationOutputs::const_iterator i = drv.outputs.begin();
return hashString(htSHA256, "fixed:out:"
+ i->second.hash->printMethodAlgo() + ":"
+ i->second.hash->hash.to_string(Base16, false) + ":"
+ store.printStorePath(i->second.path));
switch (drv.type()) {
case DerivationType::CAFloating:
throw Error("Regular input-addressed derivations are not yet allowed to depend on CA derivations");
case DerivationType::CAFixed: {
std::map<std::string, Hash> outputHashes;
for (const auto & i : drv.outputs) {
auto & dof = std::get<DerivationOutputCAFixed>(i.second.output);
auto hash = hashString(htSHA256, "fixed:out:"
+ dof.hash.printMethodAlgo() + ":"
+ dof.hash.hash.to_string(Base16, false) + ":"
+ store.printStorePath(i.second.path(store, drv.name)));
outputHashes.insert_or_assign(i.first, std::move(hash));
}
return outputHashes;
}
case DerivationType::InputAddressed:
break;
}
/* For other derivations, replace the inputs paths with recursive
calls to this function.*/
calls to this function. */
std::map<std::string, StringSet> inputs2;
for (auto & i : drv.inputDrvs) {
auto h = drvHashes.find(i.first);
if (h == drvHashes.end()) {
assert(store.isValidPath(i.first));
h = drvHashes.insert_or_assign(i.first, hashDerivationModulo(store,
store.readDerivation(i.first), false)).first;
const auto & res = pathDerivationModulo(store, i.first);
std::visit(overloaded {
// Regular non-CA derivation, replace derivation
[&](Hash drvHash) {
inputs2.insert_or_assign(drvHash.to_string(Base16, false), i.second);
},
// CA derivation's output hashes
[&](CaOutputHashes outputHashes) {
std::set<std::string> justOut = { "out" };
for (auto & output : i.second) {
/* Put each one in with a single "out" output.. */
const auto h = outputHashes.at(output);
inputs2.insert_or_assign(
h.to_string(Base16, false),
justOut);
}
inputs2.insert_or_assign(h->second.to_string(Base16, false), i.second);
},
}, res);
}
return hashString(htSHA256, drv.unparse(store, maskOutputs, &inputs2));
@ -383,11 +530,11 @@ bool wantOutput(const string & output, const std::set<string> & wanted)
}
StorePathSet BasicDerivation::outputPaths() const
StorePathSet BasicDerivation::outputPaths(const Store & store) const
{
StorePathSet paths;
for (auto & i : outputs)
paths.insert(i.second.path);
paths.insert(i.second.path(store, name));
return paths;
}
@ -397,7 +544,6 @@ static DerivationOutput readDerivationOutput(Source & in, const Store & store)
auto hashAlgo = readString(in);
auto hash = readString(in);
std::optional<FixedOutputHash> fsh;
if (hashAlgo != "") {
auto method = FileIngestionMethod::Flat;
if (string(hashAlgo, 0, 2) == "r:") {
@ -405,15 +551,27 @@ static DerivationOutput readDerivationOutput(Source & in, const Store & store)
hashAlgo = string(hashAlgo, 2);
}
auto hashType = parseHashType(hashAlgo);
fsh = FixedOutputHash {
return hash != ""
? DerivationOutput {
.output = DerivationOutputCAFixed {
.hash = FixedOutputHash {
.method = std::move(method),
.hash = Hash(hash, hashType),
};
.hash = Hash::parseNonSRIUnprefixed(hash, hashType),
},
}
}
: (settings.requireExperimentalFeature("ca-derivations"),
DerivationOutput {
.output = DerivationOutputCAFloating {
.method = std::move(method),
.hashType = std::move(hashType),
},
});
} else
return DerivationOutput {
.output = DerivationOutputInputAddressed {
.path = std::move(path),
.hash = std::move(fsh),
}
};
}
@ -426,8 +584,19 @@ StringSet BasicDerivation::outputNames() const
}
Source & readDerivation(Source & in, const Store & store, BasicDerivation & drv)
std::string_view BasicDerivation::nameFromPath(const StorePath & drvPath) {
auto nameWithSuffix = drvPath.name();
constexpr std::string_view extension = ".drv";
assert(hasSuffix(nameWithSuffix, extension));
nameWithSuffix.remove_suffix(extension.size());
return nameWithSuffix;
}
Source & readDerivation(Source & in, const Store & store, BasicDerivation & drv, std::string_view name)
{
drv.name = name;
drv.outputs.clear();
auto nr = readNum<size_t>(in);
for (size_t n = 0; n < nr; n++) {
@ -456,13 +625,20 @@ void writeDerivation(Sink & out, const Store & store, const BasicDerivation & dr
out << drv.outputs.size();
for (auto & i : drv.outputs) {
out << i.first
<< store.printStorePath(i.second.path);
if (i.second.hash) {
out << i.second.hash->printMethodAlgo()
<< i.second.hash->hash.to_string(Base16, false);
} else {
<< store.printStorePath(i.second.path(store, drv.name));
std::visit(overloaded {
[&](DerivationOutputInputAddressed doi) {
out << "" << "";
}
},
[&](DerivationOutputCAFixed dof) {
out << dof.hash.printMethodAlgo()
<< dof.hash.hash.to_string(Base16, false);
},
[&](DerivationOutputCAFloating dof) {
out << (makeFileIngestionPrefix(dof.method) + printHashType(dof.hashType))
<< "";
},
}, i.second.output);
}
writeStorePaths(store, out, drv.inputSrcs);
out << drv.platform << drv.builder << drv.args;

View file

@ -6,6 +6,7 @@
#include "content-address.hh"
#include <map>
#include <variant>
namespace nix {
@ -13,10 +14,46 @@ namespace nix {
/* Abstract syntax of derivations. */
/* The traditional non-fixed-output derivation type. */
struct DerivationOutputInputAddressed
{
/* Will need to become `std::optional<StorePath>` once input-addressed
derivations are allowed to depend on cont-addressed derivations */
StorePath path;
};
/* Fixed-output derivations, whose output paths are content addressed
according to that fixed output. */
struct DerivationOutputCAFixed
{
FixedOutputHash hash; /* hash used for expected hash computation */
};
/* Floating-output derivations, whose output paths are content addressed, but
not fixed, and so are dynamically calculated from whatever the output ends
up being. */
struct DerivationOutputCAFloating
{
/* information used for expected hash computation */
FileIngestionMethod method;
HashType hashType;
};
struct DerivationOutput
{
StorePath path;
std::optional<FixedOutputHash> hash; /* hash used for expected hash computation */
std::variant<
DerivationOutputInputAddressed,
DerivationOutputCAFixed,
DerivationOutputCAFloating
> output;
std::optional<HashType> hashAlgoOpt(const Store & store) const;
std::optional<StorePath> pathOpt(const Store & store, std::string_view drvName) const;
/* DEPRECATED: Remove after CA drvs are fully implemented */
StorePath path(const Store & store, std::string_view drvName) const {
auto p = pathOpt(store, drvName);
if (!p) throw UnimplementedError("floating content-addressed derivations are not yet implemented");
return *p;
}
};
typedef std::map<string, DerivationOutput> DerivationOutputs;
@ -27,6 +64,25 @@ typedef std::map<StorePath, StringSet> DerivationInputs;
typedef std::map<string, string> StringPairs;
enum struct DerivationType : uint8_t {
InputAddressed,
CAFixed,
CAFloating,
};
/* Do the outputs of the derivation have paths calculated from their content,
or from the derivation itself? */
bool derivationIsCA(DerivationType);
/* Is the content of the outputs fixed a-priori via a hash? Never true for
non-CA derivations. */
bool derivationIsFixed(DerivationType);
/* Is the derivation impure and needs to access non-deterministic resources, or
pure and can be sandboxed? Note that whether or not we actually sandbox the
derivation is controlled separately. Never true for non-CA derivations. */
bool derivationIsImpure(DerivationType);
struct BasicDerivation
{
DerivationOutputs outputs; /* keyed on symbolic IDs */
@ -35,6 +91,7 @@ struct BasicDerivation
Path builder;
Strings args;
StringPairs env;
std::string name;
BasicDerivation() { }
virtual ~BasicDerivation() { };
@ -42,13 +99,15 @@ struct BasicDerivation
bool isBuiltin() const;
/* Return true iff this is a fixed-output derivation. */
bool isFixedOutput() const;
DerivationType type() const;
/* Return the output paths of a derivation. */
StorePathSet outputPaths() const;
StorePathSet outputPaths(const Store & store) const;
/* Return the output names of a derivation. */
StringSet outputNames() const;
static std::string_view nameFromPath(const StorePath & storePath);
};
struct Derivation : BasicDerivation
@ -72,15 +131,47 @@ StorePath writeDerivation(ref<Store> store,
const Derivation & drv, std::string_view name, RepairFlag repair = NoRepair);
/* Read a derivation from a file. */
Derivation readDerivation(const Store & store, const Path & drvPath);
Derivation readDerivation(const Store & store, const Path & drvPath, std::string_view name);
// FIXME: remove
bool isDerivation(const string & fileName);
Hash hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs);
// known CA drv's output hashes, current just for fixed-output derivations
// whose output hashes are always known since they are fixed up-front.
typedef std::map<std::string, Hash> CaOutputHashes;
typedef std::variant<
Hash, // regular DRV normalized hash
CaOutputHashes
> DrvHashModulo;
/* Returns hashes with the details of fixed-output subderivations
expunged.
A fixed-output derivation is a derivation whose outputs have a
specified content hash and hash algorithm. (Currently they must have
exactly one output (`out'), which is specified using the `outputHash'
and `outputHashAlgo' attributes, but the algorithm doesn't assume
this.) We don't want changes to such derivations to propagate upwards
through the dependency graph, changing output paths everywhere.
For instance, if we change the url in a call to the `fetchurl'
function, we do not want to rebuild everything depending on it---after
all, (the hash of) the file being downloaded is unchanged. So the
*output paths* should not change. On the other hand, the *derivation
paths* should change to reflect the new dependency graph.
For fixed-output derivations, this returns a map from the name of
each output to its hash, unique up to the output's contents.
For regular derivations, it returns a single hash of the derivation
ATerm, after subderivations have been likewise expunged from that
derivation.
*/
DrvHashModulo hashDerivationModulo(Store & store, const Derivation & drv, bool maskOutputs);
/* Memoisation of hashDerivationModulo(). */
typedef std::map<StorePath, Hash> DrvHashes;
typedef std::map<StorePath, DrvHashModulo> DrvHashes;
extern DrvHashes drvHashes; // FIXME: global, not thread-safe
@ -89,7 +180,7 @@ bool wantOutput(const string & output, const std::set<string> & wanted);
struct Source;
struct Sink;
Source & readDerivation(Source & in, const Store & store, BasicDerivation & drv);
Source & readDerivation(Source & in, const Store & store, BasicDerivation & drv, std::string_view name);
void writeDerivation(Sink & out, const Store & store, const BasicDerivation & drv);
std::string hashPlaceholder(const std::string & outputName);

View file

@ -38,9 +38,9 @@ void Store::exportPath(const StorePath & path, Sink & sink)
filesystem corruption from spreading to other machines.
Don't complain if the stored hash is zero (unknown). */
Hash hash = hashSink.currentHash().first;
if (hash != info->narHash && info->narHash != Hash(*info->narHash.type))
if (hash != info->narHash && info->narHash != Hash(info->narHash->type))
throw Error("hash of path '%s' has changed from '%s' to '%s'!",
printStorePath(path), info->narHash.to_string(Base32, true), hash.to_string(Base32, true));
printStorePath(path), info->narHash->to_string(Base32, true), hash.to_string(Base32, true));
teeSink
<< exportMagic

View file

@ -500,7 +500,7 @@ struct LocalStore::GCState
StorePathSet alive;
bool gcKeepOutputs;
bool gcKeepDerivations;
unsigned long long bytesInvalidated;
uint64_t bytesInvalidated;
bool moveToTrash = true;
bool shouldDelete;
GCState(const GCOptions & options, GCResults & results)
@ -518,7 +518,7 @@ bool LocalStore::isActiveTempFile(const GCState & state,
void LocalStore::deleteGarbage(GCState & state, const Path & path)
{
unsigned long long bytesFreed;
uint64_t bytesFreed;
deletePath(path, bytesFreed);
state.results.bytesFreed += bytesFreed;
}
@ -528,7 +528,7 @@ void LocalStore::deletePathRecursive(GCState & state, const Path & path)
{
checkInterrupt();
unsigned long long size = 0;
uint64_t size = 0;
auto storePath = maybeParseStorePath(path);
if (storePath && isValidPath(*storePath)) {
@ -687,7 +687,7 @@ void LocalStore::removeUnusedLinks(const GCState & state)
AutoCloseDir dir(opendir(linksDir.c_str()));
if (!dir) throw SysError("opening directory '%1%'", linksDir);
long long actualSize = 0, unsharedSize = 0;
int64_t actualSize = 0, unsharedSize = 0;
struct dirent * dirent;
while (errno = 0, dirent = readdir(dir.get())) {
@ -717,10 +717,10 @@ void LocalStore::removeUnusedLinks(const GCState & state)
struct stat st;
if (stat(linksDir.c_str(), &st) == -1)
throw SysError("statting '%1%'", linksDir);
long long overhead = st.st_blocks * 512ULL;
auto overhead = st.st_blocks * 512ULL;
printInfo(format("note: currently hard linking saves %.2f MiB")
% ((unsharedSize - actualSize - overhead) / (1024.0 * 1024.0)));
printInfo("note: currently hard linking saves %.2f MiB",
((unsharedSize - actualSize - overhead) / (1024.0 * 1024.0)));
}

View file

@ -335,9 +335,6 @@ public:
"setuid/setgid bits or with file capabilities."};
#endif
Setting<Strings> hashedMirrors{this, {"http://tarballs.nixos.org/"}, "hashed-mirrors",
"A list of servers used by builtins.fetchurl to fetch files by hash."};
Setting<uint64_t> minFree{this, 0, "min-free",
"Automatically run the garbage collector when free disk space drops below the specified amount."};

View file

@ -113,7 +113,7 @@ struct LegacySSHStore : public Store
if (GET_PROTOCOL_MINOR(conn->remoteVersion) >= 4) {
auto s = readString(conn->from);
info->narHash = s.empty() ? Hash() : Hash(s);
info->narHash = s.empty() ? std::optional<Hash>{} : Hash::parseAnyPrefixed(s);
info->ca = parseContentAddressOpt(readString(conn->from));
info->sigs = readStrings<StringSet>(conn->from);
}
@ -138,7 +138,7 @@ struct LegacySSHStore : public Store
<< cmdAddToStoreNar
<< printStorePath(info.path)
<< (info.deriver ? printStorePath(*info.deriver) : "")
<< info.narHash.to_string(Base16, false);
<< info.narHash->to_string(Base16, false);
writeStorePaths(*this, conn->to, info.references);
conn->to
<< info.registrationTime

View file

@ -544,11 +544,8 @@ void LocalStore::checkDerivationOutputs(const StorePath & drvPath, const Derivat
std::string drvName(drvPath.name());
drvName = string(drvName, 0, drvName.size() - drvExtension.size());
auto check = [&](const StorePath & expected, const StorePath & actual, const std::string & varName)
auto envHasRightPath = [&](const StorePath & actual, const std::string & varName)
{
if (actual != expected)
throw Error("derivation '%s' has incorrect output '%s', should be '%s'",
printStorePath(drvPath), printStorePath(actual), printStorePath(expected));
auto j = drv.env.find(varName);
if (j == drv.env.end() || parseStorePath(j->second) != actual)
throw Error("derivation '%s' has incorrect environment variable '%s', should be '%s'",
@ -556,23 +553,34 @@ void LocalStore::checkDerivationOutputs(const StorePath & drvPath, const Derivat
};
if (drv.isFixedOutput()) {
DerivationOutputs::const_iterator out = drv.outputs.find("out");
if (out == drv.outputs.end())
throw Error("derivation '%s' does not have an output named 'out'", printStorePath(drvPath));
// Don't need the answer, but do this anyways to assert is proper
// combination. The code below is more general and naturally allows
// combinations that are currently prohibited.
drv.type();
check(
makeFixedOutputPath(
out->second.hash->method,
out->second.hash->hash,
drvName),
out->second.path, "out");
std::optional<Hash> h;
for (auto & i : drv.outputs) {
std::visit(overloaded {
[&](DerivationOutputInputAddressed doia) {
if (!h) {
// somewhat expensive so we do lazily
auto temp = hashDerivationModulo(*this, drv, true);
h = std::get<Hash>(temp);
}
else {
Hash h = hashDerivationModulo(*this, drv, true);
for (auto & i : drv.outputs)
check(makeOutputPath(i.first, h, drvName), i.second.path, i.first);
StorePath recomputed = makeOutputPath(i.first, *h, drvName);
if (doia.path != recomputed)
throw Error("derivation '%s' has incorrect output '%s', should be '%s'",
printStorePath(drvPath), printStorePath(doia.path), printStorePath(recomputed));
envHasRightPath(doia.path, i.first);
},
[&](DerivationOutputCAFixed dof) {
StorePath path = makeFixedOutputPath(dof.hash.method, dof.hash.hash, drvName);
envHasRightPath(path, i.first);
},
[&](DerivationOutputCAFloating _) {
throw UnimplementedError("floating CA output derivations are not yet implemented");
},
}, i.second.output);
}
}
@ -586,7 +594,7 @@ uint64_t LocalStore::addValidPath(State & state,
state.stmtRegisterValidPath.use()
(printStorePath(info.path))
(info.narHash.to_string(Base16, true))
(info.narHash->to_string(Base16, true))
(info.registrationTime == 0 ? time(0) : info.registrationTime)
(info.deriver ? printStorePath(*info.deriver) : "", (bool) info.deriver)
(info.narSize, info.narSize != 0)
@ -614,7 +622,7 @@ uint64_t LocalStore::addValidPath(State & state,
state.stmtAddDerivationOutput.use()
(id)
(i.first)
(printStorePath(i.second.path))
(printStorePath(i.second.path(*this, drv.name)))
.exec();
}
}
@ -647,7 +655,7 @@ void LocalStore::queryPathInfoUncached(const StorePath & path,
info->id = useQueryPathInfo.getInt(0);
try {
info->narHash = Hash(useQueryPathInfo.getStr(1));
info->narHash = Hash::parseAnyPrefixed(useQueryPathInfo.getStr(1));
} catch (BadHash & e) {
throw Error("in valid-path entry for '%s': %s", printStorePath(path), e.what());
}
@ -686,7 +694,7 @@ void LocalStore::updatePathInfo(State & state, const ValidPathInfo & info)
{
state.stmtUpdatePathInfo.use()
(info.narSize, info.narSize != 0)
(info.narHash.to_string(Base16, true))
(info.narHash->to_string(Base16, true))
(info.ultimate ? 1 : 0, info.ultimate)
(concatStringsSep(" ", info.sigs), !info.sigs.empty())
(renderContentAddress(info.ca), (bool) info.ca)
@ -846,20 +854,32 @@ StorePathSet LocalStore::querySubstitutablePaths(const StorePathSet & paths)
}
void LocalStore::querySubstitutablePathInfos(const StorePathSet & paths,
SubstitutablePathInfos & infos)
void LocalStore::querySubstitutablePathInfos(const StorePathCAMap & paths, SubstitutablePathInfos & infos)
{
if (!settings.useSubstitutes) return;
for (auto & sub : getDefaultSubstituters()) {
if (sub->storeDir != storeDir) continue;
for (auto & path : paths) {
if (infos.count(path)) continue;
debug("checking substituter '%s' for path '%s'", sub->getUri(), printStorePath(path));
auto subPath(path.first);
// recompute store path so that we can use a different store root
if (path.second) {
subPath = makeFixedOutputPathFromCA(path.first.name(), *path.second);
if (sub->storeDir == storeDir)
assert(subPath == path.first);
if (subPath != path.first)
debug("replaced path '%s' with '%s' for substituter '%s'", printStorePath(path.first), sub->printStorePath(subPath), sub->getUri());
} else if (sub->storeDir != storeDir) continue;
debug("checking substituter '%s' for path '%s'", sub->getUri(), sub->printStorePath(subPath));
try {
auto info = sub->queryPathInfo(path);
auto info = sub->queryPathInfo(subPath);
if (sub->storeDir != storeDir && !(info->isContentAddressed(*sub) && info->references.empty()))
continue;
auto narInfo = std::dynamic_pointer_cast<const NarInfo>(
std::shared_ptr<const ValidPathInfo>(info));
infos.insert_or_assign(path, SubstitutablePathInfo{
infos.insert_or_assign(path.first, SubstitutablePathInfo{
info->deriver,
info->references,
narInfo ? narInfo->fileSize : 0,
@ -900,7 +920,7 @@ void LocalStore::registerValidPaths(const ValidPathInfos & infos)
StorePathSet paths;
for (auto & i : infos) {
assert(i.narHash.type == htSHA256);
assert(i.narHash && i.narHash->type == htSHA256);
if (isValidPath_(*state, i.path))
updatePathInfo(*state, i);
else
@ -1013,7 +1033,7 @@ void LocalStore::addToStore(const ValidPathInfo & info, Source & source,
if (hashResult.first != info.narHash)
throw Error("hash mismatch importing path '%s';\n wanted: %s\n got: %s",
printStorePath(info.path), info.narHash.to_string(Base32, true), hashResult.first.to_string(Base32, true));
printStorePath(info.path), info.narHash->to_string(Base32, true), hashResult.first.to_string(Base32, true));
if (hashResult.second != info.narSize)
throw Error("size mismatch importing path '%s';\n wanted: %s\n got: %s",
@ -1033,20 +1053,6 @@ void LocalStore::addToStore(const ValidPathInfo & info, Source & source,
}
StorePath LocalStore::addToStore(const string & name, const Path & _srcPath,
FileIngestionMethod method, HashType hashAlgo, PathFilter & filter, RepairFlag repair)
{
Path srcPath(absPath(_srcPath));
auto source = sinkToSource([&](Sink & sink) {
if (method == FileIngestionMethod::Recursive)
dumpPath(srcPath, sink, filter);
else
readFile(srcPath, sink);
});
return addToStoreFromDump(*source, name, method, hashAlgo, repair);
}
StorePath LocalStore::addToStoreFromDump(Source & source0, const string & name,
FileIngestionMethod method, HashType hashAlgo, RepairFlag repair)
{
@ -1309,9 +1315,9 @@ bool LocalStore::verifyStore(bool checkContents, RepairFlag repair)
std::unique_ptr<AbstractHashSink> hashSink;
if (!info->ca || !info->references.count(info->path))
hashSink = std::make_unique<HashSink>(*info->narHash.type);
hashSink = std::make_unique<HashSink>(info->narHash->type);
else
hashSink = std::make_unique<HashModuloSink>(*info->narHash.type, std::string(info->path.hashPart()));
hashSink = std::make_unique<HashModuloSink>(info->narHash->type, std::string(info->path.hashPart()));
dumpPath(Store::toRealPath(i), *hashSink);
auto current = hashSink->finish();
@ -1320,7 +1326,7 @@ bool LocalStore::verifyStore(bool checkContents, RepairFlag repair)
logError({
.name = "Invalid hash - path modified",
.hint = hintfmt("path '%s' was modified! expected hash '%s', got '%s'",
printStorePath(i), info->narHash.to_string(Base32, true), current.first.to_string(Base32, true))
printStorePath(i), info->narHash->to_string(Base32, true), current.first.to_string(Base32, true))
});
if (repair) repairPath(i); else errors = true;
} else {

View file

@ -29,8 +29,8 @@ struct Derivation;
struct OptimiseStats
{
unsigned long filesLinked = 0;
unsigned long long bytesFreed = 0;
unsigned long long blocksFreed = 0;
uint64_t bytesFreed = 0;
uint64_t blocksFreed = 0;
};
@ -139,22 +139,14 @@ public:
StorePathSet querySubstitutablePaths(const StorePathSet & paths) override;
void querySubstitutablePathInfos(const StorePathSet & paths,
void querySubstitutablePathInfos(const StorePathCAMap & paths,
SubstitutablePathInfos & infos) override;
void addToStore(const ValidPathInfo & info, Source & source,
RepairFlag repair, CheckSigsFlag checkSigs) override;
StorePath addToStore(const string & name, const Path & srcPath,
FileIngestionMethod method, HashType hashAlgo,
PathFilter & filter, RepairFlag repair) override;
/* Like addToStore(), but the contents of the path are contained
in `dump', which is either a NAR serialisation (if recursive ==
true) or simply the contents of a regular file (if recursive ==
false). */
StorePath addToStoreFromDump(Source & dump, const string & name,
FileIngestionMethod method = FileIngestionMethod::Recursive, HashType hashAlgo = htSHA256, RepairFlag repair = NoRepair) override;
FileIngestionMethod method, HashType hashAlgo, RepairFlag repair) override;
StorePath addTextToStore(const string & name, const string & s,
const StorePathSet & references, RepairFlag repair) override;

View file

@ -4,6 +4,7 @@
#include "local-store.hh"
#include "store-api.hh"
#include "thread-pool.hh"
#include "topo-sort.hh"
namespace nix {
@ -108,9 +109,19 @@ void Store::computeFSClosure(const StorePath & startPath,
}
std::optional<ContentAddress> getDerivationCA(const BasicDerivation & drv)
{
auto out = drv.outputs.find("out");
if (out != drv.outputs.end()) {
if (auto v = std::get_if<DerivationOutputCAFixed>(&out->second.output))
return v->hash;
}
return std::nullopt;
}
void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePathSet & willBuild_, StorePathSet & willSubstitute_, StorePathSet & unknown_,
unsigned long long & downloadSize_, unsigned long long & narSize_)
uint64_t & downloadSize_, uint64_t & narSize_)
{
Activity act(*logger, lvlDebug, actUnknown, "querying info about missing paths");
@ -122,8 +133,8 @@ void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
{
std::unordered_set<std::string> done;
StorePathSet & unknown, & willSubstitute, & willBuild;
unsigned long long & downloadSize;
unsigned long long & narSize;
uint64_t & downloadSize;
uint64_t & narSize;
};
struct DrvState
@ -157,7 +168,7 @@ void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
auto outPath = parseStorePath(outPathS);
SubstitutablePathInfos infos;
querySubstitutablePathInfos({outPath}, infos);
querySubstitutablePathInfos({{outPath, getDerivationCA(*drv)}}, infos);
if (infos.empty()) {
drvState_->lock()->done = true;
@ -198,8 +209,8 @@ void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
PathSet invalid;
for (auto & j : drv->outputs)
if (wantOutput(j.first, path.outputs)
&& !isValidPath(j.second.path))
invalid.insert(printStorePath(j.second.path));
&& !isValidPath(j.second.path(*this, drv->name)))
invalid.insert(printStorePath(j.second.path(*this, drv->name)));
if (invalid.empty()) return;
if (settings.useSubstitutes && parsedDrv.substitutesAllowed()) {
@ -214,7 +225,7 @@ void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
if (isValidPath(path.path)) return;
SubstitutablePathInfos infos;
querySubstitutablePathInfos({path.path}, infos);
querySubstitutablePathInfos({{path.path, std::nullopt}}, infos);
if (infos.empty()) {
auto state(state_.lock());
@ -246,41 +257,21 @@ void Store::queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePaths Store::topoSortPaths(const StorePathSet & paths)
{
StorePaths sorted;
StorePathSet visited, parents;
std::function<void(const StorePath & path, const StorePath * parent)> dfsVisit;
dfsVisit = [&](const StorePath & path, const StorePath * parent) {
if (parents.count(path))
throw BuildError("cycle detected in the references of '%s' from '%s'",
printStorePath(path), printStorePath(*parent));
if (!visited.insert(path).second) return;
parents.insert(path);
return topoSort(paths,
{[&](const StorePath & path) {
StorePathSet references;
try {
references = queryPathInfo(path)->references;
} catch (InvalidPath &) {
}
for (auto & i : references)
/* Don't traverse into paths that don't exist. That can
happen due to substitutes for non-existent paths. */
if (i != path && paths.count(i))
dfsVisit(i, &path);
sorted.push_back(path);
parents.erase(path);
};
for (auto & i : paths)
dfsVisit(i, nullptr);
std::reverse(sorted.begin(), sorted.end());
return sorted;
return references;
}},
{[&](const StorePath & path, const StorePath & parent) {
return BuildError(
"cycle detected in the references of '%s' from '%s'",
printStorePath(path),
printStorePath(parent));
}});
}

View file

@ -79,14 +79,14 @@ struct NarAccessor : public FSAccessor
parents.top()->isExecutable = true;
}
void preallocateContents(unsigned long long size) override
void preallocateContents(uint64_t size) override
{
assert(size <= std::numeric_limits<uint64_t>::max());
parents.top()->size = (uint64_t) size;
parents.top()->start = pos;
}
void receiveContents(unsigned char * data, unsigned int len) override
void receiveContents(unsigned char * data, size_t len) override
{ }
void createSymlink(const Path & path, const string & target) override

View file

@ -193,9 +193,9 @@ public:
narInfo->url = queryNAR.getStr(2);
narInfo->compression = queryNAR.getStr(3);
if (!queryNAR.isNull(4))
narInfo->fileHash = Hash(queryNAR.getStr(4));
narInfo->fileHash = Hash::parseAnyPrefixed(queryNAR.getStr(4));
narInfo->fileSize = queryNAR.getInt(5);
narInfo->narHash = Hash(queryNAR.getStr(6));
narInfo->narHash = Hash::parseAnyPrefixed(queryNAR.getStr(6));
narInfo->narSize = queryNAR.getInt(7);
for (auto & r : tokenizeString<Strings>(queryNAR.getStr(8), " "))
narInfo->references.insert(StorePath(r));
@ -230,9 +230,9 @@ public:
(std::string(info->path.name()))
(narInfo ? narInfo->url : "", narInfo != 0)
(narInfo ? narInfo->compression : "", narInfo != 0)
(narInfo && narInfo->fileHash ? narInfo->fileHash.to_string(Base32, true) : "", narInfo && narInfo->fileHash)
(narInfo && narInfo->fileHash ? narInfo->fileHash->to_string(Base32, true) : "", narInfo && narInfo->fileHash)
(narInfo ? narInfo->fileSize : 0, narInfo != 0 && narInfo->fileSize)
(info->narHash.to_string(Base32, true))
(info->narHash->to_string(Base32, true))
(info->narSize)
(concatStringsSep(" ", info->shortRefs()))
(info->deriver ? std::string(info->deriver->to_string()) : "", (bool) info->deriver)

View file

@ -7,15 +7,14 @@ NarInfo::NarInfo(const Store & store, const std::string & s, const std::string &
: ValidPathInfo(StorePath(StorePath::dummy)) // FIXME: hack
{
auto corrupt = [&]() {
throw Error("NAR info file '%1%' is corrupt", whence);
return Error("NAR info file '%1%' is corrupt", whence);
};
auto parseHashField = [&](const string & s) {
try {
return Hash(s);
return Hash::parseAnyPrefixed(s);
} catch (BadHash &) {
corrupt();
return Hash(); // never reached
throw corrupt();
}
};
@ -25,12 +24,12 @@ NarInfo::NarInfo(const Store & store, const std::string & s, const std::string &
while (pos < s.size()) {
size_t colon = s.find(':', pos);
if (colon == std::string::npos) corrupt();
if (colon == std::string::npos) throw corrupt();
std::string name(s, pos, colon - pos);
size_t eol = s.find('\n', colon + 2);
if (eol == std::string::npos) corrupt();
if (eol == std::string::npos) throw corrupt();
std::string value(s, colon + 2, eol - colon - 2);
@ -45,16 +44,16 @@ NarInfo::NarInfo(const Store & store, const std::string & s, const std::string &
else if (name == "FileHash")
fileHash = parseHashField(value);
else if (name == "FileSize") {
if (!string2Int(value, fileSize)) corrupt();
if (!string2Int(value, fileSize)) throw corrupt();
}
else if (name == "NarHash")
narHash = parseHashField(value);
else if (name == "NarSize") {
if (!string2Int(value, narSize)) corrupt();
if (!string2Int(value, narSize)) throw corrupt();
}
else if (name == "References") {
auto refs = tokenizeString<Strings>(value, " ");
if (!references.empty()) corrupt();
if (!references.empty()) throw corrupt();
for (auto & r : refs)
references.insert(StorePath(r));
}
@ -67,7 +66,7 @@ NarInfo::NarInfo(const Store & store, const std::string & s, const std::string &
else if (name == "Sig")
sigs.insert(value);
else if (name == "CA") {
if (ca) corrupt();
if (ca) throw corrupt();
// FIXME: allow blank ca or require skipping field?
ca = parseContentAddressOpt(value);
}
@ -77,7 +76,7 @@ NarInfo::NarInfo(const Store & store, const std::string & s, const std::string &
if (compression == "") compression = "bzip2";
if (!havePath || url.empty() || narSize == 0 || !narHash) corrupt();
if (!havePath || url.empty() || narSize == 0 || !narHash) throw corrupt();
}
std::string NarInfo::to_string(const Store & store) const
@ -87,11 +86,11 @@ std::string NarInfo::to_string(const Store & store) const
res += "URL: " + url + "\n";
assert(compression != "");
res += "Compression: " + compression + "\n";
assert(fileHash.type == htSHA256);
res += "FileHash: " + fileHash.to_string(Base32, true) + "\n";
assert(fileHash && fileHash->type == htSHA256);
res += "FileHash: " + fileHash->to_string(Base32, true) + "\n";
res += "FileSize: " + std::to_string(fileSize) + "\n";
assert(narHash.type == htSHA256);
res += "NarHash: " + narHash.to_string(Base32, true) + "\n";
assert(narHash && narHash->type == htSHA256);
res += "NarHash: " + narHash->to_string(Base32, true) + "\n";
res += "NarSize: " + std::to_string(narSize) + "\n";
res += "References: " + concatStringsSep(" ", shortRefs()) + "\n";

View file

@ -10,7 +10,7 @@ struct NarInfo : ValidPathInfo
{
std::string url;
std::string compression;
Hash fileHash;
std::optional<Hash> fileHash;
uint64_t fileSize = 0;
std::string system;

View file

@ -282,7 +282,7 @@ void LocalStore::optimiseStore(OptimiseStats & stats)
}
}
static string showBytes(unsigned long long bytes)
static string showBytes(uint64_t bytes)
{
return (format("%.2f MiB") % (bytes / (1024.0 * 1024.0))).str();
}

View file

@ -117,9 +117,4 @@ bool ParsedDerivation::substitutesAllowed() const
return getBoolAttr("allowSubstitutes", true);
}
bool ParsedDerivation::contentAddressed() const
{
return getBoolAttr("__contentAddressed", false);
}
}

View file

@ -34,8 +34,6 @@ public:
bool willBuildLocally() const;
bool substitutesAllowed() const;
bool contentAddressed() const;
};
}

111
src/libstore/path-info.hh Normal file
View file

@ -0,0 +1,111 @@
#pragma once
#include "path.hh"
#include "hash.hh"
#include "content-address.hh"
#include <string>
#include <optional>
namespace nix {
class Store;
struct SubstitutablePathInfo
{
std::optional<StorePath> deriver;
StorePathSet references;
uint64_t downloadSize; /* 0 = unknown or inapplicable */
uint64_t narSize; /* 0 = unknown */
};
typedef std::map<StorePath, SubstitutablePathInfo> SubstitutablePathInfos;
struct ValidPathInfo
{
StorePath path;
std::optional<StorePath> deriver;
// TODO document this
std::optional<Hash> narHash;
StorePathSet references;
time_t registrationTime = 0;
uint64_t narSize = 0; // 0 = unknown
uint64_t id; // internal use only
/* Whether the path is ultimately trusted, that is, it's a
derivation output that was built locally. */
bool ultimate = false;
StringSet sigs; // note: not necessarily verified
/* If non-empty, an assertion that the path is content-addressed,
i.e., that the store path is computed from a cryptographic hash
of the contents of the path, plus some other bits of data like
the "name" part of the path. Such a path doesn't need
signatures, since we don't have to trust anybody's claim that
the path is the output of a particular derivation. (In the
extensional store model, we have to trust that the *contents*
of an output path of a derivation were actually produced by
that derivation. In the intensional model, we have to trust
that a particular output path was produced by a derivation; the
path then implies the contents.)
Ideally, the content-addressability assertion would just be a Boolean,
and the store path would be computed from the name component, narHash
and references. However, we support many types of content addresses.
*/
std::optional<ContentAddress> ca;
bool operator == (const ValidPathInfo & i) const
{
return
path == i.path
&& narHash == i.narHash
&& references == i.references;
}
/* Return a fingerprint of the store path to be used in binary
cache signatures. It contains the store path, the base-32
SHA-256 hash of the NAR serialisation of the path, the size of
the NAR, and the sorted references. The size field is strictly
speaking superfluous, but might prevent endless/excessive data
attacks. */
std::string fingerprint(const Store & store) const;
void sign(const Store & store, const SecretKey & secretKey);
/* Return true iff the path is verifiably content-addressed. */
bool isContentAddressed(const Store & store) const;
/* Functions to view references + hasSelfReference as one set, mainly for
compatibility's sake. */
StorePathSet referencesPossiblyToSelf() const;
void insertReferencePossiblyToSelf(StorePath && ref);
void setReferencesPossiblyToSelf(StorePathSet && refs);
static const size_t maxSigs = std::numeric_limits<size_t>::max();
/* Return the number of signatures on this .narinfo that were
produced by one of the specified keys, or maxSigs if the path
is content-addressed. */
size_t checkSignatures(const Store & store, const PublicKeys & publicKeys) const;
/* Verify a single signature. */
bool checkSignature(const Store & store, const PublicKeys & publicKeys, const std::string & sig) const;
Strings shortRefs() const;
ValidPathInfo(const ValidPathInfo & other) = default;
ValidPathInfo(StorePath && path) : path(std::move(path)) { };
ValidPathInfo(const StorePath & path) : path(path) { };
virtual ~ValidPathInfo() { }
};
typedef list<ValidPathInfo> ValidPathInfos;
}

View file

@ -64,6 +64,8 @@ typedef std::set<StorePath> StorePathSet;
typedef std::vector<StorePath> StorePaths;
typedef std::map<string, StorePath> OutputPathMap;
typedef std::map<StorePath, std::optional<ContentAddress>> StorePathCAMap;
/* Extension of derivations in the Nix store. */
const std::string drvExtension = ".drv";

View file

@ -76,8 +76,8 @@ void RefScanSink::operator () (const unsigned char * data, size_t len)
}
PathSet scanForReferences(const string & path,
const PathSet & refs, HashResult & hash)
std::pair<PathSet, HashResult> scanForReferences(const string & path,
const PathSet & refs)
{
RefScanSink refsSink;
HashSink hashSink { htSHA256 };
@ -111,9 +111,9 @@ PathSet scanForReferences(const string & path,
found.insert(j->second);
}
hash = hashSink.finish();
auto hash = hashSink.finish();
return found;
return std::pair<PathSet, HashResult>(found, hash);
}

View file

@ -5,8 +5,7 @@
namespace nix {
PathSet scanForReferences(const Path & path, const PathSet & refs,
HashResult & hash);
std::pair<PathSet, HashResult> scanForReferences(const Path & path, const PathSet & refs);
struct RewritingSink : Sink
{

View file

@ -39,6 +39,24 @@ void writeStorePaths(const Store & store, Sink & out, const StorePathSet & paths
out << store.printStorePath(i);
}
StorePathCAMap readStorePathCAMap(const Store & store, Source & from)
{
StorePathCAMap paths;
auto count = readNum<size_t>(from);
while (count--)
paths.insert_or_assign(store.parseStorePath(readString(from)), parseContentAddressOpt(readString(from)));
return paths;
}
void writeStorePathCAMap(const Store & store, Sink & out, const StorePathCAMap & paths)
{
out << paths.size();
for (auto & i : paths) {
out << store.printStorePath(i.first);
out << renderContentAddress(i.second);
}
}
std::map<string, StorePath> readOutputPathMap(const Store & store, Source & from)
{
std::map<string, StorePath> pathMap;
@ -332,18 +350,17 @@ StorePathSet RemoteStore::querySubstitutablePaths(const StorePathSet & paths)
}
void RemoteStore::querySubstitutablePathInfos(const StorePathSet & paths,
SubstitutablePathInfos & infos)
void RemoteStore::querySubstitutablePathInfos(const StorePathCAMap & pathsMap, SubstitutablePathInfos & infos)
{
if (paths.empty()) return;
if (pathsMap.empty()) return;
auto conn(getConnection());
if (GET_PROTOCOL_MINOR(conn->daemonVersion) < 12) {
for (auto & i : paths) {
for (auto & i : pathsMap) {
SubstitutablePathInfo info;
conn->to << wopQuerySubstitutablePathInfo << printStorePath(i);
conn->to << wopQuerySubstitutablePathInfo << printStorePath(i.first);
conn.processStderr();
unsigned int reply = readInt(conn->from);
if (reply == 0) continue;
@ -353,13 +370,19 @@ void RemoteStore::querySubstitutablePathInfos(const StorePathSet & paths,
info.references = readStorePaths<StorePathSet>(*this, conn->from);
info.downloadSize = readLongLong(conn->from);
info.narSize = readLongLong(conn->from);
infos.insert_or_assign(i, std::move(info));
infos.insert_or_assign(i.first, std::move(info));
}
} else {
conn->to << wopQuerySubstitutablePathInfos;
if (GET_PROTOCOL_MINOR(conn->daemonVersion) < 22) {
StorePathSet paths;
for (auto & path : pathsMap)
paths.insert(path.first);
writeStorePaths(*this, conn->to, paths);
} else
writeStorePathCAMap(*this, conn->to, pathsMap);
conn.processStderr();
size_t count = readNum<size_t>(conn->from);
for (size_t n = 0; n < count; n++) {
@ -399,7 +422,7 @@ void RemoteStore::queryPathInfoUncached(const StorePath & path,
info = std::make_shared<ValidPathInfo>(StorePath(path));
auto deriver = readString(conn->from);
if (deriver != "") info->deriver = parseStorePath(deriver);
info->narHash = Hash(readString(conn->from), htSHA256);
info->narHash = Hash::parseAny(readString(conn->from), htSHA256);
info->references = readStorePaths<StorePathSet>(*this, conn->from);
conn->from >> info->registrationTime >> info->narSize;
if (GET_PROTOCOL_MINOR(conn->daemonVersion) >= 16) {
@ -498,14 +521,89 @@ void RemoteStore::addToStore(const ValidPathInfo & info, Source & source,
conn->to << wopAddToStoreNar
<< printStorePath(info.path)
<< (info.deriver ? printStorePath(*info.deriver) : "")
<< info.narHash.to_string(Base16, false);
<< info.narHash->to_string(Base16, false);
writeStorePaths(*this, conn->to, info.references);
conn->to << info.registrationTime << info.narSize
<< info.ultimate << info.sigs << renderContentAddress(info.ca)
<< repair << !checkSigs;
bool tunnel = GET_PROTOCOL_MINOR(conn->daemonVersion) >= 21;
if (!tunnel) copyNAR(source, conn->to);
conn.processStderr(0, tunnel ? &source : nullptr);
if (GET_PROTOCOL_MINOR(conn->daemonVersion) >= 23) {
std::exception_ptr ex;
struct FramedSink : BufferedSink
{
ConnectionHandle & conn;
std::exception_ptr & ex;
FramedSink(ConnectionHandle & conn, std::exception_ptr & ex) : conn(conn), ex(ex)
{ }
~FramedSink()
{
try {
conn->to << 0;
conn->to.flush();
} catch (...) {
ignoreException();
}
}
void write(const unsigned char * data, size_t len) override
{
/* Don't send more data if the remote has
encountered an error. */
if (ex) {
auto ex2 = ex;
ex = nullptr;
std::rethrow_exception(ex2);
}
conn->to << len;
conn->to(data, len);
};
};
/* Handle log messages / exceptions from the remote on a
separate thread. */
std::thread stderrThread([&]()
{
try {
conn.processStderr();
} catch (...) {
ex = std::current_exception();
}
});
Finally joinStderrThread([&]()
{
if (stderrThread.joinable()) {
stderrThread.join();
if (ex) {
try {
std::rethrow_exception(ex);
} catch (...) {
ignoreException();
}
}
}
});
{
FramedSink sink(conn, ex);
copyNAR(source, sink);
sink.flush();
}
stderrThread.join();
if (ex)
std::rethrow_exception(ex);
} else if (GET_PROTOCOL_MINOR(conn->daemonVersion) >= 21) {
conn.processStderr(0, &source);
} else {
copyNAR(source, conn->to);
conn.processStderr(0, nullptr);
}
}
}
@ -707,7 +805,7 @@ void RemoteStore::addSignatures(const StorePath & storePath, const StringSet & s
void RemoteStore::queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePathSet & willBuild, StorePathSet & willSubstitute, StorePathSet & unknown,
unsigned long long & downloadSize, unsigned long long & narSize)
uint64_t & downloadSize, uint64_t & narSize)
{
{
auto conn(getConnection());

View file

@ -56,7 +56,7 @@ public:
StorePathSet querySubstitutablePaths(const StorePathSet & paths) override;
void querySubstitutablePathInfos(const StorePathSet & paths,
void querySubstitutablePathInfos(const StorePathCAMap & paths,
SubstitutablePathInfos & infos) override;
void addToStore(const ValidPathInfo & info, Source & nar,
@ -94,7 +94,7 @@ public:
void queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePathSet & willBuild, StorePathSet & willSubstitute, StorePathSet & unknown,
unsigned long long & downloadSize, unsigned long long & narSize) override;
uint64_t & downloadSize, uint64_t & narSize) override;
void connect() override;

View file

@ -266,6 +266,10 @@ struct S3BinaryCacheStoreImpl : public S3BinaryCacheStore
const std::string & mimeType,
const std::string & contentEncoding)
{
istream->seekg(0, istream->end);
auto size = istream->tellg();
istream->seekg(0, istream->beg);
auto maxThreads = std::thread::hardware_concurrency();
static std::shared_ptr<Aws::Utils::Threading::PooledThreadExecutor>
@ -343,13 +347,11 @@ struct S3BinaryCacheStoreImpl : public S3BinaryCacheStore
std::chrono::duration_cast<std::chrono::milliseconds>(now2 - now1)
.count();
auto size = istream->tellg();
printInfo("uploaded 's3://%s/%s' (%d bytes) in %d ms",
bucketName, path, size, duration);
stats.putTimeMs += duration;
stats.putBytes += size;
stats.putBytes += std::max(size, (decltype(size)) 0);
stats.put++;
}

View file

@ -193,6 +193,19 @@ StorePath Store::makeFixedOutputPath(
}
}
StorePath Store::makeFixedOutputPathFromCA(std::string_view name, ContentAddress ca,
const StorePathSet & references, bool hasSelfReference) const
{
// New template
return std::visit(overloaded {
[&](TextHash th) {
return makeTextPath(name, th.hash, references);
},
[&](FixedOutputHash fsh) {
return makeFixedOutputPath(fsh.method, fsh.hash, name, references, hasSelfReference);
}
}, ca);
}
StorePath Store::makeTextPath(std::string_view name, const Hash & hash,
const StorePathSet & references) const
@ -222,6 +235,20 @@ StorePath Store::computeStorePathForText(const string & name, const string & s,
}
StorePath Store::addToStore(const string & name, const Path & _srcPath,
FileIngestionMethod method, HashType hashAlgo, PathFilter & filter, RepairFlag repair)
{
Path srcPath(absPath(_srcPath));
auto source = sinkToSource([&](Sink & sink) {
if (method == FileIngestionMethod::Recursive)
dumpPath(srcPath, sink, filter);
else
readFile(srcPath, sink);
});
return addToStoreFromDump(*source, name, method, hashAlgo, repair);
}
/*
The aim of this function is to compute in one pass the correct ValidPathInfo for
the files that we are trying to add to the store. To accomplish that in one
@ -538,7 +565,7 @@ string Store::makeValidityRegistration(const StorePathSet & paths,
auto info = queryPathInfo(i);
if (showHash) {
s += info->narHash.to_string(Base16, false) + "\n";
s += info->narHash->to_string(Base16, false) + "\n";
s += (format("%1%\n") % info->narSize).str();
}
@ -570,7 +597,7 @@ void Store::pathInfoToJSON(JSONPlaceholder & jsonOut, const StorePathSet & store
auto info = queryPathInfo(storePath);
jsonPath
.attr("narHash", info->narHash.to_string(hashBase, true))
.attr("narHash", info->narHash->to_string(hashBase, true))
.attr("narSize", info->narSize);
{
@ -613,7 +640,7 @@ void Store::pathInfoToJSON(JSONPlaceholder & jsonOut, const StorePathSet & store
if (!narInfo->url.empty())
jsonPath.attr("url", narInfo->url);
if (narInfo->fileHash)
jsonPath.attr("downloadHash", narInfo->fileHash.to_string(hashBase, true));
jsonPath.attr("downloadHash", narInfo->fileHash->to_string(hashBase, true));
if (narInfo->fileSize)
jsonPath.attr("downloadSize", narInfo->fileSize);
if (showClosureSize)
@ -689,6 +716,15 @@ void copyStorePath(ref<Store> srcStore, ref<Store> dstStore,
uint64_t total = 0;
// recompute store path on the chance dstStore does it differently
if (info->ca && info->references.empty()) {
auto info2 = make_ref<ValidPathInfo>(*info);
info2->path = dstStore->makeFixedOutputPathFromCA(info->path.name(), *info->ca);
if (dstStore->storeDir == srcStore->storeDir)
assert(info->path == info2->path);
info = info2;
}
if (!info->narHash) {
StringSink sink;
srcStore->narFromPath({storePath}, sink);
@ -724,16 +760,20 @@ void copyStorePath(ref<Store> srcStore, ref<Store> dstStore,
}
void copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & storePaths,
std::map<StorePath, StorePath> copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & storePaths,
RepairFlag repair, CheckSigsFlag checkSigs, SubstituteFlag substitute)
{
auto valid = dstStore->queryValidPaths(storePaths, substitute);
PathSet missing;
StorePathSet missing;
for (auto & path : storePaths)
if (!valid.count(path)) missing.insert(srcStore->printStorePath(path));
if (!valid.count(path)) missing.insert(path);
if (missing.empty()) return;
std::map<StorePath, StorePath> pathsMap;
for (auto & path : storePaths)
pathsMap.insert_or_assign(path, path);
if (missing.empty()) return pathsMap;
Activity act(*logger, lvlInfo, actCopyPaths, fmt("copying %d paths", missing.size()));
@ -748,30 +788,49 @@ void copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & st
ThreadPool pool;
processGraph<Path>(pool,
PathSet(missing.begin(), missing.end()),
processGraph<StorePath>(pool,
StorePathSet(missing.begin(), missing.end()),
[&](const Path & storePath) {
if (dstStore->isValidPath(dstStore->parseStorePath(storePath))) {
[&](const StorePath & storePath) {
auto info = srcStore->queryPathInfo(storePath);
auto storePathForDst = storePath;
if (info->ca && info->references.empty()) {
storePathForDst = dstStore->makeFixedOutputPathFromCA(storePath.name(), *info->ca);
if (dstStore->storeDir == srcStore->storeDir)
assert(storePathForDst == storePath);
if (storePathForDst != storePath)
debug("replaced path '%s' to '%s' for substituter '%s'", srcStore->printStorePath(storePath), dstStore->printStorePath(storePathForDst), dstStore->getUri());
}
pathsMap.insert_or_assign(storePath, storePathForDst);
if (dstStore->isValidPath(storePath)) {
nrDone++;
showProgress();
return PathSet();
return StorePathSet();
}
auto info = srcStore->queryPathInfo(srcStore->parseStorePath(storePath));
bytesExpected += info->narSize;
act.setExpected(actCopyPath, bytesExpected);
return srcStore->printStorePathSet(info->references);
return info->references;
},
[&](const Path & storePathS) {
[&](const StorePath & storePath) {
checkInterrupt();
auto storePath = dstStore->parseStorePath(storePathS);
auto info = srcStore->queryPathInfo(storePath);
if (!dstStore->isValidPath(storePath)) {
auto storePathForDst = storePath;
if (info->ca && info->references.empty()) {
storePathForDst = dstStore->makeFixedOutputPathFromCA(storePath.name(), *info->ca);
if (dstStore->storeDir == srcStore->storeDir)
assert(storePathForDst == storePath);
if (storePathForDst != storePath)
debug("replaced path '%s' to '%s' for substituter '%s'", srcStore->printStorePath(storePath), dstStore->printStorePath(storePathForDst), dstStore->getUri());
}
pathsMap.insert_or_assign(storePath, storePathForDst);
if (!dstStore->isValidPath(storePathForDst)) {
MaintainCount<decltype(nrRunning)> mc(nrRunning);
showProgress();
try {
@ -780,7 +839,7 @@ void copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & st
nrFailed++;
if (!settings.keepGoing)
throw e;
logger->log(lvlError, fmt("could not copy %s: %s", storePathS, e.what()));
logger->log(lvlError, fmt("could not copy %s: %s", dstStore->printStorePath(storePath), e.what()));
showProgress();
return;
}
@ -789,6 +848,8 @@ void copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & st
nrDone++;
showProgress();
});
return pathsMap;
}
@ -811,7 +872,7 @@ std::optional<ValidPathInfo> decodeValidPathInfo(const Store & store, std::istre
if (hashGiven) {
string s;
getline(str, s);
info.narHash = Hash(s, htSHA256);
info.narHash = Hash::parseAny(s, htSHA256);
getline(str, s);
if (!string2Int(s, info.narSize)) throw Error("number expected");
}
@ -854,7 +915,7 @@ std::string ValidPathInfo::fingerprint(const Store & store) const
store.printStorePath(path));
return
"1;" + store.printStorePath(path) + ";"
+ narHash.to_string(Base32, true) + ";"
+ narHash->to_string(Base32, true) + ";"
+ std::to_string(narSize) + ";"
+ concatStringsSep(",", store.printStorePathSet(references));
}

View file

@ -10,6 +10,7 @@
#include "globals.hh"
#include "config.hh"
#include "derivations.hh"
#include "path-info.hh"
#include <atomic>
#include <limits>
@ -85,7 +86,7 @@ struct GCOptions
StorePathSet pathsToDelete;
/* Stop after at least `maxFreed' bytes have been freed. */
unsigned long long maxFreed{std::numeric_limits<unsigned long long>::max()};
uint64_t maxFreed{std::numeric_limits<uint64_t>::max()};
};
@ -97,98 +98,10 @@ struct GCResults
/* For `gcReturnDead', `gcDeleteDead' and `gcDeleteSpecific', the
number of bytes that would be or was freed. */
unsigned long long bytesFreed = 0;
uint64_t bytesFreed = 0;
};
struct SubstitutablePathInfo
{
std::optional<StorePath> deriver;
StorePathSet references;
unsigned long long downloadSize; /* 0 = unknown or inapplicable */
unsigned long long narSize; /* 0 = unknown */
};
typedef std::map<StorePath, SubstitutablePathInfo> SubstitutablePathInfos;
struct ValidPathInfo
{
StorePath path;
std::optional<StorePath> deriver;
Hash narHash;
StorePathSet references;
time_t registrationTime = 0;
uint64_t narSize = 0; // 0 = unknown
uint64_t id; // internal use only
/* Whether the path is ultimately trusted, that is, it's a
derivation output that was built locally. */
bool ultimate = false;
StringSet sigs; // note: not necessarily verified
/* If non-empty, an assertion that the path is content-addressed,
i.e., that the store path is computed from a cryptographic hash
of the contents of the path, plus some other bits of data like
the "name" part of the path. Such a path doesn't need
signatures, since we don't have to trust anybody's claim that
the path is the output of a particular derivation. (In the
extensional store model, we have to trust that the *contents*
of an output path of a derivation were actually produced by
that derivation. In the intensional model, we have to trust
that a particular output path was produced by a derivation; the
path then implies the contents.)
Ideally, the content-addressability assertion would just be a Boolean,
and the store path would be computed from the name component, narHash
and references. However, we support many types of content addresses.
*/
std::optional<ContentAddress> ca;
bool operator == (const ValidPathInfo & i) const
{
return
path == i.path
&& narHash == i.narHash
&& references == i.references;
}
/* Return a fingerprint of the store path to be used in binary
cache signatures. It contains the store path, the base-32
SHA-256 hash of the NAR serialisation of the path, the size of
the NAR, and the sorted references. The size field is strictly
speaking superfluous, but might prevent endless/excessive data
attacks. */
std::string fingerprint(const Store & store) const;
void sign(const Store & store, const SecretKey & secretKey);
/* Return true iff the path is verifiably content-addressed. */
bool isContentAddressed(const Store & store) const;
static const size_t maxSigs = std::numeric_limits<size_t>::max();
/* Return the number of signatures on this .narinfo that were
produced by one of the specified keys, or maxSigs if the path
is content-addressed. */
size_t checkSignatures(const Store & store, const PublicKeys & publicKeys) const;
/* Verify a single signature. */
bool checkSignature(const Store & store, const PublicKeys & publicKeys, const std::string & sig) const;
Strings shortRefs() const;
ValidPathInfo(const ValidPathInfo & other) = default;
ValidPathInfo(StorePath && path) : path(std::move(path)) { };
ValidPathInfo(const StorePath & path) : path(path) { };
virtual ~ValidPathInfo() { }
};
typedef list<ValidPathInfo> ValidPathInfos;
enum BuildMode { bmNormal, bmRepair, bmCheck };
@ -343,7 +256,11 @@ public:
bool hasSelfReference = false) const;
StorePath makeTextPath(std::string_view name, const Hash & hash,
const StorePathSet & references) const;
const StorePathSet & references = {}) const;
StorePath makeFixedOutputPathFromCA(std::string_view name, ContentAddress ca,
const StorePathSet & references = {},
bool hasSelfReference = false) const;
/* This is the preparatory part of addToStore(); it computes the
store path to which srcPath is to be copied. Returns the store
@ -435,9 +352,10 @@ public:
virtual StorePathSet querySubstitutablePaths(const StorePathSet & paths) { return {}; };
/* Query substitute info (i.e. references, derivers and download
sizes) of a set of paths. If a path does not have substitute
info, it's omitted from the resulting infos map. */
virtual void querySubstitutablePathInfos(const StorePathSet & paths,
sizes) of a map of paths to their optional ca values. If a path
does not have substitute info, it's omitted from the resulting
infos map. */
virtual void querySubstitutablePathInfos(const StorePathCAMap & paths,
SubstitutablePathInfos & infos) { return; };
/* Import a path into the store. */
@ -450,7 +368,7 @@ public:
libutil/archive.hh). */
virtual StorePath addToStore(const string & name, const Path & srcPath,
FileIngestionMethod method = FileIngestionMethod::Recursive, HashType hashAlgo = htSHA256,
PathFilter & filter = defaultPathFilter, RepairFlag repair = NoRepair) = 0;
PathFilter & filter = defaultPathFilter, RepairFlag repair = NoRepair);
/* Copy the contents of a path to the store and register the
validity the resulting path, using a constant amount of
@ -459,6 +377,10 @@ public:
FileIngestionMethod method = FileIngestionMethod::Recursive, HashType hashAlgo = htSHA256,
std::optional<Hash> expectedCAHash = {});
/* Like addToStore(), but the contents of the path are contained
in `dump', which is either a NAR serialisation (if recursive ==
true) or simply the contents of a regular file (if recursive ==
false). */
// FIXME: remove?
virtual StorePath addToStoreFromDump(Source & dump, const string & name,
FileIngestionMethod method = FileIngestionMethod::Recursive, HashType hashAlgo = htSHA256, RepairFlag repair = NoRepair)
@ -609,7 +531,7 @@ public:
that will be substituted. */
virtual void queryMissing(const std::vector<StorePathWithOutputs> & targets,
StorePathSet & willBuild, StorePathSet & willSubstitute, StorePathSet & unknown,
unsigned long long & downloadSize, unsigned long long & narSize);
uint64_t & downloadSize, uint64_t & narSize);
/* Sort a set of paths topologically under the references
relation. If p refers to q, then p precedes q in this list. */
@ -739,11 +661,13 @@ void copyStorePath(ref<Store> srcStore, ref<Store> dstStore,
/* Copy store paths from one store to another. The paths may be copied
in parallel. They are copied in a topologically sorted order
(i.e. if A is a reference of B, then A is copied before B), but
the set of store paths is not automatically closed; use
copyClosure() for that. */
void copyPaths(ref<Store> srcStore, ref<Store> dstStore, const StorePathSet & storePaths,
in parallel. They are copied in a topologically sorted order (i.e.
if A is a reference of B, then A is copied before B), but the set
of store paths is not automatically closed; use copyClosure() for
that. Returns a map of what each path was copied to the dstStore
as. */
std::map<StorePath, StorePath> copyPaths(ref<Store> srcStore, ref<Store> dstStore,
const StorePathSet & storePaths,
RepairFlag repair = NoRepair,
CheckSigsFlag checkSigs = CheckSigs,
SubstituteFlag substitute = NoSubstitute);
@ -842,4 +766,6 @@ std::optional<ValidPathInfo> decodeValidPathInfo(
/* Split URI into protocol+hierarchy part and its parameter set. */
std::pair<std::string, Store::Params> splitUriAndParams(const std::string & uri);
std::optional<ContentAddress> getDerivationCA(const BasicDerivation & drv);
}

View file

@ -6,7 +6,7 @@ namespace nix {
#define WORKER_MAGIC_1 0x6e697863
#define WORKER_MAGIC_2 0x6478696f
#define PROTOCOL_VERSION 0x116
#define PROTOCOL_VERSION 0x117
#define GET_PROTOCOL_MAJOR(x) ((x) & 0xff00)
#define GET_PROTOCOL_MINOR(x) ((x) & 0x00ff)
@ -70,6 +70,10 @@ template<class T> T readStorePaths(const Store & store, Source & from);
void writeStorePaths(const Store & store, Sink & out, const StorePathSet & paths);
StorePathCAMap readStorePathCAMap(const Store & store, Source & from);
void writeStorePathCAMap(const Store & store, Sink & out, const StorePathCAMap & paths);
void writeOutputPathMap(const Store & store, Sink & out, const OutputPathMap & paths);
}

View file

@ -150,17 +150,17 @@ static void skipGeneric(Source & source)
static void parseContents(ParseSink & sink, Source & source, const Path & path)
{
unsigned long long size = readLongLong(source);
uint64_t size = readLongLong(source);
sink.preallocateContents(size);
unsigned long long left = size;
uint64_t left = size;
std::vector<unsigned char> buf(65536);
while (left) {
checkInterrupt();
auto n = buf.size();
if ((unsigned long long)n > left) n = left;
if ((uint64_t)n > left) n = left;
source(buf.data(), n);
sink.receiveContents(buf.data(), n);
left -= n;
@ -323,7 +323,7 @@ struct RestoreSink : ParseSink
throw SysError("fchmod");
}
void preallocateContents(unsigned long long len)
void preallocateContents(uint64_t len)
{
#if HAVE_POSIX_FALLOCATE
if (len) {
@ -338,7 +338,7 @@ struct RestoreSink : ParseSink
#endif
}
void receiveContents(unsigned char * data, unsigned int len)
void receiveContents(unsigned char * data, size_t len)
{
writeFull(fd.get(), data, len);
}

View file

@ -57,8 +57,8 @@ struct ParseSink
virtual void createRegularFile(const Path & path) { };
virtual void isExecutable() { };
virtual void preallocateContents(unsigned long long size) { };
virtual void receiveContents(unsigned char * data, unsigned int len) { };
virtual void preallocateContents(uint64_t size) { };
virtual void receiveContents(unsigned char * data, size_t len) { };
virtual void createSymlink(const Path & path, const string & target) { };
};
@ -77,7 +77,7 @@ struct RetrieveRegularNARSink : ParseSink
regular = false;
}
void receiveContents(unsigned char * data, unsigned int len)
void receiveContents(unsigned char * data, size_t len)
{
sink(data, len);
}

View file

@ -192,6 +192,7 @@ public:
MakeError(Error, BaseError);
MakeError(UsageError, Error);
MakeError(UnimplementedError, Error);
class SysError : public Error
{

View file

@ -7,6 +7,7 @@
#include "args.hh"
#include "hash.hh"
#include "archive.hh"
#include "split.hh"
#include "util.hh"
#include <sys/types.h>
@ -16,18 +17,23 @@
namespace nix {
static size_t regularHashSize(HashType type) {
switch (type) {
case htMD5: return md5HashSize;
case htSHA1: return sha1HashSize;
case htSHA256: return sha256HashSize;
case htSHA512: return sha512HashSize;
}
abort();
}
std::set<std::string> hashTypes = { "md5", "sha1", "sha256", "sha512" };
void Hash::init()
Hash::Hash(HashType type) : type(type)
{
assert(type);
switch (*type) {
case htMD5: hashSize = md5HashSize; break;
case htSHA1: hashSize = sha1HashSize; break;
case htSHA256: hashSize = sha256HashSize; break;
case htSHA512: hashSize = sha512HashSize; break;
}
hashSize = regularHashSize(type);
assert(hashSize <= maxHashSize);
memset(hash, 0, maxHashSize);
}
@ -108,17 +114,11 @@ string printHash16or32(const Hash & hash)
}
HashType assertInitHashType(const Hash & h)
{
assert(h.type);
return *h.type;
}
std::string Hash::to_string(Base base, bool includeType) const
{
std::string s;
if (base == SRI || includeType) {
s += printHashType(assertInitHashType(*this));
s += printHashType(type);
s += base == SRI ? '-' : ':';
}
switch (base) {
@ -136,63 +136,101 @@ std::string Hash::to_string(Base base, bool includeType) const
return s;
}
Hash::Hash(std::string_view s, HashType type) : Hash(s, std::optional { type }) { }
Hash::Hash(std::string_view s) : Hash(s, std::optional<HashType>{}) { }
Hash Hash::parseSRI(std::string_view original) {
auto rest = original;
Hash::Hash(std::string_view s, std::optional<HashType> type)
: type(type)
{
size_t pos = 0;
// Parse the has type before the separater, if there was one.
auto hashRaw = splitPrefixTo(rest, '-');
if (!hashRaw)
throw BadHash("hash '%s' is not SRI", original);
HashType parsedType = parseHashType(*hashRaw);
return Hash(rest, parsedType, true);
}
// Mutates the string to eliminate the prefixes when found
static std::pair<std::optional<HashType>, bool> getParsedTypeAndSRI(std::string_view & rest) {
bool isSRI = false;
auto sep = s.find(':');
if (sep == string::npos) {
sep = s.find('-');
if (sep != string::npos) {
// Parse the has type before the separater, if there was one.
std::optional<HashType> optParsedType;
{
auto hashRaw = splitPrefixTo(rest, ':');
if (!hashRaw) {
hashRaw = splitPrefixTo(rest, '-');
if (hashRaw)
isSRI = true;
} else if (! type)
throw BadHash("hash '%s' does not include a type", s);
}
if (hashRaw)
optParsedType = parseHashType(*hashRaw);
}
if (sep != string::npos) {
string hts = string(s, 0, sep);
this->type = parseHashType(hts);
if (!this->type)
throw BadHash("unknown hash type '%s'", hts);
if (type && type != this->type)
throw BadHash("hash '%s' should have type '%s'", s, printHashType(*type));
pos = sep + 1;
}
return {optParsedType, isSRI};
}
init();
Hash Hash::parseAnyPrefixed(std::string_view original)
{
auto rest = original;
auto [optParsedType, isSRI] = getParsedTypeAndSRI(rest);
size_t size = s.size() - pos;
// Either the string or user must provide the type, if they both do they
// must agree.
if (!optParsedType)
throw BadHash("hash '%s' does not include a type", rest);
if (!isSRI && size == base16Len()) {
return Hash(rest, *optParsedType, isSRI);
}
Hash Hash::parseAny(std::string_view original, std::optional<HashType> optType)
{
auto rest = original;
auto [optParsedType, isSRI] = getParsedTypeAndSRI(rest);
// Either the string or user must provide the type, if they both do they
// must agree.
if (!optParsedType && !optType)
throw BadHash("hash '%s' does not include a type, nor is the type otherwise known from context.", rest);
else if (optParsedType && optType && *optParsedType != *optType)
throw BadHash("hash '%s' should have type '%s'", original, printHashType(*optType));
HashType hashType = optParsedType ? *optParsedType : *optType;
return Hash(rest, hashType, isSRI);
}
Hash Hash::parseNonSRIUnprefixed(std::string_view s, HashType type)
{
return Hash(s, type, false);
}
Hash::Hash(std::string_view rest, HashType type, bool isSRI)
: Hash(type)
{
if (!isSRI && rest.size() == base16Len()) {
auto parseHexDigit = [&](char c) {
if (c >= '0' && c <= '9') return c - '0';
if (c >= 'A' && c <= 'F') return c - 'A' + 10;
if (c >= 'a' && c <= 'f') return c - 'a' + 10;
throw BadHash("invalid base-16 hash '%s'", s);
throw BadHash("invalid base-16 hash '%s'", rest);
};
for (unsigned int i = 0; i < hashSize; i++) {
hash[i] =
parseHexDigit(s[pos + i * 2]) << 4
| parseHexDigit(s[pos + i * 2 + 1]);
parseHexDigit(rest[i * 2]) << 4
| parseHexDigit(rest[i * 2 + 1]);
}
}
else if (!isSRI && size == base32Len()) {
else if (!isSRI && rest.size() == base32Len()) {
for (unsigned int n = 0; n < size; ++n) {
char c = s[pos + size - n - 1];
for (unsigned int n = 0; n < rest.size(); ++n) {
char c = rest[rest.size() - n - 1];
unsigned char digit;
for (digit = 0; digit < base32Chars.size(); ++digit) /* !!! slow */
if (base32Chars[digit] == c) break;
if (digit >= 32)
throw BadHash("invalid base-32 hash '%s'", s);
throw BadHash("invalid base-32 hash '%s'", rest);
unsigned int b = n * 5;
unsigned int i = b / 8;
unsigned int j = b % 8;
@ -202,21 +240,21 @@ Hash::Hash(std::string_view s, std::optional<HashType> type)
hash[i + 1] |= digit >> (8 - j);
} else {
if (digit >> (8 - j))
throw BadHash("invalid base-32 hash '%s'", s);
throw BadHash("invalid base-32 hash '%s'", rest);
}
}
}
else if (isSRI || size == base64Len()) {
auto d = base64Decode(s.substr(pos));
else if (isSRI || rest.size() == base64Len()) {
auto d = base64Decode(rest);
if (d.size() != hashSize)
throw BadHash("invalid %s hash '%s'", isSRI ? "SRI" : "base-64", s);
throw BadHash("invalid %s hash '%s'", isSRI ? "SRI" : "base-64", rest);
assert(hashSize);
memcpy(hash, d.data(), hashSize);
}
else
throw BadHash("hash '%s' has wrong length for hash type '%s'", s, printHashType(*type));
throw BadHash("hash '%s' has wrong length for hash type '%s'", rest, printHashType(this->type));
}
Hash newHashAllowEmpty(std::string hashStr, std::optional<HashType> ht)
@ -228,7 +266,7 @@ Hash newHashAllowEmpty(std::string hashStr, std::optional<HashType> ht)
warn("found empty hash, assuming '%s'", h.to_string(SRI, true));
return h;
} else
return Hash(hashStr, ht);
return Hash::parseAny(hashStr, ht);
}
@ -269,7 +307,7 @@ static void finish(HashType ht, Ctx & ctx, unsigned char * hash)
}
Hash hashString(HashType ht, const string & s)
Hash hashString(HashType ht, std::string_view s)
{
Ctx ctx;
Hash hash(ht);
@ -336,7 +374,7 @@ HashResult hashPath(
Hash compressHash(const Hash & hash, unsigned int newSize)
{
Hash h;
Hash h(hash.type);
h.hashSize = newSize;
for (unsigned int i = 0; i < hash.hashSize; ++i)
h.hash[i % newSize] ^= hash.hash[i];
@ -344,7 +382,7 @@ Hash compressHash(const Hash & hash, unsigned int newSize)
}
std::optional<HashType> parseHashTypeOpt(const string & s)
std::optional<HashType> parseHashTypeOpt(std::string_view s)
{
if (s == "md5") return htMD5;
else if (s == "sha1") return htSHA1;
@ -353,7 +391,7 @@ std::optional<HashType> parseHashTypeOpt(const string & s)
else return std::optional<HashType> {};
}
HashType parseHashType(const string & s)
HashType parseHashType(std::string_view s)
{
auto opt_h = parseHashTypeOpt(s);
if (opt_h)

View file

@ -27,31 +27,38 @@ enum Base : int { Base64, Base32, Base16, SRI };
struct Hash
{
static const unsigned int maxHashSize = 64;
unsigned int hashSize = 0;
unsigned char hash[maxHashSize] = {};
constexpr static size_t maxHashSize = 64;
size_t hashSize = 0;
uint8_t hash[maxHashSize] = {};
std::optional<HashType> type = {};
/* Create an unset hash object. */
Hash() { };
HashType type;
/* Create a zero-filled hash object. */
Hash(HashType type) : type(type) { init(); };
Hash(HashType type);
/* Initialize the hash from a string representation, in the format
/* Parse the hash from a string representation in the format
"[<type>:]<base16|base32|base64>" or "<type>-<base64>" (a
Subresource Integrity hash expression). If the 'type' argument
is not present, then the hash type must be specified in the
string. */
Hash(std::string_view s, std::optional<HashType> type);
// type must be provided
Hash(std::string_view s, HashType type);
// hash type must be part of string
Hash(std::string_view s);
static Hash parseAny(std::string_view s, std::optional<HashType> type);
void init();
/* Parse a hash from a string representation like the above, except the
type prefix is mandatory is there is no separate arguement. */
static Hash parseAnyPrefixed(std::string_view s);
/* Parse a plain hash that musst not have any prefix indicating the type.
The type is passed in to disambiguate. */
static Hash parseNonSRIUnprefixed(std::string_view s, HashType type);
static Hash parseSRI(std::string_view original);
private:
/* The type must be provided, the string view must not include <type>
prefix. `isSRI` helps disambigate the various base-* encodings. */
Hash(std::string_view s, HashType type, bool isSRI);
public:
/* Check whether a hash is set. */
operator bool () const { return (bool) type; }
@ -107,14 +114,14 @@ Hash newHashAllowEmpty(std::string hashStr, std::optional<HashType> ht);
string printHash16or32(const Hash & hash);
/* Compute the hash of the given string. */
Hash hashString(HashType ht, const string & s);
Hash hashString(HashType ht, std::string_view s);
/* Compute the hash of the given file. */
Hash hashFile(HashType ht, const Path & path);
/* Compute the hash of the given path. The hash is defined as
(essentially) hashString(ht, dumpPath(path)). */
typedef std::pair<Hash, unsigned long long> HashResult;
typedef std::pair<Hash, uint64_t> HashResult;
HashResult hashPath(HashType ht, const Path & path,
PathFilter & filter = defaultPathFilter);
@ -123,10 +130,10 @@ HashResult hashPath(HashType ht, const Path & path,
Hash compressHash(const Hash & hash, unsigned int newSize);
/* Parse a string representing a hash type. */
HashType parseHashType(const string & s);
HashType parseHashType(std::string_view s);
/* Will return nothing on parse error */
std::optional<HashType> parseHashTypeOpt(const string & s);
std::optional<HashType> parseHashTypeOpt(std::string_view s);
/* And the reverse. */
string printHashType(HashType ht);
@ -144,7 +151,7 @@ class HashSink : public BufferedSink, public AbstractHashSink
private:
HashType ht;
Ctx * ctx;
unsigned long long bytes;
uint64_t bytes;
public:
HashSink(HashType ht);

View file

@ -312,14 +312,14 @@ T readNum(Source & source)
source(buf, sizeof(buf));
uint64_t n =
((unsigned long long) buf[0]) |
((unsigned long long) buf[1] << 8) |
((unsigned long long) buf[2] << 16) |
((unsigned long long) buf[3] << 24) |
((unsigned long long) buf[4] << 32) |
((unsigned long long) buf[5] << 40) |
((unsigned long long) buf[6] << 48) |
((unsigned long long) buf[7] << 56);
((uint64_t) buf[0]) |
((uint64_t) buf[1] << 8) |
((uint64_t) buf[2] << 16) |
((uint64_t) buf[3] << 24) |
((uint64_t) buf[4] << 32) |
((uint64_t) buf[5] << 40) |
((uint64_t) buf[6] << 48) |
((uint64_t) buf[7] << 56);
if (n > std::numeric_limits<T>::max())
throw SerialisationError("serialised integer %d is too large for type '%s'", n, typeid(T).name());

33
src/libutil/split.hh Normal file
View file

@ -0,0 +1,33 @@
#pragma once
#include <optional>
#include <string_view>
#include "util.hh"
namespace nix {
// If `separator` is found, we return the portion of the string before the
// separator, and modify the string argument to contain only the part after the
// separator. Otherwise, wer return `std::nullopt`, and we leave the argument
// string alone.
static inline std::optional<std::string_view> splitPrefixTo(std::string_view & string, char separator) {
auto sepInstance = string.find(separator);
if (sepInstance != std::string_view::npos) {
auto prefix = string.substr(0, sepInstance);
string.remove_prefix(sepInstance+1);
return prefix;
}
return std::nullopt;
}
static inline bool splitPrefix(std::string_view & string, std::string_view prefix) {
bool res = hasPrefix(string, prefix);
if (res)
string.remove_prefix(prefix.length());
return res;
}
}

40
src/libutil/topo-sort.hh Normal file
View file

@ -0,0 +1,40 @@
#include "error.hh"
namespace nix {
template<typename T>
std::vector<T> topoSort(std::set<T> items,
std::function<std::set<T>(const T &)> getChildren,
std::function<Error(const T &, const T &)> makeCycleError)
{
std::vector<T> sorted;
std::set<T> visited, parents;
std::function<void(const T & path, const T * parent)> dfsVisit;
dfsVisit = [&](const T & path, const T * parent) {
if (parents.count(path)) throw makeCycleError(path, *parent);
if (!visited.insert(path).second) return;
parents.insert(path);
std::set<T> references = getChildren(path);
for (auto & i : references)
/* Don't traverse into items that don't exist in our starting set. */
if (i != path && items.count(i))
dfsVisit(i, &path);
sorted.push_back(path);
parents.erase(path);
};
for (auto & i : items)
dfsVisit(i, nullptr);
std::reverse(sorted.begin(), sorted.end());
return sorted;
}
}

View file

@ -374,7 +374,7 @@ void writeLine(int fd, string s)
}
static void _deletePath(int parentfd, const Path & path, unsigned long long & bytesFreed)
static void _deletePath(int parentfd, const Path & path, uint64_t & bytesFreed)
{
checkInterrupt();
@ -414,7 +414,7 @@ static void _deletePath(int parentfd, const Path & path, unsigned long long & by
}
}
static void _deletePath(const Path & path, unsigned long long & bytesFreed)
static void _deletePath(const Path & path, uint64_t & bytesFreed)
{
Path dir = dirOf(path);
if (dir == "")
@ -435,12 +435,12 @@ static void _deletePath(const Path & path, unsigned long long & bytesFreed)
void deletePath(const Path & path)
{
unsigned long long dummy;
uint64_t dummy;
deletePath(path, dummy);
}
void deletePath(const Path & path, unsigned long long & bytesFreed)
void deletePath(const Path & path, uint64_t & bytesFreed)
{
//Activity act(*logger, lvlDebug, format("recursively deleting path '%1%'") % path);
bytesFreed = 0;
@ -494,6 +494,7 @@ std::pair<AutoCloseFD, Path> createTempFile(const Path & prefix)
{
Path tmpl(getEnv("TMPDIR").value_or("/tmp") + "/" + prefix + ".XXXXXX");
// Strictly speaking, this is UB, but who cares...
// FIXME: use O_TMPFILE.
AutoCloseFD fd(mkstemp((char *) tmpl.c_str()));
if (!fd)
throw SysError("creating temporary file '%s'", tmpl);
@ -1449,7 +1450,7 @@ string base64Decode(std::string_view s)
char digit = decode[(unsigned char) c];
if (digit == -1)
throw Error("invalid character in Base64 string");
throw Error("invalid character in Base64 string: '%c'", c);
bits += 6;
d = d << 6 | digit;
@ -1581,7 +1582,7 @@ AutoCloseFD createUnixDomainSocket(const Path & path, mode_t mode)
struct sockaddr_un addr;
addr.sun_family = AF_UNIX;
if (path.size() >= sizeof(addr.sun_path))
if (path.size() + 1 >= sizeof(addr.sun_path))
throw Error("socket path '%1%' is too long", path);
strcpy(addr.sun_path, path.c_str());

View file

@ -125,7 +125,7 @@ void writeLine(int fd, string s);
second variant returns the number of bytes and blocks freed. */
void deletePath(const Path & path);
void deletePath(const Path & path, unsigned long long & bytesFreed);
void deletePath(const Path & path, uint64_t & bytesFreed);
std::string getUserName();

View file

@ -174,7 +174,7 @@ static void _main(int argc, char * * argv)
else if (*arg == "--run-env") // obsolete
runEnv = true;
else if (*arg == "--command" || *arg == "--run") {
else if (runEnv && (*arg == "--command" || *arg == "--run")) {
if (*arg == "--run")
interactive = false;
envCommand = getArg(*arg, arg, end) + "\nexit";
@ -192,7 +192,7 @@ static void _main(int argc, char * * argv)
else if (*arg == "--pure") pure = true;
else if (*arg == "--impure") pure = false;
else if (*arg == "--packages" || *arg == "-p")
else if (runEnv && (*arg == "--packages" || *arg == "-p"))
packages = true;
else if (inShebang && *arg == "-i") {
@ -325,7 +325,7 @@ static void _main(int argc, char * * argv)
auto buildPaths = [&](const std::vector<StorePathWithOutputs> & paths) {
/* Note: we do this even when !printMissing to efficiently
fetch binary cache data. */
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
StorePathSet willBuild, willSubstitute, unknown;
store->queryMissing(paths,
willBuild, willSubstitute, unknown, downloadSize, narSize);

View file

@ -67,10 +67,8 @@ static int _main(int argc, char * * argv)
deleteOlderThan = getArg(*arg, arg, end);
}
else if (*arg == "--dry-run") dryRun = true;
else if (*arg == "--max-freed") {
long long maxFreed = getIntArg<long long>(*arg, arg, end, true);
options.maxFreed = maxFreed >= 0 ? maxFreed : 0;
}
else if (*arg == "--max-freed")
options.maxFreed = std::max(getIntArg<int64_t>(*arg, arg, end, true), (int64_t) 0);
else
return false;
return true;

View file

@ -154,10 +154,10 @@ static int _main(int argc, char * * argv)
/* If an expected hash is given, the file may already exist in
the store. */
std::optional<Hash> expectedHash;
Hash hash;
Hash hash(ht);
std::optional<StorePath> storePath;
if (args.size() == 2) {
expectedHash = Hash(args[1], ht);
expectedHash = Hash::parseAny(args[1], ht);
const auto recursive = unpack ? FileIngestionMethod::Recursive : FileIngestionMethod::Flat;
storePath = store->makeFixedOutputPath(recursive, *expectedHash, name);
if (store->isValidPath(*storePath))

View file

@ -77,7 +77,7 @@ static PathSet realisePath(StorePathWithOutputs path, bool build = true)
if (i == drv.outputs.end())
throw Error("derivation '%s' does not have an output named '%s'",
store2->printStorePath(path.path), j);
auto outPath = store2->printStorePath(i->second.path);
auto outPath = store2->printStorePath(i->second.path(*store, drv.name));
if (store2) {
if (gcRoot == "")
printGCWarning();
@ -130,7 +130,7 @@ static void opRealise(Strings opFlags, Strings opArgs)
for (auto & i : opArgs)
paths.push_back(store->followLinksToStorePathWithOutputs(i));
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
StorePathSet willBuild, willSubstitute, unknown;
store->queryMissing(paths, willBuild, willSubstitute, unknown, downloadSize, narSize);
@ -208,7 +208,7 @@ static void opPrintFixedPath(Strings opFlags, Strings opArgs)
string hash = *i++;
string name = *i++;
cout << fmt("%s\n", store->printStorePath(store->makeFixedOutputPath(recursive, Hash(hash, hashAlgo), name)));
cout << fmt("%s\n", store->printStorePath(store->makeFixedOutputPath(recursive, Hash::parseAny(hash, hashAlgo), name)));
}
@ -219,7 +219,7 @@ static StorePathSet maybeUseOutputs(const StorePath & storePath, bool useOutput,
auto drv = store->derivationFromPath(storePath);
StorePathSet outputs;
for (auto & i : drv.outputs)
outputs.insert(i.second.path);
outputs.insert(i.second.path(*store, drv.name));
return outputs;
}
else return {storePath};
@ -313,7 +313,7 @@ static void opQuery(Strings opFlags, Strings opArgs)
if (forceRealise) realisePath({i2});
Derivation drv = store->derivationFromPath(i2);
for (auto & j : drv.outputs)
cout << fmt("%1%\n", store->printStorePath(j.second.path));
cout << fmt("%1%\n", store->printStorePath(j.second.path(*store, drv.name)));
}
break;
}
@ -372,8 +372,8 @@ static void opQuery(Strings opFlags, Strings opArgs)
for (auto & j : maybeUseOutputs(store->followLinksToStorePath(i), useOutput, forceRealise)) {
auto info = store->queryPathInfo(j);
if (query == qHash) {
assert(info->narHash.type == htSHA256);
cout << fmt("%s\n", info->narHash.to_string(Base32, true));
assert(info->narHash && info->narHash->type == htSHA256);
cout << fmt("%s\n", info->narHash->to_string(Base32, true));
} else if (query == qSize)
cout << fmt("%d\n", info->narSize);
}
@ -572,10 +572,8 @@ static void opGC(Strings opFlags, Strings opArgs)
if (*i == "--print-roots") printRoots = true;
else if (*i == "--print-live") options.action = GCOptions::gcReturnLive;
else if (*i == "--print-dead") options.action = GCOptions::gcReturnDead;
else if (*i == "--max-freed") {
long long maxFreed = getIntArg<long long>(*i, i, opFlags.end(), true);
options.maxFreed = maxFreed >= 0 ? maxFreed : 0;
}
else if (*i == "--max-freed")
options.maxFreed = std::max(getIntArg<int64_t>(*i, i, opFlags.end(), true), (int64_t) 0);
else throw UsageError("bad sub-operation '%1%' in GC", *i);
if (!opArgs.empty()) throw UsageError("no arguments expected");
@ -725,7 +723,7 @@ static void opVerifyPath(Strings opFlags, Strings opArgs)
auto path = store->followLinksToStorePath(i);
printMsg(lvlTalkative, "checking path '%s'...", store->printStorePath(path));
auto info = store->queryPathInfo(path);
HashSink sink(*info->narHash.type);
HashSink sink(info->narHash->type);
store->narFromPath(path, sink);
auto current = sink.finish();
if (current.first != info->narHash) {
@ -734,7 +732,7 @@ static void opVerifyPath(Strings opFlags, Strings opArgs)
.hint = hintfmt(
"path '%s' was modified! expected hash '%s', got '%s'",
store->printStorePath(path),
info->narHash.to_string(Base32, true),
info->narHash->to_string(Base32, true),
current.first.to_string(Base32, true))
});
status = 1;
@ -831,7 +829,7 @@ static void opServe(Strings opFlags, Strings opArgs)
for (auto & path : paths)
if (!path.isDerivation())
paths2.push_back({path});
unsigned long long downloadSize, narSize;
uint64_t downloadSize, narSize;
StorePathSet willBuild, willSubstitute, unknown;
store->queryMissing(paths2,
willBuild, willSubstitute, unknown, downloadSize, narSize);
@ -864,7 +862,9 @@ static void opServe(Strings opFlags, Strings opArgs)
out << info->narSize // downloadSize
<< info->narSize;
if (GET_PROTOCOL_MINOR(clientVersion) >= 4)
out << (info->narHash ? info->narHash.to_string(Base32, true) : "") << renderContentAddress(info->ca) << info->sigs;
out << (info->narHash ? info->narHash->to_string(Base32, true) : "")
<< renderContentAddress(info->ca)
<< info->sigs;
} catch (InvalidPath &) {
}
}
@ -914,9 +914,9 @@ static void opServe(Strings opFlags, Strings opArgs)
if (!writeAllowed) throw Error("building paths is not allowed");
auto drvPath = store->parseStorePath(readString(in)); // informational only
auto drvPath = store->parseStorePath(readString(in));
BasicDerivation drv;
readDerivation(in, *store, drv);
readDerivation(in, *store, drv, Derivation::nameFromPath(drvPath));
getBuildSettings();
@ -948,7 +948,7 @@ static void opServe(Strings opFlags, Strings opArgs)
auto deriver = readString(in);
if (deriver != "")
info.deriver = store->parseStorePath(deriver);
info.narHash = Hash(readString(in), htSHA256);
info.narHash = Hash::parseAny(readString(in), htSHA256);
info.references = readStorePaths<StorePathSet>(*store, in);
in >> info.registrationTime >> info.narSize >> info.ultimate;
info.sigs = readStrings<StringSet>(in);

View file

@ -9,6 +9,7 @@ struct CmdAddToStore : MixDryRun, StoreCommand
{
Path path;
std::optional<std::string> namePart;
FileIngestionMethod ingestionMethod = FileIngestionMethod::Recursive;
CmdAddToStore()
{
@ -21,6 +22,13 @@ struct CmdAddToStore : MixDryRun, StoreCommand
.labels = {"name"},
.handler = {&namePart},
});
addFlag({
.longName = "flat",
.shortName = 0,
.description = "add flat file to the Nix store",
.handler = {&ingestionMethod, FileIngestionMethod::Flat},
});
}
std::string description() override
@ -45,12 +53,19 @@ struct CmdAddToStore : MixDryRun, StoreCommand
auto narHash = hashString(htSHA256, *sink.s);
ValidPathInfo info(store->makeFixedOutputPath(FileIngestionMethod::Recursive, narHash, *namePart));
Hash hash = narHash;
if (ingestionMethod == FileIngestionMethod::Flat) {
HashSink hsink(htSHA256);
readFile(path, hsink);
hash = hsink.finish().first;
}
ValidPathInfo info(store->makeFixedOutputPath(ingestionMethod, hash, *namePart));
info.narHash = narHash;
info.narSize = sink.s->size();
info.ca = std::optional { FixedOutputHash {
.method = FileIngestionMethod::Recursive,
.hash = info.narHash,
.method = ingestionMethod,
.hash = hash,
} };
if (!dryRun) {

View file

@ -9,6 +9,7 @@ using namespace nix;
struct CmdBuild : InstallablesCommand, MixDryRun, MixProfile
{
Path outLink = "result";
BuildMode buildMode = bmNormal;
CmdBuild()
{
@ -26,6 +27,12 @@ struct CmdBuild : InstallablesCommand, MixDryRun, MixProfile
.description = "do not create a symlink to the build result",
.handler = {&outLink, Path("")},
});
addFlag({
.longName = "rebuild",
.description = "rebuild an already built package and compare the result to the existing store paths",
.handler = {&buildMode, bmCheck},
});
}
std::string description() override
@ -53,7 +60,7 @@ struct CmdBuild : InstallablesCommand, MixDryRun, MixProfile
void run(ref<Store> store) override
{
auto buildables = build(store, dryRun ? Realise::Nothing : Realise::Outputs, installables);
auto buildables = build(store, dryRun ? Realise::Nothing : Realise::Outputs, installables, buildMode);
if (dryRun) return;

129
src/nix/bundle.cc Normal file
View file

@ -0,0 +1,129 @@
#include "command.hh"
#include "common-args.hh"
#include "shared.hh"
#include "store-api.hh"
#include "fs-accessor.hh"
using namespace nix;
struct CmdBundle : InstallableCommand
{
std::string bundler = "github:matthewbauer/nix-bundle";
std::optional<Path> outLink;
CmdBundle()
{
addFlag({
.longName = "bundler",
.description = "use custom bundler",
.labels = {"flake-url"},
.handler = {&bundler},
.completer = {[&](size_t, std::string_view prefix) {
completeFlakeRef(getStore(), prefix);
}}
});
addFlag({
.longName = "out-link",
.shortName = 'o',
.description = "path of the symlink to the build result",
.labels = {"path"},
.handler = {&outLink},
.completer = completePath
});
}
std::string description() override
{
return "bundle an application so that it works outside of the Nix store";
}
Examples examples() override
{
return {
Example{
"To bundle Hello:",
"nix bundle hello"
},
};
}
Category category() override { return catSecondary; }
Strings getDefaultFlakeAttrPaths() override
{
Strings res{"defaultApp." + settings.thisSystem.get()};
for (auto & s : SourceExprCommand::getDefaultFlakeAttrPaths())
res.push_back(s);
return res;
}
Strings getDefaultFlakeAttrPathPrefixes() override
{
Strings res{"apps." + settings.thisSystem.get() + ".", "packages"};
for (auto & s : SourceExprCommand::getDefaultFlakeAttrPathPrefixes())
res.push_back(s);
return res;
}
void run(ref<Store> store) override
{
auto evalState = getEvalState();
auto app = installable->toApp(*evalState);
store->buildPaths(app.context);
auto [bundlerFlakeRef, bundlerName] = parseFlakeRefWithFragment(bundler, absPath("."));
const flake::LockFlags lockFlags{ .writeLockFile = false };
auto bundler = InstallableFlake(
evalState, std::move(bundlerFlakeRef),
Strings{bundlerName == "" ? "defaultBundler" : bundlerName},
Strings({"bundlers."}), lockFlags);
Value * arg = evalState->allocValue();
evalState->mkAttrs(*arg, 2);
PathSet context;
for (auto & i : app.context)
context.insert("=" + store->printStorePath(i.path));
mkString(*evalState->allocAttr(*arg, evalState->symbols.create("program")), app.program, context);
mkString(*evalState->allocAttr(*arg, evalState->symbols.create("system")), settings.thisSystem.get());
arg->attrs->sort();
auto vRes = evalState->allocValue();
evalState->callFunction(*bundler.toValue(*evalState).first, *arg, *vRes, noPos);
if (!evalState->isDerivation(*vRes))
throw Error("the bundler '%s' does not produce a derivation", bundler.what());
auto attr1 = vRes->attrs->find(evalState->sDrvPath);
if (!attr1)
throw Error("the bundler '%s' does not produce a derivation", bundler.what());
PathSet context2;
StorePath drvPath = store->parseStorePath(evalState->coerceToPath(*attr1->pos, *attr1->value, context2));
auto attr2 = vRes->attrs->find(evalState->sOutPath);
if (!attr2)
throw Error("the bundler '%s' does not produce a derivation", bundler.what());
StorePath outPath = store->parseStorePath(evalState->coerceToPath(*attr2->pos, *attr2->value, context2));
store->buildPaths({{drvPath}});
auto outPathS = store->printStorePath(outPath);
auto info = store->queryPathInfo(outPath);
if (!info->references.empty())
throw Error("'%s' has references; a bundler must not leave any references", outPathS);
if (!outLink)
outLink = baseNameOf(app.program);
store.dynamic_pointer_cast<LocalFSStore>()->addPermRoot(outPath, absPath(*outLink), true);
}
};
static auto r2 = registerCommand<CmdBundle>("bundle");

View file

@ -5,6 +5,7 @@
#include "common-eval-args.hh"
#include "path.hh"
#include "flake/lockfile.hh"
#include "store-api.hh"
#include <optional>
@ -185,7 +186,7 @@ static RegisterCommand registerCommand(const std::string & name)
}
Buildables build(ref<Store> store, Realise mode,
std::vector<std::shared_ptr<Installable>> installables);
std::vector<std::shared_ptr<Installable>> installables, BuildMode bMode = bmNormal);
std::set<StorePath> toStorePaths(ref<Store> store,
Realise mode, OperateOn operateOn,

View file

@ -68,22 +68,22 @@ BuildEnvironment readEnvironment(const Path & path)
std::smatch match;
if (std::regex_search(pos, file.cend(), match, declareRegex)) {
if (std::regex_search(pos, file.cend(), match, declareRegex, std::regex_constants::match_continuous)) {
pos = match[0].second;
exported.insert(match[1]);
}
else if (std::regex_search(pos, file.cend(), match, varRegex)) {
else if (std::regex_search(pos, file.cend(), match, varRegex, std::regex_constants::match_continuous)) {
pos = match[0].second;
res.env.insert({match[1], Var { .exported = exported.count(match[1]) > 0, .value = match[2] }});
}
else if (std::regex_search(pos, file.cend(), match, assocArrayRegex)) {
else if (std::regex_search(pos, file.cend(), match, assocArrayRegex, std::regex_constants::match_continuous)) {
pos = match[0].second;
res.env.insert({match[1], Var { .associative = true, .value = match[2] }});
}
else if (std::regex_search(pos, file.cend(), match, functionRegex)) {
else if (std::regex_search(pos, file.cend(), match, functionRegex, std::regex_constants::match_continuous)) {
res.bashFunctions = std::string(pos, file.cend());
break;
}
@ -130,14 +130,16 @@ StorePath getDerivationEnvironment(ref<Store> store, const StorePath & drvPath)
drvName += "-env";
for (auto & output : drv.outputs)
drv.env.erase(output.first);
drv.outputs = {{"out", DerivationOutput { .path = StorePath::dummy }}};
drv.outputs = {{"out", DerivationOutput { .output = DerivationOutputInputAddressed { .path = StorePath::dummy }}}};
drv.env["out"] = "";
drv.env["_outputs_saved"] = drv.env["outputs"];
drv.env["outputs"] = "out";
drv.inputSrcs.insert(std::move(getEnvShPath));
Hash h = hashDerivationModulo(*store, drv, true);
Hash h = std::get<0>(hashDerivationModulo(*store, drv, true));
auto shellOutPath = store->makeOutputPath("out", h, drvName);
drv.outputs.insert_or_assign("out", DerivationOutput { .path = shellOutPath });
drv.outputs.insert_or_assign("out", DerivationOutput { .output = DerivationOutputInputAddressed {
.path = shellOutPath
} });
drv.env["out"] = store->printStorePath(shellOutPath);
auto shellDrvPath2 = writeDerivation(store, drv, drvName);

View file

@ -368,6 +368,21 @@ struct CmdFlakeCheck : FlakeCommand
}
};
auto checkBundler = [&](const std::string & attrPath, Value & v, const Pos & pos) {
try {
state->forceValue(v, pos);
if (v.type != tLambda)
throw Error("bundler must be a function");
if (!v.lambda.fun->formals ||
v.lambda.fun->formals->argNames.find(state->symbols.create("program")) == v.lambda.fun->formals->argNames.end() ||
v.lambda.fun->formals->argNames.find(state->symbols.create("system")) == v.lambda.fun->formals->argNames.end())
throw Error("bundler must take formal arguments 'program' and 'system'");
} catch (Error & e) {
e.addTrace(pos, hintfmt("while checking the template '%s'", attrPath));
throw;
}
};
{
Activity act(*logger, lvlInfo, actUnknown, "evaluating flake");
@ -490,6 +505,16 @@ struct CmdFlakeCheck : FlakeCommand
*attr.value, *attr.pos);
}
else if (name == "defaultBundler")
checkBundler(name, vOutput, pos);
else if (name == "bundlers") {
state->forceAttrs(vOutput, pos);
for (auto & attr : *vOutput.attrs)
checkBundler(fmt("%s.%s", name, attr.name),
*attr.value, *attr.pos);
}
else
warn("unknown flake output '%s'", name);

View file

@ -107,7 +107,7 @@ struct CmdToBase : Command
void run() override
{
for (auto s : args)
logger->stdout(Hash(s, ht).to_string(base, base == SRI));
logger->stdout(Hash::parseAny(s, ht).to_string(base, base == SRI));
}
};

View file

@ -276,7 +276,7 @@ std::vector<std::pair<std::shared_ptr<eval_cache::AttrCursor>, std::string>>
Installable::getCursors(EvalState & state, bool useEvalCache)
{
auto evalCache =
std::make_shared<nix::eval_cache::EvalCache>(false, Hash(), state,
std::make_shared<nix::eval_cache::EvalCache>(std::nullopt, state,
[&]() { return toValue(state).first; });
return {{evalCache->getRoot(), ""}};
}
@ -304,8 +304,9 @@ struct InstallableStorePath : Installable
{
if (storePath.isDerivation()) {
std::map<std::string, StorePath> outputs;
for (auto & [name, output] : store->readDerivation(storePath).outputs)
outputs.emplace(name, output.path);
auto drv = store->readDerivation(storePath);
for (auto & [name, output] : drv.outputs)
outputs.emplace(name, output.path(*store, drv.name));
return {
BuildableFromDrv {
.drvPath = storePath,
@ -422,9 +423,11 @@ ref<eval_cache::EvalCache> openEvalCache(
std::shared_ptr<flake::LockedFlake> lockedFlake,
bool useEvalCache)
{
return ref(std::make_shared<nix::eval_cache::EvalCache>(
useEvalCache && evalSettings.pureEval,
lockedFlake->getFingerprint(),
auto fingerprint = lockedFlake->getFingerprint();
return make_ref<nix::eval_cache::EvalCache>(
useEvalCache && evalSettings.pureEval
? std::optional { std::cref(fingerprint) }
: std::nullopt,
state,
[&state, lockedFlake]()
{
@ -442,7 +445,7 @@ ref<eval_cache::EvalCache> openEvalCache(
assert(aOutputs);
return aOutputs->value;
}));
});
}
static std::string showAttrPaths(const std::vector<std::string> & paths)
@ -628,7 +631,7 @@ std::shared_ptr<Installable> SourceExprCommand::parseInstallable(
}
Buildables build(ref<Store> store, Realise mode,
std::vector<std::shared_ptr<Installable>> installables)
std::vector<std::shared_ptr<Installable>> installables, BuildMode bMode)
{
if (mode == Realise::Nothing)
settings.readOnlyMode = true;
@ -657,7 +660,7 @@ Buildables build(ref<Store> store, Realise mode,
if (mode == Realise::Nothing)
printMissing(store, pathsToBuild, lvlError);
else if (mode == Realise::Outputs)
store->buildPaths(pathsToBuild);
store->buildPaths(pathsToBuild, bMode);
return buildables;
}

View file

@ -84,7 +84,7 @@ struct CmdMakeContentAddressable : StorePathsCommand, MixJSON
info.narSize = sink.s->size();
info.ca = FixedOutputHash {
.method = FileIngestionMethod::Recursive,
.hash = info.narHash,
.hash = *info.narHash,
};
if (!json)

View file

@ -61,7 +61,7 @@ struct CmdPathInfo : StorePathsCommand, MixJSON
};
}
void printSize(unsigned long long value)
void printSize(uint64_t value)
{
if (!humanReadable) {
std::cout << fmt("\t%11d", value);

View file

@ -133,7 +133,7 @@ struct ProfileManifest
info.references = std::move(references);
info.narHash = narHash;
info.narSize = sink.s->size();
info.ca = FixedOutputHash { .method = FileIngestionMethod::Recursive, .hash = info.narHash };
info.ca = FixedOutputHash { .method = FileIngestionMethod::Recursive, .hash = *info.narHash };
auto source = StringSource { *sink.s };
store->addToStore(info, source);

View file

@ -483,10 +483,10 @@ bool NixRepl::processLine(string line)
but doing it in a child makes it easier to recover from
problems / SIGINT. */
if (runProgram(settings.nixBinDir + "/nix", Strings{"build", "--no-link", drvPath}) == 0) {
auto drv = readDerivation(*state->store, drvPath);
auto drv = readDerivation(*state->store, drvPath, Derivation::nameFromPath(state->store->parseStorePath(drvPath)));
std::cout << std::endl << "this derivation produced the following outputs:" << std::endl;
for (auto & i : drv.outputs)
std::cout << fmt(" %s -> %s\n", i.first, state->store->printStorePath(i.second.path));
std::cout << fmt(" %s -> %s\n", i.first, state->store->printStorePath(i.second.path(*state->store, drv.name)));
}
} else if (command == ":i") {
runProgram(settings.nixBinDir + "/nix-env", Strings{"-i", drvPath});

View file

@ -69,11 +69,19 @@ struct CmdShowDerivation : InstallablesCommand
auto outputsObj(drvObj.object("outputs"));
for (auto & output : drv.outputs) {
auto outputObj(outputsObj.object(output.first));
outputObj.attr("path", store->printStorePath(output.second.path));
if (output.second.hash) {
outputObj.attr("hashAlgo", output.second.hash->printMethodAlgo());
outputObj.attr("hash", output.second.hash->hash.to_string(Base16, false));
}
outputObj.attr("path", store->printStorePath(output.second.path(*store, drv.name)));
std::visit(overloaded {
[&](DerivationOutputInputAddressed doi) {
},
[&](DerivationOutputCAFixed dof) {
outputObj.attr("hashAlgo", dof.hash.printMethodAlgo());
outputObj.attr("hash", dof.hash.hash.to_string(Base16, false));
},
[&](DerivationOutputCAFloating dof) {
outputObj.attr("hashAlgo", makeFileIngestionPrefix(dof.method) + printHashType(dof.hashType));
},
}, output.second.output);
}
}

View file

@ -91,15 +91,15 @@ struct CmdVerify : StorePathsCommand
std::unique_ptr<AbstractHashSink> hashSink;
if (!info->ca)
hashSink = std::make_unique<HashSink>(*info->narHash.type);
hashSink = std::make_unique<HashSink>(info->narHash->type);
else
hashSink = std::make_unique<HashModuloSink>(*info->narHash.type, std::string(info->path.hashPart()));
hashSink = std::make_unique<HashModuloSink>(info->narHash->type, std::string(info->path.hashPart()));
store->narFromPath(info->path, *hashSink);
auto hash = hashSink->finish();
if (hash.first != info->narHash) {
if (hash.first != *info->narHash) {
corrupted++;
act2.result(resCorruptedPath, store->printStorePath(info->path));
logError({
@ -107,7 +107,7 @@ struct CmdVerify : StorePathsCommand
.hint = hintfmt(
"path '%s' was modified! expected hash '%s', got '%s'",
store->printStorePath(info->path),
info->narHash.to_string(Base32, true),
info->narHash->to_string(Base32, true),
hash.first.to_string(Base32, true))
});
}

View file

@ -218,7 +218,9 @@ outPath=$(nix-build --no-out-link -E '
nix copy --to file://$cacheDir?write-nar-listing=1 $outPath
[[ $(cat $cacheDir/$(basename $outPath).ls) = '{"version":1,"root":{"type":"directory","entries":{"bar":{"type":"regular","size":4,"narOffset":232},"link":{"type":"symlink","target":"xyzzy"}}}}' ]]
diff -u \
<(jq -S < $cacheDir/$(basename $outPath).ls) \
<(echo '{"version":1,"root":{"type":"directory","entries":{"bar":{"type":"regular","size":4,"narOffset":232},"link":{"type":"symlink","target":"xyzzy"}}}}' | jq -S)
# Test debug info index generation.
@ -234,4 +236,6 @@ outPath=$(nix-build --no-out-link -E '
nix copy --to "file://$cacheDir?index-debug-info=1&compression=none" $outPath
[[ $(cat $cacheDir/debuginfo/02623eda209c26a59b1a8638ff7752f6b945c26b.debug) = '{"archive":"../nar/100vxs724qr46phz8m24iswmg9p3785hsyagz0kchf6q6gf06sw6.nar","member":"lib/debug/.build-id/02/623eda209c26a59b1a8638ff7752f6b945c26b.debug"}' ]]
diff -u \
<(cat $cacheDir/debuginfo/02623eda209c26a59b1a8638ff7752f6b945c26b.debug | jq -S) \
<(echo '{"archive":"../nar/100vxs724qr46phz8m24iswmg9p3785hsyagz0kchf6q6gf06sw6.nar","member":"lib/debug/.build-id/02/623eda209c26a59b1a8638ff7752f6b945c26b.debug"}' | jq -S)

View file

@ -61,30 +61,30 @@ nix-build check.nix -A nondeterministic --no-out-link --repeat 1 2> $TEST_ROOT/l
[ "$status" = "1" ]
grep 'differs from previous round' $TEST_ROOT/log
path=$(nix-build check.nix -A fetchurl --no-out-link --hashed-mirrors '')
path=$(nix-build check.nix -A fetchurl --no-out-link)
chmod +w $path
echo foo > $path
chmod -w $path
nix-build check.nix -A fetchurl --no-out-link --check --hashed-mirrors ''
nix-build check.nix -A fetchurl --no-out-link --check
# Note: "check" doesn't repair anything, it just compares to the hash stored in the database.
[[ $(cat $path) = foo ]]
nix-build check.nix -A fetchurl --no-out-link --repair --hashed-mirrors ''
nix-build check.nix -A fetchurl --no-out-link --repair
[[ $(cat $path) != foo ]]
nix-build check.nix -A hashmismatch --no-out-link --hashed-mirrors '' || status=$?
nix-build check.nix -A hashmismatch --no-out-link || status=$?
[ "$status" = "102" ]
echo -n > ./dummy
nix-build check.nix -A hashmismatch --no-out-link --hashed-mirrors ''
nix-build check.nix -A hashmismatch --no-out-link
echo 'Hello World' > ./dummy
nix-build check.nix -A hashmismatch --no-out-link --check --hashed-mirrors '' || status=$?
nix-build check.nix -A hashmismatch --no-out-link --check || status=$?
[ "$status" = "102" ]
# Multiple failures with --keep-going
nix-build check.nix -A nondeterministic --no-out-link
nix-build check.nix -A nondeterministic -A hashmismatch --no-out-link --check --keep-going --hashed-mirrors '' || status=$?
nix-build check.nix -A nondeterministic -A hashmismatch --no-out-link --check --keep-going || status=$?
[ "$status" = "110" ]

View file

@ -32,6 +32,8 @@ rev2=$(git -C $repo rev-parse HEAD)
# Fetch a worktree
unset _NIX_FORCE_HTTP
path0=$(nix eval --impure --raw --expr "(builtins.fetchGit file://$TEST_ROOT/worktree).outPath")
path0_=$(nix eval --impure --raw --expr "(builtins.fetchTree { type = \"git\"; url = file://$TEST_ROOT/worktree; }).outPath")
[[ $path0 = $path0_ ]]
export _NIX_FORCE_HTTP=1
[[ $(tail -n 1 $path0/hello) = "hello" ]]
@ -102,6 +104,12 @@ git -C $repo commit -m 'Bla3' -a
path4=$(nix eval --impure --refresh --raw --expr "(builtins.fetchGit file://$repo).outPath")
[[ $path2 = $path4 ]]
nix eval --impure --raw --expr "(builtins.fetchGit { url = $repo; rev = \"$rev2\"; narHash = \"sha256-B5yIPHhEm0eysJKEsO7nqxprh9vcblFxpJG11gXJus1=\"; }).outPath" || status=$?
[[ "$status" = "102" ]]
path5=$(nix eval --impure --raw --expr "(builtins.fetchGit { url = $repo; rev = \"$rev2\"; narHash = \"sha256-Hr8g6AqANb3xqX28eu1XnjK/3ab8Gv6TJSnkb1LezG9=\"; }).outPath")
[[ $path = $path5 ]]
# tarball-ttl should be ignored if we specify a rev
echo delft > $repo/hello
git -C $repo add hello

View file

@ -5,7 +5,7 @@ clearStore
# Test fetching a flat file.
hash=$(nix-hash --flat --type sha256 ./fetchurl.sh)
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr sha256 $hash --no-out-link --hashed-mirrors '')
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr sha256 $hash --no-out-link)
cmp $outPath fetchurl.sh
@ -14,7 +14,7 @@ clearStore
hash=$(nix hash-file --type sha512 --base64 ./fetchurl.sh)
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr sha512 $hash --no-out-link --hashed-mirrors '')
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr sha512 $hash --no-out-link)
cmp $outPath fetchurl.sh
@ -25,26 +25,24 @@ hash=$(nix hash-file ./fetchurl.sh)
[[ $hash =~ ^sha256- ]]
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr hash $hash --no-out-link --hashed-mirrors '')
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file://$(pwd)/fetchurl.sh --argstr hash $hash --no-out-link)
cmp $outPath fetchurl.sh
# Test the hashed mirror feature.
# Test that we can substitute from a different store dir.
clearStore
hash=$(nix hash-file --type sha512 --base64 ./fetchurl.sh)
hash32=$(nix hash-file --type sha512 --base16 ./fetchurl.sh)
other_store=file://$TEST_ROOT/other_store?store=/fnord/store
mirror=$TEST_ROOT/hashed-mirror
rm -rf $mirror
mkdir -p $mirror/sha512
ln -s $(pwd)/fetchurl.sh $mirror/sha512/$hash32
hash=$(nix hash-file --type sha256 --base16 ./fetchurl.sh)
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file:///no-such-dir/fetchurl.sh --argstr sha512 $hash --no-out-link --hashed-mirrors "file://$mirror")
storePath=$(nix --store $other_store add-to-store --flat ./fetchurl.sh)
outPath=$(nix-build '<nix/fetchurl.nix>' --argstr url file:///no-such-dir/fetchurl.sh --argstr sha256 $hash --no-out-link --substituters $other_store)
# Test hashed mirrors with an SRI hash.
nix-build '<nix/fetchurl.nix>' --argstr url file:///no-such-dir/fetchurl.sh --argstr hash $(nix to-sri --type sha512 $hash) \
--argstr name bla --no-out-link --hashed-mirrors "file://$mirror"
nix-build '<nix/fetchurl.nix>' --argstr url file:///no-such-dir/fetchurl.sh --argstr hash $(nix to-sri --type sha256 $hash) \
--no-out-link --substituters $other_store
# Test unpacking a NAR.
rm -rf $TEST_ROOT/archive

View file

@ -10,10 +10,16 @@ touch $TEST_ROOT/filterin/bak
touch $TEST_ROOT/filterin/bla.c.bak
ln -s xyzzy $TEST_ROOT/filterin/link
nix-build ./filter-source.nix -o $TEST_ROOT/filterout
checkFilter() {
test ! -e $1/foo/bar
test -e $1/xyzzy
test -e $1/bak
test ! -e $1/bla.c.bak
test ! -L $1/link
}
test ! -e $TEST_ROOT/filterout/foo/bar
test -e $TEST_ROOT/filterout/xyzzy
test -e $TEST_ROOT/filterout/bak
test ! -e $TEST_ROOT/filterout/bla.c.bak
test ! -L $TEST_ROOT/filterout/link
nix-build ./filter-source.nix -o $TEST_ROOT/filterout1
checkFilter $TEST_ROOT/filterout1
nix-build ./path.nix -o $TEST_ROOT/filterout2
checkFilter $TEST_ROOT/filterout2

View file

@ -26,12 +26,24 @@ nix cat-store $storePath/foo/baz > baz.cat-nar
diff -u baz.cat-nar $storePath/foo/baz
# Test --json.
[[ $(nix ls-nar --json $narFile /) = '{"type":"directory","entries":{"foo":{},"foo-x":{},"qux":{},"zyx":{}}}' ]]
[[ $(nix ls-nar --json -R $narFile /foo) = '{"type":"directory","entries":{"bar":{"type":"regular","size":0,"narOffset":368},"baz":{"type":"regular","size":0,"narOffset":552},"data":{"type":"regular","size":58,"narOffset":736}}}' ]]
[[ $(nix ls-nar --json -R $narFile /foo/bar) = '{"type":"regular","size":0,"narOffset":368}' ]]
[[ $(nix ls-store --json $storePath) = '{"type":"directory","entries":{"foo":{},"foo-x":{},"qux":{},"zyx":{}}}' ]]
[[ $(nix ls-store --json -R $storePath/foo) = '{"type":"directory","entries":{"bar":{"type":"regular","size":0},"baz":{"type":"regular","size":0},"data":{"type":"regular","size":58}}}' ]]
[[ $(nix ls-store --json -R $storePath/foo/bar) = '{"type":"regular","size":0}' ]]
diff -u \
<(nix ls-nar --json $narFile / | jq -S) \
<(echo '{"type":"directory","entries":{"foo":{},"foo-x":{},"qux":{},"zyx":{}}}' | jq -S)
diff -u \
<(nix ls-nar --json -R $narFile /foo | jq -S) \
<(echo '{"type":"directory","entries":{"bar":{"type":"regular","size":0,"narOffset":368},"baz":{"type":"regular","size":0,"narOffset":552},"data":{"type":"regular","size":58,"narOffset":736}}}' | jq -S)
diff -u \
<(nix ls-nar --json -R $narFile /foo/bar | jq -S) \
<(echo '{"type":"regular","size":0,"narOffset":368}' | jq -S)
diff -u \
<(nix ls-store --json $storePath | jq -S) \
<(echo '{"type":"directory","entries":{"foo":{},"foo-x":{},"qux":{},"zyx":{}}}' | jq -S)
diff -u \
<(nix ls-store --json -R $storePath/foo | jq -S) \
<(echo '{"type":"directory","entries":{"bar":{"type":"regular","size":0},"baz":{"type":"regular","size":0},"data":{"type":"regular","size":58}}}' | jq -S)
diff -u \
<(nix ls-store --json -R $storePath/foo/bar| jq -S) \
<(echo '{"type":"regular","size":0}' | jq -S)
# Test missing files.
nix ls-store --json -R $storePath/xyzzy 2>&1 | grep 'does not exist in NAR'

14
tests/path.nix Normal file
View file

@ -0,0 +1,14 @@
with import ./config.nix;
mkDerivation {
name = "filter";
builder = builtins.toFile "builder" "ln -s $input $out";
input =
builtins.path {
path = ((builtins.getEnv "TEST_ROOT") + "/filterin");
filter = path: type:
type != "symlink"
&& baseNameOf path != "foo"
&& !((import ./lang/lib.nix).hasSuffix ".bak" (baseNameOf path));
};
}

View file

@ -27,10 +27,13 @@ test_tarball() {
nix-build -o $TEST_ROOT/result -E "import (fetchTarball file://$tarball)"
nix-build --experimental-features flakes -o $TEST_ROOT/result -E "import (fetchTree file://$tarball)"
nix-build --experimental-features flakes -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; })"
nix-build --experimental-features flakes -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"$hash\"; })"
nix-build --experimental-features flakes -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"sha256-xdKv2pq/IiwLSnBBJXW8hNowI4MrdZfW+SYqDQs7Tzc=\"; })" 2>&1 | grep 'NAR hash mismatch in input'
nix-build -o $TEST_ROOT/result -E "import (fetchTree file://$tarball)"
nix-build -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; })"
nix-build -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"$hash\"; })"
nix-build -o $TEST_ROOT/result -E "import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"sha256-xdKv2pq/IiwLSnBBJXW8hNowI4MrdZfW+SYqDQs7Tzc=\"; })" 2>&1 | grep 'NAR hash mismatch in input'
nix-instantiate --strict --eval -E "!((import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"$hash\"; })) ? submodules)" >&2
nix-instantiate --strict --eval -E "!((import (fetchTree { type = \"tarball\"; url = file://$tarball; narHash = \"$hash\"; })) ? submodules)" 2>&1 | grep 'true'
nix-instantiate --eval -E '1 + 2' -I fnord=file://no-such-tarball.tar$ext
nix-instantiate --eval -E 'with <fnord/xyzzy>; 1 + 2' -I fnord=file://no-such-tarball$ext