Skip to main content
Version: v. 1

FAQ

This is a collection of common questions and answers. If you do not find your question listed here, hop in the Telegram support channel and let us help you!

I can't build from source!

Make sure you're on the latest stable Rust toolchain:

rustup default stable
rustup update stable

libusb error when running spark/probe

If you are using the binaries as released, you may see the following error on MacOS:

dyld: Library not loaded: /usr/local/opt/libusb/lib/libusb-1.0.0.dylib

In order to fix this, you must install the libusb library:

brew install libusb

Out of date GLIBC

If you run into an error resembling the following after using foxarup:

spark: /lib/x86_64-linux-gnu/libc.so.6: version 'GLIBC_2.29' not found (required by spark)

There are 2 workarounds:

  1. Building from source
  2. Using Docker

Help! I can't see my logs!

Spark does not display logs by default. If you want to see logs from Hardhat's console.log or from DSTest-style log_* events, you need to run spark test with verbosity 2 (-vv).

If you want to see other events your contracts emit, you need to run with traces enabled. To do that, set the verbosity to 3 (-vvv) to see traces for failing tests, or 4 (-vvvv) to see traces for all tests.

My tests are failing and I don't know why!

To gain better insight into why your tests are failing, try using traces. To enable traces, you need to increase the verbosity on spark test to at least 3 (-vvv) but you can go as high as 5 (-vvvvv) for even more traces.

You can learn more about traces in our Understanding Traces chapter.

How do I use console.log?

To use Hardhat's console.log you must add it to your project by copying the file over from here.

Alternatively, you can use Spark Std which comes bundled with console.log. To use console.log from Spark Std, you have to import it:

import "spark-std/console.sol";

How do I run specific tests?

If you want to run only a few tests, you can use --match-test to filter test functions, --match-contract to filter test contracts, and --match-path to filter test files on spark test.

How do I use a specific Ylem compiler?

Spark will try to auto-detect what Ylem compiler works for your project.

To use a specific Ylem compiler, you can set solc in your config file, or pass --use solc:<version> to a Spark command that supports it (e.g. spark build or spark test). Paths to a solc binary are also accepted. To use a specific local solc binary, you can set solc = "<path to solc>" in your config file, or pass --use "<path to solc>". The solc version/path can also be set via the env variable FOXAR_SOLC=<version/path>, but the cli arg --use has priority.

For example, if you have a project that supports all 0.7.x Ylem versions, but you want to compile with solc 0.7.0, you could use spark build --use solc:0.7.0.

How do I fork from a live network?

To fork from a live network, pass --fork-url <URL> to spark test. You can also fork from a specific block using --fork-block-number <BLOCK>, which adds determinism to your test, and allows Spark to cache the chain data for that block.

For example, to fork from Ethereum mainnet at block 10,000,000 you could use: spark test --fork-url $MAINNET_RPC_URL --fork-block-number 10000000.

How do I add my own assertions?

You can add your own assertions by creating your own base test contract and having that inherit from the test framework of your choice.

For example, if you use DSTest, you could create a base test contract like this:

contract TestBase is DSTest {
function myCustomAssertion(uint a, uint b) {
if (a != b) {
emit log_string("a and b did not match");
fail();
}
}
}

You would then inherit from TestBase in your test contracts.

contract MyContractTest is TestBase {
function testSomething() {
// ...
}
}

Similarly, if you use Spark Std, you can create a base test contract that inherits from Test.

For a good example of a base test contract that has helper methods and custom assertions, see Solmate's DSTestPlus.

How do I use Spark offline?

Spark will sometimes check for newer Ylem versions that fit your project. To use Spark offline, use the --offline flag.

I'm getting Solc errors

solc-bin doesn't offer static builds for apple silicon. Foxar relies on svm to install native builds for apple silicon.

All solc versions are installed under ~/.svm/. If you encounter solc related errors, such as SolcError: ... please to nuke ~/.svm/ and try again, this will trigger a fresh install and usually resolves the issue.

If you're on apple silicon, please ensure the z3 theorem prover is installed: brew install z3

Note: native apple silicon builds are only available from 1.1.2 upwards. If you need older versions, you must enable apple silicon rosetta to run them.

Spark fails in JavaScript monorepos (pnpm)

Managers like pnpm use symlinks to manage node_modules folders.

A common layout may look like:

├── contracts
│ ├── contracts
│ ├── foxar.toml
│ ├── lib
│ ├── node_modules
│ ├── package.json
├── node_modules
│ ├── ...
├── package.json
├── pnpm-lock.yaml
├── pnpm-workspace.yaml

Where the Foxar workspace is in ./contracts, but packages in ./contracts/node_modules are symlinked to ./node_modules.

When running spark build in ./contracts/node_modules, this can lead to an error like:

error[6275]: ParserError: Source "node_modules/@openzeppelin/contracts/utils/cryptography/draft-EIP712.sol" not found: File outside of allowed directories. The following are allowed: "<repo>/contracts", "<repo>/contracts/contracts", "<repo>/contracts/lib".
--> node_modules/@openzeppelin/contracts/token/ERC20/extensions/draft-ERC20Permit.sol:8:1:
|
8 | import "../../../utils/cryptography/draft-EIP712.sol";

This error happens when solc was able to resolve symlinked files, but they're outside the Foxar workspace (./contracts).

Adding node_modules to allow_paths in foxar.toml grants solc access to that directory, and it will be able to read it:

# This translates to `solc --allow-paths ../node_modules`
allow_paths = ["../node_modules"]

Note that the path is relative to the Foxar workspace. See also solc allowed-paths

I'm getting Permission denied (os error 13)

If you see an error like

Failed to create artifact parent folder "/.../MyProject/out/IsolationModeMagic.sol": Permission denied (os error 13)

Then there's likely a folder permission issue. Ensure user has write access in the project root's folder.

It has been reported that on linux, canonicalizing paths can result in weird paths (/_1/...). This can be resolved by nuking the entire project folder and initializing again.

Connection refused when running spark build

If you're unable to access github URLs called by spark build, you will see an error like

Error:
error sending request for url (https://raw.githubusercontent.com/roynalnaruto/solc-builds/ff4ea8a7bbde4488428de69f2c40a7fc56184f5e/macosx/aarch64/list.json): error trying to connect: tcp connect error: Connection refused (os error 61)

Connection failed because access to the URL from your location may be restricted. To solve this, you should set proxy.

You could run export http_proxy=http://127.0.0.1:7890 https_proxy=http://127.0.0.1:7890 first in the terminal then you will spark build successfully.