This repository describes how to run a local instance of Subtensor with a live network state using Baedeker.
Baedeker provides chain-spec modification, key generation, and chain deployment configurator based on jsonnet and UniqueNetwork’s chainql.
curl -L https://github.com/UniqueNetwork/baedeker/releases/download/v0.1.7/baedeker -o baedeker && chmod +x baedekerAfter that, add baedeker to your PATH or copy it to /usr/local/bin:
sudo mv ./baedeker /usr/local/bingit clone https://github.com/UniqueNetwork/baedeker && baedeker
cargo build --release
sudo mv ./target/release/baedeker /usr/local/bin
sudo chmod +x /usr/local/bin/baedekerTo begin, remove the previous spec if you have already run baedeker:
rm ./.bdk-env/specs/subtensor.jsonAfter that, you can run baedeker. It will download the state and prepare the spec for running the nodes:
./up.sh ./forkless-data.jsonnet --tla-str=forked_spec=subtensor --tla-str=fork_source=wss://entrypoint-finney.opentensor.aiFor graceful handling of broken state add this flag:
--corrupted-storage-strategy use-defaultIf you want to start with the testnet state, use wss://test.finney.opentensor.ai instead of wss://entrypoint-finney.opentensor.ai.
To run nodes you need to git clone subtensor in ../subtensor directory and build node. It's important to match checkout commit to current Finney runtime version as close as possible.
Then you can start:
./localnet-baedeker.shIf you are downloading the state very intensively, the connection may drop. To prevent this, set the CHAINQL_WORKERS environment variable to change the number of concurrent requests. By default, it is set to 2. To speed up the download, set it to 8 or 16; to slow it down, set it to 1.
If this does not help, try changing the number of keys requested at once by setting the CHAINQL_KEYS_CHUNK_SIZE variable. The default is 20000; for more stability, you can set it to 5000.
Important note: If you set these values too low, the state may not finish downloading within an hour, and the node will return a "State discarded for..." error.