Server backend for indexed ETH IPLD objects
Go to file
2018-12-10 17:12:51 +01:00
bin update supfile and deploy script for production 2018-09-24 10:24:06 -05:00
cmd Remove bc references and cleanup pointers 2018-12-10 16:50:13 +01:00
db (VDB-267) Remove pit file stability fee 2018-11-15 12:14:59 -06:00
dockerfiles/rinkeby dockerfiles/rinkeby: added named volume for pg container 2018-06-25 23:34:49 +03:00
environments use logrus for logging 2018-11-21 10:14:11 -06:00
integration_test Handle headers from POA chain 2018-09-19 11:00:20 -05:00
libraries/shared Remove bc references and cleanup pointers 2018-12-10 16:50:13 +01:00
pkg Fix remaining integration tests 2018-12-10 17:12:51 +01:00
postgraphile enable CORS 2018-11-13 14:29:08 -06:00
scripts Add convenience db_reset script (#45) 2018-10-03 10:37:34 +02:00
test_config use logrus for logging 2018-11-21 10:14:11 -06:00
utils use logrus for logging 2018-11-21 10:14:11 -06:00
vendor dep ensure logrus; update README: 2018-11-21 10:42:30 -06:00
.gitignore use logrus for logging 2018-11-21 10:14:11 -06:00
.private_blockchain_password Add integration test 2017-10-24 15:36:50 -05:00
.travis.yml remove logrus go get cmd from travis 2018-11-26 15:04:36 -06:00
Gopkg.lock dep ensure logrus; update README: 2018-11-21 10:42:30 -06:00
Gopkg.toml use logrus for logging 2018-11-21 10:14:11 -06:00
LICENSE Add LICENSE 2017-11-09 12:58:17 -06:00
main.go use logrus for logging 2018-11-21 10:14:11 -06:00
Makefile integration tests hits self hosted parity kovan node: 2018-11-01 10:24:03 -05:00
README.md Update readme badge to handle private repo (#124) 2018-11-21 15:48:51 -06:00
Supfile update supfile and deploy script for production 2018-09-24 10:24:06 -05:00

Vulcanize DB

Join the chat at https://gitter.im/vulcanizeio/VulcanizeDB

Build Status

About

Vulcanize DB is a set of tools that make it easier for developers to write application-specific indexes and caches for dapps built on Ethereum.

Dependencies

Installation

go get github.com/vulcanize/vulcanizedb go get gopkg.in/DataDog/dd-trace-go.v1/ddtrace

Setting up the Database

  1. Install Postgres

  2. Create a superuser for yourself and make sure psql --list works without prompting for a password.

  3. createdb vulcanize_public

  4. cd $GOPATH/src/github.com/vulcanize/vulcanizedb

  5. Run the migrations: make migrate HOST_NAME=localhost NAME=vulcanize_public PORT=5432

    • See below for configuring additional environments

Create a migration file (up and down)

  1. ./script/create_migrate create_bite_table

Configuration

  • To use a local Ethereum node, copy environments/public.toml.example to environments/public.toml and update the ipcPath and levelDbPath.

    • ipcPath should match the local node's IPC filepath:

      • when using geth:

        • The IPC file is called geth.ipc.
        • The geth IPC file path is printed to the console when you start geth.
        • The default location is:
          • Mac: $HOME/Library/Ethereum
          • Linux: $HOME/.ethereum
      • when using parity:

        • The IPC file is called jsonrpc.ipc.
        • The default location is:
          • Mac: $HOME/Library/Application\ Support/io.parity.ethereum/
          • Linux: $HOME/.local/share/io.parity.ethereum/
    • levelDbPath should match Geth's chaindata directory path.

      • The geth LevelDB chaindata path is printed to the console when you start geth.
      • The default location is:
        • Mac: $HOME/Library/Ethereum/geth/chaindata
        • Linux: $HOME/.ethereum/geth/chaindata
      • levelDbPath is irrelevant (and coldImport is currently unavailable) if only running parity.
  • See environments/infura.toml to configure commands to run against infura, if a local node is unavailable.

  • Copy environments/local.toml.example to environments/local.toml to configure commands to run against a local node such as Ganache or ganache-cli.

Start syncing with postgres

Syncs VulcanizeDB with the configured Ethereum node, populating blocks, transactions, receipts, and logs. This command is useful when you want to maintain a broad cache of what's happening on the blockchain.

  1. Start Ethereum node (if fast syncing your Ethereum node, wait for initial sync to finish)
  2. In a separate terminal start VulcanizeDB:
    • ./vulcanizedb sync --config <config.toml> --starting-block-number <block-number>

Alternatively, sync from Geth's underlying LevelDB

Sync VulcanizeDB from the LevelDB underlying a Geth node.

  1. Assure node is not running, and that it has synced to the desired block height.
  2. Start vulcanize_db
    • ./vulcanizedb coldImport --config <config.toml> --starting-block-number <block-number> --ending-block-number <block-number>
  3. Optional flags:
    • --starting-block-number/-s: block number to start syncing from
    • --ending-block-number/-e: block number to sync to
    • --all/-a: sync all missing blocks

Alternatively, sync in "light" mode

Syncs VulcanizeDB with the configured Ethereum node, populating only block headers. This command is useful when you want a minimal baseline from which to track targeted data on the blockchain (e.g. individual smart contract storage values).

  1. Start Ethereum node
  2. In a separate terminal start VulcanizeDB:
    • ./vulcanizedb lightSync --config <config.toml> --starting-block-number <block-number>

Continuously sync Maker event logs from light sync

Continuously syncs Maker event logs from the configured Ethereum node based on the populated block headers. This includes logs related to auctions, multi-collateral dai, and price feeds. This command requires that the lightSync process is also being run so as to be able to sync in real time.

  1. Start Ethereum node (or plan to configure the commands to point to a remote IPC path).
  2. In a separate terminal run the lightSync command (see above).
  3. In another terminal window run the continuousLogSync command:
  • ./vulcanizedb continuousLogSync --config <config.toml>
  • An option --transformers flag may be passed to the command to specific which transformers to execute, this will default to all transformers if the flag is not passed.
    • ./vulcanizedb continuousLogSync --config environments/private.toml --transformers="priceFeed"
    • see the buildTransformerInitializerMap method in cmd/continuousLogSync.go for available transformers

Backfill Maker event logs from light sync

Backfills Maker event logs from the configured Ethereum node based on the populated block headers. This includes logs related to auctions, multi-collateral dai, and price feeds. This command requires that a light sync (see command above) has previously been run.

Since auction/mcd contracts have not yet been deployed, this command will need to be run a local blockchain at the moment. As such, a new environment file will need to be added. See environments/local.toml.example.

  1. Start Ethereum node
  2. In a separate terminal run the backfill command:
  • ./vulcanizedb backfillMakerLogs --config <config.toml>

Start full environment in docker by single command

Geth Rinkeby

make command description
rinkeby_env_up start geth, postgres and rolling migrations, after migrations done starting vulcanizedb container
rinkeby_env_deploy build and run vulcanizedb container in rinkeby environment
rinkeby_env_migrate build and run rinkeby env migrations
rinkeby_env_down stop and remove all rinkeby env containers

Success run of the VulcanizeDB container require full geth state sync, attach to geth console and check sync state:

$ docker exec -it rinkeby_vulcanizedb_geth geth --rinkeby attach
...
> eth.syncing
false

If you have full rinkeby chaindata you can move it to rinkeby_vulcanizedb_geth_data docker volume to skip long wait of sync.

Running the Tests

  • make test will run the unit tests and skip the integration tests
  • make integrationtest will run the just the integration tests
  • Note: requires Ganache chain setup and seeded with flip-kick.js and frob.js (in that order)

Deploying

  1. you will need to make sure you have ssh agent running and your ssh key added to it. instructions here
  2. go get -u github.com/pressly/sup/cmd/sup
  3. sup staging deploy