b403e731be
Update schema to remove uncles |
||
---|---|---|
cmd | ||
db | ||
dockerfiles | ||
documentation | ||
environments | ||
integration_test | ||
libraries/shared | ||
pkg | ||
plugins | ||
postgraphile | ||
scripts | ||
test_config | ||
utils | ||
vendor | ||
.dockerignore | ||
.gitignore | ||
.travis.yml | ||
Dockerfile | ||
Gopkg.lock | ||
Gopkg.toml | ||
LICENSE | ||
main.go | ||
Makefile | ||
README.md |
Vulcanize DB
Vulcanize DB is a set of tools that make it easier for developers to write application-specific indexes and caches for dapps built on Ethereum.
Table of Contents
Background
The same data structures and encodings that make Ethereum an effective and trust-less distributed virtual machine complicate data accessibility and usability for dApp developers.
Dependencies
- Go 1.11+
- Postgres 10.6
- Ethereum Node
- Go Ethereum (1.8.23+)
- Parity 1.8.11+
Install
Building the project
Download the codebase to your local GOPATH
via:
go get github.com/vulcanize/vulcanizedb
Move to the project directory and use golang/dep to install the dependencies:
cd $GOPATH/src/github.com/vulcanize/vulcanizedb
dep ensure
Once the dependencies have been successfully installed, build the executable with:
make build
If you are running into issues at this stage, ensure that GOPATH
is defined in your shell.
If necessary, GOPATH
can be set in ~/.bashrc
or ~/.bash_profile
, depending upon your system.
It can be additionally helpful to add $GOPATH/bin
to your shell's $PATH
.
Setting up the database
-
Install Postgres
-
Create a superuser for yourself and make sure
psql --list
works without prompting for a password. -
createdb vulcanize_public
-
cd $GOPATH/src/github.com/vulcanize/vulcanizedb
-
Run the migrations:
make migrate HOST_NAME=localhost NAME=vulcanize_public PORT=5432
- To rollback a single step:
make rollback NAME=vulcanize_public
- To rollback to a certain migration:
make rollback_to MIGRATION=n NAME=vulcanize_public
- To see status of migrations:
make migration_status NAME=vulcanize_public
- See below for configuring additional environments
- To rollback a single step:
In some cases (such as recent Ubuntu systems), it may be necessary to overcome failures of password authentication from localhost. To allow access on Ubuntu, set localhost connections via hostname, ipv4, and ipv6 from peer/md5 to trust in: /etc/postgresql//pg_hba.conf
(It should be noted that trusted auth should only be enabled on systems without sensitive data in them: development and local test databases)
Configuring a synced Ethereum node
- To use a local Ethereum node, copy
environments/public.toml.example
toenvironments/public.toml
and update theipcPath
andlevelDbPath
.-
ipcPath
should match the local node's IPC filepath:-
For Geth:
- The IPC file is called
geth.ipc
. - The geth IPC file path is printed to the console when you start geth.
- The default location is:
- Mac:
<full home path>/Library/Ethereum
- Linux:
<full home path>/ethereum/geth.ipc
- Mac:
- The IPC file is called
-
For Parity:
- The IPC file is called
jsonrpc.ipc
. - The default location is:
- Mac:
<full home path>/Library/Application\ Support/io.parity.ethereum/
- Linux:
<full home path>/local/share/io.parity.ethereum/
- Mac:
- The IPC file is called
-
-
levelDbPath
should match Geth's chaindata directory path.- The geth LevelDB chaindata path is printed to the console when you start geth.
- The default location is:
- Mac:
<full home path>/Library/Ethereum/geth/chaindata
- Linux:
<full home path>/ethereum/geth/chaindata
- Mac:
levelDbPath
is irrelevant (andcoldImport
is currently unavailable) if only running parity.
-
Usage
Usage is broken up into two processes:
Data syncing
To provide data for transformations, raw Ethereum data must first be synced into vDB.
This is accomplished through the use of the lightSync
, sync
, or coldImport
commands.
These commands are described in detail here.
Data transformation
Contract watchers use the raw data that has been synced into Postgres to filter out and apply transformations to specific data of interest.
There is a built-in contractWatcher
command which provides generic transformation of most contract data. This command is described in detail here.
In many cases a custom transformer or set of transformers will need to be written to provide complete or more comprehensive coverage or to optimize other aspects of the output for a specific end-use.
In this case we have provided the compose
, execute
, and composeAndExecute
commands for running custom transformers from external repositories. This is described in detail here.
Tests
- Replace the empty
ipcPath
in theenvironments/infura.toml
with a path to a full node's eth_jsonrpc endpoint (e.g. local geth node ipc path or infura url)- Note: integration tests require configuration with an archival node
createdb vulcanize_private
will create the test dbmake migrate NAME=vulcanize_private
will run the db migrationsmake test
will run the unit tests and skip the integration testsmake integrationtest
will run just the integration tests
API
Postgraphile is used to expose GraphQL endpoints for our database schemas, this is described in detail here.
Contributing
Contributions are welcome! For more on this, please see here.
Small note: If editing the Readme, please conform to the standard-readme specification.
License
AGPL-3.0 © Vulcanize Inc