Implement SSZ union type (#2579)

## Issue Addressed

NA

## Proposed Changes

Implements the "union" type from the SSZ spec for `ssz`, `ssz_derive`, `tree_hash` and `tree_hash_derive` so it may be derived for `enums`:

https://github.com/ethereum/consensus-specs/blob/v1.1.0-beta.3/ssz/simple-serialize.md#union

The union type is required for the merge, since the `Transaction` type is defined as a single-variant union `Union[OpaqueTransaction]`.

### Crate Updates

This PR will (hopefully) cause CI to publish new versions for the following crates:

- `eth2_ssz_derive`: `0.2.1` -> `0.3.0`
- `eth2_ssz`: `0.3.0` -> `0.4.0`
- `eth2_ssz_types`: `0.2.0` -> `0.2.1`
- `tree_hash`: `0.3.0` -> `0.4.0`
- `tree_hash_derive`: `0.3.0` -> `0.4.0`

These these crates depend on each other, I've had to add a workspace-level `[patch]` for these crates. A follow-up PR will need to remove this patch, ones the new versions are published.

### Union Behaviors

We already had SSZ `Encode` and `TreeHash` derive for enums, however it just did a "transparent" pass-through of the inner value. Since the "union" decoding from the spec is in conflict with the transparent method, I've required that all `enum` have exactly one of the following enum-level attributes:

#### SSZ

-  `#[ssz(enum_behaviour = "union")]`
    - matches the spec used for the merge
-  `#[ssz(enum_behaviour = "transparent")]`
    - maintains existing functionality
    - not supported for `Decode` (never was)
    
#### TreeHash

-  `#[tree_hash(enum_behaviour = "union")]`
    - matches the spec used for the merge
-  `#[tree_hash(enum_behaviour = "transparent")]`
    - maintains existing functionality

This means that we can maintain the existing transparent behaviour, but all existing users will get a compile-time error until they explicitly opt-in to being transparent.

### Legacy Option Encoding

Before this PR, we already had a union-esque encoding for `Option<T>`. However, this was with the *old* SSZ spec where the union selector was 4 bytes. During merge specification, the spec was changed to use 1 byte for the selector.

Whilst the 4-byte `Option` encoding was never used in the spec, we used it in our database. Writing a migrate script for all occurrences of `Option` in the database would be painful, especially since it's used in the `CommitteeCache`. To avoid the migrate script, I added a serde-esque `#[ssz(with = "module")]` field-level attribute to `ssz_derive` so that we can opt into the 4-byte encoding on a field-by-field basis.

The `ssz::legacy::four_byte_impl!` macro allows a one-liner to define the module required for the `#[ssz(with = "module")]` for some `Option<T> where T: Encode + Decode`.

Notably, **I have removed `Encode` and `Decode` impls for `Option`**. I've done this to force a break on downstream users. Like I mentioned, `Option` isn't used in the spec so I don't think it'll be *that* annoying. I think it's nicer than quietly having two different union implementations or quietly breaking the existing `Option` impl.

### Crate Publish Ordering

I've modified the order in which CI publishes crates to ensure that we don't publish a crate without ensuring we already published a crate that it depends upon.

## TODO

- [ ] Queue a follow-up `[patch]`-removing PR.
This commit is contained in:
Paul Hauner 2021-09-25 05:58:36 +00:00
parent a844ce5ba9
commit fe52322088
63 changed files with 1515 additions and 571 deletions

View File

@ -36,22 +36,26 @@ jobs:
- name: Cargo login - name: Cargo login
run: | run: |
echo "${CARGO_API_TOKEN}" | cargo login echo "${CARGO_API_TOKEN}" | cargo login
- name: publish tree hash
if: startsWith(env.TAG, 'tree-hash-v')
run: |
./scripts/ci/publish.sh consensus/tree_hash tree_hash "$TAG"
- name: publish tree hash derive
if: startsWith(env.TAG, 'tree-hash-derive-v')
run: |
./scripts/ci/publish.sh consensus/tree_hash_derive tree_hash_derive "$TAG"
- name: publish eth2 ssz
if: startsWith(env.TAG, 'eth2-ssz-v')
run: |
./scripts/ci/publish.sh consensus/ssz eth2_ssz "$TAG"
- name: publish eth2 ssz derive - name: publish eth2 ssz derive
if: startsWith(env.TAG, 'eth2-ssz-derive-v') if: startsWith(env.TAG, 'eth2-ssz-derive-v')
run: | run: |
./scripts/ci/publish.sh consensus/ssz_derive eth2_ssz_derive "$TAG" ./scripts/ci/publish.sh consensus/ssz_derive eth2_ssz_derive "$TAG"
- name: publish eth2 ssz
if: startsWith(env.TAG, 'eth2-ssz-v')
run: |
./scripts/ci/publish.sh consensus/ssz eth2_ssz "$TAG"
- name: publish eth2 hashing
if: startsWith(env.TAG, 'eth2-hashing-v')
run: |
./scripts/ci/publish.sh crypto/eth2_hashing eth2_hashing "$TAG"
- name: publish tree hash derive
if: startsWith(env.TAG, 'tree-hash-derive-v')
run: |
./scripts/ci/publish.sh consensus/tree_hash_derive tree_hash_derive "$TAG"
- name: publish tree hash
if: startsWith(env.TAG, 'tree-hash-v')
run: |
./scripts/ci/publish.sh consensus/tree_hash tree_hash "$TAG"
- name: publish ssz types - name: publish ssz types
if: startsWith(env.TAG, 'eth2-ssz-types-v') if: startsWith(env.TAG, 'eth2-ssz-types-v')
run: | run: |
@ -60,7 +64,3 @@ jobs:
if: startsWith(env.TAG, 'eth2-serde-util-v') if: startsWith(env.TAG, 'eth2-serde-util-v')
run: | run: |
./scripts/ci/publish.sh consensus/serde_utils eth2_serde_utils "$TAG" ./scripts/ci/publish.sh consensus/serde_utils eth2_serde_utils "$TAG"
- name: publish eth2 hashing
if: startsWith(env.TAG, 'eth2-hashing-v')
run: |
./scripts/ci/publish.sh crypto/eth2_hashing eth2_hashing "$TAG"

266
Cargo.lock generated
View File

@ -17,8 +17,8 @@ dependencies = [
"eth2", "eth2",
"eth2_keystore", "eth2_keystore",
"eth2_network_config", "eth2_network_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"eth2_wallet", "eth2_wallet",
"eth2_wallet_manager", "eth2_wallet_manager",
"filesystem", "filesystem",
@ -454,9 +454,9 @@ dependencies = [
"eth2", "eth2",
"eth2_config", "eth2_config",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"exit-future", "exit-future",
"fork_choice", "fork_choice",
"futures", "futures",
@ -493,7 +493,7 @@ dependencies = [
"task_executor", "task_executor",
"tempfile", "tempfile",
"tokio", "tokio",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -511,7 +511,7 @@ dependencies = [
"eth2_config", "eth2_config",
"eth2_libp2p", "eth2_libp2p",
"eth2_network_config", "eth2_network_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"exit-future", "exit-future",
"futures", "futures",
"genesis", "genesis",
@ -632,14 +632,14 @@ dependencies = [
"blst", "blst",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"hex", "hex",
"milagro_bls", "milagro_bls",
"rand 0.7.3", "rand 0.7.3",
"serde", "serde",
"serde_derive", "serde_derive",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"zeroize", "zeroize",
] ]
@ -663,7 +663,7 @@ dependencies = [
"clap", "clap",
"eth2_libp2p", "eth2_libp2p",
"eth2_network_config", "eth2_network_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"futures", "futures",
"hex", "hex",
"log", "log",
@ -774,14 +774,14 @@ name = "cached_tree_hash"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"quickcheck", "quickcheck",
"quickcheck_macros", "quickcheck_macros",
"smallvec", "smallvec",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
] ]
[[package]] [[package]]
@ -880,7 +880,7 @@ dependencies = [
"clap", "clap",
"dirs", "dirs",
"eth2_network_config", "eth2_network_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"hex", "hex",
"types", "types",
] ]
@ -898,7 +898,7 @@ dependencies = [
"eth2", "eth2",
"eth2_config", "eth2_config",
"eth2_libp2p", "eth2_libp2p",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"futures", "futures",
"genesis", "genesis",
"http_api", "http_api",
@ -926,7 +926,7 @@ dependencies = [
"timer", "timer",
"tokio", "tokio",
"toml", "toml",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
"url", "url",
] ]
@ -1222,8 +1222,18 @@ version = "0.12.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f2c43f534ea4b0b049015d00269734195e6d3f0f6635cb692251aca6f9f8b3c" checksum = "5f2c43f534ea4b0b049015d00269734195e6d3f0f6635cb692251aca6f9f8b3c"
dependencies = [ dependencies = [
"darling_core", "darling_core 0.12.4",
"darling_macro", "darling_macro 0.12.4",
]
[[package]]
name = "darling"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "757c0ded2af11d8e739c4daea1ac623dd1624b06c844cf3f5a39f1bdbd99bb12"
dependencies = [
"darling_core 0.13.0",
"darling_macro 0.13.0",
] ]
[[package]] [[package]]
@ -1240,13 +1250,38 @@ dependencies = [
"syn", "syn",
] ]
[[package]]
name = "darling_core"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2c34d8efb62d0c2d7f60ece80f75e5c63c1588ba68032740494b0b9a996466e3"
dependencies = [
"fnv",
"ident_case",
"proc-macro2",
"quote",
"strsim 0.10.0",
"syn",
]
[[package]] [[package]]
name = "darling_macro" name = "darling_macro"
version = "0.12.4" version = "0.12.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "29b5acf0dea37a7f66f7b25d2c5e93fd46f8f6968b1a5d7a3e02e97768afc95a" checksum = "29b5acf0dea37a7f66f7b25d2c5e93fd46f8f6968b1a5d7a3e02e97768afc95a"
dependencies = [ dependencies = [
"darling_core", "darling_core 0.12.4",
"quote",
"syn",
]
[[package]]
name = "darling_macro"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ade7bff147130fe5e6d39f089c6bd49ec0250f35d70b2eebf72afdfc919f15cc"
dependencies = [
"darling_core 0.13.0",
"quote", "quote",
"syn", "syn",
] ]
@ -1287,13 +1322,13 @@ checksum = "b72465f46d518f6015d9cf07f7f3013a95dd6b9c2747c3d65ae0cce43929d14f"
name = "deposit_contract" name = "deposit_contract"
version = "0.2.0" version = "0.2.0"
dependencies = [ dependencies = [
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"ethabi 12.0.0", "ethabi 12.0.0",
"hex", "hex",
"reqwest", "reqwest",
"serde_json", "serde_json",
"sha2", "sha2",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -1500,8 +1535,8 @@ dependencies = [
"compare_fields", "compare_fields",
"compare_fields_derive", "compare_fields_derive",
"derivative", "derivative",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"fs2", "fs2",
"hex", "hex",
@ -1514,8 +1549,8 @@ dependencies = [
"snap", "snap",
"state_processing", "state_processing",
"swap_or_not_shuffle", "swap_or_not_shuffle",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
"types", "types",
] ]
@ -1645,8 +1680,8 @@ dependencies = [
"eth1_test_rig", "eth1_test_rig",
"eth2", "eth2",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"fallback", "fallback",
"futures", "futures",
"hex", "hex",
@ -1665,7 +1700,7 @@ dependencies = [
"task_executor", "task_executor",
"tokio", "tokio",
"toml", "toml",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
"web3", "web3",
] ]
@ -1691,8 +1726,8 @@ dependencies = [
"eth2_keystore", "eth2_keystore",
"eth2_libp2p", "eth2_libp2p",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"futures", "futures",
"futures-util", "futures-util",
"hex", "hex",
@ -1778,7 +1813,7 @@ dependencies = [
"aes", "aes",
"bls", "bls",
"eth2_key_derivation", "eth2_key_derivation",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"hex", "hex",
"hmac 0.11.0", "hmac 0.11.0",
"pbkdf2 0.8.0", "pbkdf2 0.8.0",
@ -1803,9 +1838,9 @@ dependencies = [
"dirs", "dirs",
"discv5", "discv5",
"error-chain", "error-chain",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"exit-future", "exit-future",
"fnv", "fnv",
"futures", "futures",
@ -1847,7 +1882,7 @@ version = "0.2.0"
dependencies = [ dependencies = [
"enr", "enr",
"eth2_config", "eth2_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"serde", "serde",
"serde_yaml", "serde_yaml",
"tempfile", "tempfile",
@ -1878,68 +1913,35 @@ dependencies = [
[[package]] [[package]]
name = "eth2_ssz" name = "eth2_ssz"
version = "0.3.0" version = "0.4.0"
dependencies = [
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)",
"ethereum-types 0.11.0",
"smallvec",
]
[[package]]
name = "eth2_ssz"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "76a5dc942eddedd41e4591bab17bece2b00eb9eb153b8ea683c5bba682dbd41d"
dependencies = [ dependencies = [
"eth2_ssz_derive",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"smallvec", "smallvec",
] ]
[[package]] [[package]]
name = "eth2_ssz_derive" name = "eth2_ssz_derive"
version = "0.2.1" version = "0.3.0"
dependencies = [
"quote",
"syn",
]
[[package]]
name = "eth2_ssz_derive"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "12812b9ebe7b7246ab2ddf526cca7c6b1652b8f6a189450291eae702cf34808d"
dependencies = [ dependencies = [
"darling 0.13.0",
"proc-macro2",
"quote", "quote",
"syn", "syn",
] ]
[[package]] [[package]]
name = "eth2_ssz_types" name = "eth2_ssz_types"
version = "0.2.0" version = "0.2.1"
dependencies = [ dependencies = [
"arbitrary 0.4.7", "arbitrary 0.4.7",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"serde", "serde",
"serde_derive", "serde_derive",
"serde_json", "serde_json",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
"typenum",
]
[[package]]
name = "eth2_ssz_types"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bdc06d98dfc53d15835d75e4506643b7f9c64132878a11a3269ab8549ae06e68"
dependencies = [
"arbitrary 0.4.7",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
"serde",
"serde_derive",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
"typenum", "typenum",
] ]
@ -2201,14 +2203,14 @@ name = "fork_choice"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"beacon_chain", "beacon_chain",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"hex", "hex",
"proto_array", "proto_array",
"slot_clock", "slot_clock",
"state_processing", "state_processing",
"store", "store",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -2383,7 +2385,7 @@ dependencies = [
"eth1", "eth1",
"eth1_test_rig", "eth1_test_rig",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"exit-future", "exit-future",
"futures", "futures",
"int_to_bytes", "int_to_bytes",
@ -2396,7 +2398,7 @@ dependencies = [
"slog", "slog",
"state_processing", "state_processing",
"tokio", "tokio",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -2681,7 +2683,7 @@ dependencies = [
"eth1", "eth1",
"eth2", "eth2",
"eth2_libp2p", "eth2_libp2p",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"fork_choice", "fork_choice",
"futures", "futures",
"hex", "hex",
@ -2699,7 +2701,7 @@ dependencies = [
"tokio", "tokio",
"tokio-stream", "tokio-stream",
"tokio-util", "tokio-util",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
"warp", "warp",
"warp_utils", "warp_utils",
@ -3063,7 +3065,7 @@ dependencies = [
"eth2_keystore", "eth2_keystore",
"eth2_libp2p", "eth2_libp2p",
"eth2_network_config", "eth2_network_config",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_wallet", "eth2_wallet",
"futures", "futures",
"genesis", "genesis",
@ -3078,7 +3080,7 @@ dependencies = [
"serde_yaml", "serde_yaml",
"state_processing", "state_processing",
"tokio", "tokio",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
"validator_dir", "validator_dir",
"web3", "web3",
@ -4116,8 +4118,8 @@ dependencies = [
"environment", "environment",
"error-chain", "error-chain",
"eth2_libp2p", "eth2_libp2p",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"exit-future", "exit-future",
"fnv", "fnv",
"futures", "futures",
@ -4149,7 +4151,7 @@ dependencies = [
"tokio", "tokio",
"tokio-stream", "tokio-stream",
"tokio-util", "tokio-util",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -4369,8 +4371,8 @@ version = "0.2.0"
dependencies = [ dependencies = [
"beacon_chain", "beacon_chain",
"derivative", "derivative",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"int_to_bytes", "int_to_bytes",
"itertools", "itertools",
"lazy_static", "lazy_static",
@ -4823,8 +4825,8 @@ dependencies = [
name = "proto_array" name = "proto_array"
version = "0.2.0" version = "0.2.0"
dependencies = [ dependencies = [
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"serde", "serde",
"serde_derive", "serde_derive",
"serde_yaml", "serde_yaml",
@ -5651,8 +5653,8 @@ version = "0.1.0"
dependencies = [ dependencies = [
"bincode", "bincode",
"byteorder", "byteorder",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"filesystem", "filesystem",
"flate2", "flate2",
"lazy_static", "lazy_static",
@ -5669,8 +5671,8 @@ dependencies = [
"slog", "slog",
"sloggers", "sloggers",
"tempfile", "tempfile",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
"types", "types",
] ]
@ -5708,7 +5710,7 @@ dependencies = [
"serde_derive", "serde_derive",
"serde_json", "serde_json",
"tempfile", "tempfile",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -5933,8 +5935,8 @@ dependencies = [
"bls", "bls",
"env_logger 0.9.0", "env_logger 0.9.0",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"int_to_bytes", "int_to_bytes",
"integer-sqrt", "integer-sqrt",
"itertools", "itertools",
@ -5948,8 +5950,8 @@ dependencies = [
"serde_derive", "serde_derive",
"serde_yaml", "serde_yaml",
"smallvec", "smallvec",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
"types", "types",
] ]
@ -5958,7 +5960,7 @@ name = "state_transition_vectors"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"beacon_chain", "beacon_chain",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"lazy_static", "lazy_static",
"state_processing", "state_processing",
"types", "types",
@ -6026,8 +6028,8 @@ dependencies = [
"beacon_chain", "beacon_chain",
"db-key", "db-key",
"directory", "directory",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"itertools", "itertools",
"lazy_static", "lazy_static",
"leveldb", "leveldb",
@ -6040,7 +6042,7 @@ dependencies = [
"sloggers", "sloggers",
"state_processing", "state_processing",
"tempfile", "tempfile",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]
@ -6089,7 +6091,7 @@ version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8bf7f6700d7c135cf4e4900c2cfba9a12ecad1fdc45594aad48f6b344b2589a0" checksum = "8bf7f6700d7c135cf4e4900c2cfba9a12ecad1fdc45594aad48f6b344b2589a0"
dependencies = [ dependencies = [
"darling", "darling 0.12.4",
"itertools", "itertools",
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -6591,43 +6593,25 @@ dependencies = [
[[package]] [[package]]
name = "tree_hash" name = "tree_hash"
version = "0.3.0" version = "0.4.0"
dependencies = [ dependencies = [
"beacon_chain", "beacon_chain",
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz",
"eth2_ssz_derive",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"lazy_static", "lazy_static",
"rand 0.7.3", "rand 0.7.3",
"smallvec", "smallvec",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
"types", "types",
] ]
[[package]]
name = "tree_hash"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0092991b5664c725f0fbf30ed7eba2163e36cb22a789e1e371e9575eaff580e0"
dependencies = [
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"ethereum-types 0.11.0",
"smallvec",
]
[[package]] [[package]]
name = "tree_hash_derive" name = "tree_hash_derive"
version = "0.3.1" version = "0.4.0"
dependencies = [
"quote",
"syn",
]
[[package]]
name = "tree_hash_derive"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cbadff2b79dbe7e28bf382fc51c511552d3b8054b200a2e8cd973f61b3ae7603"
dependencies = [ dependencies = [
"darling 0.13.0",
"quote", "quote",
"syn", "syn",
] ]
@ -6732,9 +6716,9 @@ dependencies = [
"eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_hashing 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_interop_keypairs", "eth2_interop_keypairs",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"eth2_ssz_types 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_types",
"ethereum-types 0.11.0", "ethereum-types 0.11.0",
"hex", "hex",
"int_to_bytes", "int_to_bytes",
@ -6759,8 +6743,8 @@ dependencies = [
"swap_or_not_shuffle", "swap_or_not_shuffle",
"tempfile", "tempfile",
"test_random_derive", "test_random_derive",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"tree_hash_derive 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash_derive",
] ]
[[package]] [[package]]
@ -6926,8 +6910,8 @@ dependencies = [
"eth2_interop_keypairs", "eth2_interop_keypairs",
"eth2_keystore", "eth2_keystore",
"eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_serde_utils 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"eth2_ssz 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz",
"eth2_ssz_derive 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "eth2_ssz_derive",
"exit-future", "exit-future",
"fallback", "fallback",
"filesystem", "filesystem",
@ -6963,7 +6947,7 @@ dependencies = [
"task_executor", "task_executor",
"tempfile", "tempfile",
"tokio", "tokio",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
"url", "url",
"validator_dir", "validator_dir",
@ -6986,7 +6970,7 @@ dependencies = [
"rayon", "rayon",
"slog", "slog",
"tempfile", "tempfile",
"tree_hash 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", "tree_hash",
"types", "types",
] ]

View File

@ -81,3 +81,11 @@ members = [
"validator_client", "validator_client",
"validator_client/slashing_protection", "validator_client/slashing_protection",
] ]
[patch]
[patch.crates-io]
eth2_ssz = { path = "consensus/ssz" }
eth2_ssz_types = { path = "consensus/ssz_types" }
eth2_ssz_derive = { path = "consensus/ssz_derive" }
tree_hash = { path = "consensus/tree_hash" }
tree_hash_derive = { path = "consensus/tree_hash_derive" }

View File

@ -15,8 +15,8 @@ dirs = "3.0.1"
environment = { path = "../lighthouse/environment" } environment = { path = "../lighthouse/environment" }
deposit_contract = { path = "../common/deposit_contract" } deposit_contract = { path = "../common/deposit_contract" }
libc = "0.2.79" libc = "0.2.79"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
hex = "0.4.2" hex = "0.4.2"
rayon = "1.4.1" rayon = "1.4.1"
eth2_network_config = { path = "../common/eth2_network_config" } eth2_network_config = { path = "../common/eth2_network_config" }

View File

@ -36,7 +36,7 @@ task_executor = { path = "../common/task_executor" }
genesis = { path = "genesis" } genesis = { path = "genesis" }
eth2_network_config = { path = "../common/eth2_network_config" } eth2_network_config = { path = "../common/eth2_network_config" }
eth2_libp2p = { path = "./eth2_libp2p" } eth2_libp2p = { path = "./eth2_libp2p" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
serde = "1.0.116" serde = "1.0.116"
clap_utils = { path = "../common/clap_utils" } clap_utils = { path = "../common/clap_utils" }
hyper = "0.14.4" hyper = "0.14.4"

View File

@ -34,11 +34,11 @@ slog = { version = "2.5.2", features = ["max_level_trace"] }
sloggers = "2.0.2" sloggers = "2.0.2"
slot_clock = { path = "../../common/slot_clock" } slot_clock = { path = "../../common/slot_clock" }
eth2_hashing = "0.2.0" eth2_hashing = "0.2.0"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
tokio = "1.10.0" tokio = "1.10.0"
eth1 = { path = "../eth1" } eth1 = { path = "../eth1" }

View File

@ -17,7 +17,7 @@ eth2_libp2p = { path = "../eth2_libp2p" }
parking_lot = "0.11.0" parking_lot = "0.11.0"
prometheus = "0.11.0" prometheus = "0.11.0"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
eth2_config = { path = "../../common/eth2_config" } eth2_config = { path = "../../common/eth2_config" }
slot_clock = { path = "../../common/slot_clock" } slot_clock = { path = "../../common/slot_clock" }
serde = "1.0.116" serde = "1.0.116"
@ -37,7 +37,7 @@ sensitive_url = { path = "../../common/sensitive_url" }
genesis = { path = "../genesis" } genesis = { path = "../genesis" }
task_executor = { path = "../../common/task_executor" } task_executor = { path = "../../common/task_executor" }
environment = { path = "../../lighthouse/environment" } environment = { path = "../../lighthouse/environment" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
lazy_static = "1.4.0" lazy_static = "1.4.0"
lighthouse_metrics = { path = "../../common/lighthouse_metrics" } lighthouse_metrics = { path = "../../common/lighthouse_metrics" }
time = "0.2.22" time = "0.2.22"

View File

@ -19,9 +19,9 @@ serde = { version = "1.0.116", features = ["derive"] }
hex = "0.4.2" hex = "0.4.2"
types = { path = "../../consensus/types"} types = { path = "../../consensus/types"}
merkle_proof = { path = "../../consensus/merkle_proof"} merkle_proof = { path = "../../consensus/merkle_proof"}
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
eth2_hashing = "0.2.0" eth2_hashing = "0.2.0"
parking_lot = "0.11.0" parking_lot = "0.11.0"
slog = "2.5.2" slog = "2.5.2"

View File

@ -5,11 +5,16 @@ use crate::{
service::EndpointsCache, service::EndpointsCache,
}; };
use parking_lot::RwLock; use parking_lot::RwLock;
use ssz::four_byte_option_impl;
use ssz::{Decode, Encode}; use ssz::{Decode, Encode};
use ssz_derive::{Decode, Encode}; use ssz_derive::{Decode, Encode};
use std::sync::Arc; use std::sync::Arc;
use types::ChainSpec; use types::ChainSpec;
// Define "legacy" implementations of `Option<u64>` which use four bytes for encoding the union
// selector.
four_byte_option_impl!(four_byte_option_u64, u64);
#[derive(Default)] #[derive(Default)]
pub struct DepositUpdater { pub struct DepositUpdater {
pub cache: DepositCache, pub cache: DepositCache,
@ -69,6 +74,7 @@ impl Inner {
pub struct SszEth1Cache { pub struct SszEth1Cache {
block_cache: BlockCache, block_cache: BlockCache,
deposit_cache: SszDepositCache, deposit_cache: SszDepositCache,
#[ssz(with = "four_byte_option_u64")]
last_processed_block: Option<u64>, last_processed_block: Option<u64>,
} }

View File

@ -10,11 +10,11 @@ discv5 = { git = "https://github.com/sigp/discv5", rev="10247bbd299227fef20233f2
unsigned-varint = { version = "0.6.0", features = ["codec"] } unsigned-varint = { version = "0.6.0", features = ["codec"] }
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
hashset_delay = { path = "../../common/hashset_delay" } hashset_delay = { path = "../../common/hashset_delay" }
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
serde = { version = "1.0.116", features = ["derive"] } serde = { version = "1.0.116", features = ["derive"] }
serde_derive = "1.0.116" serde_derive = "1.0.116"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
slog = { version = "2.5.2", features = ["max_level_trace"] } slog = { version = "2.5.2", features = ["max_level_trace"] }
lighthouse_version = { path = "../../common/lighthouse_version" } lighthouse_version = { path = "../../common/lighthouse_version" }
tokio = { version = "1.10.0", features = ["time", "macros"] } tokio = { version = "1.10.0", features = ["time", "macros"] }

View File

@ -103,6 +103,7 @@ pub struct Ping {
)] )]
#[derive(Clone, Debug, PartialEq, Serialize, Encode)] #[derive(Clone, Debug, PartialEq, Serialize, Encode)]
#[serde(bound = "T: EthSpec")] #[serde(bound = "T: EthSpec")]
#[ssz(enum_behaviour = "transparent")]
pub struct MetaData<T: EthSpec> { pub struct MetaData<T: EthSpec> {
/// A sequential counter indicating when data gets modified. /// A sequential counter indicating when data gets modified.
pub seq_number: u64, pub seq_number: u64,

View File

@ -16,9 +16,9 @@ eth1 = { path = "../eth1"}
rayon = "1.4.1" rayon = "1.4.1"
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
merkle_proof = { path = "../../consensus/merkle_proof" } merkle_proof = { path = "../../consensus/merkle_proof" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_hashing = "0.2.0" eth2_hashing = "0.2.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
tokio = { version = "1.10.0", features = ["full"] } tokio = { version = "1.10.0", features = ["full"] }
parking_lot = "0.11.0" parking_lot = "0.11.0"
slog = "2.5.2" slog = "2.5.2"

View File

@ -27,14 +27,14 @@ lighthouse_metrics = { path = "../../common/lighthouse_metrics" }
lazy_static = "1.4.0" lazy_static = "1.4.0"
warp_utils = { path = "../../common/warp_utils" } warp_utils = { path = "../../common/warp_utils" }
slot_clock = { path = "../../common/slot_clock" } slot_clock = { path = "../../common/slot_clock" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
bs58 = "0.4.0" bs58 = "0.4.0"
futures = "0.3.8" futures = "0.3.8"
[dev-dependencies] [dev-dependencies]
store = { path = "../store" } store = { path = "../store" }
environment = { path = "../../lighthouse/environment" } environment = { path = "../../lighthouse/environment" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
sensitive_url = { path = "../../common/sensitive_url" } sensitive_url = { path = "../../common/sensitive_url" }
[[test]] [[test]]

View File

@ -25,9 +25,9 @@ state_processing = { path = "../../consensus/state_processing" }
slot_clock = { path = "../../common/slot_clock" } slot_clock = { path = "../../common/slot_clock" }
slog = { version = "2.5.2", features = ["max_level_trace"] } slog = { version = "2.5.2", features = ["max_level_trace"] }
hex = "0.4.2" hex = "0.4.2"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
tree_hash = "0.3.0" tree_hash = "0.4.0"
futures = "0.3.7" futures = "0.3.7"
error-chain = "0.12.4" error-chain = "0.12.4"
tokio = { version = "1.10.0", features = ["full"] } tokio = { version = "1.10.0", features = ["full"] }

View File

@ -13,8 +13,8 @@ lighthouse_metrics = { path = "../../common/lighthouse_metrics" }
parking_lot = "0.11.0" parking_lot = "0.11.0"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
rayon = "1.5.0" rayon = "1.5.0"
serde = "1.0.116" serde = "1.0.116"
serde_derive = "1.0.116" serde_derive = "1.0.116"

View File

@ -28,6 +28,7 @@ type PersistedSyncContributions<T> = Vec<(SyncAggregateId, Vec<SyncCommitteeCont
#[derive(PartialEq, Debug, Serialize, Deserialize, Encode)] #[derive(PartialEq, Debug, Serialize, Deserialize, Encode)]
#[serde(untagged)] #[serde(untagged)]
#[serde(bound = "T: EthSpec")] #[serde(bound = "T: EthSpec")]
#[ssz(enum_behaviour = "transparent")]
pub struct PersistedOperationPool<T: EthSpec> { pub struct PersistedOperationPool<T: EthSpec> {
/// Mapping from attestation ID to attestation mappings. /// Mapping from attestation ID to attestation mappings.
// We could save space by not storing the attestation ID, but it might // We could save space by not storing the attestation ID, but it might

View File

@ -13,9 +13,9 @@ db-key = "0.0.5"
leveldb = { version = "0.8.6", default-features = false } leveldb = { version = "0.8.6", default-features = false }
parking_lot = "0.11.0" parking_lot = "0.11.0"
itertools = "0.10.0" itertools = "0.10.0"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
slog = "2.5.2" slog = "2.5.2"

View File

@ -15,9 +15,10 @@ use types::*;
/// Utilises lazy-loading from separate storage for its vector fields. /// Utilises lazy-loading from separate storage for its vector fields.
#[superstruct( #[superstruct(
variants(Base, Altair), variants(Base, Altair),
variant_attributes(derive(Debug, PartialEq, Clone, Encode, Decode)) variant_attributes(derive(Debug, PartialEq, Clone, Encode, Decode),)
)] )]
#[derive(Debug, PartialEq, Clone, Encode)] #[derive(Debug, PartialEq, Clone, Encode)]
#[ssz(enum_behaviour = "transparent")]
pub struct PartialBeaconState<T> pub struct PartialBeaconState<T>
where where
T: EthSpec, T: EthSpec,
@ -32,15 +33,12 @@ where
// History // History
pub latest_block_header: BeaconBlockHeader, pub latest_block_header: BeaconBlockHeader,
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
pub block_roots: Option<FixedVector<Hash256, T::SlotsPerHistoricalRoot>>, pub block_roots: Option<FixedVector<Hash256, T::SlotsPerHistoricalRoot>>,
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
pub state_roots: Option<FixedVector<Hash256, T::SlotsPerHistoricalRoot>>, pub state_roots: Option<FixedVector<Hash256, T::SlotsPerHistoricalRoot>>,
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
pub historical_roots: Option<VariableList<Hash256, T::HistoricalRootsLimit>>, pub historical_roots: Option<VariableList<Hash256, T::HistoricalRootsLimit>>,
// Ethereum 1.0 chain data // Ethereum 1.0 chain data
@ -55,8 +53,7 @@ where
// Shuffling // Shuffling
/// Randao value from the current slot, for patching into the per-epoch randao vector. /// Randao value from the current slot, for patching into the per-epoch randao vector.
pub latest_randao_value: Hash256, pub latest_randao_value: Hash256,
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
pub randao_mixes: Option<FixedVector<Hash256, T::EpochsPerHistoricalVector>>, pub randao_mixes: Option<FixedVector<Hash256, T::EpochsPerHistoricalVector>>,
// Slashings // Slashings

View File

@ -10,7 +10,7 @@ clap = "2.33.3"
eth2_libp2p = { path = "../beacon_node/eth2_libp2p" } eth2_libp2p = { path = "../beacon_node/eth2_libp2p" }
types = { path = "../consensus/types" } types = { path = "../consensus/types" }
eth2_network_config = { path = "../common/eth2_network_config" } eth2_network_config = { path = "../common/eth2_network_config" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
slog = "2.5.2" slog = "2.5.2"
sloggers = "2.0.2" sloggers = "2.0.2"
tokio = "1.10.0" tokio = "1.10.0"

View File

@ -12,4 +12,4 @@ hex = "0.4.2"
dirs = "3.0.1" dirs = "3.0.1"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
eth2_network_config = { path = "../eth2_network_config" } eth2_network_config = { path = "../eth2_network_config" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"

View File

@ -14,6 +14,6 @@ hex = "0.4.2"
[dependencies] [dependencies]
types = { path = "../../consensus/types"} types = { path = "../../consensus/types"}
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
ethabi = "12.0.0" ethabi = "12.0.0"

View File

@ -22,8 +22,8 @@ ring = "0.16.19"
bytes = "1.0.1" bytes = "1.0.1"
account_utils = { path = "../../common/account_utils" } account_utils = { path = "../../common/account_utils" }
sensitive_url = { path = "../../common/sensitive_url" } sensitive_url = { path = "../../common/sensitive_url" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
futures-util = "0.3.8" futures-util = "0.3.8"
futures = "0.3.8" futures = "0.3.8"
store = { path = "../../beacon_node/store", optional = true } store = { path = "../../beacon_node/store", optional = true }

View File

@ -8,11 +8,17 @@ use crate::{
use proto_array::core::ProtoArray; use proto_array::core::ProtoArray;
use reqwest::IntoUrl; use reqwest::IntoUrl;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use ssz::four_byte_option_impl;
use ssz_derive::{Decode, Encode}; use ssz_derive::{Decode, Encode};
use store::{AnchorInfo, Split}; use store::{AnchorInfo, Split};
pub use eth2_libp2p::{types::SyncState, PeerInfo}; pub use eth2_libp2p::{types::SyncState, PeerInfo};
// Define "legacy" implementations of `Option<T>` which use four bytes for encoding the union
// selector.
four_byte_option_impl!(four_byte_option_u64, u64);
four_byte_option_impl!(four_byte_option_hash256, Hash256);
/// Information returned by `peers` and `connected_peers`. /// Information returned by `peers` and `connected_peers`.
// TODO: this should be deserializable.. // TODO: this should be deserializable..
#[derive(Debug, Clone, Serialize)] #[derive(Debug, Clone, Serialize)]
@ -298,7 +304,9 @@ pub struct Eth1Block {
pub hash: Hash256, pub hash: Hash256,
pub timestamp: u64, pub timestamp: u64,
pub number: u64, pub number: u64,
#[ssz(with = "four_byte_option_hash256")]
pub deposit_root: Option<Hash256>, pub deposit_root: Option<Hash256>,
#[ssz(with = "four_byte_option_u64")]
pub deposit_count: Option<u64>, pub deposit_count: Option<u64>,
} }

View File

@ -17,6 +17,6 @@ tempfile = "3.1.0"
serde = "1.0.116" serde = "1.0.116"
serde_yaml = "0.8.13" serde_yaml = "0.8.13"
types = { path = "../../consensus/types"} types = { path = "../../consensus/types"}
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_config = { path = "../eth2_config"} eth2_config = { path = "../eth2_config"}
enr = { version = "0.5.1", features = ["ed25519", "k256"] } enr = { version = "0.5.1", features = ["ed25519", "k256"] }

View File

@ -250,6 +250,12 @@ mod tests {
assert_eq!(spec, config.chain_spec::<E>().unwrap()); assert_eq!(spec, config.chain_spec::<E>().unwrap());
} }
#[test]
fn mainnet_genesis_state() {
let config = Eth2NetworkConfig::from_hardcoded_net(&MAINNET).unwrap();
config.beacon_state::<E>().expect("beacon state can decode");
}
#[test] #[test]
fn hard_coded_nets_work() { fn hard_coded_nets_work() {
for net in HARDCODED_NETS { for net in HARDCODED_NETS {

View File

@ -17,7 +17,7 @@ types = { path = "../../consensus/types" }
rand = "0.7.3" rand = "0.7.3"
deposit_contract = { path = "../deposit_contract" } deposit_contract = { path = "../deposit_contract" }
rayon = "1.4.1" rayon = "1.4.1"
tree_hash = "0.3.0" tree_hash = "0.4.0"
slog = { version = "2.5.2", features = ["max_level_trace", "release_max_level_trace"] } slog = { version = "2.5.2", features = ["max_level_trace", "release_max_level_trace"] }
hex = "0.4.2" hex = "0.4.2"
derivative = "2.1.1" derivative = "2.1.1"

View File

@ -6,11 +6,11 @@ edition = "2018"
[dependencies] [dependencies]
ethereum-types = "0.11.0" ethereum-types = "0.11.0"
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
eth2_hashing = "0.2.0" eth2_hashing = "0.2.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
smallvec = "1.6.1" smallvec = "1.6.1"
[dev-dependencies] [dev-dependencies]

View File

@ -202,8 +202,7 @@ impl<T: Encode + Decode> CacheArena<T> {
#[derive(Debug, PartialEq, Clone, Default, Encode, Decode)] #[derive(Debug, PartialEq, Clone, Default, Encode, Decode)]
pub struct CacheArenaAllocation<T> { pub struct CacheArenaAllocation<T> {
alloc_id: usize, alloc_id: usize,
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
_phantom: PhantomData<T>, _phantom: PhantomData<T>,
} }

View File

@ -9,13 +9,13 @@ edition = "2018"
[dependencies] [dependencies]
types = { path = "../types" } types = { path = "../types" }
proto_array = { path = "../proto_array" } proto_array = { path = "../proto_array" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
[dev-dependencies] [dev-dependencies]
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
beacon_chain = { path = "../../beacon_node/beacon_chain" } beacon_chain = { path = "../../beacon_node/beacon_chain" }
store = { path = "../../beacon_node/store" } store = { path = "../../beacon_node/store" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
slot_clock = { path = "../../common/slot_clock" } slot_clock = { path = "../../common/slot_clock" }
hex = "0.4.2" hex = "0.4.2"

View File

@ -10,8 +10,8 @@ path = "src/bin.rs"
[dependencies] [dependencies]
types = { path = "../types" } types = { path = "../types" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
serde = "1.0.116" serde = "1.0.116"
serde_derive = "1.0.116" serde_derive = "1.0.116"
serde_yaml = "0.8.13" serde_yaml = "0.8.13"

View File

@ -1,9 +1,14 @@
use crate::{error::Error, Block}; use crate::{error::Error, Block};
use serde_derive::{Deserialize, Serialize}; use serde_derive::{Deserialize, Serialize};
use ssz::four_byte_option_impl;
use ssz_derive::{Decode, Encode}; use ssz_derive::{Decode, Encode};
use std::collections::HashMap; use std::collections::HashMap;
use types::{AttestationShufflingId, Epoch, Hash256, Slot}; use types::{AttestationShufflingId, Epoch, Hash256, Slot};
// Define a "legacy" implementation of `Option<usize>` which uses four bytes for encoding the union
// selector.
four_byte_option_impl!(four_byte_option_usize, usize);
#[derive(Clone, PartialEq, Debug, Encode, Decode, Serialize, Deserialize)] #[derive(Clone, PartialEq, Debug, Encode, Decode, Serialize, Deserialize)]
pub struct ProtoNode { pub struct ProtoNode {
/// The `slot` is not necessary for `ProtoArray`, it just exists so external components can /// The `slot` is not necessary for `ProtoArray`, it just exists so external components can
@ -21,11 +26,14 @@ pub struct ProtoNode {
pub current_epoch_shuffling_id: AttestationShufflingId, pub current_epoch_shuffling_id: AttestationShufflingId,
pub next_epoch_shuffling_id: AttestationShufflingId, pub next_epoch_shuffling_id: AttestationShufflingId,
pub root: Hash256, pub root: Hash256,
#[ssz(with = "four_byte_option_usize")]
pub parent: Option<usize>, pub parent: Option<usize>,
pub justified_epoch: Epoch, pub justified_epoch: Epoch,
pub finalized_epoch: Epoch, pub finalized_epoch: Epoch,
weight: u64, weight: u64,
#[ssz(with = "four_byte_option_usize")]
best_child: Option<usize>, best_child: Option<usize>,
#[ssz(with = "four_byte_option_usize")]
best_descendant: Option<usize>, best_descendant: Option<usize>,
} }

View File

@ -1,6 +1,6 @@
[package] [package]
name = "eth2_ssz" name = "eth2_ssz"
version = "0.3.0" version = "0.4.0"
authors = ["Paul Hauner <paul@sigmaprime.io>"] authors = ["Paul Hauner <paul@sigmaprime.io>"]
edition = "2018" edition = "2018"
description = "SimpleSerialize (SSZ) as used in Ethereum 2.0" description = "SimpleSerialize (SSZ) as used in Ethereum 2.0"
@ -10,7 +10,7 @@ license = "Apache-2.0"
name = "ssz" name = "ssz"
[dev-dependencies] [dev-dependencies]
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
[dependencies] [dependencies]
ethereum-types = "0.11.0" ethereum-types = "0.11.0"

View File

@ -48,6 +48,8 @@ pub enum DecodeError {
ZeroLengthItem, ZeroLengthItem,
/// The given bytes were invalid for some application-level reason. /// The given bytes were invalid for some application-level reason.
BytesInvalid(String), BytesInvalid(String),
/// The given union selector is out of bounds.
UnionSelectorInvalid(u8),
} }
/// Performs checks on the `offset` based upon the other parameters provided. /// Performs checks on the `offset` based upon the other parameters provided.
@ -172,9 +174,18 @@ impl<'a> SszDecoderBuilder<'a> {
/// Declares that some type `T` is the next item in `bytes`. /// Declares that some type `T` is the next item in `bytes`.
pub fn register_type<T: Decode>(&mut self) -> Result<(), DecodeError> { pub fn register_type<T: Decode>(&mut self) -> Result<(), DecodeError> {
if T::is_ssz_fixed_len() { self.register_type_parameterized(T::is_ssz_fixed_len(), T::ssz_fixed_len())
}
/// Declares that a type with the given parameters is the next item in `bytes`.
pub fn register_type_parameterized(
&mut self,
is_ssz_fixed_len: bool,
ssz_fixed_len: usize,
) -> Result<(), DecodeError> {
if is_ssz_fixed_len {
let start = self.items_index; let start = self.items_index;
self.items_index += T::ssz_fixed_len(); self.items_index += ssz_fixed_len;
let slice = self.bytes.get(start..self.items_index).ok_or_else(|| { let slice = self.bytes.get(start..self.items_index).ok_or_else(|| {
DecodeError::InvalidByteLength { DecodeError::InvalidByteLength {
@ -300,7 +311,7 @@ impl<'a> SszDecoder<'a> {
/// ///
/// Panics when attempting to decode more items than actually exist. /// Panics when attempting to decode more items than actually exist.
pub fn decode_next<T: Decode>(&mut self) -> Result<T, DecodeError> { pub fn decode_next<T: Decode>(&mut self) -> Result<T, DecodeError> {
T::from_ssz_bytes(self.items.remove(0)) self.decode_next_with(|slice| T::from_ssz_bytes(slice))
} }
/// Decodes the next item using the provided function. /// Decodes the next item using the provided function.
@ -312,15 +323,30 @@ impl<'a> SszDecoder<'a> {
} }
} }
/// Reads a `BYTES_PER_LENGTH_OFFSET`-byte union index from `bytes`, where `bytes.len() >= /// Takes `bytes`, assuming it is the encoding for a SSZ union, and returns the union-selector and
/// BYTES_PER_LENGTH_OFFSET`. /// the body (trailing bytes).
pub fn read_union_index(bytes: &[u8]) -> Result<usize, DecodeError> { ///
read_offset(bytes) /// ## Errors
///
/// Returns an error if:
///
/// - `bytes` is empty.
/// - the union selector is not a valid value (i.e., larger than the maximum number of variants.
pub fn split_union_bytes(bytes: &[u8]) -> Result<(UnionSelector, &[u8]), DecodeError> {
let selector = bytes
.first()
.copied()
.ok_or(DecodeError::OutOfBoundsByte { i: 0 })
.and_then(UnionSelector::new)?;
let body = bytes
.get(1..)
.ok_or(DecodeError::OutOfBoundsByte { i: 1 })?;
Ok((selector, body))
} }
/// Reads a `BYTES_PER_LENGTH_OFFSET`-byte length from `bytes`, where `bytes.len() >= /// Reads a `BYTES_PER_LENGTH_OFFSET`-byte length from `bytes`, where `bytes.len() >=
/// BYTES_PER_LENGTH_OFFSET`. /// BYTES_PER_LENGTH_OFFSET`.
fn read_offset(bytes: &[u8]) -> Result<usize, DecodeError> { pub fn read_offset(bytes: &[u8]) -> Result<usize, DecodeError> {
decode_offset(bytes.get(0..BYTES_PER_LENGTH_OFFSET).ok_or_else(|| { decode_offset(bytes.get(0..BYTES_PER_LENGTH_OFFSET).ok_or_else(|| {
DecodeError::InvalidLengthPrefix { DecodeError::InvalidLengthPrefix {
len: bytes.len(), len: bytes.len(),

View File

@ -242,36 +242,6 @@ impl Decode for NonZeroUsize {
} }
} }
/// The SSZ union type.
impl<T: Decode> Decode for Option<T> {
fn is_ssz_fixed_len() -> bool {
false
}
fn from_ssz_bytes(bytes: &[u8]) -> Result<Self, DecodeError> {
if bytes.len() < BYTES_PER_LENGTH_OFFSET {
return Err(DecodeError::InvalidByteLength {
len: bytes.len(),
expected: BYTES_PER_LENGTH_OFFSET,
});
}
let (index_bytes, value_bytes) = bytes.split_at(BYTES_PER_LENGTH_OFFSET);
let index = read_union_index(index_bytes)?;
if index == 0 {
Ok(None)
} else if index == 1 {
Ok(Some(T::from_ssz_bytes(value_bytes)?))
} else {
Err(DecodeError::BytesInvalid(format!(
"{} is not a valid union index for Option<T>",
index
)))
}
}
}
impl<T: Decode> Decode for Arc<T> { impl<T: Decode> Decode for Arc<T> {
fn is_ssz_fixed_len() -> bool { fn is_ssz_fixed_len() -> bool {
T::is_ssz_fixed_len() T::is_ssz_fixed_len()

View File

@ -104,13 +104,21 @@ impl<'a> SszEncoder<'a> {
/// Append some `item` to the SSZ bytes. /// Append some `item` to the SSZ bytes.
pub fn append<T: Encode>(&mut self, item: &T) { pub fn append<T: Encode>(&mut self, item: &T) {
if T::is_ssz_fixed_len() { self.append_parameterized(T::is_ssz_fixed_len(), |buf| item.ssz_append(buf))
item.ssz_append(&mut self.buf); }
/// Uses `ssz_append` to append the encoding of some item to the SSZ bytes.
pub fn append_parameterized<F>(&mut self, is_ssz_fixed_len: bool, ssz_append: F)
where
F: Fn(&mut Vec<u8>),
{
if is_ssz_fixed_len {
ssz_append(&mut self.buf);
} else { } else {
self.buf self.buf
.extend_from_slice(&encode_length(self.offset + self.variable_bytes.len())); .extend_from_slice(&encode_length(self.offset + self.variable_bytes.len()));
item.ssz_append(&mut self.variable_bytes); ssz_append(&mut self.variable_bytes);
} }
} }
@ -125,13 +133,6 @@ impl<'a> SszEncoder<'a> {
} }
} }
/// Encode `index` as a little-endian byte array of `BYTES_PER_LENGTH_OFFSET` length.
///
/// If `len` is larger than `2 ^ BYTES_PER_LENGTH_OFFSET`, a `debug_assert` is raised.
pub fn encode_union_index(index: usize) -> [u8; BYTES_PER_LENGTH_OFFSET] {
encode_length(index)
}
/// Encode `len` as a little-endian byte array of `BYTES_PER_LENGTH_OFFSET` length. /// Encode `len` as a little-endian byte array of `BYTES_PER_LENGTH_OFFSET` length.
/// ///
/// If `len` is larger than `2 ^ BYTES_PER_LENGTH_OFFSET`, a `debug_assert` is raised. /// If `len` is larger than `2 ^ BYTES_PER_LENGTH_OFFSET`, a `debug_assert` is raised.

View File

@ -202,36 +202,6 @@ impl_encode_for_tuples! {
} }
} }
/// The SSZ "union" type.
impl<T: Encode> Encode for Option<T> {
fn is_ssz_fixed_len() -> bool {
false
}
fn ssz_bytes_len(&self) -> usize {
if let Some(some) = self {
let len = if <T as Encode>::is_ssz_fixed_len() {
<T as Encode>::ssz_fixed_len()
} else {
some.ssz_bytes_len()
};
len + BYTES_PER_LENGTH_OFFSET
} else {
BYTES_PER_LENGTH_OFFSET
}
}
fn ssz_append(&self, buf: &mut Vec<u8>) {
match self {
None => buf.extend_from_slice(&encode_union_index(0)),
Some(t) => {
buf.extend_from_slice(&encode_union_index(1));
t.ssz_append(buf);
}
}
}
}
impl<T: Encode> Encode for Arc<T> { impl<T: Encode> Encode for Arc<T> {
fn is_ssz_fixed_len() -> bool { fn is_ssz_fixed_len() -> bool {
T::is_ssz_fixed_len() T::is_ssz_fixed_len()
@ -456,25 +426,6 @@ mod tests {
); );
} }
#[test]
fn ssz_encode_option_u16() {
assert_eq!(Some(65535_u16).as_ssz_bytes(), vec![1, 0, 0, 0, 255, 255]);
let none: Option<u16> = None;
assert_eq!(none.as_ssz_bytes(), vec![0, 0, 0, 0]);
}
#[test]
fn ssz_encode_option_vec_u16() {
assert_eq!(
Some(vec![0_u16, 1]).as_ssz_bytes(),
vec![1, 0, 0, 0, 0, 0, 1, 0]
);
let none: Option<Vec<u16>> = None;
assert_eq!(none.as_ssz_bytes(), vec![0, 0, 0, 0]);
}
#[test] #[test]
fn ssz_encode_u8() { fn ssz_encode_u8() {
assert_eq!(0_u8.as_ssz_bytes(), vec![0]); assert_eq!(0_u8.as_ssz_bytes(), vec![0]);

265
consensus/ssz/src/legacy.rs Normal file
View File

@ -0,0 +1,265 @@
//! Provides a "legacy" version of SSZ encoding for `Option<T> where T: Encode + Decode`.
//!
//! The SSZ specification changed in 2021 to use a 1-byte union selector, instead of a 4-byte one
//! which was used in the Lighthouse database.
//!
//! Users can use the `four_byte_option_impl` macro to define a module that can be used with the
//! `#[ssz(with = "module")]`.
//!
//! ## Example
//!
//! ```rust
//! use ssz_derive::{Encode, Decode};
//! use ssz::four_byte_option_impl;
//!
//! four_byte_option_impl!(impl_for_u64, u64);
//!
//! #[derive(Encode, Decode)]
//! struct Foo {
//! #[ssz(with = "impl_for_u64")]
//! a: Option<u64>,
//! }
//! ```
use crate::*;
#[macro_export]
macro_rules! four_byte_option_impl {
($mod_name: ident, $type: ty) => {
#[allow(dead_code)]
mod $mod_name {
use super::*;
pub mod encode {
use super::*;
#[allow(unused_imports)]
use ssz::*;
pub fn is_ssz_fixed_len() -> bool {
false
}
pub fn ssz_fixed_len() -> usize {
BYTES_PER_LENGTH_OFFSET
}
pub fn ssz_bytes_len(opt: &Option<$type>) -> usize {
if let Some(some) = opt {
let len = if <$type as Encode>::is_ssz_fixed_len() {
<$type as Encode>::ssz_fixed_len()
} else {
<$type as Encode>::ssz_bytes_len(some)
};
len + BYTES_PER_LENGTH_OFFSET
} else {
BYTES_PER_LENGTH_OFFSET
}
}
pub fn ssz_append(opt: &Option<$type>, buf: &mut Vec<u8>) {
match opt {
None => buf.extend_from_slice(&legacy::encode_four_byte_union_selector(0)),
Some(t) => {
buf.extend_from_slice(&legacy::encode_four_byte_union_selector(1));
t.ssz_append(buf);
}
}
}
pub fn as_ssz_bytes(opt: &Option<$type>) -> Vec<u8> {
let mut buf = vec![];
ssz_append(opt, &mut buf);
buf
}
}
pub mod decode {
use super::*;
#[allow(unused_imports)]
use ssz::*;
pub fn is_ssz_fixed_len() -> bool {
false
}
pub fn ssz_fixed_len() -> usize {
BYTES_PER_LENGTH_OFFSET
}
pub fn from_ssz_bytes(bytes: &[u8]) -> Result<Option<$type>, DecodeError> {
if bytes.len() < BYTES_PER_LENGTH_OFFSET {
return Err(DecodeError::InvalidByteLength {
len: bytes.len(),
expected: BYTES_PER_LENGTH_OFFSET,
});
}
let (index_bytes, value_bytes) = bytes.split_at(BYTES_PER_LENGTH_OFFSET);
let index = legacy::read_four_byte_union_selector(index_bytes)?;
if index == 0 {
Ok(None)
} else if index == 1 {
Ok(Some(<$type as ssz::Decode>::from_ssz_bytes(value_bytes)?))
} else {
Err(DecodeError::BytesInvalid(format!(
"{} is not a valid union index for Option<T>",
index
)))
}
}
}
}
};
}
pub fn encode_four_byte_union_selector(selector: usize) -> [u8; BYTES_PER_LENGTH_OFFSET] {
encode_length(selector)
}
pub fn read_four_byte_union_selector(bytes: &[u8]) -> Result<usize, DecodeError> {
read_offset(bytes)
}
#[cfg(test)]
mod test {
use super::*;
use crate as ssz;
use ssz_derive::{Decode, Encode};
type VecU16 = Vec<u16>;
four_byte_option_impl!(impl_u16, u16);
four_byte_option_impl!(impl_vec_u16, VecU16);
#[test]
fn ssz_encode_option_u16() {
let item = Some(65535_u16);
let bytes = vec![1, 0, 0, 0, 255, 255];
assert_eq!(impl_u16::encode::as_ssz_bytes(&item), bytes);
assert_eq!(impl_u16::decode::from_ssz_bytes(&bytes).unwrap(), item);
let item = None;
let bytes = vec![0, 0, 0, 0];
assert_eq!(impl_u16::encode::as_ssz_bytes(&item), bytes);
assert_eq!(impl_u16::decode::from_ssz_bytes(&bytes).unwrap(), None);
}
#[test]
fn ssz_encode_option_vec_u16() {
let item = Some(vec![0_u16, 1]);
let bytes = vec![1, 0, 0, 0, 0, 0, 1, 0];
assert_eq!(impl_vec_u16::encode::as_ssz_bytes(&item), bytes);
assert_eq!(impl_vec_u16::decode::from_ssz_bytes(&bytes).unwrap(), item);
let item = None;
let bytes = vec![0, 0, 0, 0];
assert_eq!(impl_vec_u16::encode::as_ssz_bytes(&item), bytes);
assert_eq!(impl_vec_u16::decode::from_ssz_bytes(&bytes).unwrap(), item);
}
fn round_trip<T: Encode + Decode + std::fmt::Debug + PartialEq>(items: Vec<T>) {
for item in items {
let encoded = &item.as_ssz_bytes();
assert_eq!(item.ssz_bytes_len(), encoded.len());
assert_eq!(T::from_ssz_bytes(encoded), Ok(item));
}
}
#[derive(Debug, PartialEq, Encode, Decode)]
struct TwoVariableLenOptions {
a: u16,
#[ssz(with = "impl_u16")]
b: Option<u16>,
#[ssz(with = "impl_vec_u16")]
c: Option<Vec<u16>>,
#[ssz(with = "impl_vec_u16")]
d: Option<Vec<u16>>,
}
#[test]
#[allow(clippy::zero_prefixed_literal)]
fn two_variable_len_options_encoding() {
let s = TwoVariableLenOptions {
a: 42,
b: None,
c: Some(vec![0]),
d: None,
};
let bytes = vec![
// 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
// | option<u16> | offset | offset | option<u16 | 1st list
42, 00, 14, 00, 00, 00, 18, 00, 00, 00, 24, 00, 00, 00, 00, 00, 00, 00, 01, 00, 00, 00,
// 23 24 25 26 27
// | 2nd list
00, 00, 00, 00, 00, 00,
];
assert_eq!(s.as_ssz_bytes(), bytes);
}
#[test]
fn two_variable_len_options_round_trip() {
let vec: Vec<TwoVariableLenOptions> = vec![
TwoVariableLenOptions {
a: 42,
b: Some(12),
c: Some(vec![0]),
d: Some(vec![1]),
},
TwoVariableLenOptions {
a: 42,
b: Some(12),
c: Some(vec![0]),
d: None,
},
TwoVariableLenOptions {
a: 42,
b: None,
c: Some(vec![0]),
d: None,
},
TwoVariableLenOptions {
a: 42,
b: None,
c: None,
d: None,
},
];
round_trip(vec);
}
#[test]
fn tuple_u8_u16() {
let vec: Vec<(u8, u16)> = vec![
(0, 0),
(0, 1),
(1, 0),
(u8::max_value(), u16::max_value()),
(0, u16::max_value()),
(u8::max_value(), 0),
(42, 12301),
];
round_trip(vec);
}
#[test]
fn tuple_vec_vec() {
let vec: Vec<(u64, Vec<u8>, Vec<Vec<u16>>)> = vec![
(0, vec![], vec![vec![]]),
(99, vec![101], vec![vec![], vec![]]),
(
42,
vec![12, 13, 14],
vec![vec![99, 98, 97, 96], vec![42, 44, 46, 48, 50]],
),
];
round_trip(vec);
}
}

View File

@ -36,11 +36,15 @@
mod decode; mod decode;
mod encode; mod encode;
pub mod legacy;
mod union_selector;
pub use decode::{ pub use decode::{
impls::decode_list_of_variable_length_items, Decode, DecodeError, SszDecoder, SszDecoderBuilder, impls::decode_list_of_variable_length_items, read_offset, split_union_bytes, Decode,
DecodeError, SszDecoder, SszDecoderBuilder,
}; };
pub use encode::{Encode, SszEncoder}; pub use encode::{encode_length, Encode, SszEncoder};
pub use union_selector::UnionSelector;
/// The number of bytes used to represent an offset. /// The number of bytes used to represent an offset.
pub const BYTES_PER_LENGTH_OFFSET: usize = 4; pub const BYTES_PER_LENGTH_OFFSET: usize = 4;
@ -50,6 +54,12 @@ pub const MAX_LENGTH_VALUE: usize = (std::u32::MAX >> (8 * (4 - BYTES_PER_LENGTH
#[cfg(target_pointer_width = "64")] #[cfg(target_pointer_width = "64")]
pub const MAX_LENGTH_VALUE: usize = (std::u64::MAX >> (8 * (8 - BYTES_PER_LENGTH_OFFSET))) as usize; pub const MAX_LENGTH_VALUE: usize = (std::u64::MAX >> (8 * (8 - BYTES_PER_LENGTH_OFFSET))) as usize;
/// The number of bytes used to indicate the variant of a union.
pub const BYTES_PER_UNION_SELECTOR: usize = 1;
/// The highest possible union selector value (higher values are reserved for backwards compatible
/// extensions).
pub const MAX_UNION_SELECTOR: u8 = 127;
/// Convenience function to SSZ encode an object supporting ssz::Encode. /// Convenience function to SSZ encode an object supporting ssz::Encode.
/// ///
/// Equivalent to `val.as_ssz_bytes()`. /// Equivalent to `val.as_ssz_bytes()`.

View File

@ -0,0 +1,29 @@
use crate::*;
/// Provides the one-byte "selector" from the SSZ union specification:
///
/// https://github.com/ethereum/consensus-specs/blob/v1.1.0-beta.3/ssz/simple-serialize.md#union
#[derive(Copy, Clone)]
pub struct UnionSelector(u8);
impl From<UnionSelector> for u8 {
fn from(union_selector: UnionSelector) -> u8 {
union_selector.0
}
}
impl PartialEq<u8> for UnionSelector {
fn eq(&self, other: &u8) -> bool {
self.0 == *other
}
}
impl UnionSelector {
/// Instantiate `self`, returning an error if `selector > MAX_UNION_SELECTOR`.
pub fn new(selector: u8) -> Result<Self, DecodeError> {
Some(selector)
.filter(|_| selector <= MAX_UNION_SELECTOR)
.map(Self)
.ok_or(DecodeError::UnionSelectorInvalid(selector))
}
}

View File

@ -292,68 +292,6 @@ mod round_trip {
); );
} }
#[derive(Debug, PartialEq, Encode, Decode)]
struct TwoVariableLenOptions {
a: u16,
b: Option<u16>,
c: Option<Vec<u16>>,
d: Option<Vec<u16>>,
}
#[test]
#[allow(clippy::zero_prefixed_literal)]
fn two_variable_len_options_encoding() {
let s = TwoVariableLenOptions {
a: 42,
b: None,
c: Some(vec![0]),
d: None,
};
let bytes = vec![
// 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
// | option<u16> | offset | offset | option<u16 | 1st list
42, 00, 14, 00, 00, 00, 18, 00, 00, 00, 24, 00, 00, 00, 00, 00, 00, 00, 01, 00, 00, 00,
// 23 24 25 26 27
// | 2nd list
00, 00, 00, 00, 00, 00,
];
assert_eq!(s.as_ssz_bytes(), bytes);
}
#[test]
fn two_variable_len_options_round_trip() {
let vec: Vec<TwoVariableLenOptions> = vec![
TwoVariableLenOptions {
a: 42,
b: Some(12),
c: Some(vec![0]),
d: Some(vec![1]),
},
TwoVariableLenOptions {
a: 42,
b: Some(12),
c: Some(vec![0]),
d: None,
},
TwoVariableLenOptions {
a: 42,
b: None,
c: Some(vec![0]),
d: None,
},
TwoVariableLenOptions {
a: 42,
b: None,
c: None,
d: None,
},
];
round_trip(vec);
}
#[test] #[test]
fn tuple_u8_u16() { fn tuple_u8_u16() {
let vec: Vec<(u8, u16)> = vec![ let vec: Vec<(u8, u16)> = vec![
@ -384,3 +322,145 @@ mod round_trip {
round_trip(vec); round_trip(vec);
} }
} }
mod derive_macro {
use ssz::{Decode, Encode};
use ssz_derive::{Decode, Encode};
use std::fmt::Debug;
fn assert_encode<T: Encode>(item: &T, bytes: &[u8]) {
assert_eq!(item.as_ssz_bytes(), bytes);
}
fn assert_encode_decode<T: Encode + Decode + PartialEq + Debug>(item: &T, bytes: &[u8]) {
assert_encode(item, bytes);
assert_eq!(T::from_ssz_bytes(bytes).unwrap(), *item);
}
#[derive(PartialEq, Debug, Encode, Decode)]
#[ssz(enum_behaviour = "union")]
enum TwoFixedUnion {
U8(u8),
U16(u16),
}
#[derive(PartialEq, Debug, Encode, Decode)]
struct TwoFixedUnionStruct {
a: TwoFixedUnion,
}
#[test]
fn two_fixed_union() {
let eight = TwoFixedUnion::U8(1);
let sixteen = TwoFixedUnion::U16(1);
assert_encode_decode(&eight, &[0, 1]);
assert_encode_decode(&sixteen, &[1, 1, 0]);
assert_encode_decode(&TwoFixedUnionStruct { a: eight }, &[4, 0, 0, 0, 0, 1]);
assert_encode_decode(&TwoFixedUnionStruct { a: sixteen }, &[4, 0, 0, 0, 1, 1, 0]);
}
#[derive(PartialEq, Debug, Encode, Decode)]
struct VariableA {
a: u8,
b: Vec<u8>,
}
#[derive(PartialEq, Debug, Encode, Decode)]
struct VariableB {
a: Vec<u8>,
b: u8,
}
#[derive(PartialEq, Debug, Encode)]
#[ssz(enum_behaviour = "transparent")]
enum TwoVariableTrans {
A(VariableA),
B(VariableB),
}
#[derive(PartialEq, Debug, Encode)]
struct TwoVariableTransStruct {
a: TwoVariableTrans,
}
#[derive(PartialEq, Debug, Encode, Decode)]
#[ssz(enum_behaviour = "union")]
enum TwoVariableUnion {
A(VariableA),
B(VariableB),
}
#[derive(PartialEq, Debug, Encode, Decode)]
struct TwoVariableUnionStruct {
a: TwoVariableUnion,
}
#[test]
fn two_variable_trans() {
let trans_a = TwoVariableTrans::A(VariableA {
a: 1,
b: vec![2, 3],
});
let trans_b = TwoVariableTrans::B(VariableB {
a: vec![1, 2],
b: 3,
});
assert_encode(&trans_a, &[1, 5, 0, 0, 0, 2, 3]);
assert_encode(&trans_b, &[5, 0, 0, 0, 3, 1, 2]);
assert_encode(
&TwoVariableTransStruct { a: trans_a },
&[4, 0, 0, 0, 1, 5, 0, 0, 0, 2, 3],
);
assert_encode(
&TwoVariableTransStruct { a: trans_b },
&[4, 0, 0, 0, 5, 0, 0, 0, 3, 1, 2],
);
}
#[test]
fn two_variable_union() {
let union_a = TwoVariableUnion::A(VariableA {
a: 1,
b: vec![2, 3],
});
let union_b = TwoVariableUnion::B(VariableB {
a: vec![1, 2],
b: 3,
});
assert_encode_decode(&union_a, &[0, 1, 5, 0, 0, 0, 2, 3]);
assert_encode_decode(&union_b, &[1, 5, 0, 0, 0, 3, 1, 2]);
assert_encode_decode(
&TwoVariableUnionStruct { a: union_a },
&[4, 0, 0, 0, 0, 1, 5, 0, 0, 0, 2, 3],
);
assert_encode_decode(
&TwoVariableUnionStruct { a: union_b },
&[4, 0, 0, 0, 1, 5, 0, 0, 0, 3, 1, 2],
);
}
#[derive(PartialEq, Debug, Encode, Decode)]
#[ssz(enum_behaviour = "union")]
enum TwoVecUnion {
A(Vec<u8>),
B(Vec<u8>),
}
#[test]
fn two_vec_union() {
assert_encode_decode(&TwoVecUnion::A(vec![]), &[0]);
assert_encode_decode(&TwoVecUnion::B(vec![]), &[1]);
assert_encode_decode(&TwoVecUnion::A(vec![0]), &[0, 0]);
assert_encode_decode(&TwoVecUnion::B(vec![0]), &[1, 0]);
assert_encode_decode(&TwoVecUnion::A(vec![0, 1]), &[0, 0, 1]);
assert_encode_decode(&TwoVecUnion::B(vec![0, 1]), &[1, 0, 1]);
}
}

View File

@ -1,6 +1,6 @@
[package] [package]
name = "eth2_ssz_derive" name = "eth2_ssz_derive"
version = "0.2.1" version = "0.3.0"
authors = ["Paul Hauner <paul@sigmaprime.io>"] authors = ["Paul Hauner <paul@sigmaprime.io>"]
edition = "2018" edition = "2018"
description = "Procedural derive macros to accompany the eth2_ssz crate." description = "Procedural derive macros to accompany the eth2_ssz crate."
@ -12,4 +12,6 @@ proc-macro = true
[dependencies] [dependencies]
syn = "1.0.42" syn = "1.0.42"
proc-macro2 = "1.0.23"
quote = "1.0.7" quote = "1.0.7"
darling = "0.13.0"

View File

@ -3,93 +3,159 @@
//! //!
//! Supports field attributes, see each derive macro for more information. //! Supports field attributes, see each derive macro for more information.
use darling::{FromDeriveInput, FromMeta};
use proc_macro::TokenStream; use proc_macro::TokenStream;
use quote::quote; use quote::quote;
use syn::{parse_macro_input, DataEnum, DataStruct, DeriveInput}; use std::convert::TryInto;
use syn::{parse_macro_input, DataEnum, DataStruct, DeriveInput, Ident};
/// Returns a Vec of `syn::Ident` for each named field in the struct, whilst filtering out fields /// The highest possible union selector value (higher values are reserved for backwards compatible
/// that should not be serialized. /// extensions).
/// const MAX_UNION_SELECTOR: u8 = 127;
/// # Panics
/// Any unnamed struct field (like in a tuple struct) will raise a panic at compile time. #[derive(Debug, FromDeriveInput)]
fn get_serializable_named_field_idents(struct_data: &syn::DataStruct) -> Vec<&syn::Ident> { #[darling(attributes(ssz))]
struct StructOpts {
#[darling(default)]
enum_behaviour: Option<String>,
}
/// Field-level configuration.
#[derive(Debug, Default, FromMeta)]
struct FieldOpts {
#[darling(default)]
with: Option<Ident>,
#[darling(default)]
skip_serializing: bool,
#[darling(default)]
skip_deserializing: bool,
}
const ENUM_TRANSPARENT: &str = "transparent";
const ENUM_UNION: &str = "union";
const ENUM_VARIANTS: &[&str] = &[ENUM_TRANSPARENT, ENUM_UNION];
const NO_ENUM_BEHAVIOUR_ERROR: &str = "enums require an \"enum_behaviour\" attribute, \
e.g., #[ssz(enum_behaviour = \"transparent\")]";
enum EnumBehaviour {
Transparent,
Union,
}
impl EnumBehaviour {
pub fn new(s: Option<String>) -> Option<Self> {
s.map(|s| match s.as_ref() {
ENUM_TRANSPARENT => EnumBehaviour::Transparent,
ENUM_UNION => EnumBehaviour::Union,
other => panic!(
"{} is an invalid enum_behaviour, use either {:?}",
other, ENUM_VARIANTS
),
})
}
}
fn parse_ssz_fields(struct_data: &syn::DataStruct) -> Vec<(&syn::Type, &syn::Ident, FieldOpts)> {
struct_data struct_data
.fields .fields
.iter() .iter()
.filter_map(|f| { .map(|field| {
if should_skip_serializing(f) { let ty = &field.ty;
None let ident = match &field.ident {
} else {
Some(match &f.ident {
Some(ref ident) => ident, Some(ref ident) => ident,
_ => panic!("ssz_derive only supports named struct fields."), _ => panic!("ssz_derive only supports named struct fields."),
}) };
}
})
.collect()
}
/// Returns a Vec of `syn::Type` for each named field in the struct, whilst filtering out fields let field_opts_candidates = field
/// that should not be serialized. .attrs
fn get_serializable_field_types(struct_data: &syn::DataStruct) -> Vec<&syn::Type> {
struct_data
.fields
.iter() .iter()
.filter_map(|f| { .filter(|attr| attr.path.get_ident().map_or(false, |ident| *ident == "ssz"))
if should_skip_serializing(f) { .collect::<Vec<_>>();
None
} else { if field_opts_candidates.len() > 1 {
Some(&f.ty) panic!("more than one field-level \"ssz\" attribute provided")
}
})
.collect()
} }
/// Returns true if some field has an attribute declaring it should not be serialized. let field_opts = field_opts_candidates
/// .first()
/// The field attribute is: `#[ssz(skip_serializing)]` .map(|attr| {
fn should_skip_serializing(field: &syn::Field) -> bool { let meta = attr.parse_meta().unwrap();
field.attrs.iter().any(|attr| { FieldOpts::from_meta(&meta).unwrap()
attr.path.is_ident("ssz")
&& attr.tokens.to_string().replace(" ", "") == "(skip_serializing)"
}) })
.unwrap_or_default();
(ty, ident, field_opts)
})
.collect()
} }
/// Implements `ssz::Encode` for some `struct` or `enum`. /// Implements `ssz::Encode` for some `struct` or `enum`.
#[proc_macro_derive(Encode, attributes(ssz))]
pub fn ssz_encode_derive(input: TokenStream) -> TokenStream {
let item = parse_macro_input!(input as DeriveInput);
let opts = StructOpts::from_derive_input(&item).unwrap();
let enum_opt = EnumBehaviour::new(opts.enum_behaviour);
match &item.data {
syn::Data::Struct(s) => {
if enum_opt.is_some() {
panic!("enum_behaviour is invalid for structs");
}
ssz_encode_derive_struct(&item, s)
}
syn::Data::Enum(s) => match enum_opt.expect(NO_ENUM_BEHAVIOUR_ERROR) {
EnumBehaviour::Transparent => ssz_encode_derive_enum_transparent(&item, s),
EnumBehaviour::Union => ssz_encode_derive_enum_union(&item, s),
},
_ => panic!("ssz_derive only supports structs and enums"),
}
}
/// Derive `ssz::Encode` for a struct.
/// ///
/// Fields are encoded in the order they are defined. /// Fields are encoded in the order they are defined.
/// ///
/// ## Field attributes /// ## Field attributes
/// ///
/// - `#[ssz(skip_serializing)]`: the field will not be serialized. /// - `#[ssz(skip_serializing)]`: the field will not be serialized.
#[proc_macro_derive(Encode, attributes(ssz))]
pub fn ssz_encode_derive(input: TokenStream) -> TokenStream {
let item = parse_macro_input!(input as DeriveInput);
match &item.data {
syn::Data::Struct(s) => ssz_encode_derive_struct(&item, s),
syn::Data::Enum(s) => ssz_encode_derive_enum(&item, s),
_ => panic!("ssz_derive only supports structs and enums"),
}
}
fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct) -> TokenStream { fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct) -> TokenStream {
let name = &derive_input.ident; let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl(); let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
let field_idents = get_serializable_named_field_idents(struct_data); let field_is_ssz_fixed_len = &mut vec![];
let field_idents_a = get_serializable_named_field_idents(struct_data); let field_fixed_len = &mut vec![];
let field_types_a = get_serializable_field_types(struct_data); let field_ssz_bytes_len = &mut vec![];
let field_types_b = field_types_a.clone(); let field_encoder_append = &mut vec![];
let field_types_d = field_types_a.clone();
let field_types_e = field_types_a.clone(); for (ty, ident, field_opts) in parse_ssz_fields(struct_data) {
let field_types_f = field_types_a.clone(); if field_opts.skip_serializing {
continue;
}
if let Some(module) = field_opts.with {
let module = quote! { #module::encode };
field_is_ssz_fixed_len.push(quote! { #module::is_ssz_fixed_len() });
field_fixed_len.push(quote! { #module::ssz_fixed_len() });
field_ssz_bytes_len.push(quote! { #module::ssz_bytes_len(&self.#ident) });
field_encoder_append.push(quote! {
encoder.append_parameterized(
#module::is_ssz_fixed_len(),
|buf| #module::ssz_append(&self.#ident, buf)
)
});
} else {
field_is_ssz_fixed_len.push(quote! { <#ty as ssz::Encode>::is_ssz_fixed_len() });
field_fixed_len.push(quote! { <#ty as ssz::Encode>::ssz_fixed_len() });
field_ssz_bytes_len.push(quote! { self.#ident.ssz_bytes_len() });
field_encoder_append.push(quote! { encoder.append(&self.#ident) });
}
}
let output = quote! { let output = quote! {
impl #impl_generics ssz::Encode for #name #ty_generics #where_clause { impl #impl_generics ssz::Encode for #name #ty_generics #where_clause {
fn is_ssz_fixed_len() -> bool { fn is_ssz_fixed_len() -> bool {
#( #(
<#field_types_a as ssz::Encode>::is_ssz_fixed_len() && #field_is_ssz_fixed_len &&
)* )*
true true
} }
@ -99,7 +165,7 @@ fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct
let mut len: usize = 0; let mut len: usize = 0;
#( #(
len = len len = len
.checked_add(<#field_types_b as ssz::Encode>::ssz_fixed_len()) .checked_add(#field_fixed_len)
.expect("encode ssz_fixed_len length overflow"); .expect("encode ssz_fixed_len length overflow");
)* )*
len len
@ -114,16 +180,16 @@ fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct
} else { } else {
let mut len: usize = 0; let mut len: usize = 0;
#( #(
if <#field_types_d as ssz::Encode>::is_ssz_fixed_len() { if #field_is_ssz_fixed_len {
len = len len = len
.checked_add(<#field_types_e as ssz::Encode>::ssz_fixed_len()) .checked_add(#field_fixed_len)
.expect("encode ssz_bytes_len length overflow"); .expect("encode ssz_bytes_len length overflow");
} else { } else {
len = len len = len
.checked_add(ssz::BYTES_PER_LENGTH_OFFSET) .checked_add(ssz::BYTES_PER_LENGTH_OFFSET)
.expect("encode ssz_bytes_len length overflow for offset"); .expect("encode ssz_bytes_len length overflow for offset");
len = len len = len
.checked_add(self.#field_idents_a.ssz_bytes_len()) .checked_add(#field_ssz_bytes_len)
.expect("encode ssz_bytes_len length overflow for bytes"); .expect("encode ssz_bytes_len length overflow for bytes");
} }
)* )*
@ -136,14 +202,14 @@ fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct
let mut offset: usize = 0; let mut offset: usize = 0;
#( #(
offset = offset offset = offset
.checked_add(<#field_types_f as ssz::Encode>::ssz_fixed_len()) .checked_add(#field_fixed_len)
.expect("encode ssz_append offset overflow"); .expect("encode ssz_append offset overflow");
)* )*
let mut encoder = ssz::SszEncoder::container(buf, offset); let mut encoder = ssz::SszEncoder::container(buf, offset);
#( #(
encoder.append(&self.#field_idents); #field_encoder_append;
)* )*
encoder.finalize(); encoder.finalize();
@ -153,15 +219,27 @@ fn ssz_encode_derive_struct(derive_input: &DeriveInput, struct_data: &DataStruct
output.into() output.into()
} }
/// Derive `Encode` for a restricted subset of all possible enum types. /// Derive `ssz::Encode` for an enum in the "transparent" method.
///
/// The "transparent" method is distinct from the "union" method specified in the SSZ specification.
/// When using "transparent", the enum will be ignored and the contained field will be serialized as
/// if the enum does not exist. Since an union variant "selector" is not serialized, it is not
/// possible to reliably decode an enum that is serialized transparently.
///
/// ## Limitations
/// ///
/// Only supports: /// Only supports:
/// - Enums with a single field per variant, where /// - Enums with a single field per variant, where
/// - All fields are variably sized from an SSZ-perspective (not fixed size). /// - All fields are variably sized from an SSZ-perspective (not fixed size).
/// ///
/// ## Panics
///
/// Will panic at compile-time if the single field requirement isn't met, but will panic *at run /// Will panic at compile-time if the single field requirement isn't met, but will panic *at run
/// time* if the variable-size requirement isn't met. /// time* if the variable-size requirement isn't met.
fn ssz_encode_derive_enum(derive_input: &DeriveInput, enum_data: &DataEnum) -> TokenStream { fn ssz_encode_derive_enum_transparent(
derive_input: &DeriveInput,
enum_data: &DataEnum,
) -> TokenStream {
let name = &derive_input.ident; let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl(); let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
@ -219,14 +297,95 @@ fn ssz_encode_derive_enum(derive_input: &DeriveInput, enum_data: &DataEnum) -> T
output.into() output.into()
} }
/// Returns true if some field has an attribute declaring it should not be deserialized. /// Derive `ssz::Encode` for an `enum` following the "union" SSZ spec.
/// ///
/// The field attribute is: `#[ssz(skip_deserializing)]` /// The union selector will be determined based upon the order in which the enum variants are
fn should_skip_deserializing(field: &syn::Field) -> bool { /// defined. E.g., the top-most variant in the enum will have a selector of `0`, the variant
field.attrs.iter().any(|attr| { /// beneath it will have a selector of `1` and so on.
attr.path.is_ident("ssz") ///
&& attr.tokens.to_string().replace(" ", "") == "(skip_deserializing)" /// # Limitations
///
/// Only supports enums where each variant has a single field.
fn ssz_encode_derive_enum_union(derive_input: &DeriveInput, enum_data: &DataEnum) -> TokenStream {
let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
let patterns: Vec<_> = enum_data
.variants
.iter()
.map(|variant| {
let variant_name = &variant.ident;
if variant.fields.len() != 1 {
panic!("ssz::Encode can only be derived for enums with 1 field per variant");
}
let pattern = quote! {
#name::#variant_name(ref inner)
};
pattern
}) })
.collect();
let union_selectors = compute_union_selectors(patterns.len());
let output = quote! {
impl #impl_generics ssz::Encode for #name #ty_generics #where_clause {
fn is_ssz_fixed_len() -> bool {
false
}
fn ssz_bytes_len(&self) -> usize {
match self {
#(
#patterns => inner
.ssz_bytes_len()
.checked_add(1)
.expect("encoded length must be less than usize::max_value"),
)*
}
}
fn ssz_append(&self, buf: &mut Vec<u8>) {
match self {
#(
#patterns => {
let union_selector: u8 = #union_selectors;
debug_assert!(union_selector <= ssz::MAX_UNION_SELECTOR);
buf.push(union_selector);
inner.ssz_append(buf)
},
)*
}
}
}
};
output.into()
}
/// Derive `ssz::Decode` for a struct or enum.
#[proc_macro_derive(Decode, attributes(ssz))]
pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
let item = parse_macro_input!(input as DeriveInput);
let opts = StructOpts::from_derive_input(&item).unwrap();
let enum_opt = EnumBehaviour::new(opts.enum_behaviour);
match &item.data {
syn::Data::Struct(s) => {
if enum_opt.is_some() {
panic!("enum_behaviour is invalid for structs");
}
ssz_decode_derive_struct(&item, s)
}
syn::Data::Enum(s) => match enum_opt.expect(NO_ENUM_BEHAVIOUR_ERROR) {
EnumBehaviour::Transparent => panic!(
"Decode cannot be derived for enum_behaviour \"{}\", only \"{}\" is valid.",
ENUM_TRANSPARENT, ENUM_UNION
),
EnumBehaviour::Union => ssz_decode_derive_enum_union(&item, s),
},
_ => panic!("ssz_derive only supports structs and enums"),
}
} }
/// Implements `ssz::Decode` for some `struct`. /// Implements `ssz::Decode` for some `struct`.
@ -238,18 +397,10 @@ fn should_skip_deserializing(field: &syn::Field) -> bool {
/// - `#[ssz(skip_deserializing)]`: during de-serialization the field will be instantiated from a /// - `#[ssz(skip_deserializing)]`: during de-serialization the field will be instantiated from a
/// `Default` implementation. The decoder will assume that the field was not serialized at all /// `Default` implementation. The decoder will assume that the field was not serialized at all
/// (e.g., if it has been serialized, an error will be raised instead of `Default` overriding it). /// (e.g., if it has been serialized, an error will be raised instead of `Default` overriding it).
#[proc_macro_derive(Decode)] fn ssz_decode_derive_struct(item: &DeriveInput, struct_data: &DataStruct) -> TokenStream {
pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
let item = parse_macro_input!(input as DeriveInput);
let name = &item.ident; let name = &item.ident;
let (impl_generics, ty_generics, where_clause) = &item.generics.split_for_impl(); let (impl_generics, ty_generics, where_clause) = &item.generics.split_for_impl();
let struct_data = match &item.data {
syn::Data::Struct(s) => s,
_ => panic!("ssz_derive only supports structs."),
};
let mut register_types = vec![]; let mut register_types = vec![];
let mut field_names = vec![]; let mut field_names = vec![];
let mut fixed_decodes = vec![]; let mut fixed_decodes = vec![];
@ -257,17 +408,13 @@ pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
let mut is_fixed_lens = vec![]; let mut is_fixed_lens = vec![];
let mut fixed_lens = vec![]; let mut fixed_lens = vec![];
// Build quotes for fields that should be deserialized and those that should be built from for (ty, ident, field_opts) in parse_ssz_fields(struct_data) {
// `Default`.
for field in &struct_data.fields {
match &field.ident {
Some(ref ident) => {
field_names.push(quote! { field_names.push(quote! {
#ident #ident
}); });
if should_skip_deserializing(field) {
// Field should not be deserialized; use a `Default` impl to instantiate. // Field should not be deserialized; use a `Default` impl to instantiate.
if field_opts.skip_deserializing {
decodes.push(quote! { decodes.push(quote! {
let #ident = <_>::default(); let #ident = <_>::default();
}); });
@ -275,32 +422,57 @@ pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
fixed_decodes.push(quote! { fixed_decodes.push(quote! {
let #ident = <_>::default(); let #ident = <_>::default();
}); });
continue;
}
let is_ssz_fixed_len;
let ssz_fixed_len;
let from_ssz_bytes;
if let Some(module) = field_opts.with {
let module = quote! { #module::decode };
is_ssz_fixed_len = quote! { #module::is_ssz_fixed_len() };
ssz_fixed_len = quote! { #module::ssz_fixed_len() };
from_ssz_bytes = quote! { #module::from_ssz_bytes(slice) };
register_types.push(quote! {
builder.register_type_parameterized(#is_ssz_fixed_len, #ssz_fixed_len)?;
});
decodes.push(quote! {
let #ident = decoder.decode_next_with(|slice| #module::from_ssz_bytes(slice))?;
});
} else { } else {
let ty = &field.ty; is_ssz_fixed_len = quote! { <#ty as ssz::Decode>::is_ssz_fixed_len() };
ssz_fixed_len = quote! { <#ty as ssz::Decode>::ssz_fixed_len() };
from_ssz_bytes = quote! { <#ty as ssz::Decode>::from_ssz_bytes(slice) };
register_types.push(quote! { register_types.push(quote! {
builder.register_type::<#ty>()?; builder.register_type::<#ty>()?;
}); });
decodes.push(quote! { decodes.push(quote! {
let #ident = decoder.decode_next()?; let #ident = decoder.decode_next()?;
}); });
}
fixed_decodes.push(quote! { fixed_decodes.push(quote! {
let #ident = decode_field!(#ty); let #ident = {
}); start = end;
end = end
is_fixed_lens.push(quote! { .checked_add(#ssz_fixed_len)
<#ty as ssz::Decode>::is_ssz_fixed_len() .ok_or_else(|| ssz::DecodeError::OutOfBoundsByte {
}); i: usize::max_value()
})?;
fixed_lens.push(quote! { let slice = bytes.get(start..end)
<#ty as ssz::Decode>::ssz_fixed_len() .ok_or_else(|| ssz::DecodeError::InvalidByteLength {
}); len: bytes.len(),
} expected: end
} })?;
_ => panic!("ssz_derive only supports named struct fields."), #from_ssz_bytes?
}; };
});
is_fixed_lens.push(is_ssz_fixed_len);
fixed_lens.push(ssz_fixed_len);
} }
let output = quote! { let output = quote! {
@ -338,23 +510,6 @@ pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
let mut start: usize = 0; let mut start: usize = 0;
let mut end = start; let mut end = start;
macro_rules! decode_field {
($type: ty) => {{
start = end;
end = end
.checked_add(<$type as ssz::Decode>::ssz_fixed_len())
.ok_or_else(|| ssz::DecodeError::OutOfBoundsByte {
i: usize::max_value()
})?;
let slice = bytes.get(start..end)
.ok_or_else(|| ssz::DecodeError::InvalidByteLength {
len: bytes.len(),
expected: end
})?;
<$type as ssz::Decode>::from_ssz_bytes(slice)?
}};
}
#( #(
#fixed_decodes #fixed_decodes
)* )*
@ -389,3 +544,79 @@ pub fn ssz_decode_derive(input: TokenStream) -> TokenStream {
}; };
output.into() output.into()
} }
/// Derive `ssz::Decode` for an `enum` following the "union" SSZ spec.
fn ssz_decode_derive_enum_union(derive_input: &DeriveInput, enum_data: &DataEnum) -> TokenStream {
let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
let (constructors, var_types): (Vec<_>, Vec<_>) = enum_data
.variants
.iter()
.map(|variant| {
let variant_name = &variant.ident;
if variant.fields.len() != 1 {
panic!("ssz::Encode can only be derived for enums with 1 field per variant");
}
let constructor = quote! {
#name::#variant_name
};
let ty = &(&variant.fields).into_iter().next().unwrap().ty;
(constructor, ty)
})
.unzip();
let union_selectors = compute_union_selectors(constructors.len());
let output = quote! {
impl #impl_generics ssz::Decode for #name #ty_generics #where_clause {
fn is_ssz_fixed_len() -> bool {
false
}
fn from_ssz_bytes(bytes: &[u8]) -> Result<Self, ssz::DecodeError> {
// Sanity check to ensure the definition here does not drift from the one defined in
// `ssz`.
debug_assert_eq!(#MAX_UNION_SELECTOR, ssz::MAX_UNION_SELECTOR);
let (selector, body) = ssz::split_union_bytes(bytes)?;
match selector.into() {
#(
#union_selectors => {
<#var_types as ssz::Decode>::from_ssz_bytes(body).map(#constructors)
},
)*
other => Err(ssz::DecodeError::UnionSelectorInvalid(other))
}
}
}
};
output.into()
}
fn compute_union_selectors(num_variants: usize) -> Vec<u8> {
let union_selectors = (0..num_variants)
.map(|i| {
i.try_into()
.expect("union selector exceeds u8::max_value, union has too many variants")
})
.collect::<Vec<u8>>();
let highest_selector = union_selectors
.last()
.copied()
.expect("0-variant union is not permitted");
assert!(
highest_selector <= MAX_UNION_SELECTOR,
"union selector {} exceeds limit of {}, enum has too many variants",
highest_selector,
MAX_UNION_SELECTOR
);
union_selectors
}

View File

@ -1,6 +1,6 @@
[package] [package]
name = "eth2_ssz_types" name = "eth2_ssz_types"
version = "0.2.0" version = "0.2.1"
authors = ["Paul Hauner <paul@paulhauner.com>"] authors = ["Paul Hauner <paul@paulhauner.com>"]
edition = "2018" edition = "2018"
description = "Provides types with unique properties required for SSZ serialization and Merklization." description = "Provides types with unique properties required for SSZ serialization and Merklization."
@ -10,14 +10,14 @@ license = "Apache-2.0"
name = "ssz_types" name = "ssz_types"
[dependencies] [dependencies]
tree_hash = "0.3.0" tree_hash = "0.4.0"
serde = "1.0.116" serde = "1.0.116"
serde_derive = "1.0.116" serde_derive = "1.0.116"
eth2_serde_utils = "0.1.0" eth2_serde_utils = "0.1.0"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
typenum = "1.12.0" typenum = "1.12.0"
arbitrary = { version = "0.4.6", features = ["derive"], optional = true } arbitrary = { version = "0.4.6", features = ["derive"], optional = true }
[dev-dependencies] [dev-dependencies]
serde_json = "1.0.58" serde_json = "1.0.58"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"

View File

@ -16,13 +16,13 @@ beacon_chain = { path = "../../beacon_node/beacon_chain" }
bls = { path = "../../crypto/bls" } bls = { path = "../../crypto/bls" }
integer-sqrt = "0.1.5" integer-sqrt = "0.1.5"
itertools = "0.10.0" itertools = "0.10.0"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
merkle_proof = { path = "../merkle_proof" } merkle_proof = { path = "../merkle_proof" }
log = "0.4.11" log = "0.4.11"
safe_arith = { path = "../safe_arith" } safe_arith = { path = "../safe_arith" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"
types = { path = "../types", default-features = false } types = { path = "../types", default-features = false }
rayon = "1.4.1" rayon = "1.4.1"
eth2_hashing = "0.2.0" eth2_hashing = "0.2.0"

View File

@ -1,6 +1,6 @@
[package] [package]
name = "tree_hash" name = "tree_hash"
version = "0.3.0" version = "0.4.0"
authors = ["Paul Hauner <paul@paulhauner.com>"] authors = ["Paul Hauner <paul@paulhauner.com>"]
edition = "2018" edition = "2018"
license = "Apache-2.0" license = "Apache-2.0"
@ -8,10 +8,12 @@ description = "Efficient Merkle-hashing as used in Ethereum 2.0"
[dev-dependencies] [dev-dependencies]
rand = "0.7.3" rand = "0.7.3"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"
types = { path = "../types" } types = { path = "../types" }
lazy_static = "1.4.0" lazy_static = "1.4.0"
beacon_chain = { path = "../../beacon_node/beacon_chain" } beacon_chain = { path = "../../beacon_node/beacon_chain" }
eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.3.0"
[dependencies] [dependencies]
ethereum-types = "0.11.0" ethereum-types = "0.11.0"

View File

@ -12,6 +12,7 @@ use eth2_hashing::{hash_fixed, ZERO_HASHES, ZERO_HASHES_MAX_INDEX};
pub const BYTES_PER_CHUNK: usize = 32; pub const BYTES_PER_CHUNK: usize = 32;
pub const HASHSIZE: usize = 32; pub const HASHSIZE: usize = 32;
pub const MERKLE_HASH_CHUNK: usize = 2 * BYTES_PER_CHUNK; pub const MERKLE_HASH_CHUNK: usize = 2 * BYTES_PER_CHUNK;
pub const MAX_UNION_SELECTOR: u8 = 127;
pub type Hash256 = ethereum_types::H256; pub type Hash256 = ethereum_types::H256;
@ -63,6 +64,31 @@ pub fn mix_in_length(root: &Hash256, length: usize) -> Hash256 {
Hash256::from_slice(&eth2_hashing::hash32_concat(root.as_bytes(), &length_bytes)[..]) Hash256::from_slice(&eth2_hashing::hash32_concat(root.as_bytes(), &length_bytes)[..])
} }
/// Returns `Some(root)` created by hashing `root` and `selector`, if `selector <=
/// MAX_UNION_SELECTOR`. Otherwise, returns `None`.
///
/// Used in `TreeHash` for the "union" type.
///
/// ## Specification
///
/// ```ignore,text
/// mix_in_selector: Given a Merkle root root and a type selector selector ("uint256" little-endian
/// serialization) return hash(root + selector).
/// ```
///
/// https://github.com/ethereum/consensus-specs/blob/v1.1.0-beta.3/ssz/simple-serialize.md#union
pub fn mix_in_selector(root: &Hash256, selector: u8) -> Option<Hash256> {
if selector > MAX_UNION_SELECTOR {
return None;
}
let mut chunk = [0; BYTES_PER_CHUNK];
chunk[0] = selector;
let root = eth2_hashing::hash32_concat(root.as_bytes(), &chunk);
Some(Hash256::from_slice(&root))
}
/// Returns a cached padding node for a given height. /// Returns a cached padding node for a given height.
fn get_zero_hash(height: usize) -> &'static [u8] { fn get_zero_hash(height: usize) -> &'static [u8] {
if height <= ZERO_HASHES_MAX_INDEX { if height <= ZERO_HASHES_MAX_INDEX {

View File

@ -0,0 +1,128 @@
use ssz_derive::Encode;
use tree_hash::{Hash256, MerkleHasher, TreeHash, BYTES_PER_CHUNK};
use tree_hash_derive::TreeHash;
#[derive(Encode)]
struct HashVec {
vec: Vec<u8>,
}
impl From<Vec<u8>> for HashVec {
fn from(vec: Vec<u8>) -> Self {
Self { vec }
}
}
impl tree_hash::TreeHash for HashVec {
fn tree_hash_type() -> tree_hash::TreeHashType {
tree_hash::TreeHashType::List
}
fn tree_hash_packed_encoding(&self) -> Vec<u8> {
unreachable!("List should never be packed.")
}
fn tree_hash_packing_factor() -> usize {
unreachable!("List should never be packed.")
}
fn tree_hash_root(&self) -> Hash256 {
let mut hasher =
MerkleHasher::with_leaves((self.vec.len() + BYTES_PER_CHUNK - 1) / BYTES_PER_CHUNK);
for item in &self.vec {
hasher.write(&item.tree_hash_packed_encoding()).unwrap()
}
let root = hasher.finish().unwrap();
tree_hash::mix_in_length(&root, self.vec.len())
}
}
fn mix_in_selector(a: Hash256, selector: u8) -> Hash256 {
let mut b = [0; 32];
b[0] = selector;
Hash256::from_slice(&eth2_hashing::hash32_concat(a.as_bytes(), &b))
}
fn u8_hash_concat(v1: u8, v2: u8) -> Hash256 {
let mut a = [0; 32];
let mut b = [0; 32];
a[0] = v1;
b[0] = v2;
Hash256::from_slice(&eth2_hashing::hash32_concat(&a, &b))
}
fn u8_hash(x: u8) -> Hash256 {
let mut a = [0; 32];
a[0] = x;
Hash256::from_slice(&a)
}
#[derive(TreeHash)]
#[tree_hash(enum_behaviour = "transparent")]
enum FixedTrans {
A(u8),
B(u8),
}
#[test]
fn fixed_trans() {
assert_eq!(FixedTrans::A(2).tree_hash_root(), u8_hash(2));
assert_eq!(FixedTrans::B(2).tree_hash_root(), u8_hash(2));
}
#[derive(TreeHash)]
#[tree_hash(enum_behaviour = "union")]
enum FixedUnion {
A(u8),
B(u8),
}
#[test]
fn fixed_union() {
assert_eq!(FixedUnion::A(2).tree_hash_root(), u8_hash_concat(2, 0));
assert_eq!(FixedUnion::B(2).tree_hash_root(), u8_hash_concat(2, 1));
}
#[derive(TreeHash)]
#[tree_hash(enum_behaviour = "transparent")]
enum VariableTrans {
A(HashVec),
B(HashVec),
}
#[test]
fn variable_trans() {
assert_eq!(
VariableTrans::A(HashVec::from(vec![2])).tree_hash_root(),
u8_hash_concat(2, 1)
);
assert_eq!(
VariableTrans::B(HashVec::from(vec![2])).tree_hash_root(),
u8_hash_concat(2, 1)
);
}
#[derive(TreeHash)]
#[tree_hash(enum_behaviour = "union")]
enum VariableUnion {
A(HashVec),
B(HashVec),
}
#[test]
fn variable_union() {
assert_eq!(
VariableUnion::A(HashVec::from(vec![2])).tree_hash_root(),
mix_in_selector(u8_hash_concat(2, 1), 0)
);
assert_eq!(
VariableUnion::B(HashVec::from(vec![2])).tree_hash_root(),
mix_in_selector(u8_hash_concat(2, 1), 1)
);
}

View File

@ -1,6 +1,6 @@
[package] [package]
name = "tree_hash_derive" name = "tree_hash_derive"
version = "0.3.1" version = "0.4.0"
authors = ["Paul Hauner <paul@paulhauner.com>"] authors = ["Paul Hauner <paul@paulhauner.com>"]
edition = "2018" edition = "2018"
description = "Procedural derive macros to accompany the tree_hash crate." description = "Procedural derive macros to accompany the tree_hash crate."
@ -12,3 +12,4 @@ proc-macro = true
[dependencies] [dependencies]
syn = "1.0.42" syn = "1.0.42"
quote = "1.0.7" quote = "1.0.7"
darling = "0.13.0"

View File

@ -1,8 +1,45 @@
#![recursion_limit = "256"] #![recursion_limit = "256"]
use darling::FromDeriveInput;
use proc_macro::TokenStream; use proc_macro::TokenStream;
use quote::quote; use quote::quote;
use std::convert::TryInto;
use syn::{parse_macro_input, Attribute, DataEnum, DataStruct, DeriveInput, Meta}; use syn::{parse_macro_input, Attribute, DataEnum, DataStruct, DeriveInput, Meta};
/// The highest possible union selector value (higher values are reserved for backwards compatible
/// extensions).
const MAX_UNION_SELECTOR: u8 = 127;
#[derive(Debug, FromDeriveInput)]
#[darling(attributes(tree_hash))]
struct StructOpts {
#[darling(default)]
enum_behaviour: Option<String>,
}
const ENUM_TRANSPARENT: &str = "transparent";
const ENUM_UNION: &str = "union";
const ENUM_VARIANTS: &[&str] = &[ENUM_TRANSPARENT, ENUM_UNION];
const NO_ENUM_BEHAVIOUR_ERROR: &str = "enums require an \"enum_behaviour\" attribute, \
e.g., #[tree_hash(enum_behaviour = \"transparent\")]";
enum EnumBehaviour {
Transparent,
Union,
}
impl EnumBehaviour {
pub fn new(s: Option<String>) -> Option<Self> {
s.map(|s| match s.as_ref() {
ENUM_TRANSPARENT => EnumBehaviour::Transparent,
ENUM_UNION => EnumBehaviour::Union,
other => panic!(
"{} is an invalid enum_behaviour, use either {:?}",
other, ENUM_VARIANTS
),
})
}
}
/// Return a Vec of `syn::Ident` for each named field in the struct, whilst filtering out fields /// Return a Vec of `syn::Ident` for each named field in the struct, whilst filtering out fields
/// that should not be hashed. /// that should not be hashed.
/// ///
@ -82,11 +119,21 @@ fn should_skip_hashing(field: &syn::Field) -> bool {
#[proc_macro_derive(TreeHash, attributes(tree_hash))] #[proc_macro_derive(TreeHash, attributes(tree_hash))]
pub fn tree_hash_derive(input: TokenStream) -> TokenStream { pub fn tree_hash_derive(input: TokenStream) -> TokenStream {
let item = parse_macro_input!(input as DeriveInput); let item = parse_macro_input!(input as DeriveInput);
let opts = StructOpts::from_derive_input(&item).unwrap();
let enum_opt = EnumBehaviour::new(opts.enum_behaviour);
match &item.data { match &item.data {
syn::Data::Struct(s) => tree_hash_derive_struct(&item, s), syn::Data::Struct(s) => {
syn::Data::Enum(e) => tree_hash_derive_enum(&item, e), if enum_opt.is_some() {
_ => panic!("tree_hash_derive only supports structs."), panic!("enum_behaviour is invalid for structs");
}
tree_hash_derive_struct(&item, s)
}
syn::Data::Enum(s) => match enum_opt.expect(NO_ENUM_BEHAVIOUR_ERROR) {
EnumBehaviour::Transparent => tree_hash_derive_enum_transparent(&item, s),
EnumBehaviour::Union => tree_hash_derive_enum_union(&item, s),
},
_ => panic!("tree_hash_derive only supports structs and enums."),
} }
} }
@ -126,15 +173,26 @@ fn tree_hash_derive_struct(item: &DeriveInput, struct_data: &DataStruct) -> Toke
output.into() output.into()
} }
/// Derive `TreeHash` for a restricted subset of all possible enum types. /// Derive `TreeHash` for an enum in the "transparent" method.
///
/// The "transparent" method is distinct from the "union" method specified in the SSZ specification.
/// When using "transparent", the enum will be ignored and the contained field will be hashed as if
/// the enum does not exist.
///
///## Limitations
/// ///
/// Only supports: /// Only supports:
/// - Enums with a single field per variant, where /// - Enums with a single field per variant, where
/// - All fields are "container" types. /// - All fields are "container" types.
/// ///
/// ## Panics
///
/// Will panic at compile-time if the single field requirement isn't met, but will panic *at run /// Will panic at compile-time if the single field requirement isn't met, but will panic *at run
/// time* if the container type requirement isn't met. /// time* if the container type requirement isn't met.
fn tree_hash_derive_enum(derive_input: &DeriveInput, enum_data: &DataEnum) -> TokenStream { fn tree_hash_derive_enum_transparent(
derive_input: &DeriveInput,
enum_data: &DataEnum,
) -> TokenStream {
let name = &derive_input.ident; let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl(); let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
@ -181,7 +239,7 @@ fn tree_hash_derive_enum(derive_input: &DeriveInput, enum_data: &DataEnum) -> To
unreachable!("Enum should never be packed") unreachable!("Enum should never be packed")
} }
fn tree_hash_root(&self) -> Hash256 { fn tree_hash_root(&self) -> tree_hash::Hash256 {
match self { match self {
#( #(
#patterns => inner.tree_hash_root(), #patterns => inner.tree_hash_root(),
@ -192,3 +250,88 @@ fn tree_hash_derive_enum(derive_input: &DeriveInput, enum_data: &DataEnum) -> To
}; };
output.into() output.into()
} }
/// Derive `TreeHash` for an `enum` following the "union" SSZ spec.
///
/// The union selector will be determined based upon the order in which the enum variants are
/// defined. E.g., the top-most variant in the enum will have a selector of `0`, the variant
/// beneath it will have a selector of `1` and so on.
///
/// # Limitations
///
/// Only supports enums where each variant has a single field.
fn tree_hash_derive_enum_union(derive_input: &DeriveInput, enum_data: &DataEnum) -> TokenStream {
let name = &derive_input.ident;
let (impl_generics, ty_generics, where_clause) = &derive_input.generics.split_for_impl();
let patterns: Vec<_> = enum_data
.variants
.iter()
.map(|variant| {
let variant_name = &variant.ident;
if variant.fields.len() != 1 {
panic!("TreeHash can only be derived for enums with 1 field per variant");
}
quote! {
#name::#variant_name(ref inner)
}
})
.collect();
let union_selectors = compute_union_selectors(patterns.len());
let output = quote! {
impl #impl_generics tree_hash::TreeHash for #name #ty_generics #where_clause {
fn tree_hash_type() -> tree_hash::TreeHashType {
tree_hash::TreeHashType::Container
}
fn tree_hash_packed_encoding(&self) -> Vec<u8> {
unreachable!("Enum should never be packed")
}
fn tree_hash_packing_factor() -> usize {
unreachable!("Enum should never be packed")
}
fn tree_hash_root(&self) -> tree_hash::Hash256 {
match self {
#(
#patterns => {
let root = inner.tree_hash_root();
let selector = #union_selectors;
tree_hash::mix_in_selector(&root, selector)
.expect("derive macro should prevent out-of-bounds selectors")
},
)*
}
}
}
};
output.into()
}
fn compute_union_selectors(num_variants: usize) -> Vec<u8> {
let union_selectors = (0..num_variants)
.map(|i| {
i.try_into()
.expect("union selector exceeds u8::max_value, union has too many variants")
})
.collect::<Vec<u8>>();
let highest_selector = union_selectors
.last()
.copied()
.expect("0-variant union is not permitted");
assert!(
highest_selector <= MAX_UNION_SELECTOR,
"union selector {} exceeds limit of {}, enum has too many variants",
highest_selector,
MAX_UNION_SELECTOR
);
union_selectors
}

View File

@ -25,13 +25,13 @@ safe_arith = { path = "../safe_arith" }
serde = {version = "1.0.116" , features = ["rc"] } serde = {version = "1.0.116" , features = ["rc"] }
serde_derive = "1.0.116" serde_derive = "1.0.116"
slog = "2.5.2" slog = "2.5.2"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
eth2_ssz_types = "0.2.0" eth2_ssz_types = "0.2.1"
swap_or_not_shuffle = { path = "../swap_or_not_shuffle" } swap_or_not_shuffle = { path = "../swap_or_not_shuffle" }
test_random_derive = { path = "../../common/test_random_derive" } test_random_derive = { path = "../../common/test_random_derive" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"
rand_xorshift = "0.2.0" rand_xorshift = "0.2.0"
cached_tree_hash = { path = "../cached_tree_hash" } cached_tree_hash = { path = "../cached_tree_hash" }
serde_yaml = "0.8.13" serde_yaml = "0.8.13"

View File

@ -28,14 +28,19 @@ use tree_hash_derive::TreeHash;
TestRandom TestRandom
), ),
serde(bound = "T: EthSpec", deny_unknown_fields), serde(bound = "T: EthSpec", deny_unknown_fields),
cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary)) cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary)),
), ),
ref_attributes(derive(Debug, PartialEq, TreeHash)) ref_attributes(
derive(Debug, PartialEq, TreeHash),
tree_hash(enum_behaviour = "transparent")
)
)] )]
#[derive(Debug, PartialEq, Clone, Serialize, Deserialize, Encode, TreeHash)] #[derive(Debug, PartialEq, Clone, Serialize, Deserialize, Encode, TreeHash)]
#[serde(untagged)] #[serde(untagged)]
#[serde(bound = "T: EthSpec")] #[serde(bound = "T: EthSpec")]
#[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))] #[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))]
#[tree_hash(enum_behaviour = "transparent")]
#[ssz(enum_behaviour = "transparent")]
pub struct BeaconBlock<T: EthSpec> { pub struct BeaconBlock<T: EthSpec> {
#[superstruct(getter(copy))] #[superstruct(getter(copy))]
pub slot: Slot, pub slot: Slot,

View File

@ -197,6 +197,8 @@ impl From<BeaconStateHash> for Hash256 {
#[serde(untagged)] #[serde(untagged)]
#[serde(bound = "T: EthSpec")] #[serde(bound = "T: EthSpec")]
#[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))] #[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))]
#[tree_hash(enum_behaviour = "transparent")]
#[ssz(enum_behaviour = "transparent")]
pub struct BeaconState<T> pub struct BeaconState<T>
where where
T: EthSpec, T: EthSpec,
@ -275,36 +277,31 @@ where
// Caching (not in the spec) // Caching (not in the spec)
#[serde(skip_serializing, skip_deserializing)] #[serde(skip_serializing, skip_deserializing)]
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
#[tree_hash(skip_hashing)] #[tree_hash(skip_hashing)]
#[test_random(default)] #[test_random(default)]
#[derivative(Clone(clone_with = "clone_default"))] #[derivative(Clone(clone_with = "clone_default"))]
pub total_active_balance: Option<(Epoch, u64)>, pub total_active_balance: Option<(Epoch, u64)>,
#[serde(skip_serializing, skip_deserializing)] #[serde(skip_serializing, skip_deserializing)]
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
#[tree_hash(skip_hashing)] #[tree_hash(skip_hashing)]
#[test_random(default)] #[test_random(default)]
#[derivative(Clone(clone_with = "clone_default"))] #[derivative(Clone(clone_with = "clone_default"))]
pub committee_caches: [CommitteeCache; CACHED_EPOCHS], pub committee_caches: [CommitteeCache; CACHED_EPOCHS],
#[serde(skip_serializing, skip_deserializing)] #[serde(skip_serializing, skip_deserializing)]
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
#[tree_hash(skip_hashing)] #[tree_hash(skip_hashing)]
#[test_random(default)] #[test_random(default)]
#[derivative(Clone(clone_with = "clone_default"))] #[derivative(Clone(clone_with = "clone_default"))]
pub pubkey_cache: PubkeyCache, pub pubkey_cache: PubkeyCache,
#[serde(skip_serializing, skip_deserializing)] #[serde(skip_serializing, skip_deserializing)]
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
#[tree_hash(skip_hashing)] #[tree_hash(skip_hashing)]
#[test_random(default)] #[test_random(default)]
#[derivative(Clone(clone_with = "clone_default"))] #[derivative(Clone(clone_with = "clone_default"))]
pub exit_cache: ExitCache, pub exit_cache: ExitCache,
#[serde(skip_serializing, skip_deserializing)] #[serde(skip_serializing, skip_deserializing)]
#[ssz(skip_serializing)] #[ssz(skip_serializing, skip_deserializing)]
#[ssz(skip_deserializing)]
#[tree_hash(skip_hashing)] #[tree_hash(skip_hashing)]
#[test_random(default)] #[test_random(default)]
#[derivative(Clone(clone_with = "clone_default"))] #[derivative(Clone(clone_with = "clone_default"))]

View File

@ -5,19 +5,26 @@ use crate::*;
use core::num::NonZeroUsize; use core::num::NonZeroUsize;
use safe_arith::SafeArith; use safe_arith::SafeArith;
use serde_derive::{Deserialize, Serialize}; use serde_derive::{Deserialize, Serialize};
use ssz::{four_byte_option_impl, Decode, DecodeError, Encode};
use ssz_derive::{Decode, Encode}; use ssz_derive::{Decode, Encode};
use std::ops::Range; use std::ops::Range;
use swap_or_not_shuffle::shuffle_list; use swap_or_not_shuffle::shuffle_list;
mod tests; mod tests;
// Define "legacy" implementations of `Option<Epoch>`, `Option<NonZeroUsize>` which use four bytes
// for encoding the union selector.
four_byte_option_impl!(four_byte_option_epoch, Epoch);
four_byte_option_impl!(four_byte_option_non_zero_usize, NonZeroUsize);
/// Computes and stores the shuffling for an epoch. Provides various getters to allow callers to /// Computes and stores the shuffling for an epoch. Provides various getters to allow callers to
/// read the committees for the given epoch. /// read the committees for the given epoch.
#[derive(Debug, Default, PartialEq, Clone, Serialize, Deserialize, Encode, Decode)] #[derive(Debug, Default, PartialEq, Clone, Serialize, Deserialize, Encode, Decode)]
pub struct CommitteeCache { pub struct CommitteeCache {
#[ssz(with = "four_byte_option_epoch")]
initialized_epoch: Option<Epoch>, initialized_epoch: Option<Epoch>,
shuffling: Vec<usize>, shuffling: Vec<usize>,
shuffling_positions: Vec<Option<NonZeroUsize>>, shuffling_positions: Vec<NonZeroUsizeOption>,
committees_per_slot: u64, committees_per_slot: u64,
slots_per_epoch: u64, slots_per_epoch: u64,
} }
@ -63,11 +70,11 @@ impl CommitteeCache {
return Err(Error::TooManyValidators); return Err(Error::TooManyValidators);
} }
let mut shuffling_positions = vec![None; state.validators().len()]; let mut shuffling_positions = vec![<_>::default(); state.validators().len()];
for (i, &v) in shuffling.iter().enumerate() { for (i, &v) in shuffling.iter().enumerate() {
*shuffling_positions *shuffling_positions
.get_mut(v) .get_mut(v)
.ok_or(Error::ShuffleIndexOutOfBounds(v))? = NonZeroUsize::new(i + 1); .ok_or(Error::ShuffleIndexOutOfBounds(v))? = NonZeroUsize::new(i + 1).into();
} }
Ok(CommitteeCache { Ok(CommitteeCache {
@ -258,7 +265,8 @@ impl CommitteeCache {
pub fn shuffled_position(&self, validator_index: usize) -> Option<usize> { pub fn shuffled_position(&self, validator_index: usize) -> Option<usize> {
self.shuffling_positions self.shuffling_positions
.get(validator_index)? .get(validator_index)?
.and_then(|p| Some(p.get() - 1)) .0
.map(|p| p.get() - 1)
} }
} }
@ -324,3 +332,52 @@ impl arbitrary::Arbitrary for CommitteeCache {
Ok(Self::default()) Ok(Self::default())
} }
} }
/// This is a shim struct to ensure that we can encode a `Vec<Option<NonZeroUsize>>` an SSZ union
/// with a four-byte selector. The SSZ specification changed from four bytes to one byte during 2021
/// and we use this shim to avoid breaking the Lighthouse database.
#[derive(Debug, Default, PartialEq, Clone, Serialize, Deserialize)]
#[serde(transparent)]
struct NonZeroUsizeOption(Option<NonZeroUsize>);
impl From<Option<NonZeroUsize>> for NonZeroUsizeOption {
fn from(opt: Option<NonZeroUsize>) -> Self {
Self(opt)
}
}
impl Encode for NonZeroUsizeOption {
fn is_ssz_fixed_len() -> bool {
four_byte_option_non_zero_usize::encode::is_ssz_fixed_len()
}
fn ssz_fixed_len() -> usize {
four_byte_option_non_zero_usize::encode::ssz_fixed_len()
}
fn ssz_bytes_len(&self) -> usize {
four_byte_option_non_zero_usize::encode::ssz_bytes_len(&self.0)
}
fn ssz_append(&self, buf: &mut Vec<u8>) {
four_byte_option_non_zero_usize::encode::ssz_append(&self.0, buf)
}
fn as_ssz_bytes(&self) -> Vec<u8> {
four_byte_option_non_zero_usize::encode::as_ssz_bytes(&self.0)
}
}
impl Decode for NonZeroUsizeOption {
fn is_ssz_fixed_len() -> bool {
four_byte_option_non_zero_usize::decode::is_ssz_fixed_len()
}
fn ssz_fixed_len() -> usize {
four_byte_option_non_zero_usize::decode::ssz_fixed_len()
}
fn from_ssz_bytes(bytes: &[u8]) -> Result<Self, DecodeError> {
four_byte_option_non_zero_usize::decode::from_ssz_bytes(bytes).map(Self)
}
}

View File

@ -57,6 +57,8 @@ impl From<SignedBeaconBlockHash> for Hash256 {
#[serde(untagged)] #[serde(untagged)]
#[serde(bound = "E: EthSpec")] #[serde(bound = "E: EthSpec")]
#[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))] #[cfg_attr(feature = "arbitrary-fuzz", derive(arbitrary::Arbitrary))]
#[tree_hash(enum_behaviour = "transparent")]
#[ssz(enum_behaviour = "transparent")]
pub struct SignedBeaconBlock<E: EthSpec> { pub struct SignedBeaconBlock<E: EthSpec> {
#[superstruct(only(Base), partial_getter(rename = "message_base"))] #[superstruct(only(Base), partial_getter(rename = "message_base"))]
pub message: BeaconBlockBase<E>, pub message: BeaconBlockBase<E>,

View File

@ -5,8 +5,8 @@ authors = ["Paul Hauner <paul@paulhauner.com>"]
edition = "2018" edition = "2018"
[dependencies] [dependencies]
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
milagro_bls = { git = "https://github.com/sigp/milagro_bls", tag = "v1.4.2", optional = true } milagro_bls = { git = "https://github.com/sigp/milagro_bls", tag = "v1.4.2", optional = true }
rand = "0.7.3" rand = "0.7.3"
serde = "1.0.116" serde = "1.0.116"

View File

@ -18,7 +18,7 @@ serde = "1.0.116"
serde_repr = "0.1.6" serde_repr = "0.1.6"
hex = "0.4.2" hex = "0.4.2"
bls = { path = "../bls" } bls = { path = "../bls" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
serde_json = "1.0.58" serde_json = "1.0.58"
eth2_key_derivation = { path = "../eth2_key_derivation" } eth2_key_derivation = { path = "../eth2_key_derivation" }
unicode-normalization = "0.1.16" unicode-normalization = "0.1.16"

View File

@ -19,7 +19,7 @@ serde_json = "1.0.66"
env_logger = "0.9.0" env_logger = "0.9.0"
types = { path = "../consensus/types" } types = { path = "../consensus/types" }
state_processing = { path = "../consensus/state_processing" } state_processing = { path = "../consensus/state_processing" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
regex = "1.3.9" regex = "1.3.9"
futures = "0.3.7" futures = "0.3.7"
environment = { path = "../lighthouse/environment" } environment = { path = "../lighthouse/environment" }
@ -27,7 +27,7 @@ eth2_network_config = { path = "../common/eth2_network_config" }
dirs = "3.0.1" dirs = "3.0.1"
genesis = { path = "../beacon_node/genesis" } genesis = { path = "../beacon_node/genesis" }
deposit_contract = { path = "../common/deposit_contract" } deposit_contract = { path = "../common/deposit_contract" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
tokio = { version = "1.10.0", features = ["full"] } tokio = { version = "1.10.0", features = ["full"] }
clap_utils = { path = "../common/clap_utils" } clap_utils = { path = "../common/clap_utils" }
eth2_libp2p = { path = "../beacon_node/eth2_libp2p" } eth2_libp2p = { path = "../beacon_node/eth2_libp2p" }

View File

@ -7,8 +7,8 @@ edition = "2018"
[dependencies] [dependencies]
bincode = "1.3.1" bincode = "1.3.1"
byteorder = "1.3.4" byteorder = "1.3.4"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
flate2 = { version = "1.0.14", features = ["zlib"], default-features = false } flate2 = { version = "1.0.14", features = ["zlib"], default-features = false }
lazy_static = "1.4.0" lazy_static = "1.4.0"
lighthouse_metrics = { path = "../common/lighthouse_metrics" } lighthouse_metrics = { path = "../common/lighthouse_metrics" }
@ -22,8 +22,8 @@ serde = "1.0"
serde_derive = "1.0" serde_derive = "1.0"
slog = "2.5.2" slog = "2.5.2"
sloggers = "2.0.2" sloggers = "2.0.2"
tree_hash = "0.3.0" tree_hash = "0.4.0"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"
types = { path = "../consensus/types" } types = { path = "../consensus/types" }
[dev-dependencies] [dev-dependencies]

View File

@ -22,10 +22,10 @@ serde = "1.0.116"
serde_derive = "1.0.116" serde_derive = "1.0.116"
serde_repr = "0.1.6" serde_repr = "0.1.6"
serde_yaml = "0.8.13" serde_yaml = "0.8.13"
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
tree_hash = "0.3.0" tree_hash = "0.4.0"
tree_hash_derive = "0.3.1" tree_hash_derive = "0.4.0"
cached_tree_hash = { path = "../../consensus/cached_tree_hash" } cached_tree_hash = { path = "../../consensus/cached_tree_hash" }
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
swap_or_not_shuffle = { path = "../../consensus/swap_or_not_shuffle" } swap_or_not_shuffle = { path = "../../consensus/swap_or_not_shuffle" }

View File

@ -26,7 +26,7 @@ pub struct Deltas {
penalties: Vec<u64>, penalties: Vec<u64>,
} }
#[derive(Debug, Clone, PartialEq, Decode, Encode, CompareFields)] #[derive(Debug, Clone, PartialEq, CompareFields)]
pub struct AllDeltas { pub struct AllDeltas {
source_deltas: Deltas, source_deltas: Deltas,
target_deltas: Deltas, target_deltas: Deltas,

View File

@ -9,6 +9,6 @@ edition = "2018"
[dependencies] [dependencies]
state_processing = { path = "../../consensus/state_processing" } state_processing = { path = "../../consensus/state_processing" }
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
beacon_chain = { path = "../../beacon_node/beacon_chain" } beacon_chain = { path = "../../beacon_node/beacon_chain" }
lazy_static = "1.4.0" lazy_static = "1.4.0"

View File

@ -13,9 +13,9 @@ tokio = { version = "1.10.0", features = ["time", "rt-multi-thread", "macros"] }
deposit_contract = { path = "../common/deposit_contract" } deposit_contract = { path = "../common/deposit_contract" }
[dependencies] [dependencies]
eth2_ssz = "0.3.0" eth2_ssz = "0.4.0"
eth2_config = { path = "../common/eth2_config" } eth2_config = { path = "../common/eth2_config" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
clap = "2.33.3" clap = "2.33.3"
eth2_interop_keypairs = { path = "../common/eth2_interop_keypairs" } eth2_interop_keypairs = { path = "../common/eth2_interop_keypairs" }
slashing_protection = { path = "./slashing_protection" } slashing_protection = { path = "./slashing_protection" }
@ -41,7 +41,7 @@ parking_lot = "0.11.0"
exit-future = "0.2.0" exit-future = "0.2.0"
filesystem = { path = "../common/filesystem" } filesystem = { path = "../common/filesystem" }
libc = "0.2.79" libc = "0.2.79"
eth2_ssz_derive = "0.2.1" eth2_ssz_derive = "0.3.0"
hex = "0.4.2" hex = "0.4.2"
deposit_contract = { path = "../common/deposit_contract" } deposit_contract = { path = "../common/deposit_contract" }
bls = { path = "../crypto/bls" } bls = { path = "../crypto/bls" }

View File

@ -7,7 +7,7 @@ edition = "2018"
[dependencies] [dependencies]
tempfile = "3.1.0" tempfile = "3.1.0"
types = { path = "../../consensus/types" } types = { path = "../../consensus/types" }
tree_hash = "0.3.0" tree_hash = "0.4.0"
rusqlite = { version = "0.25.3", features = ["bundled"] } rusqlite = { version = "0.25.3", features = ["bundled"] }
r2d2 = "0.8.9" r2d2 = "0.8.9"
r2d2_sqlite = "0.18.0" r2d2_sqlite = "0.18.0"