Prevent adding duplicate validators to validator_definitions.yml (#2166)

## Issue Addressed

N/A

## Proposed Changes

This is mostly a UX improvement.

Currently, when recursively finding keystores, we only ignore keystores with same path.This leads to potential issues while copying datadirs (e.g. copying datadir to a new ssd with more storage). After copying new datadir and starting the vc, we will  discover the copied keystores as new keystores and add it to the definitions file leading to duplicate entries.

This PR avoids duplicate keystores being discovered as new keystore by checking for duplicate pubkeys as well.
This commit is contained in:
Pawan Dhananjay 2021-02-15 06:09:51 +00:00
parent 8e5c20b6d1
commit 6e6e9104f5

View File

@ -164,6 +164,12 @@ impl ValidatorDefinitions {
})
.collect();
let known_pubkeys: HashSet<PublicKey> = self
.0
.iter()
.map(|def| def.voting_public_key.clone())
.collect();
let mut new_defs = keystore_paths
.into_iter()
.filter_map(|voting_keystore_path| {
@ -200,7 +206,13 @@ impl ValidatorDefinitions {
.filter(|path| path.exists());
let voting_public_key = match keystore.public_key() {
Some(pubkey) => pubkey,
Some(pubkey) => {
if known_pubkeys.contains(&pubkey) {
return None;
} else {
pubkey
}
}
None => {
error!(
log,