Compare commits
4 Commits
main
...
nv-nitro-n
Author | SHA1 | Date | |
---|---|---|---|
|
18056fd877 | ||
8c672c0966 | |||
98e882bf28 | |||
5ce343566e |
2
.gitignore
vendored
Normal file
2
.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
*-deployment
|
||||
*-spec.yml
|
@ -99,7 +99,7 @@
|
||||
cat <<EOF > stage0-deployment/config.env
|
||||
# Set to true to enable adding participants functionality of the onboarding module
|
||||
ONBOARDING_ENABLED=true
|
||||
|
||||
|
||||
# A custom human readable name for this node
|
||||
MONIKER=LaconicStage0
|
||||
EOF
|
||||
@ -625,7 +625,7 @@
|
||||
```bash
|
||||
network:
|
||||
ports:
|
||||
laconic-console:
|
||||
console:
|
||||
- '127.0.0.1:4001:80'
|
||||
```
|
||||
|
||||
|
85
ops/nitro-node.md
Normal file
85
ops/nitro-node.md
Normal file
@ -0,0 +1,85 @@
|
||||
# Run nitro-nodes
|
||||
|
||||
## Setup
|
||||
|
||||
- Follow the [installation guide](https://github.com/deep-stack/ops/blob/ag-run-l2/README.md#installation) to setup ansible on your machine
|
||||
|
||||
- Ensure laconic-so is installed
|
||||
|
||||
```bash
|
||||
laconic-so version
|
||||
```
|
||||
|
||||
- Clone the ops repo
|
||||
|
||||
```bash
|
||||
git clone git@github.com:deep-stack/ops.git
|
||||
cd ops
|
||||
git checkout ag-run-l2
|
||||
```
|
||||
|
||||
|
||||
## Run l1, l2 nitro nodes
|
||||
|
||||
- Navigate to the `vulcanize/nitro-node-setup` directory
|
||||
|
||||
```bash
|
||||
cd vulcanize/nitro-node-setup
|
||||
```
|
||||
|
||||
- Copy the `nitro-vars-example.yml` vars file
|
||||
|
||||
```bash
|
||||
cp nitro-vars-example.yml nitro-vars.yml
|
||||
```
|
||||
|
||||
<!-- TODO: Provide nitro environment variables to user -->
|
||||
- Edit [`nitro-vars.yml`](./nitro-vars.yml) and fill in the following values
|
||||
|
||||
```bash
|
||||
# URL endpoint of the L1 chain
|
||||
l1_nitro_chain_url: ""
|
||||
|
||||
# URL endpoint of the L2 chain
|
||||
l2_nitro_chain_url: ""
|
||||
|
||||
# Private key for your nitro address
|
||||
nitro_sc_pk: ""
|
||||
|
||||
# Private key of the account on chain that is used for funding channels in Nitro node
|
||||
nitro_chain_pk: ""
|
||||
|
||||
# Contract address of NitroAdjudicator
|
||||
na_address: ""
|
||||
|
||||
# Contract address of VirtualPaymentApp
|
||||
vpa_address: ""
|
||||
|
||||
# Contract address of ConsensusApp
|
||||
ca_address: ""
|
||||
|
||||
# Contract address of bridge
|
||||
bridge_address: ""
|
||||
|
||||
# IP address of the bridge node
|
||||
nitro_bridge_ip: ""
|
||||
|
||||
# Publically accessible IP address of your nitro node
|
||||
nitro_node_ip: ""
|
||||
```
|
||||
|
||||
- To run nitro nodes, execute the `run-nitro-node.yml` Ansible playbook by running the following command.
|
||||
|
||||
NOTE: By default, deployments are created in the `nitro-node-setup/out` directory. If you need to change this location, you can update the `nitro_directory` variable in the [setup-vars.yml](./setup-vars.yml) file.
|
||||
|
||||
```bash
|
||||
LANG=en_IN.utf8 ansible-playbook -i localhost, --connection=local run-nitro-node.yml --extra-vars='{ "target_host": "localhost"}' --user $USER
|
||||
```
|
||||
|
||||
- If you want to skip building the containers, set `"skip_container_build" : true` in the `--extra-vars` parameter:
|
||||
|
||||
```bash
|
||||
LANG=en_IN.utf8 ansible-playbook -i localhost, --connection=local run-nitro-node.yml --extra-vars='{ "target_host": "localhost", "skip_container_build": true }' --user $USER
|
||||
```
|
||||
|
||||
- Follow steps from [Demo](https://git.vdb.to/cerc-io/nitro-stack/src/branch/main/nitro-bridge-demo.md#demo) to create mirror channels on L2, create virtual channel and make payments
|
@ -22,12 +22,6 @@ Once all the participants have completed their onboarding, stage0 laconicd chain
|
||||
laconic-so deployment --dir stage0-deployment logs laconicd -f --tail 30
|
||||
```
|
||||
|
||||
* List the participants on stage0:
|
||||
|
||||
```bash
|
||||
laconic-so deployment --dir stage0-deployment exec laconicd "laconicd query onboarding list"
|
||||
```
|
||||
|
||||
* Stop the stage0 deployment:
|
||||
|
||||
```bash
|
||||
@ -36,37 +30,29 @@ Once all the participants have completed their onboarding, stage0 laconicd chain
|
||||
|
||||
## Start stage1
|
||||
|
||||
* Use the scripts in fixturenet-laconicd stack to generate genesis file for stage1 using onboarding participants from stage0 chain with token allocations:
|
||||
* Rebuild laconicd container with `>=v0.1.7` to enable `slashing` module:
|
||||
|
||||
```bash
|
||||
# laconicd source
|
||||
cd ~/cerc/laconicd
|
||||
|
||||
# Pull latest changes
|
||||
git pull
|
||||
|
||||
# Confirm the latest commit hash
|
||||
git log
|
||||
|
||||
# Rebuild the containers
|
||||
cd /srv/laconicd
|
||||
|
||||
# Set current working dir path in a variable
|
||||
DEPLOYMENTS_DIR=$(pwd)
|
||||
|
||||
cd ~/cerc/fixturenet-laconicd-stack/stack-orchestrator/stacks/fixturenet-laconicd
|
||||
|
||||
# Generate the genesis file
|
||||
# Participant allocation: 1000000000000 (10^12)
|
||||
# Validator allocation: 2000000000000000 (10^15)
|
||||
./scripts/generate-stage1-genesis-using-allocations.sh $DEPLOYMENTS_DIR/stage0-deployment 1000000000000 2000000000000000
|
||||
|
||||
# Expected output:
|
||||
# Genesis file for stage1 written to output/genesis.json
|
||||
|
||||
# Remove the temporary data directory
|
||||
sudo rm -rf stage1-genesis
|
||||
|
||||
# Go back to the directory where deployments are created
|
||||
cd $DEPLOYMENTS_DIR
|
||||
laconic-so --stack ~/cerc/fixturenet-laconicd-stack/stack-orchestrator/stacks/fixturenet-laconicd build-containers --force-rebuild
|
||||
```
|
||||
|
||||
* Copy over the generated genesis file (`.json`) containing the onboarding module state with funded participants to data directory in stage1 deployment (`stage1-deployment/data/genesis-config`):
|
||||
* Fetch the generated genesis file with stage1 participants and token allocations:
|
||||
|
||||
```bash
|
||||
cd /srv/laconicd
|
||||
|
||||
cp ~/cerc/fixturenet-laconicd-stack/stack-orchestrator/stacks/fixturenet-laconicd/output/genesis.json stage1-deployment/data/genesis-config/genesis.json
|
||||
# Place in stage1 deployment directory
|
||||
wget -O /srv/laconicd/stage1-deployment/data/genesis-config/genesis.json https://git.vdb.to/cerc-io/testnet-laconicd-stack/raw/branch/main/ops/stage1/genesis-accounts.json
|
||||
```
|
||||
|
||||
* Start the stage1 deployment:
|
||||
@ -108,3 +94,78 @@ Once all the participants have completed their onboarding, stage0 laconicd chain
|
||||
```
|
||||
|
||||
* Now users can follow the steps to [Join as a validator on stage1](https://git.vdb.to/cerc-io/testnet-laconicd-stack/src/branch/main/testnet-onboarding-validator.md#join-as-a-validator-on-stage1)
|
||||
|
||||
## Bank Transfer
|
||||
|
||||
* Transfer tokens to an address:
|
||||
|
||||
```bash
|
||||
cd /srv/laconicd
|
||||
|
||||
RECEIVER_ADDRESS=
|
||||
AMOUNT=
|
||||
|
||||
laconic-so deployment --dir stage1-deployment exec laconicd "laconicd tx bank send alice ${RECEIVER_ADDRESS} ${AMOUNT}alnt --from alice --fees 1000000alnt"
|
||||
```
|
||||
|
||||
* Check balance:
|
||||
|
||||
```bash
|
||||
laconic-so deployment --dir stage1-deployment exec laconicd "laconicd query bank balances ${RECEIVER_ADDRESS}"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Generating stage1 genesis
|
||||
|
||||
* Following steps to be run on a local machine
|
||||
|
||||
* Clone repos:
|
||||
|
||||
```bash
|
||||
git clone git@git.vdb.to:cerc-io/testnet-laconicd-stack.git
|
||||
|
||||
git clone git@git.vdb.to:cerc-io/fixturenet-laconicd-stack.git
|
||||
```
|
||||
|
||||
* Create stage1 participants and allocations using provided validators list:
|
||||
|
||||
* Prerequisite: `validators.csv` file with list of laconic addresses, example:
|
||||
|
||||
```csv
|
||||
laconic13ftz0c6cg6ttfda7ct4r6pf2j976zsey7l4wmj
|
||||
laconic1he4wjpfm5atwfvqurpg57ctp8chmxt9swf02dx
|
||||
laconic1wpsdkwz0t4ejdm7gcl7kn8989z88dd6wwy04np
|
||||
...
|
||||
```
|
||||
|
||||
* Build
|
||||
|
||||
```bash
|
||||
# Change to scripts dir
|
||||
cd testnet-laconicd-stack/scripts
|
||||
|
||||
# Install dependencies and build
|
||||
yarn && yarn build
|
||||
```
|
||||
|
||||
* Run script
|
||||
|
||||
```bash
|
||||
yarn participants-with-filtered-validators --validators-csv ./validators.csv --participant-alloc 200000000000 --validator-alloc 1000200000000000 --output stage1-participants-$(date +"%Y-%m-%dT%H%M%S").json --output-allocs stage1-allocs-$(date +"%Y-%m-%dT%H%M%S").json
|
||||
|
||||
# This should create two json files with stage1 participants and allocations
|
||||
```
|
||||
|
||||
* Create stage1 genesis file:
|
||||
|
||||
```bash
|
||||
# Change to fixturenet-laconicd stack dir
|
||||
cd fixturenet-laconicd-stack/stack-orchestrator/stacks/fixturenet-laconicd
|
||||
|
||||
# Generate genesis file
|
||||
# Provide absolute paths to generated stage1-participants and stage1-allocs files
|
||||
./scripts/generate-stage1-genesis-from-json.sh /path/to/testnet-laconicd-stack/scripts/stage1-participants.json /path/to/testnet-laconicd-stack/scripts/stage1-allocs.json
|
||||
|
||||
# This should generate the required genesis file at output/genesis.json
|
||||
```
|
||||
|
117947
ops/stage1/genesis-accounts.json
Normal file
117947
ops/stage1/genesis-accounts.json
Normal file
File diff suppressed because it is too large
Load Diff
8
scripts/.env.example
Normal file
8
scripts/.env.example
Normal file
@ -0,0 +1,8 @@
|
||||
# Default: https://laconicd.laconic.com/api
|
||||
LACONICD_GQL_ENDPOINT=
|
||||
|
||||
# Default: https://laconicd.laconic.com
|
||||
LACONICD_RPC_ENDPOINT=
|
||||
|
||||
# Default: laconic_9000-1
|
||||
LACONICD_CHAIN_ID=
|
3
scripts/.gitignore
vendored
Normal file
3
scripts/.gitignore
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
node_modules
|
||||
dist
|
||||
.env
|
1
scripts/.npmrc
Normal file
1
scripts/.npmrc
Normal file
@ -0,0 +1 @@
|
||||
@cerc-io:registry=https://git.vdb.to/api/packages/cerc-io/npm/
|
45
scripts/README.md
Normal file
45
scripts/README.md
Normal file
@ -0,0 +1,45 @@
|
||||
# scripts
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- NodeJS >= `v18.17.x`
|
||||
|
||||
## Instructions
|
||||
|
||||
- Change to scripts dir:
|
||||
|
||||
```bash
|
||||
cd scripts
|
||||
```
|
||||
|
||||
- Install dependencies and build:
|
||||
|
||||
```bash
|
||||
yarn && yarn build
|
||||
```
|
||||
|
||||
- Create required env configuration:
|
||||
|
||||
```bash
|
||||
# Update the values as required
|
||||
# By default, live laconicd testnet (laconicd.laconic.com) endpoint is configured
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
- Generate a list of onboarded participants and allocations with given list of validators:
|
||||
|
||||
```bash
|
||||
yarn participants-with-filtered-validators --validators-csv <validators-csv-file> --participant-alloc <participant-alloc-amount> --validator-alloc <validator-alloc-amount> --output <output-json-file> --output-allocs <output-allocs-json-file>
|
||||
|
||||
# Example:
|
||||
# yarn participants-with-filtered-validators --validators-csv ./validators.csv --participant-alloc 200000000000 --validator-alloc 1000200000000000 --output stage1-participants-$(date +"%Y-%m-%dT%H%M%S").json --output-allocs stage1-allocs-$(date +"%Y-%m-%dT%H%M%S").json
|
||||
```
|
||||
|
||||
- Map subscribers to onboarded participants:
|
||||
|
||||
```bash
|
||||
yarn map-subscribers-to-participants --subscribers-csv <subscribers-csv-file> --output <output-csv-file>
|
||||
|
||||
# Example:
|
||||
# yarn map-subscribers-to-participants --subscribers-csv subscribers.csv --output result-$(date +"%Y-%m-%dT%H%M%S").csv
|
||||
```
|
28
scripts/package.json
Normal file
28
scripts/package.json
Normal file
@ -0,0 +1,28 @@
|
||||
{
|
||||
"name": "testnet-laconicd-stack",
|
||||
"version": "0.1.0",
|
||||
"main": "index.js",
|
||||
"repository": "git@git.vdb.to:cerc-io/testnet-laconicd-stack.git",
|
||||
"license": "UNLICENSED",
|
||||
"private": true,
|
||||
"devDependencies": {
|
||||
"@types/cli-progress": "^3.11.6",
|
||||
"@types/node": "^22.2.0",
|
||||
"@types/yargs": "^17.0.33",
|
||||
"typescript": "^5.5.4"
|
||||
},
|
||||
"dependencies": {
|
||||
"@cerc-io/registry-sdk": "^0.2.6",
|
||||
"@cosmjs/stargate": "^0.32.4",
|
||||
"csv-parse": "^5.5.6",
|
||||
"csv-parser": "^3.0.0",
|
||||
"csv-writer": "^1.6.0",
|
||||
"dotenv": "^16.4.5",
|
||||
"yargs": "^17.7.2"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"map-subscribers-to-participants": "node dist/map-subscribers-to-participants.js"
|
||||
},
|
||||
"packageManager": "yarn@1.22.19+sha1.4ba7fc5c6e704fce2066ecbfb0b0d8976fe62447"
|
||||
}
|
171
scripts/src/map-subscribers-to-participants.ts
Normal file
171
scripts/src/map-subscribers-to-participants.ts
Normal file
@ -0,0 +1,171 @@
|
||||
import * as fs from 'fs';
|
||||
import * as crypto from 'crypto';
|
||||
import * as path from 'path';
|
||||
import yargs from 'yargs';
|
||||
import { hideBin } from 'yargs/helpers';
|
||||
import { parse as csvParse } from 'csv-parse';
|
||||
import * as csvWriter from 'csv-writer';
|
||||
import dotenv from 'dotenv';
|
||||
|
||||
import { StargateClient } from '@cosmjs/stargate';
|
||||
import { Registry } from '@cerc-io/registry-sdk';
|
||||
import { decodeTxRaw, decodePubkey } from '@cosmjs/proto-signing';
|
||||
|
||||
dotenv.config();
|
||||
|
||||
const LACONICD_GQL_ENDPOINT = process.env.LACONICD_GQL_ENDPOINT || 'https://laconicd.laconic.com/api';
|
||||
const LACONICD_RPC_ENDPOINT = process.env.LACONICD_RPC_ENDPOINT || 'https://laconicd.laconic.com';
|
||||
const LACONICD_CHAIN_ID = process.env.LACONICD_CHAIN_ID || 'laconic_9000-1';
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const argv = _getArgv();
|
||||
|
||||
const registry = new Registry(LACONICD_GQL_ENDPOINT, LACONICD_RPC_ENDPOINT, LACONICD_CHAIN_ID);
|
||||
const client = await StargateClient.connect(LACONICD_RPC_ENDPOINT);
|
||||
|
||||
console.time('time_taken_getParticipants');
|
||||
const participants = await registry.getParticipants();
|
||||
console.timeEnd('time_taken_getParticipants');
|
||||
|
||||
const subscribers = await readSubscribers(argv.subscribersCsv);
|
||||
console.log('Read subscribers, count:', subscribers.length);
|
||||
|
||||
await processSubscribers(client, participants, subscribers, argv.output);
|
||||
}
|
||||
|
||||
async function readSubscribers(subscribersCsvPath: string): Promise<any> {
|
||||
const fileContent = fs.readFileSync(path.resolve(subscribersCsvPath), { encoding: 'utf-8' });
|
||||
const headers = ['subscriber_id', 'email', 'status', 'premium?', 'created_at', 'api_subscription_id'];
|
||||
|
||||
return csvParse(fileContent, { delimiter: ',', columns: headers }).toArray();
|
||||
}
|
||||
|
||||
function hashSubscriberId(subscriberId: string): string {
|
||||
return '0x' + crypto.createHash('sha256').update(subscriberId).digest('hex');
|
||||
}
|
||||
|
||||
async function processSubscribers(client: StargateClient, participants: any[], subscribers: any[], outputPath: string) {
|
||||
// Map kyc_id to participant data
|
||||
const kycMap: Record<string, any> = {};
|
||||
participants.forEach((participant: any) => {
|
||||
kycMap[participant.kycId] = participant;
|
||||
});
|
||||
|
||||
const onboardingTxsHeightMap: Record<string, { txHeight: number, pubkey: string }> = {};
|
||||
console.time('time_taken_searchTx');
|
||||
const onboardingTxs = await client.searchTx(`message.action='/cerc.onboarding.v1.MsgOnboardParticipant'`);
|
||||
console.timeEnd('time_taken_searchTx');
|
||||
|
||||
console.log('Fetched onboardingTxs, count:', onboardingTxs.length);
|
||||
|
||||
console.time('time_taken_decodingTxs');
|
||||
onboardingTxs.forEach(onboardingTx => {
|
||||
const rawPubkey = decodeTxRaw(onboardingTx.tx).authInfo.signerInfos[0].publicKey;
|
||||
if (!rawPubkey) {
|
||||
console.error('pubkey not found in tx', onboardingTx.hash);
|
||||
return;
|
||||
}
|
||||
|
||||
const pubkey = decodePubkey(rawPubkey).value;
|
||||
|
||||
// Determine sender
|
||||
const onboardParticipantEvent = onboardingTx.events.find(event => event.type === 'onboard_participant');
|
||||
if (!onboardParticipantEvent) {
|
||||
console.error('onboard_participant event not found in tx', onboardingTx.hash);
|
||||
return;
|
||||
}
|
||||
|
||||
const sender = onboardParticipantEvent.attributes.find(attr => attr.key === 'signer')?.value;
|
||||
if (!sender) {
|
||||
console.error('sender not found in onboard_participant event for tx', onboardingTx.hash)
|
||||
return;
|
||||
}
|
||||
|
||||
// Update if already exists
|
||||
let latesTxHeight = onboardingTx.height;
|
||||
if (onboardingTxsHeightMap[sender]) {
|
||||
latesTxHeight = latesTxHeight > onboardingTxsHeightMap[sender].txHeight ? latesTxHeight : onboardingTxsHeightMap[sender].txHeight;
|
||||
}
|
||||
|
||||
onboardingTxsHeightMap[sender] = { txHeight: latesTxHeight, pubkey };
|
||||
});
|
||||
console.timeEnd('time_taken_decodingTxs');
|
||||
|
||||
const onboardedSubscribers: any[] = [];
|
||||
for (let i = 0; i < subscribers.length; i++) {
|
||||
const subscriber = subscribers[i];
|
||||
|
||||
const hashedSubscriberId = hashSubscriberId(subscriber['subscriber_id']);
|
||||
const participant = kycMap[hashedSubscriberId];
|
||||
if (!participant) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const participantAddresss = participant['cosmosAddress'];
|
||||
|
||||
// Skip participant if an onboarding tx not found
|
||||
if (!onboardingTxsHeightMap[participantAddresss]) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const onboardedSubscriber = {
|
||||
subscriber_id: subscriber['subscriber_id'],
|
||||
email: subscriber['email'],
|
||||
status: subscriber['status'],
|
||||
'premium?': subscriber['premium?'],
|
||||
created_at: subscriber['created_at'],
|
||||
laconic_address: participantAddresss,
|
||||
nitro_address: participant['nitroAddress'],
|
||||
role: participant['role'],
|
||||
hashed_subscriber_id: participant['kycId'],
|
||||
laconic_pubkey: onboardingTxsHeightMap[participantAddresss].pubkey,
|
||||
onboarding_height: onboardingTxsHeightMap[participantAddresss].txHeight
|
||||
};
|
||||
|
||||
onboardedSubscribers.push(onboardedSubscriber);
|
||||
}
|
||||
|
||||
const writer = csvWriter.createObjectCsvWriter({
|
||||
path: path.resolve(outputPath),
|
||||
header: [
|
||||
{ id: 'subscriber_id', title: 'subscriber_id' },
|
||||
{ id: 'email', title: 'email' },
|
||||
{ id: 'status', title: 'status' },
|
||||
{ id: 'premium?', title: 'premium?' },
|
||||
{ id: 'created_at', title: 'created_at' },
|
||||
{ id: 'laconic_address', title: 'laconic_address' },
|
||||
{ id: 'nitro_address', title: 'nitro_address' },
|
||||
{ id: 'role', title: 'role' },
|
||||
{ id: 'hashed_subscriber_id', title: 'hashed_subscriber_id' },
|
||||
{ id: 'laconic_pubkey', title: 'laconic_pubkey' },
|
||||
{ id: 'onboarding_height', title: 'onboarding_height' },
|
||||
],
|
||||
alwaysQuote: true
|
||||
});
|
||||
|
||||
await writer.writeRecords(onboardedSubscribers);
|
||||
|
||||
console.log(`Data has been written to ${path.resolve(outputPath)}`);
|
||||
}
|
||||
|
||||
function _getArgv (): any {
|
||||
return yargs(hideBin(process.argv))
|
||||
.option('subscribersCsv', {
|
||||
alias: 's',
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Path to the subscribers CSV file',
|
||||
})
|
||||
.option('output', {
|
||||
alias: 'o',
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Path to the output CSV file',
|
||||
})
|
||||
.help()
|
||||
.argv;
|
||||
}
|
||||
|
||||
main().catch(err => {
|
||||
console.log(err);
|
||||
});
|
114
scripts/src/participants-with-filtered-validators.ts
Normal file
114
scripts/src/participants-with-filtered-validators.ts
Normal file
@ -0,0 +1,114 @@
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import yargs from 'yargs';
|
||||
import { hideBin } from 'yargs/helpers';
|
||||
import dotenv from 'dotenv';
|
||||
|
||||
import { Registry } from '@cerc-io/registry-sdk';
|
||||
|
||||
dotenv.config();
|
||||
|
||||
const LACONICD_GQL_ENDPOINT = process.env.LACONICD_GQL_ENDPOINT || 'https://laconicd.laconic.com/api';
|
||||
const LACONICD_RPC_ENDPOINT = process.env.LACONICD_RPC_ENDPOINT || 'https://laconicd.laconic.com';
|
||||
const LACONICD_CHAIN_ID = process.env.LACONICD_CHAIN_ID || 'laconic_9000-1';
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const argv = _getArgv();
|
||||
|
||||
const registry = new Registry(LACONICD_GQL_ENDPOINT, LACONICD_RPC_ENDPOINT, LACONICD_CHAIN_ID);
|
||||
|
||||
console.time('time_taken_getParticipants');
|
||||
const participants = await registry.getParticipants();
|
||||
console.log('Fetched participants, count:', participants.length);
|
||||
console.timeEnd('time_taken_getParticipants');
|
||||
|
||||
let validators: Array<string> = await readValidators(argv.validatorsCsv);
|
||||
console.log('Read validators, count:', validators.length);
|
||||
|
||||
let stage1Allocations: Array<{ 'cosmos_address': string, balance: string }> = [];
|
||||
|
||||
const stage1Participants = participants.map((participant: any) => {
|
||||
const outputParticipant: any = {
|
||||
'cosmos_address': participant.cosmosAddress,
|
||||
'nitro_address': participant.nitroAddress,
|
||||
'kyc_id': participant.kycId
|
||||
};
|
||||
|
||||
if (validators.includes(participant.cosmosAddress)) {
|
||||
outputParticipant.role = 'validator';
|
||||
|
||||
stage1Allocations.push({
|
||||
cosmos_address: participant.cosmosAddress,
|
||||
balance: argv.validatorAlloc
|
||||
});
|
||||
|
||||
// Remove processed participant from validators list
|
||||
validators = validators.filter(val => val !== participant.cosmosAddress);
|
||||
} else {
|
||||
outputParticipant.role = 'participant';
|
||||
|
||||
stage1Allocations.push({
|
||||
cosmos_address: participant.cosmosAddress,
|
||||
balance: argv.participantAlloc
|
||||
});
|
||||
}
|
||||
|
||||
return outputParticipant;
|
||||
});
|
||||
|
||||
// Provide allocs for remaining validators
|
||||
validators.forEach(val => {
|
||||
stage1Allocations.push({
|
||||
cosmos_address: val,
|
||||
balance: argv.validatorAlloc
|
||||
});
|
||||
});
|
||||
|
||||
const participantsOutputFilePath = path.resolve(argv.output);
|
||||
fs.writeFileSync(participantsOutputFilePath, JSON.stringify(stage1Participants, null, 2));
|
||||
console.log(`Onboarded participants with filtered validators written to ${participantsOutputFilePath}`);
|
||||
|
||||
const allocsOutputFilePath = path.resolve(argv.outputAllocs);
|
||||
fs.writeFileSync(allocsOutputFilePath, JSON.stringify(stage1Allocations, null, 2));
|
||||
console.log(`Stage1 allocations written to ${allocsOutputFilePath}`);
|
||||
}
|
||||
|
||||
async function readValidators(subscribersCsvPath: string): Promise<any> {
|
||||
const fileContent = fs.readFileSync(path.resolve(subscribersCsvPath), { encoding: 'utf-8' });
|
||||
return fileContent.split('\r\n').map(address => address.trim());
|
||||
}
|
||||
|
||||
function _getArgv (): any {
|
||||
return yargs(hideBin(process.argv))
|
||||
.option('validatorsCsv', {
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Path to a CSV file with validators list',
|
||||
})
|
||||
.option('participantAlloc', {
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Participant stage1 balance allocation',
|
||||
})
|
||||
.option('validatorAlloc', {
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Validator stage1 balance allocation',
|
||||
})
|
||||
.option('output', {
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Path to the output JSON file',
|
||||
})
|
||||
.option('outputAllocs', {
|
||||
type: 'string',
|
||||
demandOption: true,
|
||||
describe: 'Path to the output JSON file with allocs',
|
||||
})
|
||||
.help()
|
||||
.argv;
|
||||
}
|
||||
|
||||
main().catch(err => {
|
||||
console.log(err);
|
||||
});
|
110
scripts/tsconfig.json
Normal file
110
scripts/tsconfig.json
Normal file
@ -0,0 +1,110 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
// "incremental": true, /* Save .tsbuildinfo files to allow for incremental compilation of projects. */
|
||||
// "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */
|
||||
// "tsBuildInfoFile": "./.tsbuildinfo", /* Specify the path to .tsbuildinfo incremental compilation file. */
|
||||
// "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects. */
|
||||
// "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */
|
||||
// "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "es2016", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
// "jsx": "preserve", /* Specify what JSX code is generated. */
|
||||
// "experimentalDecorators": true, /* Enable experimental support for legacy experimental decorators. */
|
||||
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
|
||||
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */
|
||||
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
|
||||
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */
|
||||
// "reactNamespace": "", /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
// "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "commonjs", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
// "moduleResolution": "node10", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
// "typeRoots": [], /* Specify multiple folders that act like './node_modules/@types'. */
|
||||
// "types": [], /* Specify type package names to be included without being referenced in a source file. */
|
||||
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
|
||||
// "moduleSuffixes": [], /* List of file name suffixes to search when resolving a module. */
|
||||
// "allowImportingTsExtensions": true, /* Allow imports to include TypeScript file extensions. Requires '--moduleResolution bundler' and either '--noEmit' or '--emitDeclarationOnly' to be set. */
|
||||
// "resolvePackageJsonExports": true, /* Use the package.json 'exports' field when resolving package imports. */
|
||||
// "resolvePackageJsonImports": true, /* Use the package.json 'imports' field when resolving imports. */
|
||||
// "customConditions": [], /* Conditions to set in addition to the resolver-specific defaults when resolving imports. */
|
||||
"resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
|
||||
// "noResolve": true, /* Disallow 'import's, 'require's or '<reference>'s from expanding the number of files TypeScript should add to a project. */
|
||||
|
||||
/* JavaScript Support */
|
||||
// "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
|
||||
// "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
// "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
|
||||
|
||||
/* Emit */
|
||||
"declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
|
||||
"sourceMap": true, /* Create source map files for emitted JavaScript files. */
|
||||
// "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */
|
||||
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */
|
||||
"outDir": "dist", /* Specify an output folder for all emitted files. */
|
||||
// "removeComments": true, /* Disable emitting comments. */
|
||||
// "noEmit": true, /* Disable emitting files from a compilation. */
|
||||
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
|
||||
// "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
|
||||
// "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */
|
||||
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
|
||||
// "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */
|
||||
// "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
|
||||
// "newLine": "crlf", /* Set the newline character for emitting files. */
|
||||
// "stripInternal": true, /* Disable emitting declarations that have '@internal' in their JSDoc comments. */
|
||||
// "noEmitHelpers": true, /* Disable generating custom helper functions like '__extends' in compiled output. */
|
||||
// "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */
|
||||
// "preserveConstEnums": true, /* Disable erasing 'const enum' declarations in generated code. */
|
||||
// "declarationDir": "./", /* Specify the output directory for generated declaration files. */
|
||||
|
||||
/* Interop Constraints */
|
||||
// "isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */
|
||||
// "verbatimModuleSyntax": true, /* Do not transform or elide any imports or exports not marked as type-only, ensuring they are written in the output file's format based on the 'module' setting. */
|
||||
// "isolatedDeclarations": true, /* Require sufficient annotation on exports so other tools can trivially generate declaration files. */
|
||||
// "allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
|
||||
// "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
// "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied 'any' type. */
|
||||
// "strictNullChecks": true, /* When type checking, take into account 'null' and 'undefined'. */
|
||||
// "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
|
||||
// "strictBindCallApply": true, /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */
|
||||
// "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
|
||||
// "noImplicitThis": true, /* Enable error reporting when 'this' is given the type 'any'. */
|
||||
// "useUnknownInCatchVariables": true, /* Default catch clause variables as 'unknown' instead of 'any'. */
|
||||
// "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
|
||||
// "noUnusedLocals": true, /* Enable error reporting when local variables aren't read. */
|
||||
// "noUnusedParameters": true, /* Raise an error when a function parameter isn't read. */
|
||||
// "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
|
||||
// "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
|
||||
// "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
|
||||
// "noUncheckedIndexedAccess": true, /* Add 'undefined' to a type when accessed using an index. */
|
||||
// "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
|
||||
// "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type. */
|
||||
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
|
||||
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */
|
||||
|
||||
/* Completeness */
|
||||
// "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */
|
||||
"skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
},
|
||||
"include": ["src"],
|
||||
"exclude": ["dist"],
|
||||
}
|
1999
scripts/yarn.lock
Normal file
1999
scripts/yarn.lock
Normal file
File diff suppressed because it is too large
Load Diff
@ -13,7 +13,7 @@ services:
|
||||
registry:
|
||||
rpcEndpoint: ${CERC_LACONICD_RPC_ENDPOINT}
|
||||
gqlEndpoint: ${CERC_LACONICD_GQL_ENDPOINT}
|
||||
userKey: ${CERC_LACONICD_USER_KEY}
|
||||
userKey: "${CERC_LACONICD_USER_KEY}"
|
||||
bondId: ${CERC_LACONICD_BOND_ID}
|
||||
chainId: ${CERC_LACONICD_CHAIN_ID}
|
||||
gas: ${CERC_LACONICD_GAS}
|
||||
|
@ -266,7 +266,7 @@ laconic-so deployment --dir laconic-console-deployment start
|
||||
# services:
|
||||
# registry:
|
||||
# ...
|
||||
# userKey: <your-private-key>
|
||||
# userKey: "<your-private-key>"
|
||||
# ...
|
||||
|
||||
# Note: any changes made to the config will be lost when the cli Docker container is brought down
|
||||
@ -277,7 +277,7 @@ laconic-so deployment --dir laconic-console-deployment start
|
||||
|
||||
```bash
|
||||
# Example
|
||||
laconic-so deployment --dir laconic-console-deployment exec cli "laconic registry bond create --type photon --quantity 1000000000000"
|
||||
laconic-so deployment --dir laconic-console-deployment exec cli "laconic registry bond create --type alnt --quantity 1000000000000"
|
||||
```
|
||||
|
||||
## Clean up
|
||||
|
Loading…
Reference in New Issue
Block a user