Create lockup module JSON from LPS distribution excel file (#10)

Part of https://www.notion.so/Lockup-LPS-tokens-into-a-module-account-1f2a6b22d472802cbb35e8ce052d22ca?pvs=23

Co-authored-by: IshaVenikar <ishavenikar7@gmail.com>
Reviewed-on: #10
Co-authored-by: ishavenikar <ishavenikar@noreply.git.vdb.to>
Co-committed-by: ishavenikar <ishavenikar@noreply.git.vdb.to>
This commit is contained in:
ishavenikar 2025-06-04 10:55:10 +00:00 committed by nabarun
parent b87609436a
commit 48939ffbd7
4 changed files with 205 additions and 3 deletions

View File

@ -10,11 +10,11 @@
cargo install tmkms --features=softsign --version=0.14.0 cargo install tmkms --features=softsign --version=0.14.0
``` ```
- testnet-state.json ([exported testnet state](./run-first-validator.md#export-testnet-state)) - testnet-state.json ([exported testnet state](./run-first-validator.md#export-testnet-state))
- distribution.json (JSON containing the `lps_lockup` distribution) - LPS distribution Google spreadsheet URL or CSV file path
## Steps ## Steps
- In current working directory demo, keep exported `testnet-state.json` and `distribution.json` file from prerequisites - In current working directory demo, keep exported `testnet-state.json` file from prerequisites
- Fetch stack: - Fetch stack:
@ -22,6 +22,14 @@
laconic-so fetch-stack git.vdb.to/cerc-io/laconicd-stack --git-ssh --pull laconic-so fetch-stack git.vdb.to/cerc-io/laconicd-stack --git-ssh --pull
``` ```
- Generate LPS lockup distribution JSON file
```bash
~/cerc/laconicd-stack/scripts/generate-lps-lock.sh -i "<lps-distribution-spreadsheet-url-or-file-path>" -d "~/cerc/laconicd-stack/data"
```
- This will generate the `distribution.json` file
- Export current working directory - Export current working directory
```bash ```bash
@ -32,7 +40,7 @@
```bash ```bash
export EXPORTED_STATE_PATH=$CWD/testnet-state.json export EXPORTED_STATE_PATH=$CWD/testnet-state.json
export LPS_DISTRIBUTION_PATH=$CWD/distribution.json export LPS_DISTRIBUTION_PATH=~/cerc/laconicd-stack/data/distribution.json
# Test address that does not exist on testnet chain # Test address that does not exist on testnet chain
export EARLY_SUPPORTS_ACC_ADDR=laconic1gwytamfk3m5n0gsawh5vpwxkwd3vapmvzpp6nz export EARLY_SUPPORTS_ACC_ADDR=laconic1gwytamfk3m5n0gsawh5vpwxkwd3vapmvzpp6nz

View File

@ -4,6 +4,7 @@
- [ansible](playbooks/README.md#ansible-installation) - [ansible](playbooks/README.md#ansible-installation)
- [laconic-so](https://github.com/cerc-io/stack-orchestrator/?tab=readme-ov-file#install) - [laconic-so](https://github.com/cerc-io/stack-orchestrator/?tab=readme-ov-file#install)
- LPS distribution Google spreadsheet URL or CSV file path
## Export testnet state ## Export testnet state
@ -45,6 +46,14 @@
- Copy over the exported `testnet-state.json` file to target machine - Copy over the exported `testnet-state.json` file to target machine
- Generate LPS lockup distribution JSON file
```bash
~/cerc/laconicd-stack/scripts/generate-lps-lock.sh -i "<lps-distribution-spreadsheet-url-or-file-path>" -d "<destination-folder-for-json-file>"
```
- This will generate the `distribution.json` file
- Copy over the LPS lockup distribution `distribution.json` file to target machine - Copy over the LPS lockup distribution `distribution.json` file to target machine
- Set envs: - Set envs:

View File

@ -0,0 +1,128 @@
import sys
import requests
import pandas as pd
import json
import re
import argparse
import urllib.parse
from bech32 import bech32_decode
# Column names in the input CSV
PLACEHOLDER_COLUMN = 'Placeholder'
LACONIC_ADDRESS_COLUMN = 'Laconic Address'
TOTAL_LPS_ALLOCATION_COLUMN = 'Total LPS Allocation'
LOCK_MONTHS_COLUMN = 'Lock (months)'
VEST_MONTHS_COLUMN = 'Vest (months)'
# Required columns in the input CSV
REQUIRED_COLUMNS = [
PLACEHOLDER_COLUMN,
LACONIC_ADDRESS_COLUMN,
TOTAL_LPS_ALLOCATION_COLUMN,
LOCK_MONTHS_COLUMN,
VEST_MONTHS_COLUMN
]
def to_number(val):
"""
Convert a value to a number, handling empty values and invalid inputs.
Returns None for empty or invalid values.
"""
if pd.isna(val) or str(val).strip() == '':
return None
try:
return float(val)
except (ValueError, TypeError):
return None
def get_csv_download_url(google_sheet_url):
"""
Convert a full Google Sheets URL to a CSV export URL using the `gid` in the query string.
"""
# Extract the sheet ID
match = re.search(r'/d/([a-zA-Z0-9-_]+)', google_sheet_url)
if not match:
raise ValueError('Invalid Google Sheets URL')
sheet_id = match.group(1)
# Extract gid from query params
gid_match = re.search(r'[?&]gid=([0-9]+)', google_sheet_url)
if not gid_match:
raise ValueError('Missing gid in Google Sheets URL')
gid = gid_match.group(1)
# Build export URL
return f'https://docs.google.com/spreadsheets/d/{sheet_id}/export?format=csv&gid={gid}'
def download_csv(url, output_path):
"""
Download the CSV file from the given URL.
"""
response = requests.get(url)
if response.status_code != 200:
raise Exception(f'Failed to download file: {response.status_code}')
with open(output_path, 'wb') as f:
f.write(response.content)
def convert_csv_to_json(csv_path, json_path):
"""
Read the CSV file, extract columns, and save as JSON.
"""
df = pd.read_csv(csv_path)
for col in REQUIRED_COLUMNS:
if col not in df.columns:
raise Exception(f'Missing required column: {col}')
result = {}
for _, row in df.iterrows():
placeholder = str(row[PLACEHOLDER_COLUMN]) if not pd.isna(row[PLACEHOLDER_COLUMN]) else ''
laconic_address = str(row[LACONIC_ADDRESS_COLUMN]) if not pd.isna(row[LACONIC_ADDRESS_COLUMN]) else ''
# Use laconic_address as key if placeholder is missing or empty
key = placeholder if placeholder and placeholder.lower() != 'nan' else laconic_address
# Skip the row if both 'Placeholder' and 'Laconic Address' are missing or invalid
if not key or key.lower() == 'nan':
continue
# If key is the laconic address, validate that it's a valid bech32 address
if key == laconic_address:
hrp, data = bech32_decode(laconic_address)
if hrp is None or data is None or not hrp.startswith("laconic"):
print(f"Skipping invalid Laconic address: {laconic_address}")
continue
entry = {
'total_lps_allocation': to_number(row[TOTAL_LPS_ALLOCATION_COLUMN]),
'lock_months': row[LOCK_MONTHS_COLUMN] if not pd.isna(row[LOCK_MONTHS_COLUMN]) else None,
'vest_months': row[VEST_MONTHS_COLUMN] if not pd.isna(row[VEST_MONTHS_COLUMN]) else None,
'laconic_address': row[LACONIC_ADDRESS_COLUMN] if not pd.isna(row[LACONIC_ADDRESS_COLUMN]) else None
}
result[key] = entry
with open(json_path, 'w') as f:
json.dump(result, f, indent=2)
def main():
parser = argparse.ArgumentParser(description='Generate LPS distribution JSON from CSV or Google Sheet')
parser.add_argument('--input', '-i', required=True, help='Input: Google Sheet URL or local CSV file path')
parser.add_argument('--output', '-o', default='distribution.json', help='Output JSON file path (default: distribution.json)')
args = parser.parse_args()
if args.input.startswith('https://'):
csv_url = get_csv_download_url(args.input)
csv_path = 'sheet.csv'
print(f'Downloading CSV file from: {csv_url}')
download_csv(csv_url, csv_path)
else:
csv_path = args.input
print(f'Using CSV file at path: {csv_path}')
print('Converting CSV to JSON...')
convert_csv_to_json(csv_path, args.output)
print(f'JSON saved to {args.output}')
if __name__ == '__main__':
main()

57
scripts/generate-lps-lock.sh Executable file
View File

@ -0,0 +1,57 @@
#!/bin/bash
set -e
# Default values
INPUT=""
OUTPUT_DIR="."
# Parse command line arguments
while [[ $# -gt 0 ]]; do
case $1 in
-i|--input)
INPUT="$2"
shift 2
;;
-d|--dir)
OUTPUT_DIR="$2"
shift 2
;;
*)
echo "Unknown option: $1"
echo "Usage: $0 -i|--input <input_url_or_path> [-d|--dir <output_directory>]"
exit 1
;;
esac
done
# Check if input is provided
if [ -z "$INPUT" ]; then
echo "Error: Input URL or path is required"
echo "Usage: $0 -i|--input <input_url_or_path> [-d|--dir <output_directory>]"
exit 1
fi
# Create output directory if it doesn't exist
mkdir -p "$OUTPUT_DIR"
venv_dir="$PWD/venv-lps-lock"
script_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
# Create venv if it doesn't exist
if [ ! -d "$venv_dir" ]; then
python3 -m venv "$venv_dir"
fi
# Activate venv and install dependencies
"$venv_dir/bin/pip" install --upgrade pip
"$venv_dir/bin/pip" install requests pandas openpyxl bech32
echo "Running LPS lock generation script..."
"$venv_dir/bin/python" "$script_dir/generate-lps-distribution-json.py" \
--input "$INPUT" \
--output "$OUTPUT_DIR/distribution.json"
# Clean up venv
echo "Cleaning up..."
rm -rf "$venv_dir"