initial commit
This commit is contained in:
commit
1130af7377
306
README.md
Normal file
306
README.md
Normal file
@ -0,0 +1,306 @@
|
||||
# lockdrop-simulation
|
||||
|
||||
## Approach
|
||||
|
||||
The lockdrop simulation validates the Zenith Network's token distribution mechanism by creating a realistic test environment without requiring real Ethereum data or live participants.
|
||||
|
||||
- **Generate mock participants** with proper Ethereum/Zenith addresses and distribute Urbit points (galaxies/stars) based on configurable participation rates
|
||||
- **Simulate the TGE event** to process participants and generate a genesis file with treasury allocations and unlock schedules as per [distribution-simulate-lockdrop.json](./distribution-simulate-lockdrop.json)
|
||||
- **Deploy a test zenithd validator** using the simulated genesis file to validate the actual token distribution and accrual implementation
|
||||
- **Compare expected calculations** (via Jupyter notebook) against live node's API responses to ensure mathematical correctness
|
||||
- **Provide end-to-end verification** that the entire flow from participant generation to live blockchain queries works correctly in a controlled, reproducible environment
|
||||
|
||||
### Simulation Features
|
||||
|
||||
- **Urbit Point Allocation**: Uses real Urbit naming conventions and point hierarchy
|
||||
- **Valid Ethereum Addresses**: Generates Ethereum key pairs for all participants
|
||||
- **Valid Zenith Addresses**: Generates Zenith addresses for all participants
|
||||
- **Realistic Attestations**: Create attestations following the expected format
|
||||
- **Token Calculations**: Treasury calculations for lockdrop participants based on total supply and participant count
|
||||
|
||||
## Setup
|
||||
|
||||
1. **Install Prerequisites**
|
||||
|
||||
Clone the zenith-stack repository which contains the Ansible playbooks:
|
||||
|
||||
```bash
|
||||
git clone git@git.vdb.to:LaconicNetwork/zenith-stack.git
|
||||
```
|
||||
|
||||
**Note**: Replace `<path/to/zenith-stack>` in the commands below with the actual path where you cloned the zenith-stack repository.
|
||||
|
||||
Go to the directory where playbooks are located:
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/ansible
|
||||
```
|
||||
|
||||
Install `zenith-ansible`:
|
||||
|
||||
```bash
|
||||
# Download the binary from generic package registry
|
||||
curl -OJ https://git.vdb.to/api/packages/LaconicNetwork/generic/zenith-stack/v0.2.4/zenith-ansible
|
||||
```
|
||||
|
||||
Make it executable:
|
||||
|
||||
```bash
|
||||
chmod +x ./zenith-ansible
|
||||
```
|
||||
|
||||
For more details about `zenith-ansible` check this [doc](./ansible/zenith-ansible-shiv/README.md)
|
||||
|
||||
2. **Configure Variables**
|
||||
|
||||
Configure variables required for this simulation by following [this guide](../ansible/zenith-config-cli/docs/stage1-lockdrop-simulation.md).
|
||||
|
||||
The configuration tool allows flexible simulation parameters:
|
||||
- **Participant count**: Configure the total number of mock participants
|
||||
- **Galaxy allocation**: Determines how many participants will be validators
|
||||
- **Star distribution**: Controls the total star pool available for allocation
|
||||
|
||||
3. **Setup Deployment Directories**
|
||||
|
||||
Create the data directory required for simulation (must be same as the path configured for `lockdrop simulation data directory` in previous step to configure variables):
|
||||
|
||||
```bash
|
||||
# For example:
|
||||
mkdir /home/$USER/stage1-lockdrop-simulation
|
||||
```
|
||||
|
||||
Make sure you are in ansible directory:
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/ansible
|
||||
```
|
||||
|
||||
Setup the deployment directories and pull required docker images to generate base genesis file along with other artifacts:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml tge-site.yml -e "mode=setup"
|
||||
```
|
||||
|
||||
Setup the deployment directories and pull required docker images to sign the gentx and setup stage 1 validator node:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml stage1-site.yml -e "mode=setup" --skip-tags onboarding
|
||||
```
|
||||
|
||||
## Run Simulation
|
||||
|
||||
### Step 1: Simulated Token Genesis Event
|
||||
|
||||
Following command allows users to create the base genesis file while simulating lockdrop participants:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml tge-site.yml -e "mode=simulate-lockdrop"
|
||||
```
|
||||
|
||||
This will generate base genesis file at `<path/to/zenith-stack>/base-genesis-file/genesis.json`
|
||||
|
||||
This will also generate following files in `<path/to/zenith-stack>/lockdrop-simulation/generated`:
|
||||
|
||||
```bash
|
||||
lockdrop-simulation/generated/
|
||||
├── generated-participants.json # Mock participant data with attestations
|
||||
├── generated-accounts.json # Ethereum and Zenith account pairs
|
||||
├── point-allocation-stats.json # Statistics about galaxy/star allocation
|
||||
└── watcher-events.json # Simulated lockdrop contract events
|
||||
```
|
||||
|
||||
[distribution-simulate-lockdrop.json](./distribution-simulate-lockdrop.json) is used for category-wise allocation of `$Z` with respective vesting/unlock schedules (unlock frequency reduced to 60 seconds or 30 blocks for lockdrop participants for demo purposes).
|
||||
|
||||
### Step 2: Genesis Transaction (Gentx) Signing
|
||||
|
||||
Since we have generated dummy accounts, we can access there private keys present in `generated-accounts.json`.
|
||||
|
||||
Get the private key of first account present in this file:
|
||||
|
||||
```bash
|
||||
# Working directory: <path/to/zenith-stack>/ansible
|
||||
jq -r '.[0].zenithPrivateKey' ../lockdrop-simulation/generated/generated-accounts.json
|
||||
```
|
||||
|
||||
Note this private key down as it will be required in next step.
|
||||
|
||||
Now run the playbook to sign the gentx and generate final genesis file:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml stage1-site.yml -e "mode=sign"
|
||||
```
|
||||
|
||||
Use the private key noted above when prompted.
|
||||
|
||||
This will:
|
||||
- Automatically extract the pubkey of your validator node
|
||||
- Create a genesis transaction (gentx) using the validator public key and private key
|
||||
- Combine the base genesis file with the bootstrap validator gentx
|
||||
- Generate the final genesis file
|
||||
- Copy final genesis to `<path/to/zenith-stack>/genesis-file/genesis.json`
|
||||
|
||||
### Step 3: Start Bootstrap Validator
|
||||
|
||||
Now, we can use this genesis file to run the stage 1 validator node:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml stage1-site.yml -e "mode=start" --skip-tags onboarding
|
||||
```
|
||||
|
||||
After starting the node, verify it's running correctly:
|
||||
|
||||
```bash
|
||||
# Set data directory (should match configuration)
|
||||
export DATA_DIRECTORY=/absolute/path/to/data/directory
|
||||
|
||||
# Check validator logs
|
||||
laconic-so deployment --dir $DATA_DIRECTORY/mainnet-zenithd-deployment logs zenithd -f
|
||||
```
|
||||
|
||||
### Step 4: Run Lockdrop Distribution Notebook
|
||||
|
||||
Execute the Jupyter notebook to perform lockdrop allocation calculations and generate analysis outputs:
|
||||
|
||||
1. **Create Virtual Environment and Install Dependencies**
|
||||
|
||||
Navigate to the lockdrop-simulation directory:
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/lockdrop-simulation
|
||||
```
|
||||
|
||||
Create and activate a Python virtual environment:
|
||||
|
||||
```bash
|
||||
python3 -m venv venv
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
Install required Python packages:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
2. **Execute the Notebook**
|
||||
|
||||
Run the notebook to generate allocation calculations:
|
||||
|
||||
```bash
|
||||
jupyter nbconvert --to notebook --execute --inplace --log-level WARN lockdrop-calculations-simulated.ipynb
|
||||
```
|
||||
|
||||
This will:
|
||||
- Process the generated lockdrop participant data
|
||||
- Calculate allocation amounts for different lock periods
|
||||
- Generate artifacts (`lockdrop_allocations_notebook.json`) for comparison with the data from zenithd node
|
||||
|
||||
3. **View Notebook Results (Optional)**
|
||||
|
||||
To view the analysis on generated data, open the notebook in your browser:
|
||||
|
||||
```bash
|
||||
jupyter notebook lockdrop-calculations-simulated.ipynb
|
||||
```
|
||||
|
||||
The notebook contains useful visualizations including allocation distributions, lock period analysis, and participant statistics.
|
||||
|
||||
### Step 5: Run Simulation Tests
|
||||
|
||||
Run comprehensive tests to validate that the zenithd node's TGE allocations and run-time accruals match the notebook results:
|
||||
|
||||
1. **Set Environment Variables**
|
||||
|
||||
Configure API endpoints for the running zenithd node:
|
||||
|
||||
```bash
|
||||
export REST_API_ENDPOINT="http://localhost:1317"
|
||||
export RPC_API_ENDPOINT="http://localhost:26657"
|
||||
```
|
||||
|
||||
2. **Run All Tests**
|
||||
|
||||
Navigate to the lockdrop-simulation directory (if not already there):
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/lockdrop-simulation
|
||||
```
|
||||
|
||||
Activate `venv`:
|
||||
|
||||
```bash
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
Execute the complete test suite:
|
||||
|
||||
```bash
|
||||
python3 tests/run_all_tests.py
|
||||
```
|
||||
|
||||
This will run tests in the following order:
|
||||
- **Allocation Tests**: Compare star, galaxy, and total allocations between notebook and zenithd
|
||||
- **Unlock Schedule Tests**: Validate unlock block calculations (considering each point's locking time) and initial unlock amounts
|
||||
- **Accrual State Tests**: Verify accrual state calculations at current block height
|
||||
|
||||
3. **Run Individual Test Modules** (Optional)
|
||||
|
||||
You can also run specific test categories:
|
||||
|
||||
```bash
|
||||
# Test only allocations
|
||||
python3 tests/test_allocations.py
|
||||
|
||||
# Test only unlock schedules
|
||||
python3 tests/test_unlock_schedule.py
|
||||
|
||||
# Test only accrual states
|
||||
python3 tests/test_accrual_state.py
|
||||
```
|
||||
|
||||
4. **Test Output**
|
||||
|
||||
The tests provide detailed tabular output showing:
|
||||
- Comparison between notebook calculations and zenithd responses
|
||||
- Any differences or mismatches
|
||||
- Comprehensive validation of the lockdrop implementation
|
||||
|
||||
## Cleanup
|
||||
|
||||
### Validator Deployment Cleanup
|
||||
|
||||
Go to `ansible` directory:
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/ansible
|
||||
```
|
||||
|
||||
Stop validator deployment:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml stage1-site.yml -e "mode=stop" --skip-tags onboarding
|
||||
```
|
||||
|
||||
Clean up validator deployment:
|
||||
|
||||
```bash
|
||||
./zenith-ansible -i ./inventories/development/hosts.yml stage1-site.yml -e "mode=cleanup" --skip-tags onboarding -K
|
||||
```
|
||||
|
||||
### Python Virtual Environment Cleanup
|
||||
|
||||
Go to `lockdrop-simulation` directory:
|
||||
|
||||
```bash
|
||||
cd <path/to/zenith-stack>/lockdrop-simulation
|
||||
```
|
||||
|
||||
Clean up Python virtual environment:
|
||||
|
||||
```bash
|
||||
# Deactivate virtual environment (if currently active)
|
||||
deactivate
|
||||
|
||||
# Remove virtual environment directory
|
||||
rm -rf venv
|
||||
```
|
94
distribution-simulate-lockdrop.json
Normal file
94
distribution-simulate-lockdrop.json
Normal file
@ -0,0 +1,94 @@
|
||||
[
|
||||
{
|
||||
"category": "lockdrop",
|
||||
"total_percent_share": "30",
|
||||
"unlock_params": {
|
||||
"unlock_frequency": 60,
|
||||
"initial_unlock_percent": "0"
|
||||
},
|
||||
"recipients": []
|
||||
},
|
||||
{
|
||||
"category": "block-rewards",
|
||||
"total_percent_share": "30",
|
||||
"recipients": []
|
||||
},
|
||||
{
|
||||
"category": "team",
|
||||
"total_percent_share": "12.5",
|
||||
"vesting_params": {
|
||||
"cliff_duration": 0,
|
||||
"vesting_duration": 157788000,
|
||||
"vesting_frequency": 86400,
|
||||
"start_time": "genesis"
|
||||
},
|
||||
"recipients": [
|
||||
{
|
||||
"address": "zenith186vlkszzatke9842xg7aal5dmwz4mzgr9ls52q",
|
||||
"percent_share": "40"
|
||||
},
|
||||
{
|
||||
"address": "zenith1qahj75sg5ug4w8zzthtxf43z9sp2ngl09wkpze",
|
||||
"percent_share": "30"
|
||||
},
|
||||
{
|
||||
"address": "zenith1umahm4h67wj7sq9a03nm40rv83903d4ra2pzeg",
|
||||
"percent_share": "30"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"category": "zenith-foundation",
|
||||
"total_percent_share": "7.5",
|
||||
"unlock_params": {
|
||||
"unlock_duration": -1,
|
||||
"unlock_frequency": 0,
|
||||
"start_time": "genesis",
|
||||
"initial_unlock_percent": "33.33"
|
||||
},
|
||||
"recipients": [
|
||||
{
|
||||
"address": "zenith1f0qvketzh3rszryar5jdpx2qrhuzaxrry925qp",
|
||||
"percent_share": "100"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"category": "seed-investors",
|
||||
"total_percent_share": "15",
|
||||
"vesting_params": {
|
||||
"cliff_duration": 0,
|
||||
"vesting_duration": 157788000,
|
||||
"vesting_frequency": 86400,
|
||||
"start_time": "genesis"
|
||||
},
|
||||
"recipients": [
|
||||
{
|
||||
"address": "zenith1wqkhkwtwxg5taejcxgfvw9ad3psqflu8ae7h60",
|
||||
"percent_share": "50"
|
||||
},
|
||||
{
|
||||
"address": "zenith1mc0zv9spcd8qhrwwwlw2852yt39w96an775n3g",
|
||||
"percent_share": "25"
|
||||
},
|
||||
{
|
||||
"address": "zenith12q7spfzf2xpmrex6qf842wa52lchzdryqzwt8h",
|
||||
"percent_share": "25"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"category": "market-makers",
|
||||
"total_percent_share": "5",
|
||||
"recipients": [
|
||||
{
|
||||
"address": "zenith1y3s9592xkrykl403yhrd0rsv7cpl07qv9ue8fa",
|
||||
"percent_share": "60"
|
||||
},
|
||||
{
|
||||
"address": "zenith1vuhlpx76v60d5ldh0ytlkvr0lhuykqrzwz7m5s",
|
||||
"percent_share": "40"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
1175
lockdrop-calculations-simulated.ipynb
Normal file
1175
lockdrop-calculations-simulated.ipynb
Normal file
File diff suppressed because one or more lines are too long
8
requirements.txt
Normal file
8
requirements.txt
Normal file
@ -0,0 +1,8 @@
|
||||
pandas
|
||||
matplotlib
|
||||
jupyter
|
||||
seaborn
|
||||
numpy
|
||||
urbitob
|
||||
tabulate
|
||||
requests
|
1
tests/__init__.py
Normal file
1
tests/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# Test package for lockdrop allocation comparisons
|
175
tests/base_test.py
Normal file
175
tests/base_test.py
Normal file
@ -0,0 +1,175 @@
|
||||
import json
|
||||
import os
|
||||
import unittest
|
||||
import requests
|
||||
from collections import defaultdict
|
||||
from datetime import datetime
|
||||
import urbitob
|
||||
|
||||
SECONDS_PER_YEAR = int(365.25 * 24 * 60 * 60)
|
||||
BLOCK_DURATION_SECONDS = 2
|
||||
|
||||
|
||||
class BaseAllocationTest(unittest.TestCase):
|
||||
"""Base test class with shared setup and helper methods"""
|
||||
|
||||
@classmethod
|
||||
def setUpClass(cls):
|
||||
"""Load data once for all tests"""
|
||||
cls.rest_api_endpoint = os.getenv('REST_API_ENDPOINT')
|
||||
cls.rpc_api_endpoint = os.getenv('RPC_API_ENDPOINT')
|
||||
|
||||
if not cls.rest_api_endpoint:
|
||||
raise unittest.SkipTest("REST_API_ENDPOINT environment variable not set")
|
||||
if not cls.rpc_api_endpoint:
|
||||
raise unittest.SkipTest("RPC_API_ENDPOINT environment variable not set")
|
||||
|
||||
# Load data files
|
||||
with open('./generated/watcher-events.json', 'r') as f:
|
||||
cls.watcher_events = json.load(f)
|
||||
with open('./generated/generated-participants.json', 'r') as f:
|
||||
cls.participants = json.load(f)
|
||||
with open('lockdrop_allocations_notebook.json', 'r') as f:
|
||||
cls.notebook_allocations = json.load(f)
|
||||
|
||||
cls.points_by_duration = cls._get_first_points()
|
||||
|
||||
# Load distribution config for unlock frequency
|
||||
with open('distribution-simulate-lockdrop.json', 'r') as f:
|
||||
distribution_config = json.load(f)
|
||||
for category in distribution_config:
|
||||
if category['category'] == 'lockdrop':
|
||||
cls.unlock_frequency_blocks = category['unlock_params']['unlock_frequency'] // BLOCK_DURATION_SECONDS
|
||||
break
|
||||
|
||||
@classmethod
|
||||
def _get_first_points(cls):
|
||||
"""Extract first star and galaxy for each lock duration"""
|
||||
points = defaultdict(lambda: {'star': None, 'galaxy': None})
|
||||
|
||||
for event_data in cls.watcher_events['data']['eventsInRange']:
|
||||
if event_data['event']['__typename'] == 'PointLockedEvent':
|
||||
point = event_data['event']['point']
|
||||
lock_period = event_data['event']['lock_period']
|
||||
azimuth_id = event_data['event']['azimuth_id']
|
||||
|
||||
point_num = urbitob.patp_to_num(point)
|
||||
point_type = "galaxy" if point_num < 256 else "star"
|
||||
|
||||
if points[lock_period][point_type] is None:
|
||||
# Find zenith address for this point
|
||||
zenith_address = cls._find_zenith_address(azimuth_id)
|
||||
|
||||
points[lock_period][point_type] = {
|
||||
'point': point,
|
||||
'azimuth_id': azimuth_id,
|
||||
'zenith_address': zenith_address,
|
||||
'lock_period': lock_period,
|
||||
'block_timestamp': event_data['block']['timestamp']
|
||||
}
|
||||
|
||||
return dict(points)
|
||||
|
||||
@classmethod
|
||||
def _find_zenith_address(cls, azimuth_id):
|
||||
"""Find zenith address for given azimuth_id"""
|
||||
for p in cls.participants:
|
||||
if p['attestation']['payload']['address'] == azimuth_id:
|
||||
return p['attestation']['payload']['payload']['address']
|
||||
return None
|
||||
|
||||
def _get_point_allocation_amount_from_api(self, zenith_address, point):
|
||||
"""Query API endpoint for allocation amount"""
|
||||
point_num = urbitob.patp_to_num(point)
|
||||
url = f"{self.rest_api_endpoint}/laconic/immutabletreasury/v1/allocations/{zenith_address}/{point_num}"
|
||||
|
||||
try:
|
||||
response = requests.get(url, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if 'allocations' in data and len(data['allocations']) > 0:
|
||||
return int(data['allocations'][0]['allocated_amount']['amount'])
|
||||
except Exception as e:
|
||||
self.fail(f"zenithd request failed for {point}: {e}")
|
||||
return None
|
||||
|
||||
def _get_total_address_allocation_amount_from_api(self, zenith_address):
|
||||
"""Get total allocation for an address from API"""
|
||||
url = f"{self.rest_api_endpoint}/laconic/immutabletreasury/v1/allocations/{zenith_address}"
|
||||
|
||||
try:
|
||||
response = requests.get(url, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if 'allocations' in data:
|
||||
total = sum(int(alloc['allocated_amount']['amount']) for alloc in data['allocations'])
|
||||
return total
|
||||
except Exception as e:
|
||||
self.fail(f"zenithd request failed for address {zenith_address}: {e}")
|
||||
return 0
|
||||
|
||||
def _get_point_allocation_from_api(self, zenith_address, point):
|
||||
"""Get allocation with unlock schedule from API"""
|
||||
point_num = urbitob.patp_to_num(point)
|
||||
url = f"{self.rest_api_endpoint}/laconic/immutabletreasury/v1/allocations/{zenith_address}/{point_num}"
|
||||
|
||||
try:
|
||||
response = requests.get(url, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if 'allocations' in data and len(data['allocations']) > 0:
|
||||
return data['allocations'][0]
|
||||
except Exception as e:
|
||||
self.fail(f"zenithd request failed for {point}: {e}")
|
||||
return None
|
||||
|
||||
def _get_genesis_time_from_api(self):
|
||||
"""Get genesis time from node"""
|
||||
url = f"{self.rpc_api_endpoint}/block?height=1"
|
||||
|
||||
try:
|
||||
response = requests.get(url, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
genesis_time_str = data['result']['block']['header']['time']
|
||||
|
||||
# Remove fractional seconds if present
|
||||
if '.' in genesis_time_str:
|
||||
genesis_time_str = genesis_time_str.split('.')[0] + 'Z'
|
||||
|
||||
# Parse the time string and return timestamp
|
||||
genesis_time = datetime.fromisoformat(genesis_time_str.replace('Z', '+00:00'))
|
||||
return int(genesis_time.timestamp())
|
||||
except Exception as e:
|
||||
self.fail(f"Failed to get genesis time: {e}")
|
||||
return None
|
||||
|
||||
def _get_point_accrual_state_from_api(self, zenith_address, point, block_height):
|
||||
"""Get accrual state for a point at specific block height"""
|
||||
point_num = urbitob.patp_to_num(point)
|
||||
url = f"{self.rest_api_endpoint}/laconic/immutabletreasury/v1/accrual_state/{zenith_address}/{point_num}"
|
||||
|
||||
headers = {"x-cosmos-block-height": str(block_height)}
|
||||
|
||||
try:
|
||||
response = requests.get(url, headers=headers, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
if 'accrual_state' in data:
|
||||
return data['accrual_state']
|
||||
except Exception as e:
|
||||
self.fail(f"zenithd request failed for accrual state {point} at block {block_height}: {e}")
|
||||
return None
|
||||
|
||||
def _get_latest_block_height_from_api(self):
|
||||
"""Get latest block height from node"""
|
||||
url = f"{self.rpc_api_endpoint}/status"
|
||||
|
||||
try:
|
||||
response = requests.get(url, timeout=30)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
return int(data['result']['sync_info']['latest_block_height'])
|
||||
except Exception as e:
|
||||
self.fail(f"Failed to get latest block height: {e}")
|
||||
return None
|
70
tests/run_all_tests.py
Executable file
70
tests/run_all_tests.py
Executable file
@ -0,0 +1,70 @@
|
||||
"""
|
||||
Test runner for all lockdrop allocation comparison tests.
|
||||
|
||||
This script runs all test modules in the correct order:
|
||||
1. Allocation tests (star, galaxy, total)
|
||||
2. Unlock schedule tests
|
||||
3. Accrual state tests
|
||||
|
||||
Usage:
|
||||
python run_all_tests.py
|
||||
|
||||
Environment variables required:
|
||||
REST_API_ENDPOINT - REST API endpoint for zenithd
|
||||
RPC_API_ENDPOINT - RPC API endpoint for zenithd
|
||||
"""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
import os
|
||||
|
||||
# Add the tests directory to the path so we can import the test modules
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Import test classes
|
||||
from test_allocations import AllocationTest
|
||||
from test_unlock_schedule import UnlockScheduleTest
|
||||
from test_accrual_state import AccrualStateTest
|
||||
|
||||
|
||||
def create_test_suite():
|
||||
"""Create a test suite with all tests in the desired order"""
|
||||
suite = unittest.TestSuite()
|
||||
|
||||
# Add allocation tests
|
||||
suite.addTest(AllocationTest('test_0_star_allocations'))
|
||||
suite.addTest(AllocationTest('test_1_galaxy_allocations'))
|
||||
suite.addTest(AllocationTest('test_2_total_allocations'))
|
||||
|
||||
# Add unlock schedule tests
|
||||
suite.addTest(UnlockScheduleTest('test_unlock_schedule_calculation'))
|
||||
|
||||
# Add accrual state tests
|
||||
suite.addTest(AccrualStateTest('test_accrual_state_calculation'))
|
||||
|
||||
return suite
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all tests with detailed output"""
|
||||
print("="*80)
|
||||
print("LOCKDROP ALLOCATION COMPARISON TESTS")
|
||||
print("="*80)
|
||||
|
||||
# Check environment variables
|
||||
if not os.getenv('REST_API_ENDPOINT'):
|
||||
print("ERROR: REST_API_ENDPOINT environment variable not set")
|
||||
sys.exit(1)
|
||||
|
||||
if not os.getenv('RPC_API_ENDPOINT'):
|
||||
print("ERROR: RPC_API_ENDPOINT environment variable not set")
|
||||
sys.exit(1)
|
||||
|
||||
# Create and run test suite
|
||||
suite = create_test_suite()
|
||||
runner = unittest.TextTestRunner(verbosity=2, stream=sys.stdout)
|
||||
runner.run(suite)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
99
tests/test_accrual_state.py
Normal file
99
tests/test_accrual_state.py
Normal file
@ -0,0 +1,99 @@
|
||||
import unittest
|
||||
from tabulate import tabulate
|
||||
from base_test import BaseAllocationTest
|
||||
|
||||
|
||||
class AccrualStateTest(BaseAllocationTest):
|
||||
"""Test accrual state calculations"""
|
||||
|
||||
def test_accrual_state_calculation(self):
|
||||
"""Test accrual state calculations after some blocks"""
|
||||
print("\nACCRUAL STATE CALCULATIONS")
|
||||
|
||||
# Get latest block height for testing
|
||||
test_block_height = self._get_latest_block_height_from_api()
|
||||
if not test_block_height:
|
||||
self.skipTest("Could not retrieve latest block height")
|
||||
|
||||
# Test first star and galaxy from each lock period (limit to avoid too many API calls)
|
||||
test_points = []
|
||||
for lock_period in sorted(self.points_by_duration.keys()):
|
||||
duration_data = self.points_by_duration[lock_period]
|
||||
|
||||
if duration_data['star']:
|
||||
star_data = duration_data['star']
|
||||
zenith_addr = star_data['zenith_address']
|
||||
if zenith_addr:
|
||||
test_points.append((star_data['point'], zenith_addr, lock_period))
|
||||
|
||||
if duration_data['galaxy']:
|
||||
galaxy_data = duration_data['galaxy']
|
||||
zenith_addr = galaxy_data['zenith_address']
|
||||
if zenith_addr:
|
||||
test_points.append((galaxy_data['point'], zenith_addr, lock_period))
|
||||
|
||||
# Collect data for table
|
||||
accrual_data = []
|
||||
|
||||
for point, zenith_addr, lock_period in test_points:
|
||||
with self.subTest(point=point, block_height=test_block_height):
|
||||
# Get allocation data first
|
||||
allocation_data = self._get_point_allocation_from_api(zenith_addr, point)
|
||||
self.assertIsNotNone(allocation_data, f"No allocation data for {point}")
|
||||
|
||||
unlock_schedule = allocation_data.get('unlock_schedule')
|
||||
self.assertIsNotNone(unlock_schedule, f"No unlock_schedule for {point}")
|
||||
|
||||
# Get accrual state at test block height
|
||||
accrual_state = self._get_point_accrual_state_from_api(zenith_addr, point, test_block_height)
|
||||
self.assertIsNotNone(accrual_state, f"No accrual state for {point} at block {test_block_height}")
|
||||
|
||||
# Extract values for calculation
|
||||
total_allocation = int(allocation_data['allocated_amount']['amount'])
|
||||
initial_unlock_amount = int(unlock_schedule['initial_unlock_amount']['amount'])
|
||||
unlock_blocks = int(unlock_schedule['unlock_blocks'])
|
||||
|
||||
api_total_unlocked = int(accrual_state['total_unlocked']['amount'])
|
||||
last_unlock_block = int(accrual_state['last_unlock_block'])
|
||||
|
||||
# Calculate expected last unlock block using unlock_frequency
|
||||
expected_last_unlock_block = (test_block_height // self.unlock_frequency_blocks) * self.unlock_frequency_blocks
|
||||
|
||||
# Assert on last unlock block
|
||||
self.assertEqual(expected_last_unlock_block, last_unlock_block,
|
||||
f"Last unlock block mismatch for {point} at block {test_block_height}: "
|
||||
f"Expected={expected_last_unlock_block}, "
|
||||
f"zenithd={last_unlock_block}, "
|
||||
f"Diff={last_unlock_block - expected_last_unlock_block}")
|
||||
|
||||
# Calculate expected total_unlocked using expected last unlock block
|
||||
# Formula: initial_unlock_amount + (expected_last_unlock_block * remaining_amount / unlock_blocks)
|
||||
remaining_amount = total_allocation - initial_unlock_amount
|
||||
expected_total_unlocked = initial_unlock_amount + (expected_last_unlock_block * remaining_amount // unlock_blocks)
|
||||
|
||||
difference = api_total_unlocked - expected_total_unlocked
|
||||
|
||||
accrual_data.append([
|
||||
point,
|
||||
f"{lock_period} years",
|
||||
f"Block {test_block_height}",
|
||||
f"Block {last_unlock_block}",
|
||||
f"{expected_total_unlocked:,}",
|
||||
f"{api_total_unlocked:,}",
|
||||
f"{difference:+,}" if difference != 0 else "0"
|
||||
])
|
||||
|
||||
self.assertEqual(expected_total_unlocked, api_total_unlocked,
|
||||
f"Total unlocked mismatch for {point} at block {test_block_height}: "
|
||||
f"Expected={expected_total_unlocked:,} $sZ, "
|
||||
f"zenithd={api_total_unlocked:,} $sZ, "
|
||||
f"Diff={difference:+,} $sZ")
|
||||
|
||||
# Print table
|
||||
print(f"\nTotal Unlocked at Block {test_block_height}:")
|
||||
headers = ["Point", "Lock Period", "Block Height", "Last Unlocked At", "Expected ($sZ)", "zenithd ($sZ)", "Difference"]
|
||||
print(tabulate(accrual_data, headers=headers, tablefmt="grid"))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main(verbosity=2)
|
140
tests/test_allocations.py
Normal file
140
tests/test_allocations.py
Normal file
@ -0,0 +1,140 @@
|
||||
import unittest
|
||||
from tabulate import tabulate
|
||||
from base_test import BaseAllocationTest
|
||||
|
||||
|
||||
class AllocationTest(BaseAllocationTest):
|
||||
"""Test allocation comparisons between notebook and zenithd"""
|
||||
|
||||
def test_0_star_allocations(self):
|
||||
"""Test star allocations for all lock periods"""
|
||||
print("\nSTAR ALLOCATIONS COMPARISON")
|
||||
|
||||
table_data = []
|
||||
headers = ["Lock Period", "Point", "Zenith Address", "Notebook ($sZ)", "zenithd ($sZ)", "Difference"]
|
||||
|
||||
for lock_period in sorted(self.points_by_duration.keys()):
|
||||
with self.subTest(lock_period=lock_period):
|
||||
duration_data = self.points_by_duration[lock_period]
|
||||
|
||||
star_data = duration_data['star']
|
||||
if star_data == None:
|
||||
continue
|
||||
|
||||
zenith_addr = star_data['zenith_address']
|
||||
self.assertIsNotNone(zenith_addr,
|
||||
f"Could not find zenith address for star {star_data['point']}")
|
||||
|
||||
allocation_data = self._get_point_allocation_from_api(zenith_addr, star_data['point'])
|
||||
self.assertIsNotNone(allocation_data,
|
||||
f"zenithd returned no allocation for star {star_data['point']}")
|
||||
|
||||
api_allocation = int(allocation_data['allocated_amount']['amount'])
|
||||
|
||||
notebook_key = f'{lock_period}_years'
|
||||
notebook_allocation = self.notebook_allocations['stars'].get(notebook_key, 0)
|
||||
|
||||
difference = api_allocation - notebook_allocation
|
||||
|
||||
table_data.append([
|
||||
f"{lock_period} years",
|
||||
star_data['point'],
|
||||
zenith_addr,
|
||||
f"{notebook_allocation:,}",
|
||||
f"{api_allocation:,}",
|
||||
f"{difference:+,}" if difference != 0 else "0"
|
||||
])
|
||||
|
||||
self.assertEqual(notebook_allocation, api_allocation,
|
||||
f"Star {star_data['point']} ({lock_period}Y): "
|
||||
f"Notebook={notebook_allocation:,} $sZ, "
|
||||
f"zenithd={api_allocation:,} $sZ, "
|
||||
f"Diff={difference:+,} $sZ")
|
||||
|
||||
print(tabulate(table_data, headers=headers, tablefmt="grid"))
|
||||
|
||||
def test_1_galaxy_allocations(self):
|
||||
"""Test galaxy allocations for all lock periods"""
|
||||
print("\nGALAXY ALLOCATIONS COMPARISON")
|
||||
|
||||
table_data = []
|
||||
headers = ["Lock Period", "Point", "Zenith Address", "Notebook ($sZ)", "zenithd ($sZ)", "Difference"]
|
||||
|
||||
for lock_period in sorted(self.points_by_duration.keys()):
|
||||
with self.subTest(lock_period=lock_period):
|
||||
duration_data = self.points_by_duration[lock_period]
|
||||
|
||||
galaxy_data = duration_data['galaxy']
|
||||
if galaxy_data == None:
|
||||
continue
|
||||
|
||||
zenith_addr = galaxy_data['zenith_address']
|
||||
self.assertIsNotNone(zenith_addr,
|
||||
f"Could not find zenith address for galaxy {galaxy_data['point']}")
|
||||
|
||||
allocation_data = self._get_point_allocation_from_api(zenith_addr, galaxy_data['point'])
|
||||
self.assertIsNotNone(allocation_data,
|
||||
f"zenithd returned no allocation for galaxy {galaxy_data['point']}")
|
||||
|
||||
api_allocation = int(allocation_data['allocated_amount']['amount'])
|
||||
|
||||
notebook_key = f'{lock_period}_years'
|
||||
notebook_allocation = self.notebook_allocations['galaxies'].get(notebook_key, 0)
|
||||
|
||||
difference = api_allocation - notebook_allocation
|
||||
|
||||
table_data.append([
|
||||
f"{lock_period} years",
|
||||
galaxy_data['point'],
|
||||
zenith_addr,
|
||||
f"{notebook_allocation:,}",
|
||||
f"{api_allocation:,}",
|
||||
f"{difference:+,}" if difference != 0 else "0"
|
||||
])
|
||||
|
||||
self.assertEqual(notebook_allocation, api_allocation,
|
||||
f"Galaxy {galaxy_data['point']} ({lock_period}Y): "
|
||||
f"Notebook={notebook_allocation:,} $sZ, "
|
||||
f"zenithd={api_allocation:,} $sZ, "
|
||||
f"Diff={difference:+,} $sZ")
|
||||
|
||||
print(tabulate(table_data, headers=headers, tablefmt="grid"))
|
||||
|
||||
def test_2_total_allocations(self):
|
||||
"""Test total allocations for all participants"""
|
||||
print("\nTOTAL ALLOCATIONS COMPARISON")
|
||||
|
||||
notebook_total = self.notebook_allocations.get('total', 0)
|
||||
|
||||
if notebook_total == 0:
|
||||
self.skipTest("No total_allocation found in notebook data")
|
||||
|
||||
api_total = 0
|
||||
|
||||
for i, participant in enumerate(self.participants):
|
||||
zenith_address = participant['attestation']['payload']['payload']['address']
|
||||
|
||||
with self.subTest(participant=i, address=zenith_address):
|
||||
participant_total = self._get_total_address_allocation_amount_from_api(zenith_address)
|
||||
api_total += participant_total
|
||||
|
||||
# Summary table
|
||||
difference = api_total - notebook_total
|
||||
table_data = [
|
||||
["Notebook", f"{notebook_total:,}"],
|
||||
["zenithd", f"{api_total:,}"],
|
||||
["Difference", f"{difference:+,}" if difference != 0 else "0"]
|
||||
]
|
||||
headers = ["Source", "Total Allocation ($sZ)"]
|
||||
|
||||
print(tabulate(table_data, headers=headers, tablefmt="grid"))
|
||||
|
||||
self.assertEqual(notebook_total, api_total,
|
||||
f"Total allocation mismatch: "
|
||||
f"Notebook={notebook_total:,} $sZ, "
|
||||
f"zenithd={api_total:,} $sZ, "
|
||||
f"Diff={difference:+,} $sZ")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main(verbosity=2)
|
111
tests/test_unlock_schedule.py
Normal file
111
tests/test_unlock_schedule.py
Normal file
@ -0,0 +1,111 @@
|
||||
import unittest
|
||||
from datetime import datetime
|
||||
from tabulate import tabulate
|
||||
from base_test import BaseAllocationTest, SECONDS_PER_YEAR, BLOCK_DURATION_SECONDS
|
||||
|
||||
|
||||
class UnlockScheduleTest(BaseAllocationTest):
|
||||
"""Test unlock schedule calculations"""
|
||||
|
||||
def test_unlock_schedule_calculation(self):
|
||||
"""Test unlock schedule calculations for all lock periods"""
|
||||
print("\nUNLOCK SCHEDULE CALCULATIONS")
|
||||
|
||||
genesis_timestamp = self._get_genesis_time_from_api()
|
||||
if not genesis_timestamp:
|
||||
self.skipTest("Could not retrieve genesis time")
|
||||
|
||||
# Test first star and galaxy from each lock period (limit to avoid too many API calls)
|
||||
test_points = []
|
||||
for lock_period in sorted(self.points_by_duration.keys()):
|
||||
duration_data = self.points_by_duration[lock_period]
|
||||
|
||||
if duration_data['star']:
|
||||
star_data = duration_data['star']
|
||||
zenith_addr = star_data['zenith_address']
|
||||
if zenith_addr:
|
||||
test_points.append((star_data['point'], zenith_addr, lock_period, star_data['block_timestamp']))
|
||||
|
||||
if duration_data['galaxy']:
|
||||
galaxy_data = duration_data['galaxy']
|
||||
zenith_addr = galaxy_data['zenith_address']
|
||||
if zenith_addr:
|
||||
test_points.append((galaxy_data['point'], zenith_addr, lock_period, galaxy_data['block_timestamp']))
|
||||
|
||||
# Collect data for tables
|
||||
unlock_blocks_data = []
|
||||
initial_unlock_data = []
|
||||
|
||||
for point, zenith_addr, lock_period, start_timestamp in test_points:
|
||||
with self.subTest(point=point):
|
||||
# Get allocation with unlock schedule from zenithd
|
||||
allocation_data = self._get_point_allocation_from_api(zenith_addr, point)
|
||||
self.assertIsNotNone(allocation_data, f"No allocation data for {point}")
|
||||
|
||||
unlock_schedule = allocation_data.get('unlock_schedule')
|
||||
self.assertIsNotNone(unlock_schedule, f"No unlock_schedule for {point}")
|
||||
|
||||
# Use start timestamp from watcher events
|
||||
# Calculate time difference and pregenesis blocks
|
||||
time_diff_seconds = genesis_timestamp - start_timestamp
|
||||
pregenesis_blocks = time_diff_seconds // BLOCK_DURATION_SECONDS
|
||||
|
||||
# Calculate total lock duration in blocks (years * seconds_per_year / block_time)
|
||||
total_lock_blocks = lock_period * SECONDS_PER_YEAR // BLOCK_DURATION_SECONDS
|
||||
|
||||
# Calculate expected remaining blocks
|
||||
expected_unlock_blocks = total_lock_blocks - pregenesis_blocks
|
||||
api_unlock_blocks = int(unlock_schedule['unlock_blocks'])
|
||||
blocks_diff = api_unlock_blocks - expected_unlock_blocks
|
||||
|
||||
# Convert start timestamp to ISO format
|
||||
start_time_iso = datetime.fromtimestamp(start_timestamp).isoformat()
|
||||
|
||||
unlock_blocks_data.append([
|
||||
point,
|
||||
f"{lock_period} years",
|
||||
start_time_iso,
|
||||
f"{expected_unlock_blocks:,}",
|
||||
f"{api_unlock_blocks:,}",
|
||||
f"{blocks_diff:+,}" if blocks_diff != 0 else "0"
|
||||
])
|
||||
|
||||
self.assertEqual(expected_unlock_blocks, api_unlock_blocks,
|
||||
f"Unlock blocks mismatch for {point}: "
|
||||
f"Expected={expected_unlock_blocks}, "
|
||||
f"zenithd={api_unlock_blocks}, "
|
||||
f"Diff={blocks_diff}")
|
||||
|
||||
# Calculate expected initial unlock amount
|
||||
total_allocation = int(allocation_data['allocated_amount']['amount'])
|
||||
expected_initial_unlock = (pregenesis_blocks * total_allocation) // total_lock_blocks
|
||||
api_initial_unlock = int(unlock_schedule['initial_unlock_amount']['amount'])
|
||||
unlock_diff = api_initial_unlock - expected_initial_unlock
|
||||
|
||||
initial_unlock_data.append([
|
||||
point,
|
||||
f"{lock_period} years",
|
||||
start_time_iso,
|
||||
f"{expected_initial_unlock:,}",
|
||||
f"{api_initial_unlock:,}",
|
||||
f"{unlock_diff:+,}" if unlock_diff != 0 else "0"
|
||||
])
|
||||
|
||||
self.assertEqual(expected_initial_unlock, api_initial_unlock,
|
||||
f"Initial unlock amount mismatch for {point}: "
|
||||
f"Expected={expected_initial_unlock:,} $sZ, "
|
||||
f"zenithd={api_initial_unlock:,} $sZ, "
|
||||
f"Diff={unlock_diff:+,} $sZ")
|
||||
|
||||
# Print tables
|
||||
print("\nUnlock Blocks Comparison:")
|
||||
unlock_blocks_headers = ["Point", "Lock Period", "Start Time", "Expected Blocks", "zenithd Blocks", "Difference"]
|
||||
print(tabulate(unlock_blocks_data, headers=unlock_blocks_headers, tablefmt="grid"))
|
||||
|
||||
print("\nInitial Unlock Amounts Comparison:")
|
||||
initial_unlock_headers = ["Point", "Lock Period", "Start Time", "Expected ($sZ)", "zenithd ($sZ)", "Difference"]
|
||||
print(tabulate(initial_unlock_data, headers=initial_unlock_headers, tablefmt="grid"))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main(verbosity=2)
|
Loading…
Reference in New Issue
Block a user