Compare commits

...

45 Commits

Author SHA1 Message Date
18ca4e1a08
Handle user defined types and unnamed function arguments when parsing contract in codegen (#545)
* Handle user defined types in visitor methods when continuing on error

* Handle unnamed arguments in solidity methods

* Update graph-cli package version

---------

Co-authored-by: Shreerang Kale <shreerangkale@gmail.com>
2025-08-13 12:41:16 +05:30
b572eb9032
Update Lotus EVM null block error message (#543)
* Update null block error message

* Update package versions
2025-03-19 20:13:05 +05:30
e0e2efd571
Set block totalDifficulty to zero if not available in JSON-RPC response (#540)
* Set total difficulty to zero if not available

* Upgrade package version
2025-03-10 19:27:25 +05:30
prathamesh0
944db0eaf7
Decrement package version (#539) 2024-10-14 12:25:49 +05:30
prathamesh0
ac74da6ea6
Add a CLI to backfill watcher event data (#538)
* Add a CLI to backfill watcher event data

* Add required codegen template

* Increment package version
2024-10-14 12:07:56 +05:30
5d7b7fe5b4
Handle template create in subgraph watcher during reorgs (#533)
* Handle template create events processing order during reorgs

* Add removeWatcher method in graph-node dummy indexer for test

* Apply GQL and RPC server middlewares ordered on requested paths

* Increment package version

* Remove console logs

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-10-11 14:50:52 +05:30
prathamesh0
a585500012
Support topics filtering in getLogs ETH RPC API (#537)
* Store event topics in separate columns in db

* Store event data in a separate column in db

* Support topics filter in eth_getLogs RPC API

* Make RPC server path configurable

* Sort logs result by log index
2024-09-18 15:37:13 +05:30
prathamesh0
d413d724c7
Add ETH RPC API to get logs (#536)
* Add eth_getLogs API handler

* Transform events into logs

* Update codegen templates

* Allow GET requests

* Increment package verisons

* Remove unnecessary todo

* Add limit on getLogs results size

* Fix config template
2024-09-16 19:05:45 +05:30
prathamesh0
b46d8816b5
Add ETH RPC API to watcher server (#535)
* Add ETH RPC API to watcher server

* Add eth_call API handler

* Add error handling to eth_call handler

* Parse block tag in eth_call handler

* Add a flag to enable ETH RPC server

* Fix lint errors

* Update block tag parsing
2024-09-13 12:44:00 +05:30
prathamesh0
ea5ff93e21
Add a script to analyze given eth getLogs requests (#534)
* Add a script to perform configured eth getLogs calls

* Add get logs requests with blockhash near head

* Add get logs requests with blockrange near head

* Add get logs requests with older blockrange

* Update get logs requests

* Export curl requests for given params to a file

* Support making requests in parallel

* Refactor duplicate code and rename files
2024-08-29 17:16:52 +05:30
prathamesh0
d0f88756c3
Handle object and list filters on nested GQL selections and update codegen (#532)
* Handle object type for where clause on nested GQL selections

* Handle list type for where clause on nested GQL selections

* Generate GQL schema types with arguments on plural fields
2024-08-05 16:32:40 +05:30
prathamesh0
05fdf85af8
Handle where clause on nested GQL query selections (#531)
* Handle where clause on nested GQL query selections

* Handle variables for arguments on nested selections

* Handle args on nested GQL query selections for plural queries

* Update package versions
2024-08-02 13:42:47 +05:30
prathamesh0
d53fbcf731
Add support for args on nested GQL query selections in subgraph watchers (#530)
* Add support for arguments on nested GQL query selections

* Update package versions
2024-08-01 19:01:54 +05:30
37885c64eb
Watch contract for multiple data sources from subgraph config (#529)
* Watch contract for multiple data sources from subgraph config

* Upgrade package versions
2024-07-19 17:58:20 +05:30
42cb688921
Export metric for total ETH RPC count (#528)
* Export metric for total ETH RPC count by methods

* Fix endpoint switch on max retries of new block

* Upgrade package versions
2024-07-12 16:59:23 +05:30
2217cd3ffb
Support events handlers in multiple data sources for a contract address (#526)
* Support processing events in multiple subgraph datasources for a single contract address

* Fix parsing event topic in graph-node watcher

* Update codegen templates

* Fix dummy indexer method in graph-node test

* Upgrade package versions to 0.2.102
2024-06-26 17:56:37 +05:30
b9a899aec1
Implement switching endpoints after slow eth_getLogs RPC requests (#525)
* Switch upstream endpoint if getLogs requests are too slow

* Refactor methods for switching client to indexer

* Update codegen indexer template

* Add dummy methods in graph-node test Indexer

* Upgrade package versions to 0.2.101

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-06-20 17:57:01 +05:30
ff471da287
If subgraph block handlers exist force realtime processing in watcher (#524)
* Avoid historical sync for block handlers and eth_getLogs without event handlers

* Stop processing after endBlock in watcher fill CLI

* Codegen changes for avoiding eth_getLogs when no subgraph event handlers exist

* Upgrade package version to 0.2.100

* Check for blockHandlerExists first before block processing

* Fix comments for historical block processing checks
2024-06-20 10:05:53 +05:30
981f70ec9b
Replace null values when converting bytes to string in graph-node host API (#523)
* Replace null value in strings for postgres text data type

* Upgrade package versions to 0.2.99
2024-06-14 13:47:54 +05:30
prathamesh0
9c4b06652e
Export codegen config in a generated watcher (#522)
* Export codegen config in a generated watcher

* Update instructions with build step
2024-06-13 16:41:49 +05:30
467c173a0d
Use blockHash filter in eth_getLogs query for FEVM (#521)
* Fetch logs by blockHash at head for FEVM

* Upgrade watcher package versions to 0.2.98
2024-06-11 19:07:55 +05:30
7884941e75
Update codegen to add subgraph source to watcher readme (#514)
* Update codegen to add subgraph source to watcher readme

* Update exported metrics for watcher config

* Add steps to update package json and config to codegen instructions

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-06-11 14:42:54 +05:30
prathamesh0
acf69dd554
Add a config option for block processing offset (#520)
* Add config option for block processing offset

* Upgrade package versions
2024-06-10 15:09:33 +05:30
52082fc874
Process block with MAX_REORG_DEPTH delay from head (#519)
* Process block behind latest head by reorg height

* Upgrade package versions
2024-06-07 21:12:03 +05:30
4d81c14597
Fix order of events processing for FEVM (#518)
* Fix events processing order for FEVM

* Upgrade watcher package version

* Fix getResultEvent for eventsInRange GQL query

* Fix eventFields parsing
2024-06-07 17:23:55 +05:30
8d052add2d
Log GQL requests in watcher (#517)
* Update config for server GQL

* Add winston logger for GQL requests

* Fix codegen templates for resolver and package

* Update package versions

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-06-06 16:54:49 +05:30
836fe45aa5
Add metrics for GQL query duration (#516)
* Record GQL query durations by operation name

* Use try finally for timer metric

* Export watcher repo URL in metrics

* Remove unnecessary prefix from repo link

* Update repository label name

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-06-06 13:57:33 +05:30
7cc61579d8
Use ethers.js replace util for errors in bytesToString host API (#515)
* Use replacement strategy for ethers.utils.toUtf8String in graph-node host API

* Upgrade package versions
2024-06-04 11:41:54 +05:30
3b67fa1827
Use module read-pkg for reading package.json (#513)
* Use module read-pkg for reading package.json

* Upgrade package versions
2024-05-30 15:42:57 +05:30
f00f6d2998
Update chain head exporter CLI to support running multiple instances (#512)
* Update chain-head-exported CLI for running multiple instances

* Update upstream RPC for metrics with active endpoint

* Export metrics for watcher package version and commitHash

* Upgrade package versions
2024-05-30 14:37:35 +05:30
4fd6bf4ad7
[WIP] Handle GQL query fragment for single entity query (#511)
* Handle GQL query fragment for single entity query

* Upgrade package versions
2024-05-17 18:47:15 +05:30
b57aa76d9f
Support fragments in GQL queries for subgraph watchers (#510)
* Avoid updating latest block metrics on RPC errors

* Handle fragments in subgraph GQL queries

* Upgrade package versions

* Move private method in util graph database

---------

Co-authored-by: Prathamesh Musale <prathamesh.musale0@gmail.com>
2024-05-17 17:20:29 +05:30
prathamesh0
1ca74548ff
Fetch job queue counts for metrics on scraping (#509) 2024-05-16 16:34:07 +05:30
20fa6ceaa6
Remove check for isFEVM flag when filtering event logs by topics (#508)
* Remove check for FEVM flag when filtering logs by topics

* Update package versions
2024-05-16 11:24:41 +05:30
prathamesh0
c7e6baa263
Add metrics to monitor errors and duration for ETH RPC requests (#507)
* Add metrics to monitor errors and duration for ETH RPC requests

* Check for server error

* Add a metric with configured upstream ETH RPC endpoints

* Use Gauge for RPC requests duration metric

* Filter out unknown events while loading event count on start

* Update package versions

* Rethrow errors in overridden send provider method
2024-05-15 19:11:22 +05:30
c9696c3d9f
Implement failover for RPC endpoints in watcher (#506)
* Handle RPC endpoint server errors and switch failover endpoints

* Add config maxNewBlockRetries for switching to failover endpoint

* Upgrade package versions

* Move unknown events removal after event processing for historical sync

* Rename doFailOverEndpoints to switchClients
2024-05-09 16:03:06 +05:30
6d837dc824
Fix address filter param in eth_getLogs request (#505)
* Fix eth_getLogs address filter param

* Upgrade package versions
2024-05-03 14:24:05 +05:30
edaec5a028
Fetch logs filtered by block number instead of block hash for FEVM (#504)
* Fetch logs by block number for FEVM and use node-fetch for eth_getLogs

* Upgrade package versions
2024-05-02 16:04:31 +05:30
prathamesh0
67425690e9
[WIP] Skip contract events if no match found in ABI (#503)
* Skip contract events if no match found in ABI

* Update package versions
2024-03-19 17:52:00 +05:30
prathamesh0
59edc178c9
Fix for generating nested filters and loading subgraph entities from db (#502)
* Handle ThunkComposer field types when generating nested filters

* Fix type setting for relational fields when loading a subgraph entity from db

* Fix loading nullable relational fields in GQL queries

* Update package versions
2024-03-14 11:21:52 +05:30
prathamesh0
ff44fd8717
Fix event processing and entities query for filecoin watchers (#501)
* Filter out pruned subgraph entities while fetching from db

* Sort events by tx index first incase upstream is FEVM

* Upgrade package versions

---------

Co-authored-by: Nabarun <nabarun@deepstacksoft.com>
2024-03-06 11:10:41 +05:30
1cf1a3baa2
Upgrade package versions to 0.2.80 (#500) 2023-12-28 15:17:06 +05:30
33e4455f92
Fix pruning of canonical block if null block is encountered (#499) 2023-12-28 15:07:12 +05:30
a6deed9c27
Fix block processing during chain reorg (#498)
* Fix block processing during chain reorg

* Add new method in test dummy indexer

* Add missing semicolon
2023-12-28 15:06:47 +05:30
prathamesh0
78e43bc088
Export metrics for watcher config and upstream and external chain heads (#497)
* Add a cli to export chain head block numbers

* Use ETH RPC endpoint and allow env overrides

* Use ethers provider

* Export upstream chain head block number in watcher metrics

* Remove unnecessary exports

* Upgrade package versions

* Fix defaults usage

* Export watcher config in metrics

* Add metric for watcher sync mode

* Remove cache flag from watcher config metrics

* Update watcher config field names
2023-12-19 15:23:25 +05:30
68 changed files with 3439 additions and 783 deletions

View File

@ -2,7 +2,7 @@
"packages": [
"packages/*"
],
"version": "0.2.78",
"version": "0.2.110",
"npmClient": "yarn",
"useWorkspaces": true,
"command": {

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/cache",
"version": "0.2.78",
"version": "0.2.110",
"description": "Generic object cache",
"main": "dist/index.js",
"scripts": {

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/cli",
"version": "0.2.78",
"version": "0.2.110",
"main": "dist/index.js",
"license": "AGPL-3.0",
"scripts": {
@ -10,17 +10,18 @@
"copy-assets": "copyfiles -u 1 src/**/*.gql dist/",
"chat": "DEBUG='vulcanize:*, laconic:*' node dist/chat.js",
"compare-gql": "DEBUG='vulcanize:*' node dist/compare-gql.js",
"proxy": "DEBUG='laconic:*' node dist/proxy.js"
"proxy": "DEBUG='laconic:*' node dist/proxy.js",
"export-metrics:chain-heads": "DEBUG='laconic:*' node dist/chain-head-exporter.js"
},
"dependencies": {
"@apollo/client": "^3.7.1",
"@cerc-io/cache": "^0.2.78",
"@cerc-io/ipld-eth-client": "^0.2.78",
"@cerc-io/cache": "^0.2.110",
"@cerc-io/ipld-eth-client": "^0.2.110",
"@cerc-io/libp2p": "^0.42.2-laconic-0.1.4",
"@cerc-io/nitro-node": "^0.1.15",
"@cerc-io/peer": "^0.2.78",
"@cerc-io/rpc-eth-client": "^0.2.78",
"@cerc-io/util": "^0.2.78",
"@cerc-io/peer": "^0.2.110",
"@cerc-io/rpc-eth-client": "^0.2.110",
"@cerc-io/util": "^0.2.110",
"@ethersproject/providers": "^5.4.4",
"@graphql-tools/utils": "^9.1.1",
"@ipld/dag-cbor": "^8.0.0",

View File

@ -0,0 +1,127 @@
//
// Copyright 2024 Vulcanize, Inc.
//
import yargs from 'yargs';
import { hideBin } from 'yargs/helpers';
import assert from 'assert';
import { ConnectionOptions, Repository } from 'typeorm';
import debug from 'debug';
import { DEFAULT_CONFIG_PATH, JSONbigNative, DatabaseInterface, Config, EventInterface } from '@cerc-io/util';
import { BaseCmd } from './base';
const log = debug('vulcanize:backfill-events-data');
interface Arguments {
configFile: string;
batchSize: number;
}
export class BackfillEventsDataCmd {
_argv?: Arguments;
_baseCmd: BaseCmd;
constructor () {
this._baseCmd = new BaseCmd();
}
get config (): Config {
return this._baseCmd.config;
}
get database (): DatabaseInterface {
return this._baseCmd.database;
}
async initConfig<ConfigType> (): Promise<ConfigType> {
this._argv = this._getArgv();
assert(this._argv);
return this._baseCmd.initConfig(this._argv.configFile);
}
async init (
Database: new (
config: ConnectionOptions
) => DatabaseInterface
): Promise<void> {
await this.initConfig();
this._baseCmd._database = new Database(this.config.database);
await this.database.init();
}
async exec (eventEntity: new () => EventInterface): Promise<void> {
assert(this._argv);
const eventRepository: Repository<EventInterface> = this.database._conn.getRepository(eventEntity);
// Get the total count of events
const totalEvents = await eventRepository.count();
const batchSize = Number(this._argv.batchSize);
let page = 0;
let processedCount = 0;
let eventsWithNullData: EventInterface[];
while (processedCount < totalEvents) {
// Fetch events in batches with pagination
eventsWithNullData = await eventRepository.find({
order: { id: 'ASC' },
skip: page * batchSize,
take: batchSize
});
for (const event of eventsWithNullData) {
// Parse extra info and check if data field is present
const parsedExtraInfo = JSON.parse(event.extraInfo);
// Derive data and topics
if (parsedExtraInfo.data) {
event.data = parsedExtraInfo.data;
[event.topic0, event.topic1, event.topic2, event.topic3] = parsedExtraInfo.topics;
// Update extraInfo
delete parsedExtraInfo.data;
delete parsedExtraInfo.topics;
event.extraInfo = JSONbigNative.stringify(parsedExtraInfo);
}
}
// Save updated events
await eventRepository.save(eventsWithNullData);
// Update the processed count and progress
processedCount += eventsWithNullData.length;
const progress = ((processedCount / totalEvents) * 100).toFixed(2);
log(`Processed ${processedCount}/${totalEvents} events (${progress}% complete)`);
// Move to the next batch
eventsWithNullData = [];
page++;
}
log('Done.');
await this.database.close();
}
_getArgv (): any {
return yargs(hideBin(process.argv))
.option('configFile', {
alias: 'f',
describe: 'configuration file path (toml)',
type: 'string',
default: DEFAULT_CONFIG_PATH
})
.option('b', {
alias: 'batch-size',
describe: 'batch size to process events in',
type: 'number',
default: 1000
})
.argv;
}
}

View File

@ -96,7 +96,7 @@ export class BaseCmd {
this._jobQueue = new JobQueue({ dbConnectionString, maxCompletionLag: maxCompletionLagInSecs });
await this._jobQueue.start();
const { ethClient, ethProvider } = await initClients(this._config);
const { ethClient, ethProvider } = await initClients(this._config.upstream);
this._ethProvider = ethProvider;
this._clients = { ethClient, ...clients };
}

View File

@ -0,0 +1,72 @@
//
// Copyright 2023 Vulcanize, Inc.
//
import express from 'express';
import * as promClient from 'prom-client';
import debug from 'debug';
import { ethers } from 'ethers';
import JsonRpcProvider = ethers.providers.JsonRpcProvider;
const log = debug('laconic:chain-head-exporter');
// Env overrides:
// ETH_RPC_ENDPOINT - Ethereum RPC API endpoint
// ETH_RPC_API_KEY - Ethereum RPC API endpoint key
// PORT - Metrics server listening port
// Defaults
const DEFAULT_ETH_RPC_ENDPOINT = 'https://mainnet.infura.io/v3';
const DEFAULT_PORT = 5000;
async function main (): Promise<void> {
const app = express();
const metricsRegister = new promClient.Registry();
const ethRpcBaseUrl = process.env.ETH_RPC_ENDPOINT || DEFAULT_ETH_RPC_ENDPOINT;
const ethRpcApiKey = process.env.ETH_RPC_API_KEY;
if (!ethRpcApiKey) {
log('WARNING: ETH_RPC_API_KEY not set');
}
const ethUrlSuffix = ethRpcApiKey ? `/${ethRpcApiKey}` : '';
const ethRpcUrl = `${ethRpcBaseUrl}${ethUrlSuffix}`;
let ethProvider: JsonRpcProvider;
try {
ethProvider = new JsonRpcProvider(ethRpcUrl);
} catch (err) {
log(`Error creating ETH RPC provider from URL ${ethRpcBaseUrl}`, err);
}
// eslint-disable-next-line no-new
new promClient.Gauge({
name: 'latest_block_number',
help: 'Latest block number / height from various block chains',
registers: [metricsRegister],
labelNames: ['chain'] as const,
async collect () {
try {
const latestEthBlockNumber = await ethProvider.getBlockNumber();
this.set(latestEthBlockNumber);
} catch (err) {
log('Error fetching latest block number', err);
}
}
});
app.get('/metrics', async (req, res) => {
res.set('Content-Type', metricsRegister.contentType);
const metrics = await metricsRegister.metrics();
res.send(metrics);
});
const port = Number(process.env.PORT) || DEFAULT_PORT;
app.listen(port, () => {
log(`Server running on port ${port}`);
});
}
main().catch(err => {
log(err);
});

View File

@ -17,4 +17,4 @@ export * from './fill';
export * from './create-state-gql';
export * from './peer';
export * from './utils';
export * from './proxy';
export * from './backfill-events-data';

View File

@ -110,7 +110,11 @@ export class JobRunnerCmd {
await indexer.addContracts();
}
const jobRunner = new JobRunner(config.jobQueue, indexer, jobQueue);
const jobRunner = new JobRunner(
config.jobQueue,
indexer,
jobQueue
);
// Delete all active and pending (before completed) jobs to start job-runner without old queued jobs
await jobRunner.jobQueue.deleteAllJobs('completed');
@ -121,7 +125,7 @@ export class JobRunnerCmd {
await startJobRunner(jobRunner);
jobRunner.handleShutdown();
await startMetricsServer(config, indexer);
await startMetricsServer(config, jobQueue, indexer);
}
_getArgv (): any {

View File

@ -11,6 +11,7 @@ import assert from 'assert';
import { ConnectionOptions } from 'typeorm';
import express, { Application } from 'express';
import { ApolloServer } from 'apollo-server-express';
import winston from 'winston';
import { JsonRpcProvider } from '@ethersproject/providers';
import {
@ -30,7 +31,9 @@ import {
Consensus,
readParty,
UpstreamConfig,
fillBlocks
fillBlocks,
createGQLLogger,
createEthRPCHandlers
} from '@cerc-io/util';
import { TypeSource } from '@graphql-tools/utils';
import type {
@ -268,7 +271,11 @@ export class ServerCmd {
}
async exec (
createResolvers: (indexer: IndexerInterface, eventWatcher: EventWatcher) => Promise<any>,
createResolvers: (
indexer: IndexerInterface,
eventWatcher: EventWatcher,
gqlLogger: winston.Logger
) => Promise<any>,
typeDefs: TypeSource,
paymentsManager?: PaymentsManager
): Promise<{
@ -279,6 +286,7 @@ export class ServerCmd {
const jobQueue = this._baseCmd.jobQueue;
const indexer = this._baseCmd.indexer;
const eventWatcher = this._baseCmd.eventWatcher;
const ethProvider = this._baseCmd.ethProvider;
assert(config);
assert(jobQueue);
@ -308,11 +316,21 @@ export class ServerCmd {
await eventWatcher.start();
}
const resolvers = await createResolvers(indexer, eventWatcher);
const gqlLogger = createGQLLogger(config.server.gql.logDir);
const resolvers = await createResolvers(indexer, eventWatcher, gqlLogger);
const ethRPCHandlers = await createEthRPCHandlers(indexer, ethProvider);
// Create an Express app
const app: Application = express();
const server = await createAndStartServer(app, typeDefs, resolvers, config.server, paymentsManager);
const server = await createAndStartServer(
app,
typeDefs,
resolvers,
ethRPCHandlers,
config.server,
paymentsManager
);
await startGQLMetricsServer(config);

View File

@ -9,7 +9,7 @@ import { providers } from 'ethers';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721#issuecomment-1319854183
import { PeerIdObj } from '@cerc-io/peer';
import { Config, EthClient, getCustomProvider } from '@cerc-io/util';
import { EthClient, UpstreamConfig, getCustomProvider } from '@cerc-io/util';
import { getCache } from '@cerc-io/cache';
import { EthClient as GqlEthClient } from '@cerc-io/ipld-eth-client';
import { EthClient as RpcEthClient } from '@cerc-io/rpc-eth-client';
@ -22,19 +22,14 @@ export function readPeerId (filePath: string): PeerIdObj {
return JSON.parse(peerIdJson);
}
export const initClients = async (config: Config): Promise<{
export const initClients = async (upstreamConfig: UpstreamConfig, endpointIndexes = { rpcProviderEndpoint: 0 }): Promise<{
ethClient: EthClient,
ethProvider: providers.JsonRpcProvider
}> => {
const { database: dbConfig, upstream: upstreamConfig, server: serverConfig } = config;
const { ethServer: { gqlApiEndpoint, rpcProviderEndpoints, rpcClient = false }, cache: cacheConfig } = upstreamConfig;
assert(serverConfig, 'Missing server config');
assert(dbConfig, 'Missing database config');
assert(upstreamConfig, 'Missing upstream config');
const { ethServer: { gqlApiEndpoint, rpcProviderEndpoint, rpcClient = false }, cache: cacheConfig } = upstreamConfig;
assert(rpcProviderEndpoint, 'Missing upstream ethServer.rpcProviderEndpoint');
assert(rpcProviderEndpoints, 'Missing upstream ethServer.rpcProviderEndpoints');
assert(rpcProviderEndpoints.length, 'No endpoints configured in ethServer.rpcProviderEndpoints');
const cache = await getCache(cacheConfig);
@ -42,12 +37,13 @@ export const initClients = async (config: Config): Promise<{
if (rpcClient) {
ethClient = new RpcEthClient({
rpcEndpoint: rpcProviderEndpoint,
rpcEndpoint: rpcProviderEndpoints[endpointIndexes.rpcProviderEndpoint],
cache
});
} else {
assert(gqlApiEndpoint, 'Missing upstream ethServer.gqlApiEndpoint');
// TODO: Implement failover for GQL endpoint
ethClient = new GqlEthClient({
gqlEndpoint: gqlApiEndpoint,
cache
@ -55,7 +51,7 @@ export const initClients = async (config: Config): Promise<{
}
const ethProvider = getCustomProvider({
url: rpcProviderEndpoint,
url: rpcProviderEndpoints[endpointIndexes.rpcProviderEndpoint],
allowGzip: true
});

View File

@ -114,6 +114,12 @@ Steps:
This will create a folder containing the generated code at the path provided in config. Follow the steps in [Run Generated Watcher](#run-generated-watcher) to setup and run the generated watcher.
* Update generated watcher's `package.json` with desired `version`, `description`, `repository` URL, etc.
* Update generated watcher's config (`environments/local.toml`) as required
* Update generated codegen config (`codegen-config.yml`) to remove / replace your system's absolute paths
## Development
* `lint`
@ -154,6 +160,12 @@ Steps:
yarn
```
* Run build:
```bash
yarn build
```
* In the config file (`environments/local.toml`):
* Update the state checkpoint settings.

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/codegen",
"version": "0.2.78",
"version": "0.2.110",
"description": "Code generator",
"private": true,
"main": "index.js",
@ -20,7 +20,7 @@
},
"homepage": "https://github.com/cerc-io/watcher-ts#readme",
"dependencies": {
"@cerc-io/util": "^0.2.78",
"@cerc-io/util": "^0.2.110",
"@graphql-tools/load-files": "^6.5.2",
"@npmcli/package-json": "^5.0.0",
"@poanet/solidity-flattener": "https://github.com/vulcanize/solidity-flattener.git",

View File

@ -4,3 +4,5 @@ out/
.vscode
.idea
gql-logs/

View File

@ -0,0 +1,21 @@
//
// Copyright 2024 Vulcanize, Inc.
//
import fs from 'fs';
import path from 'path';
import Handlebars from 'handlebars';
import { Writable } from 'stream';
const TEMPLATE_FILE = './templates/backfill-events-data-template.handlebars';
/**
* Writes the backfill-events-data file generated from a template to a stream.
* @param outStream A writable output stream to write the backfill-events-data file to.
*/
export function exportBackfillEventsData (outStream: Writable): void {
const templateString = fs.readFileSync(path.resolve(__dirname, TEMPLATE_FILE)).toString();
const template = Handlebars.compile(templateString);
const content = template({});
outStream.write(content);
}

View File

@ -2,6 +2,7 @@ className: Contract
indexOn:
- columns:
- address
- kind
unique: true
columns:
- name: id

View File

@ -44,6 +44,44 @@ columns:
columnOptions:
- option: length
value: 256
- name: topic0
pgType: varchar
tsType: string
columnType: Column
columnOptions:
- option: length
value: 66
- name: topic1
pgType: varchar
tsType: string | null
columnType: Column
columnOptions:
- option: length
value: 66
- option: nullable
value: true
- name: topic2
pgType: varchar
tsType: string | null
columnType: Column
columnOptions:
- option: length
value: 66
- option: nullable
value: true
- name: topic3
pgType: varchar
tsType: string | null
columnType: Column
columnOptions:
- option: length
value: 66
- option: nullable
value: true
- name: data
pgType: varchar
tsType: string
columnType: Column
- name: eventInfo
pgType: text
tsType: string

View File

@ -40,6 +40,7 @@ import { exportIndexBlock } from './index-block';
import { exportSubscriber } from './subscriber';
import { exportReset } from './reset';
import { filterInheritedContractNodes, writeFileToStream } from './utils/helpers';
import { exportBackfillEventsData } from './backfill-events-data';
const main = async (): Promise<void> => {
const argv = await yargs(hideBin(process.argv))
@ -65,7 +66,8 @@ const main = async (): Promise<void> => {
})
.argv;
const config = await getConfig(path.resolve(argv['config-file']));
const configFile = path.resolve(argv['config-file']);
const config = await getConfig(configFile);
// Create an array of flattened contract strings.
const contracts: any[] = [];
@ -120,7 +122,7 @@ const main = async (): Promise<void> => {
parseAndVisit(visitor, contracts, config.mode);
generateWatcher(visitor, contracts, config, overwriteExisting);
generateWatcher(visitor, contracts, configFile, config, overwriteExisting);
};
function parseAndVisit (visitor: Visitor, contracts: any[], mode: string) {
@ -162,7 +164,7 @@ function parseAndVisit (visitor: Visitor, contracts: any[], mode: string) {
}
}
function generateWatcher (visitor: Visitor, contracts: any[], config: any, overWriteExisting = false) {
function generateWatcher (visitor: Visitor, contracts: any[], configFile: string, config: any, overWriteExisting = false) {
// Prepare directory structure for the watcher.
let outputDir = '';
@ -198,6 +200,13 @@ function generateWatcher (visitor: Visitor, contracts: any[], config: any, overW
let outStream: Writable;
// Export the codegen config file
const configFileContent = fs.readFileSync(configFile, 'utf8');
outStream = outputDir
? fs.createWriteStream(path.join(outputDir, 'codegen-config.yml'))
: process.stdout;
outStream.write(configFileContent);
// Export artifacts for the contracts.
contracts.forEach((contract: any) => {
outStream = outputDir
@ -265,7 +274,7 @@ function generateWatcher (visitor: Visitor, contracts: any[], config: any, overW
outStream = outputDir
? fs.createWriteStream(path.join(outputDir, 'README.md'))
: process.stdout;
exportReadme(path.basename(outputDir), config.port, outStream);
exportReadme(path.basename(outputDir), config, outStream);
outStream = outputDir
? fs.createWriteStream(path.join(outputDir, 'LICENSE'))
@ -381,6 +390,11 @@ function generateWatcher (visitor: Visitor, contracts: any[], config: any, overW
: process.stdout;
exportIndexBlock(outStream);
outStream = outputDir
? fs.createWriteStream(path.join(outputDir, 'src/cli/backfill-events-data.ts'))
: process.stdout;
exportBackfillEventsData(outStream);
if (config.subgraphPath) {
outStream = outputDir
? fs.createWriteStream(path.join(outputDir, 'src/entity/Subscriber.ts'))

View File

@ -15,12 +15,25 @@ const TEMPLATE_FILE = './templates/readme-template.handlebars';
* @param port Watcher server port.
* @param outStream A writable output stream to write the README.md file to.
*/
export function exportReadme (folderName: string, port: number, outStream: Writable): void {
export function exportReadme (
folderName: string,
config: { port: number, subgraphPath?: string },
outStream: Writable
): void {
const { port, subgraphPath } = config;
const templateString = fs.readFileSync(path.resolve(__dirname, TEMPLATE_FILE)).toString();
const template = Handlebars.compile(templateString);
let subgraphRepoName;
if (subgraphPath) {
const subgraphRepoDir = path.dirname(subgraphPath);
subgraphRepoName = path.basename(subgraphRepoDir);
}
const readmeString = template({
folderName,
port
port,
subgraphRepoName
});
outStream.write(readmeString);
}

View File

@ -4,7 +4,7 @@
import assert from 'assert';
import { GraphQLSchema, parse, printSchema, print, GraphQLDirective, GraphQLInt, GraphQLBoolean, GraphQLEnumType, DefinitionNode, GraphQLString, GraphQLNonNull } from 'graphql';
import { ObjectTypeComposer, NonNullComposer, ObjectTypeComposerDefinition, ObjectTypeComposerFieldConfigMapDefinition, SchemaComposer, ListComposer, ComposeOutputType } from 'graphql-compose';
import { ObjectTypeComposer, NonNullComposer, ObjectTypeComposerDefinition, ObjectTypeComposerFieldConfigMapDefinition, SchemaComposer, ListComposer, ComposeOutputType, ThunkComposer } from 'graphql-compose';
import { Writable } from 'stream';
import { utils } from 'ethers';
import { VariableDeclaration } from '@solidity-parser/parser/dist/src/ast-types';
@ -189,6 +189,8 @@ export class Schema {
}
_addSubgraphSchemaQueries (subgraphTypeDefs: ReadonlyArray<DefinitionNode>): void {
const subgraphTypeArgsMap = new Map<string, { [key: string]: any }>();
for (const subgraphTypeDef of subgraphTypeDefs) {
// Filtering out enums.
if (subgraphTypeDef.kind !== 'ObjectTypeDefinition') {
@ -303,21 +305,30 @@ export class Schema {
let pluralQueryName = pluralize(queryName);
pluralQueryName = (pluralQueryName === queryName) ? `${pluralQueryName}s` : pluralQueryName;
const queryArgs = {
where: `${subgraphType}_filter`,
orderBy: subgraphTypeOrderByEnum,
orderDirection: ORDER_DIRECTION,
first: { type: GraphQLInt, defaultValue: 100 },
skip: { type: GraphQLInt, defaultValue: 0 }
};
queryObject[pluralQueryName] = {
// Get type composer object for return type from the schema composer.
type: this._composer.getAnyTC(subgraphType).NonNull.List.NonNull,
args: {
block: BLOCK_HEIGHT,
where: `${subgraphType}_filter`,
orderBy: subgraphTypeOrderByEnum,
orderDirection: ORDER_DIRECTION,
first: { type: GraphQLInt, defaultValue: 100 },
skip: { type: GraphQLInt, defaultValue: 0 }
...queryArgs
}
};
this._composer.Query.addFields(queryObject);
// Save the args for this type in a map (type -> args) for further usage.
subgraphTypeArgsMap.set(subgraphType, queryArgs);
}
// Add args on plural fields for subgraph types.
this._addSubgraphPluralFieldArgs(subgraphTypeDefs, subgraphTypeArgsMap);
}
_getDetailsForSubgraphField (fieldType: ComposeOutputType<any>): {
@ -348,6 +359,11 @@ export class Schema {
isArray = true;
}
if (fieldType instanceof ThunkComposer) {
const unwrappedFieldType = fieldType.getUnwrappedTC() as ObjectTypeComposer;
({ type, isRelation, entityType } = this._getDetailsForSubgraphField(unwrappedFieldType));
}
if (fieldType instanceof ObjectTypeComposer) {
type = 'String';
isRelation = true;
@ -376,6 +392,28 @@ export class Schema {
});
}
_addSubgraphPluralFieldArgs (subgraphTypeDefs: ReadonlyArray<DefinitionNode>, subgraphTypeArgsMap: Map<string, { [key: string]: any }>): void {
for (const subgraphTypeDef of subgraphTypeDefs) {
// Filtering out enums.
if (subgraphTypeDef.kind !== 'ObjectTypeDefinition') {
continue;
}
const subgraphType = subgraphTypeDef.name.value;
const subgraphTypeComposer = this._composer.getOTC(subgraphType);
// Process each field on the type.
Object.entries(subgraphTypeComposer.getFields()).forEach(([fieldName, field]) => {
const { isArray, entityType } = this._getDetailsForSubgraphField(field.type);
// Set args if it's a plural field of some entity type.
if (entityType && isArray) {
subgraphTypeComposer.setFieldArgs(fieldName, subgraphTypeArgsMap.get(entityType) || {});
}
});
}
}
/**
* Adds basic types to the schema and typemapping.
*/

View File

@ -0,0 +1,26 @@
//
// Copyright 2024 Vulcanize, Inc.
//
import 'reflect-metadata';
import debug from 'debug';
import { BackfillEventsDataCmd } from '@cerc-io/cli';
import { Database } from '../database';
import { Event } from '../entity/Event';
const log = debug('vulcanize:backfill-events-data');
const main = async (): Promise<void> => {
const backFillCmd = new BackfillEventsDataCmd();
await backFillCmd.init(Database);
await backFillCmd.exec(Event);
};
main().catch(err => {
log(err);
}).finally(() => {
process.exit(0);
});

View File

@ -2,7 +2,6 @@
host = "127.0.0.1"
port = {{port}}
kind = "{{watcherKind}}"
gqlPath = "/graphql"
# Checkpointing state.
checkpointing = true
@ -11,8 +10,7 @@
checkpointInterval = 2000
# Enable state creation
# CAUTION: Disable only if state creation is not desired or can be filled subsequently
enableState = true
enableState = false
{{#if (subgraphPath)}}
subgraphPath = "./subgraph-build"
@ -24,25 +22,40 @@
clearEntitiesCacheInterval = 1000
{{/if}}
# Max block range for which to return events in eventsInRange GQL query.
# Use -1 for skipping check on block range.
maxEventsBlockRange = 1000
# Flag to specify whether RPC endpoint supports block hash as block tag parameter
rpcSupportsBlockHashParam = true
# GQL cache settings
[server.gqlCache]
# GQL server config
[server.gql]
path = "/graphql"
# Max block range for which to return events in eventsInRange GQL query.
# Use -1 for skipping check on block range.
maxEventsBlockRange = 1000
# Log directory for GQL requests
logDir = "./gql-logs"
# GQL cache settings
[server.gql.cache]
enabled = true
# Max in-memory cache size (in bytes) (default 8 MB)
# maxCacheSize
# GQL cache-control max-age settings (in seconds)
maxAge = 15
{{#if (subgraphPath)}}
timeTravelMaxAge = 86400 # 1 day
{{/if}}
# ETH RPC server config
[server.ethRPC]
enabled = true
path = "/rpc"
# Max in-memory cache size (in bytes) (default 8 MB)
# maxCacheSize
# GQL cache-control max-age settings (in seconds)
maxAge = 15
{{#if (subgraphPath)}}
timeTravelMaxAge = 86400 # 1 day
{{/if}}
# Max number of logs that can be returned in a single getLogs request (default: 10000)
getLogsResultLimit = 10000
[metrics]
host = "127.0.0.1"
@ -63,7 +76,9 @@
[upstream]
[upstream.ethServer]
gqlApiEndpoint = "http://127.0.0.1:8082/graphql"
rpcProviderEndpoint = "http://127.0.0.1:8081"
rpcProviderEndpoints = [
"http://127.0.0.1:8081"
]
# Boolean flag to specify if rpc-eth-client should be used for RPC endpoint instead of ipld-eth-client (ipld-eth-server GQL client)
rpcClient = false
@ -76,6 +91,10 @@
# Boolean flag to filter event logs by topics
filterLogsByTopics = true
# Switch clients if eth_getLogs call takes more than threshold (in secs)
# Set to 0 for disabling switching
getLogsClientSwitchThresholdInSecs = 0
[upstream.cache]
name = "requests"
enabled = false
@ -89,6 +108,9 @@
subgraphEventsOrder = true
blockDelayInMilliSecs = 2000
# Number of blocks by which block processing lags behind head
blockProcessingOffset = 0
# Boolean to switch between modes of processing events when starting the server.
# Setting to true will fetch filtered events and required blocks in a range of blocks and then process them.
# Setting to false will fetch blocks consecutively with its events and then process them (Behaviour is followed in realtime processing near head).
@ -100,3 +122,6 @@
# Max block range of historical processing after which it waits for completion of events processing
# If set to -1 historical processing does not wait for events processing and completes till latest canonical block
historicalMaxFetchAhead = 10000
# Max number of retries to fetch new block after which watcher will failover to other RPC endpoints
maxNewBlockRetries = 3

View File

@ -199,6 +199,12 @@ export class Database implements DatabaseInterface {
return this._baseDatabase.getEventsInRange(repo, fromBlockNumber, toBlockNumber);
}
async getEvents (options: FindManyOptions<Event>): Promise<Array<Event>> {
const repo = this._conn.getRepository(Event);
return this._baseDatabase.getEvents(repo, options);
}
async saveEventEntity (queryRunner: QueryRunner, entity: Event): Promise<Event> {
const repo = queryRunner.manager.getRepository(Event);
return this._baseDatabase.saveEventEntity(repo, entity);
@ -324,8 +330,8 @@ export class Database implements DatabaseInterface {
await this._baseDatabase.deleteEntitiesByConditions(queryRunner, entity, findConditions);
}
async getAncestorAtDepth (blockHash: string, depth: number): Promise<string> {
return this._baseDatabase.getAncestorAtDepth(blockHash, depth);
async getAncestorAtHeight (blockHash: string, height: number): Promise<string> {
return this._baseDatabase.getAncestorAtHeight(blockHash, height);
}
_getPropertyColumnMapForEntity (entityName: string): Map<string, string> {

View File

@ -8,13 +8,12 @@ import debug from 'debug';
{{#if queries}}
import JSONbig from 'json-bigint';
{{/if}}
import { ethers, constants } from 'ethers';
import { ethers, constants, providers } from 'ethers';
{{#if (subgraphPath)}}
import { SelectionNode } from 'graphql';
import { GraphQLResolveInfo } from 'graphql';
{{/if}}
import { JsonFragment } from '@ethersproject/abi';
import { BaseProvider } from '@ethersproject/providers';
import { MappingKey, StorageLayout } from '@cerc-io/solidity-mapper';
import {
Indexer as BaseIndexer,
@ -49,6 +48,7 @@ import {
EthFullTransaction,
ExtraEventData
} from '@cerc-io/util';
import { initClients } from '@cerc-io/cli';
{{#if (subgraphPath)}}
import { GraphWatcher } from '@cerc-io/graph-node';
{{/if}}
@ -91,7 +91,7 @@ const {{capitalize event}}_EVENT = '{{event}}';
export class Indexer implements IndexerInterface {
_db: Database;
_ethClient: EthClient;
_ethProvider: BaseProvider;
_ethProvider: providers.JsonRpcProvider;
_baseIndexer: BaseIndexer;
_serverConfig: ServerConfig;
_upstreamConfig: UpstreamConfig;
@ -118,7 +118,7 @@ export class Indexer implements IndexerInterface {
},
db: DatabaseInterface,
clients: Clients,
ethProvider: BaseProvider,
ethProvider: providers.JsonRpcProvider,
jobQueue: JobQueue{{#if (subgraphPath)}},{{/if}}
{{#if (subgraphPath)}}
graphWatcher?: GraphWatcherInterface
@ -188,6 +188,10 @@ export class Indexer implements IndexerInterface {
return this._storageLayoutMap;
}
get contractMap (): Map<string, ethers.utils.Interface> {
return this._contractMap;
}
{{#if (subgraphPath)}}
get graphWatcher (): GraphWatcher {
return this._graphWatcher;
@ -199,6 +203,19 @@ export class Indexer implements IndexerInterface {
await this._baseIndexer.fetchStateStatus();
}
async switchClients (): Promise<void> {
const { ethClient, ethProvider } = await this._baseIndexer.switchClients(initClients);
this._ethClient = ethClient;
this._ethProvider = ethProvider;
{{#if (subgraphPath)}}
this._graphWatcher.switchClients({ ethClient, ethProvider });
{{/if}}
}
async isGetLogsRequestsSlow (): Promise<boolean> {
return this._baseIndexer.isGetLogsRequestsSlow();
}
{{#if (subgraphPath)}}
async getMetaData (block: BlockHeight): Promise<ResultMeta | null> {
return this._baseIndexer.getMetaData(block);
@ -236,7 +253,7 @@ export class Indexer implements IndexerInterface {
};
}
const { block: { number } } = await this._ethClient.getBlockByHash(blockHash);
const { block: { number } } = await this.getBlockByHash(blockHash);
const blockNumber = ethers.BigNumber.from(number).toNumber();
log('{{query.name}}: db miss, fetching from upstream server');
@ -449,9 +466,9 @@ export class Indexer implements IndexerInterface {
entity: new () => Entity,
id: string,
block: BlockHeight,
selections: ReadonlyArray<SelectionNode> = []
queryInfo: GraphQLResolveInfo
): Promise<any> {
const data = await this._graphWatcher.getEntity(entity, id, this._relationsMap, block, selections);
const data = await this._graphWatcher.getEntity(entity, id, this._relationsMap, block, queryInfo);
return data;
}
@ -461,9 +478,9 @@ export class Indexer implements IndexerInterface {
block: BlockHeight,
where: { [key: string]: any } = {},
queryOptions: QueryOptions = {},
selections: ReadonlyArray<SelectionNode> = []
queryInfo: GraphQLResolveInfo
): Promise<any[]> {
return this._graphWatcher.getEntities(entity, this._relationsMap, block, where, queryOptions, selections);
return this._graphWatcher.getEntities(entity, this._relationsMap, block, where, queryOptions, queryInfo);
}
{{/if}}
@ -512,20 +529,38 @@ export class Indexer implements IndexerInterface {
}
{{/if}}
parseEventNameAndArgs (kind: string, logObj: any): any {
parseEventNameAndArgs (watchedContracts: Contract[], logObj: any): { eventParsed: boolean, eventDetails: any } {
const { topics, data } = logObj;
let logDescription: ethers.utils.LogDescription | undefined;
const contract = this._contractMap.get(kind);
assert(contract);
for (const watchedContract of watchedContracts) {
const contract = this._contractMap.get(watchedContract.kind);
assert(contract);
const logDescription = contract.parseLog({ data, topics });
try {
logDescription = contract.parseLog({ data, topics });
break;
} catch (err) {
// Continue loop only if no matching event found
if (!((err as Error).message.includes('no matching event'))) {
throw err;
}
}
}
if (!logDescription) {
return { eventParsed: false, eventDetails: {} };
}
const { eventName, eventInfo, eventSignature } = this._baseIndexer.parseEvent(logDescription);
return {
eventName,
eventInfo,
eventSignature
eventParsed: true,
eventDetails: {
eventName,
eventInfo,
eventSignature
}
};
}
@ -600,6 +635,10 @@ export class Indexer implements IndexerInterface {
return this._baseIndexer.watchContract(address, kind, checkpoint, startingBlock, context);
}
async removeContract (address: string, kind: string): Promise<void> {
return this._baseIndexer.removeContract(address, kind);
}
updateStateStatusMap (address: string, stateStatus: StateStatus): void {
this._baseIndexer.updateStateStatusMap(address, stateStatus);
}
@ -620,8 +659,8 @@ export class Indexer implements IndexerInterface {
return this._baseIndexer.getEventsByFilter(blockHash, contract, name);
}
isWatchedContract (address : string): Contract | undefined {
return this._baseIndexer.isWatchedContract(address);
isContractAddressWatched (address : string): Contract[] | undefined {
return this._baseIndexer.isContractAddressWatched(address);
}
getWatchedContracts (): Contract[] {
@ -637,7 +676,11 @@ export class Indexer implements IndexerInterface {
}
async getEventsInRange (fromBlockNumber: number, toBlockNumber: number): Promise<Array<Event>> {
return this._baseIndexer.getEventsInRange(fromBlockNumber, toBlockNumber, this._serverConfig.maxEventsBlockRange);
return this._baseIndexer.getEventsInRange(fromBlockNumber, toBlockNumber, this._serverConfig.gql.maxEventsBlockRange);
}
async getEvents (options: FindManyOptions<Event>): Promise<Array<Event>> {
return this._db.getEvents(options);
}
async getSyncStatus (): Promise<SyncStatus | undefined> {
@ -648,6 +691,10 @@ export class Indexer implements IndexerInterface {
return this._baseIndexer.getBlocks(blockFilter);
}
async getBlockByHash (blockHash?: string): Promise<{ block: any }> {
return this._baseIndexer.getBlockByHash(blockHash);
}
async updateSyncStatusIndexedBlock (blockHash: string, blockNumber: number, force = false): Promise<SyncStatus> {
return this._baseIndexer.updateSyncStatusIndexedBlock(blockHash, blockNumber, force);
}
@ -744,8 +791,8 @@ export class Indexer implements IndexerInterface {
return this._baseIndexer.updateBlockProgress(block, lastProcessedEventIndex);
}
async getAncestorAtDepth (blockHash: string, depth: number): Promise<string> {
return this._baseIndexer.getAncestorAtDepth(blockHash, depth);
async getAncestorAtHeight (blockHash: string, height: number): Promise<string> {
return this._baseIndexer.getAncestorAtHeight(blockHash, height);
}
async resetWatcherToBlock (blockNumber: number): Promise<void> {
@ -806,6 +853,7 @@ export class Indexer implements IndexerInterface {
{{/each}}
}
// eslint-disable-next-line @typescript-eslint/no-empty-function
_populateRelationsMap (): void {
{{#each subgraphEntities as | subgraphEntity |}}
{{#if subgraphEntity.relations}}
@ -844,7 +892,27 @@ export class Indexer implements IndexerInterface {
assert(blockHash);
assert(blockNumber);
const { events: dbEvents, transactions } = await this._baseIndexer.fetchEvents(blockHash, blockNumber, this.eventSignaturesMap, this.parseEventNameAndArgs.bind(this));
{{#if (subgraphPath)}}
let dbEvents: DeepPartial<Event>[] = [];
let transactions: EthFullTransaction[] = [];
// Fetch events and txs only if subgraph config has any event handlers
if (this._graphWatcher.eventHandlerExists) {
({ events: dbEvents, transactions } = await this._baseIndexer.fetchEvents(
blockHash,
blockNumber,
this.eventSignaturesMap,
this.parseEventNameAndArgs.bind(this)
));
}
{{else~}}
const { events: dbEvents, transactions } = await this._baseIndexer.fetchEvents(
blockHash,
blockNumber,
this.eventSignaturesMap,
this.parseEventNameAndArgs.bind(this)
);
{{/if}}
const dbTx = await this._db.createTransactionRunner();
try {
@ -869,4 +937,8 @@ export class Indexer implements IndexerInterface {
await dbTx.release();
}
}
async getFullTransactions (txHashList: string[]): Promise<EthFullTransaction[]> {
return this._baseIndexer.getFullTransactions(txHashList);
}
}

View File

@ -31,7 +31,7 @@
},
"repository": {
"type": "git",
"url": "git+https://github.com/cerc-io/watcher-ts.git"
"url": "https://github.com/cerc-io/watcher-ts.git"
},
"author": "",
"license": "AGPL-3.0",
@ -41,12 +41,12 @@
"homepage": "https://github.com/cerc-io/watcher-ts#readme",
"dependencies": {
"@apollo/client": "^3.3.19",
"@cerc-io/cli": "^0.2.78",
"@cerc-io/ipld-eth-client": "^0.2.78",
"@cerc-io/solidity-mapper": "^0.2.78",
"@cerc-io/util": "^0.2.78",
"@cerc-io/cli": "^v0.2.110",
"@cerc-io/ipld-eth-client": "^v0.2.110",
"@cerc-io/solidity-mapper": "^v0.2.110",
"@cerc-io/util": "^v0.2.110",
{{#if (subgraphPath)}}
"@cerc-io/graph-node": "^0.2.78",
"@cerc-io/graph-node": "^v0.2.110",
{{/if}}
"@ethersproject/providers": "^5.4.4",
"debug": "^4.3.1",
@ -75,6 +75,7 @@
"eslint-plugin-standard": "^5.0.0",
"husky": "^7.0.2",
"ts-node": "^10.2.1",
"typescript": "^5.0.2"
"typescript": "^5.0.2",
"winston": "^3.13.0"
}
}

View File

@ -1,5 +1,12 @@
# {{folderName}}
{{#if (subgraphPath)}}
## Source
<!-- TODO: Update with publised subgraph release version -->
* Subgraph: [{{subgraphRepoName}} v0.1.0](https://github.com/cerc-io/{{subgraphRepoName}}/releases/tag/v0.1.0)
{{/if}}
## Setup
* Run the following command to install required packages:
@ -8,6 +15,12 @@
yarn
```
* Run build:
```bash
yarn build
```
* Create a postgres12 database for the watcher:
```bash
@ -63,7 +76,7 @@
To enable GQL requests caching:
* Update the `server.gqlCache` config with required settings.
* Update the `server.gql.cache` config with required settings.
* In the GQL [schema file](./src/schema.gql), use the `cacheControl` directive to apply cache hints at schema level.
@ -91,14 +104,6 @@ To enable GQL requests caching:
yarn job-runner
```
* Run the server:
```bash
yarn server
```
GQL console: http://localhost:{{port}}/graphql
* To watch a contract:
```bash
@ -124,6 +129,14 @@ To enable GQL requests caching:
yarn watch:contract --address MyProtocol --kind protocol --checkpoint true
```
* Run the server:
```bash
yarn server
```
GQL console: http://localhost:{{port}}/graphql
* To fill a block range:
```bash

View File

@ -5,6 +5,8 @@
import assert from 'assert';
import debug from 'debug';
import { GraphQLResolveInfo } from 'graphql';
import { ExpressContext } from 'apollo-server-express';
import winston from 'winston';
import {
{{#if queries}}
@ -12,6 +14,7 @@ import {
{{/if}}
gqlTotalQueryCount,
gqlQueryCount,
gqlQueryDuration,
getResultState,
IndexerInterface,
GraphQLBigInt,
@ -36,11 +39,59 @@ import { {{query.entityName}} } from './entity/{{query.entityName}}';
const log = debug('vulcanize:resolver');
export const createResolvers = async (indexerArg: IndexerInterface, eventWatcher: EventWatcher): Promise<any> => {
const executeAndRecordMetrics = async (
indexer: Indexer,
gqlLogger: winston.Logger,
opName: string,
expressContext: ExpressContext,
operation: () => Promise<any>
) => {
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels(opName).inc(1);
const endTimer = gqlQueryDuration.labels(opName).startTimer();
try {
const [result, syncStatus] = await Promise.all([
operation(),
indexer.getSyncStatus()
]);
gqlLogger.info({
opName,
query: expressContext.req.body.query,
variables: expressContext.req.body.variables,
latestIndexedBlockNumber: syncStatus?.latestIndexedBlockNumber,
urlPath: expressContext.req.path,
apiKey: expressContext.req.header('x-api-key'),
origin: expressContext.req.headers.origin
});
return result;
} catch (error) {
gqlLogger.error({
opName,
error,
query: expressContext.req.body.query,
variables: expressContext.req.body.variables,
urlPath: expressContext.req.path,
apiKey: expressContext.req.header('x-api-key'),
origin: expressContext.req.headers.origin
});
throw error;
} finally {
endTimer();
}
};
export const createResolvers = async (
indexerArg: IndexerInterface,
eventWatcher: EventWatcher,
gqlLogger: winston.Logger
): Promise<any> => {
const indexer = indexerArg as Indexer;
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const gqlCacheConfig = indexer.serverConfig.gqlCache;
const gqlCacheConfig = indexer.serverConfig.gql.cache;
return {
BigInt: GraphQLBigInt,
@ -78,20 +129,24 @@ export const createResolvers = async (indexerArg: IndexerInterface, eventWatcher
{{~#each this.params}}, {{this.name~}} {{/each}} }: { blockHash: string, contractAddress: string
{{~#each this.params}}, {{this.name}}: {{this.type~}} {{/each}} },
// eslint-disable-next-line @typescript-eslint/no-unused-vars
__: any,
expressContext: ExpressContext,
// eslint-disable-next-line @typescript-eslint/no-unused-vars
info: GraphQLResolveInfo
): Promise<ValueResult> => {
log('{{this.name}}', blockHash, contractAddress
{{~#each this.params}}, {{this.name~}} {{/each}});
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('{{this.name}}').inc(1);
// Set cache-control hints
// setGQLCacheHints(info, {}, gqlCacheConfig);
return indexer.{{this.name}}(blockHash, contractAddress
{{~#each this.params}}, {{this.name~}} {{/each}});
return executeAndRecordMetrics(
indexer,
gqlLogger,
'{{this.name}}',
expressContext,
async () => indexer.{{this.name}}(blockHash, contractAddress
{{~#each this.params}}, {{this.name~}} {{/each}})
);
},
{{/each}}
@ -100,116 +155,175 @@ export const createResolvers = async (indexerArg: IndexerInterface, eventWatcher
{{this.queryName}}: async (
_: any,
{ id, block = {} }: { id: string, block: BlockHeight },
__: any,
expressContext: ExpressContext,
info: GraphQLResolveInfo
) => {
log('{{this.queryName}}', id, JSON.stringify(block, jsonBigIntStringReplacer));
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('{{this.queryName}}').inc(1);
assert(info.fieldNodes[0].selectionSet);
// Set cache-control hints
// setGQLCacheHints(info, block, gqlCacheConfig);
return indexer.getSubgraphEntity({{this.entityName}}, id, block, info.fieldNodes[0].selectionSet.selections);
return executeAndRecordMetrics(
indexer,
gqlLogger,
'{{this.queryName}}',
expressContext,
async () => indexer.getSubgraphEntity({{this.entityName}}, id, block, info)
);
},
{{this.pluralQueryName}}: async (
_: any,
{ block = {}, where, first, skip, orderBy, orderDirection }: { block: BlockHeight, where: { [key: string]: any }, first: number, skip: number, orderBy: string, orderDirection: OrderDirection },
__: any,
expressContext: ExpressContext,
info: GraphQLResolveInfo
) => {
log('{{this.pluralQueryName}}', JSON.stringify(block, jsonBigIntStringReplacer), JSON.stringify(where, jsonBigIntStringReplacer), first, skip, orderBy, orderDirection);
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('{{this.pluralQueryName}}').inc(1);
assert(info.fieldNodes[0].selectionSet);
// Set cache-control hints
// setGQLCacheHints(info, block, gqlCacheConfig);
return indexer.getSubgraphEntities(
{{this.entityName}},
block,
where,
{ limit: first, skip, orderBy, orderDirection },
info.fieldNodes[0].selectionSet.selections
return executeAndRecordMetrics(
indexer,
gqlLogger,
'{{this.pluralQueryName}}',
expressContext,
async () => indexer.getSubgraphEntities(
{{this.entityName}},
block,
where,
{ limit: first, skip, orderBy, orderDirection },
info
)
);
},
{{/each}}
events: async (_: any, { blockHash, contractAddress, name }: { blockHash: string, contractAddress: string, name?: string }) => {
events: async (
_: any,
{ blockHash, contractAddress, name }: { blockHash: string, contractAddress: string, name?: string },
expressContext: ExpressContext
) => {
log('events', blockHash, contractAddress, name);
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('events').inc(1);
const block = await indexer.getBlockProgress(blockHash);
if (!block || !block.isComplete) {
throw new Error(`Block hash ${blockHash} number ${block?.blockNumber} not processed yet`);
}
return executeAndRecordMetrics(
indexer,
gqlLogger,
'events',
expressContext,
async () => {
const block = await indexer.getBlockProgress(blockHash);
if (!block || !block.isComplete) {
throw new Error(`Block hash ${blockHash} number ${block?.blockNumber} not processed yet`);
}
const events = await indexer.getEventsByFilter(blockHash, contractAddress, name);
return events.map(event => indexer.getResultEvent(event));
const events = await indexer.getEventsByFilter(blockHash, contractAddress, name);
return events.map(event => indexer.getResultEvent(event));
}
);
},
eventsInRange: async (_: any, { fromBlockNumber, toBlockNumber }: { fromBlockNumber: number, toBlockNumber: number }) => {
eventsInRange: async (
_: any,
{ fromBlockNumber, toBlockNumber }: { fromBlockNumber: number, toBlockNumber: number },
expressContext: ExpressContext
) => {
log('eventsInRange', fromBlockNumber, toBlockNumber);
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('eventsInRange').inc(1);
const syncStatus = await indexer.getSyncStatus();
return executeAndRecordMetrics(
indexer,
gqlLogger,
'eventsInRange',
expressContext,
async () => {
const syncStatus = await indexer.getSyncStatus();
if (!syncStatus) {
throw new Error('No blocks processed yet');
}
if (!syncStatus) {
throw new Error('No blocks processed yet');
}
if ((fromBlockNumber < syncStatus.initialIndexedBlockNumber) || (toBlockNumber > syncStatus.latestProcessedBlockNumber)) {
throw new Error(`Block range should be between ${syncStatus.initialIndexedBlockNumber} and ${syncStatus.latestProcessedBlockNumber}`);
}
if ((fromBlockNumber < syncStatus.initialIndexedBlockNumber) || (toBlockNumber > syncStatus.latestProcessedBlockNumber)) {
throw new Error(`Block range should be between ${syncStatus.initialIndexedBlockNumber} and ${syncStatus.latestProcessedBlockNumber}`);
}
const events = await indexer.getEventsInRange(fromBlockNumber, toBlockNumber);
return events.map(event => indexer.getResultEvent(event));
const events = await indexer.getEventsInRange(fromBlockNumber, toBlockNumber);
return events.map(event => indexer.getResultEvent(event));
}
);
},
getStateByCID: async (_: any, { cid }: { cid: string }) => {
getStateByCID: async (
_: any,
{ cid }: { cid: string },
expressContext: ExpressContext
) => {
log('getStateByCID', cid);
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('getStateByCID').inc(1);
const state = await indexer.getStateByCID(cid);
return executeAndRecordMetrics(
indexer,
gqlLogger,
'getStateByCID',
expressContext,
async () => {
const state = await indexer.getStateByCID(cid);
return state && state.block.isComplete ? getResultState(state) : undefined;
return state && state.block.isComplete ? getResultState(state) : undefined;
}
);
},
getState: async (_: any, { blockHash, contractAddress, kind }: { blockHash: string, contractAddress: string, kind: string }) => {
getState: async (
_: any,
{ blockHash, contractAddress, kind }: { blockHash: string, contractAddress: string, kind: string },
expressContext: ExpressContext
) => {
log('getState', blockHash, contractAddress, kind);
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('getState').inc(1);
const state = await indexer.getPrevState(blockHash, contractAddress, kind);
return executeAndRecordMetrics(
indexer,
gqlLogger,
'getState',
expressContext,
async () => {
const state = await indexer.getPrevState(blockHash, contractAddress, kind);
return state && state.block.isComplete ? getResultState(state) : undefined;
return state && state.block.isComplete ? getResultState(state) : undefined;
}
);
},
{{#if (subgraphPath)}}
_meta: async (
_: any,
{ block = {} }: { block: BlockHeight }
{ block = {} }: { block: BlockHeight },
expressContext: ExpressContext
) => {
log('_meta');
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('_meta').inc(1);
return indexer.getMetaData(block);
return executeAndRecordMetrics(
indexer,
gqlLogger,
'_meta',
expressContext,
async () => indexer.getMetaData(block)
);
},
{{/if}}
getSyncStatus: async () => {
getSyncStatus: async (
_: any,
__: Record<string, never>,
expressContext: ExpressContext
) => {
log('getSyncStatus');
gqlTotalQueryCount.inc(1);
gqlQueryCount.labels('getSyncStatus').inc(1);
return indexer.getSyncStatus();
return executeAndRecordMetrics(
indexer,
gqlLogger,
'getSyncStatus',
expressContext,
async () => indexer.getSyncStatus()
);
}
}
};

View File

@ -10,7 +10,7 @@ import { loadFilesSync } from '@graphql-tools/load-files';
import { ASSET_DIR } from './constants';
const GRAPH_TS_VERSION = '0.27.0-watcher-ts-0.1.3';
const GRAPH_CLI_VERSION = '0.32.0-watcher-ts-0.1.3';
const GRAPH_CLI_VERSION = '0.32.0-watcher-ts-0.1.4';
export function parseSubgraphSchema (subgraphPath: string, subgraphConfig: any): any {
const subgraphSchemaPath = path.join(path.resolve(subgraphPath), subgraphConfig.schema?.file ?? './schema.graphql');

View File

@ -65,20 +65,23 @@ export class Visitor {
const name = node.name;
assert(name);
const params = node.parameters.map((item: any) => {
return { name: item.name, type: item.typeName.name };
const params = node.parameters.map((item: any, index: number) => {
const itemName = item.name ?? `arg${index}`;
return { name: itemName, type: item.typeName.name };
});
let errorMessage = '';
// Check for unhandled return type params
node.returnParameters.forEach(returnParameter => {
assert(returnParameter.typeName);
const isTypeHandled = ['ElementaryTypeName', 'ArrayTypeName'].includes(returnParameter.typeName.type);
if (!isTypeHandled) {
const errorMessage = `No support in codegen for type ${returnParameter.typeName.type} from method "${node.name}"`;
errorMessage = `No support in codegen for type ${returnParameter.typeName.type} from method "${node.name}"`;
if (this._continueOnError) {
console.log(errorMessage);
return;
}
@ -86,6 +89,11 @@ export class Visitor {
}
});
if (this._continueOnError && errorMessage !== '') {
console.log(errorMessage);
return;
}
this._schema.addQuery(name, params, node.returnParameters);
this._resolvers.addQuery(name, params);
assert(this._contract);
@ -125,12 +133,24 @@ export class Visitor {
// If the variable type is mapping, extract key as a param:
// Eg. mapping(address => mapping(address => uint256)) private _allowances;
while (typeName.type === 'Mapping') {
assert(typeName.keyType.type === 'ElementaryTypeName', 'UserDefinedTypeName map keys like enum type not handled');
if (typeName.keyType.type === 'UserDefinedTypeName') {
errorMessage = 'No support in codegen for user defined type map keys';
break;
}
params.push({ name: `key${numParams.toString()}`, type: typeName.keyType.name });
typeName = typeName.valueType;
numParams++;
}
if (typeName.type === 'UserDefinedTypeName') {
errorMessage = 'No support in codegen for user defined type map values';
}
if (errorMessage !== '') {
break;
}
// falls through
}

View File

@ -1,10 +1,10 @@
{
"name": "@cerc-io/graph-node",
"version": "0.2.78",
"version": "0.2.110",
"main": "dist/index.js",
"license": "AGPL-3.0",
"devDependencies": {
"@cerc-io/solidity-mapper": "^0.2.78",
"@cerc-io/solidity-mapper": "^0.2.110",
"@ethersproject/providers": "^5.4.4",
"@graphprotocol/graph-ts": "^0.22.0",
"@nomiclabs/hardhat-ethers": "^2.0.2",
@ -51,9 +51,9 @@
"dependencies": {
"@apollo/client": "^3.3.19",
"@cerc-io/assemblyscript": "0.19.10-watcher-ts-0.1.2",
"@cerc-io/cache": "^0.2.78",
"@cerc-io/ipld-eth-client": "^0.2.78",
"@cerc-io/util": "^0.2.78",
"@cerc-io/cache": "^0.2.110",
"@cerc-io/ipld-eth-client": "^0.2.110",
"@cerc-io/util": "^0.2.110",
"@types/json-diff": "^0.5.2",
"@types/yargs": "^17.0.0",
"bn.js": "^4.11.9",

View File

@ -50,6 +50,7 @@ export interface Context {
rpcSupportsBlockHashParam: boolean;
block?: Block;
contractAddress?: string;
dataSourceName?: string;
}
const log = debug('vulcanize:graph-node');
@ -86,10 +87,7 @@ export const instantiate = async (
assert(indexer.getEntityTypesMap);
const entityTypesMap = indexer.getEntityTypesMap();
const entityTypes = entityTypesMap.get(entityName);
assert(entityTypes);
return database.toGraphEntity(instanceExports, entityName, entityData, entityTypes);
return database.toGraphEntity(instanceExports, entityName, entityData, entityTypesMap);
},
'store.set': async (entity: number, id: number, data: number) => {
const entityName = __getString(entity);
@ -343,7 +341,11 @@ export const instantiate = async (
},
'typeConversion.bytesToString': async (bytes: number) => {
const byteArray = __getArray(bytes);
const string = utils.toUtf8String(byteArray);
let string = utils.toUtf8String(byteArray, utils.Utf8ErrorFuncs.replace);
// Replace \x00 with empty string as Postgres DB text data type does not support it
string = string.replaceAll('\x00', '');
const ptr = await __newString(string);
return ptr;
@ -718,38 +720,27 @@ export const instantiate = async (
},
'dataSource.context': async () => {
assert(context.contractAddress);
const contract = indexer.isWatchedContract(context.contractAddress);
const watchedContracts = indexer.isContractAddressWatched(context.contractAddress);
const dataSourceContract = watchedContracts?.find(contract => contract.kind === context.dataSourceName);
if (!contract) {
if (!dataSourceContract) {
return null;
}
return database.toGraphContext(instanceExports, contract.context);
return database.toGraphContext(instanceExports, dataSourceContract.context);
},
'dataSource.network': async () => {
assert(dataSource);
return __newString(dataSource.network);
},
'dataSource.create': async (name: number, params: number) => {
const [addressStringPtr] = __getArray(params);
const addressString = __getString(addressStringPtr);
const contractKind = __getString(name);
assert(indexer.watchContract);
assert(context.block);
await indexer.watchContract(utils.getAddress(addressString), contractKind, true, Number(context.block.blockNumber));
await handleDataSourceCreate(name, params);
},
'dataSource.createWithContext': async (name: number, params: number, dataSourceContext: number) => {
const [addressStringPtr] = __getArray(params);
const addressString = __getString(addressStringPtr);
const contractKind = __getString(name);
const contextInstance = await Entity.wrap(dataSourceContext);
const dbData = await database.fromGraphContext(instanceExports, contextInstance);
assert(indexer.watchContract);
assert(context.block);
await indexer.watchContract(utils.getAddress(addressString), contractKind, true, Number(context.block.blockNumber), dbData);
await handleDataSourceCreate(name, params, dbData);
}
},
json: {
@ -783,6 +774,34 @@ export const instantiate = async (
}
};
const handleDataSourceCreate = async (name: number, params: number, dbData?: {[key: string]: any}) => {
const [addressStringPtr] = __getArray(params);
const addressString = __getString(addressStringPtr);
const contractKind = __getString(name);
assert(context.block);
const contractAddress = utils.getAddress(addressString);
const watchedContracts = indexer.isContractAddressWatched(contractAddress);
// If template contract is already watched (incase of reorgs)
// Remove from watched contracts and throw error to reprocess block with correct order of template contract events
if (
watchedContracts &&
watchedContracts.some(watchedContract => watchedContract.kind === contractKind)
) {
await indexer.removeContract(contractAddress, contractKind);
throw new Error(`Template contract ${contractAddress} of kind ${contractKind} already exists; removed from watched contracts`);
}
await indexer.watchContract(
contractAddress,
contractKind,
true,
Number(context.block.blockNumber),
dbData
);
};
const instance = await loader.instantiate(source, imports);
const { exports: instanceExports } = instance;

View File

@ -8,7 +8,7 @@ import debug from 'debug';
import path from 'path';
import fs from 'fs';
import { ContractInterface, utils, providers } from 'ethers';
import { SelectionNode } from 'graphql';
import { GraphQLResolveInfo, SelectionNode } from 'graphql';
import { ResultObject } from '@cerc-io/assemblyscript/lib/loader';
import {
@ -28,10 +28,6 @@ import {
Transaction,
EthClient,
DEFAULT_LIMIT,
FILTER_CHANGE_BLOCK,
Where,
Filter,
OPERATOR_MAP,
ExtraEventData,
EthFullTransaction
} from '@cerc-io/util';
@ -60,6 +56,9 @@ export class GraphWatcher {
_context: Context;
_blockHandlerExists = false;
_eventHandlerExists = false;
constructor (database: GraphDatabase, ethClient: EthClient, ethProvider: providers.BaseProvider, serverConfig: ServerConfig) {
this._database = database;
this._ethClient = ethClient;
@ -110,6 +109,10 @@ export class GraphWatcher {
};
}, {});
// Check if handlers exist for deciding watcher behaviour
this._blockHandlerExists = this._dataSources.some(dataSource => Boolean(dataSource.mapping.blockHandlers));
this._eventHandlerExists = this._dataSources.some(dataSource => Boolean(dataSource.mapping.eventHandlers));
const data = await Promise.all(dataPromises);
// Create a map from dataSource contract address to instance and contract interface.
@ -129,6 +132,11 @@ export class GraphWatcher {
this.fillEventSignatureMap();
}
async switchClients ({ ethClient, ethProvider }: { ethClient: EthClient, ethProvider: providers.BaseProvider }) {
this._ethClient = ethClient;
this._ethProvider = ethProvider;
}
fillEventSignatureMap () {
this._dataSources.forEach(contract => {
if (contract.kind === 'ethereum/contract' && contract.mapping.kind === 'ethereum/events') {
@ -146,10 +154,17 @@ export class GraphWatcher {
return this._dataSources;
}
get blockHandlerExists (): boolean {
return this._blockHandlerExists;
}
get eventHandlerExists (): boolean {
return this._eventHandlerExists;
}
async addContracts () {
assert(this._indexer);
assert(this._indexer.watchContract);
assert(this._indexer.isWatchedContract);
// Watching the contract(s) if not watched already.
for (const dataSource of this._dataSources) {
@ -157,11 +172,7 @@ export class GraphWatcher {
// Skip for templates as they are added dynamically.
if (address) {
const watchedContract = await this._indexer.isWatchedContract(address);
if (!watchedContract) {
await this._indexer.watchContract(address, name, true, startBlock);
}
await this._indexer.watchContract(address, name, true, startBlock);
}
}
}
@ -177,64 +188,79 @@ export class GraphWatcher {
const blockData = this._context.block;
assert(blockData);
assert(this._indexer && this._indexer.isWatchedContract);
const watchedContract = this._indexer.isWatchedContract(contract);
assert(watchedContract);
assert(this._indexer);
const watchedContracts = this._indexer.isContractAddressWatched(contract);
assert(watchedContracts);
// Get dataSource in subgraph yaml based on contract address.
const dataSource = this._dataSources.find(dataSource => dataSource.name === watchedContract.kind);
// Get dataSources in subgraph yaml based on contract kind (same as dataSource.name)
const dataSources = this._dataSources
.filter(dataSource => watchedContracts.some(contract => contract.kind === dataSource.name));
if (!dataSource) {
if (!dataSources.length) {
log(`Subgraph doesn't have configuration for contract ${contract}`);
return;
}
this._context.contractAddress = contract;
for (const dataSource of dataSources) {
this._context.contractAddress = contract;
this._context.dataSourceName = dataSource.name;
const { instance, contractInterface } = this._dataSourceMap[watchedContract.kind];
assert(instance);
const { exports: instanceExports } = instance;
const { instance, contractInterface } = this._dataSourceMap[dataSource.name];
assert(instance);
const { exports: instanceExports } = instance;
let eventTopic: string;
// Get event handler based on event topic (from event signature).
const eventTopic = contractInterface.getEventTopic(eventSignature);
const eventHandler = dataSource.mapping.eventHandlers.find((eventHandler: any) => {
// The event signature we get from logDescription is different than that given in the subgraph yaml file.
// For eg. event in subgraph.yaml: Stake(indexed address,uint256); from logDescription: Stake(address,uint256)
// ethers.js doesn't recognize the subgraph event signature with indexed keyword before param type.
// Match event topics from cleaned subgraph event signature (Stake(indexed address,uint256) -> Stake(address,uint256)).
const subgraphEventTopic = contractInterface.getEventTopic(eventHandler.event.replace(/indexed /g, ''));
try {
eventTopic = contractInterface.getEventTopic(eventSignature);
} catch (err) {
// Continue loop only if no matching event found
if (!((err as Error).message.includes('no matching event'))) {
throw err;
}
return subgraphEventTopic === eventTopic;
});
continue;
}
if (!eventHandler) {
log(`No handler configured in subgraph for event ${eventSignature}`);
return;
}
// Get event handler based on event topic (from event signature).
const eventHandler = dataSource.mapping.eventHandlers.find((eventHandler: any) => {
// The event signature we get from logDescription is different than that given in the subgraph yaml file.
// For eg. event in subgraph.yaml: Stake(indexed address,uint256); from logDescription: Stake(address,uint256)
// ethers.js doesn't recognize the subgraph event signature with indexed keyword before param type.
// Match event topics from cleaned subgraph event signature (Stake(indexed address,uint256) -> Stake(address,uint256)).
const subgraphEventTopic = contractInterface.getEventTopic(eventHandler.event.replace(/indexed /g, ''));
const eventFragment = contractInterface.getEvent(eventSignature);
return subgraphEventTopic === eventTopic;
});
const tx = this._getTransactionData(txHash, extraData.ethFullTransactions);
if (!eventHandler) {
log(`No handler configured in subgraph for event ${eventSignature}`);
return;
}
const data = {
block: blockData,
inputs: eventFragment.inputs,
event,
tx,
eventIndex
};
const eventFragment = contractInterface.getEvent(eventSignature);
// Create ethereum event to be passed to the wasm event handler.
console.time(`time:graph-watcher#handleEvent-createEvent-block-${block.number}-event-${eventSignature}`);
const ethereumEvent = await createEvent(instanceExports, contract, data);
console.timeEnd(`time:graph-watcher#handleEvent-createEvent-block-${block.number}-event-${eventSignature}`);
try {
console.time(`time:graph-watcher#handleEvent-exec-${dataSource.name}-event-handler-${eventSignature}`);
await this._handleMemoryError(instanceExports[eventHandler.handler](ethereumEvent), dataSource.name);
console.timeEnd(`time:graph-watcher#handleEvent-exec-${dataSource.name}-event-handler-${eventSignature}`);
} catch (error) {
this._clearCachedEntities();
throw error;
const tx = this._getTransactionData(txHash, extraData.ethFullTransactions);
const data = {
block: blockData,
inputs: eventFragment.inputs,
event,
tx,
eventIndex
};
// Create ethereum event to be passed to the wasm event handler.
console.time(`time:graph-watcher#handleEvent-createEvent-block-${block.number}-event-${eventSignature}`);
const ethereumEvent = await createEvent(instanceExports, contract, data);
console.timeEnd(`time:graph-watcher#handleEvent-createEvent-block-${block.number}-event-${eventSignature}`);
try {
console.time(`time:graph-watcher#handleEvent-exec-${dataSource.name}-event-handler-${eventSignature}`);
await this._handleMemoryError(instanceExports[eventHandler.handler](ethereumEvent), dataSource.name);
console.timeEnd(`time:graph-watcher#handleEvent-exec-${dataSource.name}-event-handler-${eventSignature}`);
} catch (error) {
this._clearCachedEntities();
throw error;
}
}
}
@ -291,6 +317,7 @@ export class GraphWatcher {
for (const contractAddress of contractAddressList) {
this._context.contractAddress = contractAddress;
this._context.dataSourceName = dataSource.name;
// Call all the block handlers one after another for a contract.
const blockHandlerPromises = dataSource.mapping.blockHandlers.map(async (blockHandler: any): Promise<void> => {
@ -316,13 +343,15 @@ export class GraphWatcher {
id: string,
relationsMap: Map<any, { [key: string]: any }>,
block: BlockHeight,
selections: ReadonlyArray<SelectionNode> = []
queryInfo: GraphQLResolveInfo
): Promise<any> {
const dbTx = await this._database.createTransactionRunner();
try {
const selections = this._getSelectionsFromGQLInfo(queryInfo);
// Get entity from the database.
const result = await this._database.getEntityWithRelations(dbTx, entity, id, relationsMap, block, selections);
const result = await this._database.getEntityWithRelations(dbTx, entity, id, relationsMap, block, selections, queryInfo);
await dbTx.commitTransaction();
// Resolve any field name conflicts in the entity result.
@ -341,19 +370,21 @@ export class GraphWatcher {
block: BlockHeight,
where: { [key: string]: any } = {},
queryOptions: QueryOptions,
selections: ReadonlyArray<SelectionNode> = []
queryInfo: GraphQLResolveInfo
): Promise<any> {
const dbTx = await this._database.createTransactionRunner();
try {
where = this._buildFilter(where);
where = this._database.buildFilter(where);
if (!queryOptions.limit) {
queryOptions.limit = DEFAULT_LIMIT;
}
const selections = this._getSelectionsFromGQLInfo(queryInfo);
// Get entities from the database.
const entities = await this._database.getEntities(dbTx, entity, relationsMap, block, where, queryOptions, selections);
const entities = await this._database.getEntities(dbTx, entity, relationsMap, block, where, queryOptions, selections, queryInfo);
await dbTx.commitTransaction();
return entities;
@ -479,74 +510,12 @@ export class GraphWatcher {
return transaction;
}
_buildFilter (where: { [key: string]: any } = {}): Where {
return Object.entries(where).reduce((acc: Where, [fieldWithSuffix, value]) => {
if (fieldWithSuffix === FILTER_CHANGE_BLOCK) {
assert(value.number_gte && typeof value.number_gte === 'number');
_getSelectionsFromGQLInfo (queryInfo: GraphQLResolveInfo): readonly SelectionNode[] {
const [fieldNode] = queryInfo.fieldNodes;
const selectionSet = fieldNode.selectionSet;
assert(selectionSet, `selectionSet not present in GQL fieldNode ${fieldNode.name}`);
acc[FILTER_CHANGE_BLOCK] = [{
value: value.number_gte,
not: false
}];
return acc;
}
if (['and', 'or'].includes(fieldWithSuffix)) {
assert(Array.isArray(value));
// Parse all the comibations given in the array
acc[fieldWithSuffix] = value.map(w => {
return this._buildFilter(w);
});
return acc;
}
const [field, ...suffix] = fieldWithSuffix.split('_');
if (!acc[field]) {
acc[field] = [];
}
let op = suffix.shift();
// If op is "" (different from undefined), it means it's a nested filter on a relation field
if (op === '') {
(acc[field] as Filter[]).push({
// Parse nested filter value
value: this._buildFilter(value),
not: false,
operator: 'nested'
});
return acc;
}
const filter: Filter = {
value,
not: false,
operator: 'equals'
};
if (op === 'not') {
filter.not = true;
op = suffix.shift();
}
if (op) {
filter.operator = op as keyof typeof OPERATOR_MAP;
}
// If filter field ends with "nocase", use case insensitive version of the operator
if (suffix[suffix.length - 1] === 'nocase') {
filter.operator = `${op}_nocase` as keyof typeof OPERATOR_MAP;
}
(acc[field] as Filter[]).push(filter);
return acc;
}, {});
return selectionSet.selections;
}
}

View File

@ -11,6 +11,6 @@
},
"dependencies": {
"@graphprotocol/graph-ts": "npm:@cerc-io/graph-ts@0.27.0-watcher-ts-0.1.3",
"@cerc-io/graph-cli": "0.32.0-watcher-ts-0.1.3"
"@cerc-io/graph-cli": "0.32.0-watcher-ts-0.1.4"
}
}

View File

@ -30,10 +30,10 @@
dependencies:
regenerator-runtime "^0.14.0"
"@cerc-io/graph-cli@0.32.0-watcher-ts-0.1.3":
version "0.32.0-watcher-ts-0.1.3"
resolved "https://git.vdb.to/api/packages/cerc-io/npm/%40cerc-io%2Fgraph-cli/-/0.32.0-watcher-ts-0.1.3/graph-cli-0.32.0-watcher-ts-0.1.3.tgz#6dd4b3b84f17f3defe274d35a5075173431864d8"
integrity sha512-21W1qGIn8DWKovmQRSs4zPYQJ4gOiAZ4tY4UnCJrOExdZuYgt0ts/gV3RBI+K8L876iDpzgMsW9Get0IhPSxfg==
"@cerc-io/graph-cli@0.32.0-watcher-ts-0.1.4":
version "0.32.0-watcher-ts-0.1.4"
resolved "https://git.vdb.to/api/packages/cerc-io/npm/%40cerc-io%2Fgraph-cli/-/0.32.0-watcher-ts-0.1.4/graph-cli-0.32.0-watcher-ts-0.1.4.tgz#7634f79a58f733055409888b27e34baa0aef209e"
integrity sha512-KNd2YLeLeEhxRFhahqWPTzH9ig8dkANASa6/fI2+KJVnKJQFpdo1qN2VWJEHG1tSZ6itUIMU9xXpJvys77JpBg==
dependencies:
assemblyscript "0.19.10"
binary-install-raw "0.0.13"
@ -44,7 +44,7 @@
dockerode "2.5.8"
fs-extra "9.0.0"
glob "7.1.6"
gluegun "https://github.com/edgeandnode/gluegun#v4.3.1-pin-colors-dep"
gluegun "5.2.0"
graphql "15.5.0"
immutable "3.8.2"
ipfs-http-client "40.0.0"
@ -381,15 +381,15 @@ ajv@^6.12.3:
json-schema-traverse "^0.4.1"
uri-js "^4.2.2"
ansi-colors@^3.2.1:
version "3.2.4"
resolved "https://registry.yarnpkg.com/ansi-colors/-/ansi-colors-3.2.4.tgz#e3a3da4bfbae6c86a9c285625de124a234026fbf"
integrity sha512-hHUXGagefjN2iRrID63xckIvotOXOojhQKWIPUZ4mNUZ9nLZW+7FMNoE1lOkEhNWYsx/7ysGIuJYCiMAA9FnrA==
ansi-colors@^4.1.1:
version "4.1.3"
resolved "https://registry.yarnpkg.com/ansi-colors/-/ansi-colors-4.1.3.tgz#37611340eb2243e70cc604cad35d63270d48781b"
integrity sha512-/6w/C21Pm1A7aZitlI5Ni/2J6FFQN8i1Cvz3kHABAAbw93v/NlvKdVOqz7CCWz/3iv/JplRSEEZ83XION15ovw==
ansi-regex@^3.0.0:
version "3.0.1"
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-3.0.1.tgz#123d6479e92ad45ad897d4054e3c7ca7db4944e1"
integrity sha512-+O9Jct8wf++lXxxFc4hc8LsjaSq0HFzzL7cVsw8pRDIPdjKD2mT4ytDZlLuSBZ4cLKZFXIrMGO7DbQCtMJJMKw==
ansi-regex@^4.1.0:
version "4.1.1"
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-4.1.1.tgz#164daac87ab2d6f6db3a29875e2d1766582dabed"
integrity sha512-ILlv4k/3f6vfQ4OoP2AGvirOktlQ98ZEL1k9FaQjxa3L1abBgbuTDAdPOpvbGncC0BTVQrl+OM8xZGK6tWXt7g==
ansi-regex@^5.0.1:
version "5.0.1"
@ -418,13 +418,12 @@ anymatch@~3.1.1:
normalize-path "^3.0.0"
picomatch "^2.0.4"
apisauce@^1.0.1:
version "1.1.5"
resolved "https://registry.yarnpkg.com/apisauce/-/apisauce-1.1.5.tgz#31d41a5cf805e401266cec67faf1a50f4aeae234"
integrity sha512-gKC8qb/bDJsPsnEXLZnXJ7gVx7dh87CEVNeIwv1dvaffnXoh5GHwac5pWR1P2broLiVj/fqFMQvLDDt/RhjiqA==
apisauce@^2.1.5:
version "2.1.6"
resolved "https://registry.yarnpkg.com/apisauce/-/apisauce-2.1.6.tgz#94887f335bf3d735305fc895c8a191c9c2608a7f"
integrity sha512-MdxR391op/FucS2YQRfB/NMRyCnHEPDd4h17LRIuVYi0BpGmMhpxc0shbOpfs5ahABuBEffNCGal5EcsydbBWg==
dependencies:
axios "^0.21.2"
ramda "^0.25.0"
axios "^0.21.4"
app-module-path@^2.2.0:
version "2.2.0"
@ -498,6 +497,11 @@ async@^2.6.1, async@^2.6.2, async@^2.6.3:
dependencies:
lodash "^4.17.14"
async@^3.2.6:
version "3.2.6"
resolved "https://registry.yarnpkg.com/async/-/async-3.2.6.tgz#1b0728e14929d51b85b449b7f06e27c1145e38ce"
integrity sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA==
asynckit@^0.4.0:
version "0.4.0"
resolved "https://registry.yarnpkg.com/asynckit/-/asynckit-0.4.0.tgz#c79ed97f7f34cb8f2ba1bc9790bcc366474b4b79"
@ -518,7 +522,7 @@ aws4@^1.8.0:
resolved "https://registry.yarnpkg.com/aws4/-/aws4-1.11.0.tgz#d61f46d83b2519250e2784daf5b09479a8b41c59"
integrity sha512-xh1Rl34h6Fi1DC2WWKfxUTVqRsNnr6LsKz2+hfwDxQJWmrx8+c7ylaqBMcHfl1U1r2dsifOvKX3LQuLNZ+XSvA==
axios@^0.21.1, axios@^0.21.2:
axios@^0.21.1, axios@^0.21.4:
version "0.21.4"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.4.tgz#c67b90dc0568e5c1cf2b0b858c43ba28e2eda575"
integrity sha512-ut5vewkiu8jjGBdqpM44XxjuCjq9LAKeHVmoVfHVzy8eHgxxq8SbAVQNovDA8mVi05kP0Ea/n/UzcSHcTJQfNg==
@ -645,6 +649,13 @@ brace-expansion@^1.1.7:
balanced-match "^1.0.0"
concat-map "0.0.1"
brace-expansion@^2.0.1:
version "2.0.2"
resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-2.0.2.tgz#54fc53237a613d854c7bd37463aad17df87214e7"
integrity sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ==
dependencies:
balanced-match "^1.0.0"
braces@~3.0.2:
version "3.0.2"
resolved "https://registry.yarnpkg.com/braces/-/braces-3.0.2.tgz#3454e1a462ee8d599e236df336cd9ea4f8afe107"
@ -748,17 +759,12 @@ callsites@^3.0.0:
resolved "https://registry.yarnpkg.com/callsites/-/callsites-3.1.0.tgz#b3630abd8943432f54b3f0519238e33cd7df2f73"
integrity sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==
camelcase@^5.0.0:
version "5.3.1"
resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-5.3.1.tgz#e3c9b31569e106811df242f715725a1f4c494320"
integrity sha512-L28STB170nwWS63UjtlEOE3dldQApaJXZkOI1uMFfzf3rRuPegHaHesyee+YxQ+W6SvRDQV6UrdOdRiR153wJg==
caseless@^0.12.0, caseless@~0.12.0:
version "0.12.0"
resolved "https://registry.yarnpkg.com/caseless/-/caseless-0.12.0.tgz#1b681c21ff84033c826543090689420d187151dc"
integrity sha1-G2gcIf+EAzyCZUMJBolCDRhxUdw=
chalk@3.0.0, chalk@^3.0.0:
chalk@3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/chalk/-/chalk-3.0.0.tgz#3f73c2bf526591f574cc492c51e2456349f844e4"
integrity sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==
@ -847,13 +853,13 @@ cli-spinners@^2.2.0:
resolved "https://registry.yarnpkg.com/cli-spinners/-/cli-spinners-2.6.1.tgz#adc954ebe281c37a6319bfa401e6dd2488ffb70d"
integrity sha512-x/5fWmGMnbKQAaNwN+UZlV79qBLM9JFnJuJ03gIi5whrob0xV0ofNVHy9DhwGdsMJQc2OKv0oGmLzvaqvAVv+g==
cli-table3@~0.5.0:
version "0.5.1"
resolved "https://registry.yarnpkg.com/cli-table3/-/cli-table3-0.5.1.tgz#0252372d94dfc40dbd8df06005f48f31f656f202"
integrity sha512-7Qg2Jrep1S/+Q3EceiZtQcDPWxhAvBw+ERf1162v4sikJrvojMHFqXt8QIVha8UlH9rgU0BeWPytZ9/TzYqlUw==
cli-table3@0.6.0:
version "0.6.0"
resolved "https://registry.yarnpkg.com/cli-table3/-/cli-table3-0.6.0.tgz#b7b1bc65ca8e7b5cef9124e13dc2b21e2ce4faee"
integrity sha512-gnB85c3MGC7Nm9I/FkiasNBOKjOiO1RNuXXarQms37q4QMpWdlbBgD/VnOStA2faG1dpXMv31RFApjX1/QdgWQ==
dependencies:
object-assign "^4.1.0"
string-width "^2.1.1"
string-width "^4.2.0"
optionalDependencies:
colors "^1.1.2"
@ -886,12 +892,7 @@ color-name@~1.1.4:
resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2"
integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==
colors@1.3.3:
version "1.3.3"
resolved "https://registry.yarnpkg.com/colors/-/colors-1.3.3.tgz#39e005d546afe01e01f9c4ca8fa50f686a01205d"
integrity sha512-mmGt/1pZqYRjMxB1axhTo16/snVZ5krrKkcmMeVKxzECMMXoCgnvTPp10QgHfcbQZw8Dq2jMNG6je4JlWU0gWg==
colors@^1.1.2:
colors@1.4.0, colors@^1.1.2:
version "1.4.0"
resolved "https://registry.yarnpkg.com/colors/-/colors-1.4.0.tgz#c50491479d4c1bdaed2c9ced32cf7c7dc2360f78"
integrity sha512-a+UqTh4kgZg/SlGvfbzDHpgRu7AAQOmmqRHJnxhRZICKFUT91brVhNNt58CMWU9PsBbv3PDCZUHbVxuDiH2mtA==
@ -933,16 +934,16 @@ core-util-is@~1.0.0:
resolved "https://registry.yarnpkg.com/core-util-is/-/core-util-is-1.0.3.tgz#a6042d3634c2b27e9328f837b965fac83808db85"
integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==
cosmiconfig@6.0.0:
version "6.0.0"
resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-6.0.0.tgz#da4fee853c52f6b1e6935f41c1a2fc50bd4a9982"
integrity sha512-xb3ZL6+L8b9JLLCx3ZdoZy4+2ECphCMo2PwqgP1tlfVq6M6YReyzBJtvWWtbDSpNr9hn96pkCiZqUcFEc+54Qg==
cosmiconfig@7.0.1:
version "7.0.1"
resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-7.0.1.tgz#714d756522cace867867ccb4474c5d01bbae5d6d"
integrity sha512-a1YWNUV2HwGimB7dU2s1wUMurNKjpx60HxBB6xUM8Re+2s1g1IIfJvFR0/iCF+XHdE0GMTKTuLR32UQff4TEyQ==
dependencies:
"@types/parse-json" "^4.0.0"
import-fresh "^3.1.0"
import-fresh "^3.2.1"
parse-json "^5.0.0"
path-type "^4.0.0"
yaml "^1.7.2"
yaml "^1.10.0"
create-hash@^1.1.0, create-hash@^1.1.2, create-hash@^1.2.0:
version "1.2.0"
@ -967,7 +968,7 @@ create-hmac@^1.1.4, create-hmac@^1.1.7:
safe-buffer "^5.0.1"
sha.js "^2.4.8"
cross-spawn@^7.0.0:
cross-spawn@7.0.3:
version "7.0.3"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.3.tgz#f73a85b9d5d41d045551c177e2882d4ac85728a6"
integrity sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==
@ -976,6 +977,15 @@ cross-spawn@^7.0.0:
shebang-command "^2.0.0"
which "^2.0.1"
cross-spawn@^7.0.3:
version "7.0.6"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.6.tgz#8a58fe78f00dcd70c370451759dfbfaf03e8ee9f"
integrity sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==
dependencies:
path-key "^3.1.0"
shebang-command "^2.0.0"
which "^2.0.1"
dashdash@^1.12.0:
version "1.14.1"
resolved "https://registry.yarnpkg.com/dashdash/-/dashdash-1.14.1.tgz#853cfa0f7cbe2fed5de20326b8dd581035f6e2f0"
@ -1004,11 +1014,6 @@ debug@^4.1.0:
dependencies:
ms "2.1.2"
decamelize@^1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-1.2.0.tgz#f6534d15148269b20352e7bee26f501f9a191290"
integrity sha1-9lNNFRSCabIDUue+4m9QH5oZEpA=
defaults@^1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/defaults/-/defaults-1.0.3.tgz#c656051e9817d9ff08ed881477f3fe4019f3ef7d"
@ -1081,10 +1086,12 @@ ecc-jsbn@~0.1.1:
jsbn "~0.1.0"
safer-buffer "^2.1.0"
ejs@^2.6.1:
version "2.7.4"
resolved "https://registry.yarnpkg.com/ejs/-/ejs-2.7.4.tgz#48661287573dcc53e366c7a1ae52c3a120eec9ba"
integrity sha512-7vmuyh5+kuUyJKePhQfRQBhXV5Ce+RnaeeQArKu1EAMpL3WbgMt5WG6uQZpEVvYSSsxMXRKOewtDk9RaTKXRlA==
ejs@3.1.8:
version "3.1.8"
resolved "https://registry.yarnpkg.com/ejs/-/ejs-3.1.8.tgz#758d32910c78047585c7ef1f92f9ee041c1c190b"
integrity sha512-/sXZeMlhS0ArkfX2Aw780gJzXSMPnKjtspYZv+f3NiKLlubezAHDU5+9xz6gd3/NhG3txQCo6xlglmTS+oTGEQ==
dependencies:
jake "^10.8.5"
elliptic@6.5.4, elliptic@^6.5.2, elliptic@^6.5.4:
version "6.5.4"
@ -1099,6 +1106,11 @@ elliptic@6.5.4, elliptic@^6.5.2, elliptic@^6.5.4:
minimalistic-assert "^1.0.1"
minimalistic-crypto-utils "^1.0.1"
emoji-regex@^8.0.0:
version "8.0.0"
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-8.0.0.tgz#e818fd69ce5ccfcb404594f842963bf53164cc37"
integrity sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==
end-of-stream@^1.0.0, end-of-stream@^1.1.0:
version "1.4.4"
resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.4.tgz#5ae64a5f45057baf3626ec14da0ca5e4b2431eb0"
@ -1106,12 +1118,12 @@ end-of-stream@^1.0.0, end-of-stream@^1.1.0:
dependencies:
once "^1.4.0"
enquirer@2.3.4:
version "2.3.4"
resolved "https://registry.yarnpkg.com/enquirer/-/enquirer-2.3.4.tgz#c608f2e1134c7f68c1c9ee056de13f9b31076de9"
integrity sha512-pkYrrDZumL2VS6VBGDhqbajCM2xpkUNLuKfGPjfKaSIBKYopQbqEFyrOkRMIb2HDR/rO1kGhEt/5twBwtzKBXw==
enquirer@2.3.6:
version "2.3.6"
resolved "https://registry.yarnpkg.com/enquirer/-/enquirer-2.3.6.tgz#2a7fe5dd634a1e4125a975ec994ff5456dc3734d"
integrity sha512-yjNnPr315/FjS4zIsUxYguYUPP2e1NK4d7E7ZOLiyYCcbFBiTMyID+2wvm2w6+pZ/odMA7cRkjhsPbltwBOrLg==
dependencies:
ansi-colors "^3.2.1"
ansi-colors "^4.1.1"
err-code@^2.0.0:
version "2.0.3"
@ -1207,20 +1219,19 @@ evp_bytestokey@^1.0.3:
md5.js "^1.3.4"
safe-buffer "^5.1.1"
execa@^3.0.0:
version "3.4.0"
resolved "https://registry.yarnpkg.com/execa/-/execa-3.4.0.tgz#c08ed4550ef65d858fac269ffc8572446f37eb89"
integrity sha512-r9vdGQk4bmCuK1yKQu1KTwcT2zwfWdbdaXfCtAh+5nU/4fSX+JAb7vZGvI5naJrQlvONrEB20jeruESI69530g==
execa@5.1.1:
version "5.1.1"
resolved "https://registry.yarnpkg.com/execa/-/execa-5.1.1.tgz#f80ad9cbf4298f7bd1d4c9555c21e93741c411dd"
integrity sha512-8uSpZZocAZRBAPIEINJj3Lo9HyGitllczc27Eh5YYojjMFMn8yHMDMaUHE2Jqfq05D/wucwI4JGURyXt1vchyg==
dependencies:
cross-spawn "^7.0.0"
get-stream "^5.0.0"
human-signals "^1.1.1"
cross-spawn "^7.0.3"
get-stream "^6.0.0"
human-signals "^2.1.0"
is-stream "^2.0.0"
merge-stream "^2.0.0"
npm-run-path "^4.0.0"
onetime "^5.1.0"
p-finally "^2.0.0"
signal-exit "^3.0.2"
npm-run-path "^4.0.1"
onetime "^5.1.2"
signal-exit "^3.0.3"
strip-final-newline "^2.0.0"
explain-error@^1.0.4:
@ -1268,6 +1279,13 @@ file-uri-to-path@1.0.0:
resolved "https://registry.yarnpkg.com/file-uri-to-path/-/file-uri-to-path-1.0.0.tgz#553a7b8446ff6f684359c445f1e37a05dacc33dd"
integrity sha512-0Zt+s3L7Vf1biwWZ29aARiVYLx7iMGnEUl9x33fbB/j3jR81u/O2LbqK+Bm1CDSNDKVtJ/YjwY7TUd5SkeLQLw==
filelist@^1.0.4:
version "1.0.4"
resolved "https://registry.yarnpkg.com/filelist/-/filelist-1.0.4.tgz#f78978a1e944775ff9e62e744424f215e58352b5"
integrity sha512-w1cEuf3S+DrLCQL7ET6kz+gmlJdbq9J7yXCSjK/OZCPA+qEN1WyF4ZAf0YYJa4/shHJra2t/d/r8SV4Ji+x+8Q==
dependencies:
minimatch "^5.0.1"
fill-range@^7.0.1:
version "7.0.1"
resolved "https://registry.yarnpkg.com/fill-range/-/fill-range-7.0.1.tgz#1919a6a7c75fe38b2c7c77e5198535da9acdda40"
@ -1336,10 +1354,10 @@ fs-extra@^8.1.0:
jsonfile "^4.0.0"
universalify "^0.1.0"
fs-jetpack@^2.2.2:
version "2.4.0"
resolved "https://registry.yarnpkg.com/fs-jetpack/-/fs-jetpack-2.4.0.tgz#6080c4ab464a019d37a404baeb47f32af8835026"
integrity sha512-S/o9Dd7K9A7gicVU32eT8G0kHcmSu0rCVdP79P0MWInKFb8XpTc8Syhoo66k9no+HDshtlh4pUJTws8X+8fdFQ==
fs-jetpack@4.3.1:
version "4.3.1"
resolved "https://registry.yarnpkg.com/fs-jetpack/-/fs-jetpack-4.3.1.tgz#cdfd4b64e6bfdec7c7dc55c76b39efaa7853bb20"
integrity sha512-dbeOK84F6BiQzk2yqqCVwCPWTxAvVGJ3fMQc6E2wuEohS28mR6yHngbrKuVCK1KHRx/ccByDylqu4H5PCP2urQ==
dependencies:
minimatch "^3.0.2"
rimraf "^2.6.3"
@ -1386,12 +1404,10 @@ get-port@^3.1.0:
resolved "https://registry.yarnpkg.com/get-port/-/get-port-3.2.0.tgz#dd7ce7de187c06c8bf353796ac71e099f0980ebc"
integrity sha512-x5UJKlgeUiNT8nyo/AcnwLnZuZNcSjSw0kogRB+Whd1fjjFq4B1hySFxSFWWSn4mIBzg3sRNUDFYc4g5gjPoLg==
get-stream@^5.0.0:
version "5.2.0"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-5.2.0.tgz#4966a1795ee5ace65e706c4b7beb71257d6e22d3"
integrity sha512-nBF+F1rAZVCu/p7rjzgA+Yb4lfYXrpl7a6VmJrU8wF9I1CKvP/QwPNZHnOlwbTkY6dvtFIzFMSyQXbLoTQPRpA==
dependencies:
pump "^3.0.0"
get-stream@^6.0.0:
version "6.0.1"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-6.0.1.tgz#a262d8eef67aced57c2852ad6167526a43cbf7b7"
integrity sha512-ts6Wi+2j3jQjqi70w5AlN8DFnkSwC+MqmxEzdEALB2qXZYV3X/b1CTfgPLGJNMeAWxdPfU8FO1ms3NUfaHCPYg==
getpass@^0.1.1:
version "0.1.7"
@ -1431,20 +1447,21 @@ glob@^7.1.3:
once "^1.3.0"
path-is-absolute "^1.0.0"
"gluegun@git+https://github.com/edgeandnode/gluegun.git#v4.3.1-pin-colors-dep":
version "4.3.1"
resolved "git+https://github.com/edgeandnode/gluegun.git#b34b9003d7bf556836da41b57ef36eb21570620a"
gluegun@5.2.0:
version "5.2.0"
resolved "https://registry.yarnpkg.com/gluegun/-/gluegun-5.2.0.tgz#88ba1f76f20e68a135557a4a4c8ea283291a7491"
integrity sha512-jSUM5xUy2ztYFQANne17OUm/oAd7qSX7EBksS9bQDt9UvLPqcEkeWUebmaposb8Tx7eTTD8uJVWGRe6PYSsYkg==
dependencies:
apisauce "^1.0.1"
apisauce "^2.1.5"
app-module-path "^2.2.0"
cli-table3 "~0.5.0"
colors "1.3.3"
cosmiconfig "6.0.0"
cross-spawn "^7.0.0"
ejs "^2.6.1"
enquirer "2.3.4"
execa "^3.0.0"
fs-jetpack "^2.2.2"
cli-table3 "0.6.0"
colors "1.4.0"
cosmiconfig "7.0.1"
cross-spawn "7.0.3"
ejs "3.1.8"
enquirer "2.3.6"
execa "5.1.1"
fs-jetpack "4.3.1"
lodash.camelcase "^4.3.0"
lodash.kebabcase "^4.1.1"
lodash.lowercase "^4.3.0"
@ -1460,12 +1477,11 @@ glob@^7.1.3:
lodash.trimstart "^4.5.1"
lodash.uppercase "^4.3.0"
lodash.upperfirst "^4.3.1"
ora "^4.0.0"
ora "4.0.2"
pluralize "^8.0.0"
ramdasauce "^2.1.0"
semver "^7.0.0"
which "^2.0.0"
yargs-parser "^16.1.0"
semver "7.3.5"
which "2.0.2"
yargs-parser "^21.0.0"
gopd@^1.0.1:
version "1.0.1"
@ -1588,10 +1604,10 @@ http-signature@~1.2.0:
jsprim "^1.2.2"
sshpk "^1.7.0"
human-signals@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/human-signals/-/human-signals-1.1.1.tgz#c5b1cd14f50aeae09ab6c59fe63ba3395fe4dfa3"
integrity sha512-SEQu7vl8KjNL2eoGBLF3+wAjpsNfA9XMlXAYj/3EdaNfAlxKthD1xjEQfGOUhllCGGJVNY34bRr6lPINhNjyZw==
human-signals@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/human-signals/-/human-signals-2.1.0.tgz#dc91fcba42e4d06e4abaed33b3e7a3c02f514ea0"
integrity sha512-B4FFZ6q/T2jhhksgkbEW3HBvWIfDW85snkQgawt07S7J5QXTk6BkNV+0yAeZrM5QpMAdYlocGoljn0sJ/WQkFw==
ieee754@^1.1.13, ieee754@^1.2.1:
version "1.2.1"
@ -1603,10 +1619,10 @@ immutable@3.8.2:
resolved "https://registry.yarnpkg.com/immutable/-/immutable-3.8.2.tgz#c2439951455bb39913daf281376f1530e104adf3"
integrity sha512-15gZoQ38eYjEjxkorfbcgBKBL6R7T459OuK+CpcWt7O3KF4uPCx2tD0uFETlUDIyo+1789crbMhTvQBSR5yBMg==
import-fresh@^3.1.0:
version "3.3.0"
resolved "https://registry.yarnpkg.com/import-fresh/-/import-fresh-3.3.0.tgz#37162c25fcb9ebaa2e6e53d5b4d88ce17d9e0c2b"
integrity sha512-veYYhQa+D1QBKznvhUHxb8faxlrwUnxseDAbAp457E0wLNio2bOSKnjYDhMj+YiAq61xrMGhQk9iXVk5FzgQMw==
import-fresh@^3.2.1:
version "3.3.1"
resolved "https://registry.yarnpkg.com/import-fresh/-/import-fresh-3.3.1.tgz#9cecb56503c0ada1f2741dbbd6546e4b13b57ccf"
integrity sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==
dependencies:
parent-module "^1.0.0"
resolve-from "^4.0.0"
@ -1771,10 +1787,10 @@ is-extglob@^2.1.1:
resolved "https://registry.yarnpkg.com/is-extglob/-/is-extglob-2.1.1.tgz#a88c02535791f02ed37c76a1b9ea9773c833f8c2"
integrity sha1-qIwCU1eR8C7TfHahueqXc8gz+MI=
is-fullwidth-code-point@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/is-fullwidth-code-point/-/is-fullwidth-code-point-2.0.0.tgz#a3b30a5c4f199183167aaab93beefae3ddfb654f"
integrity sha1-o7MKXE8ZkYMWeqq5O+764937ZU8=
is-fullwidth-code-point@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz#f116f8064fe90b3f7844a38997c0b75051269f1d"
integrity sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==
is-glob@^4.0.1, is-glob@~4.0.1:
version "4.0.3"
@ -1953,6 +1969,15 @@ iterable-ndjson@^1.1.0:
dependencies:
string_decoder "^1.2.0"
jake@^10.8.5:
version "10.9.4"
resolved "https://registry.yarnpkg.com/jake/-/jake-10.9.4.tgz#d626da108c63d5cfb00ab5c25fadc7e0084af8e6"
integrity sha512-wpHYzhxiVQL+IV05BLE2Xn34zW1S223hvjtqk0+gsPrwd/8JNLXJgZZM/iPFsYc1xyphF+6M6EvdE5E9MBGkDA==
dependencies:
async "^3.2.6"
filelist "^1.0.4"
picocolors "^1.1.1"
jayson@3.6.6:
version "3.6.6"
resolved "https://registry.yarnpkg.com/jayson/-/jayson-3.6.6.tgz#189984f624e398f831bd2be8e8c80eb3abf764a1"
@ -2294,6 +2319,13 @@ minimatch@^3.0.2, minimatch@^3.0.4:
dependencies:
brace-expansion "^1.1.7"
minimatch@^5.0.1:
version "5.1.6"
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-5.1.6.tgz#1cfcb8cf5522ea69952cd2af95ae09477f122a96"
integrity sha512-lKwV/1brpG6mBUFHtb7NUmtABCb2WZZmm2wNiOA5hAb8VdCS4B3dtMWyvcoViccwAW/COERjXLt0zP1zXUN26g==
dependencies:
brace-expansion "^2.0.1"
minimist@^1.2.6:
version "1.2.6"
resolved "https://registry.yarnpkg.com/minimist/-/minimist-1.2.6.tgz#8637a5b759ea0d6e98702cfb3a9283323c93af44"
@ -2468,11 +2500,6 @@ murmurhash3js@^3.0.1:
resolved "https://registry.yarnpkg.com/murmurhash3js/-/murmurhash3js-3.0.1.tgz#3e983e5b47c2a06f43a713174e7e435ca044b998"
integrity sha1-Ppg+W0fCoG9DpxMXTn5DXKBEuZg=
mute-stream@0.0.8:
version "0.0.8"
resolved "https://registry.yarnpkg.com/mute-stream/-/mute-stream-0.0.8.tgz#1630c42b2251ff81e2a283de96a5497ea92e5e0d"
integrity sha512-nnbWWOkoWyUsTjKrhgD0dcz22mdkSnpYqbEjIm2nhwhuxlSkpywJmBo8h0ZqJdkp73mb90SssHkN4rsRaBAfAA==
nan@^2.14.0, nan@^2.14.2:
version "2.15.0"
resolved "https://registry.yarnpkg.com/nan/-/nan-2.15.0.tgz#3f34a473ff18e15c1b5626b62903b5ad6e665fee"
@ -2518,7 +2545,7 @@ normalize-path@^3.0.0, normalize-path@~3.0.0:
resolved "https://registry.yarnpkg.com/normalize-path/-/normalize-path-3.0.0.tgz#0dcd69ff23a1c9b11fd0978316644a0388216a65"
integrity sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==
npm-run-path@^4.0.0:
npm-run-path@^4.0.1:
version "4.0.1"
resolved "https://registry.yarnpkg.com/npm-run-path/-/npm-run-path-4.0.1.tgz#b7ecd1e5ed53da8e37a55e1c2269e0b97ed748ea"
integrity sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==
@ -2560,7 +2587,7 @@ once@^1.3.0, once@^1.3.1, once@^1.4.0:
dependencies:
wrappy "1"
onetime@^5.1.0:
onetime@^5.1.0, onetime@^5.1.2:
version "5.1.2"
resolved "https://registry.yarnpkg.com/onetime/-/onetime-5.1.2.tgz#d0e96ebb56b07476df1dd9c4806e5237985ca45e"
integrity sha512-kbpaSSGJTWdAY5KPVeMOKXSrPtr8C8C7wodJbcsd51jRnmD+GZu8Y0VoU6Dm5Z4vWr0Ig/1NKuWRKf7j5aaYSg==
@ -2574,18 +2601,17 @@ optimist@~0.3.5:
dependencies:
wordwrap "~0.0.2"
ora@^4.0.0:
version "4.1.1"
resolved "https://registry.yarnpkg.com/ora/-/ora-4.1.1.tgz#566cc0348a15c36f5f0e979612842e02ba9dddbc"
integrity sha512-sjYP8QyVWBpBZWD6Vr1M/KwknSw6kJOz41tvGMlwWeClHBtYKTbHMki1PsLZnxKpXMPbTKv9b3pjQu3REib96A==
ora@4.0.2:
version "4.0.2"
resolved "https://registry.yarnpkg.com/ora/-/ora-4.0.2.tgz#0e1e68fd45b135d28648b27cf08081fa6e8a297d"
integrity sha512-YUOZbamht5mfLxPmk4M35CD/5DuOkAacxlEUbStVXpBAt4fyhBf+vZHI/HRkI++QUp3sNoeA2Gw4C+hi4eGSig==
dependencies:
chalk "^3.0.0"
chalk "^2.4.2"
cli-cursor "^3.1.0"
cli-spinners "^2.2.0"
is-interactive "^1.0.0"
log-symbols "^3.0.0"
mute-stream "0.0.8"
strip-ansi "^6.0.0"
strip-ansi "^5.2.0"
wcwidth "^1.0.1"
p-defer@^3.0.0:
@ -2601,11 +2627,6 @@ p-fifo@^1.0.0:
fast-fifo "^1.0.0"
p-defer "^3.0.0"
p-finally@^2.0.0:
version "2.0.1"
resolved "https://registry.yarnpkg.com/p-finally/-/p-finally-2.0.1.tgz#bd6fcaa9c559a096b680806f4d657b3f0f240561"
integrity sha512-vpm09aKwq6H9phqRQzecoDpD8TmVyGw70qmWlyq5onxY7tqyTTFVvxMykxQSQKILBSFlbXpypIw2T1Ml7+DDtw==
parent-module@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/parent-module/-/parent-module-1.0.1.tgz#691d2709e78c79fae3a156622452d00762caaaa2"
@ -2691,6 +2712,11 @@ performance-now@^2.1.0:
resolved "https://registry.yarnpkg.com/performance-now/-/performance-now-2.1.0.tgz#6309f4e0e5fa913ec1c69307ae364b4b377c9e7b"
integrity sha1-Ywn04OX6kT7BxpMHrjZLSzd8nns=
picocolors@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/picocolors/-/picocolors-1.1.1.tgz#3d321af3eab939b083c8f929a1d12cda81c26b6b"
integrity sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==
picomatch@^2.0.4, picomatch@^2.2.1:
version "2.3.1"
resolved "https://registry.yarnpkg.com/picomatch/-/picomatch-2.3.1.tgz#3ba3833733646d9d3e4995946c1365a67fb07a42"
@ -2775,14 +2801,6 @@ pump@^1.0.0:
end-of-stream "^1.1.0"
once "^1.3.1"
pump@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64"
integrity sha512-LwZy+p3SFs1Pytd/jYct4wpv49HiYCqd9Rlc5ZVdk0V+8Yzv6jR5Blk3TRmPL1ft69TxP0IMZGJ+WPFU2BFhww==
dependencies:
end-of-stream "^1.1.0"
once "^1.3.1"
punycode@^2.1.0, punycode@^2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.1.1.tgz#b58b010ac40c22c5657616c8d2c2c02c7bf479ec"
@ -2800,23 +2818,6 @@ qs@~6.5.2:
resolved "https://registry.yarnpkg.com/qs/-/qs-6.5.3.tgz#3aeeffc91967ef6e35c0e488ef46fb296ab76aad"
integrity sha512-qxXIEh4pCGfHICj1mAJQ2/2XVZkjCDTcEgfoSQxc/fYivUZxTkk7L3bDBJSoNrEzXI17oUO5Dp07ktqE5KzczA==
ramda@^0.24.1:
version "0.24.1"
resolved "https://registry.yarnpkg.com/ramda/-/ramda-0.24.1.tgz#c3b7755197f35b8dc3502228262c4c91ddb6b857"
integrity sha512-HEm619G8PaZMfkqCa23qiOe7r3R0brPu7ZgOsgKUsnvLhd0qhc/vTjkUovomgPWa5ECBa08fJZixth9LaoBo5w==
ramda@^0.25.0:
version "0.25.0"
resolved "https://registry.yarnpkg.com/ramda/-/ramda-0.25.0.tgz#8fdf68231cffa90bc2f9460390a0cb74a29b29a9"
integrity sha512-GXpfrYVPwx3K7RQ6aYT8KPS8XViSXUVJT1ONhoKPE9VAleW42YE+U+8VEyGWt41EnEQW7gwecYJriTI0pKoecQ==
ramdasauce@^2.1.0:
version "2.1.3"
resolved "https://registry.yarnpkg.com/ramdasauce/-/ramdasauce-2.1.3.tgz#acb45ecc7e4fc4d6f39e19989b4a16dff383e9c2"
integrity sha512-Ml3CPim4SKwmg5g9UI77lnRSeKr/kQw7YhQ6rfdMcBYy6DMlwmkEwQqjygJ3OhxPR+NfFfpjKl3Tf8GXckaqqg==
dependencies:
ramda "^0.24.1"
randombytes@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/randombytes/-/randombytes-2.1.0.tgz#df6f84372f0270dc65cdf6291349ab7a473d4f2a"
@ -3001,13 +3002,6 @@ semver@7.3.5:
dependencies:
lru-cache "^6.0.0"
semver@^7.0.0:
version "7.3.7"
resolved "https://registry.yarnpkg.com/semver/-/semver-7.3.7.tgz#12c5b649afdbf9049707796e22a4028814ce523f"
integrity sha512-QlYTucUYOews+WeEujDoEGziz4K6c47V/Bd+LjSSYcA94p+DmINdf7ncaUinThfvZyu13lN9OY1XDxt8C0Tw0g==
dependencies:
lru-cache "^6.0.0"
set-function-length@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/set-function-length/-/set-function-length-1.1.1.tgz#4bc39fafb0307224a33e106a7d35ca1218d659ed"
@ -3052,7 +3046,7 @@ side-channel@^1.0.4:
get-intrinsic "^1.0.2"
object-inspect "^1.9.0"
signal-exit@^3.0.2:
signal-exit@^3.0.2, signal-exit@^3.0.3:
version "3.0.7"
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.7.tgz#a9a1767f8af84155114eaabd73f99273c8f59ad9"
integrity sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==
@ -3094,13 +3088,14 @@ stable@^0.1.8:
resolved "https://registry.yarnpkg.com/stable/-/stable-0.1.8.tgz#836eb3c8382fe2936feaf544631017ce7d47a3cf"
integrity sha512-ji9qxRnOVfcuLDySj9qzhGSEFVobyt1kIOSkj1qZzYLzq7Tos/oUUWvotUPQLlrsidqsK6tBH89Bc9kL5zHA6w==
string-width@^2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/string-width/-/string-width-2.1.1.tgz#ab93f27a8dc13d28cac815c462143a6d9012ae9e"
integrity sha512-nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw==
string-width@^4.2.0:
version "4.2.3"
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
dependencies:
is-fullwidth-code-point "^2.0.0"
strip-ansi "^4.0.0"
emoji-regex "^8.0.0"
is-fullwidth-code-point "^3.0.0"
strip-ansi "^6.0.1"
string_decoder@^1.1.1, string_decoder@^1.2.0:
version "1.3.0"
@ -3121,14 +3116,14 @@ string_decoder@~1.1.1:
dependencies:
safe-buffer "~5.1.0"
strip-ansi@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-4.0.0.tgz#a8479022eb1ac368a871389b635262c505ee368f"
integrity sha1-qEeQIusaw2iocTibY1JixQXuNo8=
strip-ansi@^5.2.0:
version "5.2.0"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-5.2.0.tgz#8c9a536feb6afc962bdfa5b104a5091c1ad9c0ae"
integrity sha512-DuRs1gKbBqsMKIZlrffwlug8MHkcnpjs5VPmL1PAh+mA30U0DTotfDZ0d2UUsXpPmPmMMJ6W773MaA3J+lbiWA==
dependencies:
ansi-regex "^3.0.0"
ansi-regex "^4.1.0"
strip-ansi@^6.0.0:
strip-ansi@^6.0.1:
version "6.0.1"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
@ -3405,7 +3400,7 @@ whatwg-url@^5.0.0:
tr46 "~0.0.3"
webidl-conversions "^3.0.0"
which@2.0.2, which@^2.0.0, which@^2.0.1:
which@2.0.2, which@^2.0.1:
version "2.0.2"
resolved "https://registry.yarnpkg.com/which/-/which-2.0.2.tgz#7c6a8dd0a636a0327e10b59c9286eee93f3f51b1"
integrity sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==
@ -3444,15 +3439,12 @@ yaml@1.9.2:
dependencies:
"@babel/runtime" "^7.9.2"
yaml@^1.7.2:
yaml@^1.10.0:
version "1.10.2"
resolved "https://registry.yarnpkg.com/yaml/-/yaml-1.10.2.tgz#2301c5ffbf12b467de8da2333a459e29e7920e4b"
integrity sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==
yargs-parser@^16.1.0:
version "16.1.0"
resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-16.1.0.tgz#73747d53ae187e7b8dbe333f95714c76ea00ecf1"
integrity sha512-H/V41UNZQPkUMIT5h5hiwg4QKIY1RPvoBV4XcjUbRM8Bk2oKqqyZ0DIEbTFZB0XjbtSPG8SAa/0DxCQmiRgzKg==
dependencies:
camelcase "^5.0.0"
decamelize "^1.2.0"
yargs-parser@^21.0.0:
version "21.1.1"
resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-21.1.1.tgz#9096bceebf990d21bb31fa9516e0ede294a77d35"
integrity sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==

View File

@ -2,6 +2,7 @@
import assert from 'assert';
import { DeepPartial, FindConditions, FindManyOptions } from 'typeorm';
import { ethers } from 'ethers';
import {
IndexerInterface,
@ -27,6 +28,8 @@ import { GetStorageAt, getStorageValue, MappingKey, StorageLayout } from '@cerc-
export class Indexer implements IndexerInterface {
_getStorageAt: GetStorageAt;
_storageLayoutMap: Map<string, StorageLayout> = new Map();
_contractMap: Map<string, ethers.utils.Interface> = new Map();
eventSignaturesMap: Map<string, string[]> = new Map();
constructor (ethClient: EthClient, storageLayoutMap?: Map<string, StorageLayout>) {
@ -49,6 +52,10 @@ export class Indexer implements IndexerInterface {
return this._storageLayoutMap;
}
get contractMap (): Map<string, ethers.utils.Interface> {
return this._contractMap;
}
async init (): Promise<void> {
return undefined;
}
@ -84,6 +91,12 @@ export class Indexer implements IndexerInterface {
return undefined;
}
async getEvents (options: FindManyOptions<EventInterface>): Promise<Array<EventInterface>> {
assert(options);
return [];
}
async getSyncStatus (): Promise<SyncStatusInterface | undefined> {
return undefined;
}
@ -94,6 +107,10 @@ export class Indexer implements IndexerInterface {
return undefined;
}
async getBlockByHash (blockHash?: string): Promise<{ block: any }> {
return { block: undefined };
}
async getBlocksAtHeight (height: number, isPruned: boolean): Promise<BlockProgressInterface[]> {
assert(height);
assert(isPruned);
@ -107,9 +124,9 @@ export class Indexer implements IndexerInterface {
return [];
}
async getAncestorAtDepth (blockHash: string, depth: number): Promise<string> {
async getAncestorAtHeight (blockHash: string, height: number): Promise<string> {
assert(blockHash);
assert(depth);
assert(height);
return '';
}
@ -247,7 +264,7 @@ export class Indexer implements IndexerInterface {
return undefined;
}
isWatchedContract (address : string): ContractInterface | undefined {
isContractAddressWatched (address : string): ContractInterface[] | undefined {
return undefined;
}
@ -259,6 +276,10 @@ export class Indexer implements IndexerInterface {
return undefined;
}
async removeContract (address: string, kind: string): Promise<void> {
return undefined;
}
async processBlock (blockProgress: BlockProgressInterface): Promise<void> {
return undefined;
}
@ -334,4 +355,16 @@ export class Indexer implements IndexerInterface {
async processStateCheckpoint (contractAddress: string, blockHash: string): Promise<boolean> {
return false;
}
async getFullTransactions (txHashList: string[]): Promise<EthFullTransaction[]> {
return [];
}
async switchClients (): Promise<void> {
return undefined;
}
async isGetLogsRequestsSlow (): Promise<boolean> {
return false;
}
}

View File

@ -12,7 +12,7 @@
/* Language and Environment */
"target": "ES2019", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
"lib": ["ES2021"], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
// "jsx": "preserve", /* Specify what JSX code is generated. */
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/ipld-eth-client",
"version": "0.2.78",
"version": "0.2.110",
"description": "IPLD ETH Client",
"main": "dist/index.js",
"scripts": {
@ -20,8 +20,8 @@
"homepage": "https://github.com/cerc-io/watcher-ts#readme",
"dependencies": {
"@apollo/client": "^3.7.1",
"@cerc-io/cache": "^0.2.78",
"@cerc-io/util": "^0.2.78",
"@cerc-io/cache": "^0.2.110",
"@cerc-io/util": "^0.2.110",
"cross-fetch": "^3.1.4",
"debug": "^4.3.1",
"ethers": "^5.4.4",

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/peer",
"version": "0.2.78",
"version": "0.2.110",
"description": "libp2p module",
"main": "dist/index.js",
"exports": "./dist/index.js",

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/rpc-eth-client",
"version": "0.2.78",
"version": "0.2.110",
"description": "RPC ETH Client",
"main": "dist/index.js",
"scripts": {
@ -19,9 +19,9 @@
},
"homepage": "https://github.com/cerc-io/watcher-ts#readme",
"dependencies": {
"@cerc-io/cache": "^0.2.78",
"@cerc-io/ipld-eth-client": "^0.2.78",
"@cerc-io/util": "^0.2.78",
"@cerc-io/cache": "^0.2.110",
"@cerc-io/ipld-eth-client": "^0.2.110",
"@cerc-io/util": "^0.2.110",
"chai": "^4.3.4",
"ethers": "^5.4.4",
"left-pad": "^1.3.0",

View File

@ -6,12 +6,13 @@ import assert from 'assert';
import { errors, providers, utils } from 'ethers';
import { Cache } from '@cerc-io/cache';
import { encodeHeader, escapeHexString, EthClient as EthClientInterface, EthFullBlock, EthFullTransaction } from '@cerc-io/util';
import {
encodeHeader, escapeHexString,
EthClient as EthClientInterface, EthFullBlock, EthFullTransaction,
MonitoredStaticJsonRpcProvider, FUTURE_BLOCK_ERROR, NULL_BLOCK_ERROR
} from '@cerc-io/util';
import { padKey } from '@cerc-io/ipld-eth-client';
const FUTURE_BLOCK_ERROR = "requested a future epoch (beyond 'latest')";
const NULL_BLOCK_ERROR = 'requested epoch was a null round';
export interface Config {
cache: Cache | undefined;
rpcEndpoint: string;
@ -35,7 +36,7 @@ export class EthClient implements EthClientInterface {
constructor (config: Config) {
const { rpcEndpoint, cache } = config;
assert(rpcEndpoint, 'Missing RPC endpoint');
this._provider = new providers.StaticJsonRpcProvider({
this._provider = new MonitoredStaticJsonRpcProvider({
url: rpcEndpoint,
allowGzip: true
});
@ -129,7 +130,7 @@ export class EthClient implements EthClientInterface {
parentHash: block.parentHash,
timestamp: block.timestamp.toString(),
stateRoot: this._provider.formatter.hash(rawBlock.stateRoot),
td: this._provider.formatter.bigNumber(rawBlock.totalDifficulty).toString(),
td: this._provider.formatter.bigNumber(rawBlock.totalDifficulty ?? 0).toString(),
txRoot: this._provider.formatter.hash(rawBlock.transactionsRoot),
receiptRoot: this._provider.formatter.hash(rawBlock.receiptsRoot)
}
@ -190,7 +191,7 @@ export class EthClient implements EthClientInterface {
parentHash: this._provider.formatter.hash(rawBlock.parentHash),
timestamp: this._provider.formatter.number(rawBlock.timestamp).toString(),
stateRoot: this._provider.formatter.hash(rawBlock.stateRoot),
td: this._provider.formatter.bigNumber(rawBlock.totalDifficulty).toString(),
td: this._provider.formatter.bigNumber(rawBlock.totalDifficulty ?? 0).toString(),
txRoot: this._provider.formatter.hash(rawBlock.transactionsRoot),
receiptRoot: this._provider.formatter.hash(rawBlock.receiptsRoot),
uncleRoot: this._provider.formatter.hash(rawBlock.sha3Uncles),
@ -375,7 +376,7 @@ export class EthClient implements EthClientInterface {
_handleGetBlockErrors (err: any): Array<null> {
if (err.code === errors.SERVER_ERROR && err.error) {
// Check null block error and return null array
if (err.error.message === NULL_BLOCK_ERROR) {
if (err.error.message.startsWith(NULL_BLOCK_ERROR)) {
return [null];
}

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/solidity-mapper",
"version": "0.2.78",
"version": "0.2.110",
"main": "dist/index.js",
"license": "AGPL-3.0",
"devDependencies": {

View File

@ -74,3 +74,11 @@
```bash
yarn get-storage-at -e http://127.0.0.1:8545 -c 0x5C69bEe701ef814a2B6a3EDD4B1652CB9cc5aA6f -s 0x1 -b 0xB5FFFF
```
## Get Logs Requests
* Run:
```bash
yarn eth-get-logs -i <input-requests-json-file> -o <output-results-json-file> -c <output-curl-requests-file> -e http://127.0.0.1:1234/rpc/v1 --parallel <true | false>
```

View File

@ -0,0 +1,66 @@
[
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [],
"topics": [[]]
},
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics": [[]]
},
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]]
},
{
"blockHash": "0xfcac7d16db53c4a3dedd4f6cad3c2c5c74311b602c9c812672bca1f0cad3b207",
"address": [
"0x497f5f88e0bad1a184e110514142dd9d94728ed5"
],
"topics":[[
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925"
]]
}
]

View File

@ -0,0 +1,72 @@
[
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [],
"topics": [[]]
},
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics": [[]]
},
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]]
},
{
"fromBlock": "0x401CD0",
"toBlock": "0x401D34",
"address": [
"0x497f5f88e0bad1a184e110514142dd9d94728ed5"
],
"topics":[[
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925"
]]
}
]

View File

@ -0,0 +1,122 @@
[
{
"address": [
"0x497f5f88e0bad1a184e110514142dd9d94728ed5"
],
"topics": [
[
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics": [
[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics": [
[
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics": [
[
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb"
],
"topics": [
[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9"
],
"topics": [
[
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics": [
[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics": [
[
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0x763b29b97e75fb54923325d46bab2807ac8c43c5"
],
"topics": [
[
"0xf154a899b3b867021c992026539485521c86f83735a958729ab118b9ce7a6407"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
},
{
"address": [
"0xd51cb0fa9a91f156a80188a18f039140704b8df7"
],
"topics": [
[
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]
],
"fromBlock": "0x405C02",
"toBlock": "0x405C66"
}
]

View File

@ -0,0 +1,72 @@
[
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [],
"topics": [[]]
},
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics": [[]]
},
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a",
"0x57e3bb9f790185cfe70cc2c15ed5d6b84dcf4adb",
"0x443a6243a36ef0ae1c46523d563c15abd787f4e9",
"0xaaa93ac72becfbbc9149f293466bbdaa4b5ef68c"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xe1fffcc4923d04b559f4d29a8bfc6cda04eb5b0d3c460751c2402c5c5cc9109c",
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0xc42079f94a6350d7e6235f29174924f928cc2ac818eb64fed8004e115fbcca67",
"0x9d9c909296d9c674451c0c24f02cb64981eb3b727f99865939192f880a755dcb",
"0xa4b3513a5f822f3e098a6a12338b3f07613cb130b75c90a250ab181402f4bb87"
]]
},
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [
"0x60e1773636cf5e4a227d9ac24f20feca034ee25a"
],
"topics":[[
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
]]
},
{
"fromBlock": "0x405C02",
"toBlock": "0x405C66",
"address": [
"0x497f5f88e0bad1a184e110514142dd9d94728ed5"
],
"topics":[[
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925"
]]
}
]

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/test",
"version": "0.2.78",
"version": "0.2.110",
"main": "dist/index.js",
"license": "AGPL-3.0",
"private": true,
@ -8,6 +8,7 @@
"lint": "eslint .",
"build": "tsc",
"eth-call": "DEBUG=vulcanize:* ts-node src/eth-call.ts",
"eth-get-logs": "DEBUG=vulcanize:* ts-node src/eth-get-logs.ts",
"get-storage-at": "DEBUG=vulcanize:* ts-node src/get-storage-at.ts",
"test:snapshot": "DEBUG=vulcanize:* mocha src/snapshot.test.ts"
},

View File

@ -0,0 +1,256 @@
//
// Copyright 2024 Vulcanize, Inc.
//
import { providers } from 'ethers';
import * as fs from 'fs';
import * as path from 'path';
import debug from 'debug';
import yargs from 'yargs';
import assert from 'assert';
const log = debug('vulcanize:test');
interface LogParams {
address: string[];
topics: string[][];
fromBlock?: string;
toBlock?: string;
blockHash?: string;
}
// Format time in milliseconds into minutes and seconds
function formatTime(ms: number): string {
const minutes = Math.floor(ms / 60000);
const seconds = ((ms % 60000) / 1000).toFixed(0);
return `${minutes}m${seconds}s`;
}
async function generateCurlCommand(rpcEndpoint: string, params: LogParams): Promise<string> {
const curlParams: any = {
address: params.address,
topics: params.topics
};
if (params.blockHash) {
curlParams.blockHash = params.blockHash;
} else {
curlParams.fromBlock = params.fromBlock;
curlParams.toBlock = params.toBlock;
}
const requestBody = {
jsonrpc: "2.0",
method: "eth_getLogs",
params: [curlParams],
id: 1
};
const curlCommand = `time curl -X POST -H "Content-Type: application/json" \\\n-d '${JSON.stringify(requestBody, null, 2)}' \\\n${rpcEndpoint}`;
return curlCommand;
}
async function getLogs(provider: providers.JsonRpcProvider, logParams: LogParams[], outputFilePath: string, curlRequestsOutputFilePath: string) {
for (const params of logParams) {
const { filter, result, blockNumber } = await buildFilter(provider, params);
const latestBlockNumber = await provider.getBlockNumber();
result.blocksBehindHead = latestBlockNumber - blockNumber
// Generate the curl command and write it to a file
const curlCommand = await generateCurlCommand('http://localhost:1234/rpc/v1', params);
fs.appendFileSync(curlRequestsOutputFilePath, curlCommand + '\n\n');
try {
// Record the start time
const startTime = Date.now();
// Fetch logs using the filter
const ethLogs = await provider.send(
'eth_getLogs',
[filter]
);
// Format raw eth_getLogs response
const logs: providers.Log[] = providers.Formatter.arrayOf(
provider.formatter.filterLog.bind(provider.formatter)
)(ethLogs);
// Record the end time and calculate the time taken
const endTime = Date.now();
const timeTakenMs = endTime - startTime;
// Store the result
result.numEvents = logs.length;
result.timeTaken = formatTime(timeTakenMs);
} catch (error) {
console.error(`Error fetching logs for params ${JSON.stringify(params)}:`, error);
} finally {
exportResult(outputFilePath, [result]);
}
}
}
async function getLogsParallel(provider: providers.JsonRpcProvider, logParams: LogParams[], outputFilePath: string, curlRequestsOutputFilePath: string) {
const filters: any[] = [];
const results: any[] = [];
const latestBlockNumber = await provider.getBlockNumber();
for (const params of logParams) {
const { filter, result, blockNumber } = await buildFilter(provider, params);
result.blocksBehindHead = latestBlockNumber - blockNumber
filters.push(filter);
results.push(result);
// Generate the curl command and write it to a file
const curlCommand = await generateCurlCommand('http://localhost:1234/rpc/v1', params);
fs.appendFileSync(curlRequestsOutputFilePath, curlCommand + '\n\n');
}
try {
// Record the start time
const startTime = Date.now();
await Promise.all(filters.map(async (filter, index) => {
// Fetch logs using the filter
const ethLogs = await provider.send(
'eth_getLogs',
[filter]
);
// Format raw eth_getLogs response
const logs: providers.Log[] = providers.Formatter.arrayOf(
provider.formatter.filterLog.bind(provider.formatter)
)(ethLogs);
// Store the result
results[index].numEvents = logs.length;
}));
// Record the end time and calculate the time taken
const endTime = Date.now();
const timeTakenMs = endTime - startTime;
const formattedTime = formatTime(timeTakenMs);
results.forEach(result => result.timeTaken = formattedTime);
} catch (error) {
console.error(`Error fetching logs:`, error);
} finally {
exportResult(outputFilePath, results);
}
}
async function buildFilter (provider: providers.JsonRpcProvider, params: LogParams): Promise<{ filter: any, result: any, blockNumber: number }> {
// Build the filter object
const filter: any = {
address: params.address.map(address => address.toLowerCase()),
topics: params.topics,
};
const result = {
...filter,
address: params.address
};
let blockNumber: number;
if (params.blockHash) {
filter.blockHash = params.blockHash;
result.blockHash = params.blockHash;
const block = await provider.getBlock(params.blockHash);
blockNumber = block.number;
result.blockNumber = blockNumber;
} else {
assert(params.toBlock && params.fromBlock, 'fromBlock or toBlock not found');
filter.fromBlock = params.fromBlock;
filter.toBlock = params.toBlock;
result.fromBlock = params.fromBlock;
result.toBlock = params.toBlock;
blockNumber = parseInt(params.toBlock, 16);
result.blocksRange = parseInt(params.toBlock, 16) - parseInt(params.fromBlock, 16);
}
return { filter, result, blockNumber };
}
function exportResult (outputFilePath: string, results: any[]): void {
let existingData = [];
// Read existing outputfile
if (fs.existsSync(outputFilePath)) {
const data = fs.readFileSync(outputFilePath, 'utf-8');
existingData = JSON.parse(data || '[]');
}
// Append new result to existing data
existingData.push(...results);
// Write the updated data back to the JSON file
fs.writeFileSync(outputFilePath, JSON.stringify(existingData, null, 2));
}
async function main() {
const argv = await yargs.parserConfiguration({
'parse-numbers': false
}).options({
endpoint: {
alias: 'e',
demandOption: true,
describe: 'Endpoint to perform eth-get-logs calls against',
type: 'string'
},
input: {
alias: 'i',
demandOption: true,
describe: 'Input file path',
type: 'string'
},
output: {
alias: 'o',
demandOption: true,
describe: 'Output file path',
type: 'string'
},
curlRequestsOutput: {
alias: 'c',
demandOption: true,
describe: 'Output file path for curl requests',
type: 'string'
},
parallel: {
alias: 'p',
default: false,
describe: 'Make requests in parallel',
type: 'boolean'
},
}).argv;
const outputFilePath = path.resolve(argv.output);
const curlRequestsOutputFilePath = path.resolve(argv.curlRequestsOutput);
// Read the input json file
const logParams: LogParams[] = JSON.parse(fs.readFileSync(path.resolve(argv.input), 'utf-8'));
// Create a provider with sufficient timeout
const timeout = 10 * 60 * 1000; // 10mins
const provider = new providers.JsonRpcProvider({ url: argv.endpoint, timeout });
// Get logs and measure performance
if (argv.parallel) {
log('Making parallel requests');
await getLogsParallel(provider, logParams, outputFilePath, curlRequestsOutputFilePath);
} else {
log('Making serial requests');
await getLogs(provider, logParams, outputFilePath, curlRequestsOutputFilePath);
}
log(`Results written to ${outputFilePath}`);
log(`CURL requests written to ${curlRequestsOutputFilePath}`);
}
main().catch(err => {
log(err);
});

View File

@ -1,6 +1,6 @@
{
"name": "@cerc-io/tracing-client",
"version": "0.2.78",
"version": "0.2.110",
"description": "ETH VM tracing client",
"main": "dist/index.js",
"scripts": {

View File

@ -1,13 +1,13 @@
{
"name": "@cerc-io/util",
"version": "0.2.78",
"version": "0.2.110",
"main": "dist/index.js",
"license": "AGPL-3.0",
"dependencies": {
"@apollo/utils.keyvaluecache": "^1.0.1",
"@cerc-io/nitro-node": "^0.1.15",
"@cerc-io/peer": "^0.2.78",
"@cerc-io/solidity-mapper": "^0.2.78",
"@cerc-io/peer": "^0.2.110",
"@cerc-io/solidity-mapper": "^0.2.110",
"@cerc-io/ts-channel": "1.0.3-ts-nitro-0.1.1",
"@ethersproject/properties": "^5.7.0",
"@ethersproject/providers": "^5.4.4",
@ -36,6 +36,7 @@
"it-length-prefixed": "^8.0.4",
"it-pipe": "^2.0.5",
"it-pushable": "^3.1.2",
"jayson": "^4.1.2",
"js-yaml": "^4.1.0",
"json-bigint": "^1.0.0",
"lodash": "^4.17.21",
@ -45,14 +46,16 @@
"pg": "^8.5.1",
"pg-boss": "^6.1.0",
"prom-client": "^14.0.1",
"read-pkg": "^9.0.1",
"toml": "^3.0.0",
"typeorm": "0.2.37",
"typeorm-naming-strategies": "^2.0.0",
"winston": "^3.13.0",
"ws": "^8.11.0",
"yargs": "^17.0.1"
},
"devDependencies": {
"@cerc-io/cache": "^0.2.78",
"@cerc-io/cache": "^0.2.110",
"@nomiclabs/hardhat-waffle": "^2.0.1",
"@types/bunyan": "^1.8.8",
"@types/express": "^4.17.14",

View File

@ -12,7 +12,7 @@ import {
UNKNOWN_EVENT_NAME
} from './constants';
import { JobQueue } from './job-queue';
import { BlockProgressInterface, IndexerInterface, EventInterface, EthFullTransaction, EthFullBlock } from './types';
import { BlockProgressInterface, IndexerInterface, EventInterface, EthFullTransaction, EthFullBlock, ContractInterface } from './types';
import { wait } from './misc';
import { OrderDirection } from './database';
import { JobQueueConfig } from './config';
@ -22,8 +22,10 @@ const DEFAULT_EVENTS_IN_BATCH = 50;
const log = debug('vulcanize:common');
const JSONbigNative = JSONbig({ useNativeBigInt: true });
export const NEW_BLOCK_MAX_RETRIES_ERROR = 'Reached max retries for fetching new block';
export interface PrefetchedBlock {
block: BlockProgressInterface;
block?: BlockProgressInterface;
events: DeepPartial<EventInterface>[];
ethFullBlock: EthFullBlock;
ethFullTransactions: EthFullTransaction[];
@ -64,50 +66,55 @@ export const fetchBlocksAtHeight = async (
jobQueueConfig: JobQueueConfig,
blockAndEventsMap: Map<string, PrefetchedBlock>
): Promise<DeepPartial<BlockProgressInterface>[]> => {
let blocks = [];
// Try fetching blocks from the db.
const blockProgressEntities = await indexer.getBlocksAtHeight(blockNumber, false);
blocks = blockProgressEntities.map((block: any) => {
block.timestamp = block.blockTimestamp;
return block;
});
let blocks: EthFullBlock[] = [];
let newBlockRetries = 0;
// Try fetching blocks from eth-server until found.
while (!blocks.length) {
const { block: latestBlock } = await indexer.getBlockByHash();
const blockProcessingOffset = jobQueueConfig.blockProcessingOffset ?? 0;
// Process block if it is blockProcessingOffset blocks behind latest block
if (latestBlock.number < blockNumber + blockProcessingOffset) {
// Check number of retries for fetching new block
if (jobQueueConfig.maxNewBlockRetries && newBlockRetries > jobQueueConfig.maxNewBlockRetries) {
throw new Error(NEW_BLOCK_MAX_RETRIES_ERROR);
}
newBlockRetries++;
log(`Latest block: ${latestBlock.number}, blockProcessingOffset: ${blockProcessingOffset}; retry block to process: ${blockNumber} after ${jobQueueConfig.blockDelayInMilliSecs}ms`);
await wait(jobQueueConfig.blockDelayInMilliSecs);
continue;
}
console.time(`time:common#_fetchBlocks-eth-server-${blockNumber}`);
const ethFullBlocks = await indexer.getBlocks({ blockNumber });
console.timeEnd(`time:common#_fetchBlocks-eth-server-${blockNumber}`);
// Check if all blocks are null and increment blockNumber to index next block number
if (ethFullBlocks.length > 0 && ethFullBlocks.every(block => block === null)) {
blockNumber++;
log(`Block ${blockNumber} requested was null (FEVM); Fetching next block`);
blockNumber++;
continue;
}
// Fitler null blocks
blocks = ethFullBlocks.filter(block => Boolean(block)) as EthFullBlock[];
assert(blocks.length, `Blocks at ${blockNumber} should exist as latest block is ${latestBlock}`);
if (!blocks.length) {
log(`No blocks fetched for block number ${blockNumber}, retrying after ${jobQueueConfig.blockDelayInMilliSecs} ms delay.`);
await wait(jobQueueConfig.blockDelayInMilliSecs);
} else {
blocks.forEach(block => {
blockAndEventsMap.set(
block.blockHash,
{
// Block is set later in job-runner when saving to database
block: {} as BlockProgressInterface,
events: [],
ethFullBlock: block,
// Transactions are set later in job-runner when fetching events
ethFullTransactions: []
}
);
});
}
blocks.forEach(block => {
blockAndEventsMap.set(
block.blockHash,
{
// Block is set later in job-runner when saving to database
block: {} as BlockProgressInterface,
events: [],
ethFullBlock: block,
// Transactions are set later in job-runner when fetching events
ethFullTransactions: []
}
);
});
}
assert(blocks.length, 'Blocks not fetched');
@ -125,7 +132,7 @@ export const fetchBlocksAtHeight = async (
});
}
await indexer.updateSyncStatusChainHead(blocks[0].blockHash, blocks[0].blockNumber);
await indexer.updateSyncStatusChainHead(blocks[0].blockHash, Number(blocks[0].blockNumber));
return blocksToBeIndexed;
};
@ -233,22 +240,25 @@ const _processEvents = async (
console.time('time:common#processEvents-processing_events_batch');
// Process events in loop
for (let event of events) {
for (const event of events) {
// Skipping check for order of events processing since logIndex in FEVM is not index of log in block
// Check was introduced to avoid reprocessing block events incase of restarts. But currently on restarts, unprocessed block is removed and reprocessed from first event log
// if (event.index <= block.lastProcessedEventIndex) {
// throw new Error(`Events received out of order for block number ${block.blockNumber} hash ${block.blockHash}, got event index ${eventIndex} and lastProcessedEventIndex ${block.lastProcessedEventIndex}, aborting`);
// }
const watchedContract = indexer.isWatchedContract(event.contract);
const watchedContracts = indexer.isContractAddressWatched(event.contract);
if (watchedContract) {
if (watchedContracts) {
// We might not have parsed this event yet. This can happen if the contract was added
// as a result of a previous event in the same block.
if (event.eventName === UNKNOWN_EVENT_NAME) {
// Parse the unknown event and save updated event to the db
event = _parseUnknownEvent(indexer, event, watchedContract.kind);
updatedDbEvents.push(event);
const { eventParsed, event: parsedEvent } = _parseUnknownEvent(indexer, event, watchedContracts);
if (eventParsed) {
updatedDbEvents.push(parsedEvent);
}
}
await indexer.processEvent(event, { ethFullBlock, ethFullTransactions });
@ -347,16 +357,19 @@ const _processEventsInSubgraphOrder = async (
}
// Parse events of initially unwatched contracts
for (let event of unwatchedContractEvents) {
const watchedContract = indexer.isWatchedContract(event.contract);
for (const event of unwatchedContractEvents) {
const watchedContracts = indexer.isContractAddressWatched(event.contract);
if (watchedContract) {
if (watchedContracts) {
// We might not have parsed this event yet. This can happen if the contract was added
// as a result of a previous event in the same block.
if (event.eventName === UNKNOWN_EVENT_NAME) {
// Parse the unknown event and save updated event to the db
event = _parseUnknownEvent(indexer, event, watchedContract.kind);
updatedDbEvents.push(event);
const { eventParsed, event: parsedEvent } = _parseUnknownEvent(indexer, event, watchedContracts);
if (eventParsed) {
updatedDbEvents.push(parsedEvent);
}
}
}
}
@ -389,11 +402,16 @@ const _getEventsBatch = async (indexer: IndexerInterface, blockHash: string, eve
);
};
const _parseUnknownEvent = (indexer: IndexerInterface, event: EventInterface, contractKind: string): EventInterface => {
const _parseUnknownEvent = (indexer: IndexerInterface, event: EventInterface, watchedContracts: ContractInterface[]): { eventParsed: boolean, event: EventInterface } => {
const logObj = JSONbigNative.parse(event.extraInfo);
assert(indexer.parseEventNameAndArgs);
const { eventName, eventInfo, eventSignature } = indexer.parseEventNameAndArgs(contractKind, logObj);
const { eventParsed, eventDetails: { eventName, eventInfo, eventSignature } } = indexer.parseEventNameAndArgs(watchedContracts, logObj);
if (!eventParsed) {
// Skip unparsable events
log(`WARNING: Skipping event for contract ${event.contract} as no matching event found in the ABI`);
return { eventParsed: false, event };
}
event.eventName = eventName;
event.eventInfo = JSONbigNative.stringify(eventInfo);
@ -402,7 +420,7 @@ const _parseUnknownEvent = (indexer: IndexerInterface, event: EventInterface, co
eventSignature
});
return event;
return { eventParsed: true, event };
};
/**

View File

@ -22,15 +22,25 @@ export interface JobQueueConfig {
lazyUpdateBlockProgress?: boolean;
subgraphEventsOrder: boolean;
blockDelayInMilliSecs: number;
// Number of blocks by which block processing lags behind head (default: 0)
blockProcessingOffset?: number;
// Block range in which logs are fetched during historical blocks processing
historicalLogsBlockRange?: number;
// Max block range of historical processing after which it waits for completion of events processing
// If set to -1 historical processing does not wait for events processing and completes till latest canonical block
historicalMaxFetchAhead?: number;
// Boolean to switch between modes of processing events when starting the server
// Setting to true will fetch filtered events and required blocks in a range of blocks and then process them
// Setting to false will fetch blocks consecutively with its events and then process them (Behaviour is followed in realtime processing near head)
useBlockRanges: boolean;
// Max number of retries to fetch new block after which watcher will failover to other RPC endpoints
// Infinitely retry if not set
maxNewBlockRetries?: number;
}
export interface GQLCacheConfig {
@ -196,23 +206,16 @@ export interface P2PConfig {
consensus: ConsensusConfig;
}
export interface ServerConfig {
host: string;
port: number;
mode: string;
gqlPath: string;
kind: string;
enableConfigValidation: boolean;
checkpointing: boolean;
checkpointInterval: number;
subgraphPath: string;
enableState: boolean;
wasmRestartBlocksInterval: number;
// GQL config
export interface GQLConfig {
path: string;
maxEventsBlockRange: number;
clearEntitiesCacheInterval: number;
// Boolean to skip updating entity fields required in state creation and not required in the frontend
skipStateFieldsUpdate: boolean;
// GQL cache-control max-age settings (in seconds)
cache: GQLCacheConfig;
// Boolean to load GQL query nested entity relations sequentially
loadRelationsSequential: boolean;
// Max GQL API requests to process simultaneously (defaults to 1)
maxSimultaneousRequests?: number;
@ -220,11 +223,40 @@ export interface ServerConfig {
// Max GQL API requests in queue until reject (defaults to -1, means do not reject)
maxRequestQueueLimit?: number;
// Boolean to load GQL query nested entity relations sequentially
loadRelationsSequential: boolean;
// Log directory for GQL requests
logDir?: string;
}
// GQL cache-control max-age settings (in seconds)
gqlCache: GQLCacheConfig;
// ETH RPC server config
export interface EthRPCConfig {
// Enable ETH JSON RPC server
enabled: boolean;
// Path to expose the RPC server at
path?: string;
// Max number of logs that can be returned in a single getLogs request
getLogsResultLimit?: number;
}
export interface ServerConfig {
host: string;
port: number;
mode: string;
kind: string;
enableConfigValidation: boolean;
checkpointing: boolean;
checkpointInterval: number;
subgraphPath: string;
enableState: boolean;
wasmRestartBlocksInterval: number;
clearEntitiesCacheInterval: number;
// Boolean to skip updating entity fields required in state creation and not required in the frontend
skipStateFieldsUpdate: boolean;
// GQL config for server
gql: GQLConfig
p2p: P2PConfig;
@ -232,6 +264,9 @@ export interface ServerConfig {
// Flag to specify whether RPC endpoint supports block hash as block tag parameter
// https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block
rpcSupportsBlockHashParam: boolean;
// ETH JSON RPC server config
ethRPC: EthRPCConfig;
}
export interface FundingAmountsConfig {
@ -254,16 +289,24 @@ export interface UpstreamConfig {
cache: CacheConfig;
ethServer: {
gqlApiEndpoint: string;
rpcProviderEndpoint: string;
rpcProviderEndpoints: string[];
rpcProviderMutationEndpoint: string;
// Boolean flag to specify if rpc-eth-client should be used for RPC endpoint instead of ipld-eth-client (ipld-eth-server GQL client)
rpcClient: boolean;
// Boolean flag to specify if rpcProviderEndpoint is an FEVM RPC endpoint
isFEVM: boolean;
// Boolean flag to filter event logs by contracts
filterLogsByAddresses: boolean;
// Boolean flag to filter event logs by topics
filterLogsByTopics: boolean;
// Switch clients if eth_getLogs call takes more than threshold (in secs)
getLogsClientSwitchThresholdInSecs?: number;
payments: EthServerPaymentsConfig;
}
traceProviderEndpoint: string;

View File

@ -30,3 +30,5 @@ export const DEFAULT_PREFETCH_BATCH_SIZE = 10;
export const DEFAULT_MAX_GQL_CACHE_SIZE = Math.pow(2, 20) * 8; // 8 MB
export const SUPPORTED_PAID_RPC_METHODS = ['eth_getBlockByHash', 'eth_getStorageAt', 'eth_getBlockByNumber'];
export const DEFAULT_ETH_GET_LOGS_RESULT_LIMIT = 10000;

View File

@ -18,7 +18,8 @@ import {
QueryRunner,
Repository,
SelectQueryBuilder,
WhereExpressionBuilder
WhereExpressionBuilder,
Not
} from 'typeorm';
import { SnakeNamingStrategy } from 'typeorm-naming-strategies';
import _ from 'lodash';
@ -457,15 +458,14 @@ export class Database {
await repo.delete(findConditions);
}
async getAncestorAtDepth (blockHash: string, depth: number): Promise<string> {
async getAncestorAtHeight (blockHash: string, height: number): Promise<string> {
const heirerchicalQuery = `
WITH RECURSIVE cte_query AS
(
SELECT
block_hash,
block_number,
parent_hash,
0 as depth
parent_hash
FROM
block_progress
WHERE
@ -474,14 +474,13 @@ export class Database {
SELECT
b.block_hash,
b.block_number,
b.parent_hash,
c.depth + 1
b.parent_hash
FROM
block_progress b
INNER JOIN
cte_query c ON c.parent_hash = b.block_hash
WHERE
c.depth < $2
b.block_number >= $2
)
SELECT
block_hash, block_number
@ -492,7 +491,7 @@ export class Database {
`;
// Get ancestor block hash using heirarchical query.
const [{ block_hash: ancestorBlockHash }] = await this._conn.query(heirerchicalQuery, [blockHash, depth]);
const [{ block_hash: ancestorBlockHash }] = await this._conn.query(heirerchicalQuery, [blockHash, height]);
return ancestorBlockHash;
}
@ -524,6 +523,12 @@ export class Database {
return events;
}
async getEvents (repo: Repository<EventInterface>, options: FindManyOptions<EventInterface>): Promise<Array<EventInterface>> {
const events = repo.find(options);
return events;
}
async saveEventEntity (repo: Repository<EventInterface>, entity: EventInterface): Promise<EventInterface> {
const event = await repo.save(entity);
eventCount.inc(1);
@ -666,7 +671,7 @@ export class Database {
async saveContract (repo: Repository<ContractInterface>, address: string, kind: string, checkpoint: boolean, startingBlock: number, context?: any): Promise<ContractInterface> {
const contract = await repo
.createQueryBuilder()
.where('address = :address', { address })
.where('address = :address AND kind = :kind', { address, kind })
.getOne();
const entity = repo.create({ address, kind, checkpoint, startingBlock, context });
@ -1320,7 +1325,7 @@ export class Database {
async _fetchEventCount (): Promise<void> {
const res = await this._conn.getRepository('event')
.count();
.count({ where: { eventName: Not(UNKNOWN_EVENT_NAME) } });
eventCount.set(res);
}

View File

@ -0,0 +1,321 @@
import { utils } from 'ethers';
import { Between, Equal, FindConditions, In, LessThanOrEqual, MoreThanOrEqual } from 'typeorm';
import { JsonRpcProvider } from '@ethersproject/providers';
import { EventInterface, IndexerInterface } from './types';
import { DEFAULT_ETH_GET_LOGS_RESULT_LIMIT } from './constants';
const CODE_INVALID_PARAMS = -32602;
const CODE_INTERNAL_ERROR = -32603;
const CODE_SERVER_ERROR = -32000;
const ERROR_CONTRACT_MAP_NOT_SET = 'Contract map not set';
const ERROR_CONTRACT_ABI_NOT_FOUND = 'Contract ABI not found';
const ERROR_CONTRACT_INSUFFICIENT_PARAMS = 'Insufficient params';
const ERROR_CONTRACT_NOT_RECOGNIZED = 'Contract not recognized';
const ERROR_CONTRACT_METHOD_NOT_FOUND = 'Contract method not found';
const ERROR_METHOD_NOT_IMPLEMENTED = 'Method not implemented';
const ERROR_INVALID_BLOCK_TAG = 'Invalid block tag';
const ERROR_INVALID_BLOCK_HASH = 'Invalid block hash';
const ERROR_INVALID_CONTRACT_ADDRESS = 'Invalid contract address';
const ERROR_INVALID_TOPICS = 'Invalid topics';
const ERROR_BLOCK_NOT_FOUND = 'Block not found';
const ERROR_LIMIT_EXCEEDED = 'Query results exceeds limit';
const DEFAULT_BLOCK_TAG = 'latest';
class ErrorWithCode extends Error {
code: number;
constructor (code: number, message: string) {
super(message);
this.code = code;
}
}
export const createEthRPCHandlers = async (
indexer: IndexerInterface,
ethProvider: JsonRpcProvider
): Promise<any> => {
return {
eth_blockNumber: async (args: any, callback: any) => {
const syncStatus = await indexer.getSyncStatus();
const result = syncStatus ? `0x${syncStatus.latestProcessedBlockNumber.toString(16)}` : '0x';
callback(null, result);
},
eth_call: async (args: any, callback: any) => {
try {
if (!indexer.contractMap) {
throw new ErrorWithCode(CODE_INTERNAL_ERROR, ERROR_CONTRACT_MAP_NOT_SET);
}
if (args.length === 0) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_CONTRACT_INSUFFICIENT_PARAMS);
}
const { to, data } = args[0];
const blockTag = args.length > 1 ? args[1] : DEFAULT_BLOCK_TAG;
const blockHash = await parseEthCallBlockTag(indexer, ethProvider, blockTag);
const watchedContract = indexer.getWatchedContracts().find(contract => contract.address === to);
if (!watchedContract) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_CONTRACT_NOT_RECOGNIZED);
}
const contractInterface = indexer.contractMap.get(watchedContract.kind);
if (!contractInterface) {
throw new ErrorWithCode(CODE_INTERNAL_ERROR, ERROR_CONTRACT_ABI_NOT_FOUND);
}
// Slice out method signature from data
const functionSelector = data.slice(0, 10);
// Find the matching function from the ABI
const functionFragment = contractInterface.getFunction(functionSelector);
if (!functionFragment) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_CONTRACT_METHOD_NOT_FOUND);
}
// Decode the data based on the matched function
const decodedData = contractInterface.decodeFunctionData(functionFragment, data);
const functionName = functionFragment.name;
const indexerMethod = (indexer as any)[functionName].bind(indexer);
if (!indexerMethod) {
throw new ErrorWithCode(CODE_SERVER_ERROR, ERROR_METHOD_NOT_IMPLEMENTED);
}
const result = await indexerMethod(blockHash, to, ...decodedData);
const encodedResult = contractInterface.encodeFunctionResult(functionFragment, Array.isArray(result.value) ? result.value : [result.value]);
callback(null, encodedResult);
} catch (error: any) {
let callBackError;
if (error instanceof ErrorWithCode) {
callBackError = { code: error.code, message: error.message };
} else {
callBackError = { code: CODE_SERVER_ERROR, message: error.message };
}
callback(callBackError);
}
},
eth_getLogs: async (args: any, callback: any) => {
try {
if (args.length === 0) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_CONTRACT_INSUFFICIENT_PARAMS);
}
const params = args[0];
// Parse arg params into where options
const where: FindConditions<EventInterface> = {};
// Address filter, address or a list of addresses
if (params.address) {
buildAddressFilter(params.address, where);
}
// Topics filter
if (params.topics) {
buildTopicsFilter(params.topics, where);
}
// Block hash takes precedence over fromBlock / toBlock if provided
if (params.blockHash) {
// Validate input block hash
if (!utils.isHexString(params.blockHash, 32)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_INVALID_BLOCK_HASH);
}
where.block = {
blockHash: params.blockHash
};
} else if (params.fromBlock || params.toBlock) {
const fromBlockNumber = params.fromBlock ? await parseEthGetLogsBlockTag(indexer, params.fromBlock) : null;
const toBlockNumber = params.toBlock ? await parseEthGetLogsBlockTag(indexer, params.toBlock) : null;
if (fromBlockNumber && toBlockNumber) {
// Both fromBlock and toBlock set
where.block = { blockNumber: Between(fromBlockNumber, toBlockNumber) };
} else if (fromBlockNumber) {
// Only fromBlock set
where.block = { blockNumber: MoreThanOrEqual(fromBlockNumber) };
} else if (toBlockNumber) {
// Only toBlock set
where.block = { blockNumber: LessThanOrEqual(toBlockNumber) };
}
}
// Fetch events from the db
// Load block relation
const resultLimit = indexer.serverConfig.ethRPC.getLogsResultLimit || DEFAULT_ETH_GET_LOGS_RESULT_LIMIT;
const events = await indexer.getEvents({
where,
relations: ['block'],
// TODO: Use querybuilder to order by block number
order: { block: 'ASC', index: 'ASC' },
take: resultLimit + 1
});
// Limit number of results can be returned by a single query
if (events.length > resultLimit) {
throw new ErrorWithCode(CODE_SERVER_ERROR, `${ERROR_LIMIT_EXCEEDED}: ${resultLimit}`);
}
// Transform events into result logs
const result = await transformEventsToLogs(events);
callback(null, result);
} catch (error: any) {
let callBackError;
if (error instanceof ErrorWithCode) {
callBackError = { code: error.code, message: error.message };
} else {
callBackError = { code: CODE_SERVER_ERROR, message: error.message };
}
callback(callBackError);
}
}
};
};
const parseEthCallBlockTag = async (indexer: IndexerInterface, ethProvider: JsonRpcProvider, blockTag: string): Promise<string> => {
if (utils.isHexString(blockTag)) {
// Return value if hex string is of block hash length
if (utils.hexDataLength(blockTag) === 32) {
return blockTag;
}
// Treat hex value as a block number
const block = await ethProvider.getBlock(blockTag);
if (block === null) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_BLOCK_NOT_FOUND);
}
return block.hash;
}
if (blockTag === DEFAULT_BLOCK_TAG) {
const syncStatus = await indexer.getSyncStatus();
if (!syncStatus) {
throw new ErrorWithCode(CODE_INTERNAL_ERROR, 'SyncStatus not found');
}
return syncStatus.latestProcessedBlockHash;
}
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_INVALID_BLOCK_TAG);
};
const parseEthGetLogsBlockTag = async (indexer: IndexerInterface, blockTag: string): Promise<number> => {
if (utils.isHexString(blockTag)) {
return Number(blockTag);
}
if (blockTag === DEFAULT_BLOCK_TAG) {
const syncStatus = await indexer.getSyncStatus();
if (!syncStatus) {
throw new ErrorWithCode(CODE_INTERNAL_ERROR, 'SyncStatus not found');
}
return syncStatus.latestProcessedBlockNumber;
}
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_INVALID_BLOCK_TAG);
};
const buildAddressFilter = (address: any, where: FindConditions<EventInterface>): void => {
if (Array.isArray(address)) {
// Validate input addresses
address.forEach((add: string) => {
if (!utils.isHexString(add, 20)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, `${ERROR_INVALID_CONTRACT_ADDRESS}: expected hex string of size 20`);
}
});
if (address.length > 0) {
where.contract = In(address);
}
} else {
// Validate input address
if (!utils.isHexString(address, 20)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, `${ERROR_INVALID_CONTRACT_ADDRESS}: expected hex string of size 20`);
}
where.contract = Equal(address);
}
};
type TopicColumn = 'topic0' | 'topic1' | 'topic2' | 'topic3';
const buildTopicsFilter = (topics: any, where: FindConditions<EventInterface>): void => {
// Check that topics is an array of size <= 4
if (!Array.isArray(topics)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, ERROR_INVALID_TOPICS);
}
if (topics.length > 4) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, `${ERROR_INVALID_TOPICS}: exceeds max topics`);
}
for (let i = 0; i < topics.length; i++) {
addTopicCondition(topics[i], `topic${i}` as TopicColumn, where);
}
};
const addTopicCondition = (
topicFilter: string[] | string,
topicIndex: TopicColumn,
where: FindConditions<EventInterface>
): any => {
if (Array.isArray(topicFilter)) {
// Validate input topics
topicFilter.forEach((topic: string) => {
if (!utils.isHexString(topic, 32)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, `${ERROR_INVALID_TOPICS}: expected hex string of size 32 for ${topicIndex}`);
}
});
if (topicFilter.length > 0) {
where[topicIndex] = In(topicFilter);
}
} else {
// Validate input address
if (!utils.isHexString(topicFilter, 32)) {
throw new ErrorWithCode(CODE_INVALID_PARAMS, `${ERROR_INVALID_TOPICS}: expected hex string of size 32 for ${topicIndex}`);
}
where[topicIndex] = Equal(topicFilter);
}
};
const transformEventsToLogs = async (events: Array<EventInterface>): Promise<any[]> => {
return events.map(event => {
const parsedExtraInfo = JSON.parse(event.extraInfo);
const topics: string[] = [];
[event.topic0, event.topic1, event.topic2, event.topic3].forEach(topic => {
if (topic) {
topics.push(topic);
}
});
return {
address: event.contract.toLowerCase(),
blockHash: event.block.blockHash,
blockNumber: `0x${event.block.blockNumber.toString(16)}`,
transactionHash: event.txHash,
transactionIndex: `0x${parsedExtraInfo.tx.index.toString(16)}`,
logIndex: `0x${parsedExtraInfo.logIndex.toString(16)}`,
data: event.data,
topics,
removed: event.block.isPruned
};
});
};

View File

@ -102,9 +102,9 @@ export class EventWatcher {
await this._jobQueue.waitForEmptyQueue(QUEUE_EVENT_PROCESSING);
// Get latest block in chain and sync status from DB
// Also get historical-processing queu size
// Also get historical-processing queue size
const [{ block: latestBlock }, syncStatus, historicalProcessingQueueSize] = await Promise.all([
this._ethClient.getBlockByHash(),
this._indexer.getBlockByHash(),
this._indexer.getSyncStatus(),
this._jobQueue.getQueueSize(QUEUE_HISTORICAL_PROCESSING, 'completed')
]);
@ -122,9 +122,15 @@ export class EventWatcher {
startBlockNumber = syncStatus.chainHeadBlockNumber + 1;
}
// Check if filter for logs is enabled
// Check if starting block for watcher is before latest canonical block
if (this._config.jobQueue.useBlockRanges && startBlockNumber < latestCanonicalBlockNumber) {
// Perform checks before starting historical block processing
if (
// Skip historical block processing if any block handler exists
!this._indexer.graphWatcher?.blockHandlerExists &&
// Run historical block processing if useBlockRanges is enabled
this._config.jobQueue.useBlockRanges &&
// Only run historical block processing if we are below the frothy region
startBlockNumber < latestCanonicalBlockNumber
) {
await this.startHistoricalBlockProcessing(startBlockNumber, latestCanonicalBlockNumber);
return;

View File

@ -80,12 +80,12 @@ export const fillBlocks = async (
const completePercentage = Math.round(blocksProcessed / numberOfBlocks * 100);
log(`Processed ${blocksProcessed} of ${numberOfBlocks} blocks (${completePercentage}%)`);
await processBlockByNumber(jobQueue, blockNumber + 1);
if (blockNumber + 1 >= endBlock) {
// Break the async loop when blockProgress event is for the endBlock and processing is complete.
if (blockNumber + 1 > endBlock) {
// Break the async loop when next block to be processed is more than endBlock.
break;
}
await processBlockByNumber(jobQueue, blockNumber + 1);
}
}

View File

@ -27,6 +27,13 @@ export const gqlQueryCount = new client.Counter({
registers: [gqlRegistry]
});
export const gqlQueryDuration = new client.Gauge({
name: 'gql_query_duration_seconds',
help: 'Duration of GQL queries',
labelNames: ['name'] as const,
registers: [gqlRegistry]
});
// Export metrics on a server
const app: Application = express();

View File

@ -17,11 +17,11 @@ import {
} from 'typeorm';
import { ColumnMetadata } from 'typeorm/metadata/ColumnMetadata';
import { RawSqlResultsToEntityTransformer } from 'typeorm/query-builder/transformer/RawSqlResultsToEntityTransformer';
import { SelectionNode } from 'graphql';
import { ArgumentNode, FieldNode, GraphQLResolveInfo, SelectionNode, IntValueNode, EnumValueNode, ObjectValueNode, ObjectFieldNode, ValueNode } from 'graphql';
import _ from 'lodash';
import debug from 'debug';
import { Database as BaseDatabase, QueryOptions, Where, CanonicalBlockHeight } from '../database';
import { Database as BaseDatabase, QueryOptions, Where, CanonicalBlockHeight, Filter, OPERATOR_MAP, OrderDirection } from '../database';
import { BlockProgressInterface } from '../types';
import { cachePrunedEntitiesCount, eventProcessingLoadEntityCacheHitCount, eventProcessingLoadEntityCount, eventProcessingLoadEntityDBQueryDuration } from '../metrics';
import { ServerConfig } from '../config';
@ -221,7 +221,8 @@ export class GraphDatabase {
id: string,
relationsMap: Map<any, { [key: string]: any }>,
block: CanonicalBlockHeight = {},
selections: ReadonlyArray<SelectionNode> = []
selections: ReadonlyArray<SelectionNode> = [],
queryInfo: GraphQLResolveInfo
): Promise<Entity | undefined> {
const { hash: blockHash, number: blockNumber } = block;
const repo = queryRunner.manager.getRepository<Entity>(entityType);
@ -240,7 +241,7 @@ export class GraphDatabase {
}
};
let entityData: any = await repo.findOne(findOptions as FindOneOptions<Entity>);
let entityData = await repo.findOne(findOptions as FindOneOptions<Entity>);
if (!entityData && findOptions.where.blockHash) {
entityData = await this._baseDatabase.getPrevEntityVersion(queryRunner, repo, findOptions);
@ -248,7 +249,8 @@ export class GraphDatabase {
// Get relational fields
if (entityData) {
entityData = await this.loadEntityRelations(queryRunner, block, relationsMap, entityType, entityData, selections);
const defragmentedSelections = this._defragmentGQLQuerySelections(selections, queryInfo);
entityData = await this.loadEntityRelations(queryRunner, block, relationsMap, entityType, entityData, defragmentedSelections, queryInfo);
}
return entityData;
@ -259,7 +261,8 @@ export class GraphDatabase {
block: CanonicalBlockHeight,
relationsMap: Map<any, { [key: string]: any }>,
entityType: new () => Entity, entityData: any,
selections: ReadonlyArray<SelectionNode> = []
selections: ReadonlyArray<SelectionNode> = [],
queryInfo: GraphQLResolveInfo
): Promise<Entity> {
const relations = relationsMap.get(entityType);
if (relations === undefined) {
@ -269,6 +272,7 @@ export class GraphDatabase {
const relationPromises = selections.filter((selection) => selection.kind === 'Field' && Boolean(relations[selection.name.value]))
.map(async (selection) => {
assert(selection.kind === 'Field');
const field = selection.name.value;
const { entity: relationEntity, isArray, isDerived, field: foreignKey } = relations[field];
let childSelections = selection.selectionSet?.selections || [];
@ -276,6 +280,9 @@ export class GraphDatabase {
// Filter out __typename field in GQL for loading relations.
childSelections = childSelections.filter(selection => !(selection.kind === 'Field' && selection.name.value === '__typename'));
// Parse selection's arguments
let { where: relationWhere, queryOptions: relationQueryOptions } = this._getGQLSelectionFieldArguments(selection, queryInfo);
if (isDerived) {
const where: Where = {
[foreignKey]: [{
@ -284,15 +291,22 @@ export class GraphDatabase {
operator: 'equals'
}]
};
relationWhere = _.mergeWith(relationWhere, where, (objValue: any, srcValue: any) => {
if (Array.isArray(objValue)) {
// Overwrite the array in the target object with the source array
return srcValue;
}
});
const relatedEntities = await this.getEntities(
queryRunner,
relationEntity,
relationsMap,
block,
where,
{ limit: DEFAULT_LIMIT },
childSelections
relationWhere,
relationQueryOptions,
childSelections,
queryInfo
);
entityData[field] = relatedEntities;
@ -308,15 +322,22 @@ export class GraphDatabase {
operator: 'in'
}]
};
relationWhere = _.mergeWith(relationWhere, where, (objValue: any, srcValue: any) => {
if (Array.isArray(objValue)) {
// Overwrite the array in the target object with the source array
return srcValue;
}
});
const relatedEntities = await this.getEntities(
queryRunner,
relationEntity,
relationsMap,
block,
where,
{ limit: DEFAULT_LIMIT },
childSelections
relationWhere,
relationQueryOptions,
childSelections,
queryInfo
);
entityData[field] = relatedEntities;
@ -331,7 +352,8 @@ export class GraphDatabase {
entityData[field],
relationsMap,
block,
childSelections
childSelections,
queryInfo
);
entityData[field] = relatedEntity;
@ -349,11 +371,14 @@ export class GraphDatabase {
block: CanonicalBlockHeight = {},
where: Where = {},
queryOptions: QueryOptions = {},
selections: ReadonlyArray<SelectionNode> = []
selections: ReadonlyArray<SelectionNode> = [],
queryInfo: GraphQLResolveInfo
): Promise<Entity[]> {
let entities: Entity[] = [];
const latestEntityType = this._entityToLatestEntityMap.get(entityType);
const defragmentedSelections = this._defragmentGQLQuerySelections(selections, queryInfo);
if (latestEntityType) {
if (Object.keys(block).length) {
// Use lateral query for entities with latest entity table.
@ -375,7 +400,7 @@ export class GraphDatabase {
relationsMap,
where,
queryOptions,
selections
defragmentedSelections
);
}
} else {
@ -405,7 +430,7 @@ export class GraphDatabase {
return [];
}
entities = await this.loadEntitiesRelations(queryRunner, block, relationsMap, entityType, entities, selections);
entities = await this.loadEntitiesRelations(queryRunner, block, relationsMap, entityType, entities, defragmentedSelections, queryInfo);
// Resolve any field name conflicts in the entity result.
entities = entities.map(entity => resolveEntityFieldConflicts(entity));
@ -447,7 +472,8 @@ export class GraphDatabase {
'latestEntities',
`${tableName}.id = "latestEntities"."id" AND ${tableName}.block_number = "latestEntities"."block_number"`
)
.setParameters(subQuery.getParameters());
.setParameters(subQuery.getParameters())
.where(`${tableName}.is_pruned = :isPruned`, { isPruned: false });
selectQueryBuilder = this._baseDatabase.buildQuery(repo, selectQueryBuilder, where, relationsMap.get(entityType), block);
@ -743,7 +769,8 @@ export class GraphDatabase {
relationsMap: Map<any, { [key: string]: any }>,
entity: new () => Entity,
entities: Entity[],
selections: ReadonlyArray<SelectionNode> = []
selections: ReadonlyArray<SelectionNode> = [],
queryInfo: GraphQLResolveInfo
): Promise<Entity[]> {
const relations = relationsMap.get(entity);
if (relations === undefined) {
@ -752,13 +779,13 @@ export class GraphDatabase {
const relationSelections = selections.filter((selection) => selection.kind === 'Field' && Boolean(relations[selection.name.value]));
if (this._serverConfig.loadRelationsSequential) {
if (this._serverConfig.gql.loadRelationsSequential) {
for (const selection of relationSelections) {
await this.loadRelation(queryRunner, block, relationsMap, relations, entities, selection);
await this.loadRelation(queryRunner, block, relationsMap, relations, entities, selection, queryInfo);
}
} else {
const loadRelationPromises = relationSelections.map(async selection => {
await this.loadRelation(queryRunner, block, relationsMap, relations, entities, selection);
await this.loadRelation(queryRunner, block, relationsMap, relations, entities, selection, queryInfo);
});
await Promise.all(loadRelationPromises);
@ -773,9 +800,11 @@ export class GraphDatabase {
relationsMap: Map<any, { [key: string]: any }>,
relations: { [key: string]: any },
entities: Entity[],
selection: SelectionNode
selection: SelectionNode,
queryInfo: GraphQLResolveInfo
): Promise<void> {
assert(selection.kind === 'Field');
const field = selection.name.value;
const { entity: relationEntity, isArray, isDerived, field: foreignKey } = relations[field];
let childSelections = selection.selectionSet?.selections || [];
@ -783,6 +812,9 @@ export class GraphDatabase {
// Filter out __typename field in GQL for loading relations.
childSelections = childSelections.filter(selection => !(selection.kind === 'Field' && selection.name.value === '__typename'));
// Parse selection's arguments
let { where: relationWhere, queryOptions: relationQueryOptions } = this._getGQLSelectionFieldArguments(selection, queryInfo);
if (isDerived) {
const where: Where = {
[foreignKey]: [{
@ -791,15 +823,22 @@ export class GraphDatabase {
operator: 'in'
}]
};
relationWhere = _.mergeWith(relationWhere, where, (objValue: any, srcValue: any) => {
if (Array.isArray(objValue)) {
// Overwrite the array in the target object with the source array
return srcValue;
}
});
const relatedEntities = await this.getEntities(
queryRunner,
relationEntity,
relationsMap,
block,
where,
{},
childSelections
relationWhere,
relationQueryOptions,
childSelections,
queryInfo
);
const relatedEntitiesMap = relatedEntities.reduce((acc: {[key:string]: any[]}, entity: any) => {
@ -842,15 +881,22 @@ export class GraphDatabase {
operator: 'in'
}]
};
relationWhere = _.mergeWith(relationWhere, where, (objValue: any, srcValue: any) => {
if (Array.isArray(objValue)) {
// Overwrite the array in the target object with the source array
return srcValue;
}
});
const relatedEntities = await this.getEntities(
queryRunner,
relationEntity,
relationsMap,
block,
where,
{},
childSelections
relationWhere,
relationQueryOptions,
childSelections,
queryInfo
);
entities.forEach((entity: any) => {
@ -877,7 +923,10 @@ export class GraphDatabase {
// Avoid loading relation if selections only has id field.
if (childSelections.length === 1 && childSelections[0].kind === 'Field' && childSelections[0].name.value === 'id') {
entities.forEach((entity: any) => {
entity[field] = { id: entity[field] };
// Set only if field value is not null
if (entity[field]) {
entity[field] = { id: entity[field] };
}
});
return;
@ -890,15 +939,22 @@ export class GraphDatabase {
operator: 'in'
}]
};
relationWhere = _.mergeWith(relationWhere, where, (objValue: any, srcValue: any) => {
if (Array.isArray(objValue)) {
// Overwrite the array in the target object with the source array
return srcValue;
}
});
const relatedEntities = await this.getEntities(
queryRunner,
relationEntity,
relationsMap,
block,
where,
{},
childSelections
relationWhere,
relationQueryOptions,
childSelections,
queryInfo
);
const relatedEntitiesMap = relatedEntities.reduce((acc: {[key:string]: any}, entity: any) => {
@ -921,7 +977,11 @@ export class GraphDatabase {
return repo.save(dbEntity);
}
async toGraphEntity (instanceExports: any, entityName: string, data: any, entityTypes: { [key: string]: string }): Promise<any> {
async toGraphEntity (instanceExports: any, entityName: string, data: any, entityTypesMap: Map<string, {[key: string]: string; }>): Promise<any> {
// Get field types for this entity
const entityTypes = entityTypesMap.get(entityName);
assert(entityTypes);
// TODO: Cache schema/columns.
const repo = this._conn.getRepository(entityName);
const entityFields = repo.metadata.columns;
@ -944,7 +1004,16 @@ export class GraphDatabase {
field.propertyName = field.propertyName.slice(1);
}
const gqlType = entityTypes[field.propertyName];
let gqlType = entityTypes[field.propertyName];
// If the mapped type is present in entityTypesMap, it's a relational field
// Get the type for id field in that case
if (entityTypesMap.has(gqlType)) {
const relatedEntityTypes = entityTypesMap.get(gqlType);
assert(relatedEntityTypes);
gqlType = relatedEntityTypes.id;
}
return toEntityValue(instanceExports, entityInstance, data, field, gqlType);
}, {});
@ -1297,6 +1366,76 @@ export class GraphDatabase {
);
}
buildFilter (where: { [key: string]: any } = {}): Where {
return Object.entries(where).reduce((acc: Where, [fieldWithSuffix, value]) => {
if (fieldWithSuffix === FILTER_CHANGE_BLOCK) {
assert(value.number_gte && typeof value.number_gte === 'number');
acc[FILTER_CHANGE_BLOCK] = [{
value: value.number_gte,
not: false
}];
return acc;
}
if (['and', 'or'].includes(fieldWithSuffix)) {
assert(Array.isArray(value));
// Parse all the comibations given in the array
acc[fieldWithSuffix] = value.map(w => {
return this.buildFilter(w);
});
return acc;
}
const [field, ...suffix] = fieldWithSuffix.split('_');
if (!acc[field]) {
acc[field] = [];
}
let op = suffix.shift();
// If op is "" (different from undefined), it means it's a nested filter on a relation field
if (op === '') {
(acc[field] as Filter[]).push({
// Parse nested filter value
value: this.buildFilter(value),
not: false,
operator: 'nested'
});
return acc;
}
const filter: Filter = {
value,
not: false,
operator: 'equals'
};
if (op === 'not') {
filter.not = true;
op = suffix.shift();
}
if (op) {
filter.operator = op as keyof typeof OPERATOR_MAP;
}
// If filter field ends with "nocase", use case insensitive version of the operator
if (suffix[suffix.length - 1] === 'nocase') {
filter.operator = `${op}_nocase` as keyof typeof OPERATOR_MAP;
}
(acc[field] as Filter[]).push(filter);
return acc;
}, {});
}
_measureCachedPrunedEntities (): void {
const totalEntities = Array.from(this.cachedEntities.latestPrunedEntities.values())
.reduce((acc, idEntitiesMap) => acc + idEntitiesMap.size, 0);
@ -1304,4 +1443,90 @@ export class GraphDatabase {
log(`Total entities in cachedEntities.latestPrunedEntities map: ${totalEntities}`);
cachePrunedEntitiesCount.set(totalEntities);
}
_defragmentGQLQuerySelections (selections: ReadonlyArray<SelectionNode>, queryInfo: GraphQLResolveInfo): SelectionNode[] {
return selections.reduce((acc: SelectionNode[], selection) => {
if (selection.kind === 'FragmentSpread') {
const fragmentSelections = queryInfo.fragments[selection.name.value].selectionSet.selections;
return [...acc, ...fragmentSelections];
}
return [...acc, selection];
}, []);
}
_getGQLSelectionFieldArguments (fieldNode: FieldNode, queryInfo: GraphQLResolveInfo): { where: Where, queryOptions: QueryOptions } {
let where: Where = {};
const queryOptions: QueryOptions = {};
fieldNode.arguments?.forEach((arg: ArgumentNode) => {
switch (arg.name.value) {
case 'where':
where = this.buildFilter(this._buildWhereFromGQLArgValue((arg.value as ObjectValueNode), queryInfo));
break;
case 'first': {
const argValue = (arg.value.kind === 'Variable') ? queryInfo.variableValues[arg.value.name.value] : (arg.value as IntValueNode).value;
queryOptions.limit = Number(argValue);
break;
}
case 'skip': {
const argValue = (arg.value.kind === 'Variable') ? queryInfo.variableValues[arg.value.name.value] : (arg.value as IntValueNode).value;
queryOptions.skip = Number(argValue);
break;
}
case 'orderBy': {
const argValue = (arg.value.kind === 'Variable') ? queryInfo.variableValues[arg.value.name.value] : (arg.value as EnumValueNode).value;
queryOptions.orderBy = String(argValue);
break;
}
case 'orderDirection': {
const argValue = (arg.value.kind === 'Variable') ? queryInfo.variableValues[arg.value.name.value] : (arg.value as EnumValueNode).value;
queryOptions.orderDirection = argValue as OrderDirection;
break;
}
default:
throw new Error('Unrecognized query argument');
}
});
queryOptions.limit = queryOptions.limit || DEFAULT_LIMIT;
return { where, queryOptions };
}
_buildWhereFromGQLArgValue (whereArgValue: ObjectValueNode, queryInfo: GraphQLResolveInfo): { [key: string]: any } {
return whereArgValue.fields.reduce((acc: { [key: string]: any }, fieldNode: ObjectFieldNode) => {
acc[fieldNode.name.value] = this._parseGQLFieldValue(fieldNode.value, queryInfo);
return acc;
}, {});
}
_parseGQLFieldValue (value: ValueNode, queryInfo: GraphQLResolveInfo): any {
switch (value.kind) {
case 'BooleanValue':
case 'EnumValue':
case 'FloatValue':
case 'IntValue':
case 'StringValue':
return value.value;
case 'NullValue':
return null;
case 'Variable':
return queryInfo.variableValues[value.name.value];
case 'ListValue':
return value.values.map((valueNode) => this._parseGQLFieldValue(valueNode, queryInfo));
case 'ObjectValue':
return this._buildWhereFromGQLArgValue(value, queryInfo);
}
}
}

View File

@ -27,3 +27,5 @@ export * from './payments';
export * from './eth';
export * from './consensus';
export * from './validate-config';
export * from './logger';
export * from './eth-rpc-handlers';

View File

@ -33,6 +33,7 @@ import { JobQueue } from './job-queue';
import { Where, QueryOptions, BlockHeight } from './database';
import { ServerConfig, UpstreamConfig } from './config';
import { createOrUpdateStateData, StateDataMeta } from './state-helper';
import { ethRpcRequestDuration, setActiveUpstreamEndpointMetric } from './metrics';
const DEFAULT_MAX_EVENTS_BLOCK_RANGE = 1000;
@ -113,12 +114,16 @@ export class Indexer {
_db: DatabaseInterface;
_ethClient: EthClient;
_getStorageAt: GetStorageAt;
_ethProvider: ethers.providers.BaseProvider;
_ethProvider: ethers.providers.JsonRpcProvider;
_jobQueue: JobQueue;
_watchedContracts: { [key: string]: ContractInterface } = {};
_watchedContractsByAddressMap: { [key: string]: ContractInterface[] } = {};
_stateStatusMap: { [key: string]: StateStatus } = {};
_currentEndpointIndex = {
rpcProviderEndpoint: 0
};
constructor (
config: {
server: ServerConfig;
@ -126,7 +131,7 @@ export class Indexer {
},
db: DatabaseInterface,
ethClient: EthClient,
ethProvider: ethers.providers.BaseProvider,
ethProvider: ethers.providers.JsonRpcProvider,
jobQueue: JobQueue
) {
this._serverConfig = config.server;
@ -136,6 +141,61 @@ export class Indexer {
this._ethProvider = ethProvider;
this._jobQueue = jobQueue;
this._getStorageAt = this._ethClient.getStorageAt.bind(this._ethClient);
setActiveUpstreamEndpointMetric(
this._upstreamConfig,
this._currentEndpointIndex.rpcProviderEndpoint
);
}
async switchClients (
initClients: (upstreamConfig: UpstreamConfig, endpointIndexes: typeof this._currentEndpointIndex) => Promise<{
ethClient: EthClient,
ethProvider: ethers.providers.JsonRpcProvider
}>
): Promise<{ ethClient: EthClient, ethProvider: ethers.providers.JsonRpcProvider }> {
const oldRpcEndpoint = this._upstreamConfig.ethServer.rpcProviderEndpoints[this._currentEndpointIndex.rpcProviderEndpoint];
++this._currentEndpointIndex.rpcProviderEndpoint;
if (this._currentEndpointIndex.rpcProviderEndpoint === this._upstreamConfig.ethServer.rpcProviderEndpoints.length) {
this._currentEndpointIndex.rpcProviderEndpoint = 0;
}
const { ethClient, ethProvider } = await initClients(this._upstreamConfig, this._currentEndpointIndex);
setActiveUpstreamEndpointMetric(
this._upstreamConfig,
this._currentEndpointIndex.rpcProviderEndpoint
);
const newRpcEndpoint = ethProvider.connection.url;
log(`Switching RPC endpoint from ${oldRpcEndpoint} to endpoint ${newRpcEndpoint}`);
this._ethClient = ethClient;
this._ethProvider = ethProvider;
return { ethClient, ethProvider };
}
async isGetLogsRequestsSlow (): Promise<boolean> {
const threshold = this._upstreamConfig.ethServer.getLogsClientSwitchThresholdInSecs;
if (threshold) {
const getLogsLabels = {
method: 'eth_getLogs',
provider: this._ethProvider.connection.url
};
const ethRpcRequestDurationMetrics = await ethRpcRequestDuration.get();
const currentProviderDuration = ethRpcRequestDurationMetrics.values.find(
val => val.labels.method === getLogsLabels.method && val.labels.provider === getLogsLabels.provider
);
if (currentProviderDuration) {
return currentProviderDuration.value > threshold;
}
}
return false;
}
async fetchContracts (): Promise<void> {
@ -143,8 +203,12 @@ export class Indexer {
const contracts = await this._db.getContracts();
this._watchedContracts = contracts.reduce((acc: { [key: string]: ContractInterface }, contract) => {
acc[contract.address] = contract;
this._watchedContractsByAddressMap = contracts.reduce((acc: { [key: string]: ContractInterface[] }, contract) => {
if (!acc[contract.address]) {
acc[contract.address] = [];
}
acc[contract.address].push(contract);
return acc;
}, {});
@ -330,6 +394,10 @@ export class Indexer {
return blocks;
}
async getBlockByHash (blockHash?: string): Promise<{ block: any }> {
return this._ethClient.getBlockByHash(blockHash);
}
async getBlockProgress (blockHash: string): Promise<BlockProgressInterface | undefined> {
return this._db.getBlockProgress(blockHash);
}
@ -381,9 +449,9 @@ export class Indexer {
toBlock: number,
eventSignaturesMap: Map<string, string[]>,
parseEventNameAndArgs: (
kind: string,
watchedContracts: ContractInterface[],
logObj: { topics: string[]; data: string }
) => { eventName: string; eventInfo: {[key: string]: any}; eventSignature: string }
) => { eventParsed: boolean, eventDetails: any }
): Promise<{
blockProgress: BlockProgressInterface,
events: DeepPartial<EventInterface>[],
@ -490,7 +558,11 @@ export class Indexer {
}
// Fetch events (to be saved to db) for a particular block
async fetchEvents (blockHash: string, blockNumber: number, eventSignaturesMap: Map<string, string[]>, parseEventNameAndArgs: (kind: string, logObj: any) => any): Promise<{ events: DeepPartial<EventInterface>[], transactions: EthFullTransaction[]}> {
async fetchEvents (
blockHash: string, blockNumber: number,
eventSignaturesMap: Map<string, string[]>,
parseEventNameAndArgs: (watchedContracts: ContractInterface[], logObj: any) => { eventParsed: boolean, eventDetails: any }
): Promise<{ events: DeepPartial<EventInterface>[], transactions: EthFullTransaction[]}> {
const { addresses, topics } = this._createLogsFilters(eventSignaturesMap);
const { logs, transactions } = await this._fetchLogsAndTransactions(blockHash, blockNumber, addresses, topics);
@ -504,7 +576,12 @@ export class Indexer {
return { events, transactions };
}
async fetchEventsForContracts (blockHash: string, blockNumber: number, addresses: string[], eventSignaturesMap: Map<string, string[]>, parseEventNameAndArgs: (kind: string, logObj: any) => any): Promise<DeepPartial<EventInterface>[]> {
async fetchEventsForContracts (
blockHash: string, blockNumber: number,
addresses: string[],
eventSignaturesMap: Map<string, string[]>,
parseEventNameAndArgs: (watchedContracts: ContractInterface[], logObj: any) => { eventParsed: boolean, eventDetails: any }
): Promise<DeepPartial<EventInterface>[]> {
const { topics } = this._createLogsFilters(eventSignaturesMap);
const { logs, transactions } = await this._fetchLogsAndTransactions(blockHash, blockNumber, addresses, topics);
@ -530,11 +607,15 @@ export class Indexer {
}
async _fetchTxsFromLogs (logs: any[]): Promise<EthFullTransaction[]> {
const txHashes = Array.from([
const txHashList = Array.from([
...new Set<string>(logs.map((log) => log.transaction.hash))
]);
const ethFullTxPromises = txHashes.map(async txHash => {
return this.getFullTransactions(txHashList);
}
async getFullTransactions (txHashList: string[]): Promise<EthFullTransaction[]> {
const ethFullTxPromises = txHashList.map(async txHash => {
return this._ethClient.getFullTransaction(txHash);
});
@ -542,14 +623,31 @@ export class Indexer {
}
// Create events to be saved to db for a block given blockHash, logs, transactions and a parser function
createDbEventsFromLogsAndTxs (blockHash: string, logs: any, transactions: any, parseEventNameAndArgs: (kind: string, logObj: any) => any): DeepPartial<EventInterface>[] {
const transactionMap = transactions.reduce((acc: {[key: string]: any}, transaction: {[key: string]: any}) => {
createDbEventsFromLogsAndTxs (
blockHash: string,
logs: any, transactions: any,
parseEventNameAndArgs: (watchedContracts: ContractInterface[], logObj: any) => { eventParsed: boolean, eventDetails: any }
): DeepPartial<EventInterface>[] {
const transactionMap: {[key: string]: any} = transactions.reduce((acc: {[key: string]: any}, transaction: {[key: string]: any}) => {
acc[transaction.txHash] = transaction;
return acc;
}, {});
const dbEvents: Array<DeepPartial<EventInterface>> = [];
// Check if upstream is FEVM and sort logs by tx and log index
if (this._upstreamConfig.ethServer.isFEVM) {
// Sort the logs array first by tx index
// If two objects have the same tx index, it will then sort them by log index
logs = logs.sort((a: any, b: any) => {
if (a.transaction.hash !== b.transaction.hash) {
return transactionMap[a.transaction.hash].index - transactionMap[b.transaction.hash].index;
} else {
return a.index - b.index;
}
});
}
for (let li = 0; li < logs.length; li++) {
const logObj = logs[li];
const {
@ -572,23 +670,37 @@ export class Indexer {
let eventName = UNKNOWN_EVENT_NAME;
let eventInfo = {};
const tx = transactionMap[txHash];
const extraInfo: { [key: string]: any } = { topics, data, tx };
const extraInfo: { [key: string]: any } = { tx, logIndex };
const [topic0, topic1, topic2, topic3] = topics as string[];
const contract = ethers.utils.getAddress(address);
const watchedContract = this.isWatchedContract(contract);
const watchedContracts = this.isContractAddressWatched(contract);
if (watchedContracts) {
const { eventParsed, eventDetails } = parseEventNameAndArgs(watchedContracts, logObj);
if (!eventParsed) {
// Skip unparsable events
log(`WARNING: Skipping event for contract ${contract} as no matching event found in ABI`);
continue;
}
if (watchedContract) {
const eventDetails = parseEventNameAndArgs(watchedContract.kind, logObj);
eventName = eventDetails.eventName;
eventInfo = eventDetails.eventInfo;
extraInfo.eventSignature = eventDetails.eventSignature;
}
dbEvents.push({
index: logIndex,
// Use loop index incase of FEVM as logIndex is not actual index of log in block
index: this._upstreamConfig.ethServer.isFEVM ? li : logIndex,
txHash,
contract,
eventName,
topic0,
topic1,
topic2,
topic3,
data,
eventInfo: JSONbigNative.stringify(eventInfo),
extraInfo: JSONbigNative.stringify(extraInfo),
proof: JSONbigNative.stringify({
@ -709,8 +821,8 @@ export class Indexer {
}
}
async getAncestorAtDepth (blockHash: string, depth: number): Promise<string> {
return this._db.getAncestorAtDepth(blockHash, depth);
async getAncestorAtHeight (blockHash: string, height: number): Promise<string> {
return this._db.getAncestorAtHeight(blockHash, height);
}
async saveEventEntity (dbEvent: EventInterface): Promise<EventInterface> {
@ -760,19 +872,22 @@ export class Indexer {
return this._db.getEventsInRange(fromBlockNumber, toBlockNumber);
}
isWatchedContract (address : string): ContractInterface | undefined {
return this._watchedContracts[address];
isContractAddressWatched (address : string): ContractInterface[] | undefined {
return this._watchedContractsByAddressMap[address];
}
getContractsByKind (kind: string): ContractInterface[] {
const watchedContracts = Object.values(this._watchedContracts)
.filter(contract => contract.kind === kind);
const watchedContracts = Object.values(this._watchedContractsByAddressMap)
.reduce(
(acc, contracts) => acc.concat(contracts.filter(contract => contract.kind === kind)),
[]
);
return watchedContracts;
}
getWatchedContracts (): ContractInterface[] {
return Object.values(this._watchedContracts);
return Object.values(this._watchedContractsByAddressMap).flat();
}
async watchContract (address: string, kind: string, checkpoint: boolean, startingBlock: number, context?: any): Promise<void> {
@ -805,8 +920,36 @@ export class Indexer {
}
}
async removeContract (address: string, kind: string): Promise<void> {
const dbTx = await this._db.createTransactionRunner();
try {
await this._db.deleteEntitiesByConditions(dbTx, 'contract', { kind, address });
this._clearWatchedContracts(
watchedContract => watchedContract.kind === kind && watchedContract.address === address
);
} catch (error) {
await dbTx.rollbackTransaction();
throw error;
} finally {
await dbTx.release();
}
}
cacheContract (contract: ContractInterface): void {
this._watchedContracts[contract.address] = contract;
if (!this._watchedContractsByAddressMap[contract.address]) {
this._watchedContractsByAddressMap[contract.address] = [];
}
// Check if contract with kind is already cached and skip
const isAlreadyCached = this._watchedContractsByAddressMap[contract.address]
.some(watchedContract => contract.id === watchedContract.id);
if (isAlreadyCached) {
return;
}
this._watchedContractsByAddressMap[contract.address].push(contract);
}
async getStorageValue (storageLayout: StorageLayout, blockHash: string, token: string, variable: string, ...mappingKeys: any[]): Promise<ValueResult> {
@ -844,7 +987,7 @@ export class Indexer {
}
// Get all the contracts.
const contracts = Object.values(this._watchedContracts);
const [contracts] = Object.values(this._watchedContractsByAddressMap);
// Getting the block for checkpoint.
const block = await this.getBlockProgress(blockHash);
@ -898,10 +1041,11 @@ export class Indexer {
}
// Get the contract.
const contract = this._watchedContracts[contractAddress];
assert(contract, `Contract ${contractAddress} not watched`);
const watchedContracts = this._watchedContractsByAddressMap[contractAddress];
assert(watchedContracts, `Contract ${contractAddress} not watched`);
const [firstWatchedContract] = watchedContracts.sort((a, b) => a.startingBlock - b.startingBlock);
if (block.blockNumber < contract.startingBlock) {
if (block.blockNumber < firstWatchedContract.startingBlock) {
return;
}
@ -920,10 +1064,13 @@ export class Indexer {
}
// Get all the contracts.
const contracts = Object.values(this._watchedContracts);
const watchedContractsByAddress = Object.values(this._watchedContractsByAddressMap);
// Create an initial state for each contract.
for (const contract of contracts) {
for (const watchedContracts of watchedContractsByAddress) {
// Get the first watched contract
const [contract] = watchedContracts.sort((a, b) => a.startingBlock - b.startingBlock);
// Check if contract has checkpointing on.
if (contract.checkpoint) {
// Check if starting block not reached yet.
@ -968,8 +1115,9 @@ export class Indexer {
assert(block);
// Get the contract.
const contract = this._watchedContracts[contractAddress];
assert(contract, `Contract ${contractAddress} not watched`);
const watchedContracts = this._watchedContractsByAddressMap[contractAddress];
assert(watchedContracts, `Contract ${contractAddress} not watched`);
const [contract] = watchedContracts.sort((a, b) => a.startingBlock - b.startingBlock);
if (block.blockNumber < contract.startingBlock) {
return;
@ -1004,8 +1152,9 @@ export class Indexer {
}
// Get the contract.
const contract = this._watchedContracts[contractAddress];
assert(contract, `Contract ${contractAddress} not watched`);
const watchedContracts = this._watchedContractsByAddressMap[contractAddress];
assert(watchedContracts, `Contract ${contractAddress} not watched`);
const [contract] = watchedContracts.sort((a, b) => a.startingBlock - b.startingBlock);
if (block.blockNumber < contract.startingBlock) {
return;
@ -1042,8 +1191,9 @@ export class Indexer {
}
// Get the contract.
const contract = this._watchedContracts[contractAddress];
assert(contract, `Contract ${contractAddress} not watched`);
const watchedContracts = this._watchedContractsByAddressMap[contractAddress];
assert(watchedContracts, `Contract ${contractAddress} not watched`);
const [contract] = watchedContracts.sort((a, b) => a.startingBlock - b.startingBlock);
if (currentBlock.blockNumber < contract.startingBlock) {
return;
@ -1245,16 +1395,16 @@ export class Indexer {
return;
}
const contracts = Object.values(this._watchedContracts);
const contractAddresses = Object.keys(this._watchedContractsByAddressMap);
// TODO: Fire a single query for all contracts.
for (const contract of contracts) {
const initState = await this._db.getLatestState(contract.address, StateKind.Init);
const diffState = await this._db.getLatestState(contract.address, StateKind.Diff);
const diffStagedState = await this._db.getLatestState(contract.address, StateKind.DiffStaged);
const checkpointState = await this._db.getLatestState(contract.address, StateKind.Checkpoint);
for (const contractAddress of contractAddresses) {
const initState = await this._db.getLatestState(contractAddress, StateKind.Init);
const diffState = await this._db.getLatestState(contractAddress, StateKind.Diff);
const diffStagedState = await this._db.getLatestState(contractAddress, StateKind.DiffStaged);
const checkpointState = await this._db.getLatestState(contractAddress, StateKind.Checkpoint);
this._stateStatusMap[contract.address] = {
this._stateStatusMap[contractAddress] = {
init: initState?.block.blockNumber,
diff: diffState?.block.blockNumber,
diff_staged: diffStagedState?.block.blockNumber,
@ -1276,7 +1426,7 @@ export class Indexer {
}
await this._db.deleteEntitiesByConditions(dbTx, 'contract', { startingBlock: MoreThan(blockNumber) });
this._clearWatchedContracts((watchedContracts) => watchedContracts.startingBlock > blockNumber);
this._clearWatchedContracts((watchedContract) => watchedContract.startingBlock > blockNumber);
await this._db.deleteEntitiesByConditions(dbTx, 'block_progress', { blockNumber: MoreThan(blockNumber) });
@ -1318,7 +1468,7 @@ export class Indexer {
}
}
async clearProcessedBlockData (block: BlockProgressInterface, entities: EntityTarget<{ blockNumber: number }>[]): Promise<void> {
async clearProcessedBlockData (block: BlockProgressInterface, entities: EntityTarget<{ blockHash: string }>[]): Promise<void> {
const dbTx = await this._db.createTransactionRunner();
try {
@ -1338,11 +1488,15 @@ export class Indexer {
}
}
_clearWatchedContracts (removFilter: (watchedContract: ContractInterface) => boolean): void {
this._watchedContracts = Object.values(this._watchedContracts)
.filter(watchedContract => !removFilter(watchedContract))
.reduce((acc: {[key: string]: ContractInterface}, watchedContract) => {
acc[watchedContract.address] = watchedContract;
_clearWatchedContracts (removeFilter: (watchedContract: ContractInterface) => boolean): void {
this._watchedContractsByAddressMap = Object.entries(this._watchedContractsByAddressMap)
.map(([address, watchedContracts]): [string, ContractInterface[]] => [
address,
watchedContracts.filter(watchedContract => !removeFilter(watchedContract))
])
.filter(([, watchedContracts]) => watchedContracts.length)
.reduce((acc: {[key: string]: ContractInterface[]}, [address, watchedContracts]) => {
acc[address] = watchedContracts;
return acc;
}, {});
@ -1365,7 +1519,7 @@ export class Indexer {
});
}
if (this._upstreamConfig.ethServer.filterLogsByTopics && !this._upstreamConfig.ethServer.isFEVM) {
if (this._upstreamConfig.ethServer.filterLogsByTopics) {
const eventSignaturesSet = new Set<string>();
eventSignaturesMap.forEach(sigs => sigs.forEach(sig => {
eventSignaturesSet.add(sig);

View File

@ -4,9 +4,9 @@
import assert from 'assert';
import debug from 'debug';
import PgBoss from 'pg-boss';
import PgBoss, { MonitorStates } from 'pg-boss';
import { jobCount, lastJobCompletedOn } from './metrics';
import { lastJobCompletedOn } from './metrics';
import { wait } from './misc';
interface Config {
@ -48,35 +48,10 @@ export class JobQueue {
deleteAfterHours: 1, // 1 hour
newJobCheckInterval: 100,
// Time interval for firing monitor-states event.
monitorStateIntervalSeconds: 10
newJobCheckInterval: 100
});
this._boss.on('error', error => log(error));
this._boss.on('monitor-states', monitorStates => {
jobCount.set({ state: 'all' }, monitorStates.all);
jobCount.set({ state: 'created' }, monitorStates.created);
jobCount.set({ state: 'retry' }, monitorStates.retry);
jobCount.set({ state: 'active' }, monitorStates.active);
jobCount.set({ state: 'completed' }, monitorStates.completed);
jobCount.set({ state: 'expired' }, monitorStates.expired);
jobCount.set({ state: 'cancelled' }, monitorStates.cancelled);
jobCount.set({ state: 'failed' }, monitorStates.failed);
Object.entries(monitorStates.queues).forEach(([name, counts]) => {
jobCount.set({ state: 'all', name }, counts.all);
jobCount.set({ state: 'created', name }, counts.created);
jobCount.set({ state: 'retry', name }, counts.retry);
jobCount.set({ state: 'active', name }, counts.active);
jobCount.set({ state: 'completed', name }, counts.completed);
jobCount.set({ state: 'expired', name }, counts.expired);
jobCount.set({ state: 'cancelled', name }, counts.cancelled);
jobCount.set({ state: 'failed', name }, counts.failed);
});
});
}
get maxCompletionLag (): number {
@ -178,4 +153,10 @@ export class JobQueue {
await wait(EMPTY_QUEUE_CHECK_INTERVAL);
}
}
async getJobCounts (): Promise<MonitorStates> {
// Use any as countStates() method is not present in the types
const monitorStates = await (this._boss as any).countStates();
return monitorStates;
}
}

View File

@ -4,7 +4,7 @@
import assert from 'assert';
import debug from 'debug';
import { constants, ethers } from 'ethers';
import { constants, ethers, errors as ethersErrors } from 'ethers';
import { DeepPartial, In } from 'typeorm';
import PgBoss from 'pg-boss';
@ -29,9 +29,10 @@ import {
processBatchEvents,
PrefetchedBlock,
fetchBlocksAtHeight,
fetchAndSaveFilteredLogsAndBlocks
fetchAndSaveFilteredLogsAndBlocks,
NEW_BLOCK_MAX_RETRIES_ERROR
} from './common';
import { lastBlockNumEvents, lastBlockProcessDuration, lastProcessedBlockNumber } from './metrics';
import { isSyncingHistoricalBlocks, lastBlockNumEvents, lastBlockProcessDuration, lastProcessedBlockNumber } from './metrics';
const log = debug('vulcanize:job-runner');
@ -63,7 +64,11 @@ export class JobRunner {
_signalCount = 0;
_errorInEventsProcessing = false;
constructor (jobQueueConfig: JobQueueConfig, indexer: IndexerInterface, jobQueue: JobQueue) {
constructor (
jobQueueConfig: JobQueueConfig,
indexer: IndexerInterface,
jobQueue: JobQueue
) {
this._indexer = indexer;
this.jobQueue = jobQueue;
this._jobQueueConfig = jobQueueConfig;
@ -72,14 +77,28 @@ export class JobRunner {
async subscribeBlockProcessingQueue (): Promise<void> {
await this.jobQueue.subscribe(
QUEUE_BLOCK_PROCESSING,
async (job) => this.processBlock(job)
async (job) => {
try {
await this.processBlock(job);
} catch (error) {
this._jobErrorHandler(error as Error);
throw error;
}
}
);
}
async subscribeHistoricalProcessingQueue (): Promise<void> {
await this.jobQueue.subscribe(
QUEUE_HISTORICAL_PROCESSING,
async (job) => this.processHistoricalBlocks(job),
async (job) => {
try {
await this.processHistoricalBlocks(job);
} catch (error) {
this._jobErrorHandler(error as Error);
throw error;
}
},
{
teamSize: 1
}
@ -89,7 +108,14 @@ export class JobRunner {
async subscribeEventProcessingQueue (): Promise<void> {
await this.jobQueue.subscribe(
QUEUE_EVENT_PROCESSING,
async (job) => this.processEvent(job as PgBoss.JobWithMetadataDoneCallback<EventsJobData | ContractJobData, object>),
async (job) => {
try {
await this.processEvent(job as PgBoss.JobWithMetadataDoneCallback<EventsJobData | ContractJobData, object>);
} catch (error) {
this._jobErrorHandler(error as Error);
throw error;
}
},
{
teamSize: 1,
includeMetadata: true
@ -100,14 +126,28 @@ export class JobRunner {
async subscribeHooksQueue (): Promise<void> {
await this.jobQueue.subscribe(
QUEUE_HOOKS,
async (job) => this.processHooks(job)
async (job) => {
try {
await this.processHooks(job);
} catch (error) {
this._jobErrorHandler(error as Error);
throw error;
}
}
);
}
async subscribeBlockCheckpointQueue (): Promise<void> {
await this.jobQueue.subscribe(
QUEUE_BLOCK_CHECKPOINT,
async (job) => this.processCheckpoint(job)
async (job) => {
try {
this.processCheckpoint(job);
} catch (error) {
this._jobErrorHandler(error as Error);
throw error;
}
}
);
}
@ -116,6 +156,9 @@ export class JobRunner {
switch (kind) {
case JOB_KIND_INDEX: {
// Update metrics
isSyncingHistoricalBlocks.set(0);
const { data: { cid, blockHash, blockNumber, parentHash, timestamp } } = job;
// Check if blockHash present in job.
@ -140,6 +183,11 @@ export class JobRunner {
await Promise.all(indexBlockPromises);
}
// Switch clients if getLogs requests are too slow
if (await this._indexer.isGetLogsRequestsSlow()) {
await this._indexer.switchClients();
}
break;
}
@ -166,6 +214,9 @@ export class JobRunner {
}
async processHistoricalBlocks (job: PgBoss.JobWithDoneCallback<HistoricalJobData, HistoricalJobResponseData>): Promise<void> {
// Update metrics
isSyncingHistoricalBlocks.set(1);
const { data: { blockNumber: startBlock, processingEndBlockNumber } } = job;
if (this._historicalProcessingCompletedUpto) {
@ -242,6 +293,11 @@ export class JobRunner {
this._historicalProcessingCompletedUpto = endBlock;
// Switch clients if getLogs requests are too slow
if (await this._indexer.isGetLogsRequestsSlow()) {
await this._indexer.switchClients();
}
if (endBlock < processingEndBlockNumber) {
// If endBlock is lesser than processingEndBlockNumber push new historical job
await this.jobQueue.pushJob(
@ -439,10 +495,18 @@ export class JobRunner {
// We have more than one node at this height, so prune all nodes not reachable from indexed block at max reorg depth from prune height.
// This will lead to orphaned nodes, which will get pruned at the next height.
if (blocksAtHeight.length > 1) {
const [indexedBlock] = await this._indexer.getBlocksAtHeight(pruneBlockHeight + MAX_REORG_DEPTH, false);
let indexedBlock: BlockProgressInterface | undefined;
let indexedBlockHeight = pruneBlockHeight + MAX_REORG_DEPTH;
// Loop to find latest indexed block incase null block is encountered
while (!indexedBlock) {
[indexedBlock] = await this._indexer.getBlocksAtHeight(indexedBlockHeight, false);
--indexedBlockHeight;
assert(indexedBlockHeight > pruneBlockHeight, `No blocks found above pruneBlockHeight ${pruneBlockHeight}`);
}
// Get ancestor blockHash from indexed block at prune height.
const ancestorBlockHash = await this._indexer.getAncestorAtDepth(indexedBlock.blockHash, MAX_REORG_DEPTH);
const ancestorBlockHash = await this._indexer.getAncestorAtHeight(indexedBlock.blockHash, pruneBlockHeight);
newCanonicalBlockHash = ancestorBlockHash;
const blocksToBePruned = blocksAtHeight.filter(block => ancestorBlockHash !== block.blockHash);
@ -573,14 +637,12 @@ export class JobRunner {
// Do not throw error and complete the job as block will be processed after parent block processing.
return;
} else {
// Remove the unknown events of the parent block if it is marked complete.
console.time('time:job-runner#_indexBlock-remove-unknown-events');
await this._indexer.removeUnknownEvents(parentBlock);
console.timeEnd('time:job-runner#_indexBlock-remove-unknown-events');
}
}
const data = this._blockAndEventsMap.get(blockHash);
assert(data);
if (!blockProgress) {
// Delay required to process block.
const { jobDelayInMilliSecs = 0 } = this._jobQueueConfig;
@ -592,9 +654,24 @@ export class JobRunner {
[blockProgress, , ethFullTransactions] = await this._indexer.saveBlockAndFetchEvents({ cid, blockHash, blockNumber, parentHash, blockTimestamp });
log(`_indexBlock#saveBlockAndFetchEvents: fetched for block: ${blockProgress.blockHash} num events: ${blockProgress.numEvents}`);
console.timeEnd('time:job-runner#_indexBlock-saveBlockAndFetchEvents');
const data = this._blockAndEventsMap.get(blockHash);
assert(data);
this._blockAndEventsMap.set(
blockHash,
{
...data,
block: blockProgress,
ethFullTransactions
});
} else {
const events = await this._indexer.getBlockEvents(blockHash, {}, {});
const txHashList = Array.from([
...new Set<string>(events.map((event) => event.txHash))
]);
const ethFullTransactions = await this._indexer.getFullTransactions(txHashList);
// const ethFullTransactions =
this._blockAndEventsMap.set(
blockHash,
{
@ -621,6 +698,7 @@ export class JobRunner {
const prefetchedBlock = this._blockAndEventsMap.get(blockHash);
assert(prefetchedBlock);
const { block, ethFullBlock, ethFullTransactions } = prefetchedBlock;
assert(block, 'BlockProgress not set in blockAndEvents map');
try {
log(`Processing events for block ${block.blockNumber}`);
@ -640,10 +718,6 @@ export class JobRunner {
);
console.timeEnd(`time:job-runner#_processEvents-events-${block.blockNumber}`);
// Update metrics
lastProcessedBlockNumber.set(block.blockNumber);
lastBlockNumEvents.set(block.numEvents);
this._blockAndEventsMap.delete(block.blockHash);
// Check if new contract was added and filterLogsByAddresses is set to true
@ -671,12 +745,23 @@ export class JobRunner {
await this.jobQueue.deleteJobs(QUEUE_EVENT_PROCESSING);
}
// Update metrics
if (this._endBlockProcessTimer) {
this._endBlockProcessTimer();
}
lastProcessedBlockNumber.set(block.blockNumber);
lastBlockNumEvents.set(block.numEvents);
this._endBlockProcessTimer = lastBlockProcessDuration.startTimer();
await this._indexer.updateSyncStatusProcessedBlock(block.blockHash, block.blockNumber);
console.time('time:job-runner#_processEvents-update-status-and-remove-unknown-events');
await Promise.all([
// Update latest processed block in SyncStatus
this._indexer.updateSyncStatusProcessedBlock(block.blockHash, block.blockNumber),
// Remove the unknown events from processed block
this._indexer.removeUnknownEvents(block)
]);
console.timeEnd('time:job-runner#_processEvents-update-status-and-remove-unknown-events');
if (retryCount > 0) {
await Promise.all([
@ -710,6 +795,22 @@ export class JobRunner {
}
}
async _jobErrorHandler (error: any): Promise<void> {
if (
// Switch client if it is a server error from ethers.js
// https://docs.ethers.org/v5/api/utils/logger/#errors--server-error
error.code === ethersErrors.SERVER_ERROR ||
// Switch client if it is a timeout error from ethers.js
// https://docs.ethers.org/v5/api/utils/logger/#errors--timeout
error.code === ethersErrors.TIMEOUT ||
// Switch client if error is for max retries to get new block at head
error.message === NEW_BLOCK_MAX_RETRIES_ERROR
) {
log('RPC endpoint is not working; failing over to another one');
await this._indexer.switchClients();
}
}
_updateWatchedContracts (job: any): void {
const { data: { contract } } = job;
this._indexer.cacheContract(contract);

View File

@ -0,0 +1,18 @@
import winston from 'winston';
import path from 'path';
export const createGQLLogger = (logsDir = ''): winston.Logger => {
return winston.createLogger({
level: 'info',
format: winston.format.combine(
winston.format.timestamp(),
winston.format.json()
),
transports: [
// Write all logs with importance level of `error` or less to `watcher-gql-error.log`
new winston.transports.File({ filename: path.resolve(logsDir, 'watcher-gql-error.log'), level: 'error' }),
// Write all logs with importance level of `info` or less to `watcher-gql.log`
new winston.transports.File({ filename: path.resolve(logsDir, 'watcher-gql.log') })
]
});
};

View File

@ -7,20 +7,18 @@ import express, { Application } from 'express';
import { createConnection } from 'typeorm';
import debug from 'debug';
import assert from 'assert';
import { ethers } from 'ethers';
import JsonRpcProvider = ethers.providers.JsonRpcProvider;
import { Config } from './config';
import { Config, UpstreamConfig } from './config';
import { IndexerInterface } from './types';
import { JobQueue } from './job-queue';
const DB_SIZE_QUERY = 'SELECT pg_database_size(current_database())';
const log = debug('vulcanize:metrics');
// Create custom metrics
export const jobCount = new client.Gauge({
name: 'pgboss_jobs_total',
help: 'Total entries in job table',
labelNames: ['state', 'name'] as const
});
export const lastJobCompletedOn = new client.Gauge({
name: 'pgboss_last_job_completed_timestamp_seconds',
@ -78,10 +76,40 @@ export const eventProcessingEthCallDuration = new client.Histogram({
help: 'Duration of eth_calls made in event processing'
});
export const isSyncingHistoricalBlocks = new client.Gauge({
name: 'is_syncing_historical_blocks',
help: 'Whether the watcher is syncing in historical mode'
});
isSyncingHistoricalBlocks.set(Number(undefined));
export const ethRpcCount = new client.Counter({
name: 'watcher_eth_rpc_total',
help: 'Total number of ETH RPC requests',
labelNames: ['method', 'provider']
});
export const ethRpcErrors = new client.Counter({
name: 'watcher_eth_rpc_errors',
help: 'Number of ETH RPC request errors',
labelNames: ['method', 'provider']
});
export const ethRpcRequestDuration = new client.Gauge({
name: 'watcher_eth_rpc_request_duration',
help: 'ETH RPC request duration (in seconds)',
labelNames: ['method', 'provider']
});
const upstreamEndpointsMetric = new client.Gauge({
name: 'watcher_config_upstream_endpoints',
help: 'Configured upstream ETH RPC endpoints',
labelNames: ['provider']
});
// Export metrics on a server
const app: Application = express();
export const startMetricsServer = async (config: Config, indexer: IndexerInterface): Promise<void> => {
export const startMetricsServer = async (config: Config, jobQueue: JobQueue, indexer: IndexerInterface): Promise<void> => {
if (!config.metrics) {
log('Metrics is disabled. To enable add metrics host and port.');
return;
@ -103,12 +131,21 @@ export const startMetricsServer = async (config: Config, indexer: IndexerInterfa
this.set({ kind: 'latest_canonical' }, syncStatus.latestCanonicalBlockNumber);
this.set({ kind: 'chain_head' }, syncStatus.chainHeadBlockNumber);
this.set({ kind: 'intial_indexed' }, syncStatus.initialIndexedBlockNumber);
this.set({ kind: 'latest_processed' }, syncStatus.latestProcessedBlockNumber);
}
}
});
await registerJobQueueMetrics(jobQueue);
await registerWatcherConfigMetrics(config);
await registerDBSizeMetrics(config);
await registerUpstreamChainHeadMetrics();
await registerWatcherInfoMetrics();
// Collect default metrics
client.collectDefaultMetrics();
@ -123,6 +160,19 @@ export const startMetricsServer = async (config: Config, indexer: IndexerInterfa
});
};
// ETH RPC provider used for upstream chain head metrics
let ethRpcProvider: JsonRpcProvider | undefined;
export const setActiveUpstreamEndpointMetric = (upstreamConfig: UpstreamConfig, currentEndpointIndex: number): void => {
const endpoints = upstreamConfig.ethServer.rpcProviderEndpoints;
endpoints.forEach((endpoint, index) => {
upstreamEndpointsMetric.set({ provider: endpoint }, Number(index === currentEndpointIndex));
});
ethRpcProvider = new JsonRpcProvider(upstreamConfig.ethServer.rpcProviderEndpoints[currentEndpointIndex]);
};
const registerDBSizeMetrics = async ({ database, jobQueue }: Config): Promise<void> => {
const [watcherConn, jobQueueConn] = await Promise.all([
createConnection({
@ -157,3 +207,97 @@ const registerDBSizeMetrics = async ({ database, jobQueue }: Config): Promise<vo
}
});
};
const registerUpstreamChainHeadMetrics = async (): Promise<void> => {
// eslint-disable-next-line no-new
new client.Gauge({
name: 'latest_upstream_block_number',
help: 'Latest upstream block number',
async collect () {
try {
assert(ethRpcProvider, 'ethRpcProvider is not set');
const blockNumber = await ethRpcProvider.getBlockNumber();
this.set(blockNumber);
} catch (err) {
log('Error fetching latest block number', err);
}
}
});
};
const registerWatcherConfigMetrics = async ({ server, upstream, jobQueue }: Config): Promise<void> => {
const watcherConfigMetric = new client.Gauge({
name: 'watcher_config_info',
help: 'Watcher configuration info (static)',
labelNames: ['category', 'field']
});
watcherConfigMetric.set({ category: 'server', field: 'is_active' }, Number(server.kind === 'active'));
watcherConfigMetric.set({ category: 'server', field: 'is_subgraph_watcher' }, Number(server.subgraphPath?.length > 0));
watcherConfigMetric.set({ category: 'server', field: 'max_events_block_range' }, Number(server.gql.maxEventsBlockRange));
watcherConfigMetric.set({ category: 'server', field: 'clear_entities_cache_interval' }, Number(server.clearEntitiesCacheInterval));
watcherConfigMetric.set({ category: 'server', field: 'max_simultaneous_requests' }, Number(server.gql.maxSimultaneousRequests));
watcherConfigMetric.set({ category: 'server', field: 'max_request_queue_limit' }, Number(server.gql.maxRequestQueueLimit));
watcherConfigMetric.set({ category: 'upstream', field: 'is_using_rpc_client' }, Number(upstream.ethServer.rpcClient));
watcherConfigMetric.set({ category: 'upstream', field: 'is_fevm' }, Number(upstream.ethServer.isFEVM));
watcherConfigMetric.set({ category: 'server', field: 'rpc_supports_block_hash' }, Number(server.rpcSupportsBlockHashParam));
watcherConfigMetric.set({ category: 'upstream', field: 'filter_logs_by_addresses' }, Number(upstream.ethServer.filterLogsByAddresses));
watcherConfigMetric.set({ category: 'upstream', field: 'filter_logs_by_topics' }, Number(upstream.ethServer.filterLogsByTopics));
watcherConfigMetric.set({ category: 'jobqueue', field: 'num_events_in_batch' }, Number(jobQueue.eventsInBatch));
watcherConfigMetric.set({ category: 'jobqueue', field: 'block_delay_seconds' }, (Number(jobQueue.blockDelayInMilliSecs) || 0) / 1000);
watcherConfigMetric.set({ category: 'jobqueue', field: 'block_processing_offset' }, Number(jobQueue.blockProcessingOffset) ?? 0);
watcherConfigMetric.set({ category: 'jobqueue', field: 'use_block_ranges' }, Number(jobQueue.useBlockRanges));
watcherConfigMetric.set({ category: 'jobqueue', field: 'historical_logs_block_range' }, Number(jobQueue.historicalLogsBlockRange));
watcherConfigMetric.set({ category: 'jobqueue', field: 'historical_max_fetch_ahead' }, Number(jobQueue.historicalMaxFetchAhead));
};
const registerJobQueueMetrics = async (jobQueue: JobQueue): Promise<void> => {
// eslint-disable-next-line no-new
new client.Gauge({
name: 'pgboss_jobs_total',
help: 'Total entries in job table',
labelNames: ['state', 'name'] as const,
async collect () {
const jobCounts = await jobQueue.getJobCounts();
this.set({ state: 'all' }, jobCounts.all);
this.set({ state: 'created' }, jobCounts.created);
this.set({ state: 'retry' }, jobCounts.retry);
this.set({ state: 'active' }, jobCounts.active);
this.set({ state: 'completed' }, jobCounts.completed);
this.set({ state: 'expired' }, jobCounts.expired);
this.set({ state: 'cancelled' }, jobCounts.cancelled);
this.set({ state: 'failed' }, jobCounts.failed);
Object.entries(jobCounts.queues as Array<any>).forEach(([name, counts]) => {
this.set({ state: 'all', name }, counts.all);
this.set({ state: 'created', name }, counts.created);
this.set({ state: 'retry', name }, counts.retry);
this.set({ state: 'active', name }, counts.active);
this.set({ state: 'completed', name }, counts.completed);
this.set({ state: 'expired', name }, counts.expired);
this.set({ state: 'cancelled', name }, counts.cancelled);
this.set({ state: 'failed', name }, counts.failed);
});
}
});
};
const registerWatcherInfoMetrics = async (): Promise<void> => {
const { readPackage } = await import('read-pkg');
const pkgJson = await readPackage();
const watcherInfoMetric = new client.Gauge({
name: 'watcher_info',
help: 'Watcher info (static)',
labelNames: ['repository', 'version', 'commitHash']
});
watcherInfoMetric.set({
repository: pkgJson.repository && pkgJson.repository.url.replace(/^git\+/, ''),
version: pkgJson.version,
commitHash: pkgJson.commitHash
}, 1);
};

View File

@ -6,7 +6,7 @@ import assert from 'assert';
import { ValueTransformer } from 'typeorm';
import yargs from 'yargs';
import { hideBin } from 'yargs/helpers';
import { utils, providers } from 'ethers';
import { utils, providers, errors as ethersErrors } from 'ethers';
import JSONbig from 'json-bigint';
import Decimal from 'decimal.js';
import ApolloBigInt from 'apollo-type-bigint';
@ -22,9 +22,13 @@ import { ResultEvent } from './indexer';
import { EventInterface, EthFullBlock, EthFullTransaction } from './types';
import { BlockHeight } from './database';
import { Transaction } from './graph/utils';
import { ethRpcCount, ethRpcErrors, ethRpcRequestDuration } from './metrics';
const JSONbigNative = JSONbig({ useNativeBigInt: true });
export const FUTURE_BLOCK_ERROR = "requested a future epoch (beyond 'latest')";
export const NULL_BLOCK_ERROR = 'requested epoch was a null round';
/**
* Method to wait for specified time.
* @param time Time to wait in milliseconds
@ -154,7 +158,7 @@ export const getResetYargs = (): yargs.Argv => {
};
export const getCustomProvider = (url?: utils.ConnectionInfo | string, network?: providers.Networkish): providers.JsonRpcProvider => {
const provider = new providers.StaticJsonRpcProvider(url, network);
const provider = new MonitoredStaticJsonRpcProvider(url, network);
provider.formatter = new CustomFormatter();
return provider;
};
@ -251,7 +255,7 @@ export const jsonBigIntStringReplacer = (_: string, value: any): any => {
export const getResultEvent = (event: EventInterface): ResultEvent => {
const block = event.block;
const eventFields = JSONbigNative.parse(event.eventInfo);
const { tx, eventSignature } = JSONbigNative.parse(event.extraInfo);
const { tx, eventSignature, logIndex } = JSONbigNative.parse(event.extraInfo);
return {
block: {
@ -271,7 +275,7 @@ export const getResultEvent = (event: EventInterface): ResultEvent => {
contract: event.contract,
eventIndex: event.index,
eventIndex: logIndex ?? event.index,
eventSignature,
event: {
__typename: `${event.eventName}Event`,
@ -351,3 +355,32 @@ export const GraphQLBigDecimal = new GraphQLScalarType({
return value.toFixed();
}
});
export class MonitoredStaticJsonRpcProvider extends providers.StaticJsonRpcProvider {
// Override the send method
async send (method: string, params: Array<any>): Promise<any> {
// Register time taken for this request in the metrics
const endTimer = ethRpcRequestDuration.startTimer({ method, provider: this.connection.url });
try {
const result = await super.send(method, params);
return result;
} catch (err: any) {
// Ignore errors on fetching future blocks and if block is null (in case of filecoin)
if (err.code === ethersErrors.SERVER_ERROR && err.error) {
if (err.error.message === FUTURE_BLOCK_ERROR || err.error.message.startsWith(NULL_BLOCK_ERROR)) {
throw err;
}
}
// Register the error in metrics
ethRpcErrors.inc({ method, provider: this.connection.url }, 1);
// Rethrow the error
throw err;
} finally {
ethRpcCount.inc({ method, provider: this.connection.url }, 1);
endTimer();
}
}
}

View File

@ -1,5 +1,5 @@
import { Application } from 'express';
import { ApolloServer } from 'apollo-server-express';
import { ApolloServer, ExpressContext } from 'apollo-server-express';
import { createServer } from 'http';
import { WebSocketServer } from 'ws';
import { useServer } from 'graphql-ws/lib/use/ws';
@ -11,6 +11,8 @@ import debug from 'debug';
import responseCachePlugin from 'apollo-server-plugin-response-cache';
import { InMemoryLRUCache } from '@apollo/utils.keyvaluecache';
import queue from 'express-queue';
import jayson from 'jayson';
import { json as jsonParser } from 'body-parser';
import { TypeSource } from '@graphql-tools/utils';
import { makeExecutableSchema } from '@graphql-tools/schema';
@ -22,21 +24,25 @@ import { PaymentsManager, paymentsPlugin } from './payments';
const log = debug('vulcanize:server');
const DEFAULT_GQL_PATH = '/graphql';
const DEFAULT_ETH_RPC_PATH = '/rpc';
export const createAndStartServer = async (
app: Application,
typeDefs: TypeSource,
resolvers: any,
ethRPCHandlers: any,
serverConfig: ServerConfig,
paymentsManager?: PaymentsManager
): Promise<ApolloServer> => {
const {
host,
port,
gqlCache: gqlCacheConfig,
maxSimultaneousRequests,
maxRequestQueueLimit,
gqlPath = DEFAULT_GQL_PATH
gql: {
cache: gqlCacheConfig,
maxSimultaneousRequests,
maxRequestQueueLimit,
path: gqlPath = DEFAULT_GQL_PATH
}
} = serverConfig;
app.use(queue({ activeLimit: maxSimultaneousRequests || 1, queuedLimit: maxRequestQueueLimit || -1 }));
@ -62,6 +68,9 @@ export const createAndStartServer = async (
}
const server = new ApolloServer({
context: (expressContext: ExpressContext) => {
return expressContext;
},
schema,
csrfPrevention: true,
cache: gqlCache,
@ -88,14 +97,65 @@ export const createAndStartServer = async (
await server.start();
server.applyMiddleware({
app,
path: gqlPath
});
const rpcPath = serverConfig.ethRPC?.path ?? DEFAULT_ETH_RPC_PATH;
const rpcEnabled = serverConfig.ethRPC?.enabled;
// Apply GraphQL middleware
const applyGraphQLMiddleware = () => {
server.applyMiddleware({
app,
path: gqlPath
});
};
// Apply RPC middleware
const applyRPCMiddleware = () => {
if (!rpcEnabled) {
return;
}
// Create a JSON-RPC server to handle ETH RPC calls
const rpcServer = jayson.Server(ethRPCHandlers);
// Mount the JSON-RPC server to rpcPath
app.use(
rpcPath,
jsonParser(),
(req: any, res: any, next: () => void) => {
// Convert all GET requests to POST to avoid getting rejected by jayson server middleware
if (jayson.Utils.isMethod(req, 'GET')) {
req.method = 'POST';
}
next();
},
rpcServer.middleware()
);
};
// Apply middlewares based on path specificity
if (isPathMoreSpecific(rpcPath, gqlPath)) {
applyRPCMiddleware();
applyGraphQLMiddleware();
} else {
applyGraphQLMiddleware();
applyRPCMiddleware();
}
httpServer.listen(port, host, () => {
log(`Server is listening on ${host}:${port}${server.graphqlPath}`);
log(`GQL server is listening on http://${host}:${port}${gqlPath}`);
if (rpcEnabled) {
log(`ETH JSON RPC server is listening on http://${host}:${port}${rpcPath}`);
}
});
return server;
};
// Determine which path is more specific (more segments)
function isPathMoreSpecific (path1: string, path2: string) {
const path1Segments = path1.split('/').filter(segment => segment !== '');
const path2Segments = path2.split('/').filter(segment => segment !== '');
return path1Segments.length > path2Segments.length;
}

View File

@ -3,7 +3,7 @@
//
import { Connection, DeepPartial, EntityTarget, FindConditions, FindManyOptions, ObjectLiteral, QueryRunner } from 'typeorm';
import { Transaction } from 'ethers';
import { ethers, Transaction } from 'ethers';
import { MappingKey, StorageLayout } from '@cerc-io/solidity-mapper';
@ -62,6 +62,11 @@ export interface EventInterface {
index: number;
contract: string;
eventName: string;
topic0: string;
topic1: string | null;
topic2: string | null;
topic3: string | null;
data: string;
eventInfo: string;
extraInfo: string;
proof: string;
@ -161,19 +166,24 @@ export interface IndexerInterface {
readonly serverConfig: ServerConfig
readonly upstreamConfig: UpstreamConfig
readonly storageLayoutMap: Map<string, StorageLayout>
readonly contractMap: Map<string, ethers.utils.Interface>
// eslint-disable-next-line no-use-before-define
readonly graphWatcher?: GraphWatcherInterface
init (): Promise<void>
getBlockProgress (blockHash: string): Promise<BlockProgressInterface | undefined>
getBlockProgressEntities (where: FindConditions<BlockProgressInterface>, options: FindManyOptions<BlockProgressInterface>): Promise<BlockProgressInterface[]>
getEntitiesForBlock (blockHash: string, tableName: string): Promise<any[]>
getEvent (id: string): Promise<EventInterface | undefined>
getEvents (options: FindManyOptions<EventInterface>): Promise<Array<EventInterface>>
getSyncStatus (): Promise<SyncStatusInterface | undefined>
getStateSyncStatus (): Promise<StateSyncStatusInterface | undefined>
getBlocks (blockFilter: { blockHash?: string, blockNumber?: number }): Promise<Array<EthFullBlock | null>>
getBlockByHash (blockHash?: string): Promise<{ block: any }>
getBlocksAtHeight (height: number, isPruned: boolean): Promise<BlockProgressInterface[]>
getLatestCanonicalBlock (): Promise<BlockProgressInterface | undefined>
getLatestStateIndexedBlock (): Promise<BlockProgressInterface>
getBlockEvents (blockHash: string, where: Where, queryOptions: QueryOptions): Promise<Array<EventInterface>>
getAncestorAtDepth (blockHash: string, depth: number): Promise<string>
getAncestorAtHeight (blockHash: string, height: number): Promise<string>
saveBlockAndFetchEvents (block: DeepPartial<BlockProgressInterface>): Promise<[
BlockProgressInterface,
DeepPartial<EventInterface>[],
@ -200,13 +210,14 @@ export interface IndexerInterface {
saveEventEntity (dbEvent: EventInterface): Promise<EventInterface>
saveEvents (dbEvents: DeepPartial<EventInterface>[]): Promise<void>
processEvent (event: EventInterface, extraData: ExtraEventData): Promise<void>
parseEventNameAndArgs?: (kind: string, logObj: any) => any
isWatchedContract: (address: string) => ContractInterface | undefined;
parseEventNameAndArgs?: (watchedContracts: ContractInterface[], logObj: any) => { eventParsed: boolean, eventDetails: any }
isContractAddressWatched: (address: string) => ContractInterface[] | undefined;
getWatchedContracts: () => ContractInterface[]
getContractsByKind?: (kind: string) => ContractInterface[]
addContracts?: () => Promise<void>
cacheContract: (contract: ContractInterface) => void;
watchContract: (address: string, kind: string, checkpoint: boolean, startingBlock: number, context?: any) => Promise<void>
removeContract: (address: string, kind: string) => Promise<void>;
getEntityTypesMap?: () => Map<string, { [key: string]: string }>
getRelationsMap?: () => Map<any, { [key: string]: any }>
processInitialState: (contractAddress: string, blockHash: string) => Promise<any>
@ -233,6 +244,9 @@ export interface IndexerInterface {
resetWatcherToBlock (blockNumber: number): Promise<void>
clearProcessedBlockData (block: BlockProgressInterface): Promise<void>
getResultEvent (event: EventInterface): any
getFullTransactions (txHashList: string[]): Promise<EthFullTransaction[]>
isGetLogsRequestsSlow(): Promise<boolean>
switchClients(): Promise<void>
}
export interface DatabaseInterface {
@ -248,7 +262,7 @@ export interface DatabaseInterface {
getBlockEvents (blockHash: string, where?: Where, queryOptions?: QueryOptions): Promise<EventInterface[]>;
getEvent (id: string): Promise<EventInterface | undefined>
getSyncStatus (queryRunner: QueryRunner): Promise<SyncStatusInterface | undefined>
getAncestorAtDepth (blockHash: string, depth: number): Promise<string>
getAncestorAtHeight (blockHash: string, height: number): Promise<string>
getProcessedBlockCountForRange (fromBlockNumber: number, toBlockNumber: number): Promise<{ expected: number, actual: number }>;
getEventsInRange (fromBlockNumber: number, toBlockNumber: number): Promise<Array<EventInterface>>;
markBlocksAsPruned (queryRunner: QueryRunner, blocks: BlockProgressInterface[]): Promise<void>;
@ -284,6 +298,8 @@ export interface GraphDatabaseInterface {
}
export interface GraphWatcherInterface {
readonly blockHandlerExists: boolean;
readonly eventHandlerExists: boolean;
init (): Promise<void>;
setIndexer (indexer: IndexerInterface): void;
}

278
yarn.lock
View File

@ -169,6 +169,14 @@
dependencies:
"@babel/highlight" "^7.18.6"
"@babel/code-frame@^7.22.13":
version "7.24.6"
resolved "https://registry.yarnpkg.com/@babel/code-frame/-/code-frame-7.24.6.tgz#ab88da19344445c3d8889af2216606d3329f3ef2"
integrity sha512-ZJhac6FkEd1yhG2AHOmfcXG4ceoLltoCVJjN5XsWN9BifBQr+cHJbWi0h68HZuSORq+3WtJ2z0hwF2NG1b5kcA==
dependencies:
"@babel/highlight" "^7.24.6"
picocolors "^1.0.0"
"@babel/generator@^7.21.3":
version "7.21.3"
resolved "https://registry.yarnpkg.com/@babel/generator/-/generator-7.21.3.tgz#232359d0874b392df04045d72ce2fd9bb5045fce"
@ -216,6 +224,11 @@
resolved "https://registry.yarnpkg.com/@babel/helper-validator-identifier/-/helper-validator-identifier-7.19.1.tgz#7eea834cf32901ffdc1a7ee555e2f9c27e249ca2"
integrity sha512-awrNfaMtnHUr653GgGEs++LlAvW6w+DcPrOliSMXWCKo597CwL5Acf/wWdNkf/tfEQE3mjkeD1YOVZOUV/od1w==
"@babel/helper-validator-identifier@^7.24.6":
version "7.24.6"
resolved "https://registry.yarnpkg.com/@babel/helper-validator-identifier/-/helper-validator-identifier-7.24.6.tgz#08bb6612b11bdec78f3feed3db196da682454a5e"
integrity sha512-4yA7s865JHaqUdRbnaxarZREuPTHrjpDT+pXoAZ1yhyo6uFnIEpS8VMu16siFOHDpZNKYv5BObhsB//ycbICyw==
"@babel/highlight@^7.18.6":
version "7.18.6"
resolved "https://registry.yarnpkg.com/@babel/highlight/-/highlight-7.18.6.tgz#81158601e93e2563795adcbfbdf5d64be3f2ecdf"
@ -225,6 +238,16 @@
chalk "^2.0.0"
js-tokens "^4.0.0"
"@babel/highlight@^7.24.6":
version "7.24.6"
resolved "https://registry.yarnpkg.com/@babel/highlight/-/highlight-7.24.6.tgz#6d610c1ebd2c6e061cade0153bf69b0590b7b3df"
integrity sha512-2YnuOp4HAk2BsBrJJvYCbItHx0zWscI1C3zgWkz+wDyD9I7GIVrfnLyrR4Y1VR+7p+chAEcrgRQYZAGIKMV7vQ==
dependencies:
"@babel/helper-validator-identifier" "^7.24.6"
chalk "^2.4.2"
js-tokens "^4.0.0"
picocolors "^1.0.0"
"@babel/parser@7.16.4":
version "7.16.4"
resolved "https://registry.yarnpkg.com/@babel/parser/-/parser-7.16.4.tgz#d5f92f57cf2c74ffe9b37981c0e72fee7311372e"
@ -483,6 +506,11 @@
it-pushable "^3.1.0"
uint8arraylist "^2.3.2"
"@colors/colors@1.6.0", "@colors/colors@^1.6.0":
version "1.6.0"
resolved "https://registry.yarnpkg.com/@colors/colors/-/colors-1.6.0.tgz#ec6cd237440700bc23ca23087f513c75508958b0"
integrity sha512-Ir+AOibqzrIsL6ajt3Rz3LskB7OiMVHqltZmspbW/TJuTVuyOMirVqAkjfY6JISiLHgyNqicAC8AyHHGzNd/dA==
"@cspotcode/source-map-support@^0.8.0":
version "0.8.1"
resolved "https://registry.yarnpkg.com/@cspotcode/source-map-support/-/source-map-support-0.8.1.tgz#00629c35a688e05a88b1cda684fb9d5e73f000a1"
@ -490,6 +518,15 @@
dependencies:
"@jridgewell/trace-mapping" "0.3.9"
"@dabh/diagnostics@^2.0.2":
version "2.0.3"
resolved "https://registry.yarnpkg.com/@dabh/diagnostics/-/diagnostics-2.0.3.tgz#7f7e97ee9a725dffc7808d93668cc984e1dc477a"
integrity sha512-hrlQOIi7hAfzsMqlGSFyVucrx38O+j6wiGOf//H2ecvIEqYN4ADBSS2iLMh5UFyDunCNniUIPk/q3riFv45xRA==
dependencies:
colorspace "1.1.x"
enabled "2.0.x"
kuler "^2.0.0"
"@ensdomains/ens@^0.4.4":
version "0.4.5"
resolved "https://registry.npmjs.org/@ensdomains/ens/-/ens-0.4.5.tgz"
@ -3598,6 +3635,13 @@
dependencies:
"@types/node" "*"
"@types/connect@^3.4.33":
version "3.4.38"
resolved "https://registry.yarnpkg.com/@types/connect/-/connect-3.4.38.tgz#5ba7f3bc4fbbdeaff8dded952e5ff2cc53f8d858"
integrity sha512-K6uROf1LD88uDQqJCktA4yzL1YYAK6NgfsI0v/mTgyPKWsX1CnJ0XPSDhViejru1GcRkLWb8RlzFYJRqGUbaug==
dependencies:
"@types/node" "*"
"@types/cors@2.8.12":
version "2.8.12"
resolved "https://registry.yarnpkg.com/@types/cors/-/cors-2.8.12.tgz#6b2c510a7ad7039e98e7b8d3d6598f4359e5c080"
@ -3811,6 +3855,11 @@
resolved "https://registry.yarnpkg.com/@types/node/-/node-10.17.60.tgz#35f3d6213daed95da7f0f73e75bcc6980e90597b"
integrity sha512-F0KIgDJfy2nA3zMLmWGKxcH2ZVEtCZXHHdOQs2gSaQ27+lNeEfGxzkIw90aXswATX7AZ33tahPbzy6KAfUreVw==
"@types/node@^12.12.54":
version "12.20.55"
resolved "https://registry.yarnpkg.com/@types/node/-/node-12.20.55.tgz#c329cbd434c42164f846b909bd6f85b5537f6240"
integrity sha512-J8xLz7q2OFulZ2cyGTLE1TbbZcjpno7FaN6zdJNrgAdrJ+DZzh/uFR6YrTb4C+nXakvud8Q4+rbhoIWlYQbUFQ==
"@types/node@^12.12.6":
version "12.20.13"
resolved "https://registry.npmjs.org/@types/node/-/node-12.20.13.tgz"
@ -3826,6 +3875,11 @@
resolved "https://registry.npmjs.org/@types/normalize-package-data/-/normalize-package-data-2.4.0.tgz"
integrity sha512-f5j5b/Gf71L+dbqxIpQ4Z2WlmI/mPJ0fOkGGmFgtb6sAu97EPczzbS3/tJKxmcYDj55OX6ssqwDAWOHIYDRDGA==
"@types/normalize-package-data@^2.4.3":
version "2.4.4"
resolved "https://registry.yarnpkg.com/@types/normalize-package-data/-/normalize-package-data-2.4.4.tgz#56e2cc26c397c038fab0e3a917a12d5c5909e901"
integrity sha512-37i+OaWTh9qeK4LSHPsyRC7NahnGotNuZvjLSgcPzblpHB3rrCJxAOgI5gCdKm7coonsaX1Of0ILiTcnZjbfxA==
"@types/npmcli__package-json@^4.0.3":
version "4.0.3"
resolved "https://registry.yarnpkg.com/@types/npmcli__package-json/-/npmcli__package-json-4.0.3.tgz#b11073b86139070823881d4cae252617d013a940"
@ -3907,6 +3961,18 @@
"@types/glob" "~7.2.0"
"@types/node" "*"
"@types/triple-beam@^1.3.2":
version "1.3.5"
resolved "https://registry.yarnpkg.com/@types/triple-beam/-/triple-beam-1.3.5.tgz#74fef9ffbaa198eb8b588be029f38b00299caa2c"
integrity sha512-6WaYesThRMCl19iryMYP7/x2OVgCtbIVflDGFpWnb9irXI3UjYE4AzmYuiUKY1AJstGijoY+MgUszMgRxIYTYw==
"@types/ws@^7.4.4":
version "7.4.7"
resolved "https://registry.yarnpkg.com/@types/ws/-/ws-7.4.7.tgz#f7c390a36f7a0679aa69de2d501319f4f8d9b702"
integrity sha512-JQbbmxZTZehdc2iszGKs5oC3NFnjeay7mtAWrdt7qNtAVK0g19muApzAy4bm9byz79xa2ZnO/BOBC2R8RC5Lww==
dependencies:
"@types/node" "*"
"@types/ws@^8.5.3":
version "8.5.4"
resolved "https://registry.yarnpkg.com/@types/ws/-/ws-8.5.4.tgz#bb10e36116d6e570dd943735f86c933c1587b8a5"
@ -4104,7 +4170,7 @@
resolved "https://registry.npmjs.org/@yarnpkg/lockfile/-/lockfile-1.1.0.tgz"
integrity sha512-GpSwvyXOcOOlV70vbnzjj4fW5xW/FdUF6nQEt1ENy7m4ZCczi1+/buVUPAqmGfqznsORNFzUMjctTIp8a9tuCQ==
JSONStream@^1.0.4:
JSONStream@^1.0.4, JSONStream@^1.3.5:
version "1.3.5"
resolved "https://registry.npmjs.org/JSONStream/-/JSONStream-1.3.5.tgz"
integrity sha512-E+iruNOY8VV9s4JEbe1aNEm6MiszPRr/UfcHMz0TQh1BXSxHK+ASV1R6W4HpjBhSeS+54PIsAMCBmwD06LLsqQ==
@ -4722,6 +4788,11 @@ async@^2.4.0:
dependencies:
lodash "^4.17.14"
async@^3.2.3:
version "3.2.5"
resolved "https://registry.yarnpkg.com/async/-/async-3.2.5.tgz#ebd52a8fdaf7a2289a24df399f8d8485c8a46b66"
integrity sha512-baNZyqaaLhyLVKm/DlvdW051MSgO6b8eVfIezl9E5PqWxFgzLm/wQntEW4zOytVburDEr0JlALEpdOFwvErLsg==
asyncify-wasm@^1.2.1:
version "1.2.1"
resolved "https://registry.yarnpkg.com/asyncify-wasm/-/asyncify-wasm-1.2.1.tgz#a15c0480e858619a4f971e44e6fc05c49015d9e8"
@ -6140,7 +6211,7 @@ collection-visit@^1.0.0:
map-visit "^1.0.0"
object-visit "^1.0.0"
color-convert@^1.9.0:
color-convert@^1.9.0, color-convert@^1.9.3:
version "1.9.3"
resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-1.9.3.tgz#bb71850690e1f136567de629d2d5471deda4c1e8"
integrity sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==
@ -6159,11 +6230,35 @@ color-name@1.1.3:
resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.3.tgz#a7d0558bd89c42f795dd42328f740831ca53bc25"
integrity sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw==
color-name@~1.1.4:
color-name@^1.0.0, color-name@~1.1.4:
version "1.1.4"
resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2"
integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==
color-string@^1.6.0:
version "1.9.1"
resolved "https://registry.yarnpkg.com/color-string/-/color-string-1.9.1.tgz#4467f9146f036f855b764dfb5bf8582bf342c7a4"
integrity sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg==
dependencies:
color-name "^1.0.0"
simple-swizzle "^0.2.2"
color@^3.1.3:
version "3.2.1"
resolved "https://registry.yarnpkg.com/color/-/color-3.2.1.tgz#3544dc198caf4490c3ecc9a790b54fe9ff45e164"
integrity sha512-aBl7dZI9ENN6fUGC7mWpMTPNHmWUSNan9tuWN6ahh5ZLNk9baLJOnSMlrQkHcrfFgz2/RigjUVAjdx36VcemKA==
dependencies:
color-convert "^1.9.3"
color-string "^1.6.0"
colorspace@1.1.x:
version "1.1.4"
resolved "https://registry.yarnpkg.com/colorspace/-/colorspace-1.1.4.tgz#8d442d1186152f60453bf8070cd66eb364e59243"
integrity sha512-BgvKJiuVu1igBUF2kEjRCZXol6wiiGbY5ipL/oVPwm0BL9sIpMIzM8IK7vwuxIIzOXMV3Ey5w+vxhm0rR/TN8w==
dependencies:
color "^3.1.3"
text-hex "1.0.x"
columnify@^1.5.4:
version "1.5.4"
resolved "https://registry.npmjs.org/columnify/-/columnify-1.5.4.tgz"
@ -7122,6 +7217,11 @@ emoji-regex@^9.2.2:
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-9.2.2.tgz#840c8803b0d8047f4ff0cf963176b32d4ef3ed72"
integrity sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==
enabled@2.0.x:
version "2.0.0"
resolved "https://registry.yarnpkg.com/enabled/-/enabled-2.0.0.tgz#f9dd92ec2d6f4bbc0d5d1e64e21d61cd4665e7c2"
integrity sha512-AKrN98kuwOzMIdAizXGI86UFBoo26CL21UM763y1h/GMSJ4/OHU9k2YlsmBpyScFo/wbLzWQJBMCW4+IO3/+OQ==
encodeurl@~1.0.2:
version "1.0.2"
resolved "https://registry.npmjs.org/encodeurl/-/encodeurl-1.0.2.tgz"
@ -7317,6 +7417,18 @@ es6-object-assign@^1.1.0:
resolved "https://registry.yarnpkg.com/es6-object-assign/-/es6-object-assign-1.1.0.tgz#c2c3582656247c39ea107cb1e6652b6f9f24523c"
integrity sha512-MEl9uirslVwqQU369iHNWZXsI8yaZYGg/D65aOgZkeyFJwHYSxilf7rQzXKI7DdDuBPrBXbfk3sl9hJhmd5AUw==
es6-promise@^4.0.3:
version "4.2.8"
resolved "https://registry.yarnpkg.com/es6-promise/-/es6-promise-4.2.8.tgz#4eb21594c972bc40553d276e510539143db53e0a"
integrity sha512-HJDGx5daxeIvxdBxvG2cb9g4tEvwIk3i8+nhX0yGrYmZUzbkdg8QbDevheDB8gd0//uPj4c1EQua8Q+MViT0/w==
es6-promisify@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/es6-promisify/-/es6-promisify-5.0.0.tgz#5109d62f3e56ea967c4b63505aef08291c8a5203"
integrity sha512-C+d6UdsYDk0lMebHNR4S2NybQMMngAOnOwYBQjTOiv0MkoJMP0Myw2mgpDLBcpfCmRLxyFqYhS/CfOENq4SJhQ==
dependencies:
es6-promise "^4.0.3"
es6-symbol@^3.1.1, es6-symbol@~3.1.3:
version "3.1.3"
resolved "https://registry.npmjs.org/es6-symbol/-/es6-symbol-3.1.3.tgz"
@ -8212,6 +8324,11 @@ extsprintf@^1.2.0:
resolved "https://registry.npmjs.org/extsprintf/-/extsprintf-1.4.0.tgz"
integrity sha1-4mifjzVvrWLMplo6kcXfX5VRaS8=
eyes@^0.1.8:
version "0.1.8"
resolved "https://registry.yarnpkg.com/eyes/-/eyes-0.1.8.tgz#62cf120234c683785d902348a800ef3e0cc20bc0"
integrity sha512-GipyPsXO1anza0AOZdy69Im7hGFCNB7Y/NGjDlZGJ3GJJLtwNSb2vrzYrTYJRrRloVx7pl+bhUaTB8yiccPvFQ==
fake-merkle-patricia-tree@^1.0.1:
version "1.0.1"
resolved "https://registry.npmjs.org/fake-merkle-patricia-tree/-/fake-merkle-patricia-tree-1.0.1.tgz"
@ -8257,6 +8374,11 @@ fastq@^1.6.0:
dependencies:
reusify "^1.0.4"
fecha@^4.2.0:
version "4.2.3"
resolved "https://registry.yarnpkg.com/fecha/-/fecha-4.2.3.tgz#4d9ccdbc61e8629b259fdca67e65891448d569fd"
integrity sha512-OP2IUU6HeYKJi3i0z4A19kHMQoLVs4Hc+DPqqxI2h/DPZHTm/vjsfC6P0b4jCMy14XizLBqvndQ+UilD7707Jw==
fetch-ponyfill@^4.0.0:
version "4.1.0"
resolved "https://registry.npmjs.org/fetch-ponyfill/-/fetch-ponyfill-4.1.0.tgz"
@ -8408,6 +8530,11 @@ flow-stoplight@^1.0.0:
resolved "https://registry.npmjs.org/flow-stoplight/-/flow-stoplight-1.0.0.tgz"
integrity sha1-SiksW8/4s5+mzAyxqFPYbyfu/3s=
fn.name@1.x.x:
version "1.1.0"
resolved "https://registry.yarnpkg.com/fn.name/-/fn.name-1.1.0.tgz#26cad8017967aea8731bc42961d04a3d5988accc"
integrity sha512-GRnmB5gPyJpAhTQdSZTSp9uaPSvl09KoYcMQtsB9rQoOmzs9dH6ffeccH+Z+cv6P68Hu5bC6JjRh4Ah/mHSNRw==
follow-redirects@^1.0.0:
version "1.15.3"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.15.3.tgz#fe2f3ef2690afce7e82ed0b44db08165b207123a"
@ -9580,6 +9707,11 @@ indent-string@^4.0.0:
resolved "https://registry.yarnpkg.com/indent-string/-/indent-string-4.0.0.tgz#624f8f4497d619b2d9768531d58f4122854d7251"
integrity sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg==
index-to-position@^0.1.2:
version "0.1.2"
resolved "https://registry.yarnpkg.com/index-to-position/-/index-to-position-0.1.2.tgz#e11bfe995ca4d8eddb1ec43274488f3c201a7f09"
integrity sha512-MWDKS3AS1bGCHLBA2VLImJz42f7bJh8wQsTGCzI3j519/CASStoDONUBVz2I/VID0MpiX3SGSnbOD2xUalbE5g==
infer-owner@^1.0.4:
version "1.0.4"
resolved "https://registry.npmjs.org/infer-owner/-/infer-owner-1.0.4.tgz"
@ -9866,6 +9998,11 @@ is-arrayish@^0.2.1:
resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d"
integrity sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==
is-arrayish@^0.3.1:
version "0.3.2"
resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.3.2.tgz#4574a2ae56f7ab206896fb431eaeed066fdf8f03"
integrity sha512-eVRqCvVlZbuw3GrM63ovNSNAeA1K16kaR/LRY/92w0zxQ5/1YzwblUX652i4Xs9RwAGjW9d9y6X88t8OaAJfWQ==
is-bigint@^1.0.1:
version "1.0.2"
resolved "https://registry.npmjs.org/is-bigint/-/is-bigint-1.0.2.tgz"
@ -10321,6 +10458,11 @@ isobject@^3.0.0, isobject@^3.0.1:
resolved "https://registry.npmjs.org/isobject/-/isobject-3.0.1.tgz"
integrity sha1-TkMekrEalzFjaqH5yNHMvP2reN8=
isomorphic-ws@^4.0.1:
version "4.0.1"
resolved "https://registry.yarnpkg.com/isomorphic-ws/-/isomorphic-ws-4.0.1.tgz#55fd4cd6c5e6491e76dc125938dd863f5cd4f2dc"
integrity sha512-BhBvN2MBpWTaSHdWRb/bwdZJ1WaehQ2L1KngkCkfLUGF0mAWAT1sQUQacEmQ0jXkFw/czDXPNQSL5u2/Krsz1w==
isstream@~0.1.2:
version "0.1.2"
resolved "https://registry.npmjs.org/isstream/-/isstream-0.1.2.tgz"
@ -10550,6 +10692,24 @@ jackspeak@^2.3.5:
optionalDependencies:
"@pkgjs/parseargs" "^0.11.0"
jayson@^4.1.2:
version "4.1.2"
resolved "https://registry.yarnpkg.com/jayson/-/jayson-4.1.2.tgz#443c26a8658703e0b2e881117b09395d88b6982e"
integrity sha512-5nzMWDHy6f+koZOuYsArh2AXs73NfWYVlFyJJuCedr93GpY+Ku8qq10ropSXVfHK+H0T6paA88ww+/dV+1fBNA==
dependencies:
"@types/connect" "^3.4.33"
"@types/node" "^12.12.54"
"@types/ws" "^7.4.4"
JSONStream "^1.3.5"
commander "^2.20.3"
delay "^5.0.0"
es6-promisify "^5.0.0"
eyes "^0.1.8"
isomorphic-ws "^4.0.1"
json-stringify-safe "^5.0.1"
uuid "^8.3.2"
ws "^7.5.10"
js-sdsl@^4.1.4:
version "4.4.0"
resolved "https://registry.yarnpkg.com/js-sdsl/-/js-sdsl-4.4.0.tgz#8b437dbe642daa95760400b602378ed8ffea8430"
@ -10831,6 +10991,11 @@ klaw@^1.0.0:
optionalDependencies:
graceful-fs "^4.1.9"
kuler@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/kuler/-/kuler-2.0.0.tgz#e2c570a3800388fb44407e851531c1d670b061b3"
integrity sha512-Xq9nH7KlWZmXAtodXDDRE7vs6DU1gTU8zYDHDiWLSip45Egwq3plLHzPn27NgvzL2r1LMPC1vdqh98sQxtqj4A==
lcid@^1.0.0:
version "1.0.0"
resolved "https://registry.npmjs.org/lcid/-/lcid-1.0.0.tgz"
@ -11266,6 +11431,18 @@ log-symbols@4.1.0:
chalk "^4.1.0"
is-unicode-supported "^0.1.0"
logform@^2.3.2, logform@^2.4.0:
version "2.6.0"
resolved "https://registry.yarnpkg.com/logform/-/logform-2.6.0.tgz#8c82a983f05d6eaeb2d75e3decae7a768b2bf9b5"
integrity sha512-1ulHeNPp6k/LD8H91o7VYFBng5i1BDE7HoKxVbZiGFidS1Rj65qcywLxX+pVfAPoQJEjRdvKcusKwOupHCVOVQ==
dependencies:
"@colors/colors" "1.6.0"
"@types/triple-beam" "^1.3.2"
fecha "^4.2.0"
ms "^2.1.1"
safe-stable-stringify "^2.3.1"
triple-beam "^1.3.0"
loglevel@^1.6.8:
version "1.8.1"
resolved "https://registry.yarnpkg.com/loglevel/-/loglevel-1.8.1.tgz#5c621f83d5b48c54ae93b6156353f555963377b4"
@ -12731,6 +12908,13 @@ once@^1.3.0, once@^1.3.1, once@^1.4.0:
dependencies:
wrappy "1"
one-time@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/one-time/-/one-time-1.0.0.tgz#e06bc174aed214ed58edede573b433bbf827cb45"
integrity sha512-5DXOiRKwuSEcQ/l0kGCF6Q3jcADFv5tSmRaJck/OqkVFcOzutB134KRSfF0xDrL39MNnqxbHBbUUcjZIhTgb2g==
dependencies:
fn.name "1.x.x"
onetime@^5.1.0, onetime@^5.1.2:
version "5.1.2"
resolved "https://registry.npmjs.org/onetime/-/onetime-5.1.2.tgz"
@ -13070,6 +13254,15 @@ parse-json@^5.0.0:
json-parse-even-better-errors "^2.3.0"
lines-and-columns "^1.1.6"
parse-json@^8.0.0:
version "8.1.0"
resolved "https://registry.yarnpkg.com/parse-json/-/parse-json-8.1.0.tgz#91cdc7728004e955af9cb734de5684733b24a717"
integrity sha512-rum1bPifK5SSar35Z6EKZuYPJx85pkNaFrxBK3mwdfSJ1/WKbYrjoW/zTPSjRRamfmVX1ACBIdFAO0VRErW/EA==
dependencies:
"@babel/code-frame" "^7.22.13"
index-to-position "^0.1.2"
type-fest "^4.7.1"
parse-path@^4.0.0:
version "4.0.3"
resolved "https://registry.npmjs.org/parse-path/-/parse-path-4.0.3.tgz"
@ -13984,6 +14177,17 @@ read-pkg@^5.2.0:
parse-json "^5.0.0"
type-fest "^0.6.0"
read-pkg@^9.0.1:
version "9.0.1"
resolved "https://registry.yarnpkg.com/read-pkg/-/read-pkg-9.0.1.tgz#b1b81fb15104f5dbb121b6bbdee9bbc9739f569b"
integrity sha512-9viLL4/n1BJUCT1NXVTdS1jtm80yDEgR5T4yCelII49Mbj0v1rZdKqj7zCiYdbB0CuCgdrvHcNogAKTFPBocFA==
dependencies:
"@types/normalize-package-data" "^2.4.3"
normalize-package-data "^6.0.0"
parse-json "^8.0.0"
type-fest "^4.6.0"
unicorn-magic "^0.1.0"
read@1, read@~1.0.1:
version "1.0.7"
resolved "https://registry.npmjs.org/read/-/read-1.0.7.tgz"
@ -14459,6 +14663,11 @@ safe-regex@^1.1.0:
dependencies:
ret "~0.1.10"
safe-stable-stringify@^2.3.1:
version "2.4.3"
resolved "https://registry.yarnpkg.com/safe-stable-stringify/-/safe-stable-stringify-2.4.3.tgz#138c84b6f6edb3db5f8ef3ef7115b8f55ccbf886"
integrity sha512-e2bDA2WJT0wxseVd4lsDP4+3ONX6HpMXQa1ZhFQ7SU+GjvORCmShbCMltrtIDfkYhVHrOcPtj+KhmDBdPdZD1g==
"safer-buffer@>= 2.1.2 < 3", "safer-buffer@>= 2.1.2 < 3.0.0", safer-buffer@^2.0.2, safer-buffer@^2.1.0, safer-buffer@~2.1.0:
version "2.1.2"
resolved "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz"
@ -14788,6 +14997,13 @@ simple-get@^2.7.0:
once "^1.3.1"
simple-concat "^1.0.0"
simple-swizzle@^0.2.2:
version "0.2.2"
resolved "https://registry.yarnpkg.com/simple-swizzle/-/simple-swizzle-0.2.2.tgz#a4da6b635ffcccca33f70d17cb92592de95e557a"
integrity sha512-JA//kQgZtbuY83m+xT+tXJkmJncGMTFT+C+g2h2R9uxkYIrE2yy9sgmcLhCnw57/WSD+Eh3J97FPEDFnbXnDUg==
dependencies:
is-arrayish "^0.3.1"
simple-update-notifier@^1.0.7:
version "1.1.0"
resolved "https://registry.yarnpkg.com/simple-update-notifier/-/simple-update-notifier-1.1.0.tgz#67694c121de354af592b347cdba798463ed49c82"
@ -15082,6 +15298,11 @@ ssri@^8.0.0, ssri@^8.0.1:
dependencies:
minipass "^3.1.1"
stack-trace@0.0.x:
version "0.0.10"
resolved "https://registry.yarnpkg.com/stack-trace/-/stack-trace-0.0.10.tgz#547c70b347e8d32b4e108ea1a2a159e5fdde19c0"
integrity sha512-KGzahc7puUKkzyMt+IqAep+TVNbKP+k2Lmwhub39m1AsTSkaDutx56aDCo+HLDzf/D26BIHTJWNiTG1KAJiQCg==
stacktrace-parser@^0.1.10:
version "0.1.10"
resolved "https://registry.yarnpkg.com/stacktrace-parser/-/stacktrace-parser-0.1.10.tgz#29fb0cae4e0d0b85155879402857a1639eb6051a"
@ -15459,6 +15680,11 @@ text-extensions@^1.0.0:
resolved "https://registry.npmjs.org/text-extensions/-/text-extensions-1.9.0.tgz"
integrity sha512-wiBrwC1EhBelW12Zy26JeOUkQ5mRu+5o8rpsJk5+2t+Y5vE7e842qtZDQ2g1NpX/29HdyFeJ4nSIhI47ENSxlQ==
text-hex@1.0.x:
version "1.0.0"
resolved "https://registry.yarnpkg.com/text-hex/-/text-hex-1.0.0.tgz#69dc9c1b17446ee79a92bf5b884bb4b9127506f5"
integrity sha512-uuVGNWzgJ4yhRaNSiubPY7OjISw4sw4E5Uv0wbjp+OzcbmVU/rsT8ujgcXJhn9ypzsgr5vlzpPqP+MBBKcGvbg==
text-table@^0.2.0:
version "0.2.0"
resolved "https://registry.yarnpkg.com/text-table/-/text-table-0.2.0.tgz#7f5ee823ae805207c00af2df4a84ec3fcfa570b4"
@ -15633,6 +15859,11 @@ trim-right@^1.0.1:
resolved "https://registry.npmjs.org/trim-right/-/trim-right-1.0.1.tgz"
integrity sha1-yy4SAwZ+DI3h9hQJS5/kVwTqYAM=
triple-beam@^1.3.0:
version "1.4.1"
resolved "https://registry.yarnpkg.com/triple-beam/-/triple-beam-1.4.1.tgz#6fde70271dc6e5d73ca0c3b24e2d92afb7441984"
integrity sha512-aZbgViZrg1QNcG+LULa7nhZpJTZSLm/mXnHXnbAbjmN5aSa0y7V+wvv6+4WaBtpISJzThKy+PIPxc1Nq1EJ9mg==
truncate-utf8-bytes@^1.0.0:
version "1.0.2"
resolved "https://registry.yarnpkg.com/truncate-utf8-bytes/-/truncate-utf8-bytes-1.0.2.tgz#405923909592d56f78a5818434b0b78489ca5f2b"
@ -15797,6 +16028,11 @@ type-fest@^0.8.1:
resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.8.1.tgz#09e249ebde851d3b1e48d27c105444667f17b83d"
integrity sha512-4dbzIzqvjtgiM5rw1k5rEHtBANKmdudhGyBEajN01fEyhaAIhsoKNy6y7+IN93IfpFtwY9iqi7kD+xwKhQsNJA==
type-fest@^4.6.0, type-fest@^4.7.1:
version "4.18.3"
resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-4.18.3.tgz#5249f96e7c2c3f0f1561625f54050e343f1c8f68"
integrity sha512-Q08/0IrpvM+NMY9PA2rti9Jb+JejTddwmwmVQGskAlhtcrw1wsRzoR6ode6mR+OAabNa75w/dxedSUY2mlphaQ==
type-is@~1.6.17, type-is@~1.6.18:
version "1.6.18"
resolved "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz"
@ -15997,6 +16233,11 @@ undici@^5.12.0, undici@^5.14.0:
dependencies:
busboy "^1.6.0"
unicorn-magic@^0.1.0:
version "0.1.0"
resolved "https://registry.yarnpkg.com/unicorn-magic/-/unicorn-magic-0.1.0.tgz#1bb9a51c823aaf9d73a8bfcd3d1a23dde94b0ce4"
integrity sha512-lRfVq8fE8gz6QMBuDM6a+LO3IAzTi05H6gCVaUpir2E1Rwpo4ZUog45KpNXKC/Mn3Yb9UDuHumeFTo9iV/D9FQ==
union-value@^1.0.0:
version "1.0.1"
resolved "https://registry.npmjs.org/union-value/-/union-value-1.0.1.tgz"
@ -16692,6 +16933,32 @@ window-size@^0.2.0:
resolved "https://registry.npmjs.org/window-size/-/window-size-0.2.0.tgz"
integrity sha1-tDFbtCFKPXBY6+7okuE/ok2YsHU=
winston-transport@^4.7.0:
version "4.7.0"
resolved "https://registry.yarnpkg.com/winston-transport/-/winston-transport-4.7.0.tgz#e302e6889e6ccb7f383b926df6936a5b781bd1f0"
integrity sha512-ajBj65K5I7denzer2IYW6+2bNIVqLGDHqDw3Ow8Ohh+vdW+rv4MZ6eiDvHoKhfJFZ2auyN8byXieDDJ96ViONg==
dependencies:
logform "^2.3.2"
readable-stream "^3.6.0"
triple-beam "^1.3.0"
winston@^3.13.0:
version "3.13.0"
resolved "https://registry.yarnpkg.com/winston/-/winston-3.13.0.tgz#e76c0d722f78e04838158c61adc1287201de7ce3"
integrity sha512-rwidmA1w3SE4j0E5MuIufFhyJPBDG7Nu71RkZor1p2+qHvJSZ9GYDA81AyleQcZbh/+V6HjeBdfnTZJm9rSeQQ==
dependencies:
"@colors/colors" "^1.6.0"
"@dabh/diagnostics" "^2.0.2"
async "^3.2.3"
is-stream "^2.0.0"
logform "^2.4.0"
one-time "^1.0.0"
readable-stream "^3.4.0"
safe-stable-stringify "^2.3.1"
stack-trace "0.0.x"
triple-beam "^1.3.0"
winston-transport "^4.7.0"
word-wrap@^1.2.3:
version "1.2.3"
resolved "https://registry.yarnpkg.com/word-wrap/-/word-wrap-1.2.3.tgz#610636f6b1f703891bd34771ccb17fb93b47079c"
@ -16821,6 +17088,11 @@ ws@^7.4.6:
resolved "https://registry.yarnpkg.com/ws/-/ws-7.5.9.tgz#54fa7db29f4c7cec68b1ddd3a89de099942bb591"
integrity sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==
ws@^7.5.10:
version "7.5.10"
resolved "https://registry.yarnpkg.com/ws/-/ws-7.5.10.tgz#58b5c20dc281633f6c19113f39b349bd8bd558d9"
integrity sha512-+dbF1tHwZpXcbOJdVOkzLDxZP1ailvSxM6ZweXTegylPny803bFhA+vqBYw4s31NSAk4S2Qz+AKXK9a4wkdjcQ==
ws@^8.11.0, ws@^8.12.1, ws@^8.4.0:
version "8.13.0"
resolved "https://registry.yarnpkg.com/ws/-/ws-8.13.0.tgz#9a9fb92f93cf41512a0735c8f4dd09b8a1211cd0"