diff --git a/packages/graph-test-watcher/README.md b/packages/graph-test-watcher/README.md index 7513e16a..f725b0e3 100644 --- a/packages/graph-test-watcher/README.md +++ b/packages/graph-test-watcher/README.md @@ -8,6 +8,12 @@ yarn ``` +* Run the IPFS (go-ipfs version 0.9.0) daemon: + + ```bash + ipfs daemon + ``` + * Create a postgres12 database for the watcher: ```bash @@ -35,9 +41,13 @@ graph-test-watcher-job-queue=# exit ``` -* Update the [config](./environments/local.toml) with database connection settings. +* In the [config file](./environments/local.toml): -* Update the `upstream` config in the [config file](./environments/local.toml) and provide the `ipld-eth-server` GQL API and the `indexer-db` postgraphile endpoints. + * Update the database connection settings. + + * Update the `upstream` config and provide the `ipld-eth-server` GQL API and the `indexer-db` postgraphile endpoints. + + * Update the `server` config with state checkpoint settings and provide the IPFS API address. ## Customize @@ -45,7 +55,17 @@ * Edit the custom hook function `handleEvent` (triggered on an event) in [hooks.ts](./src/hooks.ts) to perform corresponding indexing using the `Indexer` object. - * Refer to [hooks.example.ts](./src/hooks.example.ts) for an example hook function for events in an ERC20 contract. + * While using the indexer storage methods for indexing, pass `diff` as true if default state is desired to be generated using the state variables being indexed. + +* Generating state: + + * Edit the custom hook function `createInitialCheckpoint` (triggered on watch-contract, checkpoint: `true`) in [hooks.ts](./src/hooks.ts) to save an initial checkpoint `IPLDBlock` using the `Indexer` object. + + * Edit the custom hook function `createStateDiff` (triggered on a block) in [hooks.ts](./src/hooks.ts) to save the state in a `diff` `IPLDBlock` using the `Indexer` object. The default state (if exists) is updated. + + * Edit the custom hook function `createStateCheckpoint` (triggered just before default and CLI checkpoint) in [hooks.ts](./src/hooks.ts) to save the state in a `checkpoint` `IPLDBlock` using the `Indexer` object. + +* The existing example hooks in [hooks.ts](./src/hooks.ts) are for an `ERC20` contract. ## Run @@ -68,11 +88,104 @@ GQL console: http://localhost:3008/graphql * To watch a contract: ```bash - yarn watch:contract --address --kind Example --starting-block [block-number] + yarn watch:contract --address --kind --checkpoint --starting-block [block-number] + ``` + + * `address`: Address or identifier of the contract to be watched. + * `kind`: Kind of the contract. + * `checkpoint`: Turn checkpointing on (`true` | `false`). + * `starting-block`: Starting block for the contract (default: `1`). + + Examples: + + Watch a contract with its address and checkpointing on: + + ```bash + yarn watch:contract --address 0x1F78641644feB8b64642e833cE4AFE93DD6e7833 --kind ERC20 --checkpoint true + ``` + + Watch a contract with its identifier and checkpointing on: + + ```bash + yarn watch:contract --address MyProtocol --kind protocol --checkpoint true ``` * To fill a block range: ```bash - yarn fill --startBlock --endBlock + yarn fill --start-block --end-block ``` + + * `start-block`: Block number to start filling from. + * `end-block`: Block number till which to fill. + + * To create a checkpoint for a contract: + + ```bash + yarn checkpoint --address --block-hash [block-hash] + ``` + + * `address`: Address or identifier of the contract for which to create a checkpoint. + * `block-hash`: Hash of a block (in the pruned region) at which to create the checkpoint (default: latest canonical block hash). + + * To reset the watcher to a previous block number: + + * Reset state: + + ```bash + yarn reset state --block-number + ``` + + * Reset job-queue: + + ```bash + yarn reset job-queue --block-number + ``` + + * `block-number`: Block number to which to reset the watcher. + + * To export and import the watcher state: + + * In source watcher, export watcher state: + + ```bash + yarn export-state --export-file [export-file-path] + ``` + + * `export-file`: Path of JSON file to which to export the watcher data. + + * In target watcher, run job-runner: + + ```bash + yarn job-runner + ``` + + * Import watcher state: + + ```bash + yarn import-state --import-file + ``` + + * `import-file`: Path of JSON file from which to import the watcher data. + + * Run fill: + + ```bash + yarn fill --start-block --end-block + ``` + + * `snapshot-block`: Block number at which the watcher state was exported. + + * Run server: + + ```bash + yarn server + ``` + + * To inspect a CID: + + ```bash + yarn inspect-cid --cid + ``` + + * `cid`: CID to be inspected. diff --git a/packages/graph-test-watcher/environments/local.toml b/packages/graph-test-watcher/environments/local.toml index 005d94b7..276befa0 100644 --- a/packages/graph-test-watcher/environments/local.toml +++ b/packages/graph-test-watcher/environments/local.toml @@ -2,6 +2,16 @@ host = "127.0.0.1" port = 3008 kind = "active" + + # Checkpointing state. + checkpointing = true + + # Checkpoint interval in number of blocks. + checkpointInterval = 2000 + + # IPFS API address (can be taken from the output on running the IPFS daemon). + ipfsApiAddr = "/ip4/127.0.0.1/tcp/5001" + subgraphPath = "../graph-node/test/subgraph/example1/build" [database] diff --git a/packages/graph-test-watcher/package.json b/packages/graph-test-watcher/package.json index 935aa62d..99d9e381 100644 --- a/packages/graph-test-watcher/package.json +++ b/packages/graph-test-watcher/package.json @@ -10,7 +10,12 @@ "server": "DEBUG=vulcanize:* ts-node src/server.ts", "job-runner": "DEBUG=vulcanize:* ts-node src/job-runner.ts", "watch:contract": "DEBUG=vulcanize:* ts-node src/cli/watch-contract.ts", - "fill": "DEBUG=vulcanize:* ts-node src/fill.ts" + "fill": "DEBUG=vulcanize:* ts-node src/fill.ts", + "reset": "DEBUG=vulcanize:* ts-node src/cli/reset.ts", + "checkpoint": "DEBUG=vulcanize:* ts-node src/cli/checkpoint.ts", + "export-state": "DEBUG=vulcanize:* ts-node src/cli/export-state.ts", + "import-state": "DEBUG=vulcanize:* ts-node src/cli/import-state.ts", + "inspect-cid": "DEBUG=vulcanize:* ts-node src/cli/inspect-cid.ts" }, "repository": { "type": "git", @@ -23,9 +28,10 @@ }, "homepage": "https://github.com/vulcanize/watcher-ts#readme", "dependencies": { + "@apollo/client": "^3.3.19", "@ethersproject/providers": "5.3.0", + "@ipld/dag-cbor": "^6.0.12", "@vulcanize/cache": "^0.1.0", - "@vulcanize/graph-node": "^0.1.0", "@vulcanize/ipld-eth-client": "^0.1.0", "@vulcanize/solidity-mapper": "^0.1.0", "@vulcanize/util": "^0.1.0", @@ -36,7 +42,10 @@ "express": "^4.17.1", "graphql": "^15.5.0", "graphql-import-node": "^0.0.4", + "ipfs-http-client": "^53.0.1", "json-bigint": "^1.0.0", + "lodash": "^4.17.21", + "multiformats": "^9.4.8", "reflect-metadata": "^0.1.13", "typeorm": "^0.2.32", "yargs": "^17.0.1" diff --git a/packages/graph-test-watcher/src/cli/checkpoint.ts b/packages/graph-test-watcher/src/cli/checkpoint.ts new file mode 100644 index 00000000..5d48dacd --- /dev/null +++ b/packages/graph-test-watcher/src/cli/checkpoint.ts @@ -0,0 +1,69 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import path from 'path'; +import yargs from 'yargs'; +import 'reflect-metadata'; +import debug from 'debug'; + +import { Config, DEFAULT_CONFIG_PATH, getConfig, initClients } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; + +import { Database } from '../database'; +import { Indexer } from '../indexer'; + +const log = debug('vulcanize:checkpoint'); + +const main = async (): Promise => { + const argv = await yargs.parserConfiguration({ + 'parse-numbers': false + }).options({ + configFile: { + alias: 'f', + type: 'string', + require: true, + demandOption: true, + describe: 'Configuration file path (toml)', + default: DEFAULT_CONFIG_PATH + }, + address: { + type: 'string', + require: true, + demandOption: true, + describe: 'Contract address to create the checkpoint for.' + }, + blockHash: { + type: 'string', + describe: 'Blockhash at which to create the checkpoint.' + } + }).argv; + + const config: Config = await getConfig(argv.configFile); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); + + const db = new Database(config.database); + await db.init(); + + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const blockHash = await indexer.processCLICheckpoint(argv.address, argv.blockHash); + + log(`Created a checkpoint for contract ${argv.address} at block-hash ${blockHash}`); + + await db.close(); +}; + +main().catch(err => { + log(err); +}).finally(() => { + process.exit(0); +}); diff --git a/packages/graph-test-watcher/src/cli/export-state.ts b/packages/graph-test-watcher/src/cli/export-state.ts new file mode 100644 index 00000000..73b6e624 --- /dev/null +++ b/packages/graph-test-watcher/src/cli/export-state.ts @@ -0,0 +1,119 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import assert from 'assert'; +import yargs from 'yargs'; +import 'reflect-metadata'; +import debug from 'debug'; +import fs from 'fs'; +import path from 'path'; + +import { Config, DEFAULT_CONFIG_PATH, getConfig, initClients } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; +import * as codec from '@ipld/dag-cbor'; + +import { Database } from '../database'; +import { Indexer } from '../indexer'; + +const log = debug('vulcanize:export-state'); + +const main = async (): Promise => { + const argv = await yargs.parserConfiguration({ + 'parse-numbers': false + }).options({ + configFile: { + alias: 'f', + type: 'string', + require: true, + demandOption: true, + describe: 'Configuration file path (toml)', + default: DEFAULT_CONFIG_PATH + }, + exportFile: { + alias: 'o', + type: 'string', + describe: 'Export file path' + } + }).argv; + + const config: Config = await getConfig(argv.configFile); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); + + const db = new Database(config.database); + await db.init(); + + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const exportData: any = { + snapshotBlock: {}, + contracts: [], + ipldCheckpoints: [] + }; + + const contracts = await db.getContracts({}); + + // Get latest canonical block. + const block = await indexer.getLatestCanonicalBlock(); + assert(block); + + // Export snapshot block. + exportData.snapshotBlock = { + blockNumber: block.blockNumber, + blockHash: block.blockHash + }; + + // Export contracts and checkpoints. + for (const contract of contracts) { + exportData.contracts.push({ + address: contract.address, + kind: contract.kind, + checkpoint: contract.checkpoint, + startingBlock: block.blockNumber + }); + + // Create and export checkpoint if checkpointing is on for the contract. + if (contract.checkpoint) { + await indexer.createCheckpoint(contract.address, block.blockHash); + + const ipldBlock = await indexer.getLatestIPLDBlock(contract.address, 'checkpoint', block.blockNumber); + assert(ipldBlock); + + const data = codec.decode(Buffer.from(ipldBlock.data)) as any; + + exportData.ipldCheckpoints.push({ + contractAddress: ipldBlock.contractAddress, + cid: ipldBlock.cid, + kind: ipldBlock.kind, + data + }); + } + } + + if (argv.exportFile) { + const encodedExportData = codec.encode(exportData); + + const filePath = path.resolve(argv.exportFile); + const fileDir = path.dirname(filePath); + + if (!fs.existsSync(fileDir)) fs.mkdirSync(fileDir, { recursive: true }); + + fs.writeFileSync(filePath, encodedExportData); + } else { + log(exportData); + } +}; + +main().catch(err => { + log(err); +}).finally(() => { + process.exit(0); +}); diff --git a/packages/graph-test-watcher/src/cli/import-state.ts b/packages/graph-test-watcher/src/cli/import-state.ts new file mode 100644 index 00000000..b98e5997 --- /dev/null +++ b/packages/graph-test-watcher/src/cli/import-state.ts @@ -0,0 +1,118 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import assert from 'assert'; +import 'reflect-metadata'; +import yargs from 'yargs'; +import { hideBin } from 'yargs/helpers'; +import debug from 'debug'; +import { PubSub } from 'apollo-server-express'; +import fs from 'fs'; +import path from 'path'; + +import { getConfig, fillBlocks, JobQueue, DEFAULT_CONFIG_PATH, Config, initClients } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; +import * as codec from '@ipld/dag-cbor'; + +import { Database } from '../database'; +import { Indexer } from '../indexer'; +import { EventWatcher } from '../events'; +import { IPLDBlock } from '../entity/IPLDBlock'; + +const log = debug('vulcanize:import-state'); + +export const main = async (): Promise => { + const argv = await yargs(hideBin(process.argv)).parserConfiguration({ + 'parse-numbers': false + }).options({ + configFile: { + alias: 'f', + type: 'string', + demandOption: true, + describe: 'configuration file path (toml)', + default: DEFAULT_CONFIG_PATH + }, + importFile: { + alias: 'i', + type: 'string', + demandOption: true, + describe: 'Import file path (JSON)' + } + }).argv; + + const config: Config = await getConfig(argv.configFile); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); + + const db = new Database(config.database); + await db.init(); + + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + // Note: In-memory pubsub works fine for now, as each watcher is a single process anyway. + // Later: https://www.apollographql.com/docs/apollo-server/data/subscriptions/#production-pubsub-libraries + const pubsub = new PubSub(); + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const jobQueueConfig = config.jobQueue; + assert(jobQueueConfig, 'Missing job queue config'); + + const { dbConnectionString, maxCompletionLagInSecs } = jobQueueConfig; + assert(dbConnectionString, 'Missing job queue db connection string'); + + const jobQueue = new JobQueue({ dbConnectionString, maxCompletionLag: maxCompletionLagInSecs }); + await jobQueue.start(); + + const eventWatcher = new EventWatcher(config.upstream, ethClient, postgraphileClient, indexer, pubsub, jobQueue); + + // Import data. + const importFilePath = path.resolve(argv.importFile); + const encodedImportData = fs.readFileSync(importFilePath); + const importData = codec.decode(Buffer.from(encodedImportData)) as any; + + // Fill the snapshot block. + await fillBlocks( + jobQueue, + indexer, + postgraphileClient, + eventWatcher, + config.upstream.ethServer.blockDelayInMilliSecs, + { + startBlock: importData.snapshotBlock.blockNumber, + endBlock: importData.snapshotBlock.blockNumber + } + ); + + // Fill the Contracts. + for (const contract of importData.contracts) { + await db.saveContract(contract.address, contract.kind, contract.checkpoint, contract.startingBlock); + } + + // Get the snapshot block. + const block = await indexer.getBlockProgress(importData.snapshotBlock.blockHash); + assert(block); + + // Fill the IPLDBlocks. + for (const checkpoint of importData.ipldCheckpoints) { + let ipldBlock = new IPLDBlock(); + + ipldBlock = Object.assign(ipldBlock, checkpoint); + ipldBlock.block = block; + + ipldBlock.data = Buffer.from(codec.encode(ipldBlock.data)); + + await db.saveOrUpdateIPLDBlock(ipldBlock); + } +}; + +main().catch(err => { + log(err); +}).finally(() => { + process.exit(0); +}); diff --git a/packages/graph-test-watcher/src/cli/inspect-cid.ts b/packages/graph-test-watcher/src/cli/inspect-cid.ts new file mode 100644 index 00000000..7c91e59d --- /dev/null +++ b/packages/graph-test-watcher/src/cli/inspect-cid.ts @@ -0,0 +1,68 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import path from 'path'; +import assert from 'assert'; +import yargs from 'yargs'; +import 'reflect-metadata'; +import debug from 'debug'; +import util from 'util'; + +import { Config, DEFAULT_CONFIG_PATH, getConfig, initClients } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; + +import { Database } from '../database'; +import { Indexer } from '../indexer'; + +const log = debug('vulcanize:inspect-cid'); + +const main = async (): Promise => { + const argv = await yargs.parserConfiguration({ + 'parse-numbers': false + }).options({ + configFile: { + alias: 'f', + type: 'string', + require: true, + demandOption: true, + describe: 'Configuration file path (toml)', + default: DEFAULT_CONFIG_PATH + }, + cid: { + alias: 'c', + type: 'string', + demandOption: true, + describe: 'CID to be inspected' + } + }).argv; + + const config: Config = await getConfig(argv.configFile); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); + + const db = new Database(config.database); + await db.init(); + + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const ipldBlock = await indexer.getIPLDBlockByCid(argv.cid); + assert(ipldBlock, 'IPLDBlock for the provided CID doesn\'t exist.'); + + const ipldData = await indexer.getIPLDData(ipldBlock); + + log(util.inspect(ipldData, false, null)); +}; + +main().catch(err => { + log(err); +}).finally(() => { + process.exit(0); +}); diff --git a/packages/graph-test-watcher/src/cli/reset-cmds/job-queue.ts b/packages/graph-test-watcher/src/cli/reset-cmds/job-queue.ts new file mode 100644 index 00000000..a8766bcf --- /dev/null +++ b/packages/graph-test-watcher/src/cli/reset-cmds/job-queue.ts @@ -0,0 +1,22 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import debug from 'debug'; + +import { getConfig, resetJobs } from '@vulcanize/util'; + +const log = debug('vulcanize:reset-job-queue'); + +export const command = 'job-queue'; + +export const desc = 'Reset job queue'; + +export const builder = {}; + +export const handler = async (argv: any): Promise => { + const config = await getConfig(argv.configFile); + await resetJobs(config); + + log('Job queue reset successfully'); +}; diff --git a/packages/graph-test-watcher/src/cli/reset-cmds/state.ts b/packages/graph-test-watcher/src/cli/reset-cmds/state.ts new file mode 100644 index 00000000..39efc03e --- /dev/null +++ b/packages/graph-test-watcher/src/cli/reset-cmds/state.ts @@ -0,0 +1,94 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import path from 'path'; +import debug from 'debug'; +import { MoreThan } from 'typeorm'; +import assert from 'assert'; + +import { getConfig, initClients, resetJobs } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; + +import { Database } from '../../database'; +import { Indexer } from '../../indexer'; +import { BlockProgress } from '../../entity/BlockProgress'; + +import { GetMethod } from '../../entity/GetMethod'; +import { _Test } from '../../entity/_Test'; + +const log = debug('vulcanize:reset-state'); + +export const command = 'state'; + +export const desc = 'Reset state to block number'; + +export const builder = { + blockNumber: { + type: 'number' + } +}; + +export const handler = async (argv: any): Promise => { + const config = await getConfig(argv.configFile); + await resetJobs(config); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); + + // Initialize database. + const db = new Database(config.database); + await db.init(); + + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const syncStatus = await indexer.getSyncStatus(); + assert(syncStatus, 'Missing syncStatus'); + + const hooksStatus = await indexer.getHookStatus(); + assert(hooksStatus, 'Missing hooksStatus'); + + const blockProgresses = await indexer.getBlocksAtHeight(argv.blockNumber, false); + assert(blockProgresses.length, `No blocks at specified block number ${argv.blockNumber}`); + assert(!blockProgresses.some(block => !block.isComplete), `Incomplete block at block number ${argv.blockNumber} with unprocessed events`); + const [blockProgress] = blockProgresses; + + const dbTx = await db.createTransactionRunner(); + + try { + const entities = [BlockProgress, GetMethod, _Test]; + + const removeEntitiesPromise = entities.map(async entityClass => { + return db.removeEntities(dbTx, entityClass, { blockNumber: MoreThan(argv.blockNumber) }); + }); + + await Promise.all(removeEntitiesPromise); + + if (syncStatus.latestIndexedBlockNumber > blockProgress.blockNumber) { + await indexer.updateSyncStatusIndexedBlock(blockProgress.blockHash, blockProgress.blockNumber, true); + } + + if (syncStatus.latestCanonicalBlockNumber > blockProgress.blockNumber) { + await indexer.updateSyncStatusCanonicalBlock(blockProgress.blockHash, blockProgress.blockNumber, true); + } + + if (hooksStatus.latestProcessedBlockNumber > blockProgress.blockNumber) { + await indexer.updateHookStatusProcessedBlock(blockProgress.blockNumber, true); + } + + dbTx.commitTransaction(); + } catch (error) { + await dbTx.rollbackTransaction(); + throw error; + } finally { + await dbTx.release(); + } + + log('Reset state successfully'); +}; diff --git a/packages/graph-test-watcher/src/cli/reset.ts b/packages/graph-test-watcher/src/cli/reset.ts new file mode 100644 index 00000000..2ddebf10 --- /dev/null +++ b/packages/graph-test-watcher/src/cli/reset.ts @@ -0,0 +1,24 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import 'reflect-metadata'; +import debug from 'debug'; + +import { getResetYargs } from '@vulcanize/util'; + +const log = debug('vulcanize:reset'); + +const main = async () => { + return getResetYargs() + .commandDir('reset-cmds', { extensions: ['ts', 'js'], exclude: /([a-zA-Z0-9\s_\\.\-:])+(.d.ts)$/ }) + .demandCommand(1) + .help() + .argv; +}; + +main().then(() => { + process.exit(); +}).catch(err => { + log(err); +}); diff --git a/packages/graph-test-watcher/src/cli/watch-contract.ts b/packages/graph-test-watcher/src/cli/watch-contract.ts index a9b9640e..8b80c586 100644 --- a/packages/graph-test-watcher/src/cli/watch-contract.ts +++ b/packages/graph-test-watcher/src/cli/watch-contract.ts @@ -2,15 +2,20 @@ // Copyright 2021 Vulcanize, Inc. // -import assert from 'assert'; +import path from 'path'; import yargs from 'yargs'; import 'reflect-metadata'; +import debug from 'debug'; -import { Config, DEFAULT_CONFIG_PATH, getConfig } from '@vulcanize/util'; +import { Config, DEFAULT_CONFIG_PATH, getConfig, initClients } from '@vulcanize/util'; +import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; import { Database } from '../database'; +import { Indexer } from '../indexer'; -(async () => { +const log = debug('vulcanize:watch-contract'); + +const main = async (): Promise => { const argv = await yargs.parserConfiguration({ 'parse-numbers': false }).options({ @@ -19,7 +24,7 @@ import { Database } from '../database'; type: 'string', require: true, demandOption: true, - describe: 'configuration file path (toml)', + describe: 'Configuration file path (toml)', default: DEFAULT_CONFIG_PATH }, address: { @@ -34,22 +39,41 @@ import { Database } from '../database'; demandOption: true, describe: 'Kind of contract' }, + checkpoint: { + type: 'boolean', + require: true, + demandOption: true, + describe: 'Turn checkpointing on' + }, startingBlock: { type: 'number', - default: 1, describe: 'Starting block' } }).argv; const config: Config = await getConfig(argv.configFile); - const { database: dbConfig } = config; + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); - assert(dbConfig); - - const db = new Database(dbConfig); + const db = new Database(config.database); await db.init(); - await db.saveContract(argv.address, argv.kind, argv.startingBlock); + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); + await graphDb.init(); + + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + await indexer.watchContract(argv.address, argv.kind, argv.checkpoint, argv.startingBlock); await db.close(); -})(); +}; + +main().catch(err => { + log(err); +}).finally(() => { + process.exit(0); +}); diff --git a/packages/graph-test-watcher/src/client.ts b/packages/graph-test-watcher/src/client.ts index aedd9c7a..877bc2c7 100644 --- a/packages/graph-test-watcher/src/client.ts +++ b/packages/graph-test-watcher/src/client.ts @@ -3,7 +3,6 @@ // import { gql } from '@apollo/client/core'; - import { GraphQLClient, GraphQLConfig } from '@vulcanize/ipld-eth-client'; import { queries, mutations, subscriptions } from './gql'; @@ -18,7 +17,6 @@ export class Client { this._client = new GraphQLClient(config); } - // eslint-disable-next-line camelcase async getGetMethod (blockHash: string, contractAddress: string): Promise { const { getMethod } = await this._client.query( gql(queries.getMethod), @@ -28,8 +26,7 @@ export class Client { return getMethod; } - // eslint-disable-next-line camelcase - async get_test (blockHash: string, contractAddress: string): Promise { + async _getTest (blockHash: string, contractAddress: string): Promise { const { _test } = await this._client.query( gql(queries._test), { blockHash, contractAddress } diff --git a/packages/graph-test-watcher/src/database.ts b/packages/graph-test-watcher/src/database.ts index 1869151d..1fd4e106 100644 --- a/packages/graph-test-watcher/src/database.ts +++ b/packages/graph-test-watcher/src/database.ts @@ -3,20 +3,22 @@ // import assert from 'assert'; -import { Connection, ConnectionOptions, DeepPartial, FindConditions, QueryRunner, FindManyOptions } from 'typeorm'; +import { Connection, ConnectionOptions, DeepPartial, FindConditions, QueryRunner, FindManyOptions, MoreThan } from 'typeorm'; import path from 'path'; -import { Database as BaseDatabase } from '@vulcanize/util'; +import { Database as BaseDatabase, MAX_REORG_DEPTH, DatabaseInterface } from '@vulcanize/util'; import { Contract } from './entity/Contract'; import { Event } from './entity/Event'; import { SyncStatus } from './entity/SyncStatus'; +import { HookStatus } from './entity/HookStatus'; import { BlockProgress } from './entity/BlockProgress'; +import { IPLDBlock } from './entity/IPLDBlock'; import { GetMethod } from './entity/GetMethod'; import { _Test } from './entity/_Test'; -export class Database { +export class Database implements DatabaseInterface { _config: ConnectionOptions; _conn!: Connection; _baseDatabase: BaseDatabase; @@ -43,7 +45,6 @@ export class Database { return this._baseDatabase.close(); } - // eslint-disable-next-line camelcase async getGetMethod ({ blockHash, contractAddress }: { blockHash: string, contractAddress: string }): Promise { return this._conn.getRepository(GetMethod) .findOne({ @@ -52,8 +53,7 @@ export class Database { }); } - // eslint-disable-next-line camelcase - async get_test ({ blockHash, contractAddress }: { blockHash: string, contractAddress: string }): Promise<_Test | undefined> { + async _getTest ({ blockHash, contractAddress }: { blockHash: string, contractAddress: string }): Promise<_Test | undefined> { return this._conn.getRepository(_Test) .findOne({ blockHash, @@ -61,20 +61,175 @@ export class Database { }); } - // eslint-disable-next-line camelcase - async saveGetMethod ({ blockHash, contractAddress, value, proof }: DeepPartial): Promise { + async saveGetMethod ({ blockHash, blockNumber, contractAddress, value, proof }: DeepPartial): Promise { const repo = this._conn.getRepository(GetMethod); - const entity = repo.create({ blockHash, contractAddress, value, proof }); + const entity = repo.create({ blockHash, blockNumber, contractAddress, value, proof }); return repo.save(entity); } - // eslint-disable-next-line camelcase - async save_test ({ blockHash, contractAddress, value, proof }: DeepPartial<_Test>): Promise<_Test> { + async _saveTest ({ blockHash, blockNumber, contractAddress, value, proof }: DeepPartial<_Test>): Promise<_Test> { const repo = this._conn.getRepository(_Test); - const entity = repo.create({ blockHash, contractAddress, value, proof }); + const entity = repo.create({ blockHash, blockNumber, contractAddress, value, proof }); return repo.save(entity); } + async getIPLDBlocks (where: FindConditions): Promise { + const repo = this._conn.getRepository(IPLDBlock); + return repo.find({ where, relations: ['block'] }); + } + + async getLatestIPLDBlock (contractAddress: string, kind: string | null, blockNumber?: number): Promise { + const repo = this._conn.getRepository(IPLDBlock); + + let queryBuilder = repo.createQueryBuilder('ipld_block') + .leftJoinAndSelect('ipld_block.block', 'block') + .where('block.is_pruned = false') + .andWhere('ipld_block.contract_address = :contractAddress', { contractAddress }) + .orderBy('block.block_number', 'DESC'); + + // Filter out blocks after the provided block number. + if (blockNumber) { + queryBuilder.andWhere('block.block_number <= :blockNumber', { blockNumber }); + } + + // Filter using kind if specified else order by id to give preference to checkpoint. + queryBuilder = kind + ? queryBuilder.andWhere('ipld_block.kind = :kind', { kind }) + : queryBuilder.andWhere('ipld_block.kind != :kind', { kind: 'diff_staged' }) + .addOrderBy('ipld_block.id', 'DESC'); + + return queryBuilder.getOne(); + } + + async getPrevIPLDBlock (queryRunner: QueryRunner, blockHash: string, contractAddress: string, kind?: string): Promise { + const heirerchicalQuery = ` + WITH RECURSIVE cte_query AS + ( + SELECT + b.block_hash, + b.block_number, + b.parent_hash, + 1 as depth, + i.id, + i.kind + FROM + block_progress b + LEFT JOIN + ipld_block i ON i.block_id = b.id + AND i.contract_address = $2 + WHERE + b.block_hash = $1 + UNION ALL + SELECT + b.block_hash, + b.block_number, + b.parent_hash, + c.depth + 1, + i.id, + i.kind + FROM + block_progress b + LEFT JOIN + ipld_block i + ON i.block_id = b.id + AND i.contract_address = $2 + INNER JOIN + cte_query c ON c.parent_hash = b.block_hash + WHERE + c.depth < $3 + ) + SELECT + block_number, id, kind + FROM + cte_query + ORDER BY block_number DESC, id DESC + `; + + // Fetching block and id for previous IPLDBlock in frothy region. + const queryResult = await queryRunner.query(heirerchicalQuery, [blockHash, contractAddress, MAX_REORG_DEPTH]); + const latestRequiredResult = kind + ? queryResult.find((obj: any) => obj.kind === kind) + : queryResult.find((obj: any) => obj.id); + + let result: IPLDBlock | undefined; + if (latestRequiredResult) { + result = await queryRunner.manager.findOne(IPLDBlock, { id: latestRequiredResult.id }, { relations: ['block'] }); + } else { + // If IPLDBlock not found in frothy region get latest IPLDBlock in the pruned region. + // Filter out IPLDBlocks from pruned blocks. + const canonicalBlockNumber = queryResult.pop().block_number + 1; + + let queryBuilder = queryRunner.manager.createQueryBuilder(IPLDBlock, 'ipld_block') + .leftJoinAndSelect('ipld_block.block', 'block') + .where('block.is_pruned = false') + .andWhere('ipld_block.contract_address = :contractAddress', { contractAddress }) + .andWhere('block.block_number <= :canonicalBlockNumber', { canonicalBlockNumber }) + .orderBy('block.block_number', 'DESC'); + + // Filter using kind if specified else order by id to give preference to checkpoint. + queryBuilder = kind + ? queryBuilder.andWhere('ipld_block.kind = :kind', { kind }) + : queryBuilder.addOrderBy('ipld_block.id', 'DESC'); + + result = await queryBuilder.getOne(); + } + + return result; + } + + // Fetch all diff IPLDBlocks after the specified checkpoint. + async getDiffIPLDBlocksByCheckpoint (contractAddress: string, checkpointBlockNumber: number): Promise { + const repo = this._conn.getRepository(IPLDBlock); + + return repo.find({ + relations: ['block'], + where: { + contractAddress, + kind: 'diff', + block: { + isPruned: false, + blockNumber: MoreThan(checkpointBlockNumber) + } + }, + order: { + block: 'ASC' + } + }); + } + + async saveOrUpdateIPLDBlock (ipldBlock: IPLDBlock): Promise { + const repo = this._conn.getRepository(IPLDBlock); + return repo.save(ipldBlock); + } + + async getHookStatus (queryRunner: QueryRunner): Promise { + const repo = queryRunner.manager.getRepository(HookStatus); + + return repo.findOne(); + } + + async updateHookStatusProcessedBlock (queryRunner: QueryRunner, blockNumber: number, force?: boolean): Promise { + const repo = queryRunner.manager.getRepository(HookStatus); + let entity = await repo.findOne(); + + if (!entity) { + entity = repo.create({ + latestProcessedBlockNumber: blockNumber + }); + } + + if (force || blockNumber > entity.latestProcessedBlockNumber) { + entity.latestProcessedBlockNumber = blockNumber; + } + + return repo.save(entity); + } + + async getContracts (where: FindConditions): Promise { + const repo = this._conn.getRepository(Contract); + return repo.find({ where }); + } + async getContract (address: string): Promise { const repo = this._conn.getRepository(Contract); @@ -115,24 +270,24 @@ export class Database { return this._baseDatabase.saveEvents(blockRepo, eventRepo, block, events); } - async saveContract (address: string, kind: string, startingBlock: number): Promise { + async saveContract (address: string, kind: string, checkpoint: boolean, startingBlock: number): Promise { await this._conn.transaction(async (tx) => { const repo = tx.getRepository(Contract); - return this._baseDatabase.saveContract(repo, address, startingBlock, kind); + return this._baseDatabase.saveContract(repo, address, kind, checkpoint, startingBlock); }); } - async updateSyncStatusIndexedBlock (queryRunner: QueryRunner, blockHash: string, blockNumber: number): Promise { + async updateSyncStatusIndexedBlock (queryRunner: QueryRunner, blockHash: string, blockNumber: number, force = false): Promise { const repo = queryRunner.manager.getRepository(SyncStatus); - return this._baseDatabase.updateSyncStatusIndexedBlock(repo, blockHash, blockNumber); + return this._baseDatabase.updateSyncStatusIndexedBlock(repo, blockHash, blockNumber, force); } - async updateSyncStatusCanonicalBlock (queryRunner: QueryRunner, blockHash: string, blockNumber: number): Promise { + async updateSyncStatusCanonicalBlock (queryRunner: QueryRunner, blockHash: string, blockNumber: number, force = false): Promise { const repo = queryRunner.manager.getRepository(SyncStatus); - return this._baseDatabase.updateSyncStatusCanonicalBlock(repo, blockHash, blockNumber); + return this._baseDatabase.updateSyncStatusCanonicalBlock(repo, blockHash, blockNumber, force); } async updateSyncStatusChainHead (queryRunner: QueryRunner, blockHash: string, blockNumber: number): Promise { diff --git a/packages/graph-test-watcher/src/entity/BlockProgress.ts b/packages/graph-test-watcher/src/entity/BlockProgress.ts index 727d90bf..9fe65afd 100644 --- a/packages/graph-test-watcher/src/entity/BlockProgress.ts +++ b/packages/graph-test-watcher/src/entity/BlockProgress.ts @@ -13,6 +13,9 @@ export class BlockProgress implements BlockProgressInterface { @PrimaryGeneratedColumn() id!: number; + @Column('varchar') + cid!: string; + @Column('varchar', { length: 66 }) blockHash!: string; diff --git a/packages/graph-test-watcher/src/entity/Contract.ts b/packages/graph-test-watcher/src/entity/Contract.ts index 83c99dcb..0727c538 100644 --- a/packages/graph-test-watcher/src/entity/Contract.ts +++ b/packages/graph-test-watcher/src/entity/Contract.ts @@ -13,9 +13,12 @@ export class Contract { @Column('varchar', { length: 42 }) address!: string; - @Column('varchar', { length: 8 }) + @Column('varchar') kind!: string; + @Column('boolean') + checkpoint!: boolean; + @Column('integer') startingBlock!: number; } diff --git a/packages/graph-test-watcher/src/entity/Event.ts b/packages/graph-test-watcher/src/entity/Event.ts index 57ec5659..c7c09d6b 100644 --- a/packages/graph-test-watcher/src/entity/Event.ts +++ b/packages/graph-test-watcher/src/entity/Event.ts @@ -12,7 +12,7 @@ export class Event { @PrimaryGeneratedColumn() id!: number; - @ManyToOne(() => BlockProgress) + @ManyToOne(() => BlockProgress, { onDelete: 'CASCADE' }) block!: BlockProgress; @Column('varchar', { length: 66 }) diff --git a/packages/graph-test-watcher/src/entity/ExampleEntity.ts b/packages/graph-test-watcher/src/entity/ExampleEntity.ts index 80a7ce78..ee53a63e 100644 --- a/packages/graph-test-watcher/src/entity/ExampleEntity.ts +++ b/packages/graph-test-watcher/src/entity/ExampleEntity.ts @@ -3,25 +3,25 @@ // import { Entity, PrimaryColumn, Column } from 'typeorm'; +import { bigintTransformer } from '@vulcanize/util'; @Entity() export class ExampleEntity { @PrimaryColumn('varchar') id!: string; - // https://typeorm.io/#/entities/primary-columns @PrimaryColumn('varchar', { length: 66 }) - blockHash!: string + blockHash!: string; @Column('integer') blockNumber!: number; - @Column('bigint') - count!: bigint + @Column('bigint', { transformer: bigintTransformer }) + count!: bigint; @Column('varchar') - param1!: string + param1!: string; @Column('integer') - param2!: number + param2!: number; } diff --git a/packages/graph-test-watcher/src/entity/GetMethod.ts b/packages/graph-test-watcher/src/entity/GetMethod.ts index 7e633a98..79a8d801 100644 --- a/packages/graph-test-watcher/src/entity/GetMethod.ts +++ b/packages/graph-test-watcher/src/entity/GetMethod.ts @@ -13,6 +13,9 @@ export class GetMethod { @Column('varchar', { length: 66 }) blockHash!: string; + @Column('integer') + blockNumber!: number; + @Column('varchar', { length: 42 }) contractAddress!: string; diff --git a/packages/graph-test-watcher/src/entity/HookStatus.ts b/packages/graph-test-watcher/src/entity/HookStatus.ts new file mode 100644 index 00000000..7e67d2bb --- /dev/null +++ b/packages/graph-test-watcher/src/entity/HookStatus.ts @@ -0,0 +1,14 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import { Entity, PrimaryGeneratedColumn, Column } from 'typeorm'; + +@Entity() +export class HookStatus { + @PrimaryGeneratedColumn() + id!: number; + + @Column('integer') + latestProcessedBlockNumber!: number; +} diff --git a/packages/graph-test-watcher/src/entity/IPLDBlock.ts b/packages/graph-test-watcher/src/entity/IPLDBlock.ts new file mode 100644 index 00000000..60a1c5ec --- /dev/null +++ b/packages/graph-test-watcher/src/entity/IPLDBlock.ts @@ -0,0 +1,30 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import { Entity, PrimaryGeneratedColumn, Column, Index, ManyToOne } from 'typeorm'; +import { BlockProgress } from './BlockProgress'; + +@Entity() +@Index(['cid'], { unique: true }) +@Index(['block', 'contractAddress']) +@Index(['block', 'contractAddress', 'kind'], { unique: true }) +export class IPLDBlock { + @PrimaryGeneratedColumn() + id!: number; + + @ManyToOne(() => BlockProgress, { onDelete: 'CASCADE' }) + block!: BlockProgress; + + @Column('varchar', { length: 42 }) + contractAddress!: string; + + @Column('varchar') + cid!: string; + + @Column('varchar') + kind!: string; + + @Column('bytea') + data!: Buffer; +} diff --git a/packages/graph-test-watcher/src/entity/_Test.ts b/packages/graph-test-watcher/src/entity/_Test.ts index 56c9f010..d0faf620 100644 --- a/packages/graph-test-watcher/src/entity/_Test.ts +++ b/packages/graph-test-watcher/src/entity/_Test.ts @@ -14,10 +14,13 @@ export class _Test { @Column('varchar', { length: 66 }) blockHash!: string; + @Column('integer') + blockNumber!: number; + @Column('varchar', { length: 42 }) contractAddress!: string; - @Column('numeric', { transformer: bigintTransformer }) + @Column('bigint', { transformer: bigintTransformer }) value!: bigint; @Column('text', { nullable: true }) diff --git a/packages/graph-test-watcher/src/events.ts b/packages/graph-test-watcher/src/events.ts index 03b00d53..700aee60 100644 --- a/packages/graph-test-watcher/src/events.ts +++ b/packages/graph-test-watcher/src/events.ts @@ -10,10 +10,15 @@ import { EthClient } from '@vulcanize/ipld-eth-client'; import { JobQueue, EventWatcher as BaseEventWatcher, + EventWatcherInterface, QUEUE_BLOCK_PROCESSING, QUEUE_EVENT_PROCESSING, + QUEUE_BLOCK_CHECKPOINT, + QUEUE_HOOKS, + QUEUE_IPFS, UNKNOWN_EVENT_NAME, - UpstreamConfig + UpstreamConfig, + JOB_KIND_PRUNE } from '@vulcanize/util'; import { Indexer } from './indexer'; @@ -23,7 +28,7 @@ const EVENT = 'event'; const log = debug('vulcanize:events'); -export class EventWatcher { +export class EventWatcher implements EventWatcherInterface { _ethClient: EthClient _indexer: Indexer _subscription: ZenObservable.Subscription | undefined @@ -55,6 +60,8 @@ export class EventWatcher { await this.initBlockProcessingOnCompleteHandler(); await this.initEventProcessingOnCompleteHandler(); + await this.initHooksOnCompleteHandler(); + await this.initBlockCheckpointOnCompleteHandler(); this._baseEventWatcher.startBlockProcessing(); } @@ -64,7 +71,7 @@ export class EventWatcher { async initBlockProcessingOnCompleteHandler (): Promise { this._jobQueue.onComplete(QUEUE_BLOCK_PROCESSING, async (job) => { - const { id, data: { failed } } = job; + const { id, data: { failed, request: { data: { kind } } } } = job; if (failed) { log(`Job ${id} for queue ${QUEUE_BLOCK_PROCESSING} failed`); @@ -72,6 +79,8 @@ export class EventWatcher { } await this._baseEventWatcher.blockProcessingCompleteHandler(job); + + await this.createHooksJob(kind); }); } @@ -99,6 +108,27 @@ export class EventWatcher { }); } + async initHooksOnCompleteHandler (): Promise { + this._jobQueue.onComplete(QUEUE_HOOKS, async (job) => { + const { data: { request: { data: { blockNumber, blockHash } } } } = job; + + await this._indexer.updateHookStatusProcessedBlock(blockNumber); + + // Create a checkpoint job after completion of a hook job. + await this.createCheckpointJob(blockHash, blockNumber); + }); + } + + async initBlockCheckpointOnCompleteHandler (): Promise { + this._jobQueue.onComplete(QUEUE_BLOCK_CHECKPOINT, async (job) => { + const { data: { request: { data: { blockHash } } } } = job; + + if (this._indexer.isIPFSConfigured()) { + await this.createIPFSPutJob(blockHash); + } + }); + } + async publishEventToSubscribers (dbEvent: Event, timeElapsedInSeconds: number): Promise { if (dbEvent && dbEvent.eventName !== UNKNOWN_EVENT_NAME) { const resultEvent = this._indexer.getResultEvent(dbEvent); @@ -111,4 +141,40 @@ export class EventWatcher { }); } } + + async createHooksJob (kind: string): Promise { + // If it's a pruning job: Create a hook job for the latest canonical block. + if (kind === JOB_KIND_PRUNE) { + const latestCanonicalBlock = await this._indexer.getLatestCanonicalBlock(); + assert(latestCanonicalBlock); + + await this._jobQueue.pushJob( + QUEUE_HOOKS, + { + blockHash: latestCanonicalBlock.blockHash, + blockNumber: latestCanonicalBlock.blockNumber + } + ); + } + } + + async createCheckpointJob (blockHash: string, blockNumber: number): Promise { + await this._jobQueue.pushJob( + QUEUE_BLOCK_CHECKPOINT, + { + blockHash, + blockNumber + } + ); + } + + async createIPFSPutJob (blockHash: string): Promise { + const ipldBlocks = await this._indexer.getIPLDBlocksByHash(blockHash); + + for (const ipldBlock of ipldBlocks) { + const data = this._indexer.getIPLDData(ipldBlock); + + await this._jobQueue.pushJob(QUEUE_IPFS, { data }); + } + } } diff --git a/packages/graph-test-watcher/src/fill.ts b/packages/graph-test-watcher/src/fill.ts index 2c67bd5f..0c3a5408 100644 --- a/packages/graph-test-watcher/src/fill.ts +++ b/packages/graph-test-watcher/src/fill.ts @@ -2,17 +2,15 @@ // Copyright 2021 Vulcanize, Inc. // +import path from 'path'; import assert from 'assert'; import 'reflect-metadata'; import yargs from 'yargs'; import { hideBin } from 'yargs/helpers'; import debug from 'debug'; import { PubSub } from 'apollo-server-express'; -import path from 'path'; -import { getCache } from '@vulcanize/cache'; -import { EthClient } from '@vulcanize/ipld-eth-client'; -import { getConfig, fillBlocks, JobQueue, DEFAULT_CONFIG_PATH, getCustomProvider } from '@vulcanize/util'; +import { Config, getConfig, fillBlocks, JobQueue, DEFAULT_CONFIG_PATH, initClients } from '@vulcanize/util'; import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; import { Database } from './database'; @@ -44,46 +42,27 @@ export const main = async (): Promise => { } }).argv; - const config = await getConfig(argv.configFile); + const config: Config = await getConfig(argv.configFile); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); - assert(config.server, 'Missing server config'); - - const { upstream, database: dbConfig, jobQueue: jobQueueConfig, server: { subgraphPath } } = config; - - assert(dbConfig, 'Missing database config'); - - const db = new Database(dbConfig); + const db = new Database(config.database); await db.init(); - assert(upstream, 'Missing upstream config'); - const { ethServer: { gqlApiEndpoint, gqlPostgraphileEndpoint, rpcProviderEndpoint, blockDelayInMilliSecs }, cache: cacheConfig } = upstream; - assert(gqlPostgraphileEndpoint, 'Missing upstream ethServer.gqlPostgraphileEndpoint'); - - const cache = await getCache(cacheConfig); - - const ethClient = new EthClient({ - gqlEndpoint: gqlApiEndpoint, - gqlSubscriptionEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const postgraphileClient = new EthClient({ - gqlEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const ethProvider = getCustomProvider(rpcProviderEndpoint); - - const graphDb = new GraphDatabase(dbConfig, path.resolve(__dirname, 'entity/*')); + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); await graphDb.init(); - const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, subgraphPath); - await graphWatcher.init(); + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); // Note: In-memory pubsub works fine for now, as each watcher is a single process anyway. // Later: https://www.apollographql.com/docs/apollo-server/data/subscriptions/#production-pubsub-libraries const pubsub = new PubSub(); - const indexer = new Indexer(db, ethClient, postgraphileClient, ethProvider, graphWatcher); + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const jobQueueConfig = config.jobQueue; + assert(jobQueueConfig, 'Missing job queue config'); const { dbConnectionString, maxCompletionLagInSecs } = jobQueueConfig; assert(dbConnectionString, 'Missing job queue db connection string'); @@ -91,11 +70,9 @@ export const main = async (): Promise => { const jobQueue = new JobQueue({ dbConnectionString, maxCompletionLag: maxCompletionLagInSecs }); await jobQueue.start(); - const eventWatcher = new EventWatcher(upstream, ethClient, postgraphileClient, indexer, pubsub, jobQueue); + const eventWatcher = new EventWatcher(config.upstream, ethClient, postgraphileClient, indexer, pubsub, jobQueue); - assert(jobQueueConfig, 'Missing job queue config'); - - await fillBlocks(jobQueue, indexer, postgraphileClient, eventWatcher, blockDelayInMilliSecs, argv); + await fillBlocks(jobQueue, indexer, postgraphileClient, eventWatcher, config.upstream.ethServer.blockDelayInMilliSecs, argv); }; main().catch(err => { diff --git a/packages/graph-test-watcher/src/gql/mutations/watchContract.gql b/packages/graph-test-watcher/src/gql/mutations/watchContract.gql index 8b272e0b..2ecc74f7 100644 --- a/packages/graph-test-watcher/src/gql/mutations/watchContract.gql +++ b/packages/graph-test-watcher/src/gql/mutations/watchContract.gql @@ -1,3 +1,3 @@ -mutation watchContract($contractAddress: String!, $startingBlock: Int){ - watchContract(contractAddress: $contractAddress, startingBlock: $startingBlock) +mutation watchContract($address: String!, $kind: String!, $checkpoint: Boolean!, $startingBlock: Int){ + watchContract(address: $address, kind: $kind, checkpoint: $checkpoint, startingBlock: $startingBlock) } \ No newline at end of file diff --git a/packages/graph-test-watcher/src/gql/queries/events.gql b/packages/graph-test-watcher/src/gql/queries/events.gql index 920a8783..de52299c 100644 --- a/packages/graph-test-watcher/src/gql/queries/events.gql +++ b/packages/graph-test-watcher/src/gql/queries/events.gql @@ -1,6 +1,7 @@ query events($blockHash: String!, $contractAddress: String!, $name: String){ events(blockHash: $blockHash, contractAddress: $contractAddress, name: $name){ block{ + cid hash number timestamp diff --git a/packages/graph-test-watcher/src/gql/queries/eventsInRange.gql b/packages/graph-test-watcher/src/gql/queries/eventsInRange.gql index a757c245..ebfa9572 100644 --- a/packages/graph-test-watcher/src/gql/queries/eventsInRange.gql +++ b/packages/graph-test-watcher/src/gql/queries/eventsInRange.gql @@ -1,6 +1,7 @@ query eventsInRange($fromBlockNumber: Int!, $toBlockNumber: Int!){ eventsInRange(fromBlockNumber: $fromBlockNumber, toBlockNumber: $toBlockNumber){ block{ + cid hash number timestamp diff --git a/packages/graph-test-watcher/src/gql/queries/exampleEntity.gql b/packages/graph-test-watcher/src/gql/queries/exampleEntity.gql new file mode 100644 index 00000000..cab9ae7b --- /dev/null +++ b/packages/graph-test-watcher/src/gql/queries/exampleEntity.gql @@ -0,0 +1,8 @@ +query exampleEntity($id: String!, $blockHash: String!){ + exampleEntity(id: $id, blockHash: $blockHash){ + id + count + param1 + param2 + } +} \ No newline at end of file diff --git a/packages/graph-test-watcher/src/gql/queries/getState.gql b/packages/graph-test-watcher/src/gql/queries/getState.gql new file mode 100644 index 00000000..3b8f6050 --- /dev/null +++ b/packages/graph-test-watcher/src/gql/queries/getState.gql @@ -0,0 +1,15 @@ +query getState($blockHash: String!, $contractAddress: String!, $kind: String){ + getState(blockHash: $blockHash, contractAddress: $contractAddress, kind: $kind){ + block{ + cid + hash + number + timestamp + parentHash + } + contractAddress + cid + kind + data + } +} \ No newline at end of file diff --git a/packages/graph-test-watcher/src/gql/queries/getStateByCID.gql b/packages/graph-test-watcher/src/gql/queries/getStateByCID.gql new file mode 100644 index 00000000..6c3c4fd8 --- /dev/null +++ b/packages/graph-test-watcher/src/gql/queries/getStateByCID.gql @@ -0,0 +1,15 @@ +query getStateByCID($cid: String!){ + getStateByCID(cid: $cid){ + block{ + cid + hash + number + timestamp + parentHash + } + contractAddress + cid + kind + data + } +} \ No newline at end of file diff --git a/packages/graph-test-watcher/src/gql/queries/index.ts b/packages/graph-test-watcher/src/gql/queries/index.ts index cb746635..e5838f0e 100644 --- a/packages/graph-test-watcher/src/gql/queries/index.ts +++ b/packages/graph-test-watcher/src/gql/queries/index.ts @@ -5,3 +5,6 @@ export const events = fs.readFileSync(path.join(__dirname, 'events.gql'), 'utf8' export const eventsInRange = fs.readFileSync(path.join(__dirname, 'eventsInRange.gql'), 'utf8'); export const getMethod = fs.readFileSync(path.join(__dirname, 'getMethod.gql'), 'utf8'); export const _test = fs.readFileSync(path.join(__dirname, '_test.gql'), 'utf8'); +export const exampleEntity = fs.readFileSync(path.join(__dirname, 'exampleEntity.gql'), 'utf8'); +export const getStateByCID = fs.readFileSync(path.join(__dirname, 'getStateByCID.gql'), 'utf8'); +export const getState = fs.readFileSync(path.join(__dirname, 'getState.gql'), 'utf8'); diff --git a/packages/graph-test-watcher/src/gql/subscriptions/onEvent.gql b/packages/graph-test-watcher/src/gql/subscriptions/onEvent.gql index 7eea1913..57f3b509 100644 --- a/packages/graph-test-watcher/src/gql/subscriptions/onEvent.gql +++ b/packages/graph-test-watcher/src/gql/subscriptions/onEvent.gql @@ -1,6 +1,7 @@ subscription onEvent{ onEvent{ block{ + cid hash number timestamp diff --git a/packages/graph-test-watcher/src/hooks.example.ts b/packages/graph-test-watcher/src/hooks.example.ts deleted file mode 100644 index 04b5ebf2..00000000 --- a/packages/graph-test-watcher/src/hooks.example.ts +++ /dev/null @@ -1,51 +0,0 @@ -// -// Copyright 2021 Vulcanize, Inc. -// - -import assert from 'assert'; - -import { Indexer, ResultEvent } from './indexer'; - -/** - * Event hook function. - * @param indexer Indexer instance that contains methods to fetch and update the contract values in the database. - * @param eventData ResultEvent object containing necessary information. - */ -export async function handleEvent (indexer: Indexer, eventData: ResultEvent): Promise { - assert(indexer); - assert(eventData); - - // The following code is for ERC20 contract implementation. - - // Perform indexing based on the type of event. - switch (eventData.event.__typename) { - // In case of ERC20 'Transfer' event. - case 'TransferEvent': { - // On a transfer, balances for both parties change. - // Therefore, trigger indexing for both sender and receiver. - - // Get event fields from eventData. - // const { from, to } = eventData.event; - - // Update balance entry for sender in the database. - // await indexer.balanceOf(eventData.block.hash, eventData.contract, from); - - // Update balance entry for receiver in the database. - // await indexer.balanceOf(eventData.block.hash, eventData.contract, to); - - break; - } - // In case of ERC20 'Approval' event. - case 'ApprovalEvent': { - // On an approval, allowance for (owner, spender) combination changes. - - // Get event fields from eventData. - // const { owner, spender } = eventData.event; - - // Update allowance entry for (owner, spender) combination in the database. - // await indexer.allowance(eventData.block.hash, eventData.contract, owner, spender); - - break; - } - } -} diff --git a/packages/graph-test-watcher/src/hooks.ts b/packages/graph-test-watcher/src/hooks.ts index 27e856ae..14211ffc 100644 --- a/packages/graph-test-watcher/src/hooks.ts +++ b/packages/graph-test-watcher/src/hooks.ts @@ -6,14 +6,33 @@ import assert from 'assert'; import { Indexer, ResultEvent } from './indexer'; -/** - * Event hook function. - * @param indexer Indexer instance that contains methods to fetch and update the contract values in the database. - * @param eventData ResultEvent object containing necessary information. - */ +export async function createInitialCheckpoint (indexer: Indexer, contractAddress: string, blockHash: string): Promise { + assert(indexer); + assert(blockHash); + assert(contractAddress); + + // Store an empty state in an IPLDBlock. + const ipldBlockData: any = { + state: {} + }; + + await indexer.createCheckpoint(contractAddress, blockHash, ipldBlockData); +} + +export async function createStateDiff (indexer: Indexer, blockHash: string): Promise { + assert(indexer); + assert(blockHash); +} + +export async function createStateCheckpoint (indexer: Indexer, contractAddress: string, blockHash: string): Promise { + assert(indexer); + assert(blockHash); + assert(contractAddress); + + return false; +} + export async function handleEvent (indexer: Indexer, eventData: ResultEvent): Promise { assert(indexer); assert(eventData); - - // Perform indexing based on the type of event. } diff --git a/packages/graph-test-watcher/src/indexer.ts b/packages/graph-test-watcher/src/indexer.ts index 05f8a6b9..a960c04e 100644 --- a/packages/graph-test-watcher/src/indexer.ts +++ b/packages/graph-test-watcher/src/indexer.ts @@ -7,21 +7,28 @@ import debug from 'debug'; import { DeepPartial } from 'typeorm'; import JSONbig from 'json-bigint'; import { ethers } from 'ethers'; +import { sha256 } from 'multiformats/hashes/sha2'; +import { CID } from 'multiformats/cid'; +import _ from 'lodash'; import { JsonFragment } from '@ethersproject/abi'; import { BaseProvider } from '@ethersproject/providers'; +import * as codec from '@ipld/dag-cbor'; import { EthClient } from '@vulcanize/ipld-eth-client'; import { StorageLayout } from '@vulcanize/solidity-mapper'; -import { EventInterface, Indexer as BaseIndexer, ValueResult, UNKNOWN_EVENT_NAME } from '@vulcanize/util'; +import { EventInterface, Indexer as BaseIndexer, IndexerInterface, ValueResult, UNKNOWN_EVENT_NAME, ServerConfig, updateStateForElementaryType } from '@vulcanize/util'; import { GraphWatcher } from '@vulcanize/graph-node'; import { Database } from './database'; import { Contract } from './entity/Contract'; import { Event } from './entity/Event'; import { SyncStatus } from './entity/SyncStatus'; +import { HookStatus } from './entity/HookStatus'; import { BlockProgress } from './entity/BlockProgress'; +import { IPLDBlock } from './entity/IPLDBlock'; import artifacts from './artifacts/Example.json'; -import { handleEvent } from './hooks'; +import { createInitialCheckpoint, handleEvent, createStateDiff, createStateCheckpoint } from './hooks'; +import { IPFSClient } from './ipfs'; const log = debug('vulcanize:indexer'); @@ -29,6 +36,7 @@ const TEST_EVENT = 'Test'; export type ResultEvent = { block: { + cid: string; hash: string; number: number; timestamp: number; @@ -48,30 +56,49 @@ export type ResultEvent = { event: any; proof: string; -} +}; -export class Indexer { +export type ResultIPLDBlock = { + block: { + cid: string; + hash: string; + number: number; + timestamp: number; + parentHash: string; + }; + contractAddress: string; + cid: string; + kind: string; + data: string; +}; + +export class Indexer implements IndexerInterface { _db: Database _ethClient: EthClient _ethProvider: BaseProvider - _postgraphileClient: EthClient; + _postgraphileClient: EthClient + _baseIndexer: BaseIndexer + _serverConfig: ServerConfig _graphWatcher: GraphWatcher; - _baseIndexer: BaseIndexer; _abi: JsonFragment[] _storageLayout: StorageLayout _contract: ethers.utils.Interface - constructor (db: Database, ethClient: EthClient, postgraphileClient: EthClient, ethProvider: BaseProvider, graphWatcher: GraphWatcher) { + _ipfsClient: IPFSClient + + constructor (serverConfig: ServerConfig, db: Database, ethClient: EthClient, postgraphileClient: EthClient, ethProvider: BaseProvider, graphWatcher: GraphWatcher) { assert(db); assert(ethClient); + assert(postgraphileClient); this._db = db; this._ethClient = ethClient; this._postgraphileClient = postgraphileClient; this._ethProvider = ethProvider; + this._serverConfig = serverConfig; + this._baseIndexer = new BaseIndexer(this._db, this._ethClient, this._postgraphileClient, this._ethProvider); this._graphWatcher = graphWatcher; - this._baseIndexer = new BaseIndexer(this._db, this._ethClient, this._ethProvider); const { abi, storageLayout } = artifacts; @@ -82,6 +109,8 @@ export class Indexer { this._storageLayout = storageLayout; this._contract = new ethers.utils.Interface(this._abi); + + this._ipfsClient = new IPFSClient(this._serverConfig.ipfsApiAddr); } getResultEvent (event: Event): ResultEvent { @@ -91,6 +120,7 @@ export class Indexer { return { block: { + cid: block.cid, hash: block.blockHash, number: block.blockNumber, timestamp: block.blockTimestamp, @@ -118,6 +148,26 @@ export class Indexer { }; } + getResultIPLDBlock (ipldBlock: IPLDBlock): ResultIPLDBlock { + const block = ipldBlock.block; + + const data = codec.decode(Buffer.from(ipldBlock.data)) as any; + + return { + block: { + cid: block.cid, + hash: block.blockHash, + number: block.blockNumber, + timestamp: block.blockTimestamp, + parentHash: block.parentHash + }, + contractAddress: ipldBlock.contractAddress, + cid: ipldBlock.cid, + kind: ipldBlock.kind, + data: JSON.stringify(data) + }; + } + async getMethod (blockHash: string, contractAddress: string): Promise { const entity = await this._db.getGetMethod({ blockHash, contractAddress }); if (entity) { @@ -131,19 +181,21 @@ export class Indexer { log('getMethod: db miss, fetching from upstream server'); - const contract = new ethers.Contract(contractAddress, this._abi, this._ethProvider); + const { block: { number } } = await this._ethClient.getBlockByHash(blockHash); + const blockNumber = ethers.BigNumber.from(number).toNumber(); + const contract = new ethers.Contract(contractAddress, this._abi, this._ethProvider); const value = await contract.getMethod({ blockTag: blockHash }); const result: ValueResult = { value }; - await this._db.saveGetMethod({ blockHash, contractAddress, value: result.value, proof: JSONbig.stringify(result.proof) }); + await this._db.saveGetMethod({ blockHash, blockNumber, contractAddress, value: result.value, proof: JSONbig.stringify(result.proof) }); return result; } - async _test (blockHash: string, contractAddress: string): Promise { - const entity = await this._db.get_test({ blockHash, contractAddress }); + async _test (blockHash: string, contractAddress: string, diff = false): Promise { + const entity = await this._db._getTest({ blockHash, contractAddress }); if (entity) { log('_test: db hit.'); @@ -155,6 +207,9 @@ export class Indexer { log('_test: db miss, fetching from upstream server'); + const { block: { number } } = await this._ethClient.getBlockByHash(blockHash); + const blockNumber = ethers.BigNumber.from(number).toNumber(); + const result = await this._baseIndexer.getStorageValue( this._storageLayout, blockHash, @@ -162,13 +217,325 @@ export class Indexer { '_test' ); - await this._db.save_test({ blockHash, contractAddress, value: result.value, proof: JSONbig.stringify(result.proof) }); + await this._db._saveTest({ blockHash, blockNumber, contractAddress, value: result.value, proof: JSONbig.stringify(result.proof) }); + + if (diff) { + const stateUpdate = updateStateForElementaryType({}, '_test', result.value.toString()); + await this.createDiffStaged(contractAddress, blockHash, stateUpdate); + } return result; } - async getExampleEntity (blockHash: string, id: string): Promise { - return this._graphWatcher.getEntity(blockHash, 'ExampleEntity', id); + async processCanonicalBlock (job: any): Promise { + const { data: { blockHash } } = job; + + // Finalize staged diff blocks if any. + await this.finalizeDiffStaged(blockHash); + + // Call custom stateDiff hook. + await createStateDiff(this, blockHash); + } + + async createDiffStaged (contractAddress: string, blockHash: string, data: any): Promise { + const block = await this.getBlockProgress(blockHash); + assert(block); + + // Create a staged diff block. + const ipldBlock = await this.prepareIPLDBlock(block, contractAddress, data, 'diff_staged'); + await this.saveOrUpdateIPLDBlock(ipldBlock); + } + + async finalizeDiffStaged (blockHash: string): Promise { + const block = await this.getBlockProgress(blockHash); + assert(block); + + // Get all the staged diff blocks for the given blockHash. + const stagedBlocks = await this._db.getIPLDBlocks({ block, kind: 'diff_staged' }); + + // For each staged block, create a diff block. + for (const stagedBlock of stagedBlocks) { + const data = codec.decode(Buffer.from(stagedBlock.data)); + await this.createDiff(stagedBlock.contractAddress, stagedBlock.block.blockHash, data); + } + + // Remove all the staged diff blocks for current blockNumber. + await this.removeStagedIPLDBlocks(block.blockNumber); + } + + async createDiff (contractAddress: string, blockHash: string, data: any): Promise { + const block = await this.getBlockProgress(blockHash); + assert(block); + + // Fetch the latest checkpoint for the contract. + const checkpoint = await this.getLatestIPLDBlock(contractAddress, 'checkpoint'); + + // There should be an initial checkpoint at least. + // Assumption: There should be no events for the contract at the starting block. + assert(checkpoint, 'Initial checkpoint doesn\'t exist'); + + // Check if the latest checkpoint is in the same block. + assert(checkpoint.block.blockHash !== block.blockHash, 'Checkpoint already created for the block hash.'); + + const ipldBlock = await this.prepareIPLDBlock(block, contractAddress, data, 'diff'); + await this.saveOrUpdateIPLDBlock(ipldBlock); + } + + async processCheckpoint (job: any): Promise { + // Return if checkpointInterval is <= 0. + const checkpointInterval = this._serverConfig.checkpointInterval; + if (checkpointInterval <= 0) return; + + const { data: { blockHash, blockNumber } } = job; + + // Get all the contracts. + const contracts = await this._db.getContracts({}); + + // For each contract, merge the diff till now to create a checkpoint. + for (const contract of contracts) { + // Check if contract has checkpointing on. + if (contract.checkpoint) { + // If a checkpoint doesn't already exist and blockNumber is equal to startingBlock, create an initial checkpoint. + const checkpointBlock = await this.getLatestIPLDBlock(contract.address, 'checkpoint'); + + if (!checkpointBlock) { + if (blockNumber === contract.startingBlock) { + // Call initial checkpoint hook. + await createInitialCheckpoint(this, contract.address, blockHash); + } + } else { + await this.createCheckpoint(contract.address, blockHash, null, checkpointInterval); + } + } + } + } + + async processCLICheckpoint (contractAddress: string, blockHash?: string): Promise { + const checkpointBlockHash = await this.createCheckpoint(contractAddress, blockHash); + assert(checkpointBlockHash); + + const block = await this.getBlockProgress(checkpointBlockHash); + const checkpointIPLDBlocks = await this._db.getIPLDBlocks({ block, contractAddress, kind: 'checkpoint' }); + + // There can be at most one IPLDBlock for a (block, contractAddress, kind) combination. + assert(checkpointIPLDBlocks.length <= 1); + const checkpointIPLDBlock = checkpointIPLDBlocks[0]; + + const checkpointData = this.getIPLDData(checkpointIPLDBlock); + + await this.pushToIPFS(checkpointData); + + return checkpointBlockHash; + } + + async createCheckpoint (contractAddress: string, blockHash?: string, data?: any, checkpointInterval?: number): Promise { + const syncStatus = await this.getSyncStatus(); + assert(syncStatus); + + // Getting the current block. + let currentBlock; + + if (blockHash) { + currentBlock = await this.getBlockProgress(blockHash); + } else { + // In case of empty blockHash from checkpoint CLI, get the latest canonical block for the checkpoint. + currentBlock = await this.getBlockProgress(syncStatus.latestCanonicalBlockHash); + } + + assert(currentBlock); + + // Data is passed in case of initial checkpoint and checkpoint hook. + // Assumption: There should be no events for the contract at the starting block. + if (data) { + const ipldBlock = await this.prepareIPLDBlock(currentBlock, contractAddress, data, 'checkpoint'); + await this.saveOrUpdateIPLDBlock(ipldBlock); + + return; + } + + // If data is not passed, create from previous checkpoint and diffs after that. + + // Make sure the block is marked complete. + assert(currentBlock.isComplete, 'Block for a checkpoint should be marked as complete'); + + // Make sure the block is in the pruned region. + assert(currentBlock.blockNumber <= syncStatus.latestCanonicalBlockNumber, 'Block for a checkpoint should be in the pruned region'); + + // Fetch the latest checkpoint for the contract. + const checkpointBlock = await this.getLatestIPLDBlock(contractAddress, 'checkpoint', currentBlock.blockNumber); + assert(checkpointBlock); + + // Check (only if checkpointInterval is passed) if it is time for a new checkpoint. + if (checkpointInterval && checkpointBlock.block.blockNumber > (currentBlock.blockNumber - checkpointInterval)) { + return; + } + + // Call state checkpoint hook and check if default checkpoint is disabled. + const disableDefaultCheckpoint = await createStateCheckpoint(this, contractAddress, currentBlock.blockHash); + + if (disableDefaultCheckpoint) { + // Return if default checkpoint is disabled. + // Return block hash for checkpoint CLI. + return currentBlock.blockHash; + } + + const { block: { blockNumber: checkpointBlockNumber } } = checkpointBlock; + + // Fetching all diff blocks after checkpoint. + const diffBlocks = await this.getDiffIPLDBlocksByCheckpoint(contractAddress, checkpointBlockNumber); + + const checkpointBlockData = codec.decode(Buffer.from(checkpointBlock.data)) as any; + data = { + state: checkpointBlockData.state + }; + + for (const diffBlock of diffBlocks) { + const diff = codec.decode(Buffer.from(diffBlock.data)) as any; + data.state = _.merge(data.state, diff.state); + } + + const ipldBlock = await this.prepareIPLDBlock(currentBlock, contractAddress, data, 'checkpoint'); + await this.saveOrUpdateIPLDBlock(ipldBlock); + + return currentBlock.blockHash; + } + + getIPLDData (ipldBlock: IPLDBlock): any { + return codec.decode(Buffer.from(ipldBlock.data)); + } + + async getIPLDBlocksByHash (blockHash: string): Promise { + const block = await this.getBlockProgress(blockHash); + assert(block); + + return this._db.getIPLDBlocks({ block }); + } + + async getIPLDBlockByCid (cid: string): Promise { + const ipldBlocks = await this._db.getIPLDBlocks({ cid }); + + // There can be only one IPLDBlock with a particular cid. + assert(ipldBlocks.length <= 1); + + return ipldBlocks[0]; + } + + async getLatestIPLDBlock (contractAddress: string, kind: string | null, blockNumber?: number): Promise { + return this._db.getLatestIPLDBlock(contractAddress, kind, blockNumber); + } + + async getPrevIPLDBlock (blockHash: string, contractAddress: string, kind?: string): Promise { + const dbTx = await this._db.createTransactionRunner(); + let res; + + try { + res = await this._db.getPrevIPLDBlock(dbTx, blockHash, contractAddress, kind); + await dbTx.commitTransaction(); + } catch (error) { + await dbTx.rollbackTransaction(); + throw error; + } finally { + await dbTx.release(); + } + return res; + } + + async getDiffIPLDBlocksByCheckpoint (contractAddress: string, checkpointBlockNumber: number): Promise { + return this._db.getDiffIPLDBlocksByCheckpoint(contractAddress, checkpointBlockNumber); + } + + async prepareIPLDBlock (block: BlockProgress, contractAddress: string, data: any, kind: string):Promise { + assert(_.includes(['diff', 'checkpoint', 'diff_staged'], kind)); + + // Get an existing 'diff' | 'diff_staged' | 'checkpoint' IPLDBlock for current block, contractAddress. + const currentIPLDBlocks = await this._db.getIPLDBlocks({ block, contractAddress, kind }); + + // There can be at most one IPLDBlock for a (block, contractAddress, kind) combination. + assert(currentIPLDBlocks.length <= 1); + const currentIPLDBlock = currentIPLDBlocks[0]; + + // Update currentIPLDBlock if it exists and is of same kind. + let ipldBlock; + if (currentIPLDBlock) { + ipldBlock = currentIPLDBlock; + + // Update the data field. + const oldData = codec.decode(Buffer.from(currentIPLDBlock.data)); + data = _.merge(oldData, data); + } else { + ipldBlock = new IPLDBlock(); + + // Fetch the parent IPLDBlock. + const parentIPLDBlock = await this.getLatestIPLDBlock(contractAddress, null, block.blockNumber); + + // Setting the meta-data for an IPLDBlock (done only once per block). + data.meta = { + id: contractAddress, + kind, + parent: { + '/': parentIPLDBlock ? parentIPLDBlock.cid : null + }, + ethBlock: { + cid: { + '/': block.cid + }, + num: block.blockNumber + } + }; + } + + // Encoding the data using dag-cbor codec. + const bytes = codec.encode(data); + + // Calculating sha256 (multi)hash of the encoded data. + const hash = await sha256.digest(bytes); + + // Calculating the CID: v1, code: dag-cbor, hash. + const cid = CID.create(1, codec.code, hash); + + // Update ipldBlock with new data. + ipldBlock = Object.assign(ipldBlock, { + block, + contractAddress, + cid: cid.toString(), + kind: data.meta.kind, + data: Buffer.from(bytes) + }); + + return ipldBlock; + } + + async saveOrUpdateIPLDBlock (ipldBlock: IPLDBlock): Promise { + return this._db.saveOrUpdateIPLDBlock(ipldBlock); + } + + async removeStagedIPLDBlocks (blockNumber: number): Promise { + const dbTx = await this._db.createTransactionRunner(); + + try { + await this._db.removeEntities(dbTx, IPLDBlock, { relations: ['block'], where: { block: { blockNumber }, kind: 'diff_staged' } }); + await dbTx.commitTransaction(); + } catch (error) { + await dbTx.rollbackTransaction(); + throw error; + } finally { + await dbTx.release(); + } + } + + async pushToIPFS (data: any): Promise { + await this._ipfsClient.push(data); + } + + isIPFSConfigured (): boolean { + const ipfsAddr = this._serverConfig.ipfsApiAddr; + + // Return false if ipfsAddr is undefined | null | empty string. + return (ipfsAddr !== undefined && ipfsAddr !== null && ipfsAddr !== ''); + } + + async getSubgraphEntity (entity: new () => Entity, id: string, blockHash: string): Promise { + return this._graphWatcher.getEntity(entity, id, blockHash); } async triggerIndexingOnEvent (event: Event): Promise { @@ -181,11 +548,6 @@ export class Indexer { await handleEvent(this, resultEvent); } - async processBlock (blockHash: string): Promise { - // Call subgraph handler for block. - await this._graphWatcher.handleBlock(blockHash); - } - async processEvent (event: Event): Promise { // Trigger indexing of data based on the event. await this.triggerIndexingOnEvent(event); @@ -218,14 +580,66 @@ export class Indexer { }; } - async watchContract (address: string, startingBlock: number): Promise { - // Always use the checksum address (https://docs.ethers.io/v5/api/utils/address/#utils-getAddress). - await this._db.saveContract(ethers.utils.getAddress(address), 'Example', startingBlock); + async watchContract (address: string, kind: string, checkpoint: boolean, startingBlock?: number): Promise { + // Use the checksum address (https://docs.ethers.io/v5/api/utils/address/#utils-getAddress) if input to address is a contract address. + // If a contract identifier is passed as address instead, no need to convert to checksum address. + // Customize: use the kind input to filter out non-contract-address input to address. + const formattedAddress = (kind === '__protocol__') ? address : ethers.utils.getAddress(address); + + if (!startingBlock) { + const syncStatus = await this.getSyncStatus(); + assert(syncStatus); + + startingBlock = syncStatus.latestIndexedBlockNumber; + } + + await this._db.saveContract(formattedAddress, kind, checkpoint, startingBlock); return true; } - async getEventsByFilter (blockHash: string, contract: string, name: string | null): Promise> { + async getHookStatus (): Promise { + const dbTx = await this._db.createTransactionRunner(); + let res; + + try { + res = await this._db.getHookStatus(dbTx); + await dbTx.commitTransaction(); + } catch (error) { + await dbTx.rollbackTransaction(); + throw error; + } finally { + await dbTx.release(); + } + + return res; + } + + async updateHookStatusProcessedBlock (blockNumber: number, force?: boolean): Promise { + const dbTx = await this._db.createTransactionRunner(); + let res; + + try { + res = await this._db.updateHookStatusProcessedBlock(dbTx, blockNumber, force); + await dbTx.commitTransaction(); + } catch (error) { + await dbTx.rollbackTransaction(); + throw error; + } finally { + await dbTx.release(); + } + + return res; + } + + async getLatestCanonicalBlock (): Promise { + const syncStatus = await this.getSyncStatus(); + assert(syncStatus); + + return this.getBlockProgress(syncStatus.latestCanonicalBlockHash); + } + + async getEventsByFilter (blockHash: string, contract?: string, name?: string): Promise> { return this._baseIndexer.getEventsByFilter(blockHash, contract, name); } @@ -245,16 +659,16 @@ export class Indexer { return this._baseIndexer.getSyncStatus(); } - async updateSyncStatusIndexedBlock (blockHash: string, blockNumber: number): Promise { - return this._baseIndexer.updateSyncStatusIndexedBlock(blockHash, blockNumber); + async updateSyncStatusIndexedBlock (blockHash: string, blockNumber: number, force = false): Promise { + return this._baseIndexer.updateSyncStatusIndexedBlock(blockHash, blockNumber, force); } async updateSyncStatusChainHead (blockHash: string, blockNumber: number): Promise { return this._baseIndexer.updateSyncStatusChainHead(blockHash, blockNumber); } - async updateSyncStatusCanonicalBlock (blockHash: string, blockNumber: number): Promise { - return this._baseIndexer.updateSyncStatusCanonicalBlock(blockHash, blockNumber); + async updateSyncStatusCanonicalBlock (blockHash: string, blockNumber: number, force = false): Promise { + return this._baseIndexer.updateSyncStatusCanonicalBlock(blockHash, blockNumber, force); } async getBlock (blockHash: string): Promise { @@ -297,7 +711,7 @@ export class Indexer { return this._baseIndexer.getAncestorAtDepth(blockHash, depth); } - async _fetchAndSaveEvents ({ blockHash }: DeepPartial): Promise { + async _fetchAndSaveEvents ({ cid: blockCid, blockHash }: DeepPartial): Promise { assert(blockHash); let { block, logs } = await this._ethClient.getLogs({ blockHash }); @@ -381,6 +795,7 @@ export class Indexer { try { block = { + cid: blockCid, blockHash, blockNumber: block.number, blockTimestamp: block.timestamp, diff --git a/packages/graph-test-watcher/src/ipfs.ts b/packages/graph-test-watcher/src/ipfs.ts new file mode 100644 index 00000000..3c92443d --- /dev/null +++ b/packages/graph-test-watcher/src/ipfs.ts @@ -0,0 +1,17 @@ +// +// Copyright 2021 Vulcanize, Inc. +// + +import { create, IPFSHTTPClient } from 'ipfs-http-client'; + +export class IPFSClient { + _client: IPFSHTTPClient + + constructor (url: string) { + this._client = create({ url }); + } + + async push (data: any): Promise { + await this._client.dag.put(data, { format: 'dag-cbor', hashAlg: 'sha2-256' }); + } +} diff --git a/packages/graph-test-watcher/src/job-runner.ts b/packages/graph-test-watcher/src/job-runner.ts index 366501b3..9f7cbea3 100644 --- a/packages/graph-test-watcher/src/job-runner.ts +++ b/packages/graph-test-watcher/src/job-runner.ts @@ -2,25 +2,26 @@ // Copyright 2021 Vulcanize, Inc. // +import path from 'path'; import assert from 'assert'; import 'reflect-metadata'; import yargs from 'yargs'; import { hideBin } from 'yargs/helpers'; import debug from 'debug'; -import path from 'path'; -import { getCache } from '@vulcanize/cache'; -import { EthClient } from '@vulcanize/ipld-eth-client'; import { getConfig, + Config, JobQueue, JobRunner as BaseJobRunner, QUEUE_BLOCK_PROCESSING, QUEUE_EVENT_PROCESSING, + QUEUE_BLOCK_CHECKPOINT, + QUEUE_HOOKS, + QUEUE_IPFS, JobQueueConfig, DEFAULT_CONFIG_PATH, - getCustomProvider, - JOB_KIND_INDEX + initClients } from '@vulcanize/util'; import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; @@ -45,18 +46,17 @@ export class JobRunner { async start (): Promise { await this.subscribeBlockProcessingQueue(); await this.subscribeEventProcessingQueue(); + await this.subscribeBlockCheckpointQueue(); + await this.subscribeHooksQueue(); + await this.subscribeIPFSQueue(); } async subscribeBlockProcessingQueue (): Promise { await this._jobQueue.subscribe(QUEUE_BLOCK_PROCESSING, async (job) => { + // TODO Call pre-block hook here (Directly or indirectly (Like done through indexer.processEvent for events)). + await this._baseJobRunner.processBlock(job); - const { data: { kind, blockHash } } = job; - - if (kind === JOB_KIND_INDEX) { - await this._indexer.processBlock(blockHash); - } - await this._jobQueue.markComplete(job); }); } @@ -73,6 +73,43 @@ export class JobRunner { await this._jobQueue.markComplete(job); }); } + + async subscribeHooksQueue (): Promise { + await this._jobQueue.subscribe(QUEUE_HOOKS, async (job) => { + const { data: { blockNumber } } = job; + + const hookStatus = await this._indexer.getHookStatus(); + + if (hookStatus && hookStatus.latestProcessedBlockNumber < (blockNumber - 1)) { + const message = `Hooks for blockNumber ${blockNumber - 1} not processed yet, aborting`; + log(message); + + throw new Error(message); + } + + await this._indexer.processCanonicalBlock(job); + + await this._jobQueue.markComplete(job); + }); + } + + async subscribeBlockCheckpointQueue (): Promise { + await this._jobQueue.subscribe(QUEUE_BLOCK_CHECKPOINT, async (job) => { + await this._indexer.processCheckpoint(job); + + await this._jobQueue.markComplete(job); + }); + } + + async subscribeIPFSQueue (): Promise { + await this._jobQueue.subscribe(QUEUE_IPFS, async (job) => { + const { data: { data } } = job; + + await this._indexer.pushToIPFS(data); + + await this._jobQueue.markComplete(job); + }); + } } export const main = async (): Promise => { @@ -86,44 +123,23 @@ export const main = async (): Promise => { }) .argv; - const config = await getConfig(argv.f); + const config: Config = await getConfig(argv.f); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); - assert(config.server, 'Missing server config'); - - const { upstream, database: dbConfig, jobQueue: jobQueueConfig, server: { subgraphPath } } = config; - - assert(dbConfig, 'Missing database config'); - - const db = new Database(dbConfig); + const db = new Database(config.database); await db.init(); - assert(upstream, 'Missing upstream config'); - const { ethServer: { gqlApiEndpoint, gqlPostgraphileEndpoint, rpcProviderEndpoint }, cache: cacheConfig } = upstream; - assert(gqlApiEndpoint, 'Missing upstream ethServer.gqlApiEndpoint'); - assert(gqlPostgraphileEndpoint, 'Missing upstream ethServer.gqlPostgraphileEndpoint'); - - const cache = await getCache(cacheConfig); - - const ethClient = new EthClient({ - gqlEndpoint: gqlApiEndpoint, - gqlSubscriptionEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const postgraphileClient = new EthClient({ - gqlEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const graphDb = new GraphDatabase(dbConfig, path.resolve(__dirname, 'entity/*')); + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); await graphDb.init(); - const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, subgraphPath); + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); + + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + + graphWatcher.setIndexer(indexer); await graphWatcher.init(); - const ethProvider = getCustomProvider(rpcProviderEndpoint); - const indexer = new Indexer(db, ethClient, postgraphileClient, ethProvider, graphWatcher); - + const jobQueueConfig = config.jobQueue; assert(jobQueueConfig, 'Missing job queue config'); const { dbConnectionString, maxCompletionLagInSecs } = jobQueueConfig; diff --git a/packages/graph-test-watcher/src/resolvers.ts b/packages/graph-test-watcher/src/resolvers.ts index 4346d724..93f38417 100644 --- a/packages/graph-test-watcher/src/resolvers.ts +++ b/packages/graph-test-watcher/src/resolvers.ts @@ -11,6 +11,8 @@ import { ValueResult } from '@vulcanize/util'; import { Indexer } from './indexer'; import { EventWatcher } from './events'; +import { ExampleEntity } from './entity/ExampleEntity'; + const log = debug('vulcanize:resolver'); export const createResolvers = async (indexer: Indexer, eventWatcher: EventWatcher): Promise => { @@ -34,9 +36,10 @@ export const createResolvers = async (indexer: Indexer, eventWatcher: EventWatch }, Mutation: { - watchContract: (_: any, { contractAddress, startingBlock = 1 }: { contractAddress: string, startingBlock: number }): Promise => { - log('watchContract', contractAddress, startingBlock); - return indexer.watchContract(contractAddress, startingBlock); + watchContract: (_: any, { address, kind, checkpoint, startingBlock }: { address: string, kind: string, checkpoint: boolean, startingBlock: number }): Promise => { + log('watchContract', address, kind, checkpoint, startingBlock); + + return indexer.watchContract(address, kind, checkpoint, startingBlock); } }, @@ -51,8 +54,14 @@ export const createResolvers = async (indexer: Indexer, eventWatcher: EventWatch return indexer._test(blockHash, contractAddress); }, - events: async (_: any, { blockHash, contractAddress, name }: { blockHash: string, contractAddress: string, name: string }) => { - log('events', blockHash, contractAddress, name || ''); + exampleEntity: async (_: any, { id, blockHash }: { id: string, blockHash: string }): Promise => { + log('exampleEntity', id, blockHash); + + return indexer.getSubgraphEntity(ExampleEntity, id, blockHash); + }, + + events: async (_: any, { blockHash, contractAddress, name }: { blockHash: string, contractAddress: string, name?: string }) => { + log('events', blockHash, contractAddress, name); const block = await indexer.getBlockProgress(blockHash); if (!block || !block.isComplete) { @@ -75,12 +84,20 @@ export const createResolvers = async (indexer: Indexer, eventWatcher: EventWatch return events.map(event => indexer.getResultEvent(event)); }, - exampleEntity: async (_: any, { blockHash, id }: { blockHash: string, id: string }) => { - log('exampleEntity', blockHash, id); + getStateByCID: async (_: any, { cid }: { cid: string }) => { + log('getStateByCID', cid); - const exampleEntity = await indexer.getExampleEntity(blockHash, id); + const ipldBlock = await indexer.getIPLDBlockByCid(cid); - return JSON.stringify(exampleEntity, undefined, 2); + return ipldBlock && ipldBlock.block.isComplete ? indexer.getResultIPLDBlock(ipldBlock) : undefined; + }, + + getState: async (_: any, { blockHash, contractAddress, kind = 'diff' }: { blockHash: string, contractAddress: string, kind: string }) => { + log('getState', blockHash, contractAddress, kind); + + const ipldBlock = await indexer.getPrevIPLDBlock(blockHash, contractAddress, kind); + + return ipldBlock && ipldBlock.block.isComplete ? indexer.getResultIPLDBlock(ipldBlock) : undefined; } } }; diff --git a/packages/graph-test-watcher/src/schema.gql b/packages/graph-test-watcher/src/schema.gql index b1839558..7e9fa64c 100644 --- a/packages/graph-test-watcher/src/schema.gql +++ b/packages/graph-test-watcher/src/schema.gql @@ -1,22 +1,31 @@ -type Query { - events(blockHash: String!, contractAddress: String!, name: String): [ResultEvent!] - eventsInRange(fromBlockNumber: Int!, toBlockNumber: Int!): [ResultEvent!] - getMethod(blockHash: String!, contractAddress: String!): ResultString! - _test(blockHash: String!, contractAddress: String!): ResultBigInt! - exampleEntity(blockHash: String!, id: String!): String! +scalar BigInt + +type Proof { + data: String! } -type ResultEvent { - block: Block! - tx: Transaction! - contract: String! - eventIndex: Int! - eventSignature: String! - event: Event! +type ResultBoolean { + value: Boolean! + proof: Proof +} + +type ResultString { + value: String! + proof: Proof +} + +type ResultInt { + value: Int! + proof: Proof +} + +type ResultBigInt { + value: BigInt! proof: Proof } type Block { + cid: String! hash: String! number: Int! timestamp: Int! @@ -30,6 +39,15 @@ type Transaction { to: String! } +type ResultEvent { + block: Block! + tx: Transaction! + contract: String! + eventIndex: Int! + event: Event! + proof: Proof +} + union Event = TestEvent type TestEvent { @@ -37,24 +55,33 @@ type TestEvent { param2: Int! } -type Proof { +type ResultIPLDBlock { + block: Block! + contractAddress: String! + cid: String! + kind: String! data: String! } -type ResultString { - value: String! - proof: Proof +type Query { + events(blockHash: String!, contractAddress: String!, name: String): [ResultEvent!] + eventsInRange(fromBlockNumber: Int!, toBlockNumber: Int!): [ResultEvent!] + getMethod(blockHash: String!, contractAddress: String!): ResultString! + _test(blockHash: String!, contractAddress: String!): ResultBigInt! + exampleEntity(id: String!, blockHash: String!): ExampleEntity! + getStateByCID(cid: String!): ResultIPLDBlock + getState(blockHash: String!, contractAddress: String!, kind: String): ResultIPLDBlock } -type ResultBigInt { - value: BigInt! - proof: Proof +type ExampleEntity { + id: ID! + count: BigInt! + param1: String! + param2: Int! } -scalar BigInt - type Mutation { - watchContract(contractAddress: String!, startingBlock: Int): Boolean! + watchContract(address: String!, kind: String!, checkpoint: Boolean!, startingBlock: Int): Boolean! } type Subscription { diff --git a/packages/graph-test-watcher/src/server.ts b/packages/graph-test-watcher/src/server.ts index 60aefcf2..aec430ec 100644 --- a/packages/graph-test-watcher/src/server.ts +++ b/packages/graph-test-watcher/src/server.ts @@ -14,9 +14,7 @@ import debug from 'debug'; import 'graphql-import-node'; import { createServer } from 'http'; -import { getCache } from '@vulcanize/cache'; -import { EthClient } from '@vulcanize/ipld-eth-client'; -import { DEFAULT_CONFIG_PATH, getConfig, JobQueue, KIND_ACTIVE, getCustomProvider } from '@vulcanize/util'; +import { DEFAULT_CONFIG_PATH, getConfig, Config, JobQueue, KIND_ACTIVE, initClients } from '@vulcanize/util'; import { GraphWatcher, Database as GraphDatabase } from '@vulcanize/graph-node'; import { createResolvers } from './resolvers'; @@ -37,58 +35,36 @@ export const main = async (): Promise => { }) .argv; - const config = await getConfig(argv.f); + const config: Config = await getConfig(argv.f); + const { ethClient, postgraphileClient, ethProvider } = await initClients(config); - assert(config.server, 'Missing server config'); + const { host, port, kind: watcherKind } = config.server; - const { host, port, kind: watcherKind, subgraphPath } = config.server; - - const { upstream, database: dbConfig, jobQueue: jobQueueConfig } = config; - - assert(dbConfig, 'Missing database config'); - - const db = new Database(dbConfig); + const db = new Database(config.database); await db.init(); - assert(upstream, 'Missing upstream config'); - const { ethServer: { gqlApiEndpoint, gqlPostgraphileEndpoint, rpcProviderEndpoint }, cache: cacheConfig } = upstream; - assert(gqlApiEndpoint, 'Missing upstream ethServer.gqlApiEndpoint'); - assert(gqlPostgraphileEndpoint, 'Missing upstream ethServer.gqlPostgraphileEndpoint'); - - const cache = await getCache(cacheConfig); - - const ethClient = new EthClient({ - gqlEndpoint: gqlApiEndpoint, - gqlSubscriptionEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const postgraphileClient = new EthClient({ - gqlEndpoint: gqlPostgraphileEndpoint, - cache - }); - - const ethProvider = getCustomProvider(rpcProviderEndpoint); - - const graphDb = new GraphDatabase(dbConfig, path.resolve(__dirname, 'entity/*')); + const graphDb = new GraphDatabase(config.database, path.resolve(__dirname, 'entity/*')); await graphDb.init(); - const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, subgraphPath); - await graphWatcher.init(); - - const indexer = new Indexer(db, ethClient, postgraphileClient, ethProvider, graphWatcher); + const graphWatcher = new GraphWatcher(graphDb, postgraphileClient, config.server.subgraphPath); // Note: In-memory pubsub works fine for now, as each watcher is a single process anyway. // Later: https://www.apollographql.com/docs/apollo-server/data/subscriptions/#production-pubsub-libraries const pubsub = new PubSub(); + const indexer = new Indexer(config.server, db, ethClient, postgraphileClient, ethProvider, graphWatcher); + graphWatcher.setIndexer(indexer); + await graphWatcher.init(); + + const jobQueueConfig = config.jobQueue; assert(jobQueueConfig, 'Missing job queue config'); + const { dbConnectionString, maxCompletionLagInSecs } = jobQueueConfig; assert(dbConnectionString, 'Missing job queue db connection string'); const jobQueue = new JobQueue({ dbConnectionString, maxCompletionLag: maxCompletionLagInSecs }); - const eventWatcher = new EventWatcher(upstream, ethClient, postgraphileClient, indexer, pubsub, jobQueue); + const eventWatcher = new EventWatcher(config.upstream, ethClient, postgraphileClient, indexer, pubsub, jobQueue); if (watcherKind === KIND_ACTIVE) { await jobQueue.start(); diff --git a/yarn.lock b/yarn.lock index 81eef1ec..b0c6e346 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1116,6 +1116,21 @@ resolved "https://registry.yarnpkg.com/@graphql-typed-document-node/core/-/core-3.1.0.tgz#0eee6373e11418bfe0b5638f654df7a4ca6a3950" integrity sha512-wYn6r8zVZyQJ6rQaALBEln5B1pzxb9shV5Ef97kTvn6yVGrqyXVnDqnU24MXnFubR+rZjBY9NWuxX3FB2sTsjg== +"@ipld/dag-cbor@^6.0.12", "@ipld/dag-cbor@^6.0.5": + version "6.0.13" + resolved "https://registry.yarnpkg.com/@ipld/dag-cbor/-/dag-cbor-6.0.13.tgz#b68265a3fbb808f2ea3fa247a107399dbc71d5a7" + integrity sha512-tqh1VLjiHi6+i0gDEnoBrQcRvgnUwQV413uTyFnXQMOjJs8px95/TBVMBpQBkhQoeCL4oQlPAr4qIrelKEkQHw== + dependencies: + cborg "^1.2.1" + multiformats "^9.0.0" + +"@ipld/dag-pb@^2.1.3": + version "2.1.13" + resolved "https://registry.yarnpkg.com/@ipld/dag-pb/-/dag-pb-2.1.13.tgz#174779914cf2bffca9e5be96e5b7f06821402333" + integrity sha512-d5GGZEQ3bbiWAafTVu8rZL1WOX3iH0c9+MAWDcpJqqevxpsRiAEiwmBJhDstE7gqMp3elARl4wNOGFZSd7lOYg== + dependencies: + multiformats "^9.0.0" + "@josephg/resolvable@^1.0.0": version "1.0.1" resolved "https://registry.yarnpkg.com/@josephg/resolvable/-/resolvable-1.0.1.tgz#69bc4db754d79e1a2f17a650d3466e038d94a5eb" @@ -2521,7 +2536,7 @@ resolved "https://registry.yarnpkg.com/@types/mime/-/mime-1.3.2.tgz#93e25bf9ee75fe0fd80b594bc4feb0e862111b5a" integrity sha512-YATxVxgRqNH6nHEIsvg6k2Boc1JHI9ZbH5iWFFv/MTkchz3b1ieGDa5T0a9RznNdI0KhVbdbWSN+KWWrQZRxTw== -"@types/minimatch@*": +"@types/minimatch@*", "@types/minimatch@^3.0.4": version "3.0.5" resolved "https://registry.yarnpkg.com/@types/minimatch/-/minimatch-3.0.5.tgz#1001cc5e6a3704b83c236027e77f2f58ea010f40" integrity sha512-Klz949h02Gz2uZCMGwDUSDS1YBlTdDDgbWHi+81l29tQALUtvz4rAYi5uoVhE5Lagoq6DeqAUlbrHvW/mXDgdQ== @@ -3158,6 +3173,14 @@ any-promise@^1.0.0: resolved "https://registry.yarnpkg.com/any-promise/-/any-promise-1.3.0.tgz#abc6afeedcea52e809cdc0376aed3ce39635d17f" integrity sha1-q8av7tzqUugJzcA3au0845Y10X8= +any-signal@^2.1.0, any-signal@^2.1.2: + version "2.1.2" + resolved "https://registry.yarnpkg.com/any-signal/-/any-signal-2.1.2.tgz#8d48270de0605f8b218cf9abe8e9c6a0e7418102" + integrity sha512-B+rDnWasMi/eWcajPcCWSlYc7muXOrcYrqgyzcdKisl2H/WTlQ0gip1KyQfr0ZlxJdsuWCj/LWwQm7fhyhRfIQ== + dependencies: + abort-controller "^3.0.0" + native-abort-controller "^1.0.3" + anymatch@~3.1.1, anymatch@~3.1.2: version "3.1.2" resolved "https://registry.yarnpkg.com/anymatch/-/anymatch-3.1.2.tgz#c0557c096af32f106198f4f4e2a383537e378716" @@ -4214,6 +4237,13 @@ blakejs@^1.1.0: resolved "https://registry.yarnpkg.com/blakejs/-/blakejs-1.1.0.tgz#69df92ef953aa88ca51a32df6ab1c54a155fc7a5" integrity sha1-ad+S75U6qIylGjLfarHFShVfx6U= +blob-to-it@^1.0.1: + version "1.0.4" + resolved "https://registry.yarnpkg.com/blob-to-it/-/blob-to-it-1.0.4.tgz#f6caf7a4e90b7bb9215fa6a318ed6bd8ad9898cb" + integrity sha512-iCmk0W4NdbrWgRRuxOriU8aM5ijeVLI61Zulsmg/lUHNr7pYjoj+U77opLefNagevtrrbMt3JQ5Qip7ar178kA== + dependencies: + browser-readablestream-to-it "^1.0.3" + bluebird@^3.5.0, bluebird@^3.5.2, bluebird@^3.7.2: version "3.7.2" resolved "https://registry.yarnpkg.com/bluebird/-/bluebird-3.7.2.tgz#9f229c15be272454ffa973ace0dbee79a1b0c36f" @@ -4300,6 +4330,11 @@ brorand@^1.0.1, brorand@^1.1.0: resolved "https://registry.yarnpkg.com/brorand/-/brorand-1.1.0.tgz#12c25efe40a45e3c323eb8675a0a0ce57b22371f" integrity sha1-EsJe/kCkXjwyPrhnWgoM5XsiNx8= +browser-readablestream-to-it@^1.0.1, browser-readablestream-to-it@^1.0.3: + version "1.0.3" + resolved "https://registry.yarnpkg.com/browser-readablestream-to-it/-/browser-readablestream-to-it-1.0.3.tgz#ac3e406c7ee6cdf0a502dd55db33bab97f7fba76" + integrity sha512-+12sHB+Br8HIh6VAMVEG5r3UXCyESIgDW7kzk3BjIXa43DVqVwL7GC5TW3jeh+72dtcH99pPVpw0X8i0jt+/kw== + browser-stdout@1.3.1: version "1.3.1" resolved "https://registry.yarnpkg.com/browser-stdout/-/browser-stdout-1.3.1.tgz#baa559ee14ced73452229bad7326467c61fabd60" @@ -4418,7 +4453,7 @@ buffer@^5.0.5, buffer@^5.2.1, buffer@^5.5.0, buffer@^5.6.0: base64-js "^1.3.1" ieee754 "^1.1.13" -buffer@^6.0.3: +buffer@^6.0.1, buffer@^6.0.3: version "6.0.3" resolved "https://registry.yarnpkg.com/buffer/-/buffer-6.0.3.tgz#2ace578459cc8fbe2a70aaa8f52ee63b6a74c6c6" integrity sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA== @@ -4633,6 +4668,11 @@ catering@^2.0.0: resolved "https://registry.yarnpkg.com/catering/-/catering-2.0.0.tgz#15ce31bcbffafbf62855ea7677b0e5d23581233d" integrity sha512-aD/WmxhGwUGsVPrj8C80vH7C7GphJilYVSdudoV4u16XdrLF7CVyfBmENsc4tLTVsJJzCRid8GbwJ7mcPLee6Q== +cborg@^1.2.1: + version "1.5.3" + resolved "https://registry.yarnpkg.com/cborg/-/cborg-1.5.3.tgz#94cd037a50cde007397ca872259457d2f8dd0846" + integrity sha512-iUweYinQpR48eXxcmEoZlixY2eo+vDCGc5utNwXV0oYhmowHU/2DwcSiJ4xU1l7niwXOR91pcE3zEgZl4VoFAQ== + chai-spies@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/chai-spies/-/chai-spies-1.0.0.tgz#d16b39336fb316d03abf8c375feb23c0c8bb163d" @@ -5737,6 +5777,15 @@ dir-glob@^3.0.1: dependencies: path-type "^4.0.0" +dns-over-http-resolver@^1.2.3: + version "1.2.3" + resolved "https://registry.yarnpkg.com/dns-over-http-resolver/-/dns-over-http-resolver-1.2.3.tgz#194d5e140a42153f55bb79ac5a64dd2768c36af9" + integrity sha512-miDiVSI6KSNbi4SVifzO/reD8rMnxgrlnkrlkugOLQpWQTe2qMdHsZp5DmfKjxNE+/T3VAAYLQUZMv9SMr6+AA== + dependencies: + debug "^4.3.1" + native-fetch "^3.0.0" + receptacle "^1.3.2" + doctrine@^2.1.0: version "2.1.0" resolved "https://registry.yarnpkg.com/doctrine/-/doctrine-2.1.0.tgz#5cd01fc101621b42c4cd7f5d1a66243716d3f39d" @@ -5817,6 +5866,13 @@ ee-first@1.1.1: resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d" integrity sha1-WQxhFWsK4vTwJVcyoViyZrxWsh0= +electron-fetch@^1.7.2: + version "1.7.4" + resolved "https://registry.yarnpkg.com/electron-fetch/-/electron-fetch-1.7.4.tgz#af975ab92a14798bfaa025f88dcd2e54a7b0b769" + integrity sha512-+fBLXEy4CJWQ5bz8dyaeSG1hD6JJ15kBZyj3eh24pIVrd3hLM47H/umffrdQfS6GZ0falF0g9JT9f3Rs6AVUhw== + dependencies: + encoding "^0.1.13" + electron-to-chromium@^1.3.47: version "1.3.741" resolved "https://registry.yarnpkg.com/electron-to-chromium/-/electron-to-chromium-1.3.741.tgz#dc1024b19b31e27fb2c8c0a1f120cb05fc6ca695" @@ -5886,7 +5942,7 @@ encoding-down@^7.0.0: level-codec "^10.0.0" level-errors "^3.0.0" -encoding@^0.1.11, encoding@^0.1.12: +encoding@^0.1.11, encoding@^0.1.12, encoding@^0.1.13: version "0.1.13" resolved "https://registry.yarnpkg.com/encoding/-/encoding-0.1.13.tgz#56574afdd791f54a8e9b2785c0582a2d26210fa9" integrity sha512-ETBauow1T35Y/WZMkio9jiM0Z5xjHHmJ4XmjZOq1l/dXz3lr2sRn87nJy20RupqSh1F2m3HHPSp8ShIPQJrJ3A== @@ -5922,6 +5978,11 @@ err-code@^2.0.2: resolved "https://registry.yarnpkg.com/err-code/-/err-code-2.0.3.tgz#23c2f3b756ffdfc608d30e27c9a941024807e7f9" integrity sha512-2bmlRpNKBxT/CRmPOlyISQpNj+qSeYvcym/uT0Jx2bMOlKLtSy1ZmLuVxSEKKyor/N5yhvp/ZiG1oE3DEYMSFA== +err-code@^3.0.1: + version "3.0.1" + resolved "https://registry.yarnpkg.com/err-code/-/err-code-3.0.1.tgz#a444c7b992705f2b120ee320b09972eef331c920" + integrity sha512-GiaH0KJUewYok+eeY05IIgjtAe4Yltygk9Wqp1V5yVWLdhf0hYZchRjNIT9bb0mSwRcIusT3cx7PJUf3zEIfUA== + errno@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/errno/-/errno-1.0.0.tgz#0ea47d701864accf996412f09e29b4dc2cf3856d" @@ -6919,6 +6980,11 @@ fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3: resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525" integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q== +fast-fifo@^1.0.0: + version "1.0.0" + resolved "https://registry.yarnpkg.com/fast-fifo/-/fast-fifo-1.0.0.tgz#9bc72e6860347bb045a876d1c5c0af11e9b984e7" + integrity sha512-4VEXmjxLj7sbs8J//cn2qhRap50dGzF5n8fjay8mau+Jn4hxSeR3xPFwxMaQq/pDaq7+KQk0PAbC2+nWDkJrmQ== + fast-glob@^3.1.1: version "3.2.5" resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.2.5.tgz#7939af2a656de79a4f1901903ee8adcaa7cb9661" @@ -7376,6 +7442,11 @@ get-intrinsic@^1.0.2, get-intrinsic@^1.1.1: has "^1.0.3" has-symbols "^1.0.1" +get-iterator@^1.0.2: + version "1.0.2" + resolved "https://registry.yarnpkg.com/get-iterator/-/get-iterator-1.0.2.tgz#cd747c02b4c084461fac14f48f6b45a80ed25c82" + integrity sha512-v+dm9bNVfOYsY1OrhaCrmyOcYoSeVvbt+hHZ0Au+T+p1y+0Uyj9aMaGIeUTT6xdpRbWzDeYKvfOslPhggQMcsg== + get-pkg-repo@^1.0.0: version "1.4.0" resolved "https://registry.yarnpkg.com/get-pkg-repo/-/get-pkg-repo-1.4.0.tgz#c73b489c06d80cc5536c2c853f9e05232056972d" @@ -8302,6 +8373,20 @@ inquirer@^7.3.3: strip-ansi "^6.0.0" through "^2.3.6" +interface-datastore@^6.0.2: + version "6.0.3" + resolved "https://registry.yarnpkg.com/interface-datastore/-/interface-datastore-6.0.3.tgz#f42163e4bfaea9e2fcde82e45d4bf2849e1fbbf5" + integrity sha512-61eOyzh7zH1ks/56hPudW6pbqsOdoHSYMVjuqlIlZGjyg0svR6DHlCcaeSJfWW8t6dsPl1n7qKBdk8ZqPzXuLA== + dependencies: + interface-store "^2.0.1" + nanoid "^3.0.2" + uint8arrays "^3.0.0" + +interface-store@^2.0.1: + version "2.0.1" + resolved "https://registry.yarnpkg.com/interface-store/-/interface-store-2.0.1.tgz#b944573b9d27190bca9be576c97bfde907931dee" + integrity sha512-TfjYMdk4RlaGPA0VGk8fVPM+xhFbjiA2mTv1AqhiFh3N+ZEwoJnmDu/EBdKXzl80nyd0pvKui3RTC3zFgHMjTA== + interpret@^1.0.0: version "1.4.0" resolved "https://registry.yarnpkg.com/interpret/-/interpret-1.4.0.tgz#665ab8bc4da27a774a40584e812e3e0fa45b1a1e" @@ -8326,6 +8411,11 @@ io-ts@1.10.4: dependencies: fp-ts "^1.0.0" +ip-regex@^4.0.0: + version "4.3.0" + resolved "https://registry.yarnpkg.com/ip-regex/-/ip-regex-4.3.0.tgz#687275ab0f57fa76978ff8f4dddc8a23d5990db5" + integrity sha512-B9ZWJxHHOHUhUjCPrMpLD4xEq35bUTClHM1S6CBU5ixQnkZmwipwgc96vAd7AAGM9TGHvJR+Uss+/Ak6UphK+Q== + ip@^1.1.5: version "1.1.5" resolved "https://registry.yarnpkg.com/ip/-/ip-1.1.5.tgz#bdded70114290828c0a039e72ef25f5aaec4354a" @@ -8336,6 +8426,95 @@ ipaddr.js@1.9.1: resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3" integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g== +ipfs-core-types@^0.8.1: + version "0.8.2" + resolved "https://registry.yarnpkg.com/ipfs-core-types/-/ipfs-core-types-0.8.2.tgz#6923dd1a5fda9ea26d7b03e83f96a16a9e8cfc24" + integrity sha512-qgKn+hHAdRRyP4eX7ygmNSPucI75KUMxtoxgYaJGPcew4rdvCA1VCD/Rc0W+dHlRbTRM4CYIE10VS3qjpXSgDQ== + dependencies: + interface-datastore "^6.0.2" + multiaddr "^10.0.0" + multiformats "^9.4.1" + +ipfs-core-utils@^0.11.1: + version "0.11.1" + resolved "https://registry.yarnpkg.com/ipfs-core-utils/-/ipfs-core-utils-0.11.1.tgz#4ac2143f2d89917f0823c9b8104e5ed489383ecb" + integrity sha512-SYBTeESuLgjYeDh8meCgttHV/LlES8FbljIDCySp+DpgPxYJA4EyC4GhywBaneQc/X3GWvHEzvW5b7ADluFcAw== + dependencies: + any-signal "^2.1.2" + blob-to-it "^1.0.1" + browser-readablestream-to-it "^1.0.1" + debug "^4.1.1" + err-code "^3.0.1" + ipfs-core-types "^0.8.1" + ipfs-unixfs "^6.0.3" + ipfs-utils "^9.0.2" + it-all "^1.0.4" + it-map "^1.0.4" + it-peekable "^1.0.2" + it-to-stream "^1.0.0" + merge-options "^3.0.4" + multiaddr "^10.0.0" + multiaddr-to-uri "^8.0.0" + multiformats "^9.4.1" + nanoid "^3.1.23" + parse-duration "^1.0.0" + timeout-abort-controller "^1.1.1" + uint8arrays "^3.0.0" + +ipfs-http-client@^53.0.1: + version "53.0.1" + resolved "https://registry.yarnpkg.com/ipfs-http-client/-/ipfs-http-client-53.0.1.tgz#1ae5afce607c87cec285429c7de665e341a3ec7d" + integrity sha512-0hmm5esSxoArEtVE9jeLwLw3pJm6rJA1kWKW+3Nqs2O8TQVSot8u1nzopF/yJ2IJGd5PHJc2JxqtEdVzV+p7nQ== + dependencies: + "@ipld/dag-cbor" "^6.0.5" + "@ipld/dag-pb" "^2.1.3" + abort-controller "^3.0.0" + any-signal "^2.1.2" + debug "^4.1.1" + err-code "^3.0.1" + ipfs-core-types "^0.8.1" + ipfs-core-utils "^0.11.1" + ipfs-utils "^9.0.2" + it-first "^1.0.6" + it-last "^1.0.4" + merge-options "^3.0.4" + multiaddr "^10.0.0" + multiformats "^9.4.1" + native-abort-controller "^1.0.3" + parse-duration "^1.0.0" + stream-to-it "^0.2.2" + uint8arrays "^3.0.0" + +ipfs-unixfs@^6.0.3: + version "6.0.6" + resolved "https://registry.yarnpkg.com/ipfs-unixfs/-/ipfs-unixfs-6.0.6.tgz#c44881c1bcd6a474c665e67108cbf31e54c63eec" + integrity sha512-gTkjYKXuHnqIf6EFfS+ESaYEl3I3aaQQ0UX8MhpNzreMLEuMnuqpoI/uLLllTZa31WRplKixabbpRTSmTYRNwA== + dependencies: + err-code "^3.0.1" + protobufjs "^6.10.2" + +ipfs-utils@^9.0.2: + version "9.0.2" + resolved "https://registry.yarnpkg.com/ipfs-utils/-/ipfs-utils-9.0.2.tgz#7e816a863753c5af1187464839a6b46aa8764e5b" + integrity sha512-o0DjVfd1kcr09fAYMkSnZ56ZkfoAzZhFWkizG3/tL7svukZpqyGyRxNlF58F+hsrn/oL8ouAP9x+4Hdf8XM+hg== + dependencies: + abort-controller "^3.0.0" + any-signal "^2.1.0" + buffer "^6.0.1" + electron-fetch "^1.7.2" + err-code "^3.0.1" + is-electron "^2.2.0" + iso-url "^1.1.5" + it-glob "^1.0.1" + it-to-stream "^1.0.0" + merge-options "^3.0.4" + nanoid "^3.1.20" + native-abort-controller "^1.0.3" + native-fetch "^3.0.0" + node-fetch "https://registry.npmjs.org/@achingbrain/node-fetch/-/node-fetch-2.6.7.tgz" + react-native-fetch-api "^2.0.0" + stream-to-it "^0.2.2" + is-accessor-descriptor@^0.1.6: version "0.1.6" resolved "https://registry.yarnpkg.com/is-accessor-descriptor/-/is-accessor-descriptor-0.1.6.tgz#a9e12cb3ae8d876727eeef3843f8a0897b5c98d6" @@ -8457,6 +8636,11 @@ is-docker@^2.0.0: resolved "https://registry.yarnpkg.com/is-docker/-/is-docker-2.2.1.tgz#33eeabe23cfe86f14bde4408a02c0cfb853acdaa" integrity sha512-F+i2BKsFrH66iaUFc0woD8sLy8getkwTwtOBjvs56Cx4CgJDeKQeqfz8wAYiSb8JOprWhHH5p77PbmYCvvUuXQ== +is-electron@^2.2.0: + version "2.2.0" + resolved "https://registry.yarnpkg.com/is-electron/-/is-electron-2.2.0.tgz#8943084f09e8b731b3a7a0298a7b5d56f6b7eef0" + integrity sha512-SpMppC2XR3YdxSzczXReBjqs2zGscWQpBIKqwXYBFic0ERaxNVgwLCHwOLZeESfdJQjX0RDvrJ1lBXX2ij+G1Q== + is-extendable@^0.1.0, is-extendable@^0.1.1: version "0.1.1" resolved "https://registry.yarnpkg.com/is-extendable/-/is-extendable-0.1.1.tgz#62b110e289a471418e3ec36a617d472e301dfc89" @@ -8526,6 +8710,13 @@ is-installed-globally@^0.3.1: global-dirs "^2.0.1" is-path-inside "^3.0.1" +is-ip@^3.1.0: + version "3.1.0" + resolved "https://registry.yarnpkg.com/is-ip/-/is-ip-3.1.0.tgz#2ae5ddfafaf05cb8008a62093cf29734f657c5d8" + integrity sha512-35vd5necO7IitFPjd/YBeqwWnyDWbuLH9ZXQdMfDA8TEo7pv5X8yfrvVO3xbJbLUlERCMvf6X0hTUamQxCYJ9Q== + dependencies: + ip-regex "^4.0.0" + is-lambda@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/is-lambda/-/is-lambda-1.0.1.tgz#3d9877899e6a53efc0160504cde15f82e6f061d5" @@ -8725,6 +8916,11 @@ isexe@^2.0.0: resolved "https://registry.yarnpkg.com/isexe/-/isexe-2.0.0.tgz#e8fbf374dc556ff8947a10dcb0572d633f2cfa10" integrity sha1-6PvzdNxVb/iUehDcsFctYz8s+hA= +iso-url@^1.1.5: + version "1.2.1" + resolved "https://registry.yarnpkg.com/iso-url/-/iso-url-1.2.1.tgz#db96a49d8d9a64a1c889fc07cc525d093afb1811" + integrity sha512-9JPDgCN4B7QPkLtYAAOrEuAWvP9rWvR5offAr0/SeF046wIkglqH3VXgYYP6NcsKslH80UIVgmPqNe3j7tG2ng== + isobject@^2.0.0: version "2.1.0" resolved "https://registry.yarnpkg.com/isobject/-/isobject-2.1.0.tgz#f065561096a3f1da2ef46272f815c840d87e0c89" @@ -8750,6 +8946,51 @@ isurl@^1.0.0-alpha5: has-to-string-tag-x "^1.2.0" is-object "^1.0.1" +it-all@^1.0.4: + version "1.0.6" + resolved "https://registry.yarnpkg.com/it-all/-/it-all-1.0.6.tgz#852557355367606295c4c3b7eff0136f07749335" + integrity sha512-3cmCc6Heqe3uWi3CVM/k51fa/XbMFpQVzFoDsV0IZNHSQDyAXl3c4MjHkFX5kF3922OGj7Myv1nSEUgRtcuM1A== + +it-first@^1.0.6: + version "1.0.7" + resolved "https://registry.yarnpkg.com/it-first/-/it-first-1.0.7.tgz#a4bef40da8be21667f7d23e44dae652f5ccd7ab1" + integrity sha512-nvJKZoBpZD/6Rtde6FXqwDqDZGF1sCADmr2Zoc0hZsIvnE449gRFnGctxDf09Bzc/FWnHXAdaHVIetY6lrE0/g== + +it-glob@^1.0.1: + version "1.0.2" + resolved "https://registry.yarnpkg.com/it-glob/-/it-glob-1.0.2.tgz#bab9b04d6aaac42884502f3a0bfee84c7a29e15e" + integrity sha512-Ch2Dzhw4URfB9L/0ZHyY+uqOnKvBNeS/SMcRiPmJfpHiM0TsUZn+GkpcZxAoF3dJVdPm/PuIk3A4wlV7SUo23Q== + dependencies: + "@types/minimatch" "^3.0.4" + minimatch "^3.0.4" + +it-last@^1.0.4: + version "1.0.6" + resolved "https://registry.yarnpkg.com/it-last/-/it-last-1.0.6.tgz#4106232e5905ec11e16de15a0e9f7037eaecfc45" + integrity sha512-aFGeibeiX/lM4bX3JY0OkVCFkAw8+n9lkukkLNivbJRvNz8lI3YXv5xcqhFUV2lDJiraEK3OXRDbGuevnnR67Q== + +it-map@^1.0.4: + version "1.0.6" + resolved "https://registry.yarnpkg.com/it-map/-/it-map-1.0.6.tgz#6aa547e363eedcf8d4f69d8484b450bc13c9882c" + integrity sha512-XT4/RM6UHIFG9IobGlQPFQUrlEKkU4eBUFG3qhWhfAdh1JfF2x11ShCrKCdmZ0OiZppPfoLuzcfA4cey6q3UAQ== + +it-peekable@^1.0.2: + version "1.0.3" + resolved "https://registry.yarnpkg.com/it-peekable/-/it-peekable-1.0.3.tgz#8ebe933767d9c5aa0ae4ef8e9cb3a47389bced8c" + integrity sha512-5+8zemFS+wSfIkSZyf0Zh5kNN+iGyccN02914BY4w/Dj+uoFEoPSvj5vaWn8pNZJNSxzjW0zHRxC3LUb2KWJTQ== + +it-to-stream@^1.0.0: + version "1.0.0" + resolved "https://registry.yarnpkg.com/it-to-stream/-/it-to-stream-1.0.0.tgz#6c47f91d5b5df28bda9334c52782ef8e97fe3a4a" + integrity sha512-pLULMZMAB/+vbdvbZtebC0nWBTbG581lk6w8P7DfIIIKUfa8FbY7Oi0FxZcFPbxvISs7A9E+cMpLDBc1XhpAOA== + dependencies: + buffer "^6.0.3" + fast-fifo "^1.0.0" + get-iterator "^1.0.2" + p-defer "^3.0.0" + p-fifo "^1.0.0" + readable-stream "^3.6.0" + iterall@^1.1.3, iterall@^1.2.1, iterall@^1.2.2, iterall@^1.3.0: version "1.3.0" resolved "https://registry.yarnpkg.com/iterall/-/iterall-1.3.0.tgz#afcb08492e2915cbd8a0884eb93a8c94d0d72fea" @@ -9802,6 +10043,13 @@ merge-descriptors@1.0.1: resolved "https://registry.yarnpkg.com/merge-descriptors/-/merge-descriptors-1.0.1.tgz#b00aaa556dd8b44568150ec9d1b953f3f90cbb61" integrity sha1-sAqqVW3YtEVoFQ7J0blT8/kMu2E= +merge-options@^3.0.4: + version "3.0.4" + resolved "https://registry.yarnpkg.com/merge-options/-/merge-options-3.0.4.tgz#84709c2aa2a4b24c1981f66c179fe5565cc6dbb7" + integrity sha512-2Sug1+knBjkaMsMgf1ctR1Ujx+Ayku4EdJN4Z+C2+JzoeF7A3OZ9KM2GY0CpQS51NR61LTurMJrRKPhSs3ZRTQ== + dependencies: + is-plain-obj "^2.1.0" + merge-source-map@^1.1.0: version "1.1.0" resolved "https://registry.yarnpkg.com/merge-source-map/-/merge-source-map-1.1.0.tgz#2fdde7e6020939f70906a68f2d7ae685e4c8c646" @@ -10185,6 +10433,25 @@ ms@2.1.3, ms@^2.0.0, ms@^2.1.1: resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2" integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA== +multiaddr-to-uri@^8.0.0: + version "8.0.0" + resolved "https://registry.yarnpkg.com/multiaddr-to-uri/-/multiaddr-to-uri-8.0.0.tgz#65efe4b1f9de5f6b681aa42ff36a7c8db7625e58" + integrity sha512-dq4p/vsOOUdVEd1J1gl+R2GFrXJQH8yjLtz4hodqdVbieg39LvBOdMQRdQnfbg5LSM/q1BYNVf5CBbwZFFqBgA== + dependencies: + multiaddr "^10.0.0" + +multiaddr@^10.0.0: + version "10.0.1" + resolved "https://registry.yarnpkg.com/multiaddr/-/multiaddr-10.0.1.tgz#0d15848871370860a4d266bb44d93b3dac5d90ef" + integrity sha512-G5upNcGzEGuTHkzxezPrrD6CaIHR9uo+7MwqhNVcXTs33IInon4y7nMiGxl2CY5hG7chvYQUQhz5V52/Qe3cbg== + dependencies: + dns-over-http-resolver "^1.2.3" + err-code "^3.0.1" + is-ip "^3.1.0" + multiformats "^9.4.5" + uint8arrays "^3.0.0" + varint "^6.0.0" + multibase@^0.7.0: version "0.7.0" resolved "https://registry.yarnpkg.com/multibase/-/multibase-0.7.0.tgz#1adfc1c50abe05eefeb5091ac0c2728d6b84581b" @@ -10216,6 +10483,11 @@ multicodec@^1.0.0: buffer "^5.6.0" varint "^5.0.0" +multiformats@^9.0.0, multiformats@^9.4.1, multiformats@^9.4.2, multiformats@^9.4.5, multiformats@^9.4.8: + version "9.4.10" + resolved "https://registry.yarnpkg.com/multiformats/-/multiformats-9.4.10.tgz#d654d06b28cc066506e4e59b246d65267fb6b93b" + integrity sha512-BwWGvgqB/5J/cnWaOA0sXzJ+UGl+kyFAw3Sw1L6TN4oad34C9OpW+GCpYTYPDp4pUaXDC1EjvB3yv9Iodo1EhA== + multihashes@^0.4.15, multihashes@~0.4.15: version "0.4.21" resolved "https://registry.yarnpkg.com/multihashes/-/multihashes-0.4.21.tgz#dc02d525579f334a7909ade8a122dabb58ccfcb5" @@ -10274,6 +10546,11 @@ nanoid@3.1.20: resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.20.tgz#badc263c6b1dcf14b71efaa85f6ab4c1d6cfc788" integrity sha512-a1cQNyczgKbLX9jwbS/+d7W8fX/RfgYR7lVWwWOGIPNgK2m0MWvrGF6/m4kk6U3QcFMnZf3RIhL0v2Jgh/0Uxw== +nanoid@^3.0.2, nanoid@^3.1.20: + version "3.1.30" + resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.30.tgz#63f93cc548d2a113dc5dfbc63bfa09e2b9b64362" + integrity sha512-zJpuPDwOv8D2zq2WRoMe1HsfZthVewpel9CAvTfc/2mBD1uUT/agc5f7GHGWXlYkFvi1mVxe4IjvP2HNrop7nQ== + nanoid@^3.1.23: version "3.1.25" resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.25.tgz#09ca32747c0e543f0e1814b7d3793477f9c8e152" @@ -10301,6 +10578,16 @@ napi-macros@~2.0.0: resolved "https://registry.yarnpkg.com/napi-macros/-/napi-macros-2.0.0.tgz#2b6bae421e7b96eb687aa6c77a7858640670001b" integrity sha512-A0xLykHtARfueITVDernsAWdtIMbOJgKgcluwENp3AlsKN/PloyO10HtmoqnFAQAcxPkgZN7wdfPfEd0zNGxbg== +native-abort-controller@^1.0.3: + version "1.0.4" + resolved "https://registry.yarnpkg.com/native-abort-controller/-/native-abort-controller-1.0.4.tgz#39920155cc0c18209ff93af5bc90be856143f251" + integrity sha512-zp8yev7nxczDJMoP6pDxyD20IU0T22eX8VwN2ztDccKvSZhRaV33yP1BGwKSZfXuqWUzsXopVFjBdau9OOAwMQ== + +native-fetch@^3.0.0: + version "3.0.0" + resolved "https://registry.yarnpkg.com/native-fetch/-/native-fetch-3.0.0.tgz#06ccdd70e79e171c365c75117959cf4fe14a09bb" + integrity sha512-G3Z7vx0IFb/FQ4JxvtqGABsOTIqRWvgQz6e+erkB+JJD6LrszQtMozEHI4EkmgZQvnGHrpLVzUWk7t4sJCIkVw== + natural-compare@^1.4.0: version "1.4.0" resolved "https://registry.yarnpkg.com/natural-compare/-/natural-compare-1.4.0.tgz#4abebfeed7541f2c27acfb29bdbbd15c8d5ba4f7" @@ -10359,12 +10646,9 @@ node-fetch@^2: resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.2.tgz#986996818b73785e47b1965cc34eb093a1d464d0" integrity sha512-aLoxToI6RfZ+0NOjmWAgn9+LEd30YCkJKFSyWacNZdEKTit/ZMcKjGkTRo8uWEsnIb/hfKecNPEbln02PdWbcA== -node-fetch@^2.6.5: - version "2.6.6" - resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.6.tgz#1751a7c01834e8e1697758732e9efb6eeadfaf89" - integrity sha512-Z8/6vRlTUChSdIgMa51jxQ4lrw/Jy5SOW10ObaA47/RElsAN2c5Pn8bTgFGWn/ibwzXTE8qwr1Yzx28vsecXEA== - dependencies: - whatwg-url "^5.0.0" +"node-fetch@https://registry.npmjs.org/@achingbrain/node-fetch/-/node-fetch-2.6.7.tgz": + version "2.6.7" + resolved "https://registry.npmjs.org/@achingbrain/node-fetch/-/node-fetch-2.6.7.tgz#1b5d62978f2ed07b99444f64f0df39f960a6d34d" node-fetch@~1.7.1: version "1.7.3" @@ -10825,6 +11109,19 @@ p-cancelable@^1.0.0: resolved "https://registry.yarnpkg.com/p-cancelable/-/p-cancelable-1.1.0.tgz#d078d15a3af409220c886f1d9a0ca2e441ab26cc" integrity sha512-s73XxOZ4zpt1edZYZzvhqFa6uvQc1vwUa0K0BdtIZgQMAJj9IbebH+JkgKZc9h+B05PKHLOTl4ajG1BmNrVZlw== +p-defer@^3.0.0: + version "3.0.0" + resolved "https://registry.yarnpkg.com/p-defer/-/p-defer-3.0.0.tgz#d1dceb4ee9b2b604b1d94ffec83760175d4e6f83" + integrity sha512-ugZxsxmtTln604yeYd29EGrNhazN2lywetzpKhfmQjW/VJmhpDmWbiX+h0zL8V91R0UXkhb3KtPmyq9PZw3aYw== + +p-fifo@^1.0.0: + version "1.0.0" + resolved "https://registry.yarnpkg.com/p-fifo/-/p-fifo-1.0.0.tgz#e29d5cf17c239ba87f51dde98c1d26a9cfe20a63" + integrity sha512-IjoCxXW48tqdtDFz6fqo5q1UfFVjjVZe8TC1QRflvNUJtNfCUhxOUw6MOVZhDPjqhSzc26xKdugsO17gmzd5+A== + dependencies: + fast-fifo "^1.0.0" + p-defer "^3.0.0" + p-finally@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/p-finally/-/p-finally-1.0.0.tgz#3fbcfb15b899a44123b34b6dcc18b724336a2cae" @@ -11008,6 +11305,11 @@ parse-asn1@^5.0.0, parse-asn1@^5.1.5: pbkdf2 "^3.0.3" safe-buffer "^5.1.1" +parse-duration@^1.0.0: + version "1.0.2" + resolved "https://registry.yarnpkg.com/parse-duration/-/parse-duration-1.0.2.tgz#b9aa7d3a1363cc7e8845bea8fd3baf8a11df5805" + integrity sha512-Dg27N6mfok+ow1a2rj/nRjtCfaKrHUZV2SJpEn/s8GaVUSlf4GGRCRP1c13Hj+wfPKVMrFDqLMLITkYKgKxyyg== + parse-github-repo-url@^1.3.0: version "1.4.1" resolved "https://registry.yarnpkg.com/parse-github-repo-url/-/parse-github-repo-url-1.4.1.tgz#9e7d8bb252a6cb6ba42595060b7bf6df3dbc1f50" @@ -11561,7 +11863,7 @@ proto-list@~1.2.1: resolved "https://registry.yarnpkg.com/proto-list/-/proto-list-1.2.4.tgz#212d5bfe1318306a420f6402b8e26ff39647a849" integrity sha1-IS1b/hMYMGpCD2QCuOJv85ZHqEk= -protobufjs@~6.11.0: +protobufjs@^6.10.2: version "6.11.2" resolved "https://registry.yarnpkg.com/protobufjs/-/protobufjs-6.11.2.tgz#de39fabd4ed32beaa08e9bb1e30d08544c1edf8b" integrity sha512-4BQJoPooKJl2G9j3XftkIXjoC9C0Av2NOrWmbLWT1vH32GcSUHjM0Arra6UfTsVyfMAuFzaLucXn1sadxJydAw== @@ -11821,6 +12123,13 @@ react-is@^16.7.0, react-is@^16.8.1: resolved "https://registry.yarnpkg.com/react-is/-/react-is-16.13.1.tgz#789729a4dc36de2999dc156dd6c1d9c18cea56a4" integrity sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ== +react-native-fetch-api@^2.0.0: + version "2.0.0" + resolved "https://registry.yarnpkg.com/react-native-fetch-api/-/react-native-fetch-api-2.0.0.tgz#c4af188b4fce3f3eaf1f1ff4e61dae1a00d4ffa0" + integrity sha512-GOA8tc1EVYLnHvma/TU9VTgLOyralO7eATRuCDchQveXW9Fr9vXygyq9iwqmM7YRZ8qRJfEt9xOS7OYMdJvRFw== + dependencies: + p-defer "^3.0.0" + read-cmd-shim@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/read-cmd-shim/-/read-cmd-shim-2.0.0.tgz#4a50a71d6f0965364938e9038476f7eede3928d9" @@ -11996,6 +12305,13 @@ readdirp@~3.5.0: dependencies: picomatch "^2.2.1" +receptacle@^1.3.2: + version "1.3.2" + resolved "https://registry.yarnpkg.com/receptacle/-/receptacle-1.3.2.tgz#a7994c7efafc7a01d0e2041839dab6c4951360d2" + integrity sha512-HrsFvqZZheusncQRiEE7GatOAETrARKV/lnfYicIm8lbvp/JQOdADOfhjBd2DajvoszEyxSM6RlAAIZgEoeu/A== + dependencies: + ms "^2.1.1" + rechoir@^0.6.2: version "0.6.2" resolved "https://registry.yarnpkg.com/rechoir/-/rechoir-0.6.2.tgz#85204b54dba82d5742e28c96756ef43af50e3384" @@ -12246,6 +12562,11 @@ ret@~0.1.10: resolved "https://registry.yarnpkg.com/ret/-/ret-0.1.15.tgz#b8a4825d5bdb1fc3f6f53c2bc33f81388681c7bc" integrity sha512-TTlYpa+OL+vMMNG24xSlQGEJ3B/RzEfUlLct7b5G/ytav+wPrplCpVMFuwzXbkecJrb6IYo1iFb0S9v37754mg== +retimer@^2.0.0: + version "2.0.0" + resolved "https://registry.yarnpkg.com/retimer/-/retimer-2.0.0.tgz#e8bd68c5e5a8ec2f49ccb5c636db84c04063bbca" + integrity sha512-KLXY85WkEq2V2bKex/LOO1ViXVn2KGYe4PYysAdYdjmraYIUsVkXu8O4am+8+5UbaaGl1qho4aqAAPHNQ4GSbg== + retry@0.12.0, retry@^0.12.0: version "0.12.0" resolved "https://registry.yarnpkg.com/retry/-/retry-0.12.0.tgz#1b42a6266a21f07421d1b0b54b7dc167b01c013b" @@ -12910,6 +13231,13 @@ static-extend@^0.1.1: resolved "https://registry.yarnpkg.com/statuses/-/statuses-1.5.0.tgz#161c7dac177659fd9811f43771fa99381478628c" integrity sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow= +stream-to-it@^0.2.2: + version "0.2.4" + resolved "https://registry.yarnpkg.com/stream-to-it/-/stream-to-it-0.2.4.tgz#d2fd7bfbd4a899b4c0d6a7e6a533723af5749bd0" + integrity sha512-4vEbkSs83OahpmBybNJXlJd7d6/RxzkkSdT3I0mnGt79Xd2Kk+e1JqbvAvsQfCeKj3aKb0QIWkyK3/n0j506vQ== + dependencies: + get-iterator "^1.0.2" + stream-to-pull-stream@^1.7.1: version "1.7.3" resolved "https://registry.yarnpkg.com/stream-to-pull-stream/-/stream-to-pull-stream-1.7.3.tgz#4161aa2d2eb9964de60bfa1af7feaf917e874ece" @@ -13347,6 +13675,14 @@ timed-out@^4.0.0, timed-out@^4.0.1: resolved "https://registry.yarnpkg.com/timed-out/-/timed-out-4.0.1.tgz#f32eacac5a175bea25d7fab565ab3ed8741ef56f" integrity sha1-8y6srFoXW+ol1/q1Zas+2HQe9W8= +timeout-abort-controller@^1.1.1: + version "1.1.1" + resolved "https://registry.yarnpkg.com/timeout-abort-controller/-/timeout-abort-controller-1.1.1.tgz#2c3c3c66f13c783237987673c276cbd7a9762f29" + integrity sha512-BsF9i3NAJag6T0ZEjki9j654zoafI2X6ayuNd6Tp8+Ul6Tr5s4jo973qFeiWrRSweqvskC+AHDKUmIW4b7pdhQ== + dependencies: + abort-controller "^3.0.0" + retimer "^2.0.0" + tmp@0.0.33, tmp@^0.0.33: version "0.0.33" resolved "https://registry.yarnpkg.com/tmp/-/tmp-0.0.33.tgz#6d34335889768d21b2bcda0aa277ced3b1bfadf9" @@ -13752,6 +14088,13 @@ uid-number@0.0.6: resolved "https://registry.yarnpkg.com/uid-number/-/uid-number-0.0.6.tgz#0ea10e8035e8eb5b8e4449f06da1c730663baa81" integrity sha1-DqEOgDXo61uOREnwbaHHMGY7qoE= +uint8arrays@^3.0.0: + version "3.0.0" + resolved "https://registry.yarnpkg.com/uint8arrays/-/uint8arrays-3.0.0.tgz#260869efb8422418b6f04e3fac73a3908175c63b" + integrity sha512-HRCx0q6O9Bfbp+HHSfQQKD7wU70+lydKVt4EghkdOvlK/NlrF90z+eXV34mUd48rNvVJXwkrMSPpCATkct8fJA== + dependencies: + multiformats "^9.4.2" + ultron@~1.1.0: version "1.1.1" resolved "https://registry.yarnpkg.com/ultron/-/ultron-1.1.1.tgz#9fe1536a10a664a65266a1e3ccf85fd36302bc9c" @@ -14020,6 +14363,11 @@ varint@^5.0.0: resolved "https://registry.yarnpkg.com/varint/-/varint-5.0.2.tgz#5b47f8a947eb668b848e034dcfa87d0ff8a7f7a4" integrity sha512-lKxKYG6H03yCZUpAGOPOsMcGxd1RHCu1iKvEHYDPmTyq2HueGhD73ssNBqqQWfvYs04G9iUFRvmAVLW20Jw6ow== +varint@^6.0.0: + version "6.0.0" + resolved "https://registry.yarnpkg.com/varint/-/varint-6.0.0.tgz#9881eb0ce8feaea6512439d19ddf84bf551661d0" + integrity sha512-cXEIW6cfr15lFv563k4GuVuW/fiwjknytD37jIOLSdSWuOI6WnO/oKwmP2FQTU2l01LP8/M5TSAJpzUaGe3uWg== + vary@^1, vary@~1.1.2: version "1.1.2" resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc"