chore(trading): maintenance on ipfs build process (#4736)
This commit is contained in:
parent
db19ee80ac
commit
f810a3a0b1
2
Makefile
2
Makefile
@ -11,7 +11,7 @@ recalculate-ipfs:
|
||||
echo "ipfs hash inside the image"
|
||||
docker run --rm ${TAG} cat /ipfs-hash
|
||||
echo "recalculating ipfs hash"
|
||||
docker run --rm ${TAG} ipfs add -r /usr/share/nginx/html
|
||||
docker run --rm ${TAG} ipfs add -rQ /usr/share/nginx/html
|
||||
|
||||
.PHONY: eject-ipfs-hash
|
||||
unpack:
|
||||
|
48
README.md
48
README.md
@ -115,11 +115,31 @@ docker run -p 3000:80 [TAG]
|
||||
|
||||
## Build instructions
|
||||
|
||||
The [`docker`](./docker) subfolder has some docker configurations for easily setting up your own hosted version of console either for the web, or ready for pinning on IPFS
|
||||
The [`docker`](./docker) subfolder has some docker configurations for easily setting up your own hosted version of Console either for the web, or ready for pinning on IPFS.
|
||||
|
||||
### nx build inside the docker
|
||||
|
||||
Using multistage dockerfile dist is compiled using [node](https://hub.docker.com/_/node) image and later packed to nginx as in [dist build](#dist-build). The multistage builds ensures consistent CPU architecture and build toolchains are used so that the result will be identical.
|
||||
|
||||
```bash
|
||||
docker build --build-arg APP=[YOUR APP] --build-arg NODE_VERSION=16.5.1 --build-arg ENV_NAME=mainnet -t [TAG] -f docker/node-inside-docker.Dockerfile .
|
||||
```
|
||||
|
||||
### Computing ipfs-hash of the build
|
||||
|
||||
At the moment this feature is important only for Console releases.
|
||||
|
||||
Each docker build finishes with hash calculation for ` dist`` directory. Resulting hash is added to file named as `/ipfs-hash`. Once docker image is produced you can run following commad to display ipfs-hash:
|
||||
|
||||
```bash
|
||||
make recalculate-ipfs TAG=vegaprotocol/trading:{YOUR_VERSION}
|
||||
```
|
||||
|
||||
**updating hash:** recompiling dist directory (even if there are no changed to source code) results in different hash computed by ipfs command.
|
||||
|
||||
### nx build outside the docker
|
||||
|
||||
Packaging prepared dist into [`nginx`](https://hub.docker.com/_/nginx)([server configuration](./nginx/nginx.conf)) docker image involves building the application on docker host machine from source.
|
||||
This Docker image packages a pre-built `dist` folder into an [`nginx`](https://hub.docker.com/_/nginx)([server configuration](./nginx/nginx.conf)) docker image. In this case, the application on docker host machine from source.
|
||||
|
||||
As a prerequisite you need to perform build of `dist` directory and move its content for specific application to `dist-result` directory. Use following script to do it with a single command:
|
||||
|
||||
@ -133,38 +153,18 @@ You can build any of the containers locally with the following command:
|
||||
docker build --dockerfile docker/node-outside-docker.Dockerfile . --tag=[TAG]
|
||||
```
|
||||
|
||||
### nx build inside the docker
|
||||
|
||||
Using multistage dockerfile dist is compiled using [node](https://hub.docker.com/_/node) image and later packed to nginx as in [dist build](#dist-build) example.
|
||||
|
||||
```bash
|
||||
docker build --build-arg APP=[YOUR APP] --build-arg NODE_VERSION=$(cat .nvmrc) --build-arg ENV_NAME=mainnet -t [TAG] -f docker/node-inside-docker.Dockerfile .
|
||||
```
|
||||
|
||||
### Computing ipfs-hash of the build
|
||||
|
||||
At the moment this feature is important only for `trading` (console) releases.
|
||||
|
||||
Each docker build finishes with hash calculation for dist directory. Resulting hash is added to file named as `/ipfs-hash`. Once docker image is produced you can run following commad to display ipfs-hash:
|
||||
|
||||
```bash
|
||||
make recalculate-ipfs TAG=vegaprotocol/trading:{YOUR_VERSION}
|
||||
```
|
||||
|
||||
**updating hash:** recompiling dist directory (even if there are no changed to source code) results in different hash computed by ipfs command.
|
||||
|
||||
### Verifying ipfs-hash of existing current application version
|
||||
|
||||
An IPFS CID will be attached to every [release](https://github.com/vegaprotocol/frontend-monorepo/releases). If you are intending to pin an application on IPFS, you can check that your build matches by running the following steps:
|
||||
|
||||
1. Show latest release by runnning: `make latest-release`. You need to configure [`gh`](https://cli.github.com/) for this step to work, otherwise please provide release manually from [github](https://github.com/vegaprotocol/frontend-monorepo/releases) or [dockerhub](https://hub.docker.com/r/vegaprotocol/trading)
|
||||
1. Show latest release by running: `make latest-release`. You need to configure [`gh`](https://cli.github.com/) for this step to work, otherwise please provide release manually from [github](https://github.com/vegaprotocol/frontend-monorepo/releases) or [dockerhub](https://hub.docker.com/r/vegaprotocol/trading)
|
||||
2. Set RELEASE environment variable to value that you want to validate: `export RELEASE=$(make latest-release)` or `export RELEASE=vXX.XX.XX`
|
||||
3. Set TAG environment variable to image that you want to validate: `export TAG=vegaprotocol/trading:$RELEASE`
|
||||
4. Download docker image with the desired release `docker pull $TAG`.
|
||||
5. Recalculate hash: `make recalculate-ipfs`
|
||||
6. You should see exactly same hash produced by ipfs command as one placed with the release notes: `make show-latest-release`
|
||||
7. If you want to extract dist from docker image to your local filesystem you can run following command: `make unpack`
|
||||
8. Now `dist` directory contains valid application build. **it is not possible to calculate same ipfs hash on files that are result of copy operation**
|
||||
8. Now `dist` directory contains valid application build
|
||||
|
||||
## Config
|
||||
|
||||
|
@ -7,7 +7,7 @@ envCmd=""
|
||||
|
||||
if [[ ! -z "${ENV_NAME}" ]]; then
|
||||
if [[ "${ENV_NAME}" != "ops-vega" ]]; then
|
||||
envCmd="envCmd="yarn env-cmd -f ./apps/${APP}/.env.${ENV_NAME}"
|
||||
envCmd="yarn env-cmd -f ./apps/${APP}/.env.${ENV_NAME}"
|
||||
fi
|
||||
fi
|
||||
|
||||
|
@ -6,12 +6,11 @@ WORKDIR /app
|
||||
ARG APP
|
||||
ARG ENV_NAME=""
|
||||
RUN apk add --update --no-cache \
|
||||
python3==3.10.11-r0 \
|
||||
make==4.3-r0 \
|
||||
gcc==11.2.1_git20220219-r2 \
|
||||
g++==11.2.1_git20220219-r2
|
||||
COPY . ./
|
||||
RUN yarn --network-timeout 100000 --pure-lockfile
|
||||
RUN yarn --pure-lockfile
|
||||
# work around for different build process in trading
|
||||
RUN sh docker/docker-build.sh
|
||||
|
||||
|
@ -1,146 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Run on Github by commits to master, this script:
|
||||
* 1. Gets all projects with a fleek configuration file
|
||||
* 2. Gets the last commit for the relevant site id
|
||||
* 3. Runs nx:affected for that commit ID and checks if the site has changed
|
||||
* 4. Calls deploy if it has
|
||||
*
|
||||
* This would probably be best as an NX task, but the circular nature of getting
|
||||
* the fleek ID for the project, then checking if it was affected didn't fit within the
|
||||
* build-only-affected cycle, and as each fleek deploy will have been triggered by
|
||||
* a different commit, it seemed best to do this outwith nx. Feel free to re-implement
|
||||
* this if the assumptions are wrong.
|
||||
*
|
||||
* It has also been written to skip external dependencies.
|
||||
*/
|
||||
const { existsSync, readdirSync, readFileSync } = require('fs');
|
||||
const { execSync } = require('child_process');
|
||||
|
||||
/**
|
||||
* Parses the last commit hash out of the Fleek API response
|
||||
* @param {String} siteId
|
||||
* @returns string Last commit that triggered a build
|
||||
*/
|
||||
function getFleekLastBuildCommit(siteId) {
|
||||
const curl = `curl 'https://api.fleek.co/graphql' --silent -X POST -H 'Accept: */*' -H 'Accept-Encoding: gzip, deflate, br' -H 'Content-Type: application/json' -H 'Authorization: ${process.env['FLEEK_API_KEY']}' --data-raw '{"query":"{getSiteById(siteId: \\"${siteId}\\"){publishedDeploy{repository{commit}}}}","variables":null}'`;
|
||||
const fleekRes = execSync(curl);
|
||||
|
||||
const res = JSON.parse(fleekRes.toString());
|
||||
let commit = res.data.getSiteById.publishedDeploy.repository.commit;
|
||||
|
||||
return commit;
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers a Fleek build of the latest code via GraphQL
|
||||
* @param {String} siteId
|
||||
* @returns
|
||||
*/
|
||||
function triggerDeploy(siteId) {
|
||||
const curl = `curl 'https://api.fleek.co/graphql' --silent -X POST -H 'Accept: */*' -H 'Accept-Encoding: gzip, deflate, br' -H 'Content-Type: application/json' -H 'Authorization: ${process.env['FLEEK_API_KEY']}' --data-raw '{"query":"mutation {triggerDeploy(commit: \\"HEAD\\", siteId: \\"${siteId}\\"){status}}","variables":null}'`;
|
||||
const fleekRes = execSync(curl);
|
||||
|
||||
JSON.parse(fleekRes.toString());
|
||||
|
||||
// Will have thrown if it failed
|
||||
return true;
|
||||
}
|
||||
|
||||
// The folder containing NX projects
|
||||
const projectPath = './apps/';
|
||||
// The Fleek project file, the existence indicates a deployed app
|
||||
const fleekFile = '.fleek.json';
|
||||
// Some simple stats for the end
|
||||
let fleekProjects = 0;
|
||||
let deployedProjects = 0;
|
||||
|
||||
// Fleek CLI requires this variable to be set
|
||||
if (!process.env['FLEEK_API_KEY']) {
|
||||
console.error('Error: FLEEK_API_KEY must be set');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
readdirSync(projectPath).forEach((proj) => {
|
||||
try {
|
||||
const config = `${projectPath}${proj}/${fleekFile}`;
|
||||
if (!existsSync(config)) {
|
||||
// No fleek file, skip it
|
||||
return;
|
||||
}
|
||||
|
||||
fleekProjects++;
|
||||
|
||||
console.group(proj);
|
||||
|
||||
// The UID for the site according to the config
|
||||
let siteId;
|
||||
|
||||
try {
|
||||
const fleekConfig = JSON.parse(readFileSync(config));
|
||||
siteId = fleekConfig.site.id;
|
||||
|
||||
console.log(`Fleek site ID: ${siteId}`);
|
||||
} catch (e) {
|
||||
console.error(`Failed to read Fleek site id for ${proj}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// The last commit that triggered a build
|
||||
let baseCommit;
|
||||
|
||||
try {
|
||||
baseCommit = getFleekLastBuildCommit(siteId);
|
||||
|
||||
console.log(`Last deploy: ${baseCommit}`);
|
||||
} catch (e) {
|
||||
console.error(`Failed to fetch last deploy for ${proj}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Now run nx affected
|
||||
let isAffected;
|
||||
|
||||
try {
|
||||
const affectedSinceCommit = execSync(
|
||||
`yarn nx print-affected --base=${baseCommit} --head=HEAD --select=projects`
|
||||
);
|
||||
|
||||
// Detect if this project name is in output, taking care not to match names that are
|
||||
// included in other projects - `trading`, `trading-e2e` is a current example
|
||||
isAffected = affectedSinceCommit
|
||||
.toString()
|
||||
.split(',')
|
||||
.map((v) => v.trim())
|
||||
.includes(proj);
|
||||
} catch (e) {
|
||||
console.error(`Failed to run nx:affected for ${baseCommit}:master`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (isAffected) {
|
||||
console.log(`Triggering deploy for: ${siteId}`);
|
||||
deployedProjects++;
|
||||
|
||||
try {
|
||||
triggerDeploy(siteId);
|
||||
} catch (e) {
|
||||
console.error(`Failed to trigger deploy for ${proj}`);
|
||||
process.exit(1);
|
||||
}
|
||||
} else {
|
||||
console.log(`Has not changed since last build, skipping...`);
|
||||
}
|
||||
|
||||
console.groupEnd();
|
||||
} catch (e) {
|
||||
console.log(e);
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
|
||||
console.log(`Fleek projects: ${fleekProjects}`);
|
||||
console.log(`Deploys triggered: ${deployedProjects}`);
|
||||
|
||||
process.exit(0);
|
Loading…
Reference in New Issue
Block a user