Command Line Interface
Each geoprocessing project provides a number of commands to get work done. They are accessible via your projects package.json scripts
and run using npm run <command>
Adding Building Blocks
create:report
- stubs out new report component and geoprocessing functioncreate:client
- stubs out a new report clientcreate:function
- stubs out a new geoprocessing functionadd:template
- add-on templates for your project
Datasource management
import:data
- import a new vector or raster datasource to thedata/dist
directory, for publish and use in preprocessing and geoprocessing functions, making any necessary transformations and precalculations.reimport:data
- reimport an existing datasource. Use when a new version of data becomes available.publish:data
- publishes imported datasources fromdata/dist
to the projectsdatasets
S3 bucket, for use by preprocessing and geoprocessing functions.
Testing
Testing uses Storybook, Jest and the React testing library.
-
storybook
- loads stories for your reports and other UI components in your default web browser using a local storybook dev server.- Story files must be named
*.stories.tsx
to be picked up. - Storybook updates automatically as you make and save changes to your components.
- Story files must be named
-
start:data
- runs a local file server, serving up the cloud-optimized datasources indata/dist
. -
test
- executes all unit and smoke tests for the project -
test:unit:matching
- executes unit tests matching the given substring.- You will need to run
start:data
command manually before running this command if your functions accesses datasources published by this project (not global datasources). - See Vitest -t
- e.g.
npm run test:matching boundaryAreaOverlapSmoke
where smoke test is coded as follows
- You will need to run
test("boundaryAreaOverlapSmoke - tests run against all examples", async () => {
...
})
test:matching
- executes tests with name matching the given substring.- You will need to run
start:data
command manually before running this command if your functions accesses datasources published by this project (not global datasources). - See Jest --testNamePattern
- e.g.
npm run test:matching boundaryAreaOverlapSmoke
where smoke test is coded as follows
- You will need to run
test("boundaryAreaOverlapSmoke - tests run against all examples", async () => {
...
})
Build and deploy
build
- bundles geoprocessing functions into a.build
directory and report clients into a.build-web
directory. Ready to be deployed.build:client
- sub-command for building just your report clientsbuild:lambda
- sub-command for building just your geoprocessing functions
deploy
- deploys your built functions and clients to an AWS CloudFormation stack. Name of stack to deploy to is based on the name of your project in package.json. After initial deploy, use this same command to re-deploy.synth
- translates your project resources into an AWS CloudFormation template. This is automatically done as par of the deploy and you should not need to run this.
destroy
- destroy your current projects CloudFormation stack in AWS. Useful if a rollback fails and your stack is left in an inconsistent state. You should be able to re-deploybootstrap
- command to runcdk bootstrap
. Usually only needed if deploying for first time with CDK to a region with your account. Run if your deploy fails and suggests you need to bootstrap.url
- returns the root URL of the REST API for your deployment, which returns the project manifest.
Upgrade scripts
install:scripts
- installs scripts from the geoprocessing library toscripts
anddata/scripts
folders, overwriting existing files. Use to manually upgrade your scripts to the latest after upgrading the geoprocessing library. If you've modified these scripts locally you will need to merge the changes manually.
Language Translation
extract:translation
- Extracts all translations from your projects source code using babel and babel-plugin-i18next-extract. It also runs an additional script (
src/i18n/bin/extractExtraTerms.ts
) to extract strings from your project config (metrics.json, objectives.json) commonly displayed in reports for translation asextraTerms
.
- Extracts all translations from your projects source code using babel and babel-plugin-i18next-extract. It also runs an additional script (
publish:translation
- Posts translations for all langauges to POEditor. Behavior is pre-configured via
src/i18n/config.ts
. Do not edit this file unless you need to. - Translations with namespace specified by
localNamespace
are written to POEditor with context value specified byremoteContext
. - All english translations are published, overwriting any in POEditor, since the code is their source of truth.
- For non-english languages, POEditor is the source of truth, so if a translation is not defined in POEditor, then a local project translation is published if available, otherwise a base translation will be published as fallback. Running
import:translation
after that will then import those base translations back and seed the local project translations.
- Posts translations for all langauges to POEditor. Behavior is pre-configured via
import:translation
- Fetches translations from POEditor for all non-english languages having context value specified by
remotextContex
property insrc/i18n/config.son
. Any existing translation values will be overwritten. Translations are saved to the namespace specified by thelocalNamespace
property inproject/i18n.json
.
- Fetches translations from POEditor for all non-english languages having context value specified by
sync:translation
- A convenience command to keep the code, local translations, and remote translations in sync. Simply runs in succession
extract
,publish
, thenimport
.
- A convenience command to keep the code, local translations, and remote translations in sync. Simply runs in succession
install:translation
- Use to manually upgrade your projects base translations from the installed geoprocessing library to the projects
src/i18n/baseLang
directory, overwriting any previous version. You should not normally need to run this, because it is already run after every time you runnpm install
such that if you upgrade your geoprocessing library version, it will be done automatically.
- Use to manually upgrade your projects base translations from the installed geoprocessing library to the projects