Usage
To use SICOR in a project:
import sicor
Quickstart
For processing multispectral data, atmospheric fields from ECMWF for your time of interest should be downloaded:
sicor_ecmwf.py --help
If this data is missing, SICOR can fall-back to default values, but this approach is not recommended.
Using SICOR from python:
# for multispectral data
from sicor import AC
AC()
# for hyperspectral data
from sicor.sicor_enmap import sicor_ac_enmap
enmap_l2a_vnir, enmap_l2a_swir, res = sicor_ac_enmap(data_l1b, options, logger)
From command line (currently, only applicable to the multispectral case):
sicor_ecmwf.py --help
sicor_ac.py --help
Command line utilities
At the command line, sicor provides the sicor_ac command for multispectral datasets:
usage: sicor_ac [-h] [-g GRANULE_PATH] [-o OUT_DIR] [-l LOG_DIR] [-s SETTINGS]
[-e EXPORT_OPTIONS_TO] [-v]
Named Arguments
- -g, --granule_path
Path to S2 granule product folder.
- -o, --out_dir
Path to output directory.
- -l, --log_dir
Path to directory where log can be written.
- -s, --settings
Path to settings json file.
- -e, --export_options_to
Export options json file to given path. Other options are ignored.
- -v, --version
Print version.
Default: False
The command line utility for hyperspectral datasets is currently under development
The program sicor_ecmwf can be used to download needed products from ECMWF servers needed for processing multispectral datasets:
usage: sicor_ecmwf [-h] [--test] [-p DB_PATH] [-f DATE_FROM] [-t DATE_TO]
[-d DAYS] [-r REPEAT] [-i PROCESSES] [-v VARIABLES] [-F]
[-m MAX_STEP]
Named Arguments
- --test
Run Test downloads.
Default: False
- -p, --db_path
Root path do database which will be created.
Default: “./db_ECMWF/”
- -f, --date_from
Start date in format: YYYY/MM/DD
Default: “2016/01/01”
- -t, --date_to
To date in format: YYYY/MM/DD
Default: “today”
- -d, --days
Download data for last [Days], ignoring -f and -t.
Default: 0
- -r, --repeat
Repeat [repeat]/[repeat every hour fraction]/[print status every seconds], e.g. 2/0.5/30, [repeat]=0 repeats forever
Default: “1/0/0”
- -i, --processes
Default: 0
- -v, --variables
Comma separated list of to be retrieved variables.
Default: “fc_T2M,fc_O3,fc_SLP,fc_TCWV,fc_GMES_ozone,fc_total_AOT_550nm,fc_sulphate_AOT_550nm,fc_black_carbon_AOT_550nm,fc_dust_AOT_550nm,fc_organic_matter_AOT_550nm,fc_sea_salt_AOT_550nm”
- -F, --force
Force downloads.
Default: False
- -m, --max_step
Maximum forecast step to download.
Default: 120
Makefile
SICOR operations can be started using make, available options are:
$ make make options: (run make [option] to perform action): clean: Remove all build, test, coverage and Python artifacts. clean-build: Remove build artifacts including build/ dist/ and .eggs/ folders. clean-pyc: Remove Python file artifacts, e.g. pyc files. clean-test: Remove test and coverage artifacts. convert_examples_to_doc: Use nbconvert to convert jupyter notebooks in examples to doc/examples. Links to internal images are adjusted such that SPHINX documentation can be build. coverage: Use coverage to run tests and to produce a coverage report. coverage_view: Open default browser to check coverage report. docs: Generate HTML documentation using SPHINX. If example jupyer notebooks should be updated, run the target 'convert_examples_to_doc' first. download-tables (currently, only needed for multispectral case): Download tables for atmospheric correction and scene classification from google drive if not found locally (anywhere in $PATH). Gdrive might be unreliable and fail. Just try again later. Files are checked for their hash before continuing here. download-tables-all (currently, only needed for multispectral case): Download ALL tables for atmospheric correction and scene classification. This includes additional data, e.g. for methane retrievals or further development. examples_notebooks: Start a jupyter notebook server in the examples directory and open browser. gitlab_CI_docker: Build a docker image for CI use within gitlab. This is based on docker and required sudo access to docker. Multiple images are build, the 'sicor:latest' includes a working environment for SICOR and is used to run the tests. SICOR is not included in this image and it is cloned and installed for each test run. install: Install the package to the active Python site-packages. lint: Check style and pep8 conformity using multiple pep8 and style checkers. Flake8 and pycodestyle need to complete without error to not fail here. For now, pylint and pydocstyle are included, but their results ignored. The target 'test' depends on 'lint' which means that testing can only be a success when linting was run without errors. Run this before any commit! requirements: Install requirements as defined in requirements.txt using pip. test: Run tests quickly with the default Python interpreter and without coverage. test_single: Run a single test quickly with the default Python and without coverage. This is useful for debugging errors and feel free to change the considered test case to your liking.