enpt.options package
Submodules
enpt.options.config module
EnPT configuration module.
Provides the configuration that is later passed to individual submodules.
- class enpt.options.config.EnPTConfig(json_config='', **user_opts)[source]
Bases:
object
Create a job configuration.
- Parameters:
json_config – path to JSON file containing configuration parameters or a string in JSON format
- Key CPUs:
number of CPU cores to be used for processing (default: “None” -> use all available)
- Key path_l1b_enmap_image:
input path of the EnMAP L1B image to be processed (zip-archive or root directory; must be given if not contained in –json-config.)
- Key path_l1b_enmap_image_gapfill:
input path of an adjacent EnMAP L1B image to be used for gap-filling (zip-archive or root directory)
- Key path_dem:
input path of digital elevation model in map or sensor geometry; GDAL compatible file format (must cover the EnMAP L1B data completely if given in map geometry or must have the same pixel dimensions like the EnMAP L1B data if given in sensor geometry)
- Key average_elevation:
average elevation in meters above sea level; may be provided if no DEM is available; ignored if DEM is given
- Key output_dir:
output directory where processed data and log files are saved
- Key output_format:
file format of all raster output files (‘GTiff’: GeoTIFF, ‘ENVI’: ENVI BSQ; default: ‘ENVI’)
- Key output_interleave:
raster data interleaving type (default: ‘pixel’)
‘band’: band-sequential (BSQ),
‘line’: data interleaved-by-line (BIL; only usable for ENVI output format),
‘pixel’ data interleaved-by-pixel (BIP)
- Key output_nodata_value:
output no-data/background value (should be within the integer 16-bit range, default: -32768)
- Key working_dir:
directory to be used for temporary files
- Key n_lines_to_append:
number of lines to be added to the main image [if None, use the whole imgap]. Requires ‘path_l1b_enmap_image_gapfill’ to be set.
- Key drop_bad_bands:
if set to True (default), the water absorption bands between 1358 and 1453 nm as well as between 1814 and 1961 nm are excluded from processing and will not be contained in the L2A product
- Key disable_progress_bars:
whether to disable all progress bars during processing
- Key path_earthSunDist:
input path of the earth sun distance model
- Key path_solar_irr:
input path of the solar irradiance model
- Key scale_factor_toa_ref:
scale factor to be applied to TOA reflectance result
- Key enable_keystone_correction:
Enable keystone correction
- Key enable_vnir_swir_coreg:
Enable VNIR/SWIR co-registration
- Key enable_absolute_coreg:
Enable the co-registration of the EnMAP image to the reference image given with ‘path_reference_image’
- Key path_reference_image:
Reference image for co-registration.
- Key polymer_root:
Polymer root directory (that contains the subdirectory for ancillary data).
- Key enable_ac:
Enable atmospheric correction using SICOR algorithm (default: True). If False, the L2A output contains top-of-atmosphere reflectance.
- Key mode_ac:
3 modes to determine which atmospheric correction is applied at which surfaces (default: land):
‘land’: SICOR (developed for land surfaces is applied to land AND water surfaces
- ‘water’: POLYMER (developed for water surfaces) is applied to water only
(land surfaces are no included in the L2A product)
- ‘combined’: SICOR is applied to land and POLYMER is applied to water surfaces;
NOTE that this may result in edge effects, e.g., at coastlines
- Key polymer_additional_results:
Enable the generation of additional results when running ACwater/POLYMER (default: True)
- Key auto_download_ecmwf:
Automatically download ECMWF AUX data when running Polymer atmospheric correction for water surfaces
- Key scale_factor_boa_ref:
Scale factor to be applied to BOA reflectance result
- Key threads:
number of threads for multiprocessing of blocks (see bellow):
‘threads = 0’: for single thread
‘threads < 0’: for as many threads as there are CPUs
‘threads > 0’: gives the number of threads
- Key blocksize:
block size for multiprocessing
- Key run_smile_P:
Enable extra smile detection and correction (provider smile coefficients are ignored)
- Key run_deadpix_P:
Enable dead pixel correction
- Key deadpix_P_algorithm:
Algorithm for dead pixel correction (‘spectral’ or ‘spatial’)
- Key deadpix_P_interp_spectral:
- Spectral interpolation algorithm to be used during dead pixel correction
(‘linear’, ‘quadratic’, ‘cubic’)
- Key deadpix_P_interp_spatial:
- Spatial interpolation algorithm to be used during dead pixel correction
(‘linear’, ‘nearest’, ‘zero’, ‘slinear’, ‘quadratic’, ‘cubic’)
- Key ortho_resampAlg:
- Ortho-rectification resampling algorithm (‘nearest’, ‘bilinear’, ‘gauss’, ‘cubic’, ‘cubic_spline’,
‘lanczos’, ‘average’, ‘mode’, ‘max’, ‘min’, ‘med’, ‘q1’, ‘q3’)
- Key target_projection_type:
Projection type of the raster output files (‘UTM’, ‘Geographic’) (default: ‘UTM’)
- Key target_epsg:
Custom EPSG code of the target projection (overrides target_projection_type)
- Key target_coord_grid:
Custom target coordinate grid where the output is resampled to ([x0, x1, y0, y1], e.g., [0, 30, 0, 30])
- get_json_opts(validate=True)[source]
Get a dictionary of EnPT config parameters.
NOTE: Reads the default options from options_default.json and updates the values with those from database.
- save(path_outfile)[source]
Save the JobConfig instance to a JSON file in the same structure like the one in options_default.json.
- Parameters:
path_outfile – path of the output JSON file
- class enpt.options.config.EnPTValidator(*args, **kwargs)[source]
Bases:
Validator
Validator class. Normalizes and/or validates any mapping against a validation-schema which is provided as an argument at class instantiation or upon calling the
validate()
,validated()
ornormalized()
method. An instance itself is callable and executes a validation.All instantiation parameters are optional.
There are the introspective properties
types
,validators
,coercers
,default_setters
,rules
,normalization_rules
andvalidation_rules
.The attributes reflecting the available rules are assembled considering constraints that are defined in the docstrings of rules’ methods and is effectively used as validation schema for
schema
.- Parameters:
schema (any mapping) – See
schema
. Defaults toNone
.ignore_none_values (
bool
) – Seeignore_none_values
. Defaults toFalse
.allow_unknown (
bool
or any mapping) – Seeallow_unknown
. Defaults toFalse
.require_all (
bool
) – Seerequire_all
. Defaults toFalse
.purge_unknown (
bool
) – Seepurge_unknown
. Defaults to toFalse
.purge_readonly (
bool
) – Removes all fields that are defined asreadonly
in the normalization phase.error_handler (class or instance based on
BaseErrorHandler
ortuple
) – The error handler that formats the result oferrors
. When given as two-value tuple with an error-handler class and a dictionary, the latter is passed to the initialization of the error handler. Default:BasicErrorHandler
.
Get an instance of EnPTValidator.
- Parameters:
args – Arguments to be passed to cerberus.Validator
kwargs – Keyword arguments to be passed to cerberus.Validator
- _types_from_methods = ()
- checkers = ()
- coercers = ()
- default_setters = ()
- normalization_rules = {'coerce': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}, 'default': {'nullable': True}, 'default_setter': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'purge_unknown': {'type': 'boolean'}, 'rename': {'type': 'hashable'}, 'rename_handler': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}}
- rules = {'allof': {'logical': 'allof', 'type': 'list'}, 'allow_unknown': {'oneof': [{'type': 'boolean'}, {'check_with': 'bulk_schema', 'type': ['dict', 'string']}]}, 'allowed': {'type': 'container'}, 'anyof': {'logical': 'anyof', 'type': 'list'}, 'check_with': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}, 'coerce': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}, 'contains': {'empty': False}, 'default': {'nullable': True}, 'default_setter': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'dependencies': {'check_with': 'dependencies', 'type': ('dict', 'hashable', 'list')}, 'empty': {'type': 'boolean'}, 'excludes': {'schema': {'type': 'hashable'}, 'type': ('hashable', 'list')}, 'forbidden': {'type': 'list'}, 'items': {'check_with': 'items', 'type': 'list'}, 'keysrules': {'check_with': 'bulk_schema', 'forbidden': ['rename', 'rename_handler'], 'type': ['dict', 'string']}, 'max': {'nullable': False}, 'maxlength': {'type': 'integer'}, 'meta': {}, 'min': {'nullable': False}, 'minlength': {'type': 'integer'}, 'noneof': {'logical': 'noneof', 'type': 'list'}, 'nullable': {'type': 'boolean'}, 'oneof': {'logical': 'oneof', 'type': 'list'}, 'purge_unknown': {'type': 'boolean'}, 'readonly': {'type': 'boolean'}, 'regex': {'type': 'string'}, 'rename': {'type': 'hashable'}, 'rename_handler': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}, 'require_all': {'type': 'boolean'}, 'required': {'type': 'boolean'}, 'schema': {'anyof': [{'check_with': 'schema'}, {'check_with': 'bulk_schema'}], 'type': ['dict', 'string']}, 'type': {'check_with': 'type', 'type': ['string', 'list']}, 'valuesrules': {'check_with': 'bulk_schema', 'forbidden': ['rename', 'rename_handler'], 'type': ['dict', 'string']}}
- validate(document2validate, **kwargs)[source]
Normalizes and validates a mapping against a validation-schema of defined rules.
- Parameters:
document (any mapping) – The document to normalize.
schema (any mapping) – The validation schema. Defaults to
None
. If not provided here, the schema must have been provided at class instantiation.update (
bool
) – IfTrue
, required fields won’t be checked.normalize (
bool
) – IfTrue
, normalize the document before validation.
- Returns:
True
if validation succeeds, otherwiseFalse
. Check theerrors()
property for a list of processing errors.- Return type:
bool
- validation_rules = {'allof': {'logical': 'allof', 'type': 'list'}, 'allow_unknown': {'oneof': [{'type': 'boolean'}, {'check_with': 'bulk_schema', 'type': ['dict', 'string']}]}, 'allowed': {'type': 'container'}, 'anyof': {'logical': 'anyof', 'type': 'list'}, 'check_with': {'oneof': [{'type': 'callable'}, {'schema': {'oneof': [{'type': 'callable'}, {'allowed': (), 'type': 'string'}]}, 'type': 'list'}, {'allowed': (), 'type': 'string'}]}, 'contains': {'empty': False}, 'dependencies': {'check_with': 'dependencies', 'type': ('dict', 'hashable', 'list')}, 'empty': {'type': 'boolean'}, 'excludes': {'schema': {'type': 'hashable'}, 'type': ('hashable', 'list')}, 'forbidden': {'type': 'list'}, 'items': {'check_with': 'items', 'type': 'list'}, 'keysrules': {'check_with': 'bulk_schema', 'forbidden': ['rename', 'rename_handler'], 'type': ['dict', 'string']}, 'max': {'nullable': False}, 'maxlength': {'type': 'integer'}, 'meta': {}, 'min': {'nullable': False}, 'minlength': {'type': 'integer'}, 'noneof': {'logical': 'noneof', 'type': 'list'}, 'nullable': {'type': 'boolean'}, 'oneof': {'logical': 'oneof', 'type': 'list'}, 'readonly': {'type': 'boolean'}, 'regex': {'type': 'string'}, 'require_all': {'type': 'boolean'}, 'required': {'type': 'boolean'}, 'schema': {'anyof': [{'check_with': 'schema'}, {'check_with': 'bulk_schema'}], 'type': ['dict', 'string']}, 'type': {'check_with': 'type', 'type': ['string', 'list']}, 'valuesrules': {'check_with': 'bulk_schema', 'forbidden': ['rename', 'rename_handler'], 'type': ['dict', 'string']}}
- enpt.options.config.get_options(target: str, validation: bool = True)[source]
Return dictionary with all options.
- Parameters:
target – if path to file, then json is used to load, otherwise the default template is used
validation – True / False, whether to validate options read from files or not
- Returns:
dictionary with options
enpt.options.options_schema module
Definition of EnPT options schema (as used by cerberus library).
Module contents
EnPT module ‘options’, containing all configuration related functions and parameter definitions.