sicor.Tools.cB package

Submodules

sicor.Tools.cB.CloudMask module

class sicor.Tools.cB.CloudMask.CloudMask(persistence_file=None, processing_tiles=10, novelty_detector=None, logger=None)[source]

Bases: S2cB

Get Cloud Detection based on classical Bayesian approach

Parameters:
  • persistence_file – if None, use internal file, else give file name to persistence file

  • processing_tiles – in order so save memory, the processing can be done in tiles

  • logger – None or logger instance

Returns:

CloudMask instance

sicor.Tools.cB.cB module

class sicor.Tools.cB.cB.S2cB(cb_clf, mask_legend, clf_to_col, processing_tiles=11, logger=None)[source]

Bases: object

sicor.Tools.cB.classical_bayesian module

class sicor.Tools.cB.classical_bayesian.ClassicalBayesian(mk_clf, bns, hh_full, hh, hh_n, n_bins, classes, n_classes, bb_full, logger=None)[source]

Bases: object

Parameters:
  • mk_clf

  • bns

  • hh_full

  • hh

  • hh_n

  • n_bins

  • classes

  • n_classes

  • bb_full

Returns:

conf(xx)[source]
predict(xx)[source]
predict_and_conf(xx, bad_data_value=255)[source]
predict_proba(xx)[source]
class sicor.Tools.cB.classical_bayesian.ClassicalBayesianFit(mk_clf, smooth_min=0.0, smooth_max=2.0, n_bins_min=5, n_bins_max=20, n_runs=10000, smooth_dd=10, max_mb=100.0, norm='right_class', ff=None, xx=None, yy=None, max_run_time=60, sample_weight=None, use_tqdm=False, sufficient_norm=None, dtype=<class 'float'>, fit_method='random', smooth=None, logger=None)[source]

Bases: ClassicalBayesian

ff: function of two variables, combines ff_train,ff_test -> ff which is maximised during fit, default

function is: lambda ff_train,ff_test: 0.5 * (ff_train + ff_test) - 0.4 * np.abs(ff_train-ff_test) which has a penalty term for over fitting

max_mb: maximum size of a histogram array in MB, if the number of bins or features are leading to higher

needed space, the number of bins is reduced to satisfy max_mb

static bins4histeq(inn, nbins_ou=10, nbins_in=1000)[source]

returns the bin-edges!! for an equalized histogram assumes numpy arrays

fit(xx, yy, sample_weight=None, **kwargs)[source]
fit_chosen_one(xx, yy, sample_weight=None, test_size=None, sufficient_norm=None)[source]
fit_random(xx, yy, sample_weight=None, test_size=None, sufficient_norm=None, return_opts=False)[source]
fit_random_bootstrap(xx, yy, sample_weight=None, test_size=None, sufficient_norm=None)[source]
get_params(deep=True)[source]
mk_hist(xx)[source]
set(xx, yy, smooth=None, n_bins=5, sample_weight=None)[source]
set_params(**params)[source]
class sicor.Tools.cB.classical_bayesian.ToClassifierDef(classifiers_id, classifiers_fk, clf_functions, id2name=None, logger=None)[source]

Bases: _ToClassifierBase

Most simple case of a usable ToClassifier instance, everything is fixed

classifiers_id: list of lists/np.arrays with indices which are inputs for classifier functions classifiers_fk: list of names for the functions to be used clf_functions: dictionary for key, value pairs of function names as used in classifiers_fk

class sicor.Tools.cB.classical_bayesian._ToClassifierBase(logger=None)[source]

Bases: object

internal base class for generation of classifiers, only to use common __call__

dummy __init__ which sets all basic needed attributes to none, need derived classes to implement proper __init__ :return:

adjust_classifier_ids(full_bands, band_lists)[source]
static list_np(arr)[source]

This is fixing a numpy annoyance where scalar arrays and vectors arrays are treated differently, namely one can not iterate over a scalar, this function fixes this in that a python list is returned for both scalar and vector arrays :param arr: numpy array or numpy scalar :return: list with values of arr

sicor.Tools.cB.classical_bayesian.dict_conv(inp)[source]
sicor.Tools.cB.classical_bayesian.digitize(data, bins, max_bins=2000)[source]

replacement of np.digitize, speed-up with numba

sicor.Tools.cB.classical_bayesian.get_clf_functions()[source]

this is just an example on how one could define classification functions, this is an argument to the ToClassifier Classes

sicor.Tools.cB.classical_bayesian.histogramdd_fast(data, bins)[source]
sicor.Tools.cB.classical_bayesian.read_classical_bayesian_from_hdf5_file(filename)[source]

loads persistence data for classical Bayesian classifier from hdf5 file

Parameters:

filename

Returns:

dictionary needed data

sicor.Tools.cB.classical_bayesian.save_divide(d1, d2, mx=100.0)[source]

save division without introducing NaN’s :param d1: :param d2: :param mx: absolute maximum allows value from which on the result is chopped :return: d1/d2

sicor.Tools.cB.classical_bayesian.to_clf(inp)[source]

helper function which sets the type of features and assures numeric values :param inp: :return: float array without NaN’s or INF’s

sicor.Tools.cB.classical_bayesian.write_classical_bayesian_to_hdf5_file(clf, filename, class_names, mask_legend, clf_to_col, band_names)[source]

Module contents

class sicor.Tools.cB.CloudMask(persistence_file=None, processing_tiles=10, novelty_detector=None, logger=None)[source]

Bases: S2cB

Get Cloud Detection based on classical Bayesian approach

Parameters:
  • persistence_file – if None, use internal file, else give file name to persistence file

  • processing_tiles – in order so save memory, the processing can be done in tiles

  • logger – None or logger instance

Returns:

CloudMask instance