Skip to content

Parameters

Saman .E edited this page Feb 28, 2023 · 3 revisions

Specifications of the GB-CNN and GB-DNN parameters.

Both GB-CNN and GB-DNN have the same set of parameters.

The parameters are categorized into three groups: general parameters, gradient boosting parameters, and additive model parameters.

General parameters

  • seed : int, default = 111, help = initialization state of a pseudo-random number generator.

Gradient Boosting parameters

  • boosting_epoch : int, default = 40, help = Number of sequences in a gradient-boosting manner.

  • boosting_eta : float, default = 1e-1, help = Shrinkage rate for gradient boosting framework.

  • save_records : bool, default = False, help = save trained models and additional training metrics (save_records will consume more memory).

Additive model parameters

  • additive_epoch : int, default = 200, help = The number of training epochs.

  • batch : int, default = 128, help = Batch size for training the additive model.

  • units : int, default = 20, help = The number of hidden neurons of the dense layers.

  • additive_eta : float, default = 1e-3, help = The learning rate for Adam optimizer.

  • patience : int, default = 3, help = Related to early stopping on validation score which is MSE.

Clone this wiki locally