LgbTrainConfig¶
LgbTrainConfig
¶
Configuration for LightGBM training parameters.
This class encapsulates all parameters that can be passed to lgb.train()
,
except for validation data-related parameters. Validation data handling is
managed separately: SingleModelContainer
decides whether to use validation
data in its train method, while CvModelContainer
automatically handles it
during cross-validation.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
params
|
dict[str, Any]
|
Parameters for training. |
required |
num_boost_round
|
int
|
Number of boosting iterations. |
100
|
valid_names
|
list[str] | None
|
Names of |
None
|
feval
|
callable or list of callable
|
Customized evaluation function. Each evaluation function should accept two parameters: preds, eval_data, and return (eval_name, eval_result, is_higher_better) or list of such tuples. |
required |
init_model
|
(str, Path, Booster or None)
|
Filename of LightGBM model or Booster instance used for continue training. |
required |
keep_training_booster
|
bool
|
Whether the returned Booster will be used to keep training. If False, the returned value will be converted into _InnerPredictor before returning. |
False
|
callbacks
|
list of callable
|
List of callback functions that are applied at each iteration. |
required |
See Also
lightgbm.train : The underlying LightGBM training function.
Examples:
>>> import lightgbm as lgb
>>> from factrainer.lightgbm import LgbTrainConfig
>>> config = LgbTrainConfig(
... params={
... "objective": "regression",
... "metric": "rmse",
... "boosting_type": "gbdt",
... "num_leaves": 31,
... "learning_rate": 0.05,
... },
... num_boost_round=100,
... callbacks=[lgb.early_stopping(10), lgb.log_evaluation(50)],
... )