HaskellGBM-0.1.0.0

Safe HaskellNone
LanguageHaskell2010

LightGBM.Parameters

Contents

Description

Parameter types for LightGBM.

Parameter details are documented in the LightGBM documentation.

Note that some of the parameters listed in the documentation are not exposed here since they're set implicitly through other parts of the API. For instance, the task param is set in the LightGBM.Model API, and the header param is set in the LightGBM.DataSet API.

Synopsis

Parameters

data Param Source #

Parameters control the behavior of lightGBM.

Constructors

Objective Application

Regression, binary classification, etc.

BoostingType Booster

Booster to apply - by default is GBDT

TrainingData FilePath

Path to training data

ValidationData [FilePath]

Paths to validation data files (supports multi-validation)

PredictionData FilePath

Path to data to use for a prediction

Iterations Natural

Number of boosting iterations - default is 100

LearningRate PositiveDouble

Scale how quickly parameters change in training

NumLeaves PositiveInt

Maximum number of leaves in one tree

Parallelism ParallelismStyle

Called tree_learner in the LightGBM docs

NumThreads Natural

Number of threads for LightGBM to use

Device Device

GPU or CPU

RandomSeed Int

A random seed used to generate other random seeds

MaxDepth Natural

Limit the depth of the tree model

MinDataInLeaf Natural

Minimum data count in a leaf

MinSumHessianInLeaf NonNegativeDouble

Minimal sum of the Hessian in one leaf

BaggingFraction LeftOpenProperFraction 
BaggingFreq PositiveInt 
BaggingFractionSeed Int 
FeatureFraction LeftOpenProperFraction 
FeatureFractionSeed Int 
EarlyStoppingRounds PositiveInt

Stop training if a validation metric doesn't improve in the last n rounds

Regularization_L1 NonNegativeDouble 
Regularization_L2 NonNegativeDouble 
MaxDeltaStep PositiveDouble 
MinSplitGain NonNegativeDouble 
MinDataPerGroup PositiveInt

Minimum number of data points per categorial group

MaxCatThreshold PositiveInt 
CatSmooth NonNegativeDouble 
CatL2 NonNegativeDouble

L2 regularization in categorical split

MaxCatToOneHot PositiveInt 
TopK PositiveInt

VotingPar only

MonotoneConstraint [Direction]

Length of directions = number of features

MaxBin IntGreaterThanOne 
MinDataInBin PositiveInt 
DataRandomSeed Int 
OutputModel FilePath

Where to persist the model after training

InputModel FilePath

Filepath to a persisted model to use for prediction or additional training

OutputResult FilePath

Where to persist the output of a prediction task

PrePartition Bool 
IsSparse Bool 
TwoRoundLoading Bool 
SaveBinary Bool 
Verbosity VerbosityLevel 
LabelColumn (ColumnSelector Natural)

Which column has the labels

WeightColumn (ColumnSelector Natural)

Which column has the weights

QueryColumn (ColumnSelector Natural) 
IgnoreColumns [ColumnSelector Natural]

Select columns to ignore in training

CategoricalFeatures [ColumnSelector Int32]

Select columns to use as features

BinConstructSampleCount PositiveInt 
UseMissing Bool 
ZeroAsMissing Bool 
InitScoreFile FilePath 
ValidInitScoreFile [FilePath] 
ForcedSplits FilePath 
Sigmoid PositiveDouble

Used in Binary classification and LambdaRank

Alpha OpenProperFraction

Used in Huber loss and Quantile regression

BoostFromAverage Bool

Used only in RegressionL2 task

RegSqrt Bool

Only used in RegressionL2

Metric [Metric]

Loss Metric

MetricFreq PositiveInt 
TrainingMetric Bool 

Instances

Eq Param Source # 

Methods

(==) :: Param -> Param -> Bool #

(/=) :: Param -> Param -> Bool #

Show Param Source # 

Methods

showsPrec :: Int -> Param -> ShowS #

show :: Param -> String #

showList :: [Param] -> ShowS #

data PredictionParam Source #

Constructors

PredictRawScore Bool

True = raw scores only, False = transformed scores

PredictLeafIndex Bool

True = predict with leaf index

PredictContrib Bool

True = estimate how each feature contributes to the prediction

NumIterationsPredict Natural

how many trained iterations are used in prediction

PredEarlyStop Bool

True = use early stopping on the prediction (may degrade accuracy)

PredEarlyStopFreq Natural 
PredEarlyStopMargin Double 

data Application Source #

LightGBM can be used for a variety of applications

Instances

Eq Application Source # 
Show Application Source # 
Generic Application Source # 

Associated Types

type Rep Application :: * -> * #

Hashable Application Source # 
type Rep Application Source # 

data Booster Source #

Different types of Boosting approaches

Constructors

GBDT

Gradient Boosting Decision Tree

RandomForest 
DART [DARTParam]

Dropouts meet Multiple Additive Regression Trees

GOSS [GOSSParam]

Gradient-based One-Sided Sampling

Instances

Eq Booster Source # 

Methods

(==) :: Booster -> Booster -> Bool #

(/=) :: Booster -> Booster -> Bool #

Show Booster Source # 
Generic Booster Source # 

Associated Types

type Rep Booster :: * -> * #

Methods

from :: Booster -> Rep Booster x #

to :: Rep Booster x -> Booster #

Hashable Booster Source # 

Methods

hashWithSalt :: Int -> Booster -> Int #

hash :: Booster -> Int #

type Rep Booster Source # 
type Rep Booster = D1 * (MetaData "Booster" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * ((:+:) * (C1 * (MetaCons "GBDT" PrefixI False) (U1 *)) (C1 * (MetaCons "RandomForest" PrefixI False) (U1 *))) ((:+:) * (C1 * (MetaCons "DART" PrefixI False) (S1 * (MetaSel (Nothing Symbol) NoSourceUnpackedness NoSourceStrictness DecidedLazy) (Rec0 * [DARTParam]))) (C1 * (MetaCons "GOSS" PrefixI False) (S1 * (MetaSel (Nothing Symbol) NoSourceUnpackedness NoSourceStrictness DecidedLazy) (Rec0 * [GOSSParam])))))

data DARTParam Source #

Parameters exclusively for the DART booster

Constructors

DropRate ProperFraction

Dropout rate

SkipDrop ProperFraction

Probablility of skipping a drop

MaxDrop PositiveInt

Max number of dropped trees on one iteration

UniformDrop Bool 
XGBoostDARTMode Bool 
DropSeed Int 

Instances

Eq DARTParam Source # 
Show DARTParam Source # 
Generic DARTParam Source # 

Associated Types

type Rep DARTParam :: * -> * #

Hashable DARTParam Source # 
type Rep DARTParam Source # 

data Device Source #

Constructors

CPU 
GPU [GPUParam] 

Instances

Eq Device Source # 

Methods

(==) :: Device -> Device -> Bool #

(/=) :: Device -> Device -> Bool #

Show Device Source # 
Generic Device Source # 

Associated Types

type Rep Device :: * -> * #

Methods

from :: Device -> Rep Device x #

to :: Rep Device x -> Device #

Hashable Device Source # 

Methods

hashWithSalt :: Int -> Device -> Int #

hash :: Device -> Int #

type Rep Device Source # 
type Rep Device = D1 * (MetaData "Device" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * (C1 * (MetaCons "CPU" PrefixI False) (U1 *)) (C1 * (MetaCons "GPU" PrefixI False) (S1 * (MetaSel (Nothing Symbol) NoSourceUnpackedness NoSourceStrictness DecidedLazy) (Rec0 * [GPUParam]))))

data Direction Source #

Instances

Eq Direction Source # 
Show Direction Source # 
Generic Direction Source # 

Associated Types

type Rep Direction :: * -> * #

Hashable Direction Source # 
type Rep Direction Source # 
type Rep Direction = D1 * (MetaData "Direction" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * (C1 * (MetaCons "Increasing" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "Decreasing" PrefixI False) (U1 *)) (C1 * (MetaCons "NoConstraint" PrefixI False) (U1 *))))

data Metric Source #

Instances

Eq Metric Source # 

Methods

(==) :: Metric -> Metric -> Bool #

(/=) :: Metric -> Metric -> Bool #

Show Metric Source # 
Generic Metric Source # 

Associated Types

type Rep Metric :: * -> * #

Methods

from :: Metric -> Rep Metric x #

to :: Rep Metric x -> Metric #

Hashable Metric Source # 

Methods

hashWithSalt :: Int -> Metric -> Int #

hash :: Metric -> Int #

type Rep Metric Source # 
type Rep Metric = D1 * (MetaData "Metric" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * ((:+:) * ((:+:) * ((:+:) * (C1 * (MetaCons "MeanAbsoluteError" PrefixI False) (U1 *)) (C1 * (MetaCons "MeanSquareError" PrefixI False) (U1 *))) ((:+:) * (C1 * (MetaCons "L2_root" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "QuantileRegression" PrefixI False) (U1 *)) (C1 * (MetaCons "MAPELoss" PrefixI False) (U1 *))))) ((:+:) * ((:+:) * (C1 * (MetaCons "HuberLoss" PrefixI False) (U1 *)) (C1 * (MetaCons "FairLoss" PrefixI False) (U1 *))) ((:+:) * (C1 * (MetaCons "PoissonNegLogLikelihood" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "GammaNegLogLikelihood" PrefixI False) (U1 *)) (C1 * (MetaCons "GammaDeviance" PrefixI False) (U1 *)))))) ((:+:) * ((:+:) * ((:+:) * (C1 * (MetaCons "TweedieNegLogLiklihood" PrefixI False) (U1 *)) (C1 * (MetaCons "NDCG" PrefixI False) (S1 * (MetaSel (Nothing Symbol) NoSourceUnpackedness NoSourceStrictness DecidedLazy) (Rec0 * (Maybe NDCGEvalPositions))))) ((:+:) * (C1 * (MetaCons "MAP" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "AUC" PrefixI False) (U1 *)) (C1 * (MetaCons "BinaryLogloss" PrefixI False) (U1 *))))) ((:+:) * ((:+:) * (C1 * (MetaCons "BinaryError" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "MultiLogloss" PrefixI False) (U1 *)) (C1 * (MetaCons "MultiError" PrefixI False) (U1 *)))) ((:+:) * (C1 * (MetaCons "Xentropy" PrefixI False) (U1 *)) ((:+:) * (C1 * (MetaCons "XentLambda" PrefixI False) (U1 *)) (C1 * (MetaCons "KullbackLeibler" PrefixI False) (U1 *)))))))

data MultiClassStyle Source #

Multi-classification styles

Instances

Eq MultiClassStyle Source # 
Show MultiClassStyle Source # 
Generic MultiClassStyle Source # 
Hashable MultiClassStyle Source # 
type Rep MultiClassStyle Source # 
type Rep MultiClassStyle = D1 * (MetaData "MultiClassStyle" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * (C1 * (MetaCons "MultiClassSimple" PrefixI False) (U1 *)) (C1 * (MetaCons "MultiClassOneVsAll" PrefixI False) (U1 *)))

data ParallelismParams Source #

Instances

Eq ParallelismParams Source # 
Show ParallelismParams Source # 
Generic ParallelismParams Source # 
Hashable ParallelismParams Source # 
type Rep ParallelismParams Source # 

data RegressionApp Source #

Different types of regression metrics

Constructors

L1

Absolute error metric

L2

RMS errror metric

Huber 
Fair [FairRegressionParam] 
Poisson [PoissonRegressionParam] 
Quantile 
MAPE 
Gamma 
Tweedie [TweedieRegressionParam] 

Instances

Eq RegressionApp Source # 
Show RegressionApp Source # 
Generic RegressionApp Source # 

Associated Types

type Rep RegressionApp :: * -> * #

Hashable RegressionApp Source # 
type Rep RegressionApp Source # 

data VerbosityLevel Source #

Constructors

Fatal 
Warn 
Info 

Instances

data XEApp Source #

Constructors

XEntropy 
XEntropyLambda 

Instances

Eq XEApp Source # 

Methods

(==) :: XEApp -> XEApp -> Bool #

(/=) :: XEApp -> XEApp -> Bool #

Show XEApp Source # 

Methods

showsPrec :: Int -> XEApp -> ShowS #

show :: XEApp -> String #

showList :: [XEApp] -> ShowS #

Generic XEApp Source # 

Associated Types

type Rep XEApp :: * -> * #

Methods

from :: XEApp -> Rep XEApp x #

to :: Rep XEApp x -> XEApp #

Hashable XEApp Source # 

Methods

hashWithSalt :: Int -> XEApp -> Int #

hash :: XEApp -> Int #

type Rep XEApp Source # 
type Rep XEApp = D1 * (MetaData "XEApp" "LightGBM.Parameters" "HaskellGBM-0.1.0.0-JRqo4bDeYXSFlBt0g8iOcz" False) ((:+:) * (C1 * (MetaCons "XEntropy" PrefixI False) (U1 *)) (C1 * (MetaCons "XEntropyLambda" PrefixI False) (U1 *)))

Utilities

data ColumnSelector a Source #

Some parameters are based on column selection either by index or by name. A ColumnSelector encapsulates this flexibility.

Constructors

Index a 
ColName String