class GBMRegressionModel extends RegressionModel[Vector, GBMRegressionModel] with GBMRegressorParams with MLWritable
- Source
- GBMRegressor.scala
- Grouped
- Alphabetic
- By Inheritance
- GBMRegressionModel
- MLWritable
- GBMRegressorParams
- GBMParams
- HasSubBag
- HasSeed
- BoostingParams
- HasAggregationDepth
- HasCheckpointInterval
- HasBaseLearner
- HasWeightCol
- HasNumBaseLearners
- HasValidationIndicatorCol
- HasTol
- HasMaxIter
- RegressionModel
- PredictionModel
- PredictorParams
- HasPredictionCol
- HasFeaturesCol
- HasLabelCol
- Model
- Transformer
- PipelineStage
- Logging
- Params
- Serializable
- Identifiable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new GBMRegressionModel(weights: Array[Double], subspaces: Array[Array[Int]], models: Array[EnsemblePredictionModelType], init: EnsemblePredictionModelType)
- new GBMRegressionModel(uid: String, weights: Array[Double], subspaces: Array[Array[Int]], models: Array[EnsemblePredictionModelType], init: EnsemblePredictionModelType)
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def $[T](param: Param[T]): T
- Attributes
- protected
- Definition Classes
- Params
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final val aggregationDepth: IntParam
- Definition Classes
- HasAggregationDepth
- val alpha: Param[Double]
The alpha-quantile of the huber loss function and the quantile loss function.
The alpha-quantile of the huber loss function and the quantile loss function. Only if loss="huber" or loss="quantile". (default = 0.9)
- Definition Classes
- GBMRegressorParams
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- val baseLearner: Param[EnsembleRegressorType]
param for the estimator that will be used by the ensemble learner as a base learner
param for the estimator that will be used by the ensemble learner as a base learner
- Definition Classes
- HasBaseLearner
- final val checkpointInterval: IntParam
- Definition Classes
- HasCheckpointInterval
- final def clear(param: Param[_]): GBMRegressionModel.this.type
- Definition Classes
- Params
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def copy(extra: ParamMap): GBMRegressionModel
- Definition Classes
- GBMRegressionModel → Model → Transformer → PipelineStage → Params
- def copyValues[T <: Params](to: T, extra: ParamMap): T
- Attributes
- protected
- Definition Classes
- Params
- final def defaultCopy[T <: Params](extra: ParamMap): T
- Attributes
- protected
- Definition Classes
- Params
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def explainParam(param: Param[_]): String
- Definition Classes
- Params
- def explainParams(): String
- Definition Classes
- Params
- def extractInstances(dataset: Dataset[_], validateInstance: (Instance) => Unit): RDD[Instance]
- Attributes
- protected
- Definition Classes
- PredictorParams
- def extractInstances(dataset: Dataset[_]): RDD[Instance]
- Attributes
- protected
- Definition Classes
- PredictorParams
- final def extractParamMap(): ParamMap
- Definition Classes
- Params
- final def extractParamMap(extra: ParamMap): ParamMap
- Definition Classes
- Params
- final val featuresCol: Param[String]
- Definition Classes
- HasFeaturesCol
- def featuresDataType: DataType
- Attributes
- protected
- Definition Classes
- PredictionModel
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- def fitBaseLearner(baseLearner: EnsembleRegressorType, labelColName: String, featuresColName: String, predictionColName: String, weightColName: Option[String])(df: DataFrame): EnsemblePredictionModelType
- Attributes
- protected
- Definition Classes
- HasBaseLearner
- final def get[T](param: Param[T]): Option[T]
- Definition Classes
- Params
- final def getAggregationDepth: Int
- Definition Classes
- HasAggregationDepth
- def getAlpha: Double
- Definition Classes
- GBMRegressorParams
- def getBaseLearner: EnsembleRegressorType
- Definition Classes
- HasBaseLearner
- final def getCheckpointInterval: Int
- Definition Classes
- HasCheckpointInterval
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def getDefault[T](param: Param[T]): Option[T]
- Definition Classes
- Params
- final def getFeaturesCol: String
- Definition Classes
- HasFeaturesCol
- def getInitStrategy: String
- Definition Classes
- GBMRegressorParams
- final def getLabelCol: String
- Definition Classes
- HasLabelCol
- def getLearningRate: Double
- Definition Classes
- GBMParams
- def getLoss: String
- Definition Classes
- GBMRegressorParams
- final def getMaxIter: Int
- Definition Classes
- HasMaxIter
- def getNumBaseLearners: Int
- Definition Classes
- HasNumBaseLearners
- def getNumRounds: Int
- Definition Classes
- GBMParams
- def getOptimizedWeights: Boolean
- Definition Classes
- GBMParams
- final def getOrDefault[T](param: Param[T]): T
- Definition Classes
- Params
- def getParam(paramName: String): Param[Any]
- Definition Classes
- Params
- final def getPredictionCol: String
- Definition Classes
- HasPredictionCol
- def getReplacement: Boolean
- Definition Classes
- HasSubBag
- final def getSeed: Long
- Definition Classes
- HasSeed
- def getSubsampleRatio: Double
- Definition Classes
- HasSubBag
- def getSubspaceRatio: Double
- Definition Classes
- HasSubBag
- final def getTol: Double
- Definition Classes
- HasTol
- def getUpdates: String
- Definition Classes
- GBMParams
- final def getValidationIndicatorCol: String
- Definition Classes
- HasValidationIndicatorCol
- final def getValidationTol: Double
- Definition Classes
- GBMParams
- final def getWeightCol: String
- Definition Classes
- HasWeightCol
- final def hasDefault[T](param: Param[T]): Boolean
- Definition Classes
- Params
- def hasParam(paramName: String): Boolean
- Definition Classes
- Params
- def hasParent: Boolean
- Definition Classes
- Model
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- val init: EnsemblePredictionModelType
- val initStrategy: Param[String]
strategy for the init predictions, can be a constant optimized for the minimized loss, zero, or the base learner learned on labels.
strategy for the init predictions, can be a constant optimized for the minimized loss, zero, or the base learner learned on labels. (case-insensitive) Supported: "constant", "zero", "base". (default = constant)
- Definition Classes
- GBMRegressorParams
- def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- def initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
- final def isDefined(param: Param[_]): Boolean
- Definition Classes
- Params
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def isSet(param: Param[_]): Boolean
- Definition Classes
- Params
- def isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- final val labelCol: Param[String]
- Definition Classes
- HasLabelCol
- val learningRate: Param[Double]
param for the learning rate of the algorithm
param for the learning rate of the algorithm
- Definition Classes
- GBMParams
- def log: Logger
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logName: String
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- val loss: Param[String]
Loss function which GBM tries to minimize.
Loss function which GBM tries to minimize. (case-insensitive) Supported: "squared", "absolute", "huber", "quantile". (default = squared)
- Definition Classes
- GBMRegressorParams
- final val maxIter: IntParam
- Definition Classes
- HasMaxIter
- val models: Array[EnsemblePredictionModelType]
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val numBaseLearners: Param[Int]
param for the number of base learners of the algorithm
param for the number of base learners of the algorithm
- Definition Classes
- HasNumBaseLearners
- def numFeatures: Int
- Definition Classes
- PredictionModel
- Annotations
- @Since("1.6.0")
- val numModels: Int
- val numRounds: Param[Int]
param for the number of round waiting for next decrease in validation set
param for the number of round waiting for next decrease in validation set
- Definition Classes
- GBMParams
- val optimizedWeights: Param[Boolean]
param for using optimized weights in GBM
param for using optimized weights in GBM
- Definition Classes
- GBMParams
- lazy val params: Array[Param[_]]
- Definition Classes
- Params
- var parent: Estimator[GBMRegressionModel]
- Definition Classes
- Model
- def predict(features: Vector): Double
- Definition Classes
- GBMRegressionModel → PredictionModel
- final val predictionCol: Param[String]
- Definition Classes
- HasPredictionCol
- val replacement: Param[Boolean]
param for whether samples are drawn with replacement
param for whether samples are drawn with replacement
- Definition Classes
- HasSubBag
- def save(path: String): Unit
- Definition Classes
- MLWritable
- Annotations
- @Since("1.6.0") @throws("If the input path already exists but overwrite is not enabled.")
- final val seed: LongParam
- Definition Classes
- HasSeed
- final def set(paramPair: ParamPair[_]): GBMRegressionModel.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def set(param: String, value: Any): GBMRegressionModel.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def set[T](param: Param[T], value: T): GBMRegressionModel.this.type
- Definition Classes
- Params
- final def setDefault(paramPairs: ParamPair[_]*): GBMRegressionModel.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def setDefault[T](param: Param[T], value: T): GBMRegressionModel.this.type
- Attributes
- protected
- Definition Classes
- Params
- def setFeaturesCol(value: String): GBMRegressionModel
- Definition Classes
- PredictionModel
- def setParent(parent: Estimator[GBMRegressionModel]): GBMRegressionModel
- Definition Classes
- Model
- def setPredictionCol(value: String): GBMRegressionModel
- Definition Classes
- PredictionModel
- def slice(indices: Array[Int]): (Vector) => Vector
- Attributes
- protected
- Definition Classes
- HasSubBag
- val subsampleRatio: Param[Double]
param for ratio of rows sampled out of the dataset
param for ratio of rows sampled out of the dataset
- Definition Classes
- HasSubBag
- def subspace(subspaceRatio: Double, numFeatures: Int, seed: Long): Array[Int]
- Attributes
- protected
- Definition Classes
- HasSubBag
- val subspaceRatio: Param[Double]
param for ratio of rows sampled out of the dataset
param for ratio of rows sampled out of the dataset
- Definition Classes
- HasSubBag
- val subspaces: Array[Array[Int]]
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- Identifiable → AnyRef → Any
- final val tol: DoubleParam
- Definition Classes
- HasTol
- def transform(dataset: Dataset[_]): DataFrame
- Definition Classes
- PredictionModel → Transformer
- def transform(dataset: Dataset[_], paramMap: ParamMap): DataFrame
- Definition Classes
- Transformer
- Annotations
- @Since("2.0.0")
- def transform(dataset: Dataset[_], firstParamPair: ParamPair[_], otherParamPairs: ParamPair[_]*): DataFrame
- Definition Classes
- Transformer
- Annotations
- @Since("2.0.0") @varargs()
- def transformImpl(dataset: Dataset[_]): DataFrame
- Attributes
- protected
- Definition Classes
- PredictionModel
- def transformSchema(schema: StructType): StructType
- Definition Classes
- PredictionModel → PipelineStage
- def transformSchema(schema: StructType, logging: Boolean): StructType
- Attributes
- protected
- Definition Classes
- PipelineStage
- Annotations
- @DeveloperApi()
- val uid: String
- Definition Classes
- GBMRegressionModel → Identifiable
- val updates: Param[String]
Newton (using hessian) or Gradient updates.
Newton (using hessian) or Gradient updates. (case-insensitive) Supported: "gradient", "newton". (default = gradient)
- Definition Classes
- GBMParams
- def validateAndTransformSchema(schema: StructType, fitting: Boolean, featuresDataType: DataType): StructType
- Attributes
- protected
- Definition Classes
- PredictorParams
- final val validationIndicatorCol: Param[String]
- Definition Classes
- HasValidationIndicatorCol
- final val validationTol: DoubleParam
Threshold for stopping early when fit with validation is used.
Threshold for stopping early when fit with validation is used. (This parameter is ignored when fit without validation is used.) The decision to stop early is decided based on this logic: If the current loss on the validation set is greater than 0.01, the diff of validation error is compared to relative tolerance which is validationTol * (current loss on the validation set). If the current loss on the validation set is less than or equal to 0.01, the diff of validation error is compared to absolute tolerance which is validationTol * 0.01.
- Definition Classes
- GBMParams
- See also
validationIndicatorCol
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final val weightCol: Param[String]
- Definition Classes
- HasWeightCol
- val weights: Array[Double]
- def write: MLWriter
- Definition Classes
- GBMRegressionModel → MLWritable