class GBMRegressor extends Regressor[Vector, GBMRegressor, GBMRegressionModel] with GBMRegressorParams with MLWritable
- Source
- GBMRegressor.scala
- Grouped
- Alphabetic
- By Inheritance
- GBMRegressor
- MLWritable
- GBMRegressorParams
- GBMParams
- HasSubBag
- HasSeed
- BoostingParams
- HasAggregationDepth
- HasCheckpointInterval
- HasBaseLearner
- HasWeightCol
- HasNumBaseLearners
- HasValidationIndicatorCol
- HasTol
- HasMaxIter
- Regressor
- Predictor
- PredictorParams
- HasPredictionCol
- HasFeaturesCol
- HasLabelCol
- Estimator
- PipelineStage
- Logging
- Params
- Serializable
- Identifiable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def $[T](param: Param[T]): T
- Attributes
- protected
- Definition Classes
- Params
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final val aggregationDepth: IntParam
- Definition Classes
- HasAggregationDepth
- val alpha: Param[Double]
The alpha-quantile of the huber loss function and the quantile loss function.
The alpha-quantile of the huber loss function and the quantile loss function. Only if loss="huber" or loss="quantile". (default = 0.9)
- Definition Classes
- GBMRegressorParams
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- val baseLearner: Param[EnsembleRegressorType]
param for the estimator that will be used by the ensemble learner as a base learner
param for the estimator that will be used by the ensemble learner as a base learner
- Definition Classes
- HasBaseLearner
- final val checkpointInterval: IntParam
- Definition Classes
- HasCheckpointInterval
- final def clear(param: Param[_]): GBMRegressor.this.type
- Definition Classes
- Params
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def copy(extra: ParamMap): GBMRegressor
- Definition Classes
- GBMRegressor → Predictor → Estimator → PipelineStage → Params
- def copyValues[T <: Params](to: T, extra: ParamMap): T
- Attributes
- protected
- Definition Classes
- Params
- final def defaultCopy[T <: Params](extra: ParamMap): T
- Attributes
- protected
- Definition Classes
- Params
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def explainParam(param: Param[_]): String
- Definition Classes
- Params
- def explainParams(): String
- Definition Classes
- Params
- def extractInstances(dataset: Dataset[_], validateInstance: (Instance) => Unit): RDD[Instance]
- Attributes
- protected
- Definition Classes
- PredictorParams
- def extractInstances(dataset: Dataset[_]): RDD[Instance]
- Attributes
- protected
- Definition Classes
- PredictorParams
- def extractLabeledPoints(dataset: Dataset[_]): RDD[LabeledPoint]
- Attributes
- protected
- Definition Classes
- Predictor
- final def extractParamMap(): ParamMap
- Definition Classes
- Params
- final def extractParamMap(extra: ParamMap): ParamMap
- Definition Classes
- Params
- final val featuresCol: Param[String]
- Definition Classes
- HasFeaturesCol
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- def fit(dataset: Dataset[_]): GBMRegressionModel
- Definition Classes
- Predictor → Estimator
- def fit(dataset: Dataset[_], paramMaps: Seq[ParamMap]): Seq[GBMRegressionModel]
- Definition Classes
- Estimator
- Annotations
- @Since("2.0.0")
- def fit(dataset: Dataset[_], paramMap: ParamMap): GBMRegressionModel
- Definition Classes
- Estimator
- Annotations
- @Since("2.0.0")
- def fit(dataset: Dataset[_], firstParamPair: ParamPair[_], otherParamPairs: ParamPair[_]*): GBMRegressionModel
- Definition Classes
- Estimator
- Annotations
- @Since("2.0.0") @varargs()
- def fitBaseLearner(baseLearner: EnsembleRegressorType, labelColName: String, featuresColName: String, predictionColName: String, weightColName: Option[String])(df: DataFrame): EnsemblePredictionModelType
- Attributes
- protected
- Definition Classes
- HasBaseLearner
- final def get[T](param: Param[T]): Option[T]
- Definition Classes
- Params
- final def getAggregationDepth: Int
- Definition Classes
- HasAggregationDepth
- def getAlpha: Double
- Definition Classes
- GBMRegressorParams
- def getBaseLearner: EnsembleRegressorType
- Definition Classes
- HasBaseLearner
- final def getCheckpointInterval: Int
- Definition Classes
- HasCheckpointInterval
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def getDefault[T](param: Param[T]): Option[T]
- Definition Classes
- Params
- final def getFeaturesCol: String
- Definition Classes
- HasFeaturesCol
- def getInitStrategy: String
- Definition Classes
- GBMRegressorParams
- final def getLabelCol: String
- Definition Classes
- HasLabelCol
- def getLearningRate: Double
- Definition Classes
- GBMParams
- def getLoss: String
- Definition Classes
- GBMRegressorParams
- final def getMaxIter: Int
- Definition Classes
- HasMaxIter
- def getNumBaseLearners: Int
- Definition Classes
- HasNumBaseLearners
- def getNumRounds: Int
- Definition Classes
- GBMParams
- def getOptimizedWeights: Boolean
- Definition Classes
- GBMParams
- final def getOrDefault[T](param: Param[T]): T
- Definition Classes
- Params
- def getParam(paramName: String): Param[Any]
- Definition Classes
- Params
- final def getPredictionCol: String
- Definition Classes
- HasPredictionCol
- def getReplacement: Boolean
- Definition Classes
- HasSubBag
- final def getSeed: Long
- Definition Classes
- HasSeed
- def getSubsampleRatio: Double
- Definition Classes
- HasSubBag
- def getSubspaceRatio: Double
- Definition Classes
- HasSubBag
- final def getTol: Double
- Definition Classes
- HasTol
- def getUpdates: String
- Definition Classes
- GBMParams
- final def getValidationIndicatorCol: String
- Definition Classes
- HasValidationIndicatorCol
- final def getValidationTol: Double
- Definition Classes
- GBMParams
- final def getWeightCol: String
- Definition Classes
- HasWeightCol
- final def hasDefault[T](param: Param[T]): Boolean
- Definition Classes
- Params
- def hasParam(paramName: String): Boolean
- Definition Classes
- Params
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- val initStrategy: Param[String]
strategy for the init predictions, can be a constant optimized for the minimized loss, zero, or the base learner learned on labels.
strategy for the init predictions, can be a constant optimized for the minimized loss, zero, or the base learner learned on labels. (case-insensitive) Supported: "constant", "zero", "base". (default = constant)
- Definition Classes
- GBMRegressorParams
- def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- def initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
- final def isDefined(param: Param[_]): Boolean
- Definition Classes
- Params
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def isSet(param: Param[_]): Boolean
- Definition Classes
- Params
- def isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- final val labelCol: Param[String]
- Definition Classes
- HasLabelCol
- val learningRate: Param[Double]
param for the learning rate of the algorithm
param for the learning rate of the algorithm
- Definition Classes
- GBMParams
- def log: Logger
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logName: String
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- val loss: Param[String]
Loss function which GBM tries to minimize.
Loss function which GBM tries to minimize. (case-insensitive) Supported: "squared", "absolute", "huber", "quantile". (default = squared)
- Definition Classes
- GBMRegressorParams
- final val maxIter: IntParam
- Definition Classes
- HasMaxIter
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val numBaseLearners: Param[Int]
param for the number of base learners of the algorithm
param for the number of base learners of the algorithm
- Definition Classes
- HasNumBaseLearners
- val numRounds: Param[Int]
param for the number of round waiting for next decrease in validation set
param for the number of round waiting for next decrease in validation set
- Definition Classes
- GBMParams
- val optimizedWeights: Param[Boolean]
param for using optimized weights in GBM
param for using optimized weights in GBM
- Definition Classes
- GBMParams
- lazy val params: Array[Param[_]]
- Definition Classes
- Params
- final val predictionCol: Param[String]
- Definition Classes
- HasPredictionCol
- val replacement: Param[Boolean]
param for whether samples are drawn with replacement
param for whether samples are drawn with replacement
- Definition Classes
- HasSubBag
- def save(path: String): Unit
- Definition Classes
- MLWritable
- Annotations
- @Since("1.6.0") @throws("If the input path already exists but overwrite is not enabled.")
- final val seed: LongParam
- Definition Classes
- HasSeed
- final def set(paramPair: ParamPair[_]): GBMRegressor.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def set(param: String, value: Any): GBMRegressor.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def set[T](param: Param[T], value: T): GBMRegressor.this.type
- Definition Classes
- Params
- def setAggregationDepth(value: Int): GBMRegressor.this.type
- def setAlpha(value: Double): GBMRegressor.this.type
- def setBaseLearner(value: EnsembleRegressorType): GBMRegressor.this.type
- def setCheckpointInterval(value: Int): GBMRegressor.this.type
- final def setDefault(paramPairs: ParamPair[_]*): GBMRegressor.this.type
- Attributes
- protected
- Definition Classes
- Params
- final def setDefault[T](param: Param[T], value: T): GBMRegressor.this.type
- Attributes
- protected
- Definition Classes
- Params
- def setFeaturesCol(value: String): GBMRegressor
- Definition Classes
- Predictor
- def setInitStrategy(value: String): GBMRegressor.this.type
- def setLabelCol(value: String): GBMRegressor
- Definition Classes
- Predictor
- def setLearningRate(value: Double): GBMRegressor.this.type
- def setLoss(value: String): GBMRegressor.this.type
- def setMaxIter(value: Int): GBMRegressor.this.type
- def setNumBaseLearners(value: Int): GBMRegressor.this.type
- def setNumRounds(value: Int): GBMRegressor.this.type
- def setOptimizedWeights(value: Boolean): GBMRegressor.this.type
- def setPredictionCol(value: String): GBMRegressor
- Definition Classes
- Predictor
- def setReplacement(value: Boolean): GBMRegressor.this.type
- def setSeed(value: Long): GBMRegressor.this.type
- def setSubsampleRatio(value: Double): GBMRegressor.this.type
- def setSubspaceRatio(value: Double): GBMRegressor.this.type
- def setTol(value: Double): GBMRegressor.this.type
- def setUpdates(value: String): GBMRegressor.this.type
- def setValidationIndicatorCol(value: String): GBMRegressor.this.type
- def setValidationTol(value: Double): GBMRegressor.this.type
- def setWeightCol(value: String): GBMRegressor.this.type
- def slice(indices: Array[Int]): (Vector) => Vector
- Attributes
- protected
- Definition Classes
- HasSubBag
- val subsampleRatio: Param[Double]
param for ratio of rows sampled out of the dataset
param for ratio of rows sampled out of the dataset
- Definition Classes
- HasSubBag
- def subspace(subspaceRatio: Double, numFeatures: Int, seed: Long): Array[Int]
- Attributes
- protected
- Definition Classes
- HasSubBag
- val subspaceRatio: Param[Double]
param for ratio of rows sampled out of the dataset
param for ratio of rows sampled out of the dataset
- Definition Classes
- HasSubBag
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- Identifiable → AnyRef → Any
- final val tol: DoubleParam
- Definition Classes
- HasTol
- def train(dataset: Dataset[_]): GBMRegressionModel
- Attributes
- protected
- Definition Classes
- GBMRegressor → Predictor
- def transformSchema(schema: StructType): StructType
- Definition Classes
- Predictor → PipelineStage
- def transformSchema(schema: StructType, logging: Boolean): StructType
- Attributes
- protected
- Definition Classes
- PipelineStage
- Annotations
- @DeveloperApi()
- val uid: String
- Definition Classes
- GBMRegressor → Identifiable
- val updates: Param[String]
Newton (using hessian) or Gradient updates.
Newton (using hessian) or Gradient updates. (case-insensitive) Supported: "gradient", "newton". (default = gradient)
- Definition Classes
- GBMParams
- def validateAndTransformSchema(schema: StructType, fitting: Boolean, featuresDataType: DataType): StructType
- Attributes
- protected
- Definition Classes
- PredictorParams
- final val validationIndicatorCol: Param[String]
- Definition Classes
- HasValidationIndicatorCol
- final val validationTol: DoubleParam
Threshold for stopping early when fit with validation is used.
Threshold for stopping early when fit with validation is used. (This parameter is ignored when fit without validation is used.) The decision to stop early is decided based on this logic: If the current loss on the validation set is greater than 0.01, the diff of validation error is compared to relative tolerance which is validationTol * (current loss on the validation set). If the current loss on the validation set is less than or equal to 0.01, the diff of validation error is compared to absolute tolerance which is validationTol * 0.01.
- Definition Classes
- GBMParams
- See also
validationIndicatorCol
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final val weightCol: Param[String]
- Definition Classes
- HasWeightCol
- def write: MLWriter
- Definition Classes
- GBMRegressor → MLWritable