Model#
- class pyspark.ml.Model[source]#
Abstract class for models that are fitted by estimators.
New in version 1.4.0.
Methods
clear
(param)Clears a param from the param map if it has been explicitly set.
copy
([extra])Creates a copy of this instance with the same uid and some extra params.
explainParam
(param)Explains a single param and returns its name, doc, and optional default value and user-supplied value in a string.
Returns the documentation of all params with their optionally default values and user-supplied values.
extractParamMap
([extra])Extracts the embedded default param values and user-supplied values, and then merges them with extra values from input into a flat param map, where the latter value is used if there exist conflicts, i.e., with ordering: default param values < user-supplied values < extra.
getOrDefault
(param)Gets the value of a param in the user-supplied param map or its default value.
getParam
(paramName)Gets a param by its name.
hasDefault
(param)Checks whether a param has a default value.
hasParam
(paramName)Tests whether this instance contains a param with a given (string) name.
isDefined
(param)Checks whether a param is explicitly set by user or has a default value.
isSet
(param)Checks whether a param is explicitly set by user.
set
(param, value)Sets a parameter in the embedded param map.
transform
(dataset[, params])Transforms the input dataset with optional parameters.
Attributes
Returns all params ordered by name.
Methods Documentation
- clear(param)#
Clears a param from the param map if it has been explicitly set.
- copy(extra=None)#
Creates a copy of this instance with the same uid and some extra params. The default implementation creates a shallow copy using
copy.copy()
, and then copies the embedded and extra parameters over and returns the copy. Subclasses should override this method if the default approach is not sufficient.- Parameters
- extradict, optional
Extra parameters to copy to the new instance
- Returns
Params
Copy of this instance
- explainParam(param)#
Explains a single param and returns its name, doc, and optional default value and user-supplied value in a string.
- explainParams()#
Returns the documentation of all params with their optionally default values and user-supplied values.
- extractParamMap(extra=None)#
Extracts the embedded default param values and user-supplied values, and then merges them with extra values from input into a flat param map, where the latter value is used if there exist conflicts, i.e., with ordering: default param values < user-supplied values < extra.
- Parameters
- extradict, optional
extra param values
- Returns
- dict
merged param map
- getOrDefault(param)#
Gets the value of a param in the user-supplied param map or its default value. Raises an error if neither is set.
- getParam(paramName)#
Gets a param by its name.
- hasDefault(param)#
Checks whether a param has a default value.
- hasParam(paramName)#
Tests whether this instance contains a param with a given (string) name.
- isDefined(param)#
Checks whether a param is explicitly set by user or has a default value.
- isSet(param)#
Checks whether a param is explicitly set by user.
- set(param, value)#
Sets a parameter in the embedded param map.
- transform(dataset, params=None)#
Transforms the input dataset with optional parameters.
New in version 1.3.0.
- Parameters
- dataset
pyspark.sql.DataFrame
input dataset
- paramsdict, optional
an optional param map that overrides embedded params.
- dataset
- Returns
pyspark.sql.DataFrame
transformed dataset
Attributes Documentation
- params#
Returns all params ordered by name. The default implementation uses
dir()
to get all attributes of typeParam
.
- uid#
A unique id for the object.