Visibility: public Uploaded 13-09-2022 by Perry
sklearn==1.0.2
numpy>=1.14.6
scipy>=1.1.0
joblib>=0.11
threadpoolctl>=2.0.0 1 runs
0 likesdownloaded by 0 people 0 issues0 downvotes
, 0 total downloads
Constant that multiplies the regularization term if regularization is
used
default: 0.0001
class_weight
Preset for the class_weight fit parameter
Weights associated with classes. If not given, all classes
are supposed to have weight one
The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
default: null
early_stopping
Whether to use early stopping to terminate training when validation
score is not improving. If set to True, it will automatically set aside
a stratified fraction of training data as validation and terminate
training when validation score is not improving by at least tol for
n_iter_no_change consecutive epochs
.. versionadded:: 0.20
default: false
eta0
Constant by which the updates are multiplied
default: 1.0
fit_intercept
Whether the intercept should be estimated or not. If False, the
data is assumed to be already centered
default: true
l1_ratio
The Elastic Net mixing parameter, with `0 <= l1_ratio <= 1`
`l1_ratio=0` corresponds to L2 penalty, `l1_ratio=1` to L1
Only used if `penalty='elasticnet'`
.. versionadded:: 0.24
default: 0.15
max_iter
The maximum number of passes over the training data (aka epochs)
It only impacts the behavior in the ``fit`` method, and not the
:meth:`partial_fit` method
.. versionadded:: 0.19
default: 1000
n_iter_no_change
Number of iterations with no improvement to wait before early stopping
.. versionadded:: 0.20
default: 5
n_jobs
The number of CPUs to use to do the OVA (One Versus All, for
multi-class problems) computation
``None`` means 1 unless in a :obj:`joblib.parallel_backend` context
``-1`` means using all processors. See :term:`Glossary `
for more details
default: null
penalty
default: null
random_state
Used to shuffle the training data, when ``shuffle`` is set to
``True``. Pass an int for reproducible output across multiple
function calls
See :term:`Glossary `
default: 0
shuffle
Whether or not the training data should be shuffled after each epoch
default: true
tol
The stopping criterion. If it is not None, the iterations will stop
when (loss > previous_loss - tol)
.. versionadded:: 0.19
default: 0.001
validation_fraction
The proportion of training data to set aside as validation set for
early stopping. Must be between 0 and 1
Only used if early_stopping is True
.. versionadded:: 0.20
default: 0.1
verbose
The verbosity level
default: 0
warm_start
When set to True, reuse the solution of the previous call to fit as
initialization, otherwise, just erase the previous solution. See
:term:`the Glossary `.