Model description

[More Information Needed]

Intended uses & limitations

[More Information Needed]

Training Procedure

[More Information Needed]

Hyperparameters

Click to expand
Hyperparameter Value
memory
steps [('columntransformer', ColumnTransformer(remainder='passthrough',
transformers=[('onehotencoder',
OneHotEncoder(handle_unknown='ignore',
sparse=False),
<sklearn.compose._column_transformer.make_column_selector object at 0x7c049c39ec20>)])), ('gradientboostingregressor', GradientBoostingRegressor(random_state=42))]
verbose False
columntransformer ColumnTransformer(remainder='passthrough',
transformers=[('onehotencoder',
OneHotEncoder(handle_unknown='ignore',
sparse=False),
<sklearn.compose._column_transformer.make_column_selector object at 0x7c049c39ec20>)])
gradientboostingregressor GradientBoostingRegressor(random_state=42)
columntransformer__n_jobs
columntransformer__remainder passthrough
columntransformer__sparse_threshold 0.3
columntransformer__transformer_weights
columntransformer__transformers [('onehotencoder', OneHotEncoder(handle_unknown='ignore', sparse=False), <sklearn.compose._column_transformer.make_column_selector object at 0x7c049c39ec20>)]
columntransformer__verbose False
columntransformer__verbose_feature_names_out True
columntransformer__onehotencoder OneHotEncoder(handle_unknown='ignore', sparse=False)
columntransformer__onehotencoder__categories auto
columntransformer__onehotencoder__drop
columntransformer__onehotencoder__dtype <class 'numpy.float64'>
columntransformer__onehotencoder__feature_name_combiner concat
columntransformer__onehotencoder__handle_unknown ignore
columntransformer__onehotencoder__max_categories
columntransformer__onehotencoder__min_frequency
columntransformer__onehotencoder__sparse False
columntransformer__onehotencoder__sparse_output True
gradientboostingregressor__alpha 0.9
gradientboostingregressor__ccp_alpha 0.0
gradientboostingregressor__criterion friedman_mse
gradientboostingregressor__init
gradientboostingregressor__learning_rate 0.1
gradientboostingregressor__loss squared_error
gradientboostingregressor__max_depth 3
gradientboostingregressor__max_features
gradientboostingregressor__max_leaf_nodes
gradientboostingregressor__min_impurity_decrease 0.0
gradientboostingregressor__min_samples_leaf 1
gradientboostingregressor__min_samples_split 2
gradientboostingregressor__min_weight_fraction_leaf 0.0
gradientboostingregressor__n_estimators 100
gradientboostingregressor__n_iter_no_change
gradientboostingregressor__random_state 42
gradientboostingregressor__subsample 1.0
gradientboostingregressor__tol 0.0001
gradientboostingregressor__validation_fraction 0.1
gradientboostingregressor__verbose 0
gradientboostingregressor__warm_start False

Model Plot

Pipeline(steps=[('columntransformer',ColumnTransformer(remainder='passthrough',transformers=[('onehotencoder',OneHotEncoder(handle_unknown='ignore',sparse=False),<sklearn.compose._column_transformer.make_column_selector object at 0x7c049c39ec20>)])),('gradientboostingregressor',GradientBoostingRegressor(random_state=42))])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.

Evaluation Results

[More Information Needed]

How to Get Started with the Model

[More Information Needed]

Model Card Authors

This model card is written by following authors:

[More Information Needed]

Model Card Contact

You can contact the model card authors through following channels: [More Information Needed]

Citation

Below you can find information related to citation.

BibTeX:

[More Information Needed]

model_card_authors

JP

limitations

This model is intended for educational purposes.

model_description

This is a GradientBoostingRegressor on a fish dataset.

Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.