Welcome to Boost Loss documentation!
- Changelog
- v0.5.5 (2024-01-26)
- v0.5.4 (2023-12-31)
- v0.5.3 (2023-12-22)
- v0.5.2 (2023-12-20)
- v0.5.1 (2023-11-13)
- v0.5.0 (2023-11-07)
- v0.4.2 (2023-11-05)
- v0.4.1 (2023-11-05)
- v0.4.0 (2023-11-04)
- v0.3.6 (2023-10-24)
- v0.3.5 (2023-10-24)
- v0.3.4 (2023-10-23)
- v0.3.3 (2023-10-15)
- v0.3.2 (2023-10-11)
- v0.3.1 (2023-10-11)
- v0.3.0 (2023-10-11)
- v0.2.2 (2023-09-18)
- v0.2.1 (2023-09-18)
- v0.2.0 (2023-06-17)
- v0.1.1 (2023-06-16)
- v0.1.0 (2023-05-28)
- Contributing
Boost Loss
Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost. This sounds very simple, but in reality it took a lot of work.
Installation
Install this via pip (or your favourite package manager):
pip install boost-loss
Usage
Basic Usage
import numpy as np
from boost_loss import LossBase
from numpy.typing import NDArray
class L2Loss(LossBase):
def loss(self, y_true: NDArray, y_pred: NDArray) -> NDArray:
return (y_true - y_pred) ** 2 / 2
def grad(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # dL/dy_pred
return - (y_true - y_pred)
def hess(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # d^2L/dy_pred^2
return np.ones_like(y_true)
import lightgbm as lgb
from boost_loss import apply_custom_loss
from sklearn.datasets import load_boston
X, y = load_boston(return_X_y=True)
apply_custom_loss(lgb.LGBMRegressor(), L2Loss()).fit(X, y)
Built-in losses are available. [1]
from boost_loss.regression import LogCoshLoss
torch.autograd
Loss [2]
import torch
from boost_loss.torch import TorchLossBase
class L2LossTorch(TorchLossBase):
def loss_torch(self, y_true: torch.Tensor, y_pred: torch.Tensor) -> torch.Tensor:
return (y_true - y_pred) ** 2 / 2
Contributors ✨
Thanks goes to these wonderful people (emoji key):
34j 💻 🤔 📖 |
This project follows the all-contributors specification. Contributions of any kind welcome!