Skip to content
/ boost-loss Public

Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost.

License

Notifications You must be signed in to change notification settings

34j/boost-loss

Repository files navigation

Boost Loss

CI Status Documentation Status Test coverage percentage

Poetry black pre-commit

PyPI Version Supported Python versions License

Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost. This sounds very simple, but in reality it took a lot of work.

Installation

Install this via pip (or your favourite package manager):

pip install boost-loss

Usage

Basic Usage

import numpy as np

from boost_loss import LossBase
from numpy.typing import NDArray


class L2Loss(LossBase):
    def loss(self, y_true: NDArray, y_pred: NDArray) -> NDArray:
        return (y_true - y_pred) ** 2 / 2

    def grad(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # dL/dy_pred
        return - (y_true - y_pred)

    def hess(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # d^2L/dy_pred^2
        return np.ones_like(y_true)
import lightgbm as lgb

from boost_loss import apply_custom_loss
from sklearn.datasets import load_boston


X, y = load_boston(return_X_y=True)
apply_custom_loss(lgb.LGBMRegressor(), L2Loss()).fit(X, y)

Built-in losses are available. 1

from boost_loss.regression import LogCoshLoss
import torch

from boost_loss.torch import TorchLossBase


class L2LossTorch(TorchLossBase):
    def loss_torch(self, y_true: torch.Tensor, y_pred: torch.Tensor) -> torch.Tensor:
        return (y_true - y_pred) ** 2 / 2

Contributors ✨

Thanks goes to these wonderful people (emoji key):

34j
34j

💻 🤔 📖

This project follows the all-contributors specification. Contributions of any kind welcome!

Footnotes

  1. Inspired by orchardbirds/bokbokbok

  2. Inspired by TomerRonen34/treeboost_autograd