You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Reproduces with 1.0.0 as well. Classifying as bug.
This behavior is not present in other multiclass trainers SdcaMaximumEntropy, LbfgsMaximumEntropy, and OneVersusAll. Could be related to calibration not being set to true in the TrainerInfo for LightGbmTrainerBase.
wschin
added
the
P1
Priority of the issue for triage purpose: Needs to be fixed soon.
label
May 21, 2019
Is there a workaround for this? I'd like to combine multiple models together, all trained with LightGbm, but I currently can't combine the predictions because the min/max scores range of each model is different. If both models were to output the 0-1 range I could aggregate the probabilities more easily.
version: 0.11
Related: #1424
When training a softmax Multi classifier with LightGBM a save/load will lose the softmax it seems:
Also inspecting the model we can see it doesn't use
ImplSoftmax
butImplRaw
.Reproduce:
Output:
The text was updated successfully, but these errors were encountered: