Skip to content

We are not getting submodel for LinearBinaryModelParameters after loading model #3967

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
nighotatul opened this issue Jul 4, 2019 · 3 comments

Comments

@nighotatul
Copy link

nighotatul commented Jul 4, 2019

Hi

After loading model for binary classification we are trying to get submodel and calibrator

but if you see after loading model we write below code

((Microsoft.ML.Calibrators.CalibratedModelParametersBase)((Microsoft.ML.Data.PredictionTransformerBase<Microsoft.ML.IPredictorProducing<float>>)((Microsoft.ML.Data.TransformerChain<Microsoft.ML.ITransformer>)mlModel).LastTransformer).Model).SubModel

but failed to getting sub model which is required for us for
LinearBinaryModelParameters

if you see in CreateModel method we easily get SubModel by writing below code

LinearBinaryModelParameters linearBinaryModelParameters = ((Microsoft.ML.Data.TransformerChain<Microsoft.ML.Data.BinaryPredictionTransformer<Microsoft.ML.Calibrators.CalibratedModelParametersBase<Microsoft.ML.Trainers.LinearBinaryModelParameters, Microsoft.ML.Calibrators.PlattCalibrator>>>)mlModel).LastTransformer.Model.SubModel;

but same code if we writer after loading model the we got exception of casting

please guide us.

mlApp_ex.zip

@nighotatul
Copy link
Author

IPredictorProducing is not accessible due to its protection level
So how we cast them?

@wschin
Copy link
Member

wschin commented Jul 8, 2019

I don't have your data file, so I can't really run your code. However, with the latest ML.NET version 1.2, I can successfully compile the casting

            // Casting is done with ML.NET 1.2.
            LinearBinaryModelParameters linearBinaryModelParameters = ((Microsoft.ML.Data.TransformerChain<Microsoft.ML.Data.BinaryPredictionTransformer<Microsoft.ML.Calibrators.CalibratedModelParametersBase<Microsoft.ML.Trainers.LinearBinaryModelParameters, Microsoft.ML.Calibrators.PlattCalibrator>>>)mlModel).LastTransformer.Model.SubModel;

in

        static void Main(string[] args)
        {
            MLContext mlContext = new MLContext();

            // Training code used by ML.NET CLI and AutoML to generate the model
            ModelBuilder.CreateModel();

            ITransformer mlModel = mlContext.Model.Load(GetAbsolutePath(MODEL_FILEPATH), out DataViewSchema inputSchema);

            // Casting is done with ML.NET 1.2.
            LinearBinaryModelParameters linearBinaryModelParameters = ((Microsoft.ML.Data.TransformerChain<Microsoft.ML.Data.BinaryPredictionTransformer<Microsoft.ML.Calibrators.CalibratedModelParametersBase<Microsoft.ML.Trainers.LinearBinaryModelParameters, Microsoft.ML.Calibrators.PlattCalibrator>>>)mlModel).LastTransformer.Model.SubModel;


            var predEngine = mlContext.Model.CreatePredictionEngine<ModelInput, ModelOutput>(mlModel);

            // Create sample data to do a single prediction with it 
            ModelInput sampleData = CreateSingleDataSample(mlContext, DATA_FILEPATH);

            // Try a single prediction
            ModelOutput predictionResult = predEngine.Predict(sampleData);

            Console.WriteLine($"Single Prediction --> Actual value: {sampleData.PurchasedBike} | Predicted value: {predictionResult.Prediction}");

            Console.WriteLine("=============== End of process, hit any key to finish ===============");
            Console.ReadKey();
        }

@antoniovs1029
Copy link
Member

I believe this is problem is related to the problem explained in here, and which was fixed a couple of months ago in PRs #4262 and #4306 and made available in ML.NET 1.4.

Basically the problem was that PredictionTransformers' parameter types weren't being set correctly when loading from disk. So what I suspect was your problem is that the last transformer in your model was a BinaryPredictionTransformer<CalibratedModelParametersBase<LinearBinaryModelParameters, PlattCalibrator>>> but after saving it, the loaded model would have a last transformer of BinaryPredictionTransformer<IPredictorProducing <float>> throwing an exception when you tried to use the cast you mentioned. This was a bug, as you've described, since users should have been able to use the same casts in models loaded from disk; and this bug has been fixed in the PR's I've pointed to.


I will close this issue, since I believe it's been fixed, and without your input dataset I couldn't really test it. Please, feel free to reopen the issue if you think there's still work to be done. Thanks!

@ghost ghost locked as resolved and limited conversation to collaborators Mar 21, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants