-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Prediction engine options #5964
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@eerhardt I think this is a good change, but I would like your opinion as well. Besides the linked issue, there are also some email threads if you would like more information. |
💡 The non-breaking way to approach this is to add a constructor (or similar) with a There are examples of both directions of this pattern (true means keep open on dispose, or true means close on dispose). @stephentoub or @terrajobst are best positioned to state the current recommended semantics. |
Thats good to know. @stephentoub or @terrajobst do you have a preference? |
Looks like in our code base we have a couple of places where we are already doing it the same way as the XML. I'll convert it to that way, but if we want it the other way its fast to change. |
Codecov Report
@@ Coverage Diff @@
## main #5964 +/- ##
==========================================
- Coverage 68.29% 68.23% -0.06%
==========================================
Files 1142 1142
Lines 242803 242821 +18
Branches 25385 25386 +1
==========================================
- Hits 165825 165695 -130
- Misses 70296 70433 +137
- Partials 6682 6693 +11
Flags with carried forward coverage won't be shown. Click here to find out more.
|
{ | ||
/// <summary> | ||
/// Options for the <see cref="PredictionEngine{TSrc, TDst}"/> as used in | ||
/// [RandomizedPca(Options)](xref:Microsoft.ML.PcaCatalog.RandomizedPca(Microsoft.ML.AnomalyDetectionCatalog.AnomalyDetectionTrainers,Microsoft.ML.Trainers.RandomizedPcaTrainer.Options)). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a copy-paste error? #Resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, resolved.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me
Co-authored-by: Eric Erhardt <[email protected]>
Fixes #5945
Currently when you dispose a prediction engine it also disposes the underlying model. This means that you are unable to cache a model to use multiple times if you want to create/dispose the prediction engine. This change makes it so the Prediction Engine no longer automatically disposes the model, allowing you to cache the model without worry about it being automatically disposed. You do now manually have to dispose of it.