-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
pm.sample_prior_predictive fails with multinomial data #3271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Definitively a problem here: obs.distribution.random().shape
# ==> (10, 1, 6, 6) |
You usually have to specify shapes for observed variables to sample from the prior (in this case, I tried a few "easy" fixes and none worked. Once you specify the shape, On the plus side, I think any fix will be only touching the |
Ooh I didn't know that you had to specify the shape for sample_prior_predictive, thank you @ColCarroll ! |
Actually, you dont need to as the shape is inferred from the The problem here is the broadcasting of |
Ok, I think I got it. Hence @ColCarroll 's intuition that the problem (and solution) is contained to Multinomial |
…er broadcasting of n and p. Added test based on pymc-devs#3271 problematic code.
Close by #3285 |
Uh oh!
There was an error while loading. Please reload this page.
As discussed with @AustinRochford on Twitter, pm.sample_prior_predictive seems to fail when working with multinomial likelihood : "TypeError: 'NoneType' object is not subscriptable"
I suspect it comes from a shape issue. Maybe it comes from my data, but it would be weird as sample_ppc is working (on a side note, the sample_posterior_predictive method seems to be missing from the current PyMC conda distribution)
Here is a minimal and reproducible example:
Please provide the full traceback.
Please provide any additional information below.
My data and model are a lot more complex than the simple example here, but it does not seem to work in that simple case study.
Versions and main components
The text was updated successfully, but these errors were encountered: