Skip to content

Commit f245f11

Browse files
mattpitkintwiecki
authored andcommitted
Manually set theano TensorType for length 1 shared variables (#3335)
* model.py: fix issue with length 1 shared variables - this patch fixes an issue highlighted in #3122 where imputation of a single missing observation fails. It implements the suggestion from the theano error message to manual force a TensorType change in cases where the variable has a length of one. * model.py: add a comment about the previous change * model.py: extra check to deal with test errors * model.py: changes to the length 1 fix * test_model.py: check shared tensor type conversion in ValueGradFunction - test the error described in #3122 and fixed in #3335. * RELEASE-NOTES.md: added mention of fix to #3122 * Update RELEASE-NOTES.md fix typo Co-Authored-By: mattpitkin <[email protected]>
1 parent b64f810 commit f245f11

File tree

3 files changed

+22
-0
lines changed

3 files changed

+22
-0
lines changed

RELEASE-NOTES.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
- Made `BrokenPipeError` for parallel sampling more verbose on Windows.
1111
- Added the `broadcast_distribution_samples` function that helps broadcasting arrays of drawn samples, taking into account the requested `size` and the inferred distribution shape. This sometimes is needed by distributions that call several `rvs` separately within their `random` method, such as the `ZeroInflatedPoisson` (Fix issue #3310).
1212
- The `Wald`, `Kumaraswamy`, `LogNormal`, `Pareto`, `Cauchy`, `HalfCauchy`, `Weibull` and `ExGaussian` distributions `random` method used a hidden `_random` function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue #3310).
13+
- Added a fix to allow the imputation of single missing values of observed data, which previously would fail (Fix issue #3122).
1314

1415
### Deprecations
1516

pymc3/model.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -401,6 +401,8 @@ class ValueGradFunction:
401401
"""
402402
def __init__(self, cost, grad_vars, extra_vars=None, dtype=None,
403403
casting='no', **kwargs):
404+
from .distributions import TensorType
405+
404406
if extra_vars is None:
405407
extra_vars = []
406408

@@ -437,6 +439,12 @@ def __init__(self, cost, grad_vars, extra_vars=None, dtype=None,
437439
self._extra_vars_shared = {}
438440
for var in extra_vars:
439441
shared = theano.shared(var.tag.test_value, var.name + '_shared__')
442+
# test TensorType compatibility
443+
if hasattr(var.tag.test_value, 'shape'):
444+
testtype = TensorType(var.dtype, var.tag.test_value.shape)
445+
446+
if testtype != shared.type:
447+
shared.type = testtype
440448
self._extra_vars_shared[var.name] = shared
441449
givens.append((var, shared))
442450

pymc3/tests/test_model.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -288,3 +288,16 @@ def test_edge_case(self):
288288
assert logp.size == 1
289289
assert dlogp.size == 4
290290
npt.assert_allclose(dlogp, 0., atol=1e-5)
291+
292+
def test_tensor_type_conversion(self):
293+
# case described in #3122
294+
X = np.random.binomial(1, 0.5, 10)
295+
X[0] = -1 # masked a single value
296+
X = np.ma.masked_values(X, value=-1)
297+
with pm.Model() as m:
298+
x1 = pm.Uniform('x1', 0., 1.)
299+
x2 = pm.Bernoulli('x2', x1, observed=X)
300+
301+
gf = m.logp_dlogp_function()
302+
303+
assert m['x2_missing'].type == gf._extra_vars_shared['x2_missing'].type

0 commit comments

Comments
 (0)