Skip to content

Commit e69dec9

Browse files
ricardoV94bwengals
andauthored
Simplify Potential docstrings and examples (#6772)
* Simplify Potential docstrings and examples Co-authored-by: Bill Engels <[email protected]>
1 parent 9bb3cf0 commit e69dec9

File tree

1 file changed

+50
-33
lines changed

1 file changed

+50
-33
lines changed

pymc/model.py

Lines changed: 50 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -2199,15 +2199,14 @@ def Deterministic(name, var, model=None, dims=None):
21992199
return var
22002200

22012201

2202-
def Potential(name, var, model=None, dims=None):
2203-
"""
2204-
Add an arbitrary factor potential to the model likelihood
2205-
2206-
The Potential function is used to add arbitrary factors (such as constraints or other likelihood components) to adjust the probability density of the model.
2202+
def Potential(name, var: TensorVariable, model=None, dims=None) -> TensorVariable:
2203+
"""Add an arbitrary term to the model log-probability.
22072204
22082205
Warnings
22092206
--------
2210-
Potential functions only influence logp-based sampling. Therefore, they are applicable for sampling with ``pm.sample`` but not ``pm.sample_prior_predictive`` or ``pm.sample_posterior_predictive``.
2207+
Potential terms only influence probability-based sampling, such as ``pm.sample``, but not forward sampling like
2208+
``pm.sample_prior_predictive`` or ``pm.sample_posterior_predictive``. A warning is raised when doing forward
2209+
sampling with models containing Potential terms.
22112210
22122211
Parameters
22132212
----------
@@ -2228,62 +2227,80 @@ def Potential(name, var, model=None, dims=None):
22282227
22292228
Examples
22302229
--------
2231-
Have a look at the following example:
2232-
2233-
In this example, we define a constraint on ``x`` to be greater or equal to 0 via the ``pm.Potential`` function.
2234-
We pass ``pm.math.log(pm.math.switch(constraint, 1, 0))`` as second argument which will return an expression depending on if the constraint is met or not and which will be added to the likelihood of the model.
2235-
The probablity density that this model produces agrees strongly with the constraint that ``x`` should be greater than or equal to 0. All the cases who do not satisfy the constraint are strictly not considered.
2230+
In this example, we define a constraint on ``x`` to be greater or equal to 0.
2231+
The statement ``pm.math.log(pm.math.switch(constraint, 0, 1))`` adds either 0 or -inf to the model logp,
2232+
depending on whether the constraint is met. During sampling, any proposals where ``x`` is negative will be rejected.
22362233
22372234
.. code:: python
22382235
2236+
import pymc as pm
2237+
22392238
with pm.Model() as model:
22402239
x = pm.Normal("x", mu=0, sigma=1)
2241-
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2240+
22422241
constraint = x >= 0
22432242
potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1, 0)))
22442243
2245-
However, if we use ``pm.math.log(pm.math.switch(constraint, 1.0, 0.5))`` the potential again penalizes the likelihood when constraint is not met but with some deviations allowed.
2246-
Here, Potential function is used to pass a soft constraint.
2247-
A soft constraint is a constraint that is only partially satisfied.
2248-
The effect of this is that the posterior probability for the parameters decreases as they move away from the constraint, but does not become exactly zero.
2249-
This allows the sampler to generate values that violate the constraint, but with lower probability.
2244+
2245+
Instead, with a soft constraint like ``pm.math.log(pm.math.switch(constraint, 1, 0.5))``,
2246+
the sampler will be less likely, but not forbidden, from accepting negative values for `x`.
22502247
22512248
.. code:: python
22522249
2250+
import pymc as pm
2251+
22532252
with pm.Model() as model:
2254-
x = pm.Normal("x", mu=0.1, sigma=1)
2255-
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2253+
x = pm.Normal("x", mu=0, sigma=1)
2254+
22562255
constraint = x >= 0
22572256
potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1.0, 0.5)))
22582257
2259-
In this example, Potential is used to obtain an arbitrary prior.
2260-
This prior distribution refers to the prior knowledge that the values of ``max_items`` are likely to be small rather than being large.
2261-
The prior probability of ``max_items`` is defined using a Potential object with the log of the inverse of ``max_items`` as its value.
2262-
This means that larger values of ``max_items`` have a lower prior probability density, while smaller values of ``max_items`` have a higher prior probability density.
2263-
When the model is sampled, the posterior distribution of ``max_items`` given the observed value of ``n_items`` will be influenced by the power-law prior defined in the Potential object
2258+
A Potential term can depend on multiple variables.
2259+
In the following example, the ``soft_sum_constraint`` potential encourages ``x`` and ``y`` to have a small sum.
2260+
The more the sum deviates from zero, the more negative the penalty value of ``(-((x + y)**2))``.
22642261
22652262
.. code:: python
22662263
2267-
with pm.Model():
2264+
import pymc as pm
2265+
2266+
with pm.Model() as model:
2267+
x = pm.Normal("x", mu=0, sigma=10)
2268+
y = pm.Normal("y", mu=0, sigma=10)
2269+
soft_sum_constraint = pm.Potential("soft_sum_constraint", -((x + y)**2))
2270+
2271+
A Potential can be used to define a specific prior term.
2272+
The following example imposes a power law prior on `max_items`, under the form ``log(1/max_items)``,
2273+
which penalizes very large values of `max_items`.
2274+
2275+
.. code:: python
2276+
2277+
import pymc as pm
2278+
2279+
with pm.Model() as model:
22682280
# p(max_items) = 1 / max_items
22692281
max_items = pm.Uniform("max_items", lower=1, upper=100)
22702282
pm.Potential("power_prior", pm.math.log(1/max_items))
22712283
22722284
n_items = pm.Uniform("n_items", lower=1, upper=max_items, observed=60)
22732285
2274-
In the next example, the ``soft_sum_constraint`` potential encourages ``x`` and ``y`` to have a small sum, effectively adding a soft constraint on the relationship between the two variables.
2275-
This can be useful in cases where you want to ensure that the sum of multiple variables stays within a certain range, without enforcing an exact value.
2276-
In this case, the larger the deviation, larger will be the negative value (-((x + y)**2)) which the MCMC sampler will attempt to minimize.
2277-
However, the sampler might generate values for some small deviations but with lower probability hence this is a soft constraint.
2286+
A Potential can be used to define a specific likelihood term.
2287+
In the following example, a normal likelihood term is added to fixed data.
2288+
The same result would be obtained by using an observed `Normal` variable.
22782289
22792290
.. code:: python
22802291
2292+
import pymc as pm
2293+
2294+
def normal_logp(value, mu, sigma):
2295+
return -0.5 * ((value - mu) / sigma) ** 2 - pm.math.log(sigma)
2296+
22812297
with pm.Model() as model:
2282-
x = pm.Normal("x", mu=0.1, sigma=1)
2283-
y = pm.Normal("y", mu=x, sigma=1, observed=data)
2284-
soft_sum_constraint = pm.Potential("soft_sum_constraint", -((x + y)**2))
2298+
mu = pm.Normal("x")
2299+
sigma = pm.HalfNormal("sigma")
2300+
2301+
data = [0.1, 0.5, 0.9]
2302+
llike = pm.Potential("llike", normal_logp(data, mu, sigma))
22852303
2286-
The potential value is incorporated into the model log-probability, so it should be -inf (or very negative) when a constraint is violated, so that those draws are rejected. 0 won't have any effect and positive values will make the proposals more likely to be accepted.
22872304
22882305
"""
22892306
model = modelcontext(model)

0 commit comments

Comments
 (0)