You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update to Turing 0.40
* Update version in _quarto.yml
* Fix DPPL 0.37 run_ad change
* Fix VI tutorial
* Fix model-manual's use of contexts
* Fix references to __context__
* Fix use of addlogprob for log prior
* Fix typo
* Regenerate manifest
* Remove version pin of DelayDiffEq and update Manifest
* Fix call to evaluate
* Add note about contexts tutorial being out of date
* Apply suggestions from code review
Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
---------
Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
Copy file name to clipboardExpand all lines: developers/compiler/minituring-contexts/index.qmd
+4Lines changed: 4 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -14,6 +14,10 @@ Pkg.instantiate();
14
14
15
15
In the [Mini Turing]({{< meta minituring >}}) tutorial we developed a miniature version of the Turing language, to illustrate its core design. A passing mention was made of contexts. In this tutorial we develop that aspect of our mini Turing language further to demonstrate how and why contexts are an important part of Turing's design.
16
16
17
+
::: {.callout-important}
18
+
Note: The way Turing actually uses contexts changed somewhat in releases 0.39 and 0.40. The content of this page remains relevant, the principles of how contexts operate remain the same, and concepts like leaf and parent contexts still exist. However, we've moved away from using contexts for quite as many things as we used to. Most importantly, whether to accumulate the log joint, log prior, or log likelihood is no longer done using different contexts. Please keep this in mind as you read this page: The principles remain, but the details have changed. We will update this page once the refactoring of internals that is happening around releases like 0.39 and 0.40 is done.
19
+
:::
20
+
17
21
# Mini Turing expanded, now with more contexts
18
22
19
23
If you haven't read [Mini Turing]({{< meta minituring >}}) yet, you should do that first. We start by repeating verbatim much of the code from there. Define the type for holding values for variables:
Copy file name to clipboardExpand all lines: developers/transforms/dynamicppl/index.qmd
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -351,7 +351,7 @@ Hence, one might expect that if we try to evaluate the model using this `VarInfo
351
351
Here, `evaluate!!` returns two things: the model's return value itself (which we defined above to be a `NamedTuple`), and the resulting `VarInfo` post-evaluation.
Copy file name to clipboardExpand all lines: usage/modifying-logprob/index.qmd
+3-6Lines changed: 3 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -47,13 +47,10 @@ using LinearAlgebra
47
47
end
48
48
```
49
49
50
-
Note that `@addlogprob!` always increases the accumulated log probability, regardless of the provided
51
-
sampling context.
52
-
For instance, if you do not want to apply `@addlogprob!` when evaluating the prior of your model but only when computing the log likelihood and the log joint probability, then you should [check the type of the internal variable `__context_`](https://github.com/TuringLang/DynamicPPL.jl/issues/154), as in the following example:
50
+
Note that `@addlogprob! (p::Float64)` adds `p` to the log likelihood.
51
+
If instead you want to add to the log prior, you can use
53
52
54
53
```{julia}
55
54
#| eval: false
56
-
if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext()
0 commit comments