-
Notifications
You must be signed in to change notification settings - Fork 228
Compatibility with DynamicPPL 0.38 + InitContext #2676
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: breaking
Are you sure you want to change the base?
Changes from all commits
10f960e
3a04643
7e522a6
9bc58c8
02d1d0e
27b0096
7f12c3e
ed197f9
c09c2a5
20f9e97
ba4da83
4b143ad
b5d82c9
c315993
3afd807
25c6513
d4aaa18
aa3cfcf
c0ea6e0
b0badc2
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,30 @@ | ||
# 0.41.0 | ||
|
||
## DynamicPPL 0.38 | ||
|
||
Turing.jl v0.41 brings with it all the underlying changes in DynamicPPL 0.38. | ||
|
||
The only user-facing difference is that initial parameters for MCMC sampling must now be specified in a different form. | ||
You still need to use the `initial_params` keyword argument to `sample`, but the allowed values are different. | ||
For almost all samplers in Turing.jl (except `Emcee`) this should now be a `DynamicPPL.AbstractInitStrategy`. | ||
|
||
TODO LINK TO DPPL DOCS WHEN THIS IS LIVE | ||
|
||
There are three kinds of initialisation strategies provided out of the box with Turing.jl (they are exported so you can use these directly with `using Turing`): | ||
|
||
- `InitFromPrior()`: Sample from the prior distribution. This is the default for most samplers in Turing.jl (if you don't specify `initial_params`). | ||
- `InitFromUniform(a, b)`: Sample uniformly from `[a, b]` in linked space. This is the default for Hamiltonian samplers. If `a` and `b` are not specified it defaults to `[-2, 2]`, which preserves the behaviour in previous versions (and mimics that of Stan). | ||
- `InitFromParams(p)`: Explicitly provide a set of initial parameters. **Note: `p` must be either a `NamedTuple` or a `Dict{<:VarName}`; it can no longer be a `Vector`.** Parameters must be provided in unlinked space, even if the sampler later performs linking. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do I recall correctly that you did end up implementing the option of providing an unwrapped Oh, also, just came to mind: Does it need to be a |
||
|
||
This change is made because Vectors are semantically ambiguous. | ||
It is not clear which element of the vector corresponds to which variable in the model, nor is it clear whether the parameters are in linked or unlinked space. | ||
Previously, both of these would depend on the internal structure of the VarInfo, which is an implementation detail. | ||
In contrast, the behaviour of `Dict`s and `NamedTuple`s is invariant to the ordering of variables and it is also easier for readers to understand which variable is being set to which value. | ||
|
||
If you were previously using `varinfo[:]` to extract a vector of initial parameters, you can now use `Dict(k => varinfo[k] for k in keys(varinfo)` to extract a Dict of initial parameters. | ||
|
||
## Initial step in MCMC sampling | ||
|
||
HMC and NUTS samplers no longer take an extra single step before starting the chain. | ||
This means that if you do not discard any samples at the start, the first sample will be the initial parameters (which may be user-provided). | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -45,7 +45,7 @@ Optim = "429524aa-4258-5aef-a3af-852621145aeb" | |
|
||
[extensions] | ||
TuringDynamicHMCExt = "DynamicHMC" | ||
TuringOptimExt = "Optim" | ||
TuringOptimExt = ["Optim", "AbstractPPL"] | ||
|
||
[compat] | ||
ADTypes = "1.9" | ||
|
@@ -64,7 +64,7 @@ Distributions = "0.25.77" | |
DistributionsAD = "0.6" | ||
DocStringExtensions = "0.8, 0.9" | ||
DynamicHMC = "3.4" | ||
DynamicPPL = "0.37.2" | ||
DynamicPPL = "0.38" | ||
EllipticalSliceSampling = "0.5, 1, 2" | ||
ForwardDiff = "0.10.3, 1" | ||
Libtask = "0.9.3" | ||
|
@@ -90,3 +90,6 @@ julia = "1.10.8" | |
[extras] | ||
DynamicHMC = "bbc10e6e-7c05-544b-b16e-64fede858acb" | ||
Optim = "429524aa-4258-5aef-a3af-852621145aeb" | ||
|
||
[sources] | ||
DynamicPPL = {url = "https://github.com/TuringLang/DynamicPPL.jl", rev = "breaking"} | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Likewise this. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,9 +1,9 @@ | ||
# TODO: Implement additional checks for certain samplers, e.g. | ||
# HMC not supporting discrete parameters. | ||
function _check_model(model::DynamicPPL.Model) | ||
# TODO(DPPL0.38/penelopeysm): use InitContext | ||
spl_model = DynamicPPL.contextualize(model, DynamicPPL.SamplingContext(model.context)) | ||
return DynamicPPL.check_model(spl_model, VarInfo(); error_on_failure=true) | ||
new_context = DynamicPPL.setleafcontext(model.context, DynamicPPL.InitContext()) | ||
new_model = DynamicPPL.contextualize(model, new_context) | ||
Comment on lines
+4
to
+5
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Isn't there a short-hand for these two lines now? |
||
return DynamicPPL.check_model(new_model, VarInfo(); error_on_failure=true) | ||
end | ||
function _check_model(model::DynamicPPL.Model, alg::InferenceAlgorithm) | ||
return _check_model(model) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Likewise a reminder comment.