Skip to content

Commit 3cc9a19

Browse files
Update publications from getcomputo-pub.fsx [skip ci]
1 parent 24d584b commit 3cc9a19

File tree

1 file changed

+70
-0
lines changed

1 file changed

+70
-0
lines changed

site/published.yml

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,73 @@
1+
- abstract': >-
2+
In Bayesian statistics, the choice of the prior can have
3+
an important influence on the posterior and the parameter
4+
estimation, especially when few data samples are available. To limit
5+
the added subjectivity from a priori information, one can use the
6+
framework of objective priors, more particularly, we focus on
7+
reference priors in this work. However, computing such priors is a
8+
difficult task in general. Hence, we consider cases where the
9+
reference prior simplifies to the Jeffreys prior. We develop in this
10+
paper a flexible algorithm based on variational inference which
11+
computes approximations of priors from a set of parametric
12+
distributions using neural networks. We also show that our algorithm
13+
can retrieve modified Jeffreys priors when constraints are specified
14+
in the optimization problem to ensure the solution is proper. We
15+
propose a simple method to recover a relevant approximation of the
16+
parametric posterior distribution using Markov Chain Monte Carlo
17+
(MCMC) methods even if the density function of the parametric prior
18+
is not known in general. Numerical experiments on several
19+
statistical models of increasing complexity are presented. We show
20+
the usefulness of this approach by recovering the target
21+
distribution. The performance of the algorithm is evaluated on both
22+
prior and posterior distributions, jointly using variational
23+
inference and MCMC sampling.
24+
authors: Nils Baillie, Antoine Van Biesbroeck and Clément Gauchy
25+
bibtex: >+
26+
@article{baillie2025,
27+
author = {Baillie, Nils and Van Biesbroeck, Antoine and Gauchy,
28+
Clément},
29+
publisher = {French Statistical Society},
30+
title = {Variational Inference for Approximate Objective Priors Using
31+
Neural Networks},
32+
journal = {Computo},
33+
date = {2025-12-01},
34+
doi = {10.57750/76fh-t442},
35+
issn = {2824-7795},
36+
langid = {en},
37+
abstract = {In Bayesian statistics, the choice of the prior can have
38+
an important influence on the posterior and the parameter
39+
estimation, especially when few data samples are available. To limit
40+
the added subjectivity from a priori information, one can use the
41+
framework of objective priors, more particularly, we focus on
42+
reference priors in this work. However, computing such priors is a
43+
difficult task in general. Hence, we consider cases where the
44+
reference prior simplifies to the Jeffreys prior. We develop in this
45+
paper a flexible algorithm based on variational inference which
46+
computes approximations of priors from a set of parametric
47+
distributions using neural networks. We also show that our algorithm
48+
can retrieve modified Jeffreys priors when constraints are specified
49+
in the optimization problem to ensure the solution is proper. We
50+
propose a simple method to recover a relevant approximation of the
51+
parametric posterior distribution using Markov Chain Monte Carlo
52+
(MCMC) methods even if the density function of the parametric prior
53+
is not known in general. Numerical experiments on several
54+
statistical models of increasing complexity are presented. We show
55+
the usefulness of this approach by recovering the target
56+
distribution. The performance of the algorithm is evaluated on both
57+
prior and posterior distributions, jointly using variational
58+
inference and MCMC sampling.}
59+
}
60+
61+
date: 2025-12-01
62+
description: ''
63+
doi: 10.57750/76fh-t442
64+
draft: false
65+
journal: Computo
66+
pdf: ''
67+
repo: published-202512-baillie-varp
68+
title: Variational inference for approximate objective priors using neural networks
69+
url: ''
70+
year: 2025
171
- abstract': >-
272
The Maximum Mean Discrepancy (MMD) is a kernel-based
373
metric widely used for nonparametric tests and estimation. Recently,

0 commit comments

Comments
 (0)