Skip to content

Commit 4dde469

Browse files
authored
update solvers to follow ProximalAlgorithms (#19)
* update solvers to follow ProximalAlgorithms * updated title * remove println * REQUIRE -> Project.toml * re-enabled tests * increment tolerances a bit * add Julia 1.2 to travis
1 parent 2620090 commit 4dde469

13 files changed

+176
-294
lines changed

.travis.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ os:
66
julia:
77
- 1.0
88
- 1.1
9+
- 1.2
910
- nightly
1011
matrix:
1112
allow_failures:

Project.toml

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
name = "StructuredOptimization"
2+
uuid = "46cd3e9d-64ff-517d-a929-236bc1a1fc9d"
3+
version = "0.2.0"
4+
5+
[deps]
6+
AbstractOperators = "d9c5613a-d543-52d8-9afd-8f241a8c3f1c"
7+
DSP = "717857b8-e6f2-59f4-9121-6e50c889abd2"
8+
FFTW = "7a1cc6ca-52ef-59f5-83cd-3a7055c09341"
9+
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
10+
ProximalAlgorithms = "140ffc9f-1907-541a-a177-7475e0a401e9"
11+
ProximalOperators = "a725b495-10eb-56fe-b38b-717eba820537"
12+
RecursiveArrayTools = "731186ca-8d62-57ce-b412-fbd966d074cd"
13+
14+
[compat]
15+
AbstractOperators = "≥ 0.1.0"
16+
DSP = "≥ 0.5.1"
17+
FFTW = "≥ 0.2.4"
18+
ProximalAlgorithms = "≥ 0.3.0"
19+
ProximalOperators = "≥ 0.8.0"
20+
RecursiveArrayTools = "≥ 0.18.0"
21+
julia = "1.0.0"
22+
23+
[extras]
24+
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
25+
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
26+
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
27+
28+
[targets]
29+
test = ["LinearAlgebra", "Test", "Random"]

REQUIRE

Lines changed: 0 additions & 7 deletions
This file was deleted.

docs/src/solvers.md

Lines changed: 16 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
!!! note "Problem warm-starting"
1010

1111
By default *warm-starting* is always enabled.
12-
For example, if two problems that utilize the same variables are solved consecutively,
12+
For example, if two problems that involve the same variables are solved consecutively,
1313
the second one will be automatically warm-started by the solution of the first one.
1414
That is because the variables are always linked to their respective data vectors.
1515
If one wants to avoid this, the optimization variables needs to be manually re-initialized
@@ -18,22 +18,21 @@
1818

1919
## Specifying solver and options
2020

21-
As shown above it is possible to choose the type of algorithm and specify its options by creating a `Solver` object.
22-
Currently, the following algorithms are supported:
21+
You can pick the algorithm to use as `Solver` object from the
22+
[`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl))
23+
package. Currently, the following algorithms are supported:
2324

24-
* *Proximal Gradient (PG)* [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542)
25-
* *Fast Proximal Gradient (FPG)* [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542)
26-
* *ZeroFPR* [[3]](https://arxiv.org/abs/1606.06256)
27-
* *PANOC* [[4]](https://doi.org/10.1109/CDC.2017.8263933)
25+
* `ProximalAlgorithms.ForwardBackward`, also known as *proximal gradient*
26+
method [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542). Nesterov acceleration can be enabled, which significantly
27+
improves its performance for convex problems.
28+
* `ProximalAlgorithms.ZeroFPR`, a Newton-type forward-backward algorithm,
29+
proposed in [[3]](https://arxiv.org/abs/1606.06256), using L-BFGS
30+
directions to accelerate convergence.
31+
* `ProximalAlgorithms.PANOC`, another Newton-type forward-backward algorithm,
32+
proposed in [[4]](https://doi.org/10.1109/CDC.2017.8263933), also using
33+
L-BFGS directions.
2834

29-
```@docs
30-
PG
31-
FPG
32-
ZeroFPR
33-
PANOC
34-
```
35-
36-
## Build and solve
35+
## Parse and solve
3736

3837
The macro [`@minimize`](@ref) automatically parse and solve the problem.
3938
An alternative syntax is given by the function [`problem`](@ref) and [`solve`](@ref).
@@ -43,18 +42,8 @@ problem
4342
solve
4443
```
4544

46-
It is important to stress out that the `Solver` objects created using
47-
the functions above ([`PG`](@ref), [`FPG`](@ref), etc.)
48-
specify only the type of algorithm to be used together with its options.
49-
The actual solver
50-
(namely the one of [`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl))
51-
is constructed altogether with the problem formulation.
52-
The problem parsing procedure can be separated from the solver application using the functions [`build`](@ref) and [`solve!`](@ref).
53-
54-
```@docs
55-
build
56-
solve!
57-
```
45+
Once again, the `Solver` objects is to be picked from
46+
[`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl)).
5847

5948
## References
6049

src/StructuredOptimization.jl

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,18 @@ using ProximalOperators
99
using ProximalAlgorithms
1010

1111
include("syntax/syntax.jl")
12-
include("calculus/precomposeNonlinear.jl") #TODO move to ProximalOperators?
13-
include("solvers/solvers.jl")
12+
include("calculus/precomposeNonlinear.jl") # TODO move to ProximalOperators?
13+
include("arraypartition.jl") # TODO move to ProximalOperators?
14+
15+
# problem parsing
16+
include("solvers/terms_extract.jl")
17+
include("solvers/terms_properties.jl")
18+
include("solvers/terms_splitting.jl")
19+
20+
# solver calls
21+
include("solvers/solvers_options.jl")
22+
include("solvers/build_solve.jl")
23+
include("solvers/minimize.jl")
24+
1425

1526
end

src/arraypartition.jl

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
import ProximalOperators
2+
import RecursiveArrayTools
3+
4+
@inline function ProximalOperators.prox(
5+
h::ProximalOperators.ProximableFunction,
6+
x::RecursiveArrayTools.ArrayPartition,
7+
gamma...
8+
)
9+
# unwrap
10+
y, fy = ProximalOperators.prox(h, x.x, gamma...)
11+
# wrap
12+
return RecursiveArrayTools.ArrayPartition(y), fy
13+
end
14+
15+
@inline function ProximalOperators.gradient(
16+
h::ProximalOperators.ProximableFunction,
17+
x::RecursiveArrayTools.ArrayPartition
18+
)
19+
# unwrap
20+
grad, fx = ProximalOperators.gradient(h, x.x)
21+
# wrap
22+
return RecursiveArrayTools.ArrayPartition(grad), fx
23+
end
24+
25+
@inline ProximalOperators.prox!(
26+
y::RecursiveArrayTools.ArrayPartition,
27+
h::ProximalOperators.ProximableFunction,
28+
x::RecursiveArrayTools.ArrayPartition,
29+
gamma...
30+
) = ProximalOperators.prox!(y.x, h, x.x, gamma...)
31+
32+
@inline ProximalOperators.gradient!(
33+
y::RecursiveArrayTools.ArrayPartition,
34+
h::ProximalOperators.ProximableFunction,
35+
x::RecursiveArrayTools.ArrayPartition
36+
) = ProximalOperators.gradient!(y.x, h, x.x)

src/solvers/build_solve.jl

Lines changed: 30 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
export build
22

33
"""
4-
`build(terms::Tuple, solver_opt::ForwardBackwardSolver)`
4+
parse_problem(terms::Tuple, solver::ForwardBackwardSolver)
55
6-
Takes as input a tuple containing the terms defining the problem and the solver options.
6+
Takes as input a tuple containing the terms defining the problem and the solver.
77
8-
Returns a tuple containing the optimization variables and the built solver.
8+
Returns a tuple containing the optimization variables and the problem terms
9+
to be fed into the solver.
910
1011
# Example
1112
@@ -18,82 +19,37 @@ julia> A, b = randn(10,4), randn(10);
1819
julia> p = problem( ls(A*x - b ) , norm(x) <= 1 );
1920
2021
julia> build(p, PG());
21-
2222
```
23-
2423
"""
25-
function build(terms::Tuple, solver::ForwardBackwardSolver)
24+
function parse_problem(terms::Tuple, solver::T) where T <: ForwardBackwardSolver
2625
x = extract_variables(terms)
2726
# Separate smooth and nonsmooth
2827
smooth, nonsmooth = split_smooth(terms)
29-
# Separate quadratic and nonquadratic
30-
quadratic, smooth = split_quadratic(smooth)
31-
kwargs = Array{Any, 1}()
3228
if is_proximable(nonsmooth)
3329
g = extract_proximable(x, nonsmooth)
34-
append!(kwargs, [(:g, g)])
35-
if !isempty(quadratic)
36-
fq = extract_functions(quadratic)
37-
Aq = extract_operators(x, quadratic)
38-
append!(kwargs, [(:fq, fq)])
39-
append!(kwargs, [(:Aq, Aq)])
40-
end
30+
kwargs = Dict{Symbol, Any}(:g => g)
4131
if !isempty(smooth)
4232
if is_linear(smooth)
43-
fs = extract_functions(smooth)
44-
As = extract_operators(x, smooth)
45-
append!(kwargs, [(:As, As)])
46-
else
47-
fs = extract_functions_nodisp(smooth)
48-
As = extract_affines(x, smooth)
49-
fs = PrecomposeNonlinear(fs, As)
33+
f = extract_functions(smooth)
34+
A = extract_operators(x, smooth)
35+
kwargs[:A] = A
36+
else # ??
37+
f = extract_functions_nodisp(smooth)
38+
A = extract_affines(x, smooth)
39+
f = PrecomposeNonlinear(f, A)
5040
end
51-
append!(kwargs, [(:fs, fs)])
41+
kwargs[:f] = f
5242
end
53-
return build_iterator(x, solver; kwargs...)
43+
return (x, kwargs)
5444
end
55-
error("Sorry, I cannot solve this problem")
56-
end
57-
58-
################################################################################
59-
export solve!
60-
61-
"""
62-
`solve!( x_solver )`
63-
64-
Takes as input a tuple containing the optimization variables and the built solver.
65-
66-
Solves the problem returning a tuple containing the iterations taken and the build solver.
67-
68-
# Example
69-
70-
```julia
71-
julia> x = Variable(4)
72-
Variable(Float64, (4,))
73-
74-
julia> A, b = randn(10,4), randn(10);
75-
76-
julia> p = problem( ls(A*x - b ) , norm(x) <= 1 );
77-
78-
julia> x_solver = build(p, PG(verbose = 0));
79-
80-
julia> solve!(x_solver);
81-
82-
```
83-
84-
"""
85-
function solve!(x_and_iter::Tuple{Tuple{Vararg{Variable}}, ProximalAlgorithms.ProximalAlgorithm})
86-
x, iterator = x_and_iter
87-
it, x_star = ProximalAlgorithms.run!(iterator)
88-
~x .= x_star
89-
return it, iterator
45+
error("Sorry, I cannot parse this problem for solver of type $(T)")
9046
end
9147

9248

9349
export solve
9450

9551
"""
96-
`solve(terms::Tuple, solver_opt::ForwardBackwardSolver)`
52+
solve(terms::Tuple, solver::ForwardBackwardSolver)
9753
9854
Takes as input a tuple containing the terms defining the problem and the solver options.
9955
@@ -102,22 +58,26 @@ Solves the problem returning a tuple containing the iterations taken and the bui
10258
# Example
10359
10460
```julia
105-
10661
julia> x = Variable(4)
10762
Variable(Float64, (4,))
10863
10964
julia> A, b = randn(10,4), randn(10);
11065
111-
julia> solve(p,PG());
112-
it | gamma | fpr |
113-
------|------------|------------|
114-
1 | 7.6375e-02 | 1.8690e+00 |
115-
12 | 7.6375e-02 | 9.7599e-05 |
66+
julia> p = problem(ls(A*x - b ), norm(x) <= 1);
11667
117-
```
68+
julia> solve(p, ProximalAlgorithms.ForwardBackward());
11869
70+
julia> ~x
71+
4-element Array{Float64,1}:
72+
-0.6427139974173074
73+
-0.29043653211431103
74+
-0.6090539651510192
75+
0.36279278640995494
76+
```
11977
"""
12078
function solve(terms::Tuple, solver::ForwardBackwardSolver)
121-
built_slv = build(terms, solver)
122-
return solve!(built_slv)
79+
x, kwargs = parse_problem(terms, solver)
80+
x_star, it = solver(~x; kwargs...)
81+
~x .= x_star
82+
return x, it
12383
end

src/solvers/solvers.jl

Lines changed: 0 additions & 9 deletions
This file was deleted.

0 commit comments

Comments
 (0)