IterativeNelderMead.jl

Documentation for IterativeNelderMead.jl

Installation

using Pkg
Pkg.add("IterativeNelderMead")

Details

This flavor of Nelder-Mead is based on the publicly available Matlab algorithm provided here with additional tweaks. It is an excellent choice for objectives where the gradient is costly or not possible to compute. Parameters may be bounded, but any other constraints must be manually implemented through the objective function. The eventual goal for IterativeNelderMead.jl is for support through the SciML Optimization.jl or Optim.jl interface.

Examples

Example: Fitting a Gaussian Curve

# Imports
using IterativeNelderMead
using PyPlot

# Build a Gaussian function
function gauss(x, a, μ, σ)
    return @. a * exp(-0.5 * ((x - μ) / σ)^2)
end

# Create a noisy dataset
x = [-20:0.05:20;]
ptrue = [4.0, 1.2, 2.8] # Amp, mean, stddev
ytrue = gauss(x, ptrue...)
yerr = abs.(0.1 .+ 0.1 .* randn(size(ytrue)))
ytrue .+= yerr .* randn(size(ytrue))

# Chi2 loss function
redchi2loss(residuals, yerr, ν) = sum((residuals ./ yerr).^2) / ν
loss(pars) = redchi2loss(ytrue .- gauss(x, pars...), yerr, length(ytrue) .- length(pars))

# Initial parameters and model
p0 = [3.0, -4.2, -4.1] # Amp, mean, stddev
lower_bounds = [0, -Inf, 0]
upper_bounds = [Inf, Inf, Inf]
y0 = gauss(x, p0...)

# Optimize
result = optimize(loss, p0, IterativeNelderMeadOptimizer())

# Best fit model
ybest = gauss(x, result.pbest...)

# Plot
begin
    errorbar(x, ytrue, yerr=yerr, marker="o", lw=0, elinewidth=1, label="Data", zorder=0)
    plot(x, y0, c="black", label="Initial model", alpha=0.6)
    plot(x, ybest, c="red", label="Best fit model")
    legend()
    plt.show()
end

The resulting plot is shown below.

Curve fitting plot

API

Missing docstring.

Missing docstring for IterativeNelderMeadOptimizer. Check Documenter's build log for details.

IterativeNelderMead.optimizeFunction
optimize(obj, p0::Vector{Float64}; lower_bounds=nothing, upper_bounds=nothing, vary=nothing)

Minimize the object function obj with initial parameters p0 using the IterativeNelderMeadOptimizer solver. Bounds can also be provided as additional Vectors. The vary keyword accepts an optional BitVector if certain parameters should remain fixed. Returns an NamedTuple with properties:

  • pbest::Vector{Float64}: The final parameters corresponding to the optimized objective value fbest.
  • fbest::Float64: The final optimized objective value.
  • fcalls::Int: The number of total objective calls.
  • simplex::Matrix{Float64}: The final simplex.
  • iteration::Int`: The number of iterations performed.
source