Skip to content

Commit

Permalink
docs: overhaul the documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
pnevyk committed Nov 15, 2023
1 parent b392536 commit 356543a
Show file tree
Hide file tree
Showing 24 changed files with 1,013 additions and 839 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/main.yml → .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Gomez tests
name: CI

on:
push:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/publish.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Gomez publish
name: Publish

on:
push:
Expand Down
79 changes: 50 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,47 +1,68 @@
# Gomez
# gomez

A pure Rust framework and implementation of (derivative-free) methods for
solving nonlinear (bound-constrained) systems of equations.
[![Build](https://img.shields.io/github/actions/workflow/status/datamole-ai/gomez/ci.yml?branch=main)](https://github.com/datamole-ai/gomez/actions)
[![License](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/datamole-ai/gomez/blob/main/LICENSE)
[![Cargo](https://img.shields.io/crates/v/gomez.svg)](https://crates.io/crates/gomez)
[![Documentation](https://docs.rs/gomez/badge.svg)](https://docs.rs/gomez)

**Warning:** The code and API are still quite rough. Expect changes.
_gomez_ is a framework and implementation for **mathematical optimization** and
solving **non-linear systems of equations**.

This library provides a variety of solvers of nonlinear equation systems with
*n* equations and *n* unknowns written entirely in Rust. Bound constraints for
variables are supported first-class, which is useful for engineering
applications. All solvers implement the same interface which is designed to give
full control over the process and allows to combine different components to
achieve the desired solution. The implemented methods are historically proven
numerical methods or global optimization algorithms.
The library is written completely in Rust. Its focus is on being useful for
**practical problems** and having API that is simple for easy cases as well as
flexible for complicated ones. The name stands for ***g***lobal
***o***ptimization & ***n***on-linear ***e***quations ***s***olving, with a few
typos.

The convergence of the numerical methods is tested on several problems and the
implementation is benchmarked against with
[GSL](https://www.gnu.org/software/gsl/doc/html/multiroots.html) library.
## Practical problems

The main goal is to be useful for practical problems. This is manifested by the
following features:

* _Derivative-free_. No algorithm requires an analytical derivative (gradient,
Hessian, Jacobian). Methods that use derivatives approximate it using finite
difference method<sup>1</sup>.
* _Constraints_ support. It is possible to specify the problem domain with
constraints<sup>2</sup>, which is necessary for many engineering applications.
* Non-naive implementations. The code is not a direct translation of a textbook
pseudocode. It's written with performance in mind and applies important
techniques from numerical mathematics. It also tries to handle situations that
hurt the methods but occur in practice.

<sup>1</sup> There is a plan to provide ways to override this approximation with
a real derivative.

<sup>2</sup> Currently, only unconstrained and box-bounded domains are
supported.

## Algorithms

* Trust region -- Recommended method to be used as a default and it will just
work in most of the cases.
* LIPO -- A global optimization algorithm useful for initial guesses search in
combination with a numerical solver.
* Steffensen -- Fast and lightweight method for one-dimensional systems.
* Nelder-Mead -- Not generally recommended, but may be useful for
low-dimensionality problems with an ill-defined Jacobian matrix.
* [Trust region](algo::trust_region) – Recommended method to be used as a
default and it will just work in most cases.
* [LIPO](algo::lipo) – Global optimization algorithm useful for searching good
initial guesses in combination with a numerical algorithm.
* [Steffensen](algo::steffensen) – Fast and lightweight method for solving
one-dimensional systems of equations.
* [Nelder-Mead](algo::nelder_mead) – Direct search optimization method that does
not use any derivatives.

This list will be extended in the future. But at the same time, having as many
algorithms as possible is _not_ the goal. Instead, the focus is on providing
quality implementations of battle-tested methods.

## Roadmap

Listed *not* in order of priority.
Listed *not* in priority order.

* [Homotopy continuation
method](http://homepages.math.uic.edu/~jan/srvart/node4.html) to compare the
performance with the Trust region method.
performance with the trust region method
* Conjugate gradient method
* Experimentation with various global optimization techniques for initial
guesses search
* Experimentation with various global optimization techniques for initial guess
search
* Evolutionary/nature-inspired algorithms
* Bayesian optimization
* Focus on initial guesses search and tools in general
* High-level drivers encapsulating the low-level API for users that do not need
the fine-grained control.
* Focus on initial guesses search and tools for analysis in general

## License

Expand All @@ -50,5 +71,5 @@ Licensed under [MIT](LICENSE).
There are `gsl-wrapper` and `gsl-sys` crates which are licensed under the
[GPLv3](http://www.gnu.org/licenses/gpl-3.0.html) identically as
[GSL](https://www.gnu.org/software/gsl/) itself. This code is part of the
repository, but is not part of the Gomez library. Its purpose is solely for
repository but is not part of the Gomez library. Its purpose is solely for
comparison in Gomez benchmarks.
75 changes: 75 additions & 0 deletions examples/custom_algorithm.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
use fastrand::Rng;
use gomez::nalgebra as na;
use gomez::{Domain, Function, Optimizer, OptimizerDriver, Problem, Sample};
use na::{storage::StorageMut, Dyn, IsContiguous, Vector};

struct Random {
rng: Rng,
}

impl Random {
fn new(rng: Rng) -> Self {
Self { rng }
}
}

impl<F: Function> Optimizer<F> for Random
where
F::Field: Sample,
{
const NAME: &'static str = "Random";
type Error = std::convert::Infallible;

fn opt_next<Sx>(
&mut self,
f: &F,
dom: &Domain<F::Field>,
x: &mut Vector<F::Field, Dyn, Sx>,
) -> Result<F::Field, Self::Error>
where
Sx: StorageMut<F::Field, Dyn> + IsContiguous,
{
// Randomly sample in the domain.
dom.sample(x, &mut self.rng);

// We must compute the value.
Ok(f.apply(x))
}
}

// https://en.wikipedia.org/wiki/Rosenbrock_function
struct Rosenbrock {
a: f64,
b: f64,
}

impl Problem for Rosenbrock {
type Field = f64;

fn domain(&self) -> Domain<Self::Field> {
Domain::rect(vec![-10.0, -10.0], vec![10.0, 10.0])
}
}

impl Function for Rosenbrock {
fn apply<Sx>(&self, x: &na::Vector<Self::Field, Dyn, Sx>) -> Self::Field
where
Sx: na::Storage<Self::Field, Dyn> + IsContiguous,
{
(self.a - x[0]).powi(2) + self.b * (x[1] - x[0].powi(2)).powi(2)
}
}

fn main() {
let f = Rosenbrock { a: 1.0, b: 1.0 };
let mut optimizer = OptimizerDriver::builder(&f)
.with_algo(|_, _| Random::new(Rng::new()))
.build();

optimizer
.find(|state| {
println!("f(x) = {}\tx = {:?}", state.fx(), state.x());
state.iter() >= 100
})
.unwrap();
}
22 changes: 11 additions & 11 deletions examples/rosenbrock.rs → examples/equations.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,31 +17,31 @@ impl Problem for Rosenbrock {
}

impl System for Rosenbrock {
fn eval<Sx, Sfx>(
fn eval<Sx, Srx>(
&self,
x: &na::Vector<Self::Field, Dyn, Sx>,
fx: &mut na::Vector<Self::Field, Dyn, Sfx>,
rx: &mut na::Vector<Self::Field, Dyn, Srx>,
) where
Sx: na::storage::Storage<Self::Field, Dyn> + IsContiguous,
Sfx: na::storage::StorageMut<Self::Field, Dyn>,
Srx: na::storage::StorageMut<Self::Field, Dyn>,
{
fx[0] = (self.a - x[0]).powi(2);
fx[1] = self.b * (x[1] - x[0].powi(2)).powi(2);
rx[0] = (self.a - x[0]).powi(2);
rx[1] = self.b * (x[1] - x[0].powi(2)).powi(2);
}
}

fn main() -> Result<(), String> {
let f = Rosenbrock { a: 1.0, b: 1.0 };
let mut solver = SolverDriver::builder(&f)
.with_initial(vec![-10.0, -5.0])
let r = Rosenbrock { a: 1.0, b: 1.0 };
let mut solver = SolverDriver::builder(&r)
.with_initial(vec![10.0, -5.0])
.build();

let tolerance = 1e-6;

let (_, fx) = solver
let (_, norm) = solver
.find(|state| {
println!(
"iter = {}\t|| fx || = {}\tx = {:?}",
"iter = {}\t|| r(x) || = {}\tx = {:?}",
state.iter(),
state.norm(),
state.x()
Expand All @@ -50,7 +50,7 @@ fn main() -> Result<(), String> {
})
.map_err(|error| format!("{error}"))?;

if fx <= tolerance {
if norm <= tolerance {
Ok(())
} else {
Err("did not converge".to_string())
Expand Down
53 changes: 53 additions & 0 deletions examples/optimization.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
use gomez::nalgebra as na;
use gomez::{Domain, Function, OptimizerDriver, Problem};
use na::{Dyn, IsContiguous};

// https://en.wikipedia.org/wiki/Rosenbrock_function
struct Rosenbrock {
a: f64,
b: f64,
}

impl Problem for Rosenbrock {
type Field = f64;

fn domain(&self) -> Domain<Self::Field> {
Domain::unconstrained(2)
}
}

impl Function for Rosenbrock {
fn apply<Sx>(&self, x: &na::Vector<Self::Field, Dyn, Sx>) -> Self::Field
where
Sx: na::Storage<Self::Field, Dyn> + IsContiguous,
{
(self.a - x[0]).powi(2) + self.b * (x[1] - x[0].powi(2)).powi(2)
}
}

fn main() -> Result<(), String> {
let f = Rosenbrock { a: 1.0, b: 1.0 };
let mut optimizer = OptimizerDriver::builder(&f)
.with_initial(vec![10.0, -5.0])
.build();

let tolerance = 1e-6;

let (_, value) = optimizer
.find(|state| {
println!(
"iter = {}\tf(x) = {}\tx = {:?}",
state.iter(),
state.fx(),
state.x()
);
state.fx() <= tolerance || state.iter() >= 100
})
.map_err(|error| format!("{error}"))?;

if value <= tolerance {
Ok(())
} else {
Err("did not converge".to_string())
}
}
Loading

0 comments on commit 356543a

Please sign in to comment.