Skip to content

Working with the optimizer

Katherine Ye edited this page Nov 7, 2019 · 7 revisions

Parameters can be specified for the optimization at runtime in a JSON file as such:

runpenrose --config=opt-config/test.json sub/twosets.sub sty/transform-rect.sty dsll/setTheory.dsl

A configuration looks like this:

{ "optMethod": "BFGS" }

If no configuration is specified, the system uses opt-config/default.json.

Available parameters can be found in the OptConfig type defined in GenOptProblem.

All functions involved in the optimization are passed a config parameter containing the configuration values passed in at runtime.

Parameters

L-BFGS has a tunable parameter defaultBfgsMemSize: the number of correction vectors to store (higher m = more memory). The number is empirically determined (see Nocedal). With m=0 it reduces to gradient descent; if numSteps < m then it's Newton's method. Try changing m if the optimization is behaving weirdly.

TODO

  • optimization policies
  • output
  • weights
  • debugging

Writing objectives and constraints

The asymptotic behavior of the function can matter a great deal (e.g. x^2 vs x^4), as seen here.

Clone this wiki locally