Question from @jameslord:

I’m looking at the “Cost Function” returned by fits.

I’d expect to see (weighted) chi-squared per degree of freedom. This should be close to 1.0 for a successful fit where the error bars on the input data are an accurate representation of the scatter in the points and the fit function can represent the ideal curve well. I do get this with the Levenberg-Marquardt minimiser and also from CalculateChiSquared(). All the other minimisers (including Levenberg-MarquardtMD) generally return a value which is half of this, though the fitted parameters are the same and CalculateChiSquared still gives 1.0 from the same function and final parameters.

I’ve observed this both with Fit() from a Python script (see attached) and using the Muon Interface.

The script actually tries different numbers of maximum iterations and returns the status and parameters so far. Sometimes Fit returns something like “Failed to converge after 10 iterations”. Shouldn’t this also appear as a warning (orange) in the Results Log window, between the usual “Fit started” and “Fit successful, duration 1.0 seconds”?

Some minimisers (the Conjugate Gradient ones for example), given plenty more iterations than necessary to work with, stop with “iteration is not making progress towards solution” rather than “success”. The parameter values and cost function look good anyway. Conversely sometimes Simplex returns “success” but its parameters aren’t quite as good as the others.

I note there’s now a page:

http://docs.mantidproject.org/nightly/concepts/FittingMinimizers.html

Should this list the convergence parameters (e.g. MaxError) that some minimisers take and under what conditions I might want to change them from the defaults?

It also says that some minimisers such as Levenberg-Marquardt and BFGS use the second derivatives of the function. Should there therefore be a way for a fit function to return these higher derivatives, where they’re straightforward to derive analytically, as it can for the first order ones?

Minor problem while writing that script: it seems that EvaluateFunction requires that the OutputWorkspace property is set, even when I’m assigning its return value (that workspace reference) to a Python variable for use as input to other algorithms later and don’t care whether it goes in the ADS or not. This is unlike most other algorithms.

FitConvergenceComparison.py (2.6 KB)