I have a computer vision algorithm I want to tune up using scipy.optimize.minimize. Right now I only want to tune-up two parameters but the number of parameters might eventually grow so I would like to use a technique that can do high-dimensional gradient searches. The Nelder-Mead implementation in SciPy seemed like a good fit.
I got the code all set up but it seems that the minimize function really wants to use floating-point values with a step size that is less than one. The current set of parameters are both integers and one has a step size of one and the other has a step size of two (i.e. the value must be odd, if it isn't the thing I am trying to optimize will convert it to an odd number). Roughly one parameter is a window size in pixels and the other parameter is a threshold (a value from 0-255).
For what it is worth I am using a fresh build of scipy from the git repo. Does anyone know how to tell scipy to use a specific step size for each parameter? Is there some way I can roll my own gradient function? Is there a scipy flag that could help me out? I am aware that this could be done with a simple parameter sweep, but I would eventually like to apply this code too much larger sets of parameters.
The code itself is dead simple:
import numpy as np
from scipy.optimize import minimize
from ScannerUtil import straightenImg
import bson
def doSingleIteration(parameters):
# do some machine vision magic
# return the difference between my value and the truth value
parameters = np.array([11,10])
res = minimize( doSingleIteration, parameters, method='Nelder-Mead',options={'xtol': 1e-2, 'disp': True,'ftol':1.0,}) #not sure if these params do anything
print "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~"
print res
This is what my output looks like. As you can see we are repeating a lot of runs and not getting anywhere in the minimization.
[ 11. 10.] <-- Output from scipy minimize
{'block_size': 11, 'degree': 10} <-- input to my algorithm rounded and made int
120 <-- output of the function I am trying to minimize
[ 11.55 10. ]
{'block_size': 11, 'degree': 10}
120
[ 11. 10.5]
{'block_size': 11, 'degree': 10}
120
[ 11.55 9.5 ]
{'block_size': 11, 'degree': 9}
120
[ 11.1375 10.25 ]
{'block_size': 11, 'degree': 10}
120
[ 11.275 10. ]
{'block_size': 11, 'degree': 10}
120
[ 11. 10.25]
{'block_size': 11, 'degree': 10}
120
[ 11.275 9.75 ]
{'block_size': 11, 'degree': 9}
120
SNIP
[ 11. 10.0078125]
{'block_size': 11, 'degree': 10}
120
Optimization terminated successfully.
Current function value: 120.000000
Iterations: 7
Function evaluations: 27
status: 0
nfev: 27
success: True
fun: 120.0
x: array([ 11., 10.])
message: 'Optimization terminated successfully.'
nit: 7*