I'm bulk loading data into a django model, and have noticed that the number of objects loaded into memory before doing a commit affects the average time to save each object. I realise this can be due to many different factors, so would rather focus on optimizing this STEPSIZE variable.
What would be a simple algorithm for optimizing this variable, in realtime, while taking into account the fact that this optimum开发者_StackOverflow社区 might also change during the process?
I imagine this would be some sort of gradient descent, with a bit of jitter to look for changes in the landscape? Is there a formally defined algorithm for this type of search?
I'd start out assuming that 1) Your function increases monotonically in both directions away from the optimum 2) You roughly know the size of the space of regions in which the optimum will live.
Then I'd recommend a bracket and subdivide approach as follows: Eval you function outwards from the previous optimum in both directions. Stop the search in each direction when a value higher than the previous optimum is achieved. With the assumptions above, this will give you a bracketed interval in which the new optimum lives. Break this region into two new regions left and right by evaluating the midpoint of the region. Choose left or right based on who has the lowest values, and repeat recursively until your region is small enough for your liking.
精彩评论