I am using boost::multi_index with a data type I'd like to index based on its size. However, the s开发者_如何学Pythonize() member function of this data type is expensive to execute. Does multi_index cache the values it gets from its key extractors?
For example, if I created a multi_index container with an ordered index with a member function key (element.size()), and inserted an element whose size put it somewhere in the middle of the container, would the container re-invoke the size() member function on all the elements it visits while traversing its internal data structure before finding the right insertion point?
Well, the documentation for member function indexers says they call the referenced member function: http://www.boost.org/doc/libs/1_46_0/libs/multi_index/doc/reference/key_extraction.html#key_extractors
But when in doubt, profile:
#include <boost/multi_index_container.hpp>
#include <boost/multi_index/mem_fun.hpp>
#include <boost/multi_index/indexed_by.hpp>
#include <boost/multi_index/hashed_index.hpp>
#include <boost/multi_index/ordered_index.hpp>
#include <iostream>
#include <vector>
#include <algorithm>
using namespace std;
namespace bmi = boost::multi_index;
int g_calls = 0;
struct A
{
explicit A(int sz) : m_size(sz) { }
int size() const { ++g_calls; return m_size; }
private:
int m_size;
};
typedef boost::multi_index_container<
A*,
bmi::indexed_by<
bmi::ordered_non_unique<
BOOST_MULTI_INDEX_CONST_MEM_FUN(A,int,A::size)
>
>
> container_t;
int main()
{
container_t cont;
int n = 100;
vector<int> o(2*n+1);
for( int i = 0; i != 2*n+1; ++i )
o[i] = i;
random_shuffle(o.begin(), o.end());
for( int i = 0; i != n; ++i )
cont.insert(new A(o[i]));
cout << "Calls to A::size(): "<< g_calls << endl;
for( int i = n+1; i <= 2*n; ++i )
cont.insert(new A(o[i]));
cout << "Calls to A::size(): "<< g_calls << endl;
cont.insert(new A(o[n]));
cout << "Calls to A::size(): "<< g_calls << endl;
for( int i = 0; i != o.size(); ++i )
cont.find(o[i]);
cout << "Calls after calling find " << o.size() << " times: "<< g_calls << endl;
return 0;
}
Gives the following output (using Boost 1.46):
Calls to A::size(): 629
Calls to A::size(): 1465
Calls to A::size(): 1474
Calls after calling find 201 times: 3262
So, it appears the answer is no, it doesn't cache values.
精彩评论