I'm writing a random access container in C++开发者_JAVA百科. In my code I use this (well, in my real code I use all kinds of Allocator typedefs, this is just easier to understand):
template<typename T, typename Allocator = std::allocator<T> >
class Carray {
public:
// ...
typedef T* iterator;
typedef const T* const_iterator;
// ...
};
But I can also create a different iterator class derived from std::iterator
. This would add support for typedefs (it::iterator_category
, it::difference_type
, etc).
Now my question, is there an overhead by using a iterator class instead of a raw pointer? If yes, how substantial is this overhead, and is it severe enough to not use a iterator class?
You have iterator category, difference type etc avalaibale for you even if you have a raw pointer. You see, there is this iterator_traits<>
template which you should use. It is already specialized for pointers.
iterator_traits<int*>::value_type // ... etc.
//or
iterator traits<my_custom_iterator>::value_type
If your iterator class simply wraps the pointer, there is almost certainly no overhead.
It's perfectly standard-conforming to use raw pointers as the iterators. However, some badly written code (including, as you suggest, code that tries to use nested typedefs directly instead of iterator_traits) may fail to compile. Some of the early standard libraries started with pointers for vector's iterators and changed, purely to keep such bad code working. That's really the only reason I'd think of for bothering.
BTW - if possible I'd make use of the Boost iterator support rather than deriving directly from std::iterator; a lot of subtle requirements are taken care of for you that way.
精彩评论