开发者

Why can't I reserve 1,000,000,000 in my vector?

开发者 https://www.devze.com 2022-12-29 11:06 出处:网络
When I type in the foll. code, I get the output as 1073741823. #include <iostream> #include <vector>

When I type in the foll. code, I get the output as 1073741823.

#include <iostream>
#include <vector>
using namespace std;
int main()
{
  vector <int> v;
  cout<<v.max_size();
  return 0;
}

However when I try to resize the vector to 1,000,000,000, by v.resize(1000000000); the program stops executing. How can I enable the program to allocate the required memory, when it seems that it should be able to?

I am using MinGW in Windows 7. I have 2 GB RAM. Should it not be possible? In case it is not possible, can't I declare it as an array of integers and get away? BUt even that doesn't work.

Another thing is that, suppose I would use a file(which can easily handle so much data ). How can I let it read and write and the same time. Using fstream file("file.txt', ios::out | ios::in ); doesn开发者_如何学Go't create a file, in the first place. But supposing the file exists, I am unable to use to do reading and writing simultaneously. WHat I mean is this : Let the contents of the file be 111111 Then if I run : -

#include <fstream>
#include <iostream>
using namespace std;
int main()
{
  fstream file("file.txt",ios:in|ios::out);
  char x;
  while( file>>x)
  {
    file<<'0';
  }
 return 0;
}

Shouldn't the file's contents now be 101010 ? Read one character and then overwrite the next one with 0 ? Or incase the entire contents were read at once into some buffer, should there not be atleast one 0 in the file ? 1111110 ? But the contents remain unaltered. Please explain. Thank you.


  1. A 32-bit process can only address 4GB address space at a single time. Usually, plenty of this 4GB address space is used to map other stuff. Your vector is going to take too much contiguous address space (4 billion bytes) which is not likely to be available.

  2. You should memory map the file. See mmap.


The maximum the STL implementation will cope with is one thing; the maximum amount of memory available from the OS is something else; you are hitting the latter.

You might for example be able to create a vector of that many char elements. Either way, don't expect blistering performance unless you physically have that much memory (plus whatever the OS and anything else running needs); accessing such a vector will no doubt result in much disk thrashing as the system pages memory in and out from disk.

Also a processor with a 32bit address space (or when running 32 bit OS regardless of physical address space) can only address 4Gb (physical or virtual), so there is an architectural limit. Moreover some OS's limit the user space; for example the user space in Win32 is fixed at 2Gb. Various versions of Win64 artificially limit the user space in order to allow Microsoft to charge different prices, so using Win64 is no guarantee of sufficient address space.


An integer is four bytes, so 1,000,000,000 integers will take up around 3.72GB.


You are asking to allocate one billion integers in contiguous sequence. Apart from the difficulty of finding such a huge contiguous space you simply don't have that space at all. Recall that an integer on a common 32-bit system occupies 32 bits, or 4 bytes. Multiply that by one billion and you go far beyond the 2GB you have. In addition, a std::vector is allowed to reserve more than you ask for.

As for your second question, if you both read and write at the same time with the same fstream object, make sure you seekg() and seekp() before your read and write.


However when I try to resize the vector to 1,000,000,000, by v.resize(1000000000); the program stops executing. How can I enable the program to allocate the required memory, when it seems that it should be able to?

It can depend on the C++ STL implementation, but the act of resizing often causes the application to reserve a lot more than what you ask for.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号