开发者

Map causing Seg Fault. How to increase memory?

开发者 https://www.devze.com 2023-03-18 06:48 出处:网络
I have a simple question.I have a few files, one file is around ~20000 lines. It has 5 fields, have some other adt (vectors and lists), but those do not cause a segfault.

I have a simple question. I have a few files, one file is around ~20000 lines.

It has 5 fields, have some other adt (vectors and lists), but those do not cause a segfault. The map itself will store a key value, equivalent to about 1 per line. When I added a map to my code, I would instantly get a segfault, I copied 5000 of 20000 lines, and receive a segfault, then 1000, and it worked.

In java there is a way to increase the amount of virtually allocated memory, is there a way to do so in c++? I have even deleted elements as they are no longer used, and I can get around 2000 lines, but not more.

Here is gdb:

(gdb) exec-file readin
(gdb) run
Starting program: /x/x/x/readin readin

Program exited normally.

valgrind:

HEAP SUMMARY:
==7948==     in use at exit: 0 bytes in 0 blocks
==7948==   total heap usage: 20,206 allocs, 20,206 frees, 2,661,509 bytes allocated
==7948== 
==7948== All heap blocks were freed -- no leaks are possible

code:

 ....
 Flow flw = endQueue.top();
  stringstream str1;
  stringstream str2;
  if (flw.getSrc() < flw.getDest()){
    str1 << flw.getSrc();
    str2 << flw.getDest();
    flw_src_dest = str1.str() + "-" + 开发者_开发问答str2.str();
  } else {
    str1 << flw.getSrc();
    str2 << flw.getDest();
    flw_src_dest = str2.str() + "-" + str1.str();
  }    
while (int_start > flw.getEnd()){
  if(flw.getFlow() == 1){
    ava_bw[flw_src_dest] += 5.5;
  } else {
    ava_bw[flw_src_dest] += 2.5;
  }
  endQueue.pop();
} 


A segmentation fault doesn't necessarily indicate that you're out of memory. In fact, with C++, it's highly unlikely: you would usually get a bad_alloc or somesuch in this case (unless you're dumping everything in objects with automatic storage duration?!).

More likely, you have a memory corruption bug in your code, that just so happens to only be noticeable when you have more than a certain number of objects.

At any rate, the solution to memory faults is not to blindly throw more memory at the program.

Run your code through valgrind and through a debugger, and see what the real problem is.


Be careful erasing elements from a container while you are iterating over the container.

for (pos = ava_bw.begin(); pos != ava_bw.end(); ++pos) {
    if (pos->second == INIT){
      ava_bw.erase(pos);
    }
  }

I believe this will have pos pointing to the next value but then ++pos will advance it yet again. If erase(pos) resulted in pos pointing at ava_bw.end(), the ++pos will fail.

I know if you tried this with a vector, pos will be invalidated.

Edit

In the while loop you do

while (int_start > flw.getEnd()){
   if(flw.getFlow() == 1){
      ava_bw[flw_src_dest] += 5.5;
   } else {
      ava_bw[flw_src_dest] += 2.5;
   }
   endQueue.pop();
}

You need to do flw = endQueue.top() again.


Generally speaking, in C\C++ max amount of available heap isn't fixed at start of the program -- you can always allocate some more memory, either via direct usage of new/malloc or by using STL containers, such as stl::list, which can do it by themselves.


I don't think the problem is memory, as C++ gets as much memory as it asks for, even hogging all available memory on your PC. Look if you delete something you access later on.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号