For a while now, I have been experiencing an extremely odd problem when trying to write memory to a filebuffer in C++. The problem only occurs on MinGW. When I compile under gcc/linux, everything is fine.
Debugging Session displaying the problem
So basically, I'm writing code from a memory buffer to a filebuffer, and the binary representation in the file ends up being different from the memory I wrote. No, the file is not being modified at a later point, I ensured this by using the debugger to exit the program after closing the file. I have no idea how something like this is even开发者_开发百科 possible, I even used valgrind to see if there were any memory allocation problems, but nope.
I'll paste some of the related code.
/// a struct holding information about a data file
class ResourceFile {
public:
string name;
uint32 size;
char* data;
ResourceFile(string name, uint32 size);
};
ResourceFile::ResourceFile(string name, uint32 size)
: name(name), size(size)
{
// will be free'd in ResourceBuilder's destruction
data = (char*) malloc(size * sizeof(char));
}
/// Build a data resource from a set of files
class ResourceBuilder {
public:
ofstream out; ///< File to put the resource into
vector<ResourceFile> input; ///< List of input strings
/// Add a file from disk to the resource
void add_file(string filename);
/// Create a file that the resource will be written to
void create_file(string filename);
~ResourceBuilder();
};
void ResourceBuilder::create_file(string filename) {
// open the specified file for output
out.open(filename.c_str());
uint16 number_files = htons(input.size());
out.write((char*) &number_files, sizeof(uint16));
foreach(vector<ResourceFile>,input,i) {
ResourceFile& df = *i;
uint16 name_size = i->name.size();
uint16 name_size_network = htons(name_size);
out.write((char*) &name_size_network, sizeof(uint16));
out.write(i->name.c_str(),name_size);
uint32 size_network = htonl(i->size);
out.write((char*) &size_network, sizeof(i->size) );
out.write(i->data, i->size);
}
out.close();
/// \todo write the CRC
}
The following is how the memory is allocated in the first place. This is a possible source of error, because I copypasted it from somewhere else without bothering to understand it in detail, but I honestly don't know how the method in which I allocated memory could be a reason for filebuffer output being different from the memory that I'm writing.
void ResourceBuilder::add_file(string filename) {
// loads a file and copies its content into memory
// this is done by the ResourceFile class and there is a
// small problem with this, namely that the memory is
// allocated in the ResourceFile directly,
ifstream file;
file.open(filename.c_str());
filebuf* pbuf=file.rdbuf();
int size=pbuf->pubseekoff (0,ios::end,ios::in);
pbuf->pubseekpos (0,ios::in);
ResourceFile df(filename,size);
pbuf->sgetn (df.data,size);
file.close();
input.push_back(df);
}
I'm really out of ideas. It's also not a bug pertaining to my compiler setup, as other people compiling the code under MinGW get the same error. The only explanation I can think of at this point is a bug with MinGW's filebuffer library itself, but I honestly have no idea.
You need to open the file in binary mode. When you open it in text mode on Windows, line feeds (0x0A) will get converted to CR/LF pairs (0x0D, 0x0A). On Linux you don't see this because Linux uses a single LF (0x0A) as the line terminator in text files, so no conversion is done.
Pass the ios::binary
flag to the ostream::open
method:
out.open(filename.c_str(), ios::out | ios::binary);
You should also use this flag when reading binary files, so that the opposite conversion isn't performed:
ifstream file;
file.open(filename.c_str(), ios::in | ios::binary);
The problem is that you open your file in text mode! In that case, 0x0a
(Linefeed) is converted to 0x0d0a
(Carriage return line feed). This is why you see a difference in the file and memory.
use out.open(filename.c_str(), ios::binary | ios::out);
精彩评论