In order to improve performance reading from a file, I'm trying to read the entire content of a big (several MB) file into memory and then use a istringstream to access the information.
My question is, which is the best way to read this information and "import it" into the string stream? A problem with this approach (see bellow) is that when creating the string stream the buffers gets copied, and memory usage doubles.
#include <fstream>
#include <sstream>
using namespace std;
int main() {
ifstream is;
is.open (sFilename.c_str(), ios::binary );
// get length of file:
is.seekg (0, std::ios::end);
long length = is.tellg();
is.seekg (0, std::ios::beg);
// allocate memory:
char *buffer = new char [length];
// read data as a block:
is.read (buffer,length);
// create string stream of memory contents
// NOTE: this ends up copying the buffer!!!
istringstream iss( string( buffer ) );
// delete temporary buffer
delete [] buffer;
// close filestream
is.close();
/* ==================================
* Use iss to access data
*/
}
std::ifstream
has a method rdbuf()
, that returns a pointer to a filebuf
. You can then "push" this filebuf
into your stringstream
:
#include <fstream>
#include <sstream>
int main()
{
std::ifstream file( "myFile" );
if ( file )
{
std::stringstream buffer;
buffer << file.rdbuf();
file.close();
// operations on the buffer...
}
}
EDIT: As Martin York remarks in the comments, this might not be the fastest solution since the stringstream
's operator<<
will read the filebuf character by character. You might want to check his answer, where he uses the ifstream
's read
method as you used to do, and then set the stringstream
buffer to point to the previously allocated memory.