Is std::ifstream significantly slower than FILE?

Jesse Beder picture Jesse Beder · Jan 25, 2009 · Viewed 11.6k times · Source

I've been informed that my library is slower than it should be, on the order of 30+ times too slow parsing a particular file (text file, size 326 kb). The user suggested that it may be that I'm using std::ifstream (presumably instead of FILE).

I'd rather not blindly rewrite, so I thought I'd check here first, since my guess would be the bottleneck is elsewhere. I'm reading character by character, so the only functions I'm using are get(), peek(), and tellg()/seekg().

Update:

I profiled, and got confusing output - gprof didn't appear to think that it took so long. I rewrote the program to read the entire file into a buffer first, and it sped up by about 100x. I think the problem may have been the tellg()/seekg() that took a long time, but gprof may have been unable to see that for some reason. In any case, ifstream does not appear to buffer the entire file, even for this size.

Answer

jalf picture jalf · Jan 25, 2009

I don't think that'd make a difference. Especially if you're reading char by char, the overhead of I/O is likely to completely dominate anything else. Why do you read single bytes at a time? You know how extremely inefficient it is?

On a 326kb file, the fastest solution will most likely be to just read it into memory at once.

The difference between std::ifstream and the C equivalents, is basically a virtual function call or two. It may make a difference if executed a few tens of million times per second, otherwise, not reall. file I/O is generally so slow that the API used to access it doesn't really matter. What matters far more is the read/write pattern. Lots of seeks are bad, sequential reads/writes good.