Overhead to using std::vector?

Vivek Ghaisas picture Vivek Ghaisas · Mar 8, 2013 · Viewed 11.7k times · Source

I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector?

To give a crude example, if I had to store an array of n integers, where n <= 16, say. I could implement it using

int* data = new int[n]; //assuming n is set beforehand

or using a vector:

std::vector<int> data;

Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?

Answer

us2012 picture us2012 · Mar 8, 2013

It is always better to use std::vector/std::array, at least until you can conclusively prove (through profiling) that the T* a = new T[100]; solution is considerably faster in your specific situation. This is unlikely to happen: vector/array is an extremely thin layer around a plain old array. There is some overhead to bounds checking with vector::at, but you can circumvent that by using operator[].