I'm looping over a large file that I know the length of, but am processing lazily since it's too large to fit in memory. I'd like to be able to use tqdm to keep track of my progress through the file, but since it can't get the total number of examples out of the generator I'm using, the only thing it shows is the estimated iterations/second. Is there any way to tell tqdm how many elements it's going to be looping over total so I can get some of the other statistics?
You can pass the length to the argument total to make it work.
Example:
from tqdm import tqdm
length = 1000000
generator = (3 * n for n in range(length)) # just doing something random
for n in tqdm(generator, total=length):
pass