PHP: scandir() is too slow

Emil Avramov picture Emil Avramov · Mar 2, 2011 · Viewed 13.7k times · Source

I have to make a function that lists all subfolders into a folder. I have a no-file filter, but the function uses scandir() for listing. That makes the application very slow. Is there an alternative of scandir(), even a not native php function? Thanks in advance!

Answer

RobertPitt picture RobertPitt · Mar 2, 2011

You can use readdir which may be faster, something like this:

function readDirectory($Directory,$Recursive = true)
{
    if(is_dir($Directory) === false)
    {
        return false;
    }

    try
    {
        $Resource = opendir($Directory);
        $Found = array();

        while(false !== ($Item = readdir($Resource)))
        {
            if($Item == "." || $Item == "..")
            {
                continue;
            }

            if($Recursive === true && is_dir($Item))
            {
                $Found[] = readDirectory($Directory . $Item);
            }else
            {
                $Found[] = $Directory . $Item;
            }
        }
    }catch(Exception $e)
    {
        return false;
    }

    return $Found;
}

May require some tweeking but this is essentially what scandir does, and it should be faster, if not please write an update as i would like to see if i can make a faster solution.

Another issue is if your reading a very large directory your filling an array up within the internal memory and that may be where your memory is going.

You could try and create a function that reads in offsets so that you can return 50 files at a time!

reading chunks of files at a time would be just as simple to use, would be like so:

$offset = 0;
while(false !== ($Batch = ReadFilesByOffset("/tmp",$offset)))
{
    //Use $batch here which contains 50 or less files!

    //Increment the offset:
    $offset += 50;
}