I am trying to read a large file (10M) using php file_get_contents
$file = 'http://www.remoteserver.com/test.txt';
$data = file_get_contents( $file );
var_dump ( $data );
It dumps back
string(32720)
and then the output with only showing part of the file. Is there a limit somewhere of file_get_contents? I tried doing ini_set('memory_limit', '512M'), but that did not work.
EDIT: ** forgot to mention ** it's a remote file.
PROBLEM RESOLVED:: Out of HDD space. Fixed that and now everything works.
Assuming the contents of the file you want to load are logically separated by line breaks (eg: not a binary file), then you might be better off reading line by line.
$fp = fopen($path_to_file, "r");
$fileLines = array();
while (!feof($fp)){
array_push(fgets($fp),$fileContents);
}
fclose($$fp);
You could always implode()
(with your choice of line break character) the array back to a single string if you really need the file in one "chunk".
Reference -