I am trying to get image size (image dimensions, width and height) of hundreds of remote images and getimagesize
is way too slow.
I have done some reading and found out the quickest way would be to use file_get_contents
to read a certain amount of bytes from the images and examining the size within the binary data.
Anyone attempted this before? How would I examine different formats? Anyone has seen any library for this?
function ranger($url){
$headers = array(
"Range: bytes=0-32768"
);
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}
$start = microtime(true);
$url = "http://news.softpedia.com/images/news2/Debian-Turns-15-2.jpeg";
$raw = ranger($url);
$im = imagecreatefromstring($raw);
$width = imagesx($im);
$height = imagesy($im);
$stop = round(microtime(true) - $start, 5);
echo $width." x ".$height." ({$stop}s)";
test...
640 x 480 (0.20859s)
Loading 32kb of data worked for me.