I want to protect a pdf file from being directly linked but instead have my logged in users be able to access it. I have a link which currently goes to a javascript function which posts a form: $('nameofdoc').setProperty('value',doc); document.getElementById('sendme').submit();
where sendme is the name of the form and nameof doc the index of the document I want to display.
This then goes to a php file:
$docpath = $holdingArray[0].$holdingArray[1];
$file = $holdingArray[0]; //file name
$filename = $holdingArray[1]; //path to the file]
header( 'Location:'.$docpath ) ;
header('Content-type: application/pdf');
header('Content-Disposition: attachment; filename="'.$filename . '"');
readfile($filename)
This all works fine it loads up the file and outputs the pdf. What I can't do is protect the directory from direct linking - ie www.mydomain.com/pathToPdf/pdfname.pdf
I've thought of using .htaccess to protect the directory but it's on a shared host so I'm not sure about the security and anyway when I've tried I can't get it to work.
Any help would be great since this is my fourth day of trying to fix this.
thanks
Update
I've had a lot of help thank you but I'm not quite there yet.
I've got an .htaccess file that now launches another php file when a pdf is requested from the directory:
RewriteEngine on
RewriteRule ^(.*).(pdf)$ fileopen.php
When the fileopen.php file lauches it fails to open the pdf
$path = $_SERVER['REQUEST_URI'];
$paths = explode('/', $path);
$lastIndex = count($paths) - 1;
$fileName = $paths[$lastIndex];
$file = basename($path);
$filepath = $path;
if (file_exists($file)) {
header( 'Location: http://www.mydomain.com'.$path ) ;
header("Content-type: application/pdf");
header("Content-Disposition: attachment; filename=".$file);
readfile($filepath);
}else{
echo "file not found using path ".$path." and file is ".$file;
}
The output is file not found using path /documents/6/Doc1.pdf and file is Doc1.pdf
but the file does exist and is in that direcotry - any ideas??
OKAY I'm happy to report that Jaroslav really helped me sort out the issue. His method works well but it is tricky to get all the directory stuff lined up. In the end I spent a few hours playing about with combinations to get it working but the principle he gave works well. Thanks
The best way would be to protect that folder with htaccess, as you have mentioned. So you put all PDFs in pdf/ folder, and in the same pdf folder you out .htaccess file:
RewriteEngine on
RewriteRule .* your-php-script.php
Now no files can be accessed by url in this folder. Every request to every file in this folder will return what your-php-script.php script returns. In your-php-script.php you do something like this:
//Check if user has right to access the file. If no, show access denied and exit the script.
$path = $_SERVER['REQUEST_URI'];
$paths = explode('/', path);
$lastIndex = count($paths) - 1;
$fileName = $paths[$lastIndex]; // Maybe add some code to detect subfolder if you have them
// Check if that file exists, if no show some error message
// Output headers here
readfile($filename);
Now if user opens domain.com/pdf/nsa-secrets.pdf Apache will run your-php-script.php. Script will have variable $_SERVER['REQUEST_URI'] set to "domain.com/pdf/nsa-secrets.pdf". You take the last part (filename) and output it to a user (or not).
This will stop anyone from accessing files directly from the internet by knowing URL. If someone has direct access to files on your server, that will not stop them. On the other hand, I think any shared hosting stops users from getting files of other clients. Only way to do it is to hack the server in some way. But then we are getting very paranoid and if that may be a case for you, you shouldn't use shared hosting in the first place.
If you cannot make htaccess work, you can try to obfuscate files, so it would be difficult to spot them for someone outside. For example change file from mySecretData.pdf to djjsdmdkjeksm.pdf. This may help a little bit.