Laravel 5: How do you copy a local file to Amazon S3?

clone45 picture clone45 · Apr 9, 2015 · Viewed 20.8k times · Source

I'm writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:

    $filename = 'database_backup_'.date('G_a_m_d_y').'.sql';
    $destination = storage_path() . '/backups/';

    $database = \Config::get('database.connections.mysql.database');
    $username = \Config::get('database.connections.mysql.username');
    $password = \Config::get('database.connections.mysql.password');

    $sql = "mysqldump $database --password=$password --user=$username --single-transaction >$destination" . $filename;

    $result = exec($sql, $output); // TODO: check $result

    // Copy database dump to S3

    $disk = \Storage::disk('s3');

    // ????????????????????????????????
    //  What goes here?
    // ????????????????????????????????

I've seen solutions online that would suggest I do something like:

$disk->put('my/bucket/' . $filename, file_get_contents($destination . $filename));

However, for large files, isn't it wasteful to use file_get_contents()? Are there any better solutions?

Answer

theHarvester picture theHarvester · Sep 13, 2017

There is a way to copy files without needing to load the file contents into memory.

You will also need to import the following:

use League\Flysystem\MountManager;

Now you can copy the file like so:

$mountManager = new MountManager([
    's3' => \Storage::disk('s3')->getDriver(),
    'local' => \Storage::disk('local')->getDriver(),
]);
$mountManager->copy('s3://path/to/file.txt', 'local://path/to/output/file.txt');