Copy Data using AWS Powershell from Amazon S3 Server to local drive.

Rajan S picture Rajan S · Feb 28, 2014 · Viewed 29k times · Source

I am trying Get-S3Object -BucketName pilot but it has lot of sub directories.

Answer

Anthony Neace picture Anthony Neace · Feb 28, 2014

You probably want to use Copy-S3Object instead, for the actual copying. Get-S3Object returns a collection of S3Object metadata. Use Get-S3Object to isolate the objects you want to copy, and then pass the important information from them (the Key) into Copy-S3Object.

You can use the KeyPrefix Parameter to filter the results down to a specific sub-directory. For example:

$srcBucketName = "myBucket"
$objects = get-s3object -bucketname $srcBucketName -KeyPrefix "mySubdirectory"

If you need to copy data from all of your directories, you can break this overall operation into smaller chunks by calling get-s3object individually, once for each directory in your bucket.

If the individual directories are still yet too large, you can further chunk your operation by using the MaxKeys parameter.


Once you're ready to copy, you can do something like this:

$objects | 
% { copy-s3object -BucketName $srcBucketName -Key $_.Key -LocalFile "somePath" }

...where you've defined your local path. You can even match the subdirectories and keys to your local path by resolving the key in a string: "C:/someDir/$($_.Key)"