PowerShell Snap-in

CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3), Amazon Glacier and file system. Windows PowerShell is a command line shell that helps IT professionals to easily control system and accelerate automation. It includes a number of system administration utilities, improved navigation of common management data such as the registry, certificate store, or WMI, etc.

What is good about PowerShell and CloudBerry Explorer Snap-in?

PowerShell Snap-in allows using the majority of Amazon S3 functionality. You can combine CloudBerry Explorer commands with PowerShell commands. PowerShell is designed to operate with .Net objects, so you are not limited with command syntax. You can write complicated scripts with loops and conditions. You can schedule periodical tasks like data backup or cleanup.

This is an Example of coping files from local disk to S3 bucket:

Example:

The file results.xls will be copied to S3 bucket.

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls"

This can be scheduled for every weekend to copy files into S3 storage (for safety reason for Example).

Example:

This will copy all files and folders from c:\workdata\ to S3 bucket "myBucket". New directory named by date like 2008_11_01 will be created.

$new_folder_format = Get-Date -uformat "%Y_%m_%d"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket" | Add-CloudFolder $new_folder_format
$src = Get-CloudFilesystemConnection | Select-CloudFolder -path "c:\workdata\"
$src | Copy-CloudItem $destination -filter "*"

COMMANDS

USING SSE-C IN AMAZON S3

You can use Server Side Encryption with Customer-provided key (SSE-C) when uploading files to Amazon S3 and manage already SSE-C encrypted S3 files.

There are new parameters:

-DestinationSseCustomerKey (alias: -DstSSEKey) – defines an encryption key for copy, move or rename operation. This key is needed if you want to encrypt files with SSE-C.

-SourceSseCustomerKey (alias: -SrcSSEKey) – defines an encryption key to download from Amazon S3 or edit file’s settings for file(s) encrypted with SSE-C.

Note: for the operations such as "local to S3" and "S3 to local" you need to specify only one key: -DstSSEKey for upload; -SrcSSEKey for download. For the operations such as “S3 to S3” or rename on S3 you can use two keys and they can be different – it allows to modify SSE-C key for already encrypted files.

These parameters were added to the following commands:

Copy-CloudItem
Move-CloudItem
Rename-CloudItem
Set-CloudItemStorageClass (backward compatibility: Set-CloudStorageClass)
Add-CloudItemHeaders
Get-CloudItemHeaders

There is new command:

Set-CloudItemServerSideEncryption – allows to set or change the SSE settings for existing S3 file (e.g. set/reset SSE-C encryption; reset any SSE encryption; switch SSE to SSE-C, or vice versa)

Example: Upload to Amazon S3 with SSE-C

1. Generate 256-bit encryption key (256-bit key for AES-256) – this example demonstrates key generation using password-based key derivation functionality PBKDF2.

$iterations = 100000
$salt = [byte[]] (1,2,3,4,5,6,7,8)
$password = "My$Super9Password"
$binaryKey=(New-Object System.Security.Cryptography.Rfc2898DeriveBytes([System.Text.Encoding]::UTF8.GetBytes($password), $salt, $iterations)).GetBytes(32)
$base64Key = [System.Convert]::ToBase64String($binaryKey)

IMPORTANT NOTE: $password is just an example value. Make sure to use your personal characters sequence.

2. Copy data from local to Amazon S3 with SSE-C using generated key:

$source | Copy-CloudItem $dest -DstSSEkey $base64Key -filter *

where $source is local folder, $dest is Amazon S3 bucket (folder). For example:

$source = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Company\DailyReports"
$s3 = Get-CloudS3Connection -k yourAccessKey -s yourSecretKey
$dest = $s3 | Select-CloudItem "mycompany/reports"

Example: Download SSE-C encrypted file from Amazon S3

$dest | Copy-CloudItem $source -SrcSSEKey $base64Key -filter "monthlyReport-Jul2014.docx"

To move files, just replace Copy-CloudItem with Move-CloudItem.

Example: Rename existing SSE-C encrypted file with keeping encryption with the same key

$dest | Rename-CloudItem –name "monthlyReport-Jul2014.docx" -newname "monthlyReport-Aug2014.docx" -SrcSSEKey $base64Key -DstSSEKey $base64Key

Example: Copy existing SSE-C encrypted file inside S3 with keeping encryption with the same key

$dest | Copy-CloudItem $dest2 -filter “monthlyReport-Jul2014.docx” -SrcSSEkey $base64Key -DstSSEkey $base64Key

Example: Set or change SSE-C encryption for existing S3 file

Encrypt non-encrypted S3 file with SSE-C

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -DstSSEkey $base64Key

Encrypt non-encrypted S3 file with SSE

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE

Decrypt SSE-C encrypted S3 file (i.e. reset SSE-C)

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -SrcSSEKey $base64Key

Reset SSE encryption for S3 file

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE:$false

Example: Change Storage Class for SSE-C encrypted file

$dest | Set-CloudItemStorageClass -filter “monthlyReport-May2014.docx” -SrcSSEKey $base64Key

UPLOAD TO AMAZON GLACIER

You can set connection to your Amazon Glacier account, set connection options, upload files to Amazon Glacier and set filters for files to upload. Also you can restore data from Amazon Glacier using PowerShell commands. Check out the Examples below:

Example: Uploading to Amazon Glacier

# Add snap-in

add-pssnapin CloudBerryLab.Explorer.PSSnapIn

# Enable logging and specify path

Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info

# Create connection

$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]

# Set options

Set-CloudOption -GlacierRetrievalRateLimitType Specified
Set-CloudOption -GlacierChunkSizeMB 4
Set-CloudOption -GlacierParallelUpload 1
Set-CloudOption -GlacierPeakRetrievalRateLimit 23.5

# Select vault

$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"

# Let's copy to vault

$destination = $vault

# Select source folder

$src = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp[YOUR SOURCE FOLDER PATH]"

# Upload files to Glacier by filter

#$src | Copy-CloudItem $destination -filter "sample.txt"

# Upload all files to Glacier

$src | Copy-CloudItem $destination -filter "*"

# Delete vault

$conn | Remove-CloudBucket $vault

Example: Retrieving data from Amazon Glacier

# Add snap-in

add-pssnapin CloudBerryLab.Explorer.PSSnapIn

# Enable logging and specify path

Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info

# Create connection

$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]

# Get existing vault

$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"

# Get vault inventory.

Note: this command may take up to 5 hours to execute if inventory has not been prepared yet.

$invJob = $vault | Get-Inventory

# Now read vault archives

$archives = $vault | get-clouditem

# Select destination local folder.

$dst = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp [YOUR DESTINATION FOLDER PATH]"

# Copy files from vault. Only files located in C:\Tmp folder are copied.

Note: this command may take many hours to execute when files have not been prepared for copying yet.

$vault | Copy-CloudItem $dst -filter "C:\Tmp\*.*"

ENABLING SERVER SIDE ENCRYPTION

SSE is enabled with "-sse" switch. Applicable for Copy-CloudItem and Copy-CloudSyncFolders commands when uploading to Amazon S3.

Example: Enabling SSE for Copy-CloudItem:

source | Copy-CloudItem $dest -Filter *.mov -sse

Example: Enabling SSE for Copy-CloudSyncFolders:

$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -sse

Example: Enable SSL for connection:

# Create connection with SSL

$s3 = Get-CloudS3Connection -UseSSL -Key $key -Secret $secret

Options supported for Copy-CloudSyncFolders:

-StorageClass defines storage class for files (it can be rrs or standard)

-IncludeFiles allows to specify certain files for sync using the standard wildcards (for Example: *.exe; *.dll; d*t.doc; *.t?t)

-ExcludeFiles allows to exclude certain files from sync using the standard wildcards (for Example: *.exe; *.dll; d*t.doc; *.t?t)

-ExcludeFolders allows to skip certain folders (for Example: bin; *temp*; My*)

Example: Sync only JPG files and setting RRS storage class while syncing the files to the S3 storage

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -StorageClass rrs

Example: Sync entire folder excluding \temp folder and .tmp files

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeSubfolders -ExcludeFiles "*.tmp" -ExcludeFolders "temp"

SETTING A STORAGE CLASS

You can set a storage class for a certain file or for a number of files:

Set-CloudStorageClass

Storage Class: rrs, standard

Example: Setting RRS storage class to specified item:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$item = $bucket | Get-CloudItem $itemname
$item | Set-CloudStorageClass -StorageClass rrs

Example: Setting RRS storage class to all text files in a specified folder:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$folder = $bucket | Get-CloudItem $foldername
$folder | Set-CloudStorageClass -Filter *.txt -StorageClass rrs

Or you can set storage class while copying files to S3 storage -StorageClass in Copy-CloudItem.

Example: Setting RRS storage class to a file while uploading it to the S3 storage:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls" -StorageClass rrs

ADVANCED PARAMETERS FOR "Copy-CloudSyncFolders"

Copy-CloudSyncFolders supports advanced parameters:

-DeleteOnTarget delete files from the target if they no longer exist on source

-IncludeSubfolders include subfolders into synchronization

-CompareByContent use MD5 hash to compare content of files (PRO only)

-MissingOnly copy only missing files, ignore files that exist both on source and target

GENERATING WEB URLs

Using Get-CloudUrl you can generate HTTP, HTTPS or RTMP URLs and also HTML code for streaming video files.

Example: Generating short URL for JPG files and save output to a file

$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -ChilpIt >> C:\urls.txt

Example: Generating signed URL

$dest | Get-CloudUrl -Filter *.jpg -Type HTTPS -Expire 01/01/2011 >> C:\urls.txt

Example: Generating CloudFront signed URL (where $domain is a CloudFront distribution domain name)

$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -Expire 01/01/2011 -DomainName $domain>> C:\urls.txt

Example: Generate signed URL for private content item (where $domain is Streaming distribution domain name)

$policy = New-CloudPolicy -PrivateKey $privatekey -KeyPairId $keypairid -IsCanned
$dest | Get-CloudUrl -Filter *.flv -Type RTMP -Policy $policy -Expire 01/01/2011 -DomainName $domain >> C:\urls.txt

SETTING CUSTOM CONTENT TYPES AND HTTP HEADERS

Example: Adding a new content type for .flv

Add-CloudContentType -Extension .flv -Type video/x-flv
Get-CloudContentTypes - displays a list of predefined and custom content types

Any file with .flv extention uploaded to S3 will have a proper content type: video/x-flv.

Example: Getting HTTP headers for an item ($s3 is an S3 connection)

$s3 | Select-CloudFolder myvideos | Get-CloudItem cats.flv | Get-CloudItemHeaders

Example: Setting HTTP headers to items

$headers = New-CloudHeaders Expires "Thu, 1 Apr 12:00:00 GMT"
$s3 | Select-CloudFolder myvideos | Add-CloudItemHeaders -Filter *.flv -Headers $headers

Example: Setting HTTP headers when copy/move

$headers = New-CloudHeaders Cache-Control private
$source | Copy-CloudItem $dest -Filter *.mov -Headers $headers

RENAMING ITEMS

Example: Renaming folder "favourites" to "thrillers" that is located in bucket "myvidoes"

$s3 | Select-CloudFolder myvidoes | Rename-CloudItem -Name favourites -NewName thrillers

APPLY ACL FOR ALL SUBFOLDERS AND FILES

Example: Make all files inside "myvideos/thrillers" and its subfolders as public read

$s3 | Select-CloudFolder myvideos/thrillers | Add-CloudItemPermission< -UserName "All Users" -Read -Descendants

SET LOGGING FOR POWERSHELL

Set-Logging -LogPath <path> -LogLevel <value>

Values: nolog, fatal, error, warning, info, debug

ADVANCED OPTIONS (PRO ONLY)

Set-CloudOption -ThreadCount <number>

Defines count of threads for multithreading uploading/downloading.

Set-CloudOption -UseCompression <value>

Defines whether to use compression or not.

Set-CloudOption -UseChunks <value> -ChunkSizeKB <sizeinKB>

Defines a size of chunk in KB; files larger than a chunk will be