CloudBerry Explorer offers PowerShell extension to manage file operations across Amazon Simple Storage Service (Amazon S3), Amazon Glacier and file system. Windows PowerShell is a command line shell that helps IT professionals to easily control system and accelerate automation. It includes a number of system administration utilities, improved navigation of common management data such as the registry, certificate store, or WMI, etc.

What is good about PowerShell and CloudBerry Explorer Snap-in?

PowerShell Snap-in allows using the majority of Amazon S3 functionality. You can combine CloudBerry Explorer commands with PowerShell commands. PowerShell is designed to operate with .Net objects, so you are not limited with command syntax. You can write complicated scripts with loops and conditions. You can schedule periodical tasks like data backup or cleanup.

This is an Example of coping files from local disk to S3 bucket:

Example:

The file results.xls will be copied to S3 bucket.

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls"

This can be scheduled for every weekend to copy files into S3 storage (for safety reason for Example).

Example:

This will copy all files and folders from c:\workdata\ to S3 bucket "myBucket". New directory named by date like 2008_11_01 will be created.

$new_folder_format = Get-Date -uformat "%Y_%m_%d"
$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket" | Add-CloudFolder $new_folder_format
$src = Get-CloudFilesystemConnection | Select-CloudFolder -path "c:\workdata\"
$src | Copy-CloudItem $destination -filter "*"

COMMANDS

USING SSE-C IN AMAZON S3

You can use Server Side Encryption with Customer-provided key (SSE-C) when uploading files to Amazon S3 and manage already SSE-C encrypted S3 files.

There are new parameters:

-DestinationSseCustomerKey (alias: -DstSSEKey) – defines an encryption key for copy, move or rename operation. This key is needed if you want to encrypt files with SSE-C.

-SourceSseCustomerKey (alias: -SrcSSEKey) – defines an encryption key to download from Amazon S3 or edit file’s settings for file(s) encrypted with SSE-C.

Note: for the operations such as "local to S3" and "S3 to local" you need to specify only one key: -DstSSEKey for upload; -SrcSSEKey for download. For the operations such as “S3 to S3” or rename on S3 you can use two keys and they can be different – it allows to modify SSE-C key for already encrypted files.

These parameters were added to the following commands:

Copy-CloudItem
Move-CloudItem
Rename-CloudItem
Set-CloudItemStorageClass (backward compatibility: Set-CloudStorageClass)
Add-CloudItemHeaders
Get-CloudItemHeaders

There is new command:

Set-CloudItemServerSideEncryption – allows to set or change the SSE settings for existing S3 file (e.g. set/reset SSE-C encryption; reset any SSE encryption; switch SSE to SSE-C, or vice versa)

Example: Upload to Amazon S3 with SSE-C

1. Generate 256-bit encryption key (256-bit key for AES-256) – this example demonstrates key generation using password-based key derivation functionality PBKDF2.

$iterations = 100000
$salt = [byte[]] (1,2,3,4,5,6,7,8)
$password = "My$Super9Password"
$binaryKey=(New-Object System.Security.Cryptography.Rfc2898DeriveBytes([System.Text.Encoding]::UTF8.GetBytes($password), $salt, $iterations)).GetBytes(32)
$base64Key = [System.Convert]::ToBase64String($binaryKey)

IMPORTANT NOTE: $password is just an example value. Make sure to use your personal characters sequence.

2. Copy data from local to Amazon S3 with SSE-C using generated key:

$source | Copy-CloudItem $dest -DstSSEkey $base64Key -filter *

where $source is local folder, $dest is Amazon S3 bucket (folder). For example:

$source = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Company\DailyReports"
$s3 = Get-CloudS3Connection -k yourAccessKey -s yourSecretKey
$dest = $s3 | Select-CloudItem "mycompany/reports"

Example: Download SSE-C encrypted file from Amazon S3

$dest | Copy-CloudItem $source -SrcSSEKey $base64Key -filter "monthlyReport-Jul2014.docx"

To move files, just replace Copy-CloudItem with Move-CloudItem.

Example: Rename existing SSE-C encrypted file with keeping encryption with the same key

$dest | Rename-CloudItem –name "monthlyReport-Jul2014.docx" -newname "monthlyReport-Aug2014.docx" -SrcSSEKey $base64Key -DstSSEKey $base64Key

Example: Copy existing SSE-C encrypted file inside S3 with keeping encryption with the same key

$dest | Copy-CloudItem $dest2 -filter “monthlyReport-Jul2014.docx” -SrcSSEkey $base64Key -DstSSEkey $base64Key

Example: Set or change SSE-C encryption for existing S3 file

Encrypt non-encrypted S3 file with SSE-C

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -DstSSEkey $base64Key

Encrypt non-encrypted S3 file with SSE

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE

Decrypt SSE-C encrypted S3 file (i.e. reset SSE-C)

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-May2014.docx” -SrcSSEKey $base64Key

Reset SSE encryption for S3 file

$dest | Set-CloudItemServerSideEncryption -filter “monthlyReport-Apr2014.docx” -SSE:$false

Example: Change Storage Class for SSE-C encrypted file

$dest | Set-CloudItemStorageClass -filter “monthlyReport-May2014.docx” -SrcSSEKey $base64Key

UPLOAD TO AMAZON GLACIER

You can set connection to your Amazon Glacier account, set connection options, upload files to Amazon Glacier and set filters for files to upload. Also you can restore data from Amazon Glacier using PowerShell commands. Check out the Examples below:

Example: Uploading to Amazon Glacier

# Add snap-in

add-pssnapin CloudBerryLab.Explorer.PSSnapIn

# Enable logging and specify path

Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info

# Create connection

$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]

# Set options

Set-CloudOption -GlacierRetrievalRateLimitType Specified
Set-CloudOption -GlacierChunkSizeMB 4
Set-CloudOption -GlacierParallelUpload 1
Set-CloudOption -GlacierPeakRetrievalRateLimit 23.5

# Select vault

$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"

# Let's copy to vault

$destination = $vault

# Select source folder

$src = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp[YOUR SOURCE FOLDER PATH]"

# Upload files to Glacier by filter

#$src | Copy-CloudItem $destination -filter "sample.txt"

# Upload all files to Glacier

$src | Copy-CloudItem $destination -filter "*"

# Delete vault

$conn | Remove-CloudBucket $vault

Example: Retrieving data from Amazon Glacier

# Add snap-in

add-pssnapin CloudBerryLab.Explorer.PSSnapIn

# Enable logging and specify path

Set-Logging -LogPath "C:\Users\user1\AppData\Local\CloudBerry S3 Explorer PRO\Logs\PowerShell.log" -LogLevel Info

# Create connection

$conn = Get-CloudGlacierConnection -Key [YOUR ACCESS KEY] -Secret [YOUR SECRET KEY]

# Get existing vault

$vault = $conn | Select-CloudFolder -Path "us-east-1/[YOUR VAULT]"

# Get vault inventory.

Note: this command may take up to 5 hours to execute if inventory has not been prepared yet.

$invJob = $vault | Get-Inventory

# Now read vault archives

$archives = $vault | get-clouditem

# Select destination local folder.

$dst = Get-CloudFilesystemConnection | Select-CloudFolder "C:\Tmp [YOUR DESTINATION FOLDER PATH]"

# Copy files from vault. Only files located in C:\Tmp folder are copied.

Note: this command may take many hours to execute when files have not been prepared for copying yet.

$vault | Copy-CloudItem $dst -filter "C:\Tmp\*.*"

ENABLING SERVER SIDE ENCRYPTION

SSE is enabled with "-sse" switch. Applicable for Copy-CloudItem and Copy-CloudSyncFolders commands when uploading to Amazon S3.

Example: Enabling SSE for Copy-CloudItem:

source | Copy-CloudItem $dest -Filter *.mov -sse

Example: Enabling SSE for Copy-CloudSyncFolders:

$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -sse

Example: Enable SSL for connection:

# Create connection with SSL

$s3 = Get-CloudS3Connection -UseSSL -Key $key -Secret $secret

Options supported for Copy-CloudSyncFolders:

-StorageClass defines storage class for files (it can be rrs or standard)

-IncludeFiles allows to specify certain files for sync using the standard wildcards (for Example: *.exe; *.dll; d*t.doc; *.t?t)

-ExcludeFiles allows to exclude certain files from sync using the standard wildcards (for Example: *.exe; *.dll; d*t.doc; *.t?t)

-ExcludeFolders allows to skip certain folders (for Example: bin; *temp*; My*)

Example: Sync only JPG files and setting RRS storage class while syncing the files to the S3 storage

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeFiles "*.jpg" -StorageClass rrs

Example: Sync entire folder excluding \temp folder and .tmp files

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -Path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudSyncFolders $destination -IncludeSubfolders -ExcludeFiles "*.tmp" -ExcludeFolders "temp"

SETTING A STORAGE CLASS

You can set a storage class for a certain file or for a number of files:

Set-CloudStorageClass

Storage Class: rrs, standard

Example: Setting RRS storage class to specified item:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$item = $bucket | Get-CloudItem $itemname
$item | Set-CloudStorageClass -StorageClass rrs

Example: Setting RRS storage class to all text files in a specified folder:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$bucket = $s3 | Select-CloudFolder -Path $bucketname
$folder = $bucket | Get-CloudItem $foldername
$folder | Set-CloudStorageClass -Filter *.txt -StorageClass rrs

Or you can set storage class while copying files to S3 storage -StorageClass in Copy-CloudItem.

Example: Setting RRS storage class to a file while uploading it to the S3 storage:

$s3 = Get-CloudS3Connection -Key $key -Secret $secret
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls" -StorageClass rrs

ADVANCED PARAMETERS FOR "Copy-CloudSyncFolders"

Copy-CloudSyncFolders supports advanced parameters:

-DeleteOnTarget delete files from the target if they no longer exist on source

-IncludeSubfolders include subfolders into synchronization

-CompareByContent use MD5 hash to compare content of files (PRO only)

-MissingOnly copy only missing files, ignore files that exist both on source and target

GENERATING WEB URLs

Using Get-CloudUrl you can generate HTTP, HTTPS or RTMP URLs and also HTML code for streaming video files.

Example: Generating short URL for JPG files and save output to a file

$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -ChilpIt >> C:\urls.txt

Example: Generating signed URL

$dest | Get-CloudUrl -Filter *.jpg -Type HTTPS -Expire 01/01/2011 >> C:\urls.txt

Example: Generating CloudFront signed URL (where $domain is a CloudFront distribution domain name)

$dest | Get-CloudUrl -Filter *.jpg -Type HTTP -Expire 01/01/2011 -DomainName $domain>> C:\urls.txt

Example: Generate signed URL for private content item (where $domain is Streaming distribution domain name)

$policy = New-CloudPolicy -PrivateKey $privatekey -KeyPairId $keypairid -IsCanned
$dest | Get-CloudUrl -Filter *.flv -Type RTMP -Policy $policy -Expire 01/01/2011 -DomainName $domain >> C:\urls.txt

SETTING CUSTOM CONTENT TYPES AND HTTP HEADERS

Example: Adding a new content type for .flv

Add-CloudContentType -Extension .flv -Type video/x-flv
Get-CloudContentTypes - displays a list of predefined and custom content types

Any file with .flv extention uploaded to S3 will have a proper content type: video/x-flv.

Example: Getting HTTP headers for an item ($s3 is an S3 connection)

$s3 | Select-CloudFolder myvideos | Get-CloudItem cats.flv | Get-CloudItemHeaders

Example: Setting HTTP headers to items

$headers = New-CloudHeaders Expires "Thu, 1 Apr 12:00:00 GMT"
$s3 | Select-CloudFolder myvideos | Add-CloudItemHeaders -Filter *.flv -Headers $headers

Example: Setting HTTP headers when copy/move

$headers = New-CloudHeaders Cache-Control private
$source | Copy-CloudItem $dest -Filter *.mov -Headers $headers

RENAMING ITEMS

Example: Renaming folder "favourites" to "thrillers" that is located in bucket "myvidoes"

$s3 | Select-CloudFolder myvidoes | Rename-CloudItem -Name favourites -NewName thrillers

APPLY ACL FOR ALL SUBFOLDERS AND FILES

Example: Make all files inside "myvideos/thrillers" and its subfolders as public read

$s3 | Select-CloudFolder myvideos/thrillers | Add-CloudItemPermission< -UserName "All Users" -Read -Descendants

SET LOGGING FOR POWERSHELL

Set-Logging -LogPath <path> -LogLevel <value>

Values: nolog, fatal, error, warning, info, debug

ADVANCED OPTIONS (PRO ONLY)

Set-CloudOption -ThreadCount <number>

Defines count of threads for multithreading uploading/downloading.

Set-CloudOption -UseCompression <value>

Defines whether to use compression or not.

Set-CloudOption -UseChunks <value> -ChunkSizeKB <sizeinKB>

Defines a size of chunk in KB; files larger than a chunk will be divided into chunks.

Values: 1 or 0

If you want to download a file that was divided into chunks on S3 storage you should enable "chunk transparency" mode before downloading the file in order to download it as a single file:

Set-CloudOption -ChunkTransparency 1

When you copy or move files to S3 these files can inherit ACL from parent object: bucket or folder.

Set-CloudOption -PermissionsInheritance <value>

Values: "donotinherit", "onlyforcloudfront", "inheritall"

Example:

Set-CloudOption -PermissionsInheritance "inheritall"
$s3 = Get-CloudS3Connection <key> <secret>
$destination = $s3 | Select-CloudFolder -path "myBucket/weeklyreport"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "c:\sales\"
$src | Copy-CloudItem $destination -filter "results.xls"

The file "result.xls" will automatically have the same ACL as "myBucket/weeklyreport".

Set-CloudOption -KeepExistingHeaders

Keep existing HTTP headers when replacing files on S3.

Set-CloudOption -DoNotChangePermissionsForExisting <value>

Keep ACL for files when replacing them on S3.

Values: 1 or 0

Set-CloudOption -KeepExistingPemissionsOnCloudCopy <value>

Keep source permissions when copying within S3.

Values: 1 or 0

Copy-CloudSyncFolders

Copy-CloudSyncFolders synchronizes local folders with Amazon S3 bucket. You should specify source folder (local or S3) in pipeline.

-Source <CloudFolder> Amazon S3 bucket or folder or local folder

-Target <CloudFolder> Amazon S3 bucket or folder or local folder

Example:

$s3 = Get-CloudS3Connection <key> <secret>
$source = $s3 | Select-CloudFolder -Path boooks/sync
$local = Get-CloudFileSystemConnection
$target = $local | Select-CloudFolder C:\temp\sync
$source | Copy-CloudSyncFolders $target

Or synchronize content in both ways.

$source | Copy-CloudSyncFolders $target -Bidirectional

New-CloudBucket

New-CloudBucket Creates a new bucket. You should specify S3 connection in pipeline.

-Connection <CloudS3Connection> - S3 connection

-Name <String> - Bucket name

-Location <String> - Bucket location. US (USA) or EU (europe). By default US location is used.

Example:

$s3 = Get-CloudS3Connection <key> <secret>
$s3 | New-CloudBucket mytestbucket EU Remove-CloudBucket - Removes bucket. Before bucket will be removed all contents must be removed. It can take long time, progress is displayed.

-Connection <CloudS3Connection> S3 Connection

-Name <String> Bucket name

-Force Suppress warning messages

-Bucket <CloudFolder> Bucket object

Example:

$s3 | Remove-CloudBucket mytestbucket

Get-CloudItemACL - Returns all access control entry for specified item. It can be S3 bucket, folder or file. You can get item using Select-CloudFolder or Get-CloudItems commands.

-Item <CloudItem> Cloud item, it can be bucket, s3 folder or s3 file.

Example:

$fld = $s3 | Select-CloudFolder mytestbucket/documents $fld | Get-CloudItemACL

Add-CloudItemPermission - Grants permission to user or group. If user is not in the ACL, user entry will be added.

-Item <CloudItem> Cloud item, it can be bucket, s3 folder or s3 file.

-UserName <String> Username or group

-Write Grant write permission

-WriteACP Grant write ACP permission

-Read Grant read permission

-ReadACP Grant read ACP permission

-FullControl Grant full control permission. This means that all other permission will be granted.

-CloudACE <CloudACE> Access control entry

Example:

$fld | Add-CloudItemPermission "All Users" -Read

Remove-CloudItemPermission - Revokes permission to user or group. If RemoveUser parameter is specified, user entry will be removed from access control list.

-Item <CloudItem> Cloud item, it can be bucket, s3 folder or s3 file.

-UserName <String> Username or group

-Write Revoke write permission

-WriteACP Revoke write ACP permission

-Read Revoke read permission

-ReadACP Revoke read ACP permission

-FullControl Revoke full control permission. This means that all other permission will be removed.

-CloudACE <CloudACE> Access control entry

Example:

$fld | Remove-CloudItemPermission "All Users" -Read

Set-CloudOption - Set options for snap-in

-PathStyle <String> - Path style if this flag is specified. VHost otherwise.

-ProxyAddress <String> - Proxy address

-ProxyPort <Int32> - Proxy port

-ProxyUser <String> - Proxy user name

-ProxyPassword <String> - Proxy user password

-CheckFileConsistency - Check file consistency. MD5 hash is used for checking.

OTHER COMMANDS

Add-CloudFolder - Create new folder

-Folder <CloudFolder> - Current folder
-Name <String> - New folder name

Copy-CloudItem - Copy cloud item (file or folder) to the Destination

-Destination <CloudFolder> - Destination folder
-Filter <String> - Item filter, * and ? are permitted
-Folder <CloudFolder> - Current folder

Get-CloudFilesystemConnection - Get connection to local file system

Get-CloudItem - List files and folder in current folder

-Filter <String> - Item filter, * and ? are permitted
-Folder <CloudFolder> - Current folder

Get-CloudRootFolder - Get root folders

-Connection <BaseCloudConnection> - Connection object

Get-CloudS3Connection - Get S3 connection

-Key <String> - Access Key for S3 connection
-Secret <String> - Secret Key for S3 connection

Move-CloudItem - Move cloud item (file or folder) to the Destination

-Destination <CloudFolder> - Destination folder
-Filter <String> - Item filter, * and ? are permitted
-Folder <Folder> - Current folder

Remove-CloudItem - Remove cloud items (file or folder)

-Filter <String> - Item filter, * and ? are permitted
-Folder <CloudFolder> - Current folder

Select-CloudFolder - Get cloud folder. Must be used for getting folder for other commands as current folder.

-Connection <BaseCloudConnection> - Connection object
-Path <String> - Path
-Folder <CloudFolder> - Folder object

INSTALLATION

Powershell Snap-In must be registered and added to console.

System Requirements

.NET Framework 4.0 (full version)
Windows Management Framework 3.0

Registering Snap-In

If the PowerShell is installed prior to installation of CloudBerry Explorer, you do not need to install Snap-in. Otherwise, run the following command in the CloudBerry Explorer installation folder (c:\Program Files\CloudBerryLab\CloudBerry Explorer for Amazon S3):
C:\Windows\Microsoft.NET\Framework\v2.0.50727\installutil.exe CloudBerryLab.Explorer.PSSnapIn.dll

Note: For x64 the command must be like : C:\Windows\Microsoft.NET\Framework64\v2.0.50727\InstallUtil.exe "C:\Program Files (x86)\CloudBerryLab\CloudBerry Explorer for Amazon S3\CloudBerryLab.Explorer.PSSnapIn.dll"

Note: For PRO version the default installation folder is "C:\Program Files\CloudBerryLab\CloudBerry S3 Explorer PRO"; on x64 - "C:\Program Files (x86)\CloudBerryLab\CloudBerry S3 Explorer PRO"

Note: You can do this from command line or PowerShell.

You can verify that the CloudBerry Explorer Snap-in is registered. Run the following command:

Get-PSsnapin -Registered

PowerShell displays registered Snap-Ins. Check that CloudBerryLab.Explorer.PSSnapIn is in the list.

Adding Snap-In to console

You can check that CloudBerry Explorer Snap-in is registered by running command above.
To add Snap-In to console run the following command:

Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn

Now new command will be available.

Exporting console configuration

You should run Add-PSSnapin command anytime you start PowerShell or you can save configuration using the following.

  • Run PowerShell.
  • Add Snap-In to console.
  • Run the command: Export-Console CloudBerruExplorerConfig

CloudBerruExplorerConfig is the name of a console file to save configuration. To start the PowerShell from a saved configuration run the command:
C:\Program Files\Command Shell> PS -PSConsoleFile CloudBerruExplorerConfig.psc1.
CloudBerry Explorer commands will be available.

Product Updates

Subscribe

CloudBerry Lab respects your privacy. We don't rent or sell your personal information to anyone. Ever.

Advanced Technology Partner