s3cmd notes
s3cmd is an intuitive way to work with Amazon's S3 on the command line. I first tried s3cmd based on the Alex Clemesha's recommendation. Here are my notes. I'm running on Ubuntu Karmic.
Install s3cmd¶
$ sudo apt-get install s3cmd
Configure s3cmd¶
$ s3cmd --configure
Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.
Access key and Secret key are your identifiers for Amazon S3
Access Key: XXXXXXXXXXXXXX
Secret Key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: XXXXX
Path to GPG program [/usr/bin/gpg]:
When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]: yes
New settings:
Access Key: XXXXXXXXXXXXXX
Secret Key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Encryption password: XXXXX
Path to GPG program: /usr/bin/gpg
Use HTTPS protocol: True
HTTP Proxy server name:
HTTP Proxy server port: 0
Test access with supplied credentials? [Y/n]
Please wait...
Success. Your access key and secret key worked fine :-)
Now verifying that encryption works...
Success. Encryption and decryption worked fine :-)
Save settings? [y/N] y
Configuration saved to '/home/saltycrane/.s3cfg'
List all your buckets¶
$ s3cmd ls
List contents of your bucket¶
$ s3cmd ls s3://mybucket
Upload a file (and make it public)¶
$ s3cmd -P put /path/to/local/file.jpg s3://mybucket/my/prefix/file.jpg
Delete a file¶
$ s3cmd del s3://mybucket/my/prefix/file.jpg
Get help¶
$ s3cmd --help
Usage: s3cmd [options] COMMAND [parameters]
S3cmd is a tool for managing objects in Amazon S3 storage. It allows for
making and removing "buckets" and uploading, downloading and removing
"objects" from these buckets.
Options:
-h, --help show this help message and exit
--configure Invoke interactive (re)configuration tool.
-c FILE, --config=FILE
Config file name. Defaults to /home/saltycrane/.s3cfg
--dump-config Dump current configuration after parsing config files
and command line options and exit.
-n, --dry-run Only show what should be uploaded or downloaded but
don't actually do it. May still perform S3 requests to
get bucket listings and other information though (only
for file transfer commands)
-e, --encrypt Encrypt files before uploading to S3.
--no-encrypt Don't encrypt files.
-f, --force Force overwrite and other dangerous operations.
--continue Continue getting a partially downloaded file (only for
[get] command).
--skip-existing Skip over files that exist at the destination (only
for [get] and [sync] commands).
-r, --recursive Recursive upload, download or removal.
-P, --acl-public Store objects with ACL allowing read for anyone.
--acl-private Store objects with default ACL allowing access for you
only.
--delete-removed Delete remote objects with no corresponding local file
[sync]
--no-delete-removed Don't delete remote objects.
-p, --preserve Preserve filesystem attributes (mode, ownership,
timestamps). Default for [sync] command.
--no-preserve Don't store FS attributes
--exclude=GLOB Filenames and paths matching GLOB will be excluded
from sync
--exclude-from=FILE Read --exclude GLOBs from FILE
--rexclude=REGEXP Filenames and paths matching REGEXP (regular
expression) will be excluded from sync
--rexclude-from=FILE Read --rexclude REGEXPs from FILE
--include=GLOB Filenames and paths matching GLOB will be included
even if previously excluded by one of
--(r)exclude(-from) patterns
--include-from=FILE Read --include GLOBs from FILE
--rinclude=REGEXP Same as --include but uses REGEXP (regular expression)
instead of GLOB
--rinclude-from=FILE Read --rinclude REGEXPs from FILE
--bucket-location=BUCKET_LOCATION
Datacentre to create bucket in. Either EU or US
(default)
-m MIME/TYPE, --mime-type=MIME/TYPE
Default MIME-type to be set for objects stored.
-M, --guess-mime-type
Guess MIME-type of files by their extension. Falls
back to default MIME-Type as specified by --mime-type
option
--add-header=NAME:VALUE
Add a given HTTP header to the upload request. Can be
used multiple times. For instance set 'Expires' or
'Cache-Control' headers (or both) using this options
if you like.
--encoding=ENCODING Override autodetected terminal and filesystem encoding
(character set). Autodetected: UTF-8
--list-md5 Include MD5 sums in bucket listings (only for 'ls'
command).
-H, --human-readable-sizes
Print sizes in human readable form (eg 1kB instead of
1234).
--progress Display progress meter (default on TTY).
--no-progress Don't display progress meter (default on non-TTY).
--enable Enable given CloudFront distribution (only for
[cfmodify] command)
--disable Enable given CloudFront distribution (only for
[cfmodify] command)
--cf-add-cname=CNAME Add given CNAME to a CloudFront distribution (only for
[cfcreate] and [cfmodify] commands)
--cf-remove-cname=CNAME
Remove given CNAME from a CloudFront distribution
(only for [cfmodify] command)
--cf-comment=COMMENT Set COMMENT for a given CloudFront distribution (only
for [cfcreate] and [cfmodify] commands)
-v, --verbose Enable verbose output.
-d, --debug Enable debug output.
--version Show s3cmd version (0.9.9) and exit.
Commands:
Make bucket
s3cmd mb s3://BUCKET
Remove bucket
s3cmd rb s3://BUCKET
List objects or buckets
s3cmd ls [s3://BUCKET[/PREFIX]]
List all object in all buckets
s3cmd la
Put file into bucket
s3cmd put FILE [FILE...] s3://BUCKET[/PREFIX]
Get file from bucket
s3cmd get s3://BUCKET/OBJECT LOCAL_FILE
Delete file from bucket
s3cmd del s3://BUCKET/OBJECT
Synchronize a directory tree to S3
s3cmd sync LOCAL_DIR s3://BUCKET[/PREFIX] or s3://BUCKET[/PREFIX] LOCAL_DIR
Disk usage by buckets
s3cmd du [s3://BUCKET[/PREFIX]]
Get various information about Buckets or Files
s3cmd info s3://BUCKET[/OBJECT]
Copy object
s3cmd cp s3://BUCKET1/OBJECT1 s3://BUCKET2[/OBJECT2]
Move object
s3cmd mv s3://BUCKET1/OBJECT1 s3://BUCKET2[/OBJECT2]
Modify Access control list for Bucket or Files
s3cmd setacl s3://BUCKET[/OBJECT]
List CloudFront distribution points
s3cmd cflist
Display CloudFront distribution point parameters
s3cmd cfinfo [cf://DIST_ID]
Create CloudFront distribution point
s3cmd cfcreate s3://BUCKET
Delete CloudFront distribution point
s3cmd cfdelete cf://DIST_ID
Change CloudFront distribution point parameters
s3cmd cfmodify cf://DIST_ID
See program homepage for more information at
http://s3tools.org
Comments
Hi,
I followed this site and was able to download the data from s3 for a single client.But i wanted it for multiple clients, so is it possible to provide s3 credentials with each get command.
I am waiting for your reply.
Thank You.
Replying to the comment above:
Why don't you supply different s3cmd configuration for each different s3 credentials by invoking this flag --config=FILE when calling s3cmd? Save each S3 credential into a different s3cmd configuration file and reference them separately. Kinda late for the reply.
how would i connect to Eucalyptus 2.0.3 Walrus (open-source) using s3cmd ?
How can I upload multiple directories into s3 bucket........Will the s3cmd put command work in this case?? I will be eagerly waiting for your reply. ThankYou.
Hello, How do i install this script to backup my website on a cpanel shared Linux host? I really need it but have no idea on how to install it. I will be happy to hear from you.
Thank you
Ronnie
disqus:2041756643
Very very informative post. I found one another well explained instalation and usage of s3cmd post. Link http://jee-appy.blogspot.co...
disqus:2382251443