Day 40 - s3cmd

19 May 2017

If you want to browse AWS S3 without going on the website, 2 methods are available:

Here, I will talk about s3cmd, I use pretty often for scripting.

$ apt-get install s3cmd
# Display available functions
$ s3cmd -h

# Listing buckets
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo ls

# List first-level files and directories, inside the bucket for iadvize bucket
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo ls s3:///idz-backups

# Create bucket
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo mb s3:///idz-test

# Download s3 file to local directory
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo get s3:///idz-backups/postgresql/dump-2017.05.19.tgz /tmp/dump-2017.05.19.tgz

# Upload local file to s3 bucket
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo put /tmp/dump-2017.05.19.tgz s3:///idz-backups/postgresql/dump-2017.05.19.tgz

# Delete file from s3 bucket
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo del s3:///idz-backups/tmp.sql

# Put file with ttl (expire in 1y)
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo \
        put \
        --add-header="Expires:`date -u +"%a, %d %b %Y %H:%M:%S GMT" --date "+1 years"`" \
        /tmp/dump-2017.05.19.tgz s3:///idz-backups/postgresql/dump-2017.05.19.tgz

# Put entire directory
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo put --recursive large-directory/ s3:///idz-backups/postgresql/large-directory

# Move files from bucket A to bucket B
$ s3cmd --region=eu-central-1 --access_key=foobar --secret_key=barfoo mv --recursive s3:///idz-backups-postgresql/ s3:///idz-backups/postgresql

Please note that you may have some 403 (access denied) error. It is due to limited permissions given to your access token.

Multipart upload is set by default. \o/