AWS S3 CP Examples – How to Copy Files to and from S3 using AWS CLI

Apr 21, 2022

In this post, we'll go over AWS S3, which is a highly particular command in the AWS CLI.
AWS S3 provides a comprehensive toolkit for managing your S3 bucket. Today, we'll look at a specific feature of S3 CLI. which is similar to
We need to get there first. We'll need to know a few things about the AWS S3 CLI.

There are two AWS S3 CLI commands available

  • AWS S3
  • AWS S3API

Let us see what they both have to offer.

Table of Contents

What is AWS S3 and S3API

For accessing Amazon S3, the AWS CLI provides two tiers of commands.
The s3 tier is made up of high-level commands that make basic activities like creating, manipulating, and deleting objects and buckets much easier.
The s3api tier works in the same way as the S3 tier, but it allows you to do sophisticated actions that aren't feasible with the S3 tier.
In this article, we'll solely discuss the s3 tier and, more specifically, the s3 cp command, which allows us to copy files from and to S3 buckets.
Before we proceed any further, there are a few commands that will assist you list the buckets.

  1. aws s3 help – To get a list of all of the commands available in high-level commands.
  2. aws s3 ls – To get the list of all buckets.
  3. aws s3 ls s3://bucket-name – Will list all the objects and folders I that bucket.
  4. aws s3 ls s3://bucket-name/path/ – This command will filter the output to a specific prefix.

Quick Caveats on AWS S3 CP command

  • A download is when you copy a file from an S3 bucket to your local computer.
    Upload is the process of copying a file from a local system to an S3 bucket.
    Please be aware that unsuccessful uploads are not resumable.
    The AWS CLI cleans up any files created and aborts the upload if the multipart upload fails due to a timeout or is manually aborted by pressing CTRL + C. This procedure can take a long time.
    The in-progress multipart upload stays in Amazon S3 if the operation is halted by a kill command or a system failure, and must be cleaned up manually in the AWS Management Console or using the s3api abort-multipart-upload command.

AWS S3 CP Command examples

Here we have listed few examples on how to use AWS S3 CP command to copy files.

Copying a local file to S3

Uploading a file to S3, in other words copying a file from your local file system to S3, is done with aws s3 cp command

Let’s suppose that your file name is file.txt  and this is how you can upload your file to S3

aws s3 cp file.txt s3://bucket-name

while executed the output of that command would like something like this.

aws s3 cp

Copying a local file to S3 with Storage Class

S3 Provides various types of Storage classes to optimize the cost and to manage the disk efficiency and IO performance during file read and write operations.

  • S3 Standard
  • S3 Intelligent-Tiering
  • S3 Standard-IA
  • S3 One Zone-IA
  • S3 Glacier
  • S3 Glacier Deep Archive

You can read more information about all of them here

aws s3 cp file.txt s3://bucket-name --storage-class class-name

aws s3 cp

Console Output:

In the prceding snapshot you can see that the test2.txt file which we have uploaded just now is showing the Standard-IA as the storage class

Copying an S3 object from one bucket to another

At times we would want to copy the content of one S3 bucket to another S3 bucket and this is how it can be done with AWS S3 CLI.

aws s3 cp s3://source-bucket-name/file.txt s3://destination-bucket-name/

aws s3 cp

How to Recursively upload or download (copy) files with AWS S3 CP command

When passed with the parameter --recursive  the aws s3 cp command recursively copies all objects from source to destination.

It can be used to download and upload large set of files from and to S3.

Here is the AWS CLI S3 command to Download list of files recursively from S3.  here the dot . at the destination end represents the current directory

aws s3 cp s3://bucket-name . --recursive

aws s3 cp

the same command can be used to upload a large set of files to S3. by just changing the source and destination

aws s3 cp . s3://bucket-name  --recursive

Here we have just changed source to current directory and destination to the bucket and now all the files on the current directory(local) would be uploaded to the bucket.

It includes all the subdirectories and hidden files

Setting the Access Control List (ACL) while copying an S3 object

For security and compliance reasons, we might want to restrict the files that we are copying to S3 buckets with set of Access control. Like Read only or Read Write etc.

Here is the command that copies a file from one bucket to another bucket with specific ACL applied on the destination bucket ( after copied )

aws s3 cp s3://source-bucket-name/file.txt s3://dest-bucket-name/ --acl public-read-write

There are 5 types of ACL permissions available with S3 which are listed here on the following snapshot.

aws s3 cp
Image credits to AWS

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.