Command Line File Upload Amazon Cloud Drive

Introduction

Amazon Spider web Service, aka AWS, is a leading cloud infrastructure provider for storing your servers, applications, databases, networking, domain controllers, and active directories in a widespread cloud architecture. AWS provides a Southimple Southtorage Service (S3) for storing your objects or data with (119's) of data durability. AWS S3 is compliant with PCI-DSS, HIPAA/HITECH, FedRAMP, Eu Data Protection Directive, and FISMA that helps satisfy regulatory requirements.

When you log in to the AWS portal, navigate to the S3 bucket, choose your required saucepan, and download or upload the files. Doing it manually on the portal is quite a time-consuming task. Instead, you can use the AWS Command 50ine Interface (CLI) that works best for bulk file operations with easy-to-utilize scripts. You can schedule the execution of these scripts for an unattended object download/upload.

Configure AWS CLI

Download the AWS CLI and install AWS Command Line Interface V2 on Windows, macOS, or Linux operating systems.

AWS CLI Configuration

Y'all tin can follow the installation magician for a quick setup.

Create an IAM user

To access the AWS S3 bucket using the command line interface, we demand to ready an IAM user. In the AWS portal, navigate to Identity and Admission Management (IAM) and click Add User .

setting up an IAM user

In the Add User folio, enter the username and access type every bit Programmatic access.

Programmatic access

Next, we provide permissions to the IAM user using existing policies. For this article, we have chosen [AmazonS3FullAccess] from the AWS managed policies.

provide permissions to the IAM user using existing policies

Review your IAM user configuration and click Create user .

Review IAM user configuration and click Create user

Once the AWS IAM user is created, it gives the Access Key ID and Secret access key to connect using the AWS CLI.

Annotation : You lot should copy and relieve these credentials. AWS does non allow you to retrieve them at a later stage.

IAM User created successfully.

Configure AWS Profile On Your Reckoner

To work with AWS CLI on Amazon web service resources, launch the PowerShell and run the following command.

          >aws configure        

Information technology requires the following user inputs:

  • IAM user Access Fundamental ID
  • AWS Undercover Admission cardinal
  • Default AWS region-name
  • Default output format
AWS Profile Configuration

Create S3 Bucket Using AWS CLI

To shop the files or objects, we need an S3 bucket.  Nosotros tin create it using both the AWS portal and AWS CLI.

The following CLI control creates a bucket named [mys3bucket-testupload1] in the us-east-1 region. The query returns the saucepan name in the output, as shown below.

          >aws s3api create-bucket --bucket mys3bucket-testupload1 --region us-east-1        
Create S3 Bucket Using AWS CLI

You can verify the newly-created s3 bucket using the AWS console. Equally shown below, the [mys3bucket-testupload1] is uploaded in the US E (Due north. Virginia).

verify the newly-created s3 bucket using the AWS console

To list the existing S3 bucket using AWS CLI, run the command – aws s3 ls

list the existing S3 bucket using AWS CLI

Uploading Objects in the S3 Bucket Using AWS CLI

We tin can upload a single file or multiple files together in the AWS S3 bucket using the AWS CLI command. Suppose we have a single file to upload. The file is stored locally in the C:\S3Files with the proper name script1.txt.

To upload the unmarried file, use the following CLI script.

          >aws s3 cp C:\S3Files\Script1.txt s3://mys3bucket-testupload1/        

It uploads the file and returns the source-destination file paths in the output:

Uploading Objects in the S3 Bucket Using AWS CLI

Note: The time to upload on the S3 bucket depends on the file size and the network bandwidth. For the demo purpose, I used a pocket-size file of a few KBs.

Yous can refresh the s3 bucket [mys3bucket-testupload1] and view the file stored in it.

refresh the s3 bucket

Similarly, we tin can utilise the same CLI script with a slight modification. It uploads all files from the source to the destination S3 saucepan. Here, we use the parameter –recursive for uploading multiple files together:

          >aws s3 cp c:\s3files s3://mys3bucket-testupload1/ --recursive        

Equally shown beneath, it uploads all files stored within the local directory c:\S3Files to the S3 bucket. You become the progress of each upload in the panel.

 progress of each upload

We tin can come across all uploaded files using recursive parameters in the S3 bucket in the following effigy:

all uploaded files

If y'all do not want to get to the AWS portal to verify the uploaded list, run the CLI script, return all files, and upload timestamps.

          >aws s3 ls s3://mys3bucket-testupload1        
run the CLI script, return all files, and upload timestamps.

Suppose we want to upload only files with a specific extension into the carve up binder of AWS S3. You lot can exercise the object filtering using the CLI script equally well. For this purpose, the script uses include and exclude keywords.

For example, the query beneath checks files in the source directory (c:\s3bucket), filters files with .sql extension, and uploads them into SQL/ binder of the S3 saucepan. Here, we specified the extension using the include keyword:

          >aws s3 cp C:\S3Files s3://mys3bucket-testupload1/  --recursive   --exclude * --include *.sql        

In the script output, you tin can verify that files with the .sql extensions only were uploaded.

verify that files with the .sql extensions only were uploaded
verify that files with the .sql extensions only were uploaded

Similarly, the below script uploads files with the .csv extension into the S3 bucket.

          >aws s3 cp C:\S3Files s3://mys3bucket-testupload1/  --recursive   --exclude * --include *.csv        
 script uploads files with the .csv extension into the S3 bucket

Upload New or Modified Files from Source Folder to S3 Bucket

Suppose yous utilize an S3 saucepan to move your database transaction log backups.

For this purpose, nosotros employ the sync keyword. It recursively copies new, modified files from the source directory to the destination s3 bucket.

          >aws s3 sync C:\S3Files s3://mys3bucket-testupload1/  --recursive   --exclude * --include *.sql        

Equally shown below, it uploaded a file that was absent in the s3 saucepan. Similarly, if y'all modify any existing file in the source binder, the CLI script will pick information technology and upload it to the S3 bucket.

Upload New or Modified Files from Source Folder to S3 Bucket
Upload New or Modified Files from Source Folder to S3 Bucket

Summary

The AWS CLI script can make your piece of work easier for storing files in the S3 saucepan. You can apply it to upload or synchronize files between local folders and the S3 bucket. It is a quick mode to deploy and work with objects in the AWS cloud.

(Visited 8,542 times, 64 visits today)

Tags: AWS, aws cli, aws s3, cloud platform Terminal modified: September xvi, 2021

hardawayliker1951.blogspot.com

Source: https://codingsight.com/upload-files-to-aws-s3-with-the-aws-cli/

0 Response to "Command Line File Upload Amazon Cloud Drive"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel