As a beginner in Python, you are probably not sure where to start. This guide will help get beginners started with AWS S3 and Boto3.
The “python boto3 s3 example” is a tutorial that will help new users get started with managing AWS S3. The tutorial uses Python Boto3 and the AWS SDK for Python to create a simple bucket, upload a file, list objects in the bucket and download an object.
The AWS Boto3 software development kit (SDK) for Python is your best friend if you need to transfer files to an Amazon Web Services (AWS) S3 bucket, copy files from bucket to bucket, and automate the process. Using Boto3 and S3, you can easily transfer files around in AWS.
Utilizing an example-driven approach, you will learn how to get started using the Boto3 Python library with S3 in this article.
Let’s get going!
This article will be a step-by-step guide. If you want to join in, make sure you have the following items:
Make that the IAM user is set up for programmatic access and is assigned to the AmazonS3FullAccess policy.
- Install Python 3.6 or later on your local PC. On a Windows 10 system, Python v3.9.2 will be used in this course.
How Do You Install Python 3.6? Related:
- Pip is the Python package manager.
- An editor for coding. Even though you can work with Python files in any text editor. Visual Studio (VS) Code will be used in this lesson.
Before you can begin managing S3 with Boto3, you must install it first. Let’s start off this tutorial by downloading and Boto3 installation on your local computer.
The pip Python package manager is the simplest method to install Boto3. To use pip to install Boto3:
1. On your PC, open a cmd/Bash/PowerShell window.
2. Run the pip install command as shown below, specifying the Python module to install (boto3).
Pip is a Python package manager that allows you to install software that isn’t included in the standard Python library.
Boto3 should now be installed successfully!
Boto3 for AWS S3 Bucket Creation
It’s time to explore what Boto3 can accomplish now that you’ve installed it! Let’s have a look at some AWS S3 examples, beginning with establishing a new S3 bucket.
Boto3 supports two kinds of AWS interactions: resource and client. The client level gives you low-level service access, while the resource level gives you higher-level, abstracted access. Client access will be used in this lesson.
1. Launch your preferred code editor.
2. Paste the Python script below into your code editor and save the file as main.py. The file will be saved as main.py in the tutorial. The code below creates an S3 bucket named first-us-east-1-bucket and publishes a message to the console when it’s finished.
# Importing boto3 library to enable functionality import boto3 library # Setting up an AWS S3 client connection s3 = boto3.client(‘s3′) s3.create bucket(Bucket=’first-us-east-1-bucket’) # Creating a bucket print (“Bucket created succesfully”)
3. Use python to run the main.tf script in your terminal. If everything went well, you should see a single message from Bucket.
4. Open your preferred web browser and go to the AWS Management Console to log in.
5. Go to the top of the console’s search box and type in “S3,” then choose the S3 menu option.
In the AWS Management Console, look for the S3 service.
Your newly-created bucket should now appear on the S3 page, as seen below.
Because the default region in the AWS profile is set to us-east-1, the bucket is in the AWS Region US East.
In the AWS Management Console, locate the newly created S3 bucket.
How to List S3 Buckets in Amazon Web Services
Now that you have at least one S3 bucket in your account, confirm it using Boto3 rather than the AWS Management Console. Boto3 can also provide a list of all your S3 buckets.
Open your code editor and type:
1. In your code editor, paste the following Python code and save it as list s3 buckets.py. This script uses the list buckets() function to query AWS, saves the response to a file, loops (for) over an array (response[‘Buckets’]), and returns the name (bucket[“Name”]) of each S3 bucket it finds.
# Creating a client connection with AWS S3 s3 = boto3.client(‘s3’) # Importing boto3 library import boto3 # Rep response = s3.list buckets stores the client connection () # Print the bucket names for bucket in response[‘Buckets’]: print(‘Existing buckets:’): # print(f’bucket[“Name”]’) for loop to list all buckets
2. Run the script, and each S3 bucket name should show up in your account.
list s3 buckets.py in Python
Uploading a File to an Amazon S3 Bucket
Let’s get started generating items in your S3 bucket now that you have one. Upload a file to your S3 bucket to get started.
1. Choose an existing file on your local computer to upload or create a new one. The file ATA.txt will be used in this lesson.
2. If you still have your code editor open, copy/paste the following code into a new Python script and save it as upload s3 file.py. The script below reads the /ATA.txt file for reading (rb) and uploads it to the first-us-east-1-bucket (upload fileobjj()).
# Adding the boto3 library import boto3 # Setting up an AWS S3 client connection s3 = boto3.client(‘s3’) # Use open(‘/ATA.txt’, ‘rb’) to read the file on your local system as data: # Use s3.upload fileobj to upload the file ATA.txt from the Myfolder to S3. (data, ‘first-us-east-1-bucket’, ‘~/ATA.txt’)
3. Run the script that is supposed to upload the file.
4. If you still have the S3 page open in your browser, click on the bucket you established earlier, and the file should have been successfully uploaded!
locating the file that was uploaded
How to Use Boto3 to Upload Entire Folders
You could upload a single file to an AWS S3 bucket before, but what if you need to upload a folder? There isn’t anything in the boto3 library that allows you to upload a whole directory. You can still do it using a Python script, however.
1. Make sure you have a folder with some files on your local computer. This tutorial will utilize the ATA folder.
2. If you still have your code editor open, copy/paste the following code into a new Python script and save it as upload s3 folder.py. The script below utilizes the os module to traverse the directory tree using os.walk, then uses the ZipFile module to recursively add all the files in the folder to a zip file named ATA.zip and upload it to the first-us-east-1-bucket.
# Importing boto3, zipfile, and os library # Setting up an AWS S3 client connection s3 = boto3.client(‘s3’) zipdir(path, ziph): def # ziph is the zipfile handle for root, dirs, and files in os.walk(path): ziph.write(os.path.join(root, file)) zipf = zipfile.ZipFile(‘ATA.zip’, ‘w’, zipfile.ZIP DEFLATED) zipdir(‘ATA’, zipf) zipf = zipfile.ZipFile(‘ATA.zip’, ‘w’, zipfile # Passing the ATA folder in the zipf.close arguments () # Use open(‘ATA.zip’, ‘rb’) to upload the Zip file ATA.zip from folder2 to S3 as data: s3.upload fileobj(data, ‘first-us-east-1-bucket’, ‘ATA.zip’)g
3. Run the script, which will upload the ATA folder’s zip file containing all of your files to the bucket. upload s3 folder.py in Python
upload s3 folder.py in Python
How to Use Boto3 to Copy Files Between S3 Buckets
You previously worked with S3 on-premises. Let’s remain in the cloud now and transfer data from one S3 bucket to another.
1. Using the knowledge from the previous part, create a second S3 bucket to move data to. As the destination bucket for this example, first-us-east-1-bucket-2 will be used.
2. Make a new Python script called copy* s3 to s3.py and save it. Copy and paste the following code into your browser. The script assumes the ATA.txt file is still in the S3 bucket you uploaded before. It will look in the first-us-east-1-bucket for the ATA.txt file and transfer it to the first-us-east-1-bucket2 S3 bucket.
The Resource() API offers a higher-level abstraction than service clients’ raw, low-level calls. You may need to connect to resources directly rather than via the Service API on occasion.
# Creating the connection with the resource s3 = boto3.resource(‘s3’) # Importing the boto3 library import boto3 # The source to be copied is declared. copy source = ‘Bucket’: ‘first-us-east-1-bucket’, ‘Key’: ‘ATA.txt’ copy source = ‘Bucket’: ‘first-us-east-1-bucket’ copy s3.Bucket(‘first-us-east-1-bucket-2’) = bucket # Copying the files to a different bucket copy (‘ATA.txt’, copy source)
3. Run the script that is supposed to upload the file.
The file should now have been transferred from one bucket to the other.
Copying from one bucket to another successfully
You learnt how to install the Boto3 AWS SDK for Python and use the AWS S3 service in this tutorial. Although several automation solutions manage and operate with different AWS services, Boto3 is the answer if you need to connect with AWS APIs using Python.
What do you intend to do now that you’ve set up an AWS S3 bucket with a Boto3?
The “boto3 read file from s3” is a command that allows you to read files stored in your Amazon S3 bucket. This article will show you what this command does and how to use it.
- python script to list s3 buckets
- copy files from one s3 bucket to another python boto3
- boto3 list objects in bucket
- boto3 client
- boto3 connect to s3