File Handling in Amazon S3 with Python Boto Library
1.Introduction
Amazon Web Services (AWS) Simple Storage Service (S3) is a storage as a service provided by Amazon. It a general purpose object store, the objects are grouped under a name space called as “buckets”. The buckets are unique across entire AWS S3.
Boto library is the official Python SDK for software development. It provides APIs to work with AWS services like EC2, S3 and others.
In this article we will focus on how to use Amzaon S3 for regular file handling operations using Python and Boto library.
2. Amzon S3 & Work Flows
In Amzaon S3, the user has to first create a bucket. The bucket is a namespace, which is has a unique name across AWS. The users can set access privileges to it based on their requirement. The buckets can contain objects. The objects are referred as a key-value pair, where key is the identifier to operate on the object. The key must be unique inside the bucket. The object can be of any type. It can be used to store strings, integers, JSON, text files, sequence files, binary files, picture & videos. To understand more about Amazon S3 Refer Amazon Documentation [2].
Following are the possible work flow of operations in Amazon S3:
- Create a Bucket
- Upload file to a bucket
- List the contents of a bucket
- Download a file from a bucket
- Move files across buckets
- Delete a file from bucket
- Delete a bucket
3. Python Boto Library
Boto library is the official Python SDK for software development. It supports Python 2.7. Work for Python 3.x is on going. The code snippets in this article are developed using boto v2.x. To install the boto library, pip command can be used as below:
pip install -u boto
Also in the below code snippets, I have used connect_s3() API, by passing the access credentials as arguments. This provides the connection object to work with. But If you don’t want to code the access credentials in your program, there are other ways of do it. We can create environmental variables for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. The other way is to create a credential files and keep them under .aws directory in the name of “credentials” in the users home directory. The file should contain the below:
File Name : ~/.aws/credentials
[default] aws_access_key_id = ACCESS_KEY aws_secret_access_key = SECRET_KEY
4. S3 Work flow Automation
4.1 Create a bucket
The first operation to be performed before any other operation to access the S3 is to create a bucket. The create_bucket() api in connection object performs the same. The bucket is the name space under which all the objects of the users can be stored.
import boto keyId = "your_aws_key_id" sKeyId="your_aws_secret_key_id" #Connect to S3 with access credentials conn = boto.connect_s3(keyId,sKeyId) #Create the bucket in a specific region. bucket = conn.create_bucket('mybucket001',location='us-west-2')
In create_bucket() api, the bucketname (‘mybucket001’) is the mandatory parameter, which is the name of the bucket. The location is optional parameter, if the location is not given, then bucket will be created in the default region of the user.
create_bucket() call might throw an error message, if a bucket with the same name already exists. Also the bucket name is unique across the system. Naming convention of the bucket is depend the rules enforced by the AWS region. Generally, bucket name must be in lower case.
4.2 Upload a file
To upload a file into S3, we can use set_contents_from_file() api of the Key object. The Key object resides inside the bucket object.
import boto from boto.s3.key import Key keyId = "your_aws_key_id" sKeyId= "your_aws_secret_key_id" fileName="abcd.txt" bucketName="mybucket001" file = open(fileName) conn = boto.connect_s3(keyId,sKeyId) bucket = conn.get_bucket(bucketName) #Get the Key object of the bucket k = Key(bucket) #Crete a new key with id as the name of the file k.key=fileName #Upload the file result = k.set_contents_from_file(file) #result contains the size of the file uploaded
4.3 Download a file
To download the file, we can use get_contents_to_file() api.
import boto from boto.s3.key import Key keyId ="your_aws_key_id" sKeyId="your_aws_secret_key_id" srcFileName="abc.txt" destFileName="s3_abc.txt" bucketName="mybucket001" conn = boto.connect_s3(keyId,sKeyId) bucket = conn.get_bucket(bucketName) #Get the Key object of the given key, in the bucket k = Key(bucket,srcFileName) #Get the contents of the key into a file k.get_contents_to_filename(destFileName)
4.4 Move a file from one bucket to another
We can achieve moving a file from one bucket to another, only by copying the object from one bucket to another. The copy_key() api of bucket object, copies the object from a given bucket to local.
import boto keyId = "your_aws_access_key_id" sKeyId="your_aws_secret_key_id" conn = boto.connect_s3(keyId,sKeyId) srcBucket = conn.get_bucket('mybucket001') #Source Bucket Object dstBucket = conn.get_bucket('mybucket002') #Destination Bucket Object fileName = "abc.txt" #Call the copy_key() from destination bucket dstBucket.copy_key(fileName,srcBucket.name,fileName)
4.5 Delete a file
To delete a file inside the object, we have to retrieve the key of the object and call the delete() API of the Key object. The key object can be retrieved by calling Key() with bucket name and object name.
import boto from boto.s3.key import Key keyId = "your_aws_access_key" sKeyId = "your_aws_secret_key" srcFileName="abc.txt" #Name of the file to be deleted bucketName="mybucket001" #Name of the bucket, where the file resides conn = boto.connect_s3(keyId,sKeyId) #Connect to S3 bucket = conn.get_bucket(bucketName) #Get the bucket object k = Key(bucket,srcFileName) #Get the key of the given object k.delete() #Delete the object
4.6 Delete a bucket
The delete_bucket() api of the connection object deletes the given bucket in the parameter.
import boto keyId = "your_aws_access_key_id" sKeyId= "your_aws_secret_key_id" conn = boto.connect_s3(keyId,sKeyId) bucket = conn.delete_bucket('mybucket002')
The delete_bucket() call will fail, if there are objects inside the bucket.
4.7 Empty a bucket
Emptying a bucket can be achieved by deleting all the objects indie the bucket. The list() api of bucket object (bucket.list()) will provide all the objects inside the bucket. By calling the delete() api for those objects, we can delete them.
import boto keyId = "your_aws_access_key_id" sKeyId= "your_aws_secret_key_id" bucketName="mybucket002" conn = boto.connect_s3(keyId,sKeyId) #Connect to S3 bucket = conn.get_bucket(bucketName) #Get the bucket Object for i in bucket.list(): print(i.key) i.delete() #Delete the object
4.8 List All Buckets
The get_all_buckets() of the connection object returns list of all buckets for the user. This can be used to validate existence of the bucket once you have created or deleted a bucket.
import boto keyId = "your_aws_access_key_id" sKeyId= "your_aws_secret_key_id" conn = boto.connect_s3(keyId,sKeyId) #Connect to S3 buckets = conn.get_all_buckets() #Get the bucket list for i in buckets: print(i.name)
5 Summary
The boto library provides connection object, bucket object and key object which exactly represents the design of S3. By understanding various methods of these objects we can perform all the possible operations on S3 using this boto library.
Hope this helps.
6. References
[1] Boto S3 API Documentation – http://boto.cloudhackers.com/en/latest/ref/s3.html
[2] Amazon S3 Documention – https://aws.amazon.com/documentation/s3/
Reference: | File Handling in Amazon S3 with Python Boto Library from our WCG partner Saravanan Subramanian at the Saravanan Subramanian Tech Notes blog. |