What is rs3helper
rs3helper includes a collection of functions to work on Amazon Simple Storage Service (Amazon S3). Each function executes a Python script that performs an action using the boto S3 library - boto is probably the most popular interface to Amazon Web Services in Python. In this regard, this package can be considered as a R wrapper of Python boto S3 library.
Why use Python
In relation to s3, although there are a number of existing packages, many of them seem to be deprecated, premature or platform-dependent. If there is not a comprehensive R-way of doing something yet, it may be necessary to create it from scratch. I have chosen to use Python as it is easy to learn and it has a comprehensive interface to AWS. For those who are interested in a R-based tool, see the aws.s3 package of the cloudyr project.
Python and the boto library need to be installed. The Python version that is used for development is 2.7.10. The boto library can be installed using a Python package management system called pip. Once it is installed (see here), installing the boto library is as simple as the following.
The package is hosted in GitHub and it is possible to install/load as shown below.
Note, if you see the following error, install the jsonlite package again and run the above snippet - this package is used to parse responses of the Python scripts.
Introduction to rs3wrapper
Below is the fomal of lookup_bucket() in the package.
The first two arguments are for authentication (see here) while the last two modify connection setting. Specifically, if a bucket name is non-DNS compliant, is_ordinary_calling_format should be set to TRUE so that connection can be made without an error. For example, if a bucket name begins/ends with a dot(.) or includes both small and capital letters (eg MyAWSBucket), it is not DNS-compliant. For more details, see this article. Also it is possible to specify a specific region to connect and the available regions can be checked by executing lookup_region().
Although the last two arguments can be skipped as they have default values, the authentication arguments must be provided each time. Besides there may be occasions when non-default values should be entered for them. Entering these argumens each time can be tedious and thus it will be convenient if they are pre-filled. In this regard, a wrapper function called rs3wrapper() is created. As can be seen below, it returns a function that has a function argumnet (f) and unspecified argument (…). Then it is possible to execute a function where connection related arguments are pre-filled.
An example of applying rs3wrapper() is shown below. At first a function named as wrapper is created by filling the connection related arguments. Then lookup_bucket() is executed in it by entering the function and other necessary arguments. It returns the same outcome to the original function but now the amount of code can be reduced especially when multiple functions are executed. Further example can be seen in the object document (?rs3wrapper).
From now on, demonstration will be made using this wrapper function.
Functions to lookup objects and setting variables
As mentioned earlier, connection can be made to a specific region and it can be checked by executing lookup_region(). Also a bucket can be created in a specific location and avaialble locations can be looked up by lookup_location(). Both of them return a character vector.
In order to look up specific bucket or key, lookup_bucket() and lookup_key() can be used. Both of them return a list of lookup information - the message element is kept to inform an error or unexpected case.
Also more than one objects can be checked by get_all_buckets() and get_keys(). As the names suggest, the former returns information of all buckets while the latter returns information of keys. get_keys() has an extra argument called prefix that filters keys. For example, if a bucket has a folder named json and only the keys in the folder need to be checked, it is possible to set the argument’s value to be json. Then only the keys in the folder are looked up - subfolders can also be prefixed (eg folder/subfolder). Both the functions return a data frame of lookup information.
Below shows some examples.
Functions to get/set access control list
- private: Owner gets FULL_CONTROL. No one else has any access rights.
- public-read: Owners gets FULL_CONTROL and the anonymous principal is granted READ access.
- public-read-write: Owner gets FULL_CONTROL and the anonymous principal is granted READ and WRITE access.
- authenticated-read: Owner gets FULL_CONTROL and any principal authenticated as a registered Amazon S3 user is granted READ access.
get_access_control_list() has 2 extra arguments that are not related to connection: bucket_name and key_name. If key_name is NULL, the access control list of the bucket is returned. Otherwise the specific key’s access control list is returned.
By default, all objects are private and an extra level of permission can be set up. For example, the bucket has two extra rows which indicate it has public-read-write permission as well.
set_access_control_list() can be used to set access control policy. It has one extra argument called permission to its ‘get-counterpart’ - only canned access control policy that is listed above can be added. Below shows some examples.
In part 1, motivation of the rs3helper package is illustrated followed by its installation and basic usage. In the next part, more usage will be demonstrated.