This post will walk through how to go about setting up Play Module, which would encapsulate the functionality of inserting and reading files in to AWS S3.

The module can be plugged into any play application, and is ready to use.

James Ward had a beautiful realization of the AWS S3 Plugin, https://devcenter.heroku.com/articles/using-amazon-s3-for-file-uploads-with-java-and-play-2. The plugin in play framework has been deprecated in 2.5 and modules are advocated.

The source code for the same can be found here at gihub.

To create a module in play project, create the default app structure of the application under module directory named aws

 

Play Framework Module Directory Structure
Module Directory of Play Framework.

Project Structure->Dependencies->Click on +->Select the build.sbt file of the new module aws

InteliJ PlayFramework Module

This should reflect the module added in the application directory by bold text in square brackets, the name here is the name of the module as defined in build.sbt file.

Create a play framework module, which can be readily recognized by the play application.

/**
 * Play Framework module entry point. Reference this class in
 * your {@code application.conf} file to enable Amazon PlayS3 module.
 *
 * @author Mohammed Sabhi
 * @see Module
 */
public class AmazonS3Module extends Module {


    public Seq<Binding<?>> bindings(Environment environment, Configuration configuration) {
        return seq(
                bind(AmazonS3ModuleInitializer.class).toSelf()
        );
    }

}

The actual module implementation. Note the Singleton Instance, as it is realized once when the application is loaded. Subsequent uploads would not result in a new instance, we would be requiring only at the beginning a single instance towards aws.

/**
 * Implementation of {@code AmazonS3Module}.
 * Singleton to ensure that only one bucket is reserved towards the application, instead of
 * creating multiple buckets.
 *
 * @author Mohammed Sabhi
 */
@Singleton
public class AmazonS3ModuleInitializer {

    public static final String AWS_S3_BUCKET = "aws.s3.bucket";
    public static final String AWS_ACCESS_KEY = "aws.access.key";
    public static final String AWS_SECRET_KEY = "aws.secret.key";
    public static final String AWS_REGION = "aws.region";

    public static AmazonS3 amazonS3;
    public static String s3Bucket;


    private Environment environment;
    private Configuration config;

    /**
     * Create a simple instance of {@code S3Module}.
     *
     * @param lifecycle     The application life cycle
     * @param configuration The application configuration
     * @param environment The application environment
     * @since 16.03.13
     */
@Inject
    public AmazonS3ModuleInitializer(
            final ApplicationLifecycle lifecycle,
            Environment environment,
            final Configuration configuration) {
        this.environment = environment;
        this.config = configuration;

        String accessKey = this.config.underlying().getString(AWS_ACCESS_KEY);
        String secretKey = this.config.underlying().getString(AWS_SECRET_KEY);
        String awsRegion = this.config.underlying().getString(AWS_REGION);

        s3Bucket = this.config.underlying().getString(AWS_S3_BUCKET);
        System.out.println(s3Bucket);
        AwsS3.bucketName=s3Bucket;
        Logger.info("AmazonS3Module Initlialization in progress");
        if (accessKey == null || secretKey == null || s3Bucket == null) {
            throw new RuntimeException("S3Module is not properly configured");
        }

        AwsS3.amazonS3 = AmazonS3ClientBuilder
                .standard()
                .withCredentials(new AWSCredentialsProvider() {
                    @Override
                    public AWSCredentials getCredentials() {
                        return new BasicAWSCredentials(accessKey, secretKey);
                    }

                    @Override
                    public void refresh() {
                        // Not used with basic AWS credentials
                    }
                })
                .withRegion(awsRegion) // The first region to try your request against
                .build();
        try {
            AwsS3.amazonS3.createBucket(s3Bucket);
            System.out.println("bucket created");
        } catch (final AmazonS3Exception ex) {
            if (ex.getErrorCode().compareTo("BucketAlreadyOwnedByYou") != 0
                    && ex.getErrorCode().compareTo("AccessDenied") != 0) {
                throw ex;
            }
        } finally {
            Logger.info("Using PlayS3 Bucket: " + AwsS3.bucketName);
        }



        lifecycle.addStopHook(() -> {
            Logger.info("Shutdown Module");
            return CompletableFuture.completedFuture(null);
                }

        );
    }
}

In case if you plan to have the end point implementation in the module itself, specify the end point in the module routes file, aws.routes.

GET /s3 @com.comp.awss3.controllers.MediaController.savetoS3()

The reference to the module in the main application routes file is as follows

POST /orders/transidcallback @controllers.OrderController.getTransId()

-> /aws     aws.Routes

To load all the modules in the main play application, alter the build.sbt file

//lazy val root = (project in file(".")).enablePlugins(PlayJava)

lazy val notfcns = (project in file("module/notfcns")).enablePlugins(PlayJava)
lazy val aws = (project in file("module/aws")).enablePlugins(PlayJava)
lazy val root = (project in file(".")).enablePlugins(PlayJava).aggregate(notfcns,aws).dependsOn(notfcns,aws)

The first commented line is the default settings found in play application, we override it to include modules found in the project directory.

 

References:

https://devcenter.heroku.com/articles/using-amazon-s3-for-file-uploads-with-java-and-play-2

https://github.com/heroku/devcenter-java-play-s3

https://github.com/pmgautam/s3-upload-play-react/tree/master/s3-back/app/s3

https://github.com/0xbaadf00d/play-s3-module