Hot questions for Using Amazon S3 in spring


I have a bunch of files inside Amazon s3 bucket, I want to zip those file and download get the contents via S3 URL using Java Spring.


S3 is not a file server, nor does it offer operating system file services, such as data manipulation.

If there is many "HUGE" files, your best bet is

  1. start a simple EC2 instance
  2. Download all those files to EC2 instance, compress them, reupload it back to S3 bucket with a new object name

Yes, you can use AWS lambda to do the same thing, but lambda is bounds to 900 seconds (15 mins) execution timeout (Thus it is recommended to allocate more RAM to boost lambda execution performance)

Traffics from S3 to local region EC2 instance and etc services is FREE.

If your main purpose is just to read those file within same AWS region using EC2/etc services, then you don't need this extra step. Just access the file directly.

Note :

It is recommended to access and share file using AWS API. If you intend to share the file publicly, you must look into security issue seriously and impose download restriction. AWS traffics out to internet is never cheap.


I'm getting this stack when trying to run a Mock test using PowerMock

 Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate []: Factory method   'amazonS3Client' threw exception; nested exception is org.apache.http.conn.ssl.SSLInitializationException: class configured for SSLContext:$TLSContext not a SSLContext
 Caused by: org.apache.http.conn.ssl.SSLInitializationException: class  configured for SSLContext:$TLSContext not a SSLContext
 Caused by: class configured for SSLContext:$TLSContext not a SSLContext

I have tried suggestions of adding an @PowerMockIgnore with the org.apache.http.con.ssl.* but doing that causes my Rabbit connector to fail. I wasn't sure if there are any suggestions have both load for my test. Or not initialize one if it is not needed for the test?

I am limited in what I can provide since this is something for my company.

Using Amazon SDK: 1.11.69

Here is how I'm configuring my tests

@PowerMockIgnore({ "*", "ch.qos.logback.*",
    "org.slf4j.*" })

Example Bean:

public class S3Configuration {
    public AmazonS3Client amazonS3Client() throws IOException {
          return new AmazonS3Client(new EnvironmentVariableCredentialsProvider());


I was able to solve this by adding a custom Configuration file that mocks the bean and returns it.

public class TestConfig {

    AmazonS3Client client;

    public TestConfig(){

    public AmazonS3Client amazonS3Client(){
        return client;


I am trying to set up a service that pulls encrypted values from AWS KMS (amazon's Key management service) given the bucket, key and region.

Doing this is not the issue I am having but when I go to unit test I don't really want to test a 3rd party method or integration test and call it a unit test.

I want to be able to mock that class to just return back garbage text for testing.

Where I am struggling is with the AmazonS3ClientBuilder.

How can i create a bean to return an instance of this without doing something like this.

Public class config {
    public AmazonS3ClientBuilder amazonS3ClientBuilder{
        return AmazonS3ClientBuilder.standard();

Here is how I am currently using this.

AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withCredentials(new AWSSTaticCredentialsProvider(credentaials)).withRegion(region).build();

Am I looking at this wrong and should be injecting an AmazonS3 for the client instead of the builder?

Thanks in advance.


To put what we had discussed into an example, how about something like this:

Application configuration

public class Config {
    public AmazonS3 getClient(){
        //TODO: Pull whatever you need from KMS, create credentials, define region, etc
        return AmazonS3ClientBuilder.standard().withRegion("my region").build();


This is what normally gets injected into the application. Instead of injecting the builder, I'm instead injecting a pre-constructed S3Client. Then, in my unit tests I can do:

Sample unit test

public class SampleTest {

    static class ContextConfiguration {
        public AmazonS3 gets3(){
            System.out.println("Providing mocked s3...");
            //TODO: Provide when/then statements, etc
            return Mockito.mock(AmazonS3.class);

    public void testSomething(){
      //TODO: Test something that utilizes S3Client

Which should instead provide a Mocked version of the S3Client when the test is run. This is a bit of a bare bones example, another option would be to take a similar approach but use @Profile to active a test profile. Or you could even hide all the s3 interaction behind another class, and mock that instead as a means of abstracting the s3 implementation away from classes that need to deal with storage.


Currently I encountered the issue when using Spring's Pageable and AWS API Gateway together. Spring's url for sorting multiple columns is like this:


By going through AWS API Gateway, it becomes like this:


I tried the following:


But it is throwing 500 error with message "No property asc found for type Item!"

I understand that it should be an issue with AWS API Gateway, but it seems that AWS has ignored it since 2015 according to this post.

Kindly let me know if you have any alternatives or workarounds, thanks!


Thanks to Oliver Gierke here, the workaround can be


so all fields are sorted in the same direction (asc or desc).


How do I open an image on my s3 bucket in the browser?

I have setup an Amazon s3 bucket to host my webapp's images, the issue I am having is that when I click on the image's link, it will download the image instead of displaying it in the browser.

  1. I have set the metadata to be 'Content-Type: image/*'
  2. I have set the permissions to be public

This is the code I am using:

private void uploadFileTos3bucket(String fileName, MultipartFile file) 
        (InputStream is = file.getInputStream()) 
            { ObjectMetadata metadata = new ObjectMetadata(); 
              metadata.setLastModified(new Date()); 
              metadata.setContentType("image/*"); s3client.putObject(new 
              PutObjectRequest(bucketName, fileName, is, metadata).withCannedAcl(CannedAccessControlList.PublicRead)); }        
              catch(IOException e) { e.printStackTrace(); 

Does anyone have any other ideas or solutions?


Change your contenttype to image/png, considering that your image in S3 is in .png format. You need to set the appropriate content type as set in your S3 bucket. For example, if it is a text, set it as text/plain. Similarly for image, set it to image/png. Also, your file content type should not be binary/octet-stream, otherwise, it won't open up, rather it will download then.


Is it possible to create an s3 managed ignite cluster using DefaultAWSCredentialsProviderChain? In java you can do it like this:

    DefaultAWSCredentialsProviderChain chain = new DefaultAWSCredentialsProviderChain();
    AWSCredentials creds = chain.getCredentials();

I know I can easily create a wrapper class that Implements AWSCredentials and uses the provider chain under the hood, like in here Is there a way to do this just with spring? Without the wrapper?

Edit: Here is probably what Im going to use:

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.DefaultAWSCredentialsProviderChain;

 * Created by Carlos Bribiescas on 10/28/16.
public class DefaultAWSCredentials implements AWSCredentials{
    private DefaultAWSCredentials() {}

    private static class LazyHolder{
        private static AWSCredentials CREDENTIALS = new DefaultAWSCredentialsProviderChain().getCredentials();

    public String getAWSAccessKeyId() {
        return LazyHolder.CREDENTIALS.getAWSAccessKeyId();

    public String getAWSSecretKey() {
        return LazyHolder.CREDENTIALS.getAWSSecretKey();


You should be able to utilize Spring factory-bean and factory-method for this:

<bean id="aws.cred.chain" class="com.amazonaws.auth.DefaultAWSCredentialsProviderChain"/>

<property name="ipFinder">
    <bean class="org.apache.ignite.spi.discovery.tcp.ipfinder.s3.TcpDiscoveryS3IpFinder">
        <property name="awsCredentials">
            <bean factory-bean="aws.cred.chain" factory-method="getCredentials"/>