Hot questions for Using Amazon S3 in linux

Question:

I tried this

Downloading Java JDK on Linux via wget is shown license page instead

but I keep getting a 404 error.

This command "sudo amazon-linux-extras install java-openjdk11" just states that amazon-linux-extras doesnt exist.


Answer:

Use one of the OpenJDK distributions:

https://docs.aws.amazon.com/corretto/latest/corretto-11-ug/downloads-list.html

or

https://adoptopenjdk.net/?variant=openjdk11&jvmVariant=hotspot

Question:

I have this scenario: I created a Java servlet to be executed in a Lotus Domino server (just in case, the servlet is OUTSIDE from any database. It's in the folder <domino data>/domino/servlet/my_servlet.class). The servlet access a S3 server using a credentials file.

When I developed the servlet, I did my tests on a Windows server, and everything worked like a charm. But, when I did the same tests on a Linux server using the same credentials and the same servlet, it did not work.

The exception occured here:

    AWSCredentials credentials = null;
    try {
        credentials = new ProfileCredentialsProvider().getCredentials();
    } catch (Exception e) {
        throw new AmazonClientException(
                "Cannot load the credentials from the credential profiles file. " +
                "Please make sure that your credentials file is at the correct " +
                "location (~/.aws/credentials), and is in valid format.",
                e);
    }

Considering that the domino server is executed with the user notes, I put the credentials in notes/.aws/credentials. Nothing. I put it in the ec2-user/.aws/credentials (it's a EC2 server). Nothing again. Same exception.

About the Domino server, it's executed using a user notes. The .aws folder and the credentials file owner is notes. The permissions in the credentials file is 600. The servlet owner is notes too.

Do you have any idea about how can I resolve this?

TIA,

EDIT: I added this lines in the servlet:

res.setContentType("text/html");        
PrintWriter toBrowser = res.getWriter();        
//etc.
toBrowser.println("HOME: " + System.getProperty("user.home")); 

I got this:

HOME: /home/notes 

I checked this folder again and the credentials are still there.

[root@ip-xxx-xxx-xxx-xxx notes]# ls -l /home/notes/.aws
total 4
-rw------- 1 notes notes 117 Nov 28 03:50 credentials
[root@ip-xxx-xxx-xxx-xxx notes]#

EDIT 2: I added this lines too:

File f = new File(System.getProperty("user.home") + "/.aws/credentials");

if(f.exists()){
    toBrowser.println("Credentials exists" + "<BR/>"); 
}else{
    toBrowser.println("Credentials DOES NOT exist" + "<BR/>"); 
}

And I got this:

Credentials exists

Therefore, the servlet has the right permissions to find the credentials file.

I'm stuck on this...


Answer:

Well, the problem was that the servlet had no access to the environment variables

That's because there's a bug in the Lotus Domino server since the version 8.5. The solution was modify the java.policy file at $JAVA_HOME/lib/security adding this line:

grant {
    [...]
    permission java.security.AllPermission;
    [...]
}

Everything works again.

Question:

I need to scan a s3 object (jpeg, pdf) in my java application by Eset virus scanner.

S3Object s3Object = s3Client.getObject("bucket", "key);

So I get S3 object and here is the command line that I should use

@SBINDIR@/esets_scan [option(s)] FILES

How can I use this command line in a java application?


Answer:

You can execute the scan using ProcessBuilder

ProcessBuilder pb = new ProcessBuilder("esets_scan", "s3FileName");

Of course, you have to save your object to a file first and add more code to process the output of the scanner.

Question:

I wrote a Java program to list all the buckets and to upload a file in S3 compatible Object storage service. The program is working fine in Windows my local machine but when I (after changing the path of the file to be uploaded of course ) transfer the runnable jar in the remote linux server and execute it I'm getting the following error-

> Exception in thread "main"
> com.amazonaws.services.s3.model.AmazonS3Exception: The request
> signature we calculated does not match the signature you provided.
> Check your AWS Secret Access Key and signing method. For more
> information, see REST Authentication and SOAP Authentication for
> details. (Service: Amazon S3; Status Code: 403; Error Code:
> SignatureDoesNotMatch; Request ID:
> 4e271b5b-d7f5-42b3-a4ad-886988bcb785; S3 Extended Request ID: null),
> S3 Extended Request ID: null

The issue seems to be in the 2nd half of the program as the list of buckets are returning in linux env. as well but during the file upload it is throwing error.

import java.io.File;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;

import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.S3ClientOptions;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.services.s3.model.Bucket;

/**
 * List your Amazon S3 buckets.
 */
public class ListBuckets
{
    private static void listObjects(AmazonS3 s3) {
        List<Bucket> buckets = s3.listBuckets();
        System.out.println("Your Amazon S3 buckets are:");
        for (Bucket b : buckets) {
            System.out.println("* " + b.getName());
        }
    }
    private static void putObject(AmazonS3 s3, String bucketName, String objectName, String pathName) throws Exception
{

    s3.putObject(bucketName, objectName, new File(pathName));


}
private static void time(String t) {
    DateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss");
    Date date = new Date();
    System.out.println(t+"-->"+dateFormat.format(date));
}
public static void main(String[] args) throws Exception
{
    final String accessKey = "XXXXXXXXXXXXXX";
    final String secretKey = "XXXXXXXXXXXXXXXXXXXXXXXX";
    BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
    @SuppressWarnings("deprecation")
    final AmazonS3 s3 = new AmazonS3Client(credentials);
    S3ClientOptions opts = new S3ClientOptions().withPathStyleAccess(true);
    s3.setS3ClientOptions(opts);

    s3.setEndpoint("https://XXXXXX.com");
    ListBuckets.time("startTime");
    ListBuckets.listObjects(s3);
    //String pathName = "C:\\Users\\XXXXXX\\Documents\\New folder\\New Text Document - Copy.txt";
    String pathName = "/home/abcd/XXXXX/objectStorage/CHANGELOG.mdown";
    ListBuckets.putObject(s3, "snap-shot/sample-aws-ex", pathName, pathName);
    ListBuckets.time("end time");
}

}`


Answer:

Unbelievable! You know what the issue was in linux? The object name and path name are two different things.

putObject(AmazonS3 s3, String bucketName, String objectName, String pathName)

where the pathName is the path of your file i.e.

String pathName = "/home/abc/xxxxx/objectStorage/errorlog.txt";

Notice it starts with forward slash, whereas object name should not start with / i.e.

String objectName = "home/abc/xxxxxx/objectStorage/errorlog.txt";

I wish the exception thrown would have given better clarity on what was wrong. The exception thrown only made me deviate from the root cause.