Hot questions for Using Amazon S3 in amazon ses

Question:

I have configured my AWS SES to store all incoming emails to an S3 bucket with Object key Prefix as Email. I have a Java application using with I am trying to read all objects in that bucket and then move them to another so that only the unread emails remain in the bucket. I use the following code:

public class FileReadImpl 
{
    private static final Logger logger  = LoggerFactory.getLogger(FileReadImpl.class);

    AmazonS3 s3;

    public void init(String accessKey, String secretKey) 
    {
      s3 = new AmazonS3Client(new BasicAWSCredentials(accessKey, secretKey));
    }


    public List<S3ObjectInputStream> readEmailsAndMoveToRead(String accessKeyId, String secretAccessKey, String incommingBucket, String processedBucket)
    {
        List<S3ObjectInputStream> s3ObjectInputStreamList = new ArrayList<S3ObjectInputStream>();
        AWSCredentials credentials = new BasicAWSCredentials(accessKeyId, secretAccessKey);
        AmazonS3 s3 = new AmazonS3Client(credentials);
            ObjectListing listing = s3.listObjects(incommingBucket, "Email/");
            List<S3ObjectSummary> summaries = listing.getObjectSummaries();

            while (listing.isTruncated()) 
            {
               listing = s3.listNextBatchOfObjects (listing);
               summaries.addAll (listing.getObjectSummaries());
            }
            for (S3ObjectSummary s3ObjectSummary : summaries) 
            {
                String key = s3ObjectSummary.getKey();//getting the key of the item
                S3Object object = s3.getObject(
                          new GetObjectRequest(incommingBucket, key));
                S3ObjectInputStream  inuptStream = object.getObjectContent();
                s3ObjectInputStreamList.add(inuptStream);
                if(!s3.doesBucketExist(processedBucket))
                {
                    s3.createBucket(processedBucket);
                }
                s3.copyObject(incommingBucket, key, processedBucket, key);
                s3.deleteObject(incommingBucket, key);
                try 
                {
                    inuptStream.close();
                }
                catch (IOException e) 
                {
                    logger.error(e.toString());
                }
            }
        return s3ObjectInputStreamList;
    }
}

I have another service class that access the above class to get the list of emails and store them to my database. The code is as shown below:

public void getEmails()
{
    FileReadImpl fileReadImpl = new FileReadImpl();
    List<S3ObjectInputStream> s3ObjectInputStreamList = fileReadImpl.readEmailsAndMoveToRead("accessKeyId", "secretAccessKey", "incomingBucket", "processedBucket");
    for (S3ObjectInputStream s3ObjectInputStream : s3ObjectInputStreamList) 
    {
        //logic to save the email content as emails
    }
}

I am not sure how to get the email content, the senders details, the cc details etc from the S3ObjectInputStream object that I have. How do I process this object to get all the details I need.


Answer:

You have to convert the inputstream to a MIME Message which can be done as follows:

for (S3ObjectSummary s3ObjectSummary : summaries) 
{
    String key = s3ObjectSummary.getKey();//getting the key of the item
    S3Object object = s3.getObject(
              new GetObjectRequest(incommingBucket, key));
    InputStream mailFileInputStream = object.getObjectContent();
    String bucketKey = object.getKey();
    MimeMessage message = getMimeMessageForRawEmailString(mailFileInputStream);//converting input stream to mime message
    object.close();
}

public MimeMessage getMimeMessageForRawEmailString(InputStream mailFileInputStream) throws Exception
{
    Properties props = new Properties();
    Session session = Session.getDefaultInstance(props, null);
    MimeMessage message = new MimeMessage(session, mailFileInputStream);
    return message;
}

Once you have the Mime message, use a parser to read the contents from that message:

MimeMessageParser parser = new MimeMessageParser(message);
String from = parser.getFrom();
String htmlContent = parser.parse().getHtmlContent();
System.out.println("Body: "+htmlContent);
String plain = parser.parse().getPlainContent();
System.out.println("plain: "+plain);

This way, you will be able to read all the contents of the email stored as an S3 object in an S3 bucket

Question:

I am trying to set up amazon SES recipient rule set for putting emails into an s3 bucket. I have created an s3 bucket and I want these mails to sent into folders according to the email id. For example if an email is coming to 1@mydomain.com it should go into mytestbucket/1 and if it is coming to 2@mydomain.com it should go into mytestbucket/2.

    AWSCredentials awsCredentials = new BasicAWSCredentials(accessKey, secretKey);
    AmazonSimpleEmailServiceClient sesClient = new AmazonSimpleEmailServiceClient(awsCredentials);
    ;
    if (sesClient != null) {
        CreateReceiptRuleRequest req = new CreateReceiptRuleRequest();
        req.withRuleSetName(ruleSetName);
        ReceiptRule rule = new ReceiptRule();
        rule.setEnabled(true);
        rule.setName(customerIdString + "-email");
        rule.withRecipients(customerIdString + "@" + mydomain.com);
        List<ReceiptAction> actions = new ArrayList<ReceiptAction>();
        ReceiptAction action = new ReceiptAction();
        S3Action s3Action = new S3Action();
        s3Action.setBucketName(mytestbucket);
        s3Action.setObjectKeyPrefix(customerIdString);
        action.setS3Action(s3Action);
        actions.add(action);
        rule.setActions(actions);
        req.setRule(rule);
        CreateReceiptRuleResult response = sesClient.createReceiptRule(req);
        return true;
    }

Whenever I add a customer I was calling this method to create a rule to my active ruleset. But it looks like only 100 rules can be added. My usecase will be for at least 100 000. How can I achieve this?

Something I am expecting to do is

  1. Have a single recipient rule which says any email comes to mysubdomain invoke a lambda function

  2. Lambda function should put the email into subfolders of s3


Answer:

Follow these steps to achieve what you desire...

  1. Create a single SES rule to place ALL emails into a single S3 folder unsorted_emails (you can call it anything).

  2. Create a Lambda function that places emails into their proper folders.

  3. Set unsorted_emails as an event source to trigger your lambda function.

  4. Now, whenever new emails are added to unsorted_emails, your lambda function will be triggered and move the email into a proper folder.

Let me know if these steps make sense, if you have any questions, or if I can clarify more.