AWS: Send Simple Email Service

This entry is part 5 of 5 in the series AWS & Java

If you want to send an email using AWS’ Simple Mail then you need to do the following. This is a very basic example.

Import the following:

import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailService;
import com.amazonaws.services.simpleemail.AmazonSimpleEmailServiceClientBuilder;
import com.amazonaws.services.simpleemail.model.Body;
import com.amazonaws.services.simpleemail.model.Content;
import com.amazonaws.services.simpleemail.model.Destination;
import com.amazonaws.services.simpleemail.model.Message;
import com.amazonaws.services.simpleemail.model.SendEmailRequest;

Setup Connection to AWS Simple Email Service

final AmazonSimpleEmailService simpleEmailService = AmazonSimpleEmailServiceClientBuilder.standard().withRegion(myRegion)
.withCredentials(new AWSStaticCredentialsProvider(new BasicAWSCredentials(accessKeyId, secretKey)))
.build();

Setup Email:

final SendEmailRequest request = new SendEmailRequest().withDestination(new Destination().withToAddresses(TO)).withSource(FROM)
.withMessage(new Message().withSubject(new Content().withCharset("UTF-8").withData(SUBJECT))
.withBody(new Body().withText(new Content().withCharset("UTF-8").withData(BODY))));

Send Email:

simpleEmailService.sendEmail(request);

AWS: Java Post to Kinesis Queue

This entry is part 4 of 5 in the series AWS & Java

Posting to an AWS Kinesis Queue is rather simple and straight forward. As always you should refer to AWS Documentation.

Put Multiple Records On Queue

Import the following

import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.kinesis.AmazonKinesis;
import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
import com.amazonaws.services.kinesis.model.PutRecordsRequest;
import com.amazonaws.services.kinesis.model.PutRecordsRequestEntry;
import com.amazonaws.services.kinesis.model.Record;

Put Records

AmazonKinesisClientBuilder clientBuilder = AmazonKinesisClientBuilder.standard().withRegion(myRegion).withCredentials(new AWSStaticCredentialsProvider(new BasicAWSCredentials(myAccessKeyId, mySecretKey)));
AmazonKinesis kinesisClient = clientBuilder.build();
PutRecordsRequest putRecordsRequest = new PutRecordsRequest();
putRecordsRequest.setStreamName(myQueue);
List putRecordsRequestEntryList  = new ArrayList<>(); 


//You can put multiple entries at once if you wanted to
PutRecordsRequestEntry putRecordsRequestEntry  = new PutRecordsRequestEntry();
putRecordsRequestEntry.setData(ByteBuffer.wrap(myData));
putRecordsRequestEntry.setPartitionKey(myKey);
putRecordsRequestEntryList.add(putRecordsRequestEntry);


putRecordsRequest.setRecords(putRecordsRequestEntryList);
PutRecordsResult putResult = kinesisClient.putRecords(putRecordsRequest);

Put Single Record On Queue

Import the following

import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.kinesis.AmazonKinesis;
import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
import com.amazonaws.services.kinesis.model.PutRecordRequest;
import com.amazonaws.services.kinesis.model.Record;

Put Record

AmazonKinesisClientBuilder clientBuilder = AmazonKinesisClientBuilder.standard().withRegion(myRegion).withCredentials(new AWSStaticCredentialsProvider(new BasicAWSCredentials(myAccessKeyId, mySecretKey)));
AmazonKinesis kinesisClient = clientBuilder.build();
PutRecordRequest putRecordRequest = new PutRecordRequest();
putRecordRequest.setStreamName(myQueue);

putRecordRequest.setData(ByteBuffer.wrap(data.getBytes("UTF-8")));
putRecordRequest.setPartitionKey(myKey);

PutRecordResult putResult = kinesisClient.putRecord(putRecordRequest);

You now have put a record(s) onto the queue congratulations!

AWS: Java S3 Upload

This entry is part 3 of 5 in the series AWS & Java

If you want to push data to AWS S3 there are a few different ways of doing this. I will show you two ways I have used.

Option 1: putObject

import com.amazonaws.AmazonClientException;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;

ClientConfiguration config = new ClientConfiguration();
config.setSocketTimeout(SOCKET_TIMEOUT);
config.setMaxErrorRetry(RETRY_COUNT);
config.setClientExecutionTimeout(CLIENT_EXECUTION_TIMEOUT);
config.setRequestTimeout(REQUEST_TIMEOUT);
config.setConnectionTimeout(CONNECTION_TIMEOUT);

AWSCredentialsProvider credProvider = ...;
String region = ...;

AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withCredentials(credProvider).withRegion(region).withClientConfiguration(config).build();

InputStream stream = ...;
String bucketName = .....;
String keyName = ...;
String mimeType = ...;

//You use metadata to describe the data.
final ObjectMetadata metaData = new ObjectMetadata();
metaData.setContentType(mimeType);

//There are overrides available. Find the one that suites what you need.
try {
	s3Client.putObject(bucketName, keyName, stream, metaData);
} catch (final AmazonClientException ex) {
	//Log the exception
}

Option 2: MultiPart Upload

import com.amazonaws.AmazonClientException;
import com.amazonaws.event.ProgressEvent;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.event.ProgressEventType;
import com.amazonaws.event.ProgressListener;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.transfer.TransferManager;
import com.amazonaws.services.s3.transfer.TransferManagerBuilder;
import com.amazonaws.services.s3.transfer.Upload;

ClientConfiguration config = new ClientConfiguration();
config.setSocketTimeout(SOCKET_TIMEOUT);
config.setMaxErrorRetry(RETRY_COUNT);
config.setClientExecutionTimeout(CLIENT_EXECUTION_TIMEOUT);
config.setRequestTimeout(REQUEST_TIMEOUT);
config.setConnectionTimeout(CONNECTION_TIMEOUT);

AWSCredentialsProvider credProvider = ...;
String region = ...;

AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withCredentials(credProvider).withRegion(region).withClientConfiguration(config).build();

InputStream stream = ...;
String bucketName = .....;
String keyName = ...;
long contentLength = ...;
String mimeType = ...;

//You use metadata to describe the data. You need the content length so the multi part upload knows how big it is
final ObjectMetadata metaData = new ObjectMetadata();
metaData.setContentLength(contentLength);
metaData.setContentType(mimeType);

TransferManager tf = TransferManagerBuilder.standard().withS3Client(s3Client).build();
tf.getConfiguration().setMinimumUploadPartSize(UPLOAD_PART_SIZE);
tf.getConfiguration().setMultipartUploadThreshold(UPLOAD_THRESHOLD);
Upload xfer = tf.upload(bucketName, keyName, stream, metaData);

ProgressListener progressListener = new ProgressListener() {
	public void progressChanged(ProgressEvent progressEvent) {
		if (xfer == null)
			return;
		
		if (progressEvent.getEventType() == ProgressEventType.TRANSFER_FAILED_EVENT || progressEvent.getEventType() == ProgressEventType.TRANSFER_PART_FAILED_EVENT) {
			//Log the message
		}
	}
};

xfer.addProgressListener(progressListener);
xfer.waitForCompletion();

AWS: Java Kinesis Lambda Handler

This entry is part 2 of 5 in the series AWS & Java

If you want to write a Lambda for AWS in Java that connects to a Kinesis Stream. You need to have the handler.

Maven:

<dependency>
    <groupId>com.amazonaws</groupId>
    <artifactId>aws-java-sdk</artifactId>
    <version>1.11.109</version>
</dependency>
<dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-databind</artifactId>
    <version>2.7.1</version>
</dependency>

This is the method that AWS Lambda will call. It will look similar to the one below.

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.events.KinesisEvent;
import com.amazonaws.services.lambda.runtime.events.KinesisEvent.KinesisEventRecord;
import com.amazonaws.services.kinesis.model.Record;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.JsonNode;

public void kinesisRecordHandler(KinesisEvent kinesisEvent, Context context) {
	final String awsRequestId = context.getAwsRequestId();
	final int memoryLimitMb = context.getMemoryLimitInMB();
	final int remainingTimeInMillis = context.getRemainingTimeInMillis();

	for (final KinesisEventRecord kinesisRec : kinesisEvent.getRecords()) {
		final Record record = kinesisRec.getKinesis();

		//We get the kinesis data information
		final JsonNode recData = new ObjectMapper().readValue(record.getData().array(), JsonNode.class);

		final String bucketName = recData.get("bucket").asText();
		final String key = recData.get("key").asText();
	}
}

The thing to note when you setup you Lambda is how to setup the “Handler” field in the “Configuration” section on AWS. It is in the format “##PACKAGE##.##CLASS##::##METHOD##”.

AWS: Java S3 Lambda Handler

This entry is part 1 of 5 in the series AWS & Java

If you want to write a Lambda for AWS in Java that connects to S3. You need to have the handler.

Maven:

<dependency>
    <groupId>com.amazonaws</groupId>
    <artifactId>aws-java-sdk</artifactId>
    <version>1.11.109</version>
</dependency>

This is the method that AWS Lambda will call. It will look similar to the one below.

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.event.S3EventNotification.S3Entity;
import com.amazonaws.services.s3.event.S3EventNotification.S3EventNotificationRecord;

public void S3Handler(S3Event s3e, Context context) {
	final String awsRequestId =  context.getAwsRequestId();
	final int memoryLimitMb = context.getMemoryLimitInMB();
	final int remainingTimeInMillis = context.getRemainingTimeInMillis();

	for (final S3EventNotificationRecord s3Rec : s3e.getRecords()) {
		final S3Entity record = s3Rec.getS3();
		
		final String bucketName = record.getBucket().getName()
		final String key = record.getObject().getKey();
	}
}

The thing to note when you setup you Lambda is how to setup the “Handler” field in the “Configuration” section on AWS. It is in the format “##PACKAGE##.##CLASS##::##METHOD##”.

AWS: Node Kinesis Stream

This entry is part 2 of 2 in the series AWS & Node

If you haven’t already done so please refer to the AWS Node Setup tutorial as part of this series. In this tutorial we will just put something on the Kinesis queue.

We will utilise the AWS variable we created during the setup as part of this series.

First we need to create the variable that connects to our Kinesis in our region.

var kinesis = new AWS.Kinesis({region : ##REGION##});

Next we need to setup a record to send on the Kinesis stream. This will contain our data, key and the stream name.

var recordParams = {
	Data: ##DATA##,
	PartitionKey: ##FILENAME_OR_ID##,
	StreamName: ##STREAM_NAME##
};

Next we need to put the record onto the stream. This is a very basic implementation. Feel free to expand as you need to.

kinesis.putRecord(recordParams, function(err, data) {
	if (err) {
		console.error(err);
	}
	else {
		console.log("done");
	}
});

AWS: Node Setup

This entry is part 1 of 2 in the series AWS & Node

Using Node we can setup connections to AWS. As time goes on I will keep this section updated.

First we need to install the aws-sdk and save in our dependencies. This will make it show up in our package.json file.

npm install aws-sdk --save

Next we need to require the aws-sdk.

var AWS = require('aws-sdk')

Next we update the config to utilise our keys.

AWS.config.update({accessKeyId: ##ACCESS_ID##, secretAccessKey: ##SECRET_KEY##});

 

 

 

AWS: Python S3

This entry is part 3 of 3 in the series AWS & Python

If you haven’t already done so please refer to the AWS setup section which is part of this series. As time goes on I will continually update this section.

To work with S3 you need to utilise the “connection_s3” you setup already in the previous tutorial on setting up the connection.

To load a file from a S3 bucket you need to know the bucket name and the file name.

connection_s3.Object(##S3_BUCKET##, ##FILE_NAME##).load()

If you want to check if the check if a file exists on S3 you do something like the below. However you will need to import botocore.

import botocore

def keyExists(key):
    file = connection_s3.Object(##S3_BUCKET##, ##FILE_NAME##)
    
    try:
        file.load()
    except botocore.exceptions.ClientError as e:
        exists = False
    else:
        exists = True
    
    return exists, file

If you want to copy a file from one bucket to another or sub folder you can do it like below.

connection_s3.Object(##S3_DESTINATION_BUCKET##, ##FILE_NAME##).copy_from(CopySource=##S3_SOURCE_BUCKET## + '/' + ##FILE_NAME##)

If you want to delete the file you can use the “keyExists” function above and then just call “delete”.

##FILE##.delete()

If you want to just get a bucket object. Just need to specify what bucket and utilise the S3 connection.

bucket = connection_s3.Bucket(##S3_BUCKET##)

To upload a file to S3’s bucket. You need to set the body, type and name. Take a look at the below example.

bucket.put_object(Body=##DATA##,ContentType="application/zip", Key=##FILE_NAME##)

If you want to loop over the objects in a bucket. It’s pretty straight forward.

for key in bucket.objects.all():
	file_name = key.key
	response = key.get()
	data = response_get['Body'].read()

If you want to filter objects in a bucket.

for key in bucket.objects.filter(Prefix='##PREFIX##').all():
        file_name = key.key
	response = key.get()
	data = response_get['Body'].read()

AWS: Python Kinesis Streams

This entry is part 2 of 3 in the series AWS & Python

If you haven’t already done so please refer to the AWS setup section which is part of this series. As time goes on I will continually update this section.

To put something on the Kinesis Stream you need to utilise the “connection_kinesis” you setup already in the previous tutorial on setting up the connection. You will need to set the partition key, data and stream.

response = connection_kinesis.put_record(StreamName=##KINESIS_STREAM##, Data=##DATA##, PartitionKey=##FILE_NAME##)

Depending on your data you may need to utf8 encode. For example below.

bytearray(##MY_DATA##, 'utf8')

To read from the kinesis stream you need to setup the shard iterator then retrieve the data from the stream. Not forgetting to grab the new shard iterator from the returned records. Remember to not query against the queue to fast.

#shardId-000000000000 is the format of the stream
shard_it = connection_kinesis.get_shard_iterator(StreamName=##KINESIS_STREAM##, ShardId='shardId-000000000000', ShardIteratorType='LATEST')["ShardIterator"]

recs = connection_kinesis.get_records(ShardIterator=shard_it, Limit=1)

#This is the new shard iterator returned after queueing the data from the stream.
shard_it = out["NextShardIterator"]

 

AWS: Python Setup

This entry is part 1 of 3 in the series AWS & Python

When you want to work with S3 or a Kinesis Stream we first need to setup the connection. At the time of this writing I am using boto3 version 1.3.1.

Next we need to import the package.

import boto3

Next we setup the session and specify what profile we will be using.

profile = boto3.session.Session(profile_name='prod')

The profile name comes from the “credentials” file. You can set the environment variable “AWS_SHARED_CREDENTIALS_FILE” to specify what credentials file to use. You can setup the credentials file like below. You can change the “local” to anything you want. I normally use “stage”, “dev” or “prod”.

[local]
aws_access_key_id=##KEY_ID##
aws_secret_access_key=##SECRET_ACCESS_KEY##
region=##REGION##

Next we need to setup the connection to S3. To do this we will need to use the profile we created above.

connection_s3 = profile.resource('s3')

If we want to also use a Kinesis stream then we need to setup the connection. To do this we will need the profile we created above.

connection_kinesis = profile.client('kinesis')