1

I have an API that receives a payload in POST request and inserts it into an Amazon SQS queue so that the message can be consumed by a different service. The basic idea is that the API needs to quickly return a response back to the caller and continue the heavy processing later sometime.
POST request model:

class Company { public int CompanyId { get; set; } public List<Person> People { get; set; } } class Person { public int Id { get; set; } public int CompanyId { get; set; } public string FirstName { get; set; } .... .... } 

My initial thought was to insert the whole Company into SQS. But the number of people working in certain companies can be huge and therefore inserting the whole Company into SQS will not be possible because of 256kb message size limit imposed by SQS.
So right now I am making changes to the API so that instead of inserting a Company into SQS, it should insert a single Person and the consumer would handle the Company-Person relationship. Example: if a company with CompanyId: 1 has 1000 Persons, then the API would insert 100 messages into SQS (one message per person, and each person having CompanyId property set to 1).
This seems fine for companies that will be greater than 256kb after serialization but very inefficient for companies that are under the 256kb size limit after serialization.

Question:What should be my strategy to handle companies of all sizes efficiently (handle companies that produce json less than 256kb after serialization as well as companies that prodice json greater than 256kb after serialization).

NOTE: I am aware that there is a library for java that supports upto 2GB SQS message size, but I could not find any equivalent official .NET library from AWS that could handle this scenario.

3 Answers 3

4

That official library for handling larger SQS message is Java only I believe, but you can reproduce its functionality fairly easily. Here's the pseudo code you would need to implement in your application:

if (JSON size > 256kb) { push json to S3 send SQS message with S3 URL in it } else { send SQS message with JSON in it } 

Then your SQS client receiving the messages would just need to check if the JSON is in the message, or if it needs to download it from S3.

Sign up to request clarification or add additional context in comments.

Comments

1

I think you are talking about Extended Client Library for Java, which can send payloads up to 2GB.

What this library does can be easily implemented for other programming languages as well. Essentially you save the huge payload in a S3 bucket and send a SQS message with a pointer to the location from where the payload can be downloaded.

Comments

1

For .NET Developer, library that has the similar functionality as mentioned by Ervin Szilagyi above is Amazon SQS Extended Client Library for .NET

It enables you to store message payloads in S3 and hence overcomes message size limitation of the SQS. With this library you can:

  1. Specify whether message payloads should be always stored in S3 or when message size exceeds configurable threshold.
  2. Send message and store its payload in S3 bucket.
  3. Receive stored message from S3 bucket transparently
  4. Delete stored payload from the S3 bucket.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.