Author: Darvesh Lodhi
SQS service has a message size limit of 256 KB. To handle larger messages, up to 1 GB, we utilize an S3 bucket.
Prerequisite:
- An AWS account with access to SQS and an S3 bucket (Free Tier).
- Basic knowledge of Anypoint Studio and AWS.
AWS SQS:Â
- Create an SQS queue named ‘Demo-sqs-queue.’
- Generate access keys and access secrets for your IAM user.
AWS S3 Bucket:
- Create an S3 bucket named ‘Demo-s3-bucket.’
Note: Enabling the S3 bucket may incur charges associated with AWS.
What do we resolve here?Â
- Our goal is to send large files into SQS and retrieve them using an S3 bucket.
MuleSoft Flow will be:
For send large csv file Flow will be
Sender flow:

Receiver flow:

Sender Flow: The first flow sends large messages/files to the Amazon sqs queue that references the AWS S3 bucket to use for storing large messages/files.
Creating the flow involves creating a new Mule project and configuring the following components and operations:
- HTTP > Listener component to initiate the flow.
- File > Read component to read the file.
- Transform Message component to convert the input data file to a new output format as Csv/Text.
- A Second Transform Message component to convert the input data to a new output structure or format to accept.
- Amazon SQS > Send Message operation to send the message to the specified queue
Receiver Flow: Receiver flow using the same queue and the S3 bucket to retrieve the messages/files.
So, create a flow to receive the message and write the file as CSV.
- Amazon SQS > Receive Message operation to retrieve messages from the queue.
- File > Write component to write file as Csv format.
Let’s start:Â
Create Mule Project:
1. Open Anypoint Studio, select File > New > Mule Project.
2. Enter a name for your Mule project (as Naming convention) and click Finish.
Sender Flow
Add Http listener and configure:
1. In listener configuration, configure as default host: localhost and port: 8081.
path :/Demo-SQS

Add Read Component and configure:
1. In Read configuration, click + next to the File Configuration field to add a global element.
In File Config just click ok as default (No need set work directory).
2. Configure following General fields :
- Display Name: Name to display in canvas.
- Connector Configuration: Global configuration for read file operation.
- File Path: Path to the file to read.
3. File data1.csv contains 90 mb records.

Add Transform message:
1. In Transform message, write the code below to convert payload as CSV/TEXT.

Add second Transform message:
1. In the Transform message, write the code below to convert the payload to Java.
%dw 2.0
output application/java
---
{
delaySeconds: 0,
body: payload,
messageAttributes: {
"AccountId": {
"stringValue" : "000123456",
"dataType" : "String.AccountId"
} as Object {
class: "org.mule.extension.sqs.api.model.MessageAttributeValue"
},
"NumberId": {
"stringValue" : "230.000000000000000001",
"dataType" : "Number"
} as Object {
class : "org.mule.extension.sqs.api.model.MessageAttributeValue"
}
} as Object {
class: "java.util.HashMap"
}
} as Object {
class: "org.mule.extension.sqs.api.model.Message"
}
Add & Configure SQS Send Message Operation:
1. In Mule Palette, search Amazon SQS and add Send message component next to set variable component (If unable to find SQS, click on add module and add Amazon SQS).

2. In the Send message configuration, click + next to the Connector configuration field to add a global element.
Session Token: Session token used to validate the temporary security credentials.
Access Key: Alphanumeric text string that uniquely identifies the user who owns the account.
Secret Key: Key that acts as a password.
Region Endpoint: Queue region.
Default Global Queue URL: Amazon SQS queue URL credentials.

3. Large Payload Support: Edit inline.
Bucket: Provide S3 bucket name.

4. Configure the Send message with SQS queue properties and Queue URL as shown below.

Receiver Flow:
Create a Flow to Receive Messages
1. Drag the Receive message component from the Mule palette in Amazon SQS and drop it onto the canvas as a new flow.

2. While receiving a message or file we are using the same queue and bucket.
3. Configure the following fields in the General properties window:
Display Name: Name that displays for the connector operation.
Connector Configuration: Global configuration you created previously. For this example, use Amazon_SQS_Large_Payload_Configuration.
Number of Messages: Number of messages to receive. For this example, it is 10.
Queue url: Amazon SQS queue URL.
4. Use the same Amazon SQS Configuration global properties for receiving messages.

Add File Write Component
1. Use the same File_Config Connector configuration, provide file path with .csv .

Save the whole project.
Run the project http://localhost:8081/Demo-SQS
You will get the output. Check the file data2.csv that contains the same data as data1.csv.