Webux Lab - Blog
Webux Lab Logo

Webux Lab

By Studio Webux

Search

By Tommy Gingras

Last update 2022-10-09

NotesServerlessSQS

Serverless framework SQS Offline Plugin

I had few issues;

In summary:

Bundle

I use serverless v3 with Node 14.

potato@-_- crawler % serverless info
Environment: darwin, node 14.19.3, framework 3.19.0, plugin 6.2.2, SDK 4.3.2
Credentials: Local, "---" profile
Docs:        docs.serverless.com
Support:     forum.serverless.com
Bugs:        github.com/serverless/serverless/issues

The "official" serverless offline SQS plugin is broken... There is an alternative sls-offline-aws-sqs thanks a lot !!

...
  "devDependencies": {
    "serverless-offline": "^10.3.2",
    // "serverless-offline-sqs": "^6.0.0", --> Doesn't work !
    "sls-offline-aws-sqs": "^1.0.2"
  }
...

serverless.yml

Here is only the things related to the SQS Offline.

...
plugins:
  # - serverless-offline-sqs
  - sls-offline-aws-sqs
  - serverless-offline

custom:
  sqs:
    name: "${self:service}-${sls:stage}-dispatch"
    name_dlq: "${self:service}-${sls:stage}-dispatch-dlq"
    urls:
      true: "http://localhost:9324/000000000000/${self:service}-${sls:stage}-dispatch"
      false: !Ref SitemapQueue
    url: "${self:custom.sqs.urls.${env:IS_OFFLINE, false}}"
  serverless-offline-sqs:
    autoCreate: true # create queue if not exists
    apiVersion: "2012-11-05"
    endpoint: http://127.0.0.1:9324
    region: "ca-central-1"
    accessKeyId: root
    secretAccessKey: root
    skipCacheInvalidation: false


functions:
  mySuperFunction:
    ...
    environment:
      QUEUE_URL: ${self:custom.sqs.url}
      QUEUE_ENDPOINT: ${env:QUEUE_ENDPOINT, ""}
    ...

...

resources:
  Resources:
    SitemapQueue:
      Type: AWS::SQS::Queue
      Properties:
        QueueName: ${self:custom.sqs.name}
        VisibilityTimeout: 300
        ReceiveMessageWaitTimeSeconds: 20
        RedrivePolicy:
          deadLetterTargetArn:
            Fn::GetAtt:
              - "DlqQueue"
              - "Arn"
          maxReceiveCount: 100

    DlqQueue:
      Type: AWS::SQS::Queue
      Properties:
        QueueName: "${self:custom.sqs.name_dlq}"
        VisibilityTimeout: 300
        ReceiveMessageWaitTimeSeconds: 20

...

Using that configuration I was able to use SQS locally. But this is not all..

The over complicated thing is the following ! If you have a better approach, please share it !


Partial Lambda code if pertinent for you

const { v5: uuidv5 } = require("uuid");
const { SQSClient, SendMessageBatchCommand } = require("@aws-sdk/client-sqs");
const AWSXRay = require("aws-xray-sdk");

const client = AWSXRay.captureAWSv3Client(
  new SQSClient({
    region: process.env.AWS_REGION,
    endpoint:
      process.env.QUEUE_ENDPOINT !== ""
        ? process.env.QUEUE_ENDPOINT ?? null
        : null,
  })
);

const { QUEUE_URL } = process.env;

// ...

client.send(
  new SendMessageBatchCommand({
    Entries: [...],
    QueueUrl: QUEUE_URL,
  })
);
// ...

Run Local Script (the complicated for nothing script...)

I should have used docker; but even there I'm not 100% sure that it will reduce the complexity, I saw that the create-queue commands can be replaced by a configuration script : https://github.com/softwaremill/elasticmq#automatically-creating-queues-on-startup

But I wanted to unlock my development flow. So I know it could be improved.. Might doing it if I use it in other services.

#!/bin/bash

SERVICE_NAME=""
STAGE=""
AWS_REGION=""

trap 'kill $(jobs -p)' EXIT

# docker run --name elasticmq -p 9324:9324 -p 9325:9325 softwaremill/elasticmq-native
java -jar elasticmq-server-1.3.9.jar &

sleep 3

export AWS_ACCESS_KEY_ID=root
export AWS_SECRET_ACCESS_KEY=root
export AWS_SESSION_TOKEN=root
aws sqs create-queue --queue-name ${SERVICE_NAME}-${STAGE}-dispatch-dlq --endpoint-url http://127.0.0.1:9324 --region ${AWS_REGION}
aws sqs create-queue --queue-name ${SERVICE_NAME}-${STAGE}-dispatch --endpoint-url http://127.0.0.1:9324 --region ${AWS_REGION}

unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY
unset AWS_SESSION_TOKEN

export AWS_XRAY_CONTEXT_MISSING=LOG_ERROR
export AWS_XRAY_LOG_LEVEL=silent
export IS_OFFLINE=true
export QUEUE_ENDPOINT="http://localhost:9324/"

sleep 5
NODE_ENV=test SLS_DEBUG=* sls offline --stage=${STAGE}

Using that script, it starts and prepare everything. T hen using a curl command I've tested my flow correctly.. Hourray !

I hope it will be simpler in the future.


References