Lambda Function on AWS (Part 2)

Abdullah Ayad
3 min readOct 15, 2023

— — — — — — — See part 1 from here — — — — — — — —

Creating Parallel Step Type

Recall that in the example scenario you have three main tasks to perform:

  • Generate a report
  • Update DynamoDB tables
  • Log a metric to CloudWatch

The first two are completely independent tasks, whereas the last one must be executed only when the previous tasks are completed. In order to minimize the amount of time needed to complete the entire process, you will generate the report and update the database in parallel. Of course, parallel tasks are possible here because the tasks are not dependent upon each other.

Below you can see the JSON snippet that defines the Parallel task, along with its branches:

{
"Comment": "An example of the Amazon States Language using a parallel state to execute two branches at the same time.",
"StartAt": "StartTask",
"States": {
"StartTask": {
"Type": "Parallel",
"Next": "CWMetric",
"Branches": [...]
}
}
}

The JSON template to define a flow is composed of different fields and each one defines a different aspect of the flow that you are building.
The StartAt field defines the first state where the flow begins and the States object contains the definitions of all reachable states.

Going back to the example, it begins from the StartTask state. This is a Parallel state (as you can see in its Type property) and for this reason, it is made up of branches. This is a very powerful feature in Step Functions because here you can define a sub-flow of tasks that can include all the kinds of tasks you need. Also note that every branch has both the StartAt and the States property. In the StartAt property, you have to declare where the branch starts. In the States property, you declare every state of the branch.

The first parallel task branch that will make up the JSON template will be Gen Report. Gen Report is a very simple one; its Type is also Task. This means that it represents a Lambda Function and for this reason, you have to specify the function’s Amazon Resource Name (ARN) in its Resource field.

You can see this in the branches object of the template below (lines 6–15):

{
"StartTask": {
"Type": "Parallel",
"Next": "CWMetric",
"Branches": [
{
"StartAt": "Gen Report",
"States": {
""" Here you can define all your branches """
"Gen Report": {
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME"
}
}
}
""" UpdateDB branch goes here """
]
}
}

In order to implement the flow, you will have to create a Lambda Function that executes this task.

Note: This Lambda function will utilize an S3 bucket. Before creating the Lambda function, you need to know what S3 bucket to use.

Instructions

  1. In the AWS Management Console search bar, enter S3, and click the S3 result under Services:
  2. Create your own bucket
  3. In the AWS Management Console search bar, enter Lambda, and click the Lambda result under Services:
  4. Click on Create function.
  5. Check Author from scratch and fill all fields as given below:

Function name: GenerateReport
Runtime: Python 3.7

6. Toggle the drop-down Change default execution role and fill all fields as given below:

Select the role or create a new one

7. Click on Create function.

8. Navigate to the Code source section and double-click lambda_function.py. You will replace the contents with the code below:

import boto3
bucket_name = "BUCKET_NAME"
s3_client = boto3.client('s3')
def lambda_handler(event, context):
level = event.get('level')
user_id = event.get('user_id')
score = event.get('score')
max_score = event.get('max_score')
report = 'Completed Level: %s\nMy Score: %s\%s\n' % (level, score, max_score)
s3_client.put_object(
ACL='public-read',
Bucket=bucket_name,
Key="%s_report_%s.txt" % (user_id, level),
Body=report
)

return eventNote: Keep the quotation marks.

9. Replace BUCKET_NAME with your S3 bucket name (Line 2)

10. To deploy your function, click Deploy

Consider what this function is going to receive as the event parameter. In general, a given function uses the output returned from a preceding function and receives it as its input. However, the first function of the flow (like in this case) receives the event provided when the flow starts. At the end of the lab, you will see that when you start an execution you can provide a start event.

Looking at your code, you can see that this Lamba Function is very simple and only calls the s3.put_object API to upload the simple report.

Creating Conditional Step Type

See part 3 from here

GitHub
LinkedIn
Facebook
Dev

--

--

Abdullah Ayad

Machine Learning and AI(NLP & Computer Vision) Engineer, AWS Community Builder, Kaggle Expert