I have a use case where I have a AWS Step function that is triggered when a file is uploaded to S3, from there the first step runs an ffprobe to get the duration of the file from an external service such as transloadit where the output is written back to S3.
I can create a new step function from that event, but I was wandering if it is possible to have an Await promise inside the original step function and then continue to the next - taking into account that it could take longer for the ffprobe to comeback.
Any advice is much appreciated on how to tackle this.
AWS Step Functions now supports asynchronous callbacks for long-running steps as first-class.
This is similar to @mixja's answer above but simplified. A single state in your workflow can directly invoke Lambda, SNS, SQS, or ECS and wait for a call to SendTaskSuccess
.
There is a good example documented for SQS, where a step function sends a message and pauses workflow execution until something provides a callback. Lambda would be equivalent (assuming the main processing like transloadit happens outside the Lambda itself)
Your step function definition would look like
"Invoke transloadit": {
"Type": "Task",
"Resource": "arn:aws:states:::lambda:invoke.waitForTaskToken",
"Parameters": {
"FunctionName": "InvokeTransloadit",
"Payload": {
"some_other_param": "...",
"token.$": "$$.Task.Token"
}
},
"Next": "NEXT_STATE"
}
Then in your Lambda you would do something like
def lambda_handler(event, context):
token = event['token']
# invoke transloadit via SSM, ECS, passing token along
then in your main long-running process you would issue a callback with the token like aws stepfunctions send-task-success --task-token $token
from shell script / CLI, or similar with API calls.