I upload my lambda function sources from AWS codebuild. My Python script uses NLTK so it needs a lot of data. My .zip package is too big and an RequestEntityTooLargeException
occurs. I want to know how to increase the size of the deployment package sent via the UpdateFunctionCode command.
I use AWS CodeBuild
to transform the source from a GitHub repository to AWS Lambda
. Here is the associated buildspec file:
version: 0.2
phases:
install:
commands:
- echo "install step"
- apt-get update
- apt-get install zip -y
- apt-get install python3-pip -y
- pip install --upgrade pip
- pip install --upgrade awscli
# Define directories
- export HOME_DIR=`pwd`
- export NLTK_DATA=$HOME_DIR/nltk_data
pre_build:
commands:
- echo "pre_build step"
- cd $HOME_DIR
- virtualenv venv
- . venv/bin/activate
# Install modules
- pip install -U requests
# NLTK download
- pip install -U nltk
- python -m nltk.downloader -d $NLTK_DATA wordnet stopwords punkt
- pip freeze > requirements.txt
build:
commands:
- echo 'build step'
- cd $HOME_DIR
- mv $VIRTUAL_ENV/lib/python3.6/site-packages/* .
- sudo zip -r9 algo.zip .
- aws s3 cp --recursive --acl public-read ./ s3://hilightalgo/
- aws lambda update-function-code --function-name arn:aws:lambda:eu-west-3:671560023774:function:LaunchHilight --zip-file fileb://algo.zip
- aws lambda update-function-configuration --function-name arn:aws:lambda:eu-west-3:671560023774:function:LaunchHilight --environment 'Variables={NLTK_DATA=/var/task/nltk_data}'
post_build:
commands:
- echo "post_build step"
When I launch the pipeline, I have RequestEntityTooLargeException
because there are too many data in my .zip package. See the build logs below:
[Container] 2019/02/11 10:48:35 Running command aws lambda update-function-code --function-name arn:aws:lambda:eu-west-3:671560023774:function:LaunchHilight --zip-file fileb://algo.zip
An error occurred (RequestEntityTooLargeException) when calling the UpdateFunctionCode operation: Request must be smaller than 69905067 bytes for the UpdateFunctionCode operation
[Container] 2019/02/11 10:48:37 Command did not exit successfully aws lambda update-function-code --function-name arn:aws:lambda:eu-west-3:671560023774:function:LaunchHilight --zip-file fileb://algo.zip exit status 255
[Container] 2019/02/11 10:48:37 Phase complete: BUILD Success: false
[Container] 2019/02/11 10:48:37 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: aws lambda update-function-code --function-name arn:aws:lambda:eu-west-3:671560023774:function:LaunchHilight --zip-file fileb://algo.zip. Reason: exit status 255
Everything works correctly when I reduce the NLTK data to download (I tried with only the packages stopwords
and wordnet
.
Does anyone have an idea to solve this "size limit problem"?
You cannot increase the deployment package size for Lambda. AWS Lambda limits are described in AWS Lambda devopler guide. More information on how those limits work can be seen here. In essence, your unzipped package size has to be less than 250MB (262144000 bytes).
PS: Using layers doesn't solve sizing problem, though helps with management & maybe faster cold start. Package size includes the layers - Lambda layers.
A function can use up to 5 layers at a time. The total unzipped size of the function and all layers can't exceed the unzipped deployment package size limit of 250 MB.