Skip to main content
AI Assist is now on Stack Overflow. Start a chat to get instant answers from across the network. Sign up to save and share your chats.
Stack Overflow is like an encyclopedia, so we prefer to omit these types of phrases. It is assumed that everyone here is trying to be helpful.
Source Link
Dharman
  • 34k
  • 27
  • 106
  • 157

After lots of googling on this and messing around, the concept of layers are great and seem to work for me.

This github repo from keithrozario has loads of pre-build layers you can simply add to your lambda via the arn which has some great stuff in there like pandas, requests and sqlalchemy.

I've create a template to compile and upload a layer (containing python dependencies) to lambda using the AWS CLI which you can find in my Gitlab repo here.

I'm running this on an Amazon Linux EC2, using a virtual environment (venv) to install libraries from a requirements.txt file and then load the zipped files to lambda using the AWS CLI.

Note the folder structure my_zip_file/python/binaries which is required for lambda.

Note: Pandas is quite a large library. Your zipped layer file must be below 70mb.

You may also encounter the horrible "OpenBLAS WARNING - could not determine the L2 cache size on this system" error message. I had to increase the memory from the default 128mb in order to the lambda to successfully run.

I hope this helps!

After lots of googling on this and messing around, the concept of layers are great and seem to work for me.

This github repo from keithrozario has loads of pre-build layers you can simply add to your lambda via the arn which has some great stuff in there like pandas, requests and sqlalchemy.

I've create a template to compile and upload a layer (containing python dependencies) to lambda using the AWS CLI which you can find in my Gitlab repo here.

I'm running this on an Amazon Linux EC2, using a virtual environment (venv) to install libraries from a requirements.txt file and then load the zipped files to lambda using the AWS CLI.

Note the folder structure my_zip_file/python/binaries which is required for lambda.

Note: Pandas is quite a large library. Your zipped layer file must be below 70mb.

You may also encounter the horrible "OpenBLAS WARNING - could not determine the L2 cache size on this system" error message. I had to increase the memory from the default 128mb in order to the lambda to successfully run.

I hope this helps!

After lots of googling on this and messing around, the concept of layers are great and seem to work for me.

This github repo from keithrozario has loads of pre-build layers you can simply add to your lambda via the arn which has some great stuff in there like pandas, requests and sqlalchemy.

I've create a template to compile and upload a layer (containing python dependencies) to lambda using the AWS CLI which you can find in my Gitlab repo here.

I'm running this on an Amazon Linux EC2, using a virtual environment (venv) to install libraries from a requirements.txt file and then load the zipped files to lambda using the AWS CLI.

Note the folder structure my_zip_file/python/binaries which is required for lambda.

Note: Pandas is quite a large library. Your zipped layer file must be below 70mb.

You may also encounter the horrible "OpenBLAS WARNING - could not determine the L2 cache size on this system" error message. I had to increase the memory from the default 128mb in order to the lambda to successfully run.

Source Link

After lots of googling on this and messing around, the concept of layers are great and seem to work for me.

This github repo from keithrozario has loads of pre-build layers you can simply add to your lambda via the arn which has some great stuff in there like pandas, requests and sqlalchemy.

I've create a template to compile and upload a layer (containing python dependencies) to lambda using the AWS CLI which you can find in my Gitlab repo here.

I'm running this on an Amazon Linux EC2, using a virtual environment (venv) to install libraries from a requirements.txt file and then load the zipped files to lambda using the AWS CLI.

Note the folder structure my_zip_file/python/binaries which is required for lambda.

Note: Pandas is quite a large library. Your zipped layer file must be below 70mb.

You may also encounter the horrible "OpenBLAS WARNING - could not determine the L2 cache size on this system" error message. I had to increase the memory from the default 128mb in order to the lambda to successfully run.

I hope this helps!