I am trying to deploy a NN model that I trained locally on my machine using Keras. I use my model (locally) as:
from keras.models import load_model model = load_model("/path/to/model/model.h5") prediction = model.predict(x) Now, I need to use the same model on my lambda function. I uploaded the model in an s3 bucket. I then tried to access the file as:
model = load_model("https://s3-eu-west-1.amazonaws.com/my-bucket/models/model.h5") But it tells me that the file does not exist. I guess it's a privilege problem. I also tried as (similar to how I read JSON files from s3):
client_s3 = boto3.client("s3") result = client_s3.get_object(Bucket="my-bucket", Key='models/model.h5') model = load_model(result["Body"].read()) But I obtain this error:
"stackTrace": [ [ "/var/task/lambda_function.py", 322, "lambda_handler", "model = load_model(result[\"Body\"].read())" ], [ "/var/task/keras/models.py", 227, "load_model", "with h5py.File(filepath, mode='r') as f:" ], [ "/var/task/h5py/_hl/files.py", 269, "__init__", "fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)" ], [ "/var/task/h5py/_hl/files.py", 99, "make_fid", "fid = h5f.open(name, flags, fapl=fapl)" ], [ "h5py/_objects.pyx", 54, "h5py._objects.with_phil.wrapper", null ], [ "h5py/_objects.pyx", 55, "h5py._objects.with_phil.wrapper", null ], [ "h5py/h5f.pyx", 78, "h5py.h5f.open", null ], [ "h5py/defs.pyx", 621, "h5py.defs.H5Fopen", null ], [ "h5py/_errors.pyx", 123, "h5py._errors.set_exception", null ] ], "errorType": "UnicodeDecodeError", "errorMessage": "'utf8' codec can't decode byte 0x89 in position 29: invalid start byte" } I suspect the result["Body"].read() function cannot be used with h5py object. What is the best way to load a h5py model from s3?
SOLUTION: The solution is to download the file into the /tmp/ folder:
result = client_s3.download_file("my-bucket",'model.h5', "/tmp/model.h5") model = load_model("/tmp/day/model.h5")