Deploying a Model to a Serverless Cloud Function

Deploying a Model to a Serverless Cloud Function

LSTM to Google Cloud

Today we are going to look at how to deploy a machine learning model into a serverless cloud function so that any app can make an API call to it and make predictions from the model.

For this exercise, we are going to use Google Cloud. Here we assume that you already have a trained model and are ready to serve it to apps. And you have a google cloud account.

If you need to know how to create a Google Cloud account, please check this link.

Getting Started with Google Cloud

For this demo, I'm going to use my LSTM model which I trained to predict stock prices.

You can see its detailed description here.

Predicting Stock Prices Through Deep Learning

The server code and the model training notebook are available in this GitHub repo.

Alright, let's get started.

Preparing the model for deployment

Once we have a trained model, we need to save it to a local folder which can we use while building our cloud function.

In my case I trained a model using TensorFlow Keras and TensorFlow provides a way to save and load models.

# Save the model
model.save('model_directory')

After the model is saved in the desired local directory, we should see a local folder containing the model files. The saved_model.pb is what we need when we want to load this model again.

Installing Google Cloud CLI

We can either log in to the Google Cloud console, and build our function manually by uploading our model, but I prefer to use the Google Cloud CLI from my local machine as it allows me to test my functions quickly without thinking about deployment. When I am ready, I can just use a CLI command to deploy it.

Google Cloud CLI installs a bunch of commands which we can execute from our terminal to interact with the Google Cloud services like Cloud Storage, Cloud Functions etc.

To install the Google Cloud CLI, you can follow the official instructions given here. You can follow the instructions based on the operating system you're using.

Install the gcloud CLI

Once installed, we can verify using the following command.

gcloud --version

We should see an output like this:

WARNING:  Python 3.5-3.7 will be deprecated on August 8th, 2023. Please use Python version 3.8 and up.

If you have a compatible Python interpreter installed, you can use it by setting
the CLOUDSDK_PYTHON environment variable to point to it.

Google Cloud SDK 441.0.0
alpha 2023.07.28
beta 2023.07.28
bq 2.0.95
core 2023.07.28
gcloud-crc32c 1.0.0
gsutil 5.25
Updates are available for some Google Cloud CLI components.  To install them,
please run:
  $ gcloud components update

Next, we need to connect the gcloud in our local machine to the Google Cloud account. For this, we use the init command.

gcloud init

This opens the web browser where you can connect your logged-in account.

Next, we need to connect a project where we are going to deploy our function.

To create a project in the Google Cloud console, you can click the drop-down next to the Google Cloud logo and click on NEW PROJECT. The view maybe different if you are creating the project for the first time.

Once a project is created, you should see an ID next to the project name. IDs are alpha numeric strings generated by Google and they are unique per project.

To connect this project to our gcloud CLI, we use the below command.

gcloud config set project PROJECT_ID

In place of PROJECT_ID use your project's ID from the console.

Perfect, now gcloud is in our machine, let's build our function.

Building the cloud function

So far we have a trained model which is saved to our local folder, and we also have gcloud CLI installed. We connected our Google account and project with the CLI. Let's now begin to write our function.

We shall start a Python script in the same folder that contains the model folder.

The function script itself is not very complicated. Let's see the building blocks.

Like any function, there is going to be an input that goes into the function as params, and the function does some operations and returns an output.

The cloud function is no different, we just need to prepare it to listen to inputs from the internet.

First, we import the required libraries.

from datetime import datetime, timedelta
from sklearn.preprocessing import MinMaxScaler
import yfinance as yf
import numpy as np
from flask import request, jsonify
import os
import tensorflow as tf

In my case, I need libraries such as sklearn, yfinance, datetime to do some preprocessing on my data.

We define a function called 'predict' which

  • takes an input

  • pre-processes the data

  • loads the machine-learning model

  • the model predicts the output

  • function post-processes the data

  • the function returns the output

Since the pre-processing and the post-processing are very specific to the dataset and the function I'm not going to discuss it here, but feel free the explore the code.

The important piece is how to load the model and use it to predict the output.

This is done by the following two lines of code.

# load model
model = tf.keras.models.load_model('./model')

# Perform prediction using the loaded model
predictions = model.predict(stock_data)

We use TensorFlow's 'load_model' function to load the model from our local directory.

Once the model is loaded, we use the 'predict' function to get our results.

That's it. From the machine-learning perspective, the function is ready. Now let's prepare it for deployment.

Deploying the function

There are a couple of things we need to take care of before deploying.

First, since it's going to be a cloud function which can accept requests from the open internet, we need to let the function accept cross-origin requests.

That's why we need to include this piece of code at the beginning of the function. It's standard protocol of CORS and we need to handle it.

if request.method == 'OPTIONS':
    # Handle preflight request
    response = jsonify({})
    response.headers.add('Access-Control-Allow-Origin', '*')
    response.headers.add(
        'Access-Control-Allow-Headers', 'Content-Type')
    response.headers.add(
        'Access-Control-Allow-Methods', 'GET, OPTIONS')
    return response

headers = {'Access-Control-Allow-Origin': '*'}

Next, we need to tell Google Cloud about the libraries that we are going to be using. Remember the import statements we made earlier?

Those packages, when installed locally will make the function run but the cloud function won't have access to them after deployment so we need to tell Google while deploying to install those packages along with the function code.

For this, we are going to create a new file called 'requirements.txt' which will simply have the list of packages that we need to run our function.

This list can vary depending on your function logic.

So let's create a new file in the same folder as the script.

The content of this file is just a list of packages.

Flask
requests
google-cloud-aiplatform~=0.5.1
numpy
scikit_learn
yfinance
tensorflow

You can choose to specify the versions against each package if your function requires it, or leave it blank so 'pip' can take care of it.

I recommend not including the package versions so if there are any cross dependencies 'pip' can auto-resolve them. So unless there is a very specific need for a package version, try to avoid it.

The presence of the 'requirements.txt' file in the same folder as the script is an instruction to the gcloud command to install those packages.

Ok, now we are ready to deploy.

To deploy the function, we use the following command.

gcloud functions deploy predict --runtime python37 --trigger-http --memory 1GB

Things to look for in the command:

  • The 'predict' flag in the command is the name of the function that we defined.

  • The 'runtime' flag tells gcloud which Python version to use.

  • The 'trigger-http' flag tells gcloud to allow the function to be publicly triggered.

  • The 'memory' flag tells gcloud to allocate the required RAM for the function. The default is 256MB, I raised it as we used TensorFlow. Change this setting only if the default values are not enough and your deployment is failing.

The deployment takes a couple of minutes depending on the number of packages to be installed.

The output of this command looks like this after a successful deployment:

WARNING:  Python 3.5-3.7 will be deprecated on August 8th, 2023. Please use Python version 3.8 and up.

If you have a compatible Python interpreter installed, you can use it by setting
the CLOUDSDK_PYTHON environment variable to point to it.

Deploying function (may take a while - up to 2 minutes)...⠹                                                                                            
For Cloud Build Logs, visit: https://console.cloud.google.com/cloud-build/builds;region=us-central1/19869c69-b9d0-49f1-8ce6-d73f61f429c2?project=497584664360
Deploying function (may take a while - up to 2 minutes)...done.                                                                                        
availableMemoryMb: 1024
buildId: 19869c69-b9d0-49f1-8ce6-d73f61f429c2
buildName: projects/497584664360/locations/us-central1/builds/19869c69-b9d0-49f1-8ce6-d73f61f429c2
dockerRegistry: CONTAINER_REGISTRY
entryPoint: predict
environmentVariables:
  ENDPOINT_ID: '6257043596743016448'
  PROJECT_ID: '497584664360'
httpsTrigger:
  securityLevel: SECURE_OPTIONAL
  url: https://us-central1-carbide-nation-393701.cloudfunctions.net/predict
ingressSettings: ALLOW_ALL
labels:
  deployment-tool: cli-gcloud
name: projects/carbide-nation-393701/locations/us-central1/functions/predict
runtime: python37
serviceAccountEmail: carbide-nation-393701@appspot.gserviceaccount.com
sourceUploadUrl: https://storage.googleapis.com/uploads-347883678710.us-central1.cloudfunctions.appspot.com/95315054-cf14-4397-980d-7fe5deba9411.zip
status: ACTIVE
timeout: 60s
updateTime: '2023-08-15T02:43:59.389Z'
versionId: '40'


Updates are available for some Google Cloud CLI components.  To install them,
please run:
  $ gcloud components update

Notice the 'url' field under 'httpsTrigger'. That's our public URL that we can send POST requests to and receive predictions.

On the cloud console, we can check out our function details and code. Search for Cloud Functions on the big search bar and select it.

Now we should see a new line item for the 'predict' function.

Click on it to check the details.

Click on the 'SOURCE' tab to see our code.

We could even test our function using the 'TESTING' tab.

As we can see, the function is now able to provide prediction outputs. Our deployment is successful.

We can now use this function URL as an API in any of our apps to serve the machine learning model.

Conclusion

If you have followed me along till now I hope you had a successful deployment as well. If you are stuck anywhere please post it in the comments and we shall try to figure it out together.

I hope you found this article helpful. If you did, do leave a like to support me.

Thank you for reading, see you at the next one.

Cheers,

Uday

📷
Alex KnightUnsplash - cover image credit

Did you find this article valuable?

Support Uday Kiran Kavaturu by becoming a sponsor. Any amount is appreciated!