BentoML Demo: Iris Classifier with SQL Server Machine Learning Services

Microsoft's Machine Learning Services is a feature in SQL Server that gives the ability to run Python and R scripts with relational data. We can use BentoML and other open-source packages, along with the Microsoft Python packages, for predictive analytics and machine learning. The scripts are executed in-database without moving data outside SQL Server or over the network.

Impression

Let's start by creating the Iris Classifier and packaging that as a bento bundle.

In [ ]:
%reload_ext autoreload
%autoreload 2
%matplotlib inline
In [ ]:
!pip install -q bentoml scikit-learn>=0.23.0

Creating BentoService for model serving.

BentoService is the base for creating the classs. It should not be present in the __main__ function so we are going to create a separate cell for it.

In [ ]:
%%writefile iris_classifier.py
from bentoml import env, artifacts, api, BentoService
from bentoml.adapters import DataframeInput
from bentoml.artifact import SklearnModelArtifact

@env(infer_pip_packages=True)
@artifacts([SklearnModelArtifact('model')])

class IrisClassifier(BentoService):
    @api(input=DataframeInput())

    def predict(Self,df):
        #optional pre-procesing, post-processing code goes here
        return self.artifacts.model.predict(df)
In [ ]:
from sklearn import svm
from sklearn import datasets

from iris_classifier import IrisClassifier

if __name__ == "__main__":
  # Load training data
  iris = datasets.load_iris()
  X, y = iris.data, iris.target

  # Model Training
  clf = svm.SVC(gamma='scale')
  clf.fit(X, y)

  # Create a iris classifier service instance
  iris_classifier_service = IrisClassifier()

  # Pack the newly trained model artifact
  iris_classifier_service.pack('model', clf)

  # Save the prediction service to disk for model serving
  saved_path = iris_classifier_service.save()
In [ ]:
# Get the path of the saved bundle

!bentoml get --print-location IrisClassifier:latest 

Now you can serve the bentoml packed model via REST API by running the following command.

!bentoml serve IrisClassifier:latest

However, the focus of this tutorial is to serve the saved bundle using SQL server.

We will first install the required software and packages.

For running the saved bundle in sql server follow these steps:

  • Install sql server machine learning services from here

  • Follow the installation steps from here. Don't forget to select python from the feature selection

  • Once the setup is ready, create a new server and after server is connected download SSMS. It is an integrated environment for managing SQL queries. We will use it to run our saved bento bundle and make predictions. Alternatively we can also use Azure Data Studio

  • Install bentoml in the sql server. Open Command Prompt or terminal and change directory to SQL Server/PYTHON_SERVICES/SCRIPTS. Run pip.exe install bentoml

If you get SSL related errors:

  • cd..
  • condabin\activate.bat

  • pip.exe install bentoml

Additionally upgrade scikit-learn to prevent errors during deployment:

  • pip install --upgrade scikit-learn

Finally copy the bento bundle from the location you got from !bentoml get --print-location IrisClassifier:latest to the directory where SQL Server is installed to avoid permission denied errors.

Now let's switch to SSMS and write query for deploying bento saved bundle.

Connect to Database Engine with the server name you used to create new server.

Start a new query and write the following command. This will enable running external scripts.

sp_configure

EXEC sp_configure 'external scripts enabled', 1

RECONFIGURE WITH OVERRIDE

Create a new database to store the dataset for prediction.

CREATE DATABASE irissql

GO

Now create new table in the database we just created and put columns similar to the iris dataset. We will later save the data here.

USE irissql

GO

DROP TABLE IF EXISTS iris_data;

GO

CREATE TABLE iris_data (

id INT NOT NULL IDENTITY PRIMARY KEY

, "Sepal.Length" FLOAT NOT NULL, "Sepal.Width" FLOAT NOT NULL

, "Petal.Length" FLOAT NOT NULL, "Petal.Width" FLOAT NOT NULL

, "Species" VARCHAR(100) NOT NULL, "SpeciesId" INT NOT NULL

);

Next we will create a procedure which works similar to a method(function) in python. We list a set of steps that will get executed when we'll use this procedure. We are going to use procedure get_iris_dataset to insert values in the table. Find more in the documentation here.

CREATE PROCEDURE get_iris_dataset

AS

BEGIN

EXEC sp_execute_external_script @language = N'Python',

@script = N'

from sklearn import datasets

iris = datasets.load_iris()

iris_data = pandas.DataFrame(iris.data)

iris_data["Species"] = pandas.Categorical.from_codes(iris.target, iris.target_names)

iris_data["SpeciesId"] = iris.target

',

@input_data_1 = N'',

@output_data_1_name = N'iris_data'

WITH RESULT SETS (("Sepal.Length" float not null, "Sepal.Width" float not null, "Petal.Length" float not null, "Petal.Width" float not null,

"Species" varchar(100) not null, "SpeciesId" int not null));

END;

GO

Finally insert data into the table iris_data and execute the procedure get_iris_dataset.

INSERT INTO iris_data ("Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width", "Species", "SpeciesId")

EXEC dbo.get_iris_dataset;

The last step is to create a procedure for model deployment and prediction. We will create a procedure predict_species and as an external script we will run the bento saved bundle.

import bentoml as usual and set the saved_path to the location where the bento bundle is saved.

Load the bundle using bentoml.load(). Now we can use this model loaded from the saved bundle to make predictions and deploy the model. List all the input and output features.

Here is the complete script

CREATE PROCEDURE predict_species (@model VARCHAR(100))

AS

BEGIN

`DECLARE @svm_model VARBINARY(max)`

`EXECUTE sp_execute_external_script @language = N'Python'`

    `, @script = N'`

import bentoml

saved_path=r"C:\Program Files\Microsoft SQL Server\MSSQL15.NEWSERVER\bento_bundle"

irismodel = bentoml.load(saved_path)

species_pred = irismodel.predict(iris_data[["Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width"]])

iris_data["PredictedSpecies"] = species_pred

OutputDataSet = iris_data[["id","SpeciesId","PredictedSpecies"]]

print(OutputDataSet)

' , @input_data_1 = N'select id, "Sepal.Length", "Sepal.Width", "Petal.Length", "Petal.Width", "SpeciesId" from iris_data'

    `, @input_data_1_name = N'iris_data'`

    `, @params = N'@svm_model varbinary(max)'`

    `, @nb_model = @svm_model`

`WITH RESULT SETS((`

            `"id" INT`

          `, "SpeciesId" INT`

          `, "SpeciesId.Predicted" INT`

           ` ));`

END;

GO

The procedure is ready now. We can deploy it using Execute predict_species 'SVM'; where SVM is the name given to the model.

EXECUTE predict_species 'SVM';

GO

After running the final query you can see the predictions in form of a table. Our model is served with SQL server easily with the help of BentoML.