BentoML Example

Titanic Survival Prediction with XGBoost

This is a BentoML Demo Project demonstrating how to package and serve XGBoost model for production using BentoML.

BentoML is an open source platform for machine learning model serving and deployment.

Let's get started! Impression

In [1]:
%reload_ext autoreload
%autoreload 2
%matplotlib inline

import warnings
warnings.filterwarnings("ignore")
In [ ]:
!pip install bentoml
!pip install xgboost numpy pandas
In [2]:
import pandas as pd
import numpy as np
import xgboost as xgb
import bentoml

Prepare Dataset

download dataset from https://www.kaggle.com/c/titanic/data

In [3]:
!mkdir data
!curl https://raw.githubusercontent.com/agconti/kaggle-titanic/master/data/train.csv -o ./data/train.csv
!curl https://raw.githubusercontent.com/agconti/kaggle-titanic/master/data/test.csv -o ./data/test.csv
mkdir: data: File exists
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 60302  100 60302    0     0   217k      0 --:--:-- --:--:-- --:--:--  216k
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 28210  100 28210    0     0   133k      0 --:--:-- --:--:-- --:--:--  133k
In [4]:
train = pd.read_csv("./data/train.csv")
test  = pd.read_csv("./data/test.csv")
X_y_train = xgb.DMatrix(data=train[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch']], label= train['Survived'])
X_test    = xgb.DMatrix(data=test[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch']])
In [5]:
train[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch', 'Survived']].head()
Out[5]:
Pclass Age Fare SibSp Parch Survived
0 3 22.0 7.2500 1 0 0
1 1 38.0 71.2833 1 0 1
2 3 26.0 7.9250 0 0 1
3 1 35.0 53.1000 1 0 1
4 3 35.0 8.0500 0 0 0

Model Training

In [6]:
params = {
          'base_score': np.mean(train['Survived']),
          'eta':  0.1,
          'max_depth': 3,
          'gamma' :3,
          'objective'   :'reg:linear',
          'eval_metric' :'mae'
         }
model = xgb.train(params=params, 
                  dtrain=X_y_train, 
                  num_boost_round=3)
In [7]:
y_test =  model.predict(X_test)
test['pred'] = y_test
test[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch','pred']].iloc[10:].head(2)
Out[7]:
Pclass Age Fare SibSp Parch pred
10 3 NaN 7.8958 0 0 0.341580
11 1 46.0 26.0000 0 0 0.413966

Create BentoService for model serving

In [8]:
%%writefile xgboost_titanic_bento_service.py

import xgboost as xgb

import bentoml
from bentoml.artifact import XgboostModelArtifact
from bentoml.handlers import DataframeHandler

@bentoml.artifacts([XgboostModelArtifact('model')])
@bentoml.env(pip_dependencies=['xgboost'])
class TitanicSurvivalPredictionService(bentoml.BentoService):
    
    @bentoml.api(DataframeHandler)
    def predict(self, df):
        data = xgb.DMatrix(data=df[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch']])
        return self.artifacts.model.predict(data)
Overwriting xgboost_titanic_bento_service.py

Save BentoML service archive

In [9]:
# 1) import the custom BentoService defined above
from xgboost_titanic_bento_service import TitanicSurvivalPredictionService

# 2) `pack` it with required artifacts
bento_service = TitanicSurvivalPredictionService()
bento_service.pack('model', model)

# 3) save your BentoSerivce
saved_path = bento_service.save()
[2019-12-11 17:47:09,677] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
True
[2019-12-11 17:47:09,679] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
[2019-12-11 17:47:09,692] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
[2019-12-11 17:47:34,508] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
running sdist
running egg_info
writing BentoML.egg-info/PKG-INFO
writing dependency_links to BentoML.egg-info/dependency_links.txt
writing entry points to BentoML.egg-info/entry_points.txt
writing requirements to BentoML.egg-info/requires.txt
writing top-level names to BentoML.egg-info/top_level.txt
reading manifest file 'BentoML.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
no previously-included directories found matching 'examples'
no previously-included directories found matching 'tests'
no previously-included directories found matching 'docs'
warning: no previously-included files matching '*~' found anywhere in distribution
warning: no previously-included files matching '*.pyo' found anywhere in distribution
warning: no previously-included files matching '.git' found anywhere in distribution
warning: no previously-included files matching '.ipynb_checkpoints' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
writing manifest file 'BentoML.egg-info/SOURCES.txt'
running check
warning: check: missing meta-data: if 'author' supplied, 'author_email' must be supplied too

creating BentoML-0.5.3+22.g10ffa8c
creating BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
creating BentoML-0.5.3+22.g10ffa8c/bentoml
creating BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
creating BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
creating BentoML-0.5.3+22.g10ffa8c/bentoml/cli
creating BentoML-0.5.3+22.g10ffa8c/bentoml/clipper
creating BentoML-0.5.3+22.g10ffa8c/bentoml/configuration
creating BentoML-0.5.3+22.g10ffa8c/bentoml/deployment
creating BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/aws_lambda
creating BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/sagemaker
creating BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/serverless
creating BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
creating BentoML-0.5.3+22.g10ffa8c/bentoml/migrations
creating BentoML-0.5.3+22.g10ffa8c/bentoml/migrations/versions
creating BentoML-0.5.3+22.g10ffa8c/bentoml/proto
creating BentoML-0.5.3+22.g10ffa8c/bentoml/repository
creating BentoML-0.5.3+22.g10ffa8c/bentoml/server
creating BentoML-0.5.3+22.g10ffa8c/bentoml/server/static
creating BentoML-0.5.3+22.g10ffa8c/bentoml/utils
creating BentoML-0.5.3+22.g10ffa8c/bentoml/utils/validator
creating BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
copying files to BentoML-0.5.3+22.g10ffa8c...
copying LICENSE -> BentoML-0.5.3+22.g10ffa8c
copying MANIFEST.in -> BentoML-0.5.3+22.g10ffa8c
copying README.md -> BentoML-0.5.3+22.g10ffa8c
copying setup.cfg -> BentoML-0.5.3+22.g10ffa8c
copying setup.py -> BentoML-0.5.3+22.g10ffa8c
copying versioneer.py -> BentoML-0.5.3+22.g10ffa8c
copying BentoML.egg-info/PKG-INFO -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying BentoML.egg-info/SOURCES.txt -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying BentoML.egg-info/dependency_links.txt -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying BentoML.egg-info/entry_points.txt -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying BentoML.egg-info/requires.txt -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying BentoML.egg-info/top_level.txt -> BentoML-0.5.3+22.g10ffa8c/BentoML.egg-info
copying bentoml/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/_version.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/alembic.ini -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/db.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/exceptions.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/service.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/service_env.py -> BentoML-0.5.3+22.g10ffa8c/bentoml
copying bentoml/artifact/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/fastai_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/h2o_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/keras_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/lightgbm_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/pickle_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/pytorch_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/sklearn_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/text_file_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/tf_savedmodel_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/artifact/xgboost_model_artifact.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/artifact
copying bentoml/bundler/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/bundler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/config.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/loader.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/py_module_utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/templates.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/bundler/utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/bundler
copying bentoml/cli/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/cli
copying bentoml/cli/click_utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/cli
copying bentoml/cli/config.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/cli
copying bentoml/cli/deployment.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/cli
copying bentoml/cli/utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/cli
copying bentoml/clipper/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/clipper
copying bentoml/configuration/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/configuration
copying bentoml/configuration/configparser.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/configuration
copying bentoml/configuration/default_bentoml.cfg -> BentoML-0.5.3+22.g10ffa8c/bentoml/configuration
copying bentoml/deployment/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment
copying bentoml/deployment/operator.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment
copying bentoml/deployment/store.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment
copying bentoml/deployment/utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment
copying bentoml/deployment/aws_lambda/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/aws_lambda
copying bentoml/deployment/aws_lambda/lambda_app.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/aws_lambda
copying bentoml/deployment/aws_lambda/unzip_requirements.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/aws_lambda
copying bentoml/deployment/aws_lambda/utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/aws_lambda
copying bentoml/deployment/sagemaker/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/sagemaker
copying bentoml/deployment/sagemaker/templates.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/sagemaker
copying bentoml/deployment/serverless/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/serverless
copying bentoml/deployment/serverless/aws_lambda.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/serverless
copying bentoml/deployment/serverless/gcp_function.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/serverless
copying bentoml/deployment/serverless/serverless_utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/deployment/serverless
copying bentoml/handlers/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/base_handlers.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/clipper_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/dataframe_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/fastai_image_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/image_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/json_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/pytorch_tensor_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/handlers/tensorflow_tensor_handler.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/handlers
copying bentoml/migrations/README -> BentoML-0.5.3+22.g10ffa8c/bentoml/migrations
copying bentoml/migrations/env.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/migrations
copying bentoml/migrations/script.py.mako -> BentoML-0.5.3+22.g10ffa8c/bentoml/migrations
copying bentoml/migrations/versions/a6b00ae45279_add_last_updated_at_for_deployments.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/migrations/versions
copying bentoml/proto/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/proto/deployment_pb2.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/proto/repository_pb2.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/proto/status_pb2.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/proto/yatai_service_pb2.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/proto/yatai_service_pb2_grpc.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/proto
copying bentoml/repository/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/repository
copying bentoml/repository/metadata_store.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/repository
copying bentoml/server/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/bento_api_server.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/bento_sagemaker_server.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/gunicorn_config.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/gunicorn_server.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/server
copying bentoml/server/static/swagger-ui-bundle.js -> BentoML-0.5.3+22.g10ffa8c/bentoml/server/static
copying bentoml/server/static/swagger-ui.css -> BentoML-0.5.3+22.g10ffa8c/bentoml/server/static
copying bentoml/utils/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/cloudpickle.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/hybirdmethod.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/log.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/s3.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/tempdir.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/usage_stats.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/whichcraft.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils
copying bentoml/utils/validator/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/utils/validator
copying bentoml/yatai/__init__.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
copying bentoml/yatai/deployment_utils.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
copying bentoml/yatai/python_api.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
copying bentoml/yatai/status.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
copying bentoml/yatai/yatai_service_impl.py -> BentoML-0.5.3+22.g10ffa8c/bentoml/yatai
Writing BentoML-0.5.3+22.g10ffa8c/setup.cfg
UPDATING BentoML-0.5.3+22.g10ffa8c/bentoml/_version.py
set BentoML-0.5.3+22.g10ffa8c/bentoml/_version.py to '0.5.3+22.g10ffa8c'
Creating tar archive
removing 'BentoML-0.5.3+22.g10ffa8c' (and everything under it)
[2019-12-11 17:47:36,534] INFO - BentoService bundle 'TitanicSurvivalPredictionService:20191211174709_CCB40F' created at: /private/var/folders/kn/xnc9k74x03567n1mx2tfqnpr0000gn/T/bentoml-temp-s963i88c
[2019-12-11 17:47:36,536] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
[2019-12-11 17:47:36,550] WARNING - Saved BentoService bundle version mismatch: loading BentoServie bundle create with BentoML version 0.5.3,  but loading from BentoML version 0.5.3+22.g10ffa8c
[2019-12-11 17:47:36,771] INFO - BentoService bundle 'TitanicSurvivalPredictionService:20191211174709_CCB40F' created at: /Users/bozhaoyu/bentoml/repository/TitanicSurvivalPredictionService/20191211174709_CCB40F

Load saved BentoService for serving

In [10]:
import bentoml

bento_model = bentoml.load(saved_path)

result = bento_model.predict(test)
test['pred'] = result
test[['Pclass', 'Age', 'Fare', 'SibSp', 'Parch','pred']].iloc[10:].head(2)
[2019-12-11 17:48:15,614] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
[2019-12-11 17:48:15,629] WARNING - Saved BentoService bundle version mismatch: loading BentoServie bundle create with BentoML version 0.5.3,  but loading from BentoML version 0.5.3+22.g10ffa8c
[2019-12-11 17:48:15,632] WARNING - Module `xgboost_titanic_bento_service` already loaded, using existing imported module.
[2019-12-11 17:48:15,638] WARNING - BentoML local changes detected - Local BentoML repository including all code changes will be bundled together with the BentoService bundle. When used with docker, the base docker image will be default to same version as last PyPI release at version: 0.5.3. You can also force bentoml to use a specific version for deploying your BentoService bundle, by setting the config 'core/bentoml_deploy_version' to a pinned version or your custom BentoML on github, e.g.:'bentoml_deploy_version = git+https://github.com/{username}/[email protected]{branch}'
Out[10]:
Pclass Age Fare SibSp Parch pred
10 3 NaN 7.8958 0 0 0.341580
11 1 46.0 26.0000 0 0 0.413966

Model Serving via REST API

In your termnial, run the following command to start the REST API server:

In [14]:
!bentoml serve {saved_path}
[14:20:10] WARNING: src/objective/regression_obj.cu:152: reg:linear is now deprecated in favor of reg:squarederror.
 * Serving Flask app "TitanicSurvivalPredictionService" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
127.0.0.1 - - [19/Sep/2019 14:20:14] "POST /predict HTTP/1.1" 200 -
127.0.0.1 - - [19/Sep/2019 14:20:16] "POST /predict HTTP/1.1" 200 -
^C

Copy following command to make a curl request to Rest API server

curl -i \
--header "Content-Type: application/json" \
--request POST \
--data '[{"Pclass": 1, "Age": 30, "Fare": 200, "SibSp": 1, "Parch": 0}]' \
localhost:5000/predict

Deploy it as API endpoint on AWS Lambda

In order to run this as AWS Lambda function, make sure to configure your AWS credentials via either aws configure command or setting the environment variables below:

%env AWS_ACCESS_KEY_ID=
%env AWS_SECRET_ACCESS_KEY=
%env AWS_DEFAULT_REGION=

Make sure you have AWS SAM CLI installed:

In [ ]:
%env AWS_ACCESS_KEY_ID=
%env AWS_SECRET_ACCESS_KEY=
%env AWS_DEFAULT_REGION=

Make sure you have AWS SAM CLI installed:

In [ ]:
!pip install -U aws-sam-cli==0.33.1
In [16]:
!bentoml deployment create titanic-prediction \
    --bento=TitanicSurvivalPredictionService:{bento_service.version} \
    --platform=aws-lambda \
[2019-12-12 15:38:26,831] INFO - Building lambda project
[2019-12-12 15:41:29,703] INFO - Packaging AWS Lambda project at /private/var/folders/kn/xnc9k74x03567n1mx2tfqnpr0000gn/T/bentoml-temp-_2fjzrcs ...
[2019-12-12 15:41:54,482] INFO - Deploying lambda project
[2019-12-12 15:43:44,620] INFO - ApplyDeployment (titanic-prediction, namespace bobo) succeeded
Successfully created deployment titanic-prediction
{
  "namespace": "bobo",
  "name": "titanic-prediction",
  "spec": {
    "bentoName": "TitanicSurvivalPredictionService",
    "bentoVersion": "20191211174709_CCB40F",
    "operator": "AWS_LAMBDA",
    "awsLambdaOperatorConfig": {
      "region": "us-west-2",
      "memorySize": 1024,
      "timeout": 3
    }
  },
  "state": {
    "state": "RUNNING",
    "infoJson": {
      "endpoints": [
        "https://u7nerrir6a.execute-api.us-west-2.amazonaws.com/Prod/predict"
      ],
      "s3_bucket": "btml-bobo-titanic-prediction-c3da9d"
    },
    "timestamp": "2019-12-12T23:43:44.799944Z"
  },
  "createdAt": "2019-12-12T23:38:22.515572Z",
  "lastUpdatedAt": "2019-12-12T23:38:22.515613Z"
}
In [13]:
!bentoml deployment describe titanic-prediction
{
  "namespace": "bobo",
  "name": "titanic-prediction",
  "spec": {
    "bentoName": "TitanicSurvivalPredictionService",
    "bentoVersion": "20191211174709_CCB40F",
    "operator": "AWS_LAMBDA",
    "awsLambdaOperatorConfig": {
      "region": "us-west-2",
      "memorySize": 1024,
      "timeout": 3
    }
  },
  "state": {
    "state": "RUNNING",
    "infoJson": {
      "endpoints": [
        "https://eg3tvb519i.execute-api.us-west-2.amazonaws.com/Prod/predict"
      ],
      "s3_bucket": "btml-bobo-titanic-prediction-89a148"
    },
    "timestamp": "2019-12-12T01:54:02.842272Z"
  },
  "createdAt": "2019-12-12T01:49:36.076229Z",
  "lastUpdatedAt": "2019-12-12T01:49:36.076270Z"
}

To send request to your AWS Lambda deployment, grab the endpoint URL from the json output above:

In [18]:
!curl -i \
--header "Content-Type: application/json" \
--request POST \
--data '[{"Pclass": 1, "Age": 30, "Fare": 200, "SibSp": 1, "Parch": 0}]' \
https://u7nerrir6a.execute-api.us-west-2.amazonaws.com/Prod/predict











[0.469721257686615]
In [15]:
!bentoml deployment delete titanic-prediction
Successfully deleted deployment "titanic-prediction"
In [ ]: