BentoML Example: spaCy named entity recognizer

BentoML makes moving trained ML models to production easy:

  • Package models trained with any ML framework and reproduce them for model serving in production
  • Deploy anywhere for online API serving or offline batch serving
  • High-Performance API model server with adaptive micro-batching support
  • Central hub for managing models and deployment process via Web UI and APIs
  • Modular and flexible design making it adaptable to your infrastrcuture

BentoML is a framework for serving, managing, and deploying machine learning models. It is aiming to bridge the gap between Data Science and DevOps, and enable teams to deliver prediction services in a fast, repeatable, and scalable way.

Before reading this example project, be sure to check out the Getting started guide to learn about the basic concepts in BentoML.

Make sure to use GPU runtime when running this notebook in Google Colab, you can set it in top menu: Runtime > Change Runtime Type > Hardware accelerator.

Impression

In [1]:
%reload_ext autoreload
%autoreload 2
%matplotlib inline
In [1]:
!pip install -q bentoml spacy>=2.3.0
WARNING: You are using pip version 20.2.2; however, version 20.2.3 is available.
You should consider upgrading via the '/usr/local/anaconda3/envs/dev-py3/bin/python -m pip install --upgrade pip' command.
In [2]:
!python3 -m spacy download en_core_web_sm
Requirement already satisfied: en_core_web_sm==2.1.0 from https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.1.0/en_core_web_sm-2.1.0.tar.gz#egg=en_core_web_sm==2.1.0 in /usr/local/anaconda3/envs/dev-py3/lib/python3.7/site-packages (2.1.0)
WARNING: You are using pip version 20.2.2; however, version 20.2.3 is available.
You should consider upgrading via the '/usr/local/anaconda3/envs/dev-py3/bin/python3 -m pip install --upgrade pip' command.
✔ Download and installation successful
You can now load the model via spacy.load('en_core_web_sm')
In [3]:
import en_core_web_sm

nlp = en_core_web_sm.load()

# Getting the pipeline component
ner=nlp.get_pipe("ner")
In [4]:
# training data
TRAIN_DATA = [
              ("Walmart is a leading e-commerce company", {"entities": [(0, 7, "ORG")]}),
              ("I reached Chennai yesterday.", {"entities": [(19, 28, "GPE")]}),
              ("I recently ordered a book from Amazon", {"entities": [(24,32, "ORG")]}),
              ("I was driving a BMW", {"entities": [(16,19, "PRODUCT")]}),
              ("I ordered this from ShopClues", {"entities": [(20,29, "ORG")]}),
              ("Fridge can be ordered in Amazon ", {"entities": [(0,6, "PRODUCT")]}),
              ("I bought a new Washer", {"entities": [(16,22, "PRODUCT")]}),
              ("I bought a old table", {"entities": [(16,21, "PRODUCT")]}),
              ("I bought a fancy dress", {"entities": [(18,23, "PRODUCT")]}),
              ("I rented a camera", {"entities": [(12,18, "PRODUCT")]}),
              ("I rented a tent for our trip", {"entities": [(12,16, "PRODUCT")]}),
              ("I rented a screwdriver from our neighbour", {"entities": [(12,22, "PRODUCT")]}),
              ("I repaired my computer", {"entities": [(15,23, "PRODUCT")]}),
              ("I got my clock fixed", {"entities": [(16,21, "PRODUCT")]}),
              ("I got my truck fixed", {"entities": [(16,21, "PRODUCT")]}),
              ("Flipkart started it's journey from zero", {"entities": [(0,8, "ORG")]}),
              ("I recently ordered from Max", {"entities": [(24,27, "ORG")]}),
              ("Flipkart is recognized as leader in market",{"entities": [(0,8, "ORG")]}),
              ("I recently ordered from Swiggy", {"entities": [(24,29, "ORG")]})
              ]
In [5]:
for _, annotations in TRAIN_DATA:
  for ent in annotations.get("entities"):
    ner.add_label(ent[2])
    
# Disable pipeline components you dont need to change
pipe_exceptions = ["ner", "trf_wordpiecer", "trf_tok2vec"]
unaffected_pipes = [pipe for pipe in nlp.pipe_names if pipe not in pipe_exceptions]
In [6]:
# Import requirements
import random
from spacy.util import minibatch, compounding
from pathlib import Path

# TRAINING THE MODEL
with nlp.disable_pipes(*unaffected_pipes):

  # Training for 30 iterations
  for iteration in range(300):

    # shuufling examples  before every iteration
    random.shuffle(TRAIN_DATA)
    losses = {}
    # batch up the examples using spaCy's minibatch
    batches = minibatch(TRAIN_DATA, size=compounding(4.0, 32.0, 1.001))
    for batch in batches:
        texts, annotations = zip(*batch)
        nlp.update(
                    texts,  # batch of texts
                    annotations,  # batch of annotations
                    drop=0.5,  # dropout - make it harder to memorise data
                    losses=losses,
                )
        print("Losses", losses)
Losses {'ner': 6.793484245426953}
Losses {'ner': 11.22245792252943}
Losses {'ner': 12.999693839577958}
Losses {'ner': 16.720011212630197}
Losses {'ner': 22.314620407627444}
Losses {'ner': 3.4436206984100863}
Losses {'ner': 3.6815590899204835}
Losses {'ner': 4.65945196559187}
Losses {'ner': 12.351311054517282}
Losses {'ner': 17.30548254583846}
Losses {'ner': 4.034456621389836}
Losses {'ner': 6.063596666761441}
Losses {'ner': 13.533002676878823}
Losses {'ner': 15.843369366608385}
Losses {'ner': 19.643523378304963}
Losses {'ner': 5.578153549526178}
Losses {'ner': 10.864336172882759}
Losses {'ner': 13.35362255142536}
Losses {'ner': 18.992132167939417}
Losses {'ner': 19.441392632019415}
Losses {'ner': 0.2953134557210433}
Losses {'ner': 5.536041476621904}
Losses {'ner': 10.480906741459279}
Losses {'ner': 13.397003056385387}
Losses {'ner': 13.686604858527375}
Losses {'ner': 6.776049214517116}
Losses {'ner': 7.188584149613234}
Losses {'ner': 9.517494688889201}
Losses {'ner': 11.517572496379216}
Losses {'ner': 12.24777794563488}
Losses {'ner': 0.636776294442825}
Losses {'ner': 2.6147034158093447}
Losses {'ner': 4.220808923160803}
Losses {'ner': 8.08370939447559}
Losses {'ner': 8.086251394740003}
Losses {'ner': 1.7114613079611445}
Losses {'ner': 5.223225952096982}
Losses {'ner': 7.277183616970433}
Losses {'ner': 9.28402422615909}
Losses {'ner': 9.814013403203944}
Losses {'ner': 3.5940807928709546}
Losses {'ner': 5.6107135348865995}
Losses {'ner': 8.85219393474108}
Losses {'ner': 10.453919755003653}
Losses {'ner': 10.463092676712904}
Losses {'ner': 1.6589720522752032}
Losses {'ner': 3.1273618906004685}
Losses {'ner': 6.810165939243916}
Losses {'ner': 8.79576247142063}
Losses {'ner': 9.540278781187908}
Losses {'ner': 1.3028818591419622}
Losses {'ner': 3.7296083254195764}
Losses {'ner': 5.745377404383362}
Losses {'ner': 6.880968070471226}
Losses {'ner': 11.516633703150092}
Losses {'ner': 0.0071973120711845695}
Losses {'ner': 3.2924554609844563}
Losses {'ner': 3.3510597148260786}
Losses {'ner': 4.9933133354525125}
Losses {'ner': 6.294441871187701}
Losses {'ner': 2.2681492138144677}
Losses {'ner': 3.72791449640863}
Losses {'ner': 4.126896042338558}
Losses {'ner': 6.843974368294596}
Losses {'ner': 9.805656714487384}
Losses {'ner': 1.6219459306994395}
Losses {'ner': 1.6283929038436327}
Losses {'ner': 3.5182508186298946}
Losses {'ner': 6.212706752013901}
Losses {'ner': 6.25000444890145}
Losses {'ner': 1.1253539330427884}
Losses {'ner': 1.1256410144254687}
Losses {'ner': 4.716753020648298}
Losses {'ner': 5.151065874615938}
Losses {'ner': 5.151607864792979}
Losses {'ner': 2.2113986849844878}
Losses {'ner': 2.6213282372746107}
Losses {'ner': 5.985223468739605}
Losses {'ner': 5.985851843129012}
Losses {'ner': 5.990487230768029}
Losses {'ner': 3.147708846663627}
Losses {'ner': 3.4843309222851895}
Losses {'ner': 4.627558185647595}
Losses {'ner': 4.9361510676003855}
Losses {'ner': 6.8415589969226085}
Losses {'ner': 0.8266773219604788}
Losses {'ner': 2.0622825462289898}
Losses {'ner': 2.1047771957027157}
Losses {'ner': 2.108320597349027}
Losses {'ner': 4.448810713033399}
Losses {'ner': 0.01155432742196183}
Losses {'ner': 0.2685400204087074}
Losses {'ner': 5.07763227942867}
Losses {'ner': 7.652900192080395}
Losses {'ner': 7.800724359820393}
Losses {'ner': 0.4986674397902675}
Losses {'ner': 0.525948429278948}
Losses {'ner': 0.7517783734000218}
Losses {'ner': 2.980656004201343}
Losses {'ner': 2.9809900089753896}
Losses {'ner': 0.3298376278135944}
Losses {'ner': 0.4864726600381797}
Losses {'ner': 2.1952483233672293}
Losses {'ner': 2.5562373987790648}
Losses {'ner': 2.8321571320563312}
Losses {'ner': 0.13427360101229624}
Losses {'ner': 0.7293831895179892}
Losses {'ner': 2.7285636895625203}
Losses {'ner': 6.241061565644902}
Losses {'ner': 6.241069861393476}
Losses {'ner': 0.0018209653428016281}
Losses {'ner': 0.002166793277961432}
Losses {'ner': 0.02006351917742588}
Losses {'ner': 0.5432098374555485}
Losses {'ner': 0.7846155536323474}
Losses {'ner': 0.8763997209989611}
Losses {'ner': 2.796796531938755}
Losses {'ner': 2.7967986863535694}
Losses {'ner': 2.841003016523067}
Losses {'ner': 2.841003748361575}
Losses {'ner': 2.2504532393163856e-07}
Losses {'ner': 2.1207791301128873}
Losses {'ner': 3.645608287275345}
Losses {'ner': 3.645622295085327}
Losses {'ner': 3.645785942153697}
Losses {'ner': 1.2952311014329965}
Losses {'ner': 1.2962882244871032}
Losses {'ner': 1.9941729686439345}
Losses {'ner': 1.9942901805610809}
Losses {'ner': 1.9955487613483989}
Losses {'ner': 0.04501903937392766}
Losses {'ner': 2.278699452399637}
Losses {'ner': 2.2788098613178036}
Losses {'ner': 4.171827080177487}
Losses {'ner': 5.798259984927087}
Losses {'ner': 1.9570832347637006}
Losses {'ner': 1.992760547754443}
Losses {'ner': 4.144613660041396}
Losses {'ner': 4.144616299460533}
Losses {'ner': 6.0388564402772245}
Losses {'ner': 5.944446286321181e-06}
Losses {'ner': 0.4389285397444951}
Losses {'ner': 1.5461257307101364}
Losses {'ner': 2.779942529974952}
Losses {'ner': 2.7906120667526695}
Losses {'ner': 0.18862055778356535}
Losses {'ner': 1.6245406331645782}
Losses {'ner': 1.6246151508465054}
Losses {'ner': 1.6254152617003816}
Losses {'ner': 2.7249306782860954}
Losses {'ner': 1.2347424742186686}
Losses {'ner': 2.396220427537693}
Losses {'ner': 3.3583024738558676}
Losses {'ner': 3.3583468924159545}
Losses {'ner': 3.358350345170464}
Losses {'ner': 3.511540418360548e-05}
Losses {'ner': 9.044514253547797e-05}
Losses {'ner': 1.4734790067782555}
Losses {'ner': 2.2060512824188834}
Losses {'ner': 2.206055593917633}
Losses {'ner': 1.7915369134170476}
Losses {'ner': 1.7988887674213514}
Losses {'ner': 1.8369329738833267}
Losses {'ner': 1.8371690362969468}
Losses {'ner': 3.796448935007074}
Losses {'ner': 0.17714454740895746}
Losses {'ner': 1.4932812116999017}
Losses {'ner': 1.4936553434480182}
Losses {'ner': 1.4942664369495362}
Losses {'ner': 2.3777366247344855}
Losses {'ner': 1.8610296161676234}
Losses {'ner': 1.8610437592066793}
Losses {'ner': 1.8611301865942376}
Losses {'ner': 2.767211547834605}
Losses {'ner': 2.849095265957959}
Losses {'ner': 0.00014264387671858447}
Losses {'ner': 0.03099238494014067}
Losses {'ner': 0.030998299150459763}
Losses {'ner': 1.0066490946387483}
Losses {'ner': 1.006660018903725}
Losses {'ner': 0.02136822724777467}
Losses {'ner': 0.20556344395880677}
Losses {'ner': 0.20812446317123093}
Losses {'ner': 1.859008424276118}
Losses {'ner': 2.4957501802599382}
Losses {'ner': 1.9103881017557e-05}
Losses {'ner': 0.003307901951048464}
Losses {'ner': 0.2601360886385411}
Losses {'ner': 0.2637238486226776}
Losses {'ner': 2.2688439023919638}
Losses {'ner': 1.3327925199855777}
Losses {'ner': 1.3378709227248242}
Losses {'ner': 1.8618570853293503}
Losses {'ner': 1.8619477649397786}
Losses {'ner': 3.706498853945373}
Losses {'ner': 7.939051109295825e-07}
Losses {'ner': 4.49256483336935e-06}
Losses {'ner': 1.9998486821881238}
Losses {'ner': 3.6794472253400015}
Losses {'ner': 5.6294693699793825}
Losses {'ner': 0.0009747663043426225}
Losses {'ner': 0.0016291928723674576}
Losses {'ner': 3.8493588820775626}
Losses {'ner': 4.037712723919587}
Losses {'ner': 4.037712732780973}
Losses {'ner': 1.6269758114336244e-06}
Losses {'ner': 0.007913523966331465}
Losses {'ner': 1.4598360683840594}
Losses {'ner': 1.48046277747679}
Losses {'ner': 1.4805080495283278}
Losses {'ner': 6.066155341932777e-08}
Losses {'ner': 0.013038177026904685}
Losses {'ner': 0.02523724979046238}
Losses {'ner': 1.896323109198792}
Losses {'ner': 1.8963231271228458}
Losses {'ner': 2.2223322394576828}
Losses {'ner': 3.0520088234183813}
Losses {'ner': 3.0520091638886844}
Losses {'ner': 3.214959132067746}
Losses {'ner': 3.2149859954851516}
Losses {'ner': 0.25924304673404563}
Losses {'ner': 1.0231498246268025}
Losses {'ner': 1.0234209887296988}
Losses {'ner': 1.055596707994928}
Losses {'ner': 1.055666479897923}
Losses {'ner': 0.0015219497108712732}
Losses {'ner': 0.0015221152179757291}
Losses {'ner': 1.3080942475201425}
Losses {'ner': 1.30810594692553}
Losses {'ner': 1.7769817441352282}
Losses {'ner': 0.2662247664381976}
Losses {'ner': 0.2663257162642301}
Losses {'ner': 0.26632652268018}
Losses {'ner': 0.3682150375198287}
Losses {'ner': 0.5795975204598893}
Losses {'ner': 0.0003573035351433877}
Losses {'ner': 0.2500960991764709}
Losses {'ner': 2.8914195721884344}
Losses {'ner': 2.8914205674126974}
Losses {'ner': 2.891446572320045}
Losses {'ner': 1.5451892552390945}
Losses {'ner': 1.5711554509654144}
Losses {'ner': 1.5711554968311214}
Losses {'ner': 1.5711583902582278}
Losses {'ner': 1.7898388005469907}
Losses {'ner': 0.03770482944020448}
Losses {'ner': 0.0613161028831446}
Losses {'ner': 0.06168050442018851}
Losses {'ner': 0.061680962781734586}
Losses {'ner': 0.06168098202901408}
Losses {'ner': 0.09786139128875737}
Losses {'ner': 0.09798271001253382}
Losses {'ner': 0.12889271964240284}
Losses {'ner': 0.12889465021494678}
Losses {'ner': 0.12889531946185048}
Losses {'ner': 0.0698594831392243}
Losses {'ner': 0.0698595017255927}
Losses {'ner': 0.09651936576350088}
Losses {'ner': 0.09895458248653888}
Losses {'ner': 0.09895476916511753}
Losses {'ner': 0.0013747215123573187}
Losses {'ner': 0.0013748601736074913}
Losses {'ner': 0.001374881548786951}
Losses {'ner': 0.8049622035700091}
Losses {'ner': 0.8049622043954353}
Losses {'ner': 7.499334320204554e-09}
Losses {'ner': 0.00021386599544505336}
Losses {'ner': 0.00021412208652944742}
Losses {'ner': 0.00021500560926795823}
Losses {'ner': 0.5927684671607404}
Losses {'ner': 0.04049796338155891}
Losses {'ner': 0.48533744192132416}
Losses {'ner': 0.4853393443193629}
Losses {'ner': 0.48533955915073423}
Losses {'ner': 0.536230133653}
Losses {'ner': 1.4283099695061585e-06}
Losses {'ner': 1.2255422092697321}
Losses {'ner': 1.2570349518606865}
Losses {'ner': 1.2570377372171972}
Losses {'ner': 1.2570377665180772}
Losses {'ner': 0.15305622572947758}
Losses {'ner': 0.15334020650604055}
Losses {'ner': 0.1533423472765147}
Losses {'ner': 0.15341868886233664}
Losses {'ner': 0.20878757668751252}
Losses {'ner': 1.8472105535218513}
Losses {'ner': 1.8472115160226832}
Losses {'ner': 1.847760779783838}
Losses {'ner': 1.8514789911499263}
Losses {'ner': 1.8514795001535225}
Losses {'ner': 0.24716954296272522}
Losses {'ner': 2.0872459719176963}
Losses {'ner': 2.0872459726432475}
Losses {'ner': 2.0872577077980154}
Losses {'ner': 3.34116060303323}
Losses {'ner': 3.902583104564705e-06}
Losses {'ner': 3.991191915850788e-06}
Losses {'ner': 0.06865872214275269}
Losses {'ner': 2.148067574921406}
Losses {'ner': 2.148067590181034}
Losses {'ner': 2.913609190323936e-07}
Losses {'ner': 4.5262755827187934e-07}
Losses {'ner': 0.04211657654479903}
Losses {'ner': 0.1738999388537689}
Losses {'ner': 2.1718979078842877}
Losses {'ner': 8.38127013037821e-06}
Losses {'ner': 0.00019690387308599766}
Losses {'ner': 0.0001970348867160092}
Losses {'ner': 0.16312216209780583}
Losses {'ner': 0.5444467111136101}
Losses {'ner': 0.0005217343549279664}
Losses {'ner': 0.0005217547055351286}
Losses {'ner': 0.0005218053992199923}
Losses {'ner': 1.8100716826888297}
Losses {'ner': 1.810071690987122}
Losses {'ner': 1.0173410786296078e-06}
Losses {'ner': 1.9600230918274268}
Losses {'ner': 1.9638089759718658}
Losses {'ner': 1.9638112695834442}
Losses {'ner': 1.9642112744778069}
Losses {'ner': 1.17004433648326e-10}
Losses {'ner': 1.5984283852729333}
Losses {'ner': 1.5984293170759352}
Losses {'ner': 3.062045016291074}
Losses {'ner': 3.062045102885809}
Losses {'ner': 1.2483731576856931}
Losses {'ner': 1.2483937295846168}
Losses {'ner': 3.0848936645514784}
Losses {'ner': 3.08489366929984}
Losses {'ner': 3.0848936706153554}
Losses {'ner': 1.0034950176573516e-07}
Losses {'ner': 1.0034988151544635e-07}
Losses {'ner': 1.9741840815612972}
Losses {'ner': 1.9741840825644248}
Losses {'ner': 3.640318807761091}
Losses {'ner': 1.729604009406903}
Losses {'ner': 1.7296040181229941}
Losses {'ner': 1.7296040230347183}
Losses {'ner': 1.7296040282424068}
Losses {'ner': 1.7296042506207912}
Losses {'ner': 0.9401483952511007}
Losses {'ner': 0.9401602044255436}
Losses {'ner': 0.9401610790418324}
Losses {'ner': 0.9403490444104955}
Losses {'ner': 0.9670788661166326}
Losses {'ner': 0.0009144096438498101}
Losses {'ner': 0.0013426856895344189}
Losses {'ner': 0.8650475968782829}
Losses {'ner': 0.8650475984006015}
Losses {'ner': 0.8650476383382986}
Losses {'ner': 3.0407321027769996e-07}
Losses {'ner': 3.054226105546471e-07}
Losses {'ner': 3.0635201478226117e-07}
Losses {'ner': 2.1605326628423906}
Losses {'ner': 2.160532669641428}
Losses {'ner': 0.0002749636761925355}
Losses {'ner': 0.3713721720103076}
Losses {'ner': 0.371372235791661}
Losses {'ner': 0.37137431408508526}
Losses {'ner': 0.3713743171041682}
Losses {'ner': 1.5578989344747218e-07}
Losses {'ner': 0.13556803565514935}
Losses {'ner': 0.1356482022390716}
Losses {'ner': 0.13564820398664182}
Losses {'ner': 0.13564829143367085}
Losses {'ner': 3.435700579702278e-05}
Losses {'ner': 4.627806401226081e-05}
Losses {'ner': 1.2119977941309705}
Losses {'ner': 1.2120003275596232}
Losses {'ner': 1.21200076127665}
Losses {'ner': 3.5242204358994895e-09}
Losses {'ner': 1.9652044769644557}
Losses {'ner': 3.9468779923770456}
Losses {'ner': 3.946878002343751}
Losses {'ner': 3.9468780467897333}
Losses {'ner': 1.998627870591277}
Losses {'ner': 3.9981219776797037}
Losses {'ner': 3.998121977681028}
Losses {'ner': 3.9981348793611797}
Losses {'ner': 3.998134914713578}
Losses {'ner': 1.99100779058371}
Losses {'ner': 1.9910078034104308}
Losses {'ner': 1.9910378844879346}
Losses {'ner': 1.9910440153836964}
Losses {'ner': 1.991044272204714}
Losses {'ner': 3.006357551374797e-08}
Losses {'ner': 2.969950190505359}
Losses {'ner': 2.994810723965377}
Losses {'ner': 2.994811180163457}
Losses {'ner': 2.9948450739687313}
Losses {'ner': 1.9177377689041581}
Losses {'ner': 1.9177441073483468}
Losses {'ner': 1.9182416676504987}
Losses {'ner': 1.9182464293676265}
Losses {'ner': 1.9262081736816072}
Losses {'ner': 0.03620493129017738}
Losses {'ner': 1.6824359925830514}
Losses {'ner': 1.682436116964832}
Losses {'ner': 3.2921452706100376}
Losses {'ner': 3.2921455541487377}
Losses {'ner': 6.25660200653685e-06}
Losses {'ner': 1.9342739203898862}
Losses {'ner': 1.9367648215059858}
Losses {'ner': 1.9367648225824525}
Losses {'ner': 1.9370671280119578}
Losses {'ner': 1.2463544901663822e-06}
Losses {'ner': 1.8803240501069225}
Losses {'ner': 1.8803240758889606}
Losses {'ner': 1.8803334469971085}
Losses {'ner': 1.8803335091777966}
Losses {'ner': 3.235556203178207e-05}
Losses {'ner': 0.00013233363096661508}
Losses {'ner': 0.005535167925730652}
Losses {'ner': 0.005535168132667602}
Losses {'ner': 1.991643243071477}
Losses {'ner': 0.0005491575660340933}
Losses {'ner': 0.5197113327993095}
Losses {'ner': 0.5197115500648383}
Losses {'ner': 0.5197116557130869}
Losses {'ner': 1.1535317518330916}
Losses {'ner': 3.6428619691218286e-06}
Losses {'ner': 1.3965212148056725}
Losses {'ner': 1.3966105070325372}
Losses {'ner': 1.433558346060225}
Losses {'ner': 3.2459970603466672}
Losses {'ner': 1.6331948699743672e-08}
Losses {'ner': 1.584764130697874e-07}
Losses {'ner': 8.737415081918867e-06}
Losses {'ner': 1.095986569280887}
Losses {'ner': 1.0959865744519273}
Losses {'ner': 1.851649777645658}
Losses {'ner': 1.8516692145879168}
Losses {'ner': 1.8517778638838134}
Losses {'ner': 1.851778046854157}
Losses {'ner': 1.8517815388040444}
Losses {'ner': 2.4211886849916594e-09}
Losses {'ner': 0.327997234118272}
Losses {'ner': 0.32799723425112076}
Losses {'ner': 0.3279972357615274}
Losses {'ner': 1.861953525404898}
Losses {'ner': 1.0656511746421424e-07}
Losses {'ner': 5.902870747994547e-07}
Losses {'ner': 0.011774220888907063}
Losses {'ner': 1.5894964162648886}
Losses {'ner': 1.5894964180944504}
Losses {'ner': 2.4603276966635873e-11}
Losses {'ner': 1.9396619802148485}
Losses {'ner': 1.9537743477508602}
Losses {'ner': 1.9537786865440776}
Losses {'ner': 1.9537786865498208}
Losses {'ner': 1.3953864687920697e-07}
Losses {'ner': 0.0023697239337364022}
Losses {'ner': 0.0023698670836624068}
Losses {'ner': 0.002369867110058304}
Losses {'ner': 0.02569027979790258}
Losses {'ner': 1.3231308211477657e-09}
Losses {'ner': 2.6323620719641557e-06}
Losses {'ner': 3.130774227665342e-06}
Losses {'ner': 0.0015803815494009107}
Losses {'ner': 0.0019012403315573508}
Losses {'ner': 0.00011477283476936575}
Losses {'ner': 0.00011478564403479763}
Losses {'ner': 0.0001147870180493834}
Losses {'ner': 0.0001220788334142942}
Losses {'ner': 0.00012208174231436548}
Losses {'ner': 8.231018469521292e-06}
Losses {'ner': 8.233649244433769e-06}
Losses {'ner': 3.610876627396926e-05}
Losses {'ner': 0.004212419417699621}
Losses {'ner': 0.004384822608773316}
Losses {'ner': 2.5547091640545705e-05}
Losses {'ner': 2.5554897648342223e-05}
Losses {'ner': 2.631116504727908e-05}
Losses {'ner': 2.6449587354382918e-05}
Losses {'ner': 3.978373075188574e-05}
Losses {'ner': 6.179613105814418e-11}
Losses {'ner': 2.803885186847664e-10}
Losses {'ner': 7.627628717247176e-05}
Losses {'ner': 7.628060787119364e-05}
Losses {'ner': 0.0010115808956123865}
Losses {'ner': 1.1748573793767141e-06}
Losses {'ner': 0.00014457556465019493}
Losses {'ner': 0.00016275733067768977}
Losses {'ner': 0.0001627654105668306}
Losses {'ner': 0.00016405569990888551}
Losses {'ner': 0.013529394503864457}
Losses {'ner': 0.01352939452612144}
Losses {'ner': 0.013531169459427757}
Losses {'ner': 0.013538839008469682}
Losses {'ner': 0.013538924606591245}
Losses {'ner': 0.0005713476612683325}
Losses {'ner': 0.0006046322449145356}
Losses {'ner': 0.0006046374480809627}
Losses {'ner': 0.0006046398985841016}
Losses {'ner': 0.000604643582234027}
Losses {'ner': 2.638922961377408e-07}
Losses {'ner': 3.4655810686038077e-07}
Losses {'ner': 0.003512165166097548}
Losses {'ner': 0.0035953377676622563}
Losses {'ner': 0.003685806869246914}
Losses {'ner': 3.0546770800478324e-09}
Losses {'ner': 0.051456391189723824}
Losses {'ner': 0.051456391217140324}
Losses {'ner': 0.05145691459139853}
Losses {'ner': 0.05146992030424576}
Losses {'ner': 2.2104228067712506e-08}
Losses {'ner': 2.2304883066558538e-08}
Losses {'ner': 0.023684076709755583}
Losses {'ner': 0.02368457838071719}
Losses {'ner': 0.023684578668602064}
Losses {'ner': 4.4397697295717244e-07}
Losses {'ner': 6.114282149398979e-07}
Losses {'ner': 0.16160812398578484}
Losses {'ner': 0.16214494341073157}
Losses {'ner': 0.16214494492190015}
Losses {'ner': 9.55195922036805e-09}
Losses {'ner': 2.6812473295095465e-05}
Losses {'ner': 2.686801277038533e-05}
Losses {'ner': 2.6868158722734196e-05}
Losses {'ner': 0.00020660717325774903}
Losses {'ner': 4.771713518080976e-08}
Losses {'ner': 1.6027310187134752e-07}
Losses {'ner': 3.121093106970471e-07}
Losses {'ner': 0.001187752365208355}
Losses {'ner': 0.0011877541392566852}
Losses {'ner': 1.3403295214068652e-06}
Losses {'ner': 1.523794505760222e-06}
Losses {'ner': 1.1468691774934434e-05}
Losses {'ner': 1.1472814962289206e-05}
Losses {'ner': 1.147281529115795e-05}
Losses {'ner': 0.10244324427698827}
Losses {'ner': 0.1024433933777601}
Losses {'ner': 0.10269233487720941}
Losses {'ner': 0.10269267318276395}
Losses {'ner': 0.10269268058638108}
Losses {'ner': 1.1821967859731875e-09}
Losses {'ner': 6.957340248703683e-06}
Losses {'ner': 7.251314858462595e-06}
Losses {'ner': 0.0003835130953135987}
Losses {'ner': 0.0011205000689471}
Losses {'ner': 3.2747876680234785e-05}
Losses {'ner': 3.318074771480328e-05}
Losses {'ner': 3.31820473760471e-05}
Losses {'ner': 0.0010795800520640335}
Losses {'ner': 0.0010795800814831155}
Losses {'ner': 0.00015693866803738585}
Losses {'ner': 0.00033225561914048145}
Losses {'ner': 0.00033290149473212124}
Losses {'ner': 0.0003343168965500414}
Losses {'ner': 0.00033434252207617245}
Losses {'ner': 6.190997750982197e-06}
Losses {'ner': 6.191238030095877e-06}
Losses {'ner': 6.1960745945698836e-06}
Losses {'ner': 6.1975084838399525e-06}
Losses {'ner': 6.197525337877753e-06}
Losses {'ner': 5.7090253444097635e-09}
Losses {'ner': 3.6520415350445327e-06}
Losses {'ner': 1.1295738991976038e-05}
Losses {'ner': 1.1295764048033127e-05}
Losses {'ner': 0.13069567164134663}
Losses {'ner': 2.0040825342379556}
Losses {'ner': 2.004082535463429}
Losses {'ner': 2.027613693808802}
Losses {'ner': 2.0276137000814964}
Losses {'ner': 2.0276137012530016}
Losses {'ner': 3.161793045889733e-10}
Losses {'ner': 0.24827794012089982}
Losses {'ner': 0.24999016458997253}
Losses {'ner': 0.24999016493988244}
Losses {'ner': 0.24999016494979454}
Losses {'ner': 0.6959256784091454}
Losses {'ner': 0.6959256908898022}
Losses {'ner': 0.6959256927796964}
Losses {'ner': 0.6959256949356715}
Losses {'ner': 0.6999607324399691}
Losses {'ner': 5.80108395396025e-05}
Losses {'ner': 5.844627298437495e-05}
Losses {'ner': 5.8446313944310485e-05}
Losses {'ner': 7.337102474952684e-05}
Losses {'ner': 0.00012155630745397812}
Losses {'ner': 1.9849603099544e-08}
Losses {'ner': 2.0615368236992316e-08}
Losses {'ner': 9.912786669660551e-08}
Losses {'ner': 9.912788719134724e-08}
Losses {'ner': 9.941641257149126e-08}
Losses {'ner': 1.347658124602965e-08}
Losses {'ner': 2.8269725374749818e-08}
Losses {'ner': 0.09195529472443319}
Losses {'ner': 0.0979763466644419}
Losses {'ner': 0.09815110322376037}
Losses {'ner': 1.2671560505040498e-08}
Losses {'ner': 1.1418934451555041e-07}
Losses {'ner': 1.7763921975577533e-07}
Losses {'ner': 0.01661084110910092}
Losses {'ner': 0.01661084415261856}
Losses {'ner': 4.058614152949993e-06}
Losses {'ner': 4.0651194458967965e-06}
Losses {'ner': 4.065672348082943e-06}
Losses {'ner': 4.065713435757823e-06}
Losses {'ner': 4.065713510023856e-06}
Losses {'ner': 7.228460324632621e-13}
Losses {'ner': 3.974040516608506e-11}
Losses {'ner': 1.930487543317739e-06}
Losses {'ner': 3.074268406309261e-05}
Losses {'ner': 3.98229227891458e-05}
Losses {'ner': 2.3784138794963215e-13}
Losses {'ner': 5.415247542411035e-06}
Losses {'ner': 5.425137359634933e-06}
Losses {'ner': 5.4251377001528885e-06}
Losses {'ner': 5.42998125733539e-06}
Losses {'ner': 1.124539361162726e-10}
Losses {'ner': 0.00012485364328412564}
Losses {'ner': 0.0004284866894442998}
Losses {'ner': 0.0004284867020525456}
Losses {'ner': 0.00042848670457390584}
Losses {'ner': 3.067101358007985e-10}
Losses {'ner': 3.8423751946581025e-06}
Losses {'ner': 3.842384352269531e-06}
Losses {'ner': 3.842384834688087e-06}
Losses {'ner': 3.860843784491352e-06}
Losses {'ner': 0.007706685139076031}
Losses {'ner': 0.0077066851592687965}
Losses {'ner': 0.0077066860273468174}
Losses {'ner': 0.0077066860393152446}
Losses {'ner': 0.007706686059632308}
Losses {'ner': 6.906608486360949e-08}
Losses {'ner': 7.035078610950978e-08}
Losses {'ner': 8.062520973465664e-07}
Losses {'ner': 8.062578499645634e-07}
Losses {'ner': 9.60348601824234e-07}
Losses {'ner': 1.10212192819129e-09}
Losses {'ner': 2.457474360466436e-09}
Losses {'ner': 2.4632132422292064e-09}
Losses {'ner': 2.6325102770298976e-09}
Losses {'ner': 2.720433582403171e-09}
Losses {'ner': 0.005779818739312797}
Losses {'ner': 0.006934500923846978}
Losses {'ner': 0.006934597702406502}
Losses {'ner': 0.006934598406470476}
Losses {'ner': 0.006950469563842528}
Losses {'ner': 0.004606261693457948}
Losses {'ner': 0.004639286694037149}
Losses {'ner': 0.004639286870528387}
Losses {'ner': 0.0046436813408552635}
Losses {'ner': 1.992989782474466}
Losses {'ner': 1.772342967875348e-09}
Losses {'ner': 2.1651090079944056e-09}
Losses {'ner': 6.803093202481149e-08}
Losses {'ner': 1.273006234528037e-07}
Losses {'ner': 1.2730353115761844e-07}
Losses {'ner': 1.3818832082273423e-06}
Losses {'ner': 1.4014199388839803e-06}
Losses {'ner': 7.664020748656814e-05}
Losses {'ner': 0.0006369514313451418}
Losses {'ner': 0.0006370793597539289}
Losses {'ner': 2.3256142256690158e-07}
Losses {'ner': 2.326882037332787e-07}
Losses {'ner': 0.002495257750912824}
Losses {'ner': 0.0024952642534576555}
Losses {'ner': 0.002495264253915522}
Losses {'ner': 1.0996129591657795e-05}
Losses {'ner': 2.0000107436169814}
Losses {'ner': 2.000010860677158}
Losses {'ner': 2.000010860783878}
Losses {'ner': 2.0000108608262765}
Losses {'ner': 1.8372905252038016e-07}
Losses {'ner': 1.9956965722895788e-07}
Losses {'ner': 2.16302897069822e-07}
Losses {'ner': 2.1705866363853031e-07}
Losses {'ner': 1.7288411043472705e-06}
Losses {'ner': 1.222463640967288e-10}
Losses {'ner': 1.2225907936426846e-10}
Losses {'ner': 0.0005335411602435179}
Losses {'ner': 0.0005335411614422952}
Losses {'ner': 0.0005335411626393923}
Losses {'ner': 1.7010795063106903e-10}
Losses {'ner': 1.978352904068085}
Losses {'ner': 1.9783529040681047}
Losses {'ner': 1.978352904802328}
Losses {'ner': 1.9783529048023951}
Losses {'ner': 7.238108456083755e-06}
Losses {'ner': 7.261049034149158e-06}
Losses {'ner': 1.1471555700884452e-05}
Losses {'ner': 1.447479101910824e-05}
Losses {'ner': 1.4474832658181138e-05}
Losses {'ner': 0.0004558533837553589}
Losses {'ner': 0.0004558761711591539}
Losses {'ner': 0.0005754106967139097}
Losses {'ner': 0.0005754111812350312}
Losses {'ner': 0.0005754111812396802}
Losses {'ner': 0.17851474983927546}
Losses {'ner': 0.17851475040434364}
Losses {'ner': 0.17911772229574005}
Losses {'ner': 0.17911822229550678}
Losses {'ner': 0.1791184360073006}
Losses {'ner': 8.04334610193287e-09}
Losses {'ner': 8.095850243604247e-09}
Losses {'ner': 2.3644266910533615e-06}
Losses {'ner': 0.011768114005238469}
Losses {'ner': 0.011814023119787225}
Losses {'ner': 2.3532004363938674}
Losses {'ner': 2.3551709000610623}
Losses {'ner': 2.3551709027499177}
Losses {'ner': 2.4131509186938915}
Losses {'ner': 2.4131509320913076}
Losses {'ner': 6.949980306432938e-08}
Losses {'ner': 1.709558194955055e-05}
Losses {'ner': 1.7095583500710086e-05}
Losses {'ner': 1.709558356096426e-05}
Losses {'ner': 1.8226540036683244e-05}
Losses {'ner': 2.893253400752029e-10}
Losses {'ner': 8.227063896113969e-07}
Losses {'ner': 8.263509148477753e-07}
Losses {'ner': 9.023429502800804e-07}
Losses {'ner': 9.023705352606662e-07}
Losses {'ner': 3.127005999687545e-09}
Losses {'ner': 0.0012191816197064948}
Losses {'ner': 0.0012197668908887267}
Losses {'ner': 0.0012198537778006456}
Losses {'ner': 0.00122087148912961}
Losses {'ner': 2.077119564411825e-08}
Losses {'ner': 0.0001979367509551632}
Losses {'ner': 0.00019795217676873248}
Losses {'ner': 0.00019795218744358544}
Losses {'ner': 0.0001979521874483864}
Losses {'ner': 3.5932821014028024e-10}
Losses {'ner': 5.176696936061751e-09}
Losses {'ner': 1.0810952283236802e-07}
Losses {'ner': 1.0810964104485945e-07}
Losses {'ner': 1.08113350003652e-07}
Losses {'ner': 2.0272668605937215e-07}
Losses {'ner': 0.00035976876625411333}
Losses {'ner': 0.0003597698158716021}
Losses {'ner': 0.0003695968795337809}
Losses {'ner': 0.00036959687988845173}
Losses {'ner': 1.0961634679375201e-08}
Losses {'ner': 1.116954877254562e-08}
Losses {'ner': 1.3453946097477735e-08}
Losses {'ner': 2.793728970826637e-06}
Losses {'ner': 2.796362523435783e-06}
Losses {'ner': 0.00021410379907269163}
Losses {'ner': 0.006682397707856155}
Losses {'ner': 0.006718951168790665}
Losses {'ner': 0.006718951168791715}
Losses {'ner': 0.006719115047967376}
Losses {'ner': 7.189944330826935e-12}
Losses {'ner': 5.493899304431031e-10}
Losses {'ner': 0.1422141362064579}
Losses {'ner': 2.1420560647670706}
Losses {'ner': 2.142056064768226}
Losses {'ner': 1.464009666475085e-07}
Losses {'ner': 3.1661613740997565e-07}
Losses {'ner': 0.00016389038602116512}
Losses {'ner': 0.0001638905932449853}
Losses {'ner': 0.00016389060257560658}
Losses {'ner': 1.0534788875464927e-09}
Losses {'ner': 1.4519130362846981e-09}
Losses {'ner': 8.541099913698081e-08}
Losses {'ner': 8.559385756167189e-08}
Losses {'ner': 8.561854048361314e-08}
Losses {'ner': 7.357683463716737e-10}
Losses {'ner': 1.4363761740370982e-08}
Losses {'ner': 1.621646382305947e-08}
Losses {'ner': 0.0006684161403206768}
Losses {'ner': 0.0006684161407151829}
Losses {'ner': 2.2577342680411514e-05}
Losses {'ner': 2.4316365358098483e-05}
Losses {'ner': 0.23011665125846553}
Losses {'ner': 0.23011665135771497}
Losses {'ner': 0.23011665137502774}
Losses {'ner': 1.9999997653839794}
Losses {'ner': 2.0099414192149414}
Losses {'ner': 2.0099477892447712}
Losses {'ner': 3.9611699716169815}
Losses {'ner': 3.9611699716175894}
Losses {'ner': 1.0802038238312157e-07}
Losses {'ner': 1.1870655016646195e-07}
Losses {'ner': 0.00016497541977184777}
Losses {'ner': 0.0001649754704086257}
Losses {'ner': 0.00016497566155969408}
Losses {'ner': 2.686074480766873e-15}
Losses {'ner': 1.602967605759634e-09}
Losses {'ner': 0.00760360194602027}
Losses {'ner': 0.007603602594479283}
Losses {'ner': 0.007603642501240125}
Losses {'ner': 1.664871549879684e-12}
Losses {'ner': 4.849069460443846e-09}
Losses {'ner': 0.0001636017238905254}
Losses {'ner': 0.00016362078302877047}
Losses {'ner': 1.7586539448051968}
Losses {'ner': 1.415666930927855e-07}
Losses {'ner': 1.9663722275182983e-07}
Losses {'ner': 1.9667289807794337e-07}
Losses {'ner': 1.9672320915586475e-07}
Losses {'ner': 1.9672936498369315e-07}
Losses {'ner': 1.6533636317207285e-08}
Losses {'ner': 2.3471738382596617e-06}
Losses {'ner': 0.0009447621471735355}
Losses {'ner': 0.0009447621473211414}
Losses {'ner': 0.0009447622207203323}
Losses {'ner': 3.001714591890157e-14}
Losses {'ner': 1.4469339137672249e-09}
Losses {'ner': 3.497146017444327e-09}
Losses {'ner': 3.4973343571496902e-09}
Losses {'ner': 4.089799613281507e-09}
Losses {'ner': 0.006644185377347812}
Losses {'ner': 0.006825776612332352}
Losses {'ner': 0.007022734692002304}
Losses {'ner': 0.00702274606653533}
Losses {'ner': 0.007022754072676432}
Losses {'ner': 0.00786121375858833}
Losses {'ner': 0.3903338075423769}
Losses {'ner': 0.3903468843376456}
Losses {'ner': 0.39035400002263343}
Losses {'ner': 0.3903540000293437}
Losses {'ner': 0.0006201722350047436}
Losses {'ner': 0.0006202094132371414}
Losses {'ner': 0.0006240393310623199}
Losses {'ner': 0.000627068219765211}
Losses {'ner': 0.0006270682200464515}
Losses {'ner': 2.0000000145918047}
Losses {'ner': 2.0000000145919272}
Losses {'ner': 2.0000000149797055}
Losses {'ner': 2.0000000150503134}
Losses {'ner': 2.0000000152952055}
Losses {'ner': 7.899748318454421e-11}
Losses {'ner': 0.0040670449105357255}
Losses {'ner': 0.00406704491126041}
Losses {'ner': 0.004067509046047062}
Losses {'ner': 0.004067509965473714}
Losses {'ner': 0.0016885085964039794}
Losses {'ner': 0.0017008454668869493}
Losses {'ner': 0.0017008456146101183}
Losses {'ner': 0.00170084574477735}
Losses {'ner': 0.0017008457482883112}
Losses {'ner': 5.691590631931123e-07}
Losses {'ner': 6.054436739205e-07}
Losses {'ner': 7.7275018208307e-05}
Losses {'ner': 0.00028664675791259086}
Losses {'ner': 0.0003643116901553876}
Losses {'ner': 2.7154862170220235e-05}
Losses {'ner': 2.7154862258668655e-05}
Losses {'ner': 2.7157353472182254e-05}
Losses {'ner': 2.7377157426078862e-05}
Losses {'ner': 2.737715744896155e-05}
Losses {'ner': 1.1197481268925935e-08}
Losses {'ner': 1.3901995516366247e-07}
Losses {'ner': 1.3902337694535648e-07}
Losses {'ner': 2.2705504149857244e-05}
Losses {'ner': 2.270558683398338e-05}
Losses {'ner': 6.828697212730208e-14}
Losses {'ner': 1.2833151412533689e-08}
Losses {'ner': 1.273478225584435e-07}
Losses {'ner': 6.825999005511148e-07}
Losses {'ner': 7.935339271588363e-07}
Losses {'ner': 2.6198249121542198e-09}
Losses {'ner': 2.5993912887855455e-08}
Losses {'ner': 2.5994329965030963e-08}
Losses {'ner': 6.2159934987926265e-06}
Losses {'ner': 1.6143586806795405}
Losses {'ner': 5.680518948419041e-10}
Losses {'ner': 5.8899365555111535e-09}
Losses {'ner': 1.7363336431762428e-08}
Losses {'ner': 1.9998419334253814}
Losses {'ner': 1.9998419334406519}
Losses {'ner': 1.9995236397160951}
Losses {'ner': 3.946834789441507}
Losses {'ner': 3.9468348216427023}
Losses {'ner': 3.946834822606404}
Losses {'ner': 3.9468348226070096}
Losses {'ner': 5.467095490564722e-11}
Losses {'ner': 1.0187800077453014e-06}
Losses {'ner': 3.759213019085533e-05}
Losses {'ner': 3.759971529744359e-05}
Losses {'ner': 3.759977354544468e-05}
Losses {'ner': 3.529129149814473}
Losses {'ner': 3.532784227023836}
Losses {'ner': 3.532784227024007}
Losses {'ner': 3.5327844842731473}
Losses {'ner': 3.5327844862005966}
Losses {'ner': 3.2525303703634606e-10}
Losses {'ner': 0.4180673107185093}
Losses {'ner': 0.41806731136568565}
Losses {'ner': 0.418067339651348}
Losses {'ner': 0.4180751537339096}
Losses {'ner': 4.9908373049689946e-08}
Losses {'ner': 5.041423948010178e-08}
Losses {'ner': 1.8303509324995202}
Losses {'ner': 1.8303509325084477}
Losses {'ner': 1.8303509325640341}
Losses {'ner': 2.5756262969758737e-08}
Losses {'ner': 2.8281985751951868e-08}
Losses {'ner': 8.43331257898858e-08}
Losses {'ner': 2.934056398487394e-05}
Losses {'ner': 2.9459844778903843e-05}
Losses {'ner': 4.550360036431145e-09}
Losses {'ner': 5.638293091659697e-09}
Losses {'ner': 0.003100529830467033}
Losses {'ner': 0.0031005298531119898}
Losses {'ner': 0.003102797880429124}
Losses {'ner': 7.424308445554451e-05}
Losses {'ner': 7.424375707457557e-05}
Losses {'ner': 7.42437571958334e-05}
Losses {'ner': 0.0004167918253441968}
Losses {'ner': 0.00041679266279414946}
Losses {'ner': 1.3996509887285654e-09}
Losses {'ner': 1.3996534483857265e-09}
Losses {'ner': 7.362075494129085e-09}
Losses {'ner': 0.00043058988985343193}
Losses {'ner': 0.0006812764249999702}
Losses {'ner': 7.364756389836995e-08}
Losses {'ner': 7.364768280063597e-08}
Losses {'ner': 9.77722433148799e-05}
Losses {'ner': 9.77722449048667e-05}
Losses {'ner': 9.77722449119465e-05}
Losses {'ner': 4.789735660469411e-10}
Losses {'ner': 9.800800459344659e-09}
Losses {'ner': 9.800874792447955e-09}
Losses {'ner': 9.80129561407966e-09}
Losses {'ner': 2.7370699331498887e-07}
Losses {'ner': 3.3812683591314496e-12}
Losses {'ner': 8.432061766739202e-10}
Losses {'ner': 5.621275609628188e-07}
Losses {'ner': 5.651739782668264e-07}
Losses {'ner': 5.651824568428071e-07}
Losses {'ner': 2.9232437816354463e-10}
Losses {'ner': 3.085836324329669e-10}
Losses {'ner': 3.107668085771983e-10}
Losses {'ner': 1.603965702631379e-05}
Losses {'ner': 1.9996941745879295}
Losses {'ner': 9.17478758657676e-11}
Losses {'ner': 2.7845944063841725e-05}
Losses {'ner': 2.788613494889536e-05}
Losses {'ner': 0.0065854447278015675}
Losses {'ner': 0.006585444727887927}
Losses {'ner': 1.1277350529923302e-10}
Losses {'ner': 0.00010379292873731906}
Losses {'ner': 0.01634067191874422}
Losses {'ner': 0.016340804535279696}
Losses {'ner': 0.01634080458894813}
Losses {'ner': 2.6172671449817737e-12}
Losses {'ner': 2.889848081985731e-07}
Losses {'ner': 2.8898509265331226e-07}
Losses {'ner': 2.889877823563402e-07}
Losses {'ner': 8.288292605624901e-07}
Losses {'ner': 2.4884691956823153e-12}
Losses {'ner': 3.299748319505103e-07}
Losses {'ner': 3.397924650212392e-07}
Losses {'ner': 3.407647591445072e-07}
Losses {'ner': 0.0009021955722980888}
Losses {'ner': 8.860202537388492e-12}
Losses {'ner': 2.3676527422351003e-07}
Losses {'ner': 0.0001391380316333567}
Losses {'ner': 0.00013913805806536257}
Losses {'ner': 0.00013942823276813239}
Losses {'ner': 3.6675844196743568}
Losses {'ner': 3.6675844196744056}
Losses {'ner': 3.667584672793923}
Losses {'ner': 3.6675846822152436}
Losses {'ner': 3.6675846822539184}
Losses {'ner': 1.1122622084768111e-07}
Losses {'ner': 1.1133688960748529e-07}
Losses {'ner': 1.2009437780442953e-07}
Losses {'ner': 1.1360900001529273e-06}
Losses {'ner': 2.6280015711821942}
Losses {'ner': 2.677989851617737e-09}
Losses {'ner': 5.235198127210138e-09}
Losses {'ner': 0.0303238573803904}
Losses {'ner': 0.03032445993333581}
Losses {'ner': 0.03032446138637143}
Losses {'ner': 3.500080805184415e-13}
Losses {'ner': 1.47508737362637e-11}
Losses {'ner': 4.985761245297261e-07}
Losses {'ner': 4.985761531637206e-07}
Losses {'ner': 5.484127170923924e-07}
Losses {'ner': 1.873676404117219e-11}
Losses {'ner': 1.6091931657301075e-10}
Losses {'ner': 0.003986944632521694}
Losses {'ner': 0.003986948512651837}
Losses {'ner': 0.003989598866766477}
Losses {'ner': 1.8589175557750703e-09}
Losses {'ner': 1.8589946368353761e-09}
Losses {'ner': 8.240243226504908e-08}
Losses {'ner': 8.240288490474281e-08}
Losses {'ner': 8.372617748178687e-08}
Losses {'ner': 1.3642220848231874e-15}
Losses {'ner': 1.5770863032619383e-08}
Losses {'ner': 1.5789432534855377e-08}
Losses {'ner': 2.9386172056401878e-05}
Losses {'ner': 2.9386174905703683e-05}
Losses {'ner': 1.436828399484143e-16}
Losses {'ner': 1.8548696614923652e-07}
Losses {'ner': 2.000536960762375}
Losses {'ner': 2.0005369618808366}
Losses {'ner': 2.000536961880837}
Losses {'ner': 2.1598451901441606e-09}
Losses {'ner': 2.2711273273383173e-09}
Losses {'ner': 2.271452946748621e-09}
Losses {'ner': 2.0023767171714066e-08}
Losses {'ner': 2.0023799300116363e-08}
Losses {'ner': 2.9639407667807565e-09}
Losses {'ner': 2.0423751722223915e-05}
Losses {'ner': 2.0424094919418648e-05}
Losses {'ner': 2.112955554153971e-05}
Losses {'ner': 2.1130720422990336e-05}
Losses {'ner': 1.2507051015518406e-06}
Losses {'ner': 1.257502450059615e-06}
Losses {'ner': 2.271269734317174e-05}
Losses {'ner': 2.2712699986672072e-05}
Losses {'ner': 2.2712907410508592e-05}
Losses {'ner': 7.162243242633878e-12}
Losses {'ner': 7.1816065717225575e-12}
Losses {'ner': 1.999849170926067}
Losses {'ner': 1.9998491733552322}
Losses {'ner': 3.9966508999889085}
Losses {'ner': 1.885120742583255e-14}
Losses {'ner': 8.491256684682108e-09}
Losses {'ner': 1.1207141330187294e-08}
Losses {'ner': 1.1207649980917679e-08}
Losses {'ner': 1.7480756172568121e-06}
Losses {'ner': 3.5257435809216055e-08}
Losses {'ner': 3.5267881206913023e-08}
Losses {'ner': 3.6101384702728944e-08}
Losses {'ner': 3.847627660105381e-08}
Losses {'ner': 3.3575069372863576e-05}
Losses {'ner': 1.8443907740419295e-09}
Losses {'ner': 6.187046925394383e-06}
Losses {'ner': 6.187046961698374e-06}
Losses {'ner': 6.18704703198404e-06}
Losses {'ner': 4.8662646863239645e-05}
Losses {'ner': 0.5906451942883376}
Losses {'ner': 0.5906451942890018}
Losses {'ner': 0.5906455102200644}
Losses {'ner': 0.5906455102201015}
Losses {'ner': 0.5906456979276961}
Losses {'ner': 1.3656707233260786e-06}
Losses {'ner': 1.3656722191311245e-06}
Losses {'ner': 5.180146019343284e-06}
Losses {'ner': 1.9047960601603844}
Losses {'ner': 1.9047960604967331}
Losses {'ner': 0.014366143650152777}
Losses {'ner': 0.014366144840598118}
Losses {'ner': 0.014366145243042343}
Losses {'ner': 0.014370630301915155}
Losses {'ner': 0.01437063038560801}
Losses {'ner': 9.730958569788966e-13}
Losses {'ner': 0.5791281473685512}
Losses {'ner': 0.5791897700279193}
Losses {'ner': 0.579190036119441}
Losses {'ner': 0.5791900361194428}
Losses {'ner': 3.513964985294945e-11}
Losses {'ner': 1.7473745313449482e-10}
Losses {'ner': 5.075090775978778e-07}
Losses {'ner': 5.075135729302221e-07}
Losses {'ner': 5.11807353884796e-07}
Losses {'ner': 4.480849948774081e-13}
Losses {'ner': 0.0015840604198532255}
Losses {'ner': 0.001584339652779847}
Losses {'ner': 0.0015853380594570779}
Losses {'ner': 0.0016657810255956047}
Losses {'ner': 2.491307255761633e-11}
Losses {'ner': 3.2033151987609255e-10}
Losses {'ner': 2.372428898406781e-06}
Losses {'ner': 2.4936408200908002e-06}
Losses {'ner': 2.497584300531365e-06}
Losses {'ner': 3.6190406584618236e-11}
Losses {'ner': 3.6494771365753685e-11}
Losses {'ner': 2.6187169204189626e-08}
Losses {'ner': 0.010008344532985554}
Losses {'ner': 0.010008344534052712}
Losses {'ner': 0.00022977056021373675}
Losses {'ner': 0.00022977056028655702}
Losses {'ner': 0.00022977068012645048}
Losses {'ner': 0.00022977068074613128}
Losses {'ner': 0.0002299232676751398}
Losses {'ner': 0.00123834123226387}
Losses {'ner': 0.00136538892739437}
Losses {'ner': 0.0013653889274525424}
Losses {'ner': 0.0013653889398123258}
Losses {'ner': 0.002768970577523604}
Losses {'ner': 3.1094467117623686e-10}
Losses {'ner': 3.1318599977871724e-10}
Losses {'ner': 0.4089915353802799}
Losses {'ner': 0.4090044835135295}
Losses {'ner': 0.40900448351362356}
Losses {'ner': 4.84086898794792e-07}
Losses {'ner': 8.818269639362592e-07}
Losses {'ner': 1.0050839656751357e-06}
Losses {'ner': 1.0051353384238735e-06}
Losses {'ner': 1.005168880474022e-06}
Losses {'ner': 6.303166353557426e-06}
Losses {'ner': 6.342397783057379e-06}
Losses {'ner': 6.342397809065166e-06}
Losses {'ner': 1.0475372190021772e-05}
Losses {'ner': 1.0484307934988272e-05}
Losses {'ner': 1.2333574917481467e-05}
Losses {'ner': 1.2333583470871404e-05}
Losses {'ner': 1.2344104442392063e-05}
Losses {'ner': 1.2344110124434725e-05}
Losses {'ner': 1.234413138957785e-05}
Losses {'ner': 7.80005885306762e-12}
Losses {'ner': 1.2156048372295666e-11}
Losses {'ner': 0.021450188069942443}
Losses {'ner': 0.021450190684782097}
Losses {'ner': 0.021450190686191587}
Losses {'ner': 1.9173300550349223e-09}
Losses {'ner': 1.955021582647213e-09}
Losses {'ner': 1.9551317697841496e-09}
Losses {'ner': 2.050613205109646e-09}
Losses {'ner': 1.5024337288132826e-08}
Losses {'ner': 6.252728160062323e-09}
Losses {'ner': 6.25886950860321e-09}
Losses {'ner': 6.259352234595106e-09}
Losses {'ner': 0.0001087325904843683}
Losses {'ner': 0.00010873576781921669}
Losses {'ner': 2.1794585608616513e-09}
Losses {'ner': 2.1795476413210515e-09}
Losses {'ner': 2.1859151153658207e-09}
Losses {'ner': 1.3376941458797685e-07}
Losses {'ner': 1.33771740315393e-07}
Losses {'ner': 1.6139304810484115e-08}
Losses {'ner': 1.6142274093821344e-08}
Losses {'ner': 7.797252909990478e-06}
Losses {'ner': 7.889990126257566e-06}
Losses {'ner': 7.889998215542699e-06}
Losses {'ner': 0.0006943893744114151}
Losses {'ner': 0.0006943893746535274}
Losses {'ner': 0.000694389376339441}
Losses {'ner': 0.0006943893763693245}
Losses {'ner': 0.0007140868864263298}
Losses {'ner': 3.311236405517796e-11}
Losses {'ner': 1.260571600827667e-06}
Losses {'ner': 1.2605749134792993e-06}
Losses {'ner': 1.2933652674925728e-06}
Losses {'ner': 1.2933730244801925e-06}
Losses {'ner': 0.00019212442582749764}
Losses {'ner': 0.0001921244746923923}
Losses {'ner': 0.00019212452059259755}
Losses {'ner': 0.00019428644566575384}
Losses {'ner': 0.00019428644610114247}
Losses {'ner': 1.9727381467938203}
Losses {'ner': 1.9727415312541252}
Losses {'ner': 1.9727415314838974}
Losses {'ner': 1.9727492507948856}
Losses {'ner': 1.972749250795299}
Losses {'ner': 2.0589380668837933e-11}
Losses {'ner': 7.03682504593486e-08}
Losses {'ner': 7.057306373992342e-08}
Losses {'ner': 7.057483651944617e-08}
Losses {'ner': 7.057483720828488e-08}
Losses {'ner': 9.366478764853089e-09}
Losses {'ner': 9.45707519749378e-09}
Losses {'ner': 9.485580851470913e-09}
Losses {'ner': 9.522746299312354e-09}
Losses {'ner': 9.887817483070293e-09}
Losses {'ner': 5.102849864577704e-13}
Losses {'ner': 5.258553980470651e-13}
Losses {'ner': 1.5353385222405392e-11}
Losses {'ner': 1.5360544509222928e-11}
Losses {'ner': 1.5567304108321303e-11}
Losses {'ner': 1.636612448434995e-13}
Losses {'ner': 8.048419586439189e-11}
Losses {'ner': 6.422997631393644e-06}
Losses {'ner': 6.422997645530055e-06}
Losses {'ner': 6.423252638074482e-06}
Losses {'ner': 8.516616776479809e-09}
Losses {'ner': 0.366044113270918}
Losses {'ner': 0.3660441163609752}
Losses {'ner': 0.366044196051853}
Losses {'ner': 0.36605011673242294}
Losses {'ner': 1.3599902851937836e-11}
Losses {'ner': 4.5503461447341575e-10}
Losses {'ner': 8.603823205727348e-08}
Losses {'ner': 8.604512225551681e-08}
Losses {'ner': 6.053690334809654e-07}
Losses {'ner': 4.222753296730254e-12}
Losses {'ner': 3.218988098786874e-05}
Losses {'ner': 3.3638856576201474e-05}
Losses {'ner': 3.363885841606006e-05}
Losses {'ner': 3.363919405792938e-05}
Losses {'ner': 2.4541654766063593e-10}
Losses {'ner': 2.0113948596980085e-09}
Losses {'ner': 2.7002824624988542e-09}
Losses {'ner': 2.7048948916411994e-09}
Losses {'ner': 2.708368047970042e-09}
Losses {'ner': 5.3311728315697505e-09}
Losses {'ner': 5.4222202870440186e-09}
Losses {'ner': 1.4601322466838562}
Losses {'ner': 1.460132252780534}
Losses {'ner': 1.4601322527805352}
Losses {'ner': 2.330866773503314e-13}
Losses {'ner': 4.81250808700688e-07}
Losses {'ner': 4.812510537026912e-07}
Losses {'ner': 4.812513726184668e-07}
Losses {'ner': 5.254889091706822e-07}
Losses {'ner': 4.006583307904835e-05}
Losses {'ner': 0.00010670366661462705}
Losses {'ner': 0.00010670378010594608}
Losses {'ner': 0.00010846151604424311}
Losses {'ner': 0.00010846153265636922}
Losses {'ner': 2.521269134720978e-12}
Losses {'ner': 1.2312573007890532e-06}
Losses {'ner': 1.2312573032035272e-06}
Losses {'ner': 1.231262218332434e-06}
Losses {'ner': 1.23126276956146e-06}
Losses {'ner': 8.212292621956555e-12}
Losses {'ner': 1.3645572246479303e-06}
Losses {'ner': 1.3645572319350697e-06}
Losses {'ner': 1.3645581659213802e-06}
Losses {'ner': 1.7255117768681228e-06}
Losses {'ner': 1.30704639460282e-07}
Losses {'ner': 1.402605053801942e-07}
Losses {'ner': 1.934337180692174e-07}
Losses {'ner': 1.966444420584381e-07}
Losses {'ner': 1.969145794993448e-07}
Losses {'ner': 2.0232364287683703e-11}
Losses {'ner': 1.3613557632374262e-09}
Losses {'ner': 1.6924108474665077e-09}
Losses {'ner': 2.1855537817058582e-09}
Losses {'ner': 2.209075668608912e-09}
Losses {'ner': 1.3291657340625512e-11}
Losses {'ner': 8.362210476572453e-09}
Losses {'ner': 1.0619922851894374e-08}
Losses {'ner': 2.3053278224982844e-08}
Losses {'ner': 2.3053592385741175e-08}
Losses {'ner': 6.419885191925206e-13}
Losses {'ner': 8.855649394756907e-13}
Losses {'ner': 0.006884048694663276}
Losses {'ner': 0.034080668356064533}
Losses {'ner': 0.03408073197620836}
Losses {'ner': 7.262357261152471e-14}
Losses {'ner': 7.431204341118718e-08}
Losses {'ner': 7.431944088033504e-08}
Losses {'ner': 7.431944088094026e-08}
Losses {'ner': 7.431948001083828e-08}
Losses {'ner': 1.3100192579532888e-09}
Losses {'ner': 1.3103950035757618e-09}
Losses {'ner': 9.801957195840056e-09}
Losses {'ner': 9.930177141722898e-09}
Losses {'ner': 3.3436448328959627e-07}
Losses {'ner': 0.00016544241960964132}
Losses {'ner': 0.00016544242436227182}
Losses {'ner': 0.00016545533269682913}
Losses {'ner': 0.00016545558959196432}
Losses {'ner': 0.00016545559165968908}
Losses {'ner': 1.6596547181210115e-12}
Losses {'ner': 7.046789518997109e-09}
Losses {'ner': 7.1526475764246335e-09}
Losses {'ner': 7.176573627244562e-09}
Losses {'ner': 2.145310254948817e-06}
Losses {'ner': 2.6411742544920014e-09}
Losses {'ner': 2.944845810349007e-09}
Losses {'ner': 1.4557685728816e-06}
Losses {'ner': 1.4557726720269957e-06}
Losses {'ner': 1.4557726869415903e-06}
Losses {'ner': 4.0000000034988545}
Losses {'ner': 4.000000003502202}
Losses {'ner': 4.000000003504487}
Losses {'ner': 4.000000003535648}
Losses {'ner': 4.000000003568853}
Losses {'ner': 2.000000000000004}
Losses {'ner': 2.000000000336953}
Losses {'ner': 2.0000000425544893}
Losses {'ner': 2.0000000434586256}
Losses {'ner': 2.0000000594142824}
Losses {'ner': 2.0775199838235034e-11}
Losses {'ner': 8.931386774073345e-05}
Losses {'ner': 8.95595809609086e-05}
Losses {'ner': 8.956197744558768e-05}
Losses {'ner': 8.956481497934219e-05}
Losses {'ner': 6.665579408579097e-09}
Losses {'ner': 1.2824611209161941e-05}
Losses {'ner': 1.6941277373687677e-05}
Losses {'ner': 0.0014066294453014344}
Losses {'ner': 0.0014066294453047086}
Losses {'ner': 3.0053448844290882e-05}
Losses {'ner': 3.2967898504181496e-05}
Losses {'ner': 3.303339778098549e-05}
Losses {'ner': 0.0005881942867085978}
Losses {'ner': 0.0005882099952485162}
Losses {'ner': 2.6102196928836347e-10}
Losses {'ner': 8.381484271828643e-07}
Losses {'ner': 1.8897452396670578}
Losses {'ner': 1.8897452435671762}
Losses {'ner': 1.8897452437224662}
Losses {'ner': 2.037063910682377e-11}
Losses {'ner': 3.079016589470299e-11}
Losses {'ner': 3.650704958582839e-11}
Losses {'ner': 6.172346168828495e-05}
Losses {'ner': 6.17234619158632e-05}
Losses {'ner': 3.1128320583792925e-10}
Losses {'ner': 3.1142125817885225e-10}
Losses {'ner': 3.30336889668081e-08}
Losses {'ner': 1.0416774071271128e-06}
Losses {'ner': 2.2672070415760238e-05}
Losses {'ner': 1.36619396870663e-08}
Losses {'ner': 1.3661940125621863e-08}
Losses {'ner': 5.546759526288907e-07}
Losses {'ner': 5.549027669509751e-07}
Losses {'ner': 5.549088141019031e-07}
Losses {'ner': 2.1425632508769956e-13}
Losses {'ner': 2.142992438910811e-13}
Losses {'ner': 4.014580455668607e-08}
Losses {'ner': 4.014580459923453e-08}
Losses {'ner': 1.8703847727270399e-06}
Losses {'ner': 0.029284490047674964}
Losses {'ner': 0.02956570968029636}
Losses {'ner': 0.029565709681764207}
Losses {'ner': 0.029565709681770247}
Losses {'ner': 0.029565709711446297}
Losses {'ner': 3.7303948999293566e-10}
Losses {'ner': 1.804542824413835e-06}
Losses {'ner': 1.804632194139689e-06}
Losses {'ner': 1.8046452412088147e-06}
Losses {'ner': 1.8052976582411506e-06}
Losses {'ner': 1.486150478897413e-11}
Losses {'ner': 1.6674324862782066e-11}
Losses {'ner': 1.828518218578726e-11}
Losses {'ner': 1.8491711495983135e-11}
Losses {'ner': 1.0595330647528404e-09}
Losses {'ner': 7.265460452627965e-13}
Losses {'ner': 7.539614774744242e-13}
Losses {'ner': 1.0352580657464828e-10}
Losses {'ner': 1.0353359222251127e-10}
Losses {'ner': 1.9894772219103942}
Losses {'ner': 2.8791193074839935e-07}
Losses {'ner': 2.8799793688186036e-07}
Losses {'ner': 2.8800128285163597e-07}
Losses {'ner': 2.8809389046800377e-07}
Losses {'ner': 2.8810586043861237e-07}
Losses {'ner': 3.994378600225631e-15}
Losses {'ner': 2.946239064433816e-13}
Losses {'ner': 6.592775665878541e-12}
Losses {'ner': 1.9999934758034033}
Losses {'ner': 1.9999934758096116}
Losses {'ner': 9.067341464849111e-07}
Losses {'ner': 1.304267125379377e-06}
Losses {'ner': 1.304547504677605e-06}
Losses {'ner': 1.3045533122558453e-06}
Losses {'ner': 1.3045533379012935e-06}
Losses {'ner': 4.937889180386318e-12}
Losses {'ner': 1.5349826055124384e-08}
Losses {'ner': 4.261454071996316e-07}
Losses {'ner': 4.2614599478387877e-07}
Losses {'ner': 5.341197810324673e-07}
Losses {'ner': 2.9997756792621288e-12}
Losses {'ner': 3.115036617426872e-12}
Losses {'ner': 1.5931480806681663e-11}
Losses {'ner': 1.4437369347055572e-05}
Losses {'ner': 1.8421943554921343}
Losses {'ner': 6.6646359357884e-11}
Losses {'ner': 0.0006596903148507433}
Losses {'ner': 0.0025599248081873737}
Losses {'ner': 0.002559943375108818}
Losses {'ner': 0.002592036255819722}
Losses {'ner': 3.588487386129748e-09}
Losses {'ner': 3.5931614130733345e-09}
Losses {'ner': 2.2231726985174066e-07}
Losses {'ner': 0.3295081692855841}
Losses {'ner': 0.3611892192774877}
Losses {'ner': 0.0004322562206614716}
Losses {'ner': 0.00043264334399239964}
Losses {'ner': 0.0030300841356199677}
Losses {'ner': 0.003030084135620387}
Losses {'ner': 0.0030300843620427774}
Losses {'ner': 4.852479409735852e-13}
Losses {'ner': 4.018169741621716e-08}
Losses {'ner': 2.187496416217603e-05}
Losses {'ner': 2.190314051965512e-05}
Losses {'ner': 2.1903140528233246e-05}
Losses {'ner': 7.335884824374118e-13}
Losses {'ner': 3.6834449738533544e-07}
Losses {'ner': 4.2808663755789687e-07}
Losses {'ner': 4.281592848660147e-07}
Losses {'ner': 8.832839238553513e-07}
Losses {'ner': 8.674588786858036e-08}
Losses {'ner': 1.9999996099131074}
Losses {'ner': 1.9999996099203123}
Losses {'ner': 1.9999996100178685}
Losses {'ner': 1.9999996100179738}
Losses {'ner': 1.2920850825558155e-13}
Losses {'ner': 1.5415693636804232e-09}
Losses {'ner': 0.001670636032022686}
Losses {'ner': 0.0016706360320236405}
Losses {'ner': 0.0016706360327334332}
Losses {'ner': 4.469212541748844e-05}
Losses {'ner': 4.4839727733979755e-05}
Losses {'ner': 4.48445493695356e-05}
Losses {'ner': 4.4844549534173496e-05}
Losses {'ner': 4.4844554758992805e-05}
Losses {'ner': 2.3715303185484996e-14}
Losses {'ner': 3.119545859766335e-10}
Losses {'ner': 6.41368681006258e-08}
Losses {'ner': 0.7099269077801377}
Losses {'ner': 0.70992690778767}
Losses {'ner': 1.9996962547413744}
Losses {'ner': 1.9996962673524732}
Losses {'ner': 1.9996962693203097}
Losses {'ner': 1.9996962693206335}
Losses {'ner': 1.9997283502510292}
Losses {'ner': 1.5200537263306068e-14}
Losses {'ner': 1.6474763707073622e-10}
Losses {'ner': 1.8525704242463935e-10}
Losses {'ner': 4.1051904567205965e-09}
Losses {'ner': 4.1576374170109405e-09}
Losses {'ner': 0.0027896790225839397}
Losses {'ner': 0.0028142752192642484}
Losses {'ner': 0.0028142753577057294}
Losses {'ner': 0.0028142765938788446}
Losses {'ner': 0.0028142801143464687}
Losses {'ner': 1.9786853790289316}
Losses {'ner': 1.9812386783161795}
Losses {'ner': 1.9812386784130254}
Losses {'ner': 1.981238678458196}
Losses {'ner': 1.9812386784581981}
Losses {'ner': 1.19358418821527e-05}
Losses {'ner': 1.1935841889822218e-05}
Losses {'ner': 1.1935842471794304e-05}
Losses {'ner': 1.1935842562568174e-05}
Losses {'ner': 1.1935842716724521e-05}
Losses {'ner': 1.0155819472283421e-08}
Losses {'ner': 1.0171399718300665e-08}
Losses {'ner': 9.353785444958635e-07}
Losses {'ner': 9.548531822436143e-07}
Losses {'ner': 9.548532168587424e-07}
Losses {'ner': 2.3745519850635216e-11}
Losses {'ner': 0.02082506557926931}
Losses {'ner': 0.02082506559214794}
Losses {'ner': 0.020825075473606115}
Losses {'ner': 0.02082507548668991}
Losses {'ner': 1.1469920735156184e-08}
Losses {'ner': 8.91767566596752e-08}
Losses {'ner': 6.945286441325241e-06}
Losses {'ner': 6.9452994369698064e-06}
Losses {'ner': 6.9463009683845775e-06}
Losses {'ner': 1.1719743673988572e-12}
Losses {'ner': 1.173546638995037e-12}
Losses {'ner': 5.622265382433491e-10}
Losses {'ner': 1.0346077987188717e-09}
Losses {'ner': 2.2755588258508553e-09}
Losses {'ner': 6.419872265479971e-11}
Losses {'ner': 6.419876205278695e-11}
Losses {'ner': 6.409173015569932e-08}
Losses {'ner': 5.006945178996889e-07}
Losses {'ner': 5.033752311945368e-07}
Losses {'ner': 5.061477818177186e-06}
Losses {'ner': 1.1420893225828401e-05}
Losses {'ner': 1.1420894034522237e-05}
Losses {'ner': 1.1421520560025317e-05}
Losses {'ner': 1.1421521887642214e-05}
Losses {'ner': 2.207364415867798e-09}
Losses {'ner': 1.3680042060891662e-06}
Losses {'ner': 4.7882391043230586e-05}
Losses {'ner': 4.788318808410859e-05}
Losses {'ner': 0.0011304944678934069}
Losses {'ner': 5.910514083181137e-14}
Losses {'ner': 1.9999814034104781}
Losses {'ner': 1.9999814034105052}
Losses {'ner': 1.9999814062877959}
Losses {'ner': 1.9999814173997894}
Losses {'ner': 1.589480087577243e-05}
Losses {'ner': 1.5894802170799516e-05}
Losses {'ner': 1.5895149612398472e-05}
Losses {'ner': 1.589514962756106e-05}
Losses {'ner': 1.3196522406713125}
Losses {'ner': 1.229663921478121e-13}
Losses {'ner': 7.339353654255568e-10}
Losses {'ner': 1.317729676654053e-05}
Losses {'ner': 1.3177659357479172e-05}
Losses {'ner': 1.3177659357922429e-05}
Losses {'ner': 9.737463070872703e-17}
Losses {'ner': 1.2426306650287657e-11}
Losses {'ner': 4.933088864364178e-07}
Losses {'ner': 0.698431568999166}
Losses {'ner': 0.6984315690027421}
Losses {'ner': 2.791485462951155e-07}
Losses {'ner': 3.0500737025205466e-06}
Losses {'ner': 3.056556496772178e-06}
Losses {'ner': 3.056608635443863e-06}
Losses {'ner': 3.056608646320827e-06}
Losses {'ner': 0.057389374665211254}
Losses {'ner': 0.05738937466680278}
Losses {'ner': 0.057389374667103615}
Losses {'ner': 0.0573893750830856}
Losses {'ner': 0.08936752777548776}
Losses {'ner': 1.321658495259311e-12}
Losses {'ner': 0.004482905381750389}
Losses {'ner': 0.004482907045851614}
Losses {'ner': 0.0044829070531044535}
Losses {'ner': 0.0044829070531278836}
Losses {'ner': 2.9387885288164896e-13}
Losses {'ner': 4.955130505170781e-13}
Losses {'ner': 5.105720444232785e-13}
Losses {'ner': 9.506519866042534e-13}
Losses {'ner': 1.2292723550807808e-08}
Losses {'ner': 2.1206961780362668e-06}
Losses {'ner': 0.000100560990275225}
Losses {'ner': 0.00010056148349746062}
Losses {'ner': 0.00010056148350036644}
Losses {'ner': 0.00010056157665418096}
In [7]:
# Testing the model
doc = nlp("I was driving a Ford")
print(doc.ents)
print("Entities", [(ent.text, ent.label_) for ent in doc.ents])
(Ford,)
Entities [('Ford', 'PRODUCT')]

Create BentoService for model serving

In [8]:
%%writefile spacy_ner.py


from bentoml import BentoService, api, env, artifacts
from bentoml.frameworks.spacy import SpacyModelArtifact
from bentoml.adapters import JsonInput


@env(auto_pip_dependencies=True)
@artifacts([SpacyModelArtifact('nlp')])
class SpacyNERService(BentoService):
    @api(input=JsonInput(), batch=True)
    def predict(self, parsed_json_list):
        result = []
        for index, parsed_json in enumerate(parsed_json_list):
            doc = self.artifacts.nlp(parsed_json['text'])
            result.append([{'entity': ent.text, 'label': ent.label_} for ent in doc.ents])
        return result
Overwriting spacy_ner.py
In [9]:
from spacy_ner import SpacyNERService

svc = SpacyNERService()
svc.pack('nlp', nlp)

saved_path = svc.save()
[2020-09-22 21:16:26,556] WARNING - Importing from "bentoml.artifact.*" has been deprecated. Instead, use`bentoml.frameworks.*` and `bentoml.service.*`. e.g.:, `from bentoml.frameworks.sklearn import SklearnModelArtifact`, `from bentoml.service.artifacts import BentoServiceArtifact`, `from bentoml.service.artifacts.common import PickleArtifact`
[2020-09-22 21:16:26,614] WARNING - Using BentoML installed in `editable` model, the local BentoML repository including all code changes will be packaged together with saved bundle created, under the './bundled_pip_dependencies' directory of the saved bundle.
[2020-09-22 21:16:27,001] INFO - Using default docker base image: `None` specified inBentoML config file or env var. User must make sure that the docker base image either has Python 3.7 or conda installed.
[2020-09-22 21:16:27,789] INFO - Detected non-PyPI-released BentoML installed, copying local BentoML modulefiles to target saved bundle path..
/usr/local/anaconda3/envs/dev-py3/lib/python3.7/site-packages/setuptools/dist.py:476: UserWarning: Normalizing '0.9.0.pre+6.g4beee0a8.dirty' to '0.9.0rc0+6.g4beee0a8.dirty'
  normalized_version,
warning: no previously-included files matching '*~' found anywhere in distribution
warning: no previously-included files matching '*.pyo' found anywhere in distribution
warning: no previously-included files matching '.git' found anywhere in distribution
warning: no previously-included files matching '.ipynb_checkpoints' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
no previously-included directories found matching 'e2e_tests'
no previously-included directories found matching 'tests'
no previously-included directories found matching 'benchmark'
UPDATING BentoML-0.9.0rc0+6.g4beee0a8.dirty/bentoml/_version.py
set BentoML-0.9.0rc0+6.g4beee0a8.dirty/bentoml/_version.py to '0.9.0.pre+6.g4beee0a8.dirty'
[2020-09-22 21:16:31,692] INFO - BentoService bundle 'SpacyNERService:20200922211627_9930A6' saved to: /Users/bozhaoyu/bentoml/repository/SpacyNERService/20200922211627_9930A6

REST API Model Serving

To start a REST API model server with the BentoService saved above, use the bentoml serve command:

In [14]:
!bentoml serve SpacyNERService:latest
[2020-09-15 16:13:38,999] INFO - Getting latest version SpacyNERService:20200915161253_DC0550
[2020-09-15 16:13:38,999] INFO - Starting BentoML API server in development mode..
[2020-09-15 16:13:39,318] WARNING - Using BentoML installed in `editable` model, the local BentoML repository including all code changes will be packaged together with saved bundle created, under the './bundled_pip_dependencies' directory of the saved bundle.
[2020-09-15 16:13:39,330] WARNING - Saved BentoService bundle version mismatch: loading BentoService bundle create with BentoML version 0.8.6, but loading from BentoML version 0.8.6+43.g53afaa73
 * Serving Flask app "SpacyNERService" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
^C

If you are running this notebook from Google Colab, you can start the dev server with --run-with-ngrok option, to gain acccess to the API endpoint via a public endpoint managed by ngrok:

In [23]:
!bentoml serve SpacyNERService:latest --run-with-ngrok
[2020-09-15 16:39:23,654] INFO - Getting latest version SpacyNERService:20200915161701_4475B2
[2020-09-15 16:39:23,655] INFO - Starting BentoML API server in development mode..
[2020-09-15 16:39:23,920] WARNING - Using BentoML installed in `editable` model, the local BentoML repository including all code changes will be packaged together with saved bundle created, under the './bundled_pip_dependencies' directory of the saved bundle.
[2020-09-15 16:39:23,937] WARNING - Saved BentoService bundle version mismatch: loading BentoService bundle create with BentoML version 0.8.6, but loading from BentoML version 0.8.6+43.g53afaa73
 * Serving Flask app "SpacyNERService" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
7=ngrok by @inconshreveable                                       (Ctrl+C to quit)                                                                                Session Status                connecting                                        Version                       2.3.35                                            Region                        United States (us)                                Web Interface                 http://127.0.0.1:4040                                                                                                             Connections                   ttl     opn     rt1     rt5     p50     p90                                     0       0       0.00    0.00    0.00    0.00                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                      Session Status                online    SsExpires7 hours, 59 minutesVrsion2.3.35            Rgio       United States (us)   Web Interface                 http://127.0.0.1:4040Frwardng htps://5acb0b850a4d.ngrok.io-> http://localhost:                                                                              Connections                   ttl     opn     rt1     rt5     p50     p90                                   0       0       0.00    0.00    0.00    0.00    :/5acb0b850a4d.ngrok.io -> htp:/localhost:5Forwarding                    https://5acb0b850a4d.ngrok.io -> http://localhost:                                                                              Connectionsttlopnrt1 rt5 p5 p9                               0       0       0.00    0.00    0.00    0.00    [2020-09-15 16:39:28,468] INFO -  * Running on http://5acb0b850a4d.ngrok.io
[2020-09-15 16:39:28,468] INFO -  * Traffic stats available on http://127.0.0.1:4040
[[]]
127.0.0.1 - - [15/Sep/2020 16:39:35] "POST /prediction HTTP/1.1" 200 -
WARNING: Logging before flag parsing goes to stderr.
I0915 16:39:35.685670 4697529792 _internal.py:122] 127.0.0.1 - - [15/Sep/2020 16:39:35] "POST /prediction HTTP/1.1" 200 -
8>

Open http://127.0.0.1:5000 to see more information about the REST APIs server in your browser.

Send prediction requeset to the REST API server

Navigate to parent directory of the notebook(so you have reference to the test.jpg image), and run the following curl command to send the image to REST API server and get a prediction result:

curl -i \
    --request POST \
    --header "Content-Type: application/json" \
    --data "{\"text\":\"I am driving BMW\"}" \
    localhost:5000/predict

Containerize model server with Docker

One common way of distributing this model API server for production deployment, is via Docker containers. And BentoML provides a convenient way to do that.

Note that docker is not available in Google Colab. You will need to download and run this notebook locally to try out this containerization with docker feature.

If you already have docker configured, simply run the follow command to product a docker container serving the IrisClassifier prediction service created above:

In [ ]:
!bentoml containerize SpacyNERService:latest
In [ ]:
!docker run -p 5000:5000 spacynerservice

Load saved BentoService

bentoml.load is the API for loading a BentoML packaged model in python:

In [25]:
from bentoml import load

service = load(saved_path)

print(service.predict([{'text': 'I am driving BMW'}]))
[2020-09-15 16:42:57,464] WARNING - Saved BentoService bundle version mismatch: loading BentoService bundle create with BentoML version 0.8.6, but loading from BentoML version 0.8.6+43.g53afaa73
[2020-09-15 16:42:57,465] WARNING - Module `spacy_ner` already loaded, using existing imported module.
[[{'entity': 'BMW', 'label': 'PRODUCT'}]]

Launch inference job from CLI

BentoML cli supports loading and running a packaged model from CLI. With the DataframeInput adapter, the CLI command supports reading input Dataframe data from CLI argument or local csv or json files:

In [27]:
!bentoml run SpacyNERService:latest predict --input "{\"text\":\"I am driving BMW\"}"
[2020-09-15 16:44:08,832] INFO - Getting latest version SpacyNERService:20200915161701_4475B2
[2020-09-15 16:44:08,871] WARNING - Using BentoML installed in `editable` model, the local BentoML repository including all code changes will be packaged together with saved bundle created, under the './bundled_pip_dependencies' directory of the saved bundle.
[2020-09-15 16:44:08,884] WARNING - Saved BentoService bundle version mismatch: loading BentoService bundle create with BentoML version 0.8.6, but loading from BentoML version 0.8.6+43.g53afaa73
[{"entity": "BMW", "label": "PRODUCT"}]

Deployment Options

If you are at a small team with limited engineering or DevOps resources, try out automated deployment with BentoML CLI, currently supporting AWS Lambda, AWS SageMaker, and Azure Functions:

If the cloud platform you are working with is not on the list above, try out these step-by-step guide on manually deploying BentoML packaged model to cloud platforms:

Lastly, if you have a DevOps or ML Engineering team who's operating a Kubernetes or OpenShift cluster, use the following guides as references for implementating your deployment strategy:

In [ ]: