Service

Flask application to serve Machine Learning models

service.features()[source]

Model features

Get the model accepted features. This includes feature inportance if the model allows it.

service.health_check()[source]
service.info()[source]

Model information

Get the model information: metadata, type, classifier, etc.

service.predict()[source]

Make preditcions and explain them

Model inference using input data. This is the main function.

URL Params:
proba (int):

1 in order to compute probabilities for classification models or 0 to return predicted class (classification) or value (regression). Default 0.

explain (int):

1 in order to compute moeldel explanations for the predicted value. This will return a status 500 when the model does not support explanations. Default 0.

Payload:

JSON string that can take two forms:

The first, the payload is a record or a list of records with one value per feature. This will be directly interpreted as the input for the model.

The second, the payload is a dictionary with 1 or 2 elements. The key “_data” is mandatory because this will be the input for the model and its format is expected to be a record or a list of records. On the other hand the key “_samples” (optional) will be used to obtain different explanations (see explain())

service.preprocess()[source]

Preporcess input data

Get the preprocessed version of the input data. If the model does not include preprocessing steps, this method will return the same data as the input.

service.readiness_check()[source]
service.service_info()[source]

Service information

Get information about the service: up-time, varsion of the template, name of the served model, etc.