In this post, we are going to answer this question:
- How can I benefit from the experiment tracking and other advantages included in the Prevision.io platform while continuing to build my experiment outside the platform and / or with third-party solutions?
If you use another environment to train your models and you wish to benefit from the experiment tracking solutions offered by Prevision.io :
1. You load and prepare data in your environment, on Kaggle notebook or on Google Colab
# an example of binary classification |
2. You train one or many models in your environment, on Prevision.io notebooks, on Kaggle notebook or on Google Colab and export them in ONNX format.
from sklearn.ensemble import RandomForestClassifier |
from lightgbm import LGBMClassifier model_onnx = convert_sklearn( # And save. |
from xgboost import XGBClassifier update_registered_converter( model_onnx = convert_sklearn( # And save. |
from catboost import CatBoostClassifier |
# create yaml # export datasets |
3. You upload data models, configuration files using the user interface.

Configuring your external model

Process to import your external model

Configuring your external experiment

For each external model, you need to set a name, a yaml with features configuration,and a ONNX file containing the model

You can import as many models as you want
To go further, external model import uses the Standardized ONNX Format and most of the standard ML libraries have a module for export.


After few minutes, you obtain a dashboard with all models
Now you can Evaluate your experiment.

External model information

External model feature importance

External model confusion matrix

External model metrics
Good news: Once imported, you still can benefit from the insightful analytics available for internally trained models.
4. You upload data models and the related configuration files using the SDK (Python or R).
p = pio.Project(_id=’616d9628d26230001d81b0c9′, name=’E2E_Telecom_churn’) |
for model in models: |
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec
5. Once your imported model is deployed, you are able to use it periodically (every hour, every day, every month …).
To proceed to Deployments, I refer you to the paragraph explaining how to deploy an experiment in article 2 of this series or in documentation here https://previsionio.readthedocs.io/fr/latest/studio/deployments/index.html
Conclusion
In this guide, we went through the whole experiment tracking process, while using Prevision.io.
As we have seen, it is essential for a data scientist to document the different iterations over all the data science project stages: from data ingestion, to feature engineering, to model selection, to hyperparameters tuning, while accessing the in depth visual analysis, until the model is deployed and in production.
For the production and the models’ monitoring iterative phases, Prevision.io facilitates this task, through providing solutions for documenting and tracing the used models, the stability of features, as well continuously evaluating the produced models’ performance.