Welcome to the only model operationalization approach you will ever need.



    Execute and Manage Models

    Execute any language, Any model, with your existing Source Code Management tooling.


    Rapid Incremental Adoption

    FastScore slots into your existing CI/CD tooling, allowing rapid adoption by DevOps teams. 


    Easy Model On-boarding

    Simple access to move models into production, from any model creation workbench. 


    Deploy Model Workflows

    Easily create ensembles of models, even from different languages, and automatically deploy the workflow with a no-code approach.


    Compare Performance

    Quickly generate customized or standard methods for comparing performance between two, or many, machine learning models.


    Track Complete Model History

    Easily identify the history of each analytic asset at every stage of the model life cycle.


    Our Partners

    We integrate with a wide range of technologies to support our customers' end to end model life cycle.

    Model Operations that work in the Cloud, On Premise, or in Hybrid configurations.


    Screen Shot 2018-10-26 at 3.02.24 PM


    FastScore gives the enterprise a modern, microservices based approach for machine learning and AI model operationalization. FastScore is architected as a suite of microservice modules based on Docker. Each is an optional, but powerful, service used to connect the critical pieces of the analytics workflow: data science models, data sources, and applications. The underlying philosophy of FastScore is integration: provide unique value where appropriate, and leverage existing technology when available. Read on to learn more about each module, and the value they bring to operationalizing models.

    FastScore Engine

    Engine represents the fundamental unit of execution for your models.  With support for all the major data science languages and packages, think of engine as a universal production container, whose job is to ingest and operate any model.  The FastScore Engine provides a single unified approach for operationalizing models - in dev, test and production scenarios. Data Science teams can use Engine to validate their models for production, prior to promotion downstream.  IT teams operate models as fleets of Engines, each as a modern microservice, and can rest assured the critical math payload will execute flawlessly. Business teams receive the critical model outputs from Engine to any application, in any mode of operation. Learn more about Engine here.


    Screen Shot 2018-10-26 at 3.12.27 PM

    FastScore Manage

    Manage is a supporting microservice for the FastScore Engine.  Manage provides the engine with access to name-based execution assets, including models, data schemas and data stream descriptors.  Out of the box, Manage is a stand-alone tool, and can manage assets for Engines without further integrations. However, the most common pattern in our customers is to connect FastScore Manage to an enterprise source code management tool like Git or Bitbucket.  With this integration, FastScore leverages existing repositories and processes in the enterprise and allows for natural and rapid adoption. Learn more about Manage here.

    Manage (1)

    FastScore Deploy

    Deploy is a supporting microservice for the FastScore Engine.  Deploy provides data science teams native integrations to many of their favorite workbenches and model creation tools like Jupyter.  With Deploy it’s simple to create and test model assets for production, before the model leaves the Data Scientists’ desk. Deploy also connects with the Manage microservice, allowing model assets to be pushed into common repositories for use by others and downstream processes.  Learn more about Deploy here.

    Deploy (1)

    FastScore Composer

    Composer is a supporting microservice for the FastScore Engine.  Composer provides a no-code GUI approach to build workflows of models, even if they are written in different languages.   Composer creates automated deployment pipelines for each machine learning model in your workflow. Composer then automatically builds entire fleets of Engines, establishes configurations, and connects data transport layers - creating an operational workflow as specified by the user.

    Composer (1)


    FastScore Lineage

    Lineage is designed to provide a comprehensive historical view of an analytic model’s critical assets, and update in real-time as the model progresses through its lifecycle and systems. Lineage gives the enterprise user the ability to quickly audit any model in minutes by providing REST APIs for accessing information about specific model events and assets, as well as track various pieces of metadata. By identifying individual metadata and elements, lineage is able to group this information together to produce pathways that show the complete model life cycle. Users are also able to push lineage into common repositories such as ArrangoDB or similar databases. Learn more about Lineage here.

    Lineage image

    FastScore Compare

    Compare is a supporting microservice for the FastScore Engine.  Composer generates statistics between two streams of data; one a “truth stream” and a second stream under test.  We created Compare as a general statistical comparison tool, learning from our customers we could not pre-guess the right metrics for a specific use case.  Compare allows users to quickly generate customized or standard methods for comparing performance between two, or many, machine learning models. Learn more about Compare here.

    Compare (1)