WATCH A DEMO
REQUEST DEMO

 

 

 

    FastScore

    Welcome to the only model operationalization approach you will ever need.

    PRODUCT SUPPORT

     

    Execute and Manage Models

    Execute any language, Any model, with your existing Source Code Management tooling.

     

    Rapid Incremental Adoption

    FastScore slots into your existing CI/CD tooling, allowing rapid adoption by DevOps teams. 

     

    Easy Model On-boarding

    Simple access to move models into production, from any model creation workbench. 

     

    Deploy Model Workflows

    Easily create ensembles of models, even from different languages, and automatically deploy the workflow with a no-code approach.

     

    Compare Performance

    Quickly generate customized or standard methods for comparing performance between two, or many, machine learning models.

    Model Operations that work in the Cloud, On Premise, or in Hybrid configurations.

     

    Screen Shot 2018-10-26 at 3.02.24 PM

    FastScore gives the enterprise a modern, microservices based approach for machine learning and AI model operationalization. FastScore is architected as a suite of microservice modules based on Docker. Each is an optional, but powerful, service used to connect the critical pieces of the analytics workflow: data science models, data sources, and applications. The underlying philosophy of FastScore is integration: provide unique value where appropriate, and leverage existing technology when available. Read on to learn more about each module, and the value they bring to operationalizing models.

    FastScore Engine

    Engine represents the fundamental unit of execution for your models.  With support for all the major data science languages and packages, think of engine as a universal production container, whose job is to ingest and operate any model.  The FastScore Engine provides a single unified approach for operationalizing models - in dev, test and production scenarios. Data Science teams can use Engine to validate their models for production, prior to promotion downstream.  IT teams operate models as fleets of Engines, each as a modern microservice, and can rest assured the critical math payload will execute flawlessly. Business teams receive the critical model outputs from Engine to any application, in any mode of operation. Learn more about Engine here.

     

    Screen Shot 2018-10-26 at 3.12.27 PM

    FastScore Manage

    Manage is a supporting microservice for the FastScore Engine.  Manage provides the engine with access to name-based execution assets, including models, data schemas and data stream descriptors.  Out of the box, Manage is a stand-alone tool, and can manage assets for Engines without further integrations. However, the most common pattern in our customers is to connect FastScore Manage to an enterprise source code management tool like Git or Bitbucket.  With this integration, FastScore leverages existing repositories and processes in the enterprise and allows for natural and rapid adoption. Learn more about Manage here.

    Manage (1)

    FastScore Deploy

    Deploy is a supporting microservice for the FastScore Engine.  Deploy provides data science teams native integrations to many of their favorite workbenches and model creation tools like Jupyter.  With Deploy it’s simple to create and test model assets for production, before the model leaves the Data Scientists’ desk. Deploy also connects with the Manage microservice, allowing model assets to be pushed into common repositories for use by others and downstream processes.  Learn more about Deploy here.

    Deploy (1)

    FastScore Composer

    Composer is a supporting microservice for the FastScore Engine.  Composer provides a no-code GUI approach to build workflows of models, even if they are written in different languages.   Composer creates automated deployment pipelines for each machine learning model in your workflow. Composer then automatically builds entire fleets of Engines, establishes configurations, and connects data transport layers - creating an operational workflow as specified by the user.

    Composer (1)

     

    FastScore Compare

    Compare is a supporting microservice for the FastScore Engine.  Composer generates statistics between two streams of data; one a “truth stream” and a second stream under test.  We created Compare as a general statistical comparison tool, learning from our customers we could not pre-guess the right metrics for a specific use case.  Compare allows users to quickly generate customized or standard methods for comparing performance between two, or many, machine learning models. Learn more about Compare here.

    Compare (1)