A new standard for unified AI platforms

At a Glance

PredictiveWorks. accelerates the complete AI lifecycle and de­fines a new standard for uni­fied AI platforms. Data science work alone does not make a bu­siness solution, will not aug­ment de­­­ci­sions or create any kind of ROI.

The focus is on the AI triad of data, models and solution, a fast track from experimentation to production and a seamless integration into every specific business environment.

PredictiveWorks. cuts down average project times by over 90% and significantly reduces the size of AI project teams. It is our mission to do AI like cooking with pre-defined ingredients and fully reproducible, version-controlled and reusable solution recipes. Without pushing teams to write a single line of code.

Declarative AI solutions instead of tedious data science work and software engineering.

PredictiveWorks. is a declarative AI application platform. At the heart is the structured AI solu­tion language, SAISL, to describe the entire solution with all its tasks and workflows. An inte­grated code generator and scalable runtime brings every declarative solution to action in mi­nutes.

Every AI solution is accompanied by a declarative twin — a visual template that contains the know­ledge how to solve similar problems.

PredictiveWorks. registers every AI solution template in a version-controlled tem­plate hub for solution reproducibility and reusability. The hub shows up as a marketplace with an embedded search & recommendation engine.

AI feasibility and problem under­stan­ding phases turn into a shopping experience. And it has never been easier to keep pace with real-life dynamics of ever-changing business & re­gu­la­tory re­quire­ments and move continuous feature drifts and data changes back into production.

Integrate data at rest and in motion from and to everywhere.

Data diversity with hundreds of sources & destinations, formats and APIs is no longer an after­thought. PredictiveWorks. makes data fusion a first-class AI citizen, integrates into every bu­si­ness environment and supports all phases of data preparation to make AI models efficient and relevant.

Manage full spectrum machine intelligence instead of operating a plethora of isolated frameworks.

PredictiveWorks. comes with a clear commitment to Apache Spark for lightning-fast com­puta­tion and unified integration: For deep and machine learning, SQL queries, business rules, natu­ral language and time series processing.

Whether it is data fusion or machine intelligence, batch, real-time or hybrid demands, at the heart is the same user experience:

A code-free point-and-click AI workflow orchestration with standardized access to 200+ con­figurable pre-compiled data connectors, extractors, trans­form­ers, operators and more.

Access trained AI models in production without any delay.

PredictiveWorks. embeds a comprehensive AI model management into every data operator that is in touch with an AI model. Experimentation & production workflows become closely connected, interact with the same model and reuse the same pre-com­piled component for the same pre- and post-processing task. In short: Predict as you train.

Reusability is the first-class driver for all levels of AI solutions, from solution templates down to reusable pre-compiled software artifacts to reusing the same component for the same task.

PredictiveWorks. is made to support a simple formula:

Minimal time-to-value with the smallest possible team for a broad range of use cases.

Under the Hood

PredictiveWorks. is a declarative AI application platform with a scalable data fusion foun­da­tion and charged with full-spectrum machine intelligence. A built-in AI model management features “predict as you train” and closely ties experimentation and production together.

AI solutions are described as reproducible & reusable SAISL (Structured AI Solution Lan­guage) documents and managed & shared by a central repository.

Reusability is the first-class driver for all levels of AI solutions, from solution templates for­ma­t­ted as SAISL documents down to reusable pre-compiled software artifacts to reusing the same component for the same task.

Data fusion as first-class AI citizen

Data is one of the components of the AI triad and data fusion defines the basis for data under­standing. No ingestion, collection and aggregating, contextualization and wrangling without a scalable data fusion foundation.

PredictiveWorks. provides the most flexible approach to integrate into every infra­structure, extract and transform data from a wide variety of data sources and store data in plenty of data destinations. No matter whether the data are at rest or in motion.

It is based on plenty of configurable pre-compiled connectors, extractors and trans­for­m­ers in combination with a scalable data workflow technology and prepares ground for countless use cases. Whether there is a need to

  • extract network traffic events from Zeek (former Bro) to acquire data for security operations,
  • listen to aggregated sensor readings from IoT platforms like Things­Board or opera­tional data from WITSML servers of Gas & Oil industry
  • write data to cloud data warehouses like Salesforce or
  • build an enterprise grade data lake with e.g., Aerospike or Crate DB,
  • and more,

the answer is always the same: the right connector, extractor and transformer in combination with a code-free orchestration of the right data fusion workflow.

PredictiveWorks. leverages Google’s scalable data fusion platform CDAP as a solid founda-tion for the entire AI lifecycle and adds 50+ purpose-built configurable and pre-compiled data inte­gration artifacts.

Full spectrum machine intelligence

PredictiveWorks. supports machine intelligence the same way, that data fusion use cases are supported. With 150+ configurable pre-compiled data operators ranging from deep and ma­chine learning to business rules & SQL queries to natural language & time se­ries processing.

Whether it is data fusion or machine intelligence, batch, real-time or hybrid demands, at the heart is a code-free data workflow orch­estration based on 200+ configurable pre-compiled soft­­ware artifacts (which complete Google’s 180+ artifacts).

In total, it’s the world’s largest collection of pre-built workflow components.

PredictiveWorks. is powered by Apache Spark and uses its DataFrames API to organize and provide the following frameworks as configurable pre-compiled data operators:

  • Analytics Zoo from Intel for deep learning use cases,
  • Apache Spark MLlib for machine learning,
  • JBoss Drools for business rule processing,
  • Apache Spark SQL for query pro­cessing,
  • John Snow LABS SparkNLP for natural language processing, and
  • SparkTIME by Dr. Krusche & Partner for time series analytics.

Whether real-time ingestion needs to be augmented by machine learning, business rule pro­cessing or SQL queries, e.g., for data contextual­ization, there is no need to switch to other frame­works or platforms.

Next-generation AI model management

No data journey from business problem to AI-driven solution at lightning speed without a fast track from experimentation to AI models in pro­duction.

PredictiveWorks. introduces the “predict as you train” features and defines a new generation of AI model management:

Model-type specific recorders are integral part of every pre-compiled data operators, made or model training and prediction. They track and enable access to model instances, model parameters, evaluation metrics and support automated versioning.

Each instance is stored in an internal model repository as a time series of training runs and is enriched with staging (e.g., experimentation or production) and version metadata. And a com­prehensive model viewer supports visualization, comparison and validation of different model runs.

PredictiveWorks. model management is implemented on top of Google CDAP’s big data stor­age and distributed service layer and supports the closest connection between experimen­ta­tion and production workflows.

The result is the first lightning-fast code-free track to AI operationalization.

AI solutions made reusable and shareable

Unlike all other unified AI platforms, PredictiveWorks. completely removes software engineer­ing and supports a declarative approach for all phases of the AI lifecycle.

SAISL, short for Structured AI Solution Lan­guage, is at the heart of every data processing task and integrates four semantic layers from business case to associated tasks to data workflows down to pre-compiled software artifacts.

Every AI solution starts as a SAISL formatted solution template, is visually created or modi-fied with a SAISL compliant solution designer and reaches its final stage as executable bina-ries with the help of a code generator.

A scalable runtime en­vironment brings these binaries to action in minutes.

SAISL is made to represent the entire AI solution with all its tasks & work­flows as a SAISL for­matted solution template. This declarative twin makes solutions explainable, reproducible and reusable.

PredictiveWorks. ships with a SAISL template hub and provides comprehensive search and re­commendation support. Solving business problems turns into a shopping experience and so­lu­tion finding gets inspired and accelerated by looking into similar problems and solutions.

Whether it is about Cyber Defense, Internet-of-Things, Digital Marketing, E-Commerce, Retail or any other domain: It is all about reusable AI solution templates, formatted as SAISL docu­ments and made accessible through a template hub.

Originally published at https://www.linkedin.com.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store