At a Glance
PredictiveWorks. accelerates the complete AI lifecycle and defines a new standard for unified AI platforms. Data science work alone does not make a business solution, will not augment decisions or create any kind of ROI.
The focus is on the AI triad of data, models and solution, a fast track from experimentation to production and a seamless integration into every specific business environment.
PredictiveWorks. cuts down average project times by over 90% and significantly reduces the size of AI project teams. It is our mission to do AI like cooking with pre-defined ingredients and fully reproducible, version-controlled and reusable solution recipes. Without pushing teams to write a single line of code.
Declarative AI solutions instead of tedious data science work and software engineering.
PredictiveWorks. is a declarative AI application platform. At the heart is the structured AI solution language, SAISL, to describe the entire solution with all its tasks and workflows. An integrated code generator and scalable runtime brings every declarative solution to action in minutes.
Every AI solution is accompanied by a declarative twin — a visual template that contains the knowledge how to solve similar problems.
PredictiveWorks. registers every AI solution template in a version-controlled template hub for solution reproducibility and reusability. The hub shows up as a marketplace with an embedded search & recommendation engine.
AI feasibility and problem understanding phases turn into a shopping experience. And it has never been easier to keep pace with real-life dynamics of ever-changing business & regulatory requirements and move continuous feature drifts and data changes back into production.
Integrate data at rest and in motion from and to everywhere.
Data diversity with hundreds of sources & destinations, formats and APIs is no longer an afterthought. PredictiveWorks. makes data fusion a first-class AI citizen, integrates into every business environment and supports all phases of data preparation to make AI models efficient and relevant.
Manage full spectrum machine intelligence instead of operating a plethora of isolated frameworks.
PredictiveWorks. comes with a clear commitment to Apache Spark for lightning-fast computation and unified integration: For deep and machine learning, SQL queries, business rules, natural language and time series processing.
Whether it is data fusion or machine intelligence, batch, real-time or hybrid demands, at the heart is the same user experience:
A code-free point-and-click AI workflow orchestration with standardized access to 200+ configurable pre-compiled data connectors, extractors, transformers, operators and more.
Access trained AI models in production without any delay.
PredictiveWorks. embeds a comprehensive AI model management into every data operator that is in touch with an AI model. Experimentation & production workflows become closely connected, interact with the same model and reuse the same pre-compiled component for the same pre- and post-processing task. In short: Predict as you train.
Reusability is the first-class driver for all levels of AI solutions, from solution templates down to reusable pre-compiled software artifacts to reusing the same component for the same task.
PredictiveWorks. is made to support a simple formula:
Minimal time-to-value with the smallest possible team for a broad range of use cases.
Under the Hood
PredictiveWorks. is a declarative AI application platform with a scalable data fusion foundation and charged with full-spectrum machine intelligence. A built-in AI model management features “predict as you train” and closely ties experimentation and production together.
AI solutions are described as reproducible & reusable SAISL (Structured AI Solution Language) documents and managed & shared by a central repository.
Reusability is the first-class driver for all levels of AI solutions, from solution templates formatted as SAISL documents down to reusable pre-compiled software artifacts to reusing the same component for the same task.
Data fusion as first-class AI citizen
Data is one of the components of the AI triad and data fusion defines the basis for data understanding. No ingestion, collection and aggregating, contextualization and wrangling without a scalable data fusion foundation.
PredictiveWorks. provides the most flexible approach to integrate into every infrastructure, extract and transform data from a wide variety of data sources and store data in plenty of data destinations. No matter whether the data are at rest or in motion.
It is based on plenty of configurable pre-compiled connectors, extractors and transformers in combination with a scalable data workflow technology and prepares ground for countless use cases. Whether there is a need to
- extract network traffic events from Zeek (former Bro) to acquire data for security operations,
- listen to aggregated sensor readings from IoT platforms like ThingsBoard or operational data from WITSML servers of Gas & Oil industry
- write data to cloud data warehouses like Salesforce or
- build an enterprise grade data lake with e.g., Aerospike or Crate DB,
- and more,
the answer is always the same: the right connector, extractor and transformer in combination with a code-free orchestration of the right data fusion workflow.
PredictiveWorks. leverages Google’s scalable data fusion platform CDAP as a solid founda-tion for the entire AI lifecycle and adds 50+ purpose-built configurable and pre-compiled data integration artifacts.
Full spectrum machine intelligence
PredictiveWorks. supports machine intelligence the same way, that data fusion use cases are supported. With 150+ configurable pre-compiled data operators ranging from deep and machine learning to business rules & SQL queries to natural language & time series processing.
Whether it is data fusion or machine intelligence, batch, real-time or hybrid demands, at the heart is a code-free data workflow orchestration based on 200+ configurable pre-compiled software artifacts (which complete Google’s 180+ artifacts).
In total, it’s the world’s largest collection of pre-built workflow components.
PredictiveWorks. is powered by Apache Spark and uses its DataFrames API to organize and provide the following frameworks as configurable pre-compiled data operators:
- Analytics Zoo from Intel for deep learning use cases,
- Apache Spark MLlib for machine learning,
- JBoss Drools for business rule processing,
- Apache Spark SQL for query processing,
- John Snow LABS SparkNLP for natural language processing, and
- SparkTIME by Dr. Krusche & Partner for time series analytics.
Whether real-time ingestion needs to be augmented by machine learning, business rule processing or SQL queries, e.g., for data contextualization, there is no need to switch to other frameworks or platforms.
Next-generation AI model management
No data journey from business problem to AI-driven solution at lightning speed without a fast track from experimentation to AI models in production.
PredictiveWorks. introduces the “predict as you train” features and defines a new generation of AI model management:
Model-type specific recorders are integral part of every pre-compiled data operators, made or model training and prediction. They track and enable access to model instances, model parameters, evaluation metrics and support automated versioning.
Each instance is stored in an internal model repository as a time series of training runs and is enriched with staging (e.g., experimentation or production) and version metadata. And a comprehensive model viewer supports visualization, comparison and validation of different model runs.
PredictiveWorks. model management is implemented on top of Google CDAP’s big data storage and distributed service layer and supports the closest connection between experimentation and production workflows.
The result is the first lightning-fast code-free track to AI operationalization.
AI solutions made reusable and shareable
Unlike all other unified AI platforms, PredictiveWorks. completely removes software engineering and supports a declarative approach for all phases of the AI lifecycle.
SAISL, short for Structured AI Solution Language, is at the heart of every data processing task and integrates four semantic layers from business case to associated tasks to data workflows down to pre-compiled software artifacts.
Every AI solution starts as a SAISL formatted solution template, is visually created or modi-fied with a SAISL compliant solution designer and reaches its final stage as executable bina-ries with the help of a code generator.
A scalable runtime environment brings these binaries to action in minutes.
SAISL is made to represent the entire AI solution with all its tasks & workflows as a SAISL formatted solution template. This declarative twin makes solutions explainable, reproducible and reusable.
PredictiveWorks. ships with a SAISL template hub and provides comprehensive search and recommendation support. Solving business problems turns into a shopping experience and solution finding gets inspired and accelerated by looking into similar problems and solutions.
Whether it is about Cyber Defense, Internet-of-Things, Digital Marketing, E-Commerce, Retail or any other domain: It is all about reusable AI solution templates, formatted as SAISL documents and made accessible through a template hub.
Originally published at https://www.linkedin.com.