Online Decision Intelligence
for an ever-changing environment
Mind the gap. Make the most reliable, accurate & cost-efficient decisions with zero gap from reality. Orchestrate event streams, react on anomaly, prevent errors accumulation, follow targets online.
Here & Now. Every 1 minute matters
Mind the Gap
Every 1 minute the environment in which modern business operates is changing. The data reflects the environment: demand, supply, force majeure, crisis, market, production, competitors, defects, breakdown, downtime.

Every 1 hour the data lags behind reality means that business makes forecasts and decisions based on irrelevant data. The bigger the gap of data from reality, the more errors are accumulated, the less accuracy of forecasts are achieved and the lower efficiency of business is available.

Zero Gap

Zero gap data processing ensures that business makes forecasts and decisions based on the most relevant data. This motivates to detect errors immediately and respond to them, continuously improving the accuracy of forecasts and business efficiency.

The Engine Usage Scenarios

Modern business means 10 to 10,000 brunches, factories, warehouses; 10 to 100,000 types of products and services; 100 to 10,000,000 clients; 10 to 100 information systems and data sources that generate 10 to 1,000,000 events every second and 10 to 1000 analysts, struggling with these data daily.

Every minute business and external environment are transformed, which leads to data changes and errors. Gaining access to data may take days or months. All this time, errors accumulate and are not corrected. As a result, business gets forecasts and makes decisions based on inconsistent and irrelevant data. Multiply that by the complexity and flexibility of today's business. automates data collection, processing, distribution from 100s data sources, data validation, access control, anomaly detection, alerting to ensure data relevance and consistency with zero gap from reality.

Today's business solves 100s of analytical tasks such as quality control, maintenance prediction, online scoring, anomaly detection, fraud prevention, promo forecasting, supply optimization, demand forecasting, inventory management, personalization, news categorization, etc. Each project involves dozens of analysts, data scientists, data engineers, product owners.

In a constantly changing environment, business needs online analytics and the most accurate forecasts. At the same time, the predictive power of the data falls off rapidly over time, and AI algorithms lose accuracy. The most accurate predictions require zero gap data from reality and continuous retraining on the latest data. automates every stage of the AI algorithms life cycle, such as data connection and transformation, features storing and versioning, algorithms decomposition for production execution, validation and testing, building and deploying, computational resources management, staging and production deployment, concurrent running, metrics control, optimization and re-training with a zero gap from reality.

Each data transformation scenario requires connecting to 10s data sources, reading and writing results in multiple storages (there is no one universal), executing 10s consequent and parallel operations, integrating with 10s consumers, involving monitoring 100s metrics and detecting anomaly. Scenario execution may take 0.1 second to 100 days.

Today's business faces the choice of a vendor solution or in-house development using open source components. The first option brings the risks of vendor lock-in, slow and expensive implementation, insufficient development pace.The second requires investing in software development and choosing the right architecture with unpredicted risks. provides businesses with an orchestrator and a framework that allows to easily build from atomic operations event-driven data transformation scenarios of any complexity with zero gap from reality.

Solution Architecture

Typical scenario. An organization has 10 to 100 data sources like online activity, payment transactions, geolocation, sensors, devices, data changes in operational databases, etc. The data sources generate 10 to 1,000,000 events every second. There are 10 to 1,000 consumers like data analysts, customers, applications, or business processes. The business solves 100s of analytical tasks in parallel. The engine allows to develop and orchestrate with zero lag from reality data processing scenarios containing such operations as data connection, distribution, processing, validation, versioning, anomaly detection, monitoring, forecasting, AI algorithms execution, A/B testing, retraining, optimizing, etc.

Architecture. The engine is an implementation of the OLEP (Online Event Processing) approach providing guarantees of data consistency, high performance, fault tolerance and scalability at executing complex data transformation scenario with zero gap from reality.

Infrastructure. Kubernetes, Apache Kafka. Optionally: RDBMS, Timeseries, kV, Graph, Document, any. Business logic: Python & Low-code / No-code DSL. ML libraries: any. Zero trust. On-premises & Cloud native.

Components. The engine provides an orchestrator (MCC, Mission Control Center) and basic Python libraries (Atelier, Actor). The libraries are used to develop stateless atomic data processing operations. An online data transformation scenario is built from these atomic operations. The orchestrator provides an UI and API to build, manage and execute data processing scenarios with zero gap from reality. MCC automates building, deployment, execution, A/B testing, routing, monitoring, versioning, features collection, metrics counting, computational resources planning and management, role-based access management, logging, alerting, scenario visualization, etc.

Cost Efficiency

Cost efficiency is made up of multiple variables such as license, staff hiring, qualification, speed of implementation, making changes and hypothesis testing, projects time to market, number of projects in production, level of automation, collaborative work, low-code & code re-usage, knowledge accumulation, protection against the risk of wrong predictions, fault tolerance, performance, scalability, security, cost of infrastructure, etc.
new implementation takes 3 months
project time to market from 1 to 14 days
speed of making changes from zero to 10 minutes
data is processed with zero gap from reality
the best forecast accuracy is due to the most up-to-date data
protection against the risk of wrong forecasts is provided online
high level of code reuse and collaborative work
low-code is supported via API
the required computational resources are minimum
the system works atop of existing IT infrastructure 
the number of connected data sources is not limited
the number of consumers is not limited
the number of executing projects is not limited
the number of concurrent running strategies is not limited
the number of counting metrics is not limited
the re-training / optimization of AI can be automated
the process of storing features is automated
the resources are strictly separated between the projects
role-based access control is provided
zero trust approach is supported
no vendor lock-in
Request a demo! LTD
Innovation Centre, Maidstone Rd, Chatham ME5 9FD, UK
© All Rights Reserved.