AutoIncrement
AutoIncrements provide automatic value generation for attributes when entities are inserted into the repository. They ensure unique, sequential values for attributes like customer numbers, document IDs, or any other identifier that requires automatic numbering.
Best practices and recommendations
This section provides best practices and recommendations for creating construction kits.
Create
GraphQL allows to query and mutate data. Mutations are operations like create, update and delete. This chapter describes how data can be created.
Enums
Enums are used for establishing a set of predefined constants, which can represent various states, types, or configurations within the library. Enums are embedded within a Runtime Entity Object and do not need any navigation through associations.
Fixup Scripts
Fixup Scripts are MongoDB-compatible scripts that can be applied to databases for maintenance, migration, and data correction tasks. They are executed by the bot service in a defined sequence order, ensuring consistent and predictable database modifications.
Installation
OctoMesh uses Communication Operators to manage distributed computing resources using Kubernetes. The Communication Operators are responsible for managing the lifecycle of the Adapters, including creating, updating, and deleting Adapters.
Introduction
In the realm of OctoMesh, adapters and pipelines play a crucial role as the connective tissue between the OctoMesh platform and external data sources and services. These small, but powerful pieces of software are designed to facilitate communication and data exchange across a diverse set of endpoints, including APIs, file systems, databases, message brokers, and other custom or standard protocols. To cater to different architectural needs and deployment scenarios, OctoMesh distinguishes between two main types of adapters: Edge Adapters and Mesh Adapters.
Introduction
At the heart of OctoMesh lies the concept of Construction Kits. These kits serve as a fundamental building block for defining object models and providing the essential context that transforms data into actionable insights. With OctoMesh, you can construct models that align with your specific needs, allowing you to shape data in ways that make sense for your organization.
Introduction to Technology Guide
Welcome to the OctoMesh Technology Guide, your comprehensive resource for leveraging the transformative power of OctoMesh to architect and manage robust data mesh solutions. This guide is crafted to serve as your navigator through the expansive features of OctoMesh, shedding light on the underlying concepts, providing detailed how-to instructions, and offering practical recipes that help you harness the full potential of your data.
Maintenance Dashboard
The Maintenance Dashboard allows to get insights about costs and maintenance activities. It provides a comprehensive overview of the maintenance status of the assets and the costs associated with the maintenance activities. The dashboard is designed to help maintenance managers and technicians to monitor the maintenance activities, track the costs, and identify potential issues that require immediate attention.
Overview
OctoMesh is a software product designed to operate within Kubernetes environments. It is a tenant-based system, where each tenant's data is isolated and managed within its own MongoDB database. This design supports scalability and security, making it suitable for businesses that require multi-tenant capabilities.
Overview
Adapters are executing pipelines and pipeline consists of nodes. There are nodes that are common for all adapters and there are nodes that are specific for each adapter. For example the Modbus adapter comes with Modbus nodes, the OPC UA adapter comes with OPC UA nodes, etc.
Overview
The integration of OctoMesh with SAP provides a seamless and efficient way to exchange data between the two systems. Leveraging the SAP NetWeaver SDK,
Overview
In OctoMesh, data pipelines are integral to the Extract, Transform, Load (ETL) processes that ensure efficient data handling across distributed environments. These pipelines are categorized into Edge Pipelines and Mesh Pipelines, each executed by specific components within the system: Edge Adapters and Mesh Adapters respectively.
Overview
Pipeline triggers are used to start the execution of a pipeline based on a cron schedule using the Bot Service.
Overview
OctoMesh uses Communication Operators to manage distributed computing resources using Kubernetes. The Communication Operators are responsible for managing the lifecycle of the Adapters, including creating, updating, and deleting Adapters.
Overview
In OctoMesh, we understand that data is at the heart of your operations. This chapter focuses on how you can access and interact with your data through our Construction Kits (CK), tailored for both runtime data and stream (time series) data. Leveraging GraphQL endpoints, OctoMesh offers a seamless and efficient way to work with your data, regardless of its nature.
Pipeline design
Edge and Mesh Pipelines enable the data flow between the edge and the mesh (cloud) environment. The Edge Pipelines are responsible for preprocessing the data before sending it to the Mesh Pipelines. The Mesh Pipelines are responsible for processing the data in the cloud environment.
Prerequisites
OctoMesh is operated using Kubernetes in production. This docs describe a possibility to run OctoMesh on a local docker environment.
Repository Backup & Restore
OctoMesh provides comprehensive backup and restore capabilities for repositories through the octo-cli tool. These operations are executed by the bot service and ensure data integrity and availability for disaster recovery scenarios.
Retrieve
GraphQL allows to query and mutate data. Mutations are operations like create, update and delete. This chapter describes how data can be retrieved.
Run container locally
Clone the repository to your local machine using:
Start creating libraries
Start creating libraries
Stream data access
The area `streamData` allows access to stored time series data. Let's start with simple sample that requests the voltage value of all energy meters by their timestamp.