The hourglass model is a specific model for the transformation of data sources to a standardized model in a target datastore. It is the simplified implementation of a layered Big Data architecture. The hourglass model can be used to medel specific implementations of transformation of data in a pattern called the datapipe.
In a number of other diagrams a detail view is given of these implementations in projects like Digital Transformation, TDP, MaxLimit and others.
Versie | 1.0 | Creatie datum | 02-05-2021 |
Different application integration types for application - application integration Well known integration types ares:
- Webservices
- Filetransfer and FTP etc
See for details the description in the TenneT architecture
Requirements for consumers when connection to a service or service implementation (interface)
Generic description of the interface for the extraction of standardized data for the various consumers.
See package with the description of these DaMa qualities
Integration from data storage to data storage. For example the relational database integrations like
- views,
- materialized views
- database logic in packages and stored procedures.
- ETL: implementations
External consumers of the (standardized) data and master data produced by the TDP solutions. For external consumers extra requirements are necessary for example with aspects like security, privacy and governance.
Data saved in a file for example
semi structured like: XML, XLS, JSon, edifact etc
Unstructured like Word, Text etc.
Transformation of a datafile (mostly semi structured) for example an XML file with an internal datamodel that has to be transformed to the standardized model.
This is a logical service for the publication of a certain standardized dataset. In the current implementation of TDP plateau 1 list of differences.
This logical dataservice is implemented in one or more technical interfaces like user interfaces or XML webservices
Data governance processes, mainly focused on the realization of a data target with acceptable data qualities
Implentation and operations processes for developing and maintaining the implemented data pipes
Consumers in the TenneT organisation of the standardized TDP data products
XML message as a stream or file imported in a message transformation handler with the transformation function. The message should be structured and described for example with a XSD definition
Transformation of a XML stream (mostly semi structured) with an internal or specific datamodel that has to be transformed to the standardized model in the data target.
NoSQL is a new dataplatform for the implementation of semi structured data. There are various NoSQL database types like column, name value, document or graph databases
Transformation of a NoSQL datasource to a standardized data target
Function for the registration and extractions of governance aspects of the datasets published over the logical services. For example data qualities, connection requirements and the standardized object or information model
Data entities saved in a relational (staging) database
Transformation of relational data entities (stored in a relational database) to the standardized data target.
Standardized data or business object. Models are based on (open) standards like CIM
Data that is received as a data stream. For example smart meter data or data received from push webservices in a continuous manner
Transformation function from the specific source data model to the standardized target data model. This can be a model or a protocol transformation.
Often this transformation is divided in sub steps and these sub steps will be analyzed and modeled in the architecture of a specific data pipe implementation and components
Different graphical user interfaces like:
- Reports
- Analytical tools
- Geospatial viewers
- Forms
- Portals and widgets e.g.