Our 3 level analysis model as a big data solution for the industry
During development of processess or the monitoring operating states, various test and measuring systems lead to a rapid increase of data volume characterized by different sources and formats. Often, local data filing structures or data redundancy cause a lack of transparency and usability of the measuring and sensor data.
Our integral analysis model offers a modern, efficient and tailor-made solution procedure for the technical barriers presented by Big Data in industrial applications.
Engineering related and technically correct identification of target and problemIn close consultation and with the highest confidentiality, problems are identified, targets are defined and an individual analysis strategy is developed.
» Analysis of measurement concept or process structure
» Identification of system sizes
Structured storage of raw data that need to be evaluated in an efficient data base
Big data is characterized by the 4V model: More and more (Volume) poly-structured data (Variety) in less time (Velocity). “Veracity” describes the vagueness and indeterminacy as not every data set of the measurement sensors can be used in its completeness.
» Modern and efficient data bases (e.g. Hadoop HDFS)
» Application-oriented methods for data access (e.g. MapReduce, Spark)
Analysis of the physical system in consideration of input and output variables and special events
Before the measurement data are processed and utilized by digital signal processing, the measurement signals need to be analyzed and understood
» Definition of target-relevant events, relations and connections
» Recording of (recurring) events (and reactions) in measurement signals
- real / physical Level -
past and presence
Signal conditioning and digital signal and measurement data processing based on the system analysis (Descriptive Analytics)
The data editing and technical signal processing includes the realization and interpretation of the measurement signals after the profound system analysis and represents the core task in measurement data management.
» Adaptation and correction of data set (e.g. filtering out of wrong signals, calibration for signal drifts, reduction of data set (>>sampling rates))
» Use of various statistical, mathematical and technological engineering signal processing processes (time range, spectral range, frequency and results filter, cross-correlations, counting procedure, etc.)
- numerical / analytic Level -
Creation of FE model to depict the real system including validation with gathered measurement data
By depicting and converging the real system, the numerical validation of the model enhances the findings and results of the measurement data management.
» Realistic creation of FE model which is relevant for the evaluation
» Derivation of load and marginal conditions based on previous measurement data processing
- synthetic / prognostic Level -
scenarios and predictions (future)
Database interpretation for abstract and extensive knowledge acquisition
The combination of measurement data processing and simulations makes it possible to derive (future-oriented) trends, prognoses and recommendations for action.
» Evaluation of expected, generic or possible input signals and events (expected cases, synthetic signals)
» Revealing and detecting non-detected and undetermined incidents (intelligent detection and learning – Machine Learning)