Visu­ally supported work­flow of AI Engi­neering activities

From an idea to a busi­ness impact, from raw data to trained Machine & Deep Learning data models — innoSEP’s AI Analytics Plat­form covers all step to apply AI success­fully without being an AI expert

Data Inges­tion

Data Pipeline for several data­base struc­tures and data formats

  • Flex­ible connec­tion to common data­base struc­tures (MongoDB, NoSQL, Cassandra…) 
  • Importing Module for upload and saving of multiple data formats (csv, xlsx, mdf, hdf5, tdms…)
  • Special­ized on tran­sient and time-based process data 

Visual Data Exploration

Modular and domain-oriented dash­boards to visu­alize the data and operations

  • Eight pre-config­ured dash­boards consid­ering diverse plot types for different data types (tran­sient, sprec­tral, corre­la­tion, counting, descrip­tive statistics…)
  • Flex­ible and modular struc­ture of visu­al­iza­tion elements
  • Perfor­mant plot­ting of Big Data 

Data Engi­neering

Holistic oper­a­tion panels from idea to trained data models

  • Diverse repos­i­tory of Machine & Deep Learning algo­rithms using state-of the arts python-based libraries (Ski-Kit Learn, Tensor­flow, Keras, XGBoost…
  • Multi-func­tional build-in func­tions in the areas of Data Wran­gling, Data Enrich­ment, Feature Engi­neering and Data Modeling
  • Considers multiple Engi­neering-special­ized & scien­tific methods (e.g. filtering, spec­tral analysis, counting methods…)

Model Deploy­ment

Inte­gra­tion of indi­vid­u­ally trained AI appli­ca­tion into your busi­ness workflow

  • Imple­ment Machine & Deep Learning data models into a value gener­ating Application 
  • Flex­ible deploy­ment (On-Premise, embedded, cloud services)
  • Fully container­ized deploy­ment structure

Export of Workflow

Conver­sion of your data and work­flow into re-usable and trace­able Python code

  • Transfer your data and visual work­flow into Python code with one-click 
  • Simply proceed with your usual program­ming envi­ron­ment in case of func­tional limitation
  • Ensure an effec­tive work­flow in case of specific use case characteristics

Job Execu­tion Manager

Flex­ible Cluster & Job Manager for large-scale processing

  • Assign and conduct compu­ta­tional heavy oper­a­tion to instances using Spark & Mesos
  • Realize remote or local computing (On-Premise, cloud services)
  • Includes auto­mated resource allo­ca­tion for effi­cient processing

2 + 14 =

… Contact us

Do you want a demo version? Do you have access to oper­a­tional process data? Do you have an AI Use Case in mind?

Feel free to contact us or give us a call: +49 511 590 72 777