We provide credible information that users are able to interpret and use.
We verified that in most cases all we need to know already exists in some form at our customers or in open data sources. However, it is necessary to know where to look, understand data content and how they were gathered, shall they become a valuable information. It goes a long way to provide a new, yet understandable and credible information. For this purpose, we process not only process big data, but we also combine and verify various data sources. This work is called “BIGDATA”.
According to the generally accepted definition the “BIG DATA” are datasets whose size is beyond the ability to capture, manage and process them with commonly used software resources in a reasonable time (source: wiki). A prerequisite to the activities of BIGDATA is a sufficient capacity and a powerful hardware. Processing and analysis involves the use of different programming languages, databases and operating systems. Way of acquisition, commissioning and configuration of a platform for processing “bigdata” is part of the know-how of companies involved in this business.