Big Data Analytics
Analysing the Data Lake.
Hadoop-based systems are ideal for storing large volumes of unstructured and semi-structured information but we all know that analysing this data requires teams of highly skilled (and rare) programmer data scientists – right?
Wrong: with the tools in SDG Group’s kit bag it is now possible to treat data lakes like any other data source, even performing joins with other data sets and giving access via industry standard APIs such as ODBC, JDBC, ODATA and REST, as well as applying security and governance (see Data Virtualisation).
It is also possible to design advanced analytics workflows in a visual environment which executes code in the Hadoop platform without the user having any knowledge of programming. Results can then be visualised and interacted with alongside those from other data sources.
Couple that with the ease of standing up a Hadoop environment in the Cloud and you no longer need to be wary of incorporating data lakes into your information strategy.