I recall a senior executive of one of the world’s largest consumers of data proudly proclaiming in a much-celebrated announcement that went something like this: “We’ve made the bold decision to migrate our data to the cloud. This move will dramatically improve our information sharing because the contributing data will be stored in a common environment.” That was in about 2012, a long time ago in IT years! No one seemed to notice, however, that it’s not the physical location of data that enables “information sharing.” What happened is that multiple “data clouds” stood up, and data within each of these clouds was still isolated. That is, myriad data collections resided in the cloud(s), but were not related; they remained disconnected. One could say “data islands in the ocean called the cloud.”
Smart Data Lake platforms providing semantic layers are automating data access and data management for accelerated insight. These platforms, based on knowledge graphs, allow organizations on-demand access to all relevant data, internal or external, regardless of the source format or type (i.e. structured, unstructured or semi-structured). All resulting in faster answers to questions impacting an enterprise. Now IT organizations can easily deliver harmonized diverse data sets with full richness allowing various stakeholders to conduct interactive high resolution analytics on the multi-layered rich data. Stakeholders can also use these Smart Data Lakes with semantic layers for on-demand discovery of right data and data sets to be used with business intelligence, machine learning or advanced analytics systems without any IT set up or preparation. The result - better insights, improved time to market, and faster and timely decision making.
May 25, 2018 has become a sword of Damocles hanging over the heads of any companies conducting business within European Union (EU) countries. Because that’s when the EU’s General Data Protection Requirement (GDPR) goes into effect. The GDPR requires all firms to enhance their protection of personal data, whether the firm is located within the EU or not. Organizations must be completely compliant from the first day or suffer the consequences of large fines – potentially up to 2 to 4 percent of a firm’s global revenue.
A biomedical professional is certain to have come across some variant of the headlines mentioned in the visual above. Combinations of trending buzzwords in technology and healthcare form half of my news feed. The other half merely mirrors the first half.
Superior decision making is an essential aspect of life, whether it be in business, national security, health, environment, and every other aspect of human existence. Look no further than geopolitical affairs, such as North Korean or Iranian relations, to understand the importance and impact of decisions on human life, indeed the entire planet. This is not an exaggeration. From an Information Technology point of view, the goal is to provide complete and accurate information on demand to support decision making.
Real-world events demonstrate our inability to understand rapidly and accurately what we already know. In other words, we cannot answer questions completely, despite the fact that we may hold the requisite data. For example, if someone attempted to enter the United States (US) at an airport, and US officials initiated a query to the “system” and found nothing, that person may enter the US erroneously. This might occur because US officials asking a question such as “What do we know about this person?” cannot assuredly answer it – and not in a timely fashion.
The financial industry is facing a perfect storm of disruptive drivers for data management. While regulators seek accuracy and transparency, institutions are struggling with fragmented data and IT infrastructures. The path forward is “data engineering” – applying consistent semantics with scalable infrastructure to harmonize data and enable traceable and dynamic analytics.
We’d like to introduce you to the newest member of our team, Sam Chance, who has joined us as managing director of pre-sales. In this newly created position, Sam will work closely with the sales and engineering teams to accurately define and communicate the value of our Anzo Smart Data Lake® (ASDL) platform to our growing roster of customers, while also architecting a customized solution for their environments.
Conventional data analytics utilizes dashboards, visualizations, search, and other tools to determine appropriate data for integrated, targeted use cases. Smart data analytics techniques, on the other hand, leverage linked data graphs, comprehensive data models, and a semantic standards-based approach to publish results to those same popular tools.
Comprehending semantic technology is no longer an arduous task for the back offices of data-savvy organizations. Business users and C-level executives are starting to comprehend the basics of the technologies that are increasingly impacting their jobs.