What do you gain from AI/ML using Knowledge Graph Technology?
Any meaningful discussion of Artificial Intelligence (AI) at the enterprise, community of interest or ecosystem levels must include the data that AI services consume and produce; otherwise, one is creating yet another ‘island of excellence.’ In other words, the data infrastructure for AI is at least as important as AI outputs.
This article focuses on knowledge graph technology as a critical technical enabler to operationalizing AI. We provide some examples to help illustrate AI and knowledge graph technology working together. We provide background for knowledge graph technology, and a way forward for large scale knowledge graph exploitation for operationalizing AI.
Effective AI — and analytics in general — require data derived from multiple sources, formats and structures. As a result, organizations apply significant resources to prepare disparate structured and unstructured data. Collectively, we realize that data preparation is a bottleneck to data-driven decision making; and this certainly applies to AI.
As such, organizations are investing in advanced data architectures to mitigate this data processing backlog. To help them understand the various contemporary architectures, we published a blog series titled An Integrated Data Enterprise that discusses:
- Differences between popular data architectures,
- Why a data fabric wins,
- Why knowledge graph is the best data fabric enabler,
- And how to get started.
After reading these, one should come away with an appreciation for knowledge graph technology.
Succinctly, a knowledge graph is a rich, unified, semantically integrated and self-describing information access layer. The knowledge graph normalizes disparate data formats, structures and syntaxes into a simple and uniform graph model; and it harmonizes concepts and relationships using knowledge representation. The most scalable and powerful knowledge graph implementations, in my opinion, use W3C RDF and OWL. We elaborate on these standards in Knowledge Graphs: Origins, Inhibitors and Breakthroughs. Notably, OWL is an unambiguous, formal ontology model, and is itself a form of AI; that is, computable intelligence is inherent to the OWL specification.
Organizations that continue to stitch disparate data together to feed AI algorithms will not meaningfully participate in the coming AI revolution, which will feature increasingly autonomous systems. Leaders will leverage scalable knowledge graph technology to operationalize their AI initiatives.
Knowledge Graph Use Cases
Manufacturing with Augmented AI
As an example, consider an engineer designing a new aircraft. This person interacts with software tools to design and configure aircraft components, assemblies, sub-systems, systems and eventually the platform. Augmented with AI, the engineer focuses on the design while intelligent services find, collect and inform the design with relevant information derived from many different sources. Without AI working alongside the engineer, humans must collect, aggregate and synthesize data from many disparate sources. This process is error prone and time consuming. Human capital cannot continue to be disproportionately applied to data preparation tasks. Knowledge graph technology empowers AI services to operate on behalf of humans.
Digital Recommendation Engines
Or, consider a person who wants to plan a vacation. The user looks at one service that prompts the user to enter particular information such as dates, places, desired amenities, and other parameters and the services recommend suitable alternatives. But an AI-powered service doesn’t wait for the user to ask. This service prompts the user to consider a vacation based on what it already knows about the user and his behavior. Also, the AI powered service leverages the knowledge graph to provide the user an explanation of its recommendations. Do you want a proactive service or a reactive service?
Automated Insider Threat Detection
A current example of knowledge graph powering AI is for insider threat detection. In this case, the knowledge graph integrates structured data from sources such as badge swipes, user activity logs, reference data, and others, as well as unstructured sources such as chats, emails, audio converted to text, and so on. The knowledge graph creates a fully connected access layer that AI analytics consume to understand normal patterns and anomalies. The knowledge graph remains current as events and activities occur. Designated users monitor the organization in real-time, and they perform strategic analysis to further inform the models to enhance and refine the knowledge graph.
360° Product Views
What about drug development in the pharmaceutical industry? The FDA uses Anzo’s knowledge graph platform to create a 360° view of drug products. Then AI accesses the drug products knowledge graph from a uniform layer, avoiding the need to develop multiple queries for multiple sources, and then aggregate the results. In fact, knowledge graph clients, human and automated, answer arbitrary questions on demand. This approach provides a fully informed view of one or more drugs, enables discovery of new insights and shrinks decision cycles. Do you want actionable answers, or do you want to focus on data prep?
A common use case for AI is feature engineering. Many AI algorithms require rapid preparation and prototyping of training data sets from across multiple data sets. The Anzo knowledge graph platform, for example, quickly assembles training data sets by joining, aggregating, and calculating values for users. The figure below shows an example of how knowledge graphs allow users to prepare disparate data for feature engineering which is a common activity for AI/ML applications. This example uses Anzo to create a knowledge graph from structured and unstructured data from pharma, insurance, healthcare and other data. Anzo enables this function across domains of interest at scale.
Knowledge graphs create highly contextualized and normalized information that allows users to operationalize AI, ML and other Advanced Analytics.
AI at Scale Requires Semantic Interoperability
We stress to our clients the importance of semantic data integration. Increasingly, enterprises understand that ‘graph’ is a superior approach to creating links among elements in otherwise disparate data sources. However, ‘knowledge graphs’ integrate the semantics contained in contributing data sources. Moreover, knowledge graphs implemented using W3C RDF and OWL yield (machine understandable) context essential to gaining the capabilities required for semantic interoperability, at scale. Ad hoc semantic mediation is insufficient and not scalable; this approach perpetuates brittle, point-to-point data architectures.
Semantic interoperability forms the context that allows for discovery, new insights, automation and decision advantage. Machine understandable knowledge graphs help to overcome brittle, point-to-point data architectures. Enterprises need semantic interoperability at scale for their users, clients, partners and their stakeholders. Semantic interoperability at scale is a key enabler for AI-based, increasingly autonomous systems.
Several vendors purport to offer graph models to provide knowledge graph capabilities; yet only a few offer semantic (i.e., formal ontology) knowledge graph technology and even fewer scale to enterprise needs. In other words, not all graph models are equal. We assert that Anzo provides the most scalable and adaptable standards-based platform to help enterprises realize the enterprise knowledge graph, semantic interoperability, and ultimately scalable AI capabilities.
The advent of AI applications means adopters of scalable knowledge graph platforms that implement the W3C Semantic Web standards (namely RDF, OWL, and SPARQL) position themselves to maximize the opportunities afforded by AI and autonomous systems.