The Smart Data Blog

Machine Understandable Context and Why It Matters

Written by Sam Chance | Apr 2, 2021 5:31:53 PM

This post continues the larger discussion of knowledge graphs, their roles, uses and significance. We discuss context, it’s importance, and how to establish it using formal ontology to create machine understandable information. It is useful to read posts, such as Knowledge Graphs: A Love Affair and The Rise of The Knowledge Graph, to understand this article…in context!

To create machine understandable information, namely a knowledge graph, we require a means to establish context. Lack of context significantly hinders effective communication. For example, the word “bolt” can invoke different interpretations of the intended meaning. One might think of it as a type of fastener; another may interpret it to mean a “bolt of lightning;” still another may think you mean to depart quickly. This is one of several types of opportunities for misunderstanding. In fact, there is often a divergence between intended meaning and interpretation.

Providing even a modest amount of context dramatically facilitates accurate interpretation. Correct and shared understanding allow agents to effectively communicate. Notably, agents can be people or software processes. Consider the following.

  • I left my phone on the left side of the room.
  • The baseball pitcher asked for a pitcher of water.
  • The committee chair sat in the center chair.
  • The crane flew above the construction crane.
  • While they are at the play, I’m going to play with the dog.
  • She will park the car so we can walk in the park.

So what exactly is context? The definition of context is “the circumstances that form the setting for an event, statement or idea, and in terms that it can be fully understood and assessed.” When information is understood it is accurately interpreted.

When humans view and interpret things in isolation (absent sufficient context), they often form erroneous conclusions. This happens everywhere: doctors talking to patients, judicial proceedings, in engineering, intelligence and organizations. We often hear of news stories that “lack context” to shape perception. Since humans struggle to understand without context despite superior cognitive abilities, computers have no chance. We require an efficient and rigorous way to establish context so computers can arrive at the most accurate understanding of data.

Context Continuum

Consider context creation in terms of a continuum that aligns with “levels” of increasing interoperability. For more information, see my article, “Interoperability and How to Sustain it, for more details. In this construct, as one “climbs” the continuum, one uses standardized models to become increasingly interoperable. Different models achieve increasingly sophisticated capabilities from defining syntaxes to structures and then semantics for information.

After SICoP White Paper Series Module 2: Semantic Wave 2006 — Executive Guide to the Business Value of Semantic Technologies, May 15, 2006, Principal Author Mills Davis, Project10X.

Collectively, we have created models to improve shared understanding. For example, we created lists, tables and simple connections between tables. We worked hard to establish shared syntaxes, and we increased “syntactic interoperability.” Then we developed entity-relationship models, database schemas, and interchange schemas, which increased interoperability by agreeing on “structures.”

Syntactic and structural agreements are necessary, but not sufficient for machine understandable context. In other words, others, especially machines, still don’t know what we mean. We require “semantic interoperability” to realize machine understandable context.

The next step in the evolution of context creation uses formal ontology derived from Description Logics, which is a family of formal knowledge representation models derived from first order logic and set theory. (See THE DESCRIPTION LOGIC HANDBOOK: Theory, implementation, and applications, Edited by Franz Baader et al, 2003.)

Ontology Creates Context

It has long been realized that [interoperability] could benefit by having content understandable and available in a machine processable form, and it is widely agreed that ontologies will play a key role in providing much enabling infrastructure to support this goal. (Baader et al)

Ontology is the study of the nature of existence, beings and their relations. In information science, ontology provides a means to create unambiguous knowledge. An ontology is a formal specification of the concepts, types, properties and interrelationships of entities within a domain of the real world. Ontologies provide humans and machines an accurately understandable context or meaning. Ontologies ensure a common understanding of information. In practice, ontologies, describe and link disparate and complex data. Following are some additional benefits of ontologies related to knowledge representation.

  • Ontologies enable reuse of foundational concepts in (upper) ontologies that are domain independent and can be used across domains.
  • Modularity of ontologies allows separation and recombination of different parts of an ontology depending on specific needs, instead of creating a single common ontology.
  • Extensibility of ontologies allows further growth of the ontology for the purpose of specific applications.
  • Maintainability of ontologies facilitates the process of identifying and correcting defects, accommodates new requirements, and copes with changes in an ontology
  • Ontologies enable separation of design and implementation concerns, so they are flexible to changes in specific implementation technologies.

Ontology Requires a Language

Informal ontologies may lead to ambiguities. Systems based on informal ontologies are more error-prone than systems based on formal ontologies. Formal ontologies allow automated reasoning and consistency checking (i.e. is my model logically sound?). Formal ontologies span from taxonomies of concepts related by subsumption relationships to complete representations of concepts related by complex relationships. Formal Ontologies include axioms to constrain their intended interpretation of the concepts.

But we require a language to create standard and shareable ontologies. When one models a portion of the real world (i.e., some domain of interest), a conceptualization exists in one’s mind. This is based on the concepts existing in the domain and their salient relationships. An ontology language provides a mechanism to represent the concepts. The entire domain specification is expressed in the language. So an ontology is an explicit specification of a conceptualization of some domain. See the figure below which is adapted from “An Ontological Approach to Logistics,” by Laura Daniele and Luís Ferreira Pires.

Ontology as an Explicit Specification of a Conceptualization.

So how do we arrive at a standard ontology language? We provide the answer to this in another post titled Knowledge Graphs: Origins, Inhibitors and Breakthroughs. However, given that we have a standard ontology language, we use it to provide the contextualization of data as a core element of what we call a “knowledge graph.”

A Knowledge Graph is a connected graph of data and associated metadata applied to model, integrate and access an organization’s information assets. The knowledge graph represents real-world entities, facts, concepts, and events as well as the relationships between them. Knowledge graphs yield a more accurate and more comprehensive representation of an organization’s data that is human and machine understandable.

Cambridge Semantics offers a viable knowledge graph platform, called Anzo, which allows users to create, manage and use ontologies for their knowledge graph applications. As stated in other posts, Anzo knowledge graphs provide unsurpassed flexibility, enable common understanding, and are a powerful means to achieve semantic interoperability — for the enterprise and across ecosystems. Use of knowledge graphs is essential for interoperability to improve communication and cope gracefully with complexity and volatility.