The skills & expertise required for knowledge graph success

Posted by Jon Zuanich on May 30, 2022 10:15:00 AM
Find me on:

More often than not, Cambridge Semantics’ Anzo® is sought out by organizational data stewards who have data/application integration aspirations. These individuals know they need a knowledge graph platform that can serve as the foundation for modern data architectures like data fabric or data mesh.

Cambridge Semantics’ team of knowledge graph experts can impress with use case demonstrations and PoCs; subsequently, questions or apprehensions arise, such as:

“What skills do I need?”

“Who else do I need to hire?”

“Can my existing team easily manage Anzo?”

“What kind of training will be required?”


For those of you thinking TL;DR, I’ll sum it up: You have the skills. You have the team. It’s a small learning curve that anyone with “data” or “analyst” in their job title can handle.

What skills do I need?

W3C SPARQL and RDF are not the most widely acquired skills. You don’t need to be masters of these concepts, and we’ve seen the full gamut from “what is RDF?” to “we got this, step aside.”

Analysts, your data consumers using the BI tools, don’t need to be knowledgeable in these concepts. They continue to use their BI tools, but the data available to them is accelerated, thanks to Anzo.

Data Scientists can add any functions, pull-in any plugin, or library to successfully extend knowledge. For those looking for more information, find the white paper Accelerating Data Science with Knowledge Graphs for everything a Data Scientist needs to know regarding knowledge graph technology.

Data architects and data engineers, the users building models or linking and harmonizing data, will want some knowledge of SPARQL and RDF. Not to worry, they are super simple concepts and these training/tutorials will get them there. Elaborating a bit further on data modeling and SPARQL skills:

Data Modeling

There's a lot of traditional data modeling tools overlap (e.g. Erwin Diagrams) these skills translate easily. Anzo also provides tooling to accelerate the generation of those models. Anzo is based on open standards, meaning that any Web Ontology Language(OWL) model can easily be imported and used out of the box. Here are just a few examples of such models: 


Being able to write SPARQL queries can be important to help harmonize and transform data. SQL and SPARQL overlap considerably, so don’t be daunted. But again, Anzo contains many features that accelerate learning and development. You may not even have to write SPARQL queries. Anzo automates and directly suggests connections for you and generates different visualizations and tables without a user writing queries. However, in some advanced use cases, you will want to know and understand how SPARQL works. Here are a few good resources for that:

All that said, Anzo® was built with the intent to make every individual in your organization successful. In other words, we’ve further simplified a lot of the above.

Knowledge Graph Analytics Dashboard: Anzo® Hi-Res

Anzo® Hi-Res is what we call our easy-to-use knowledge graph analytics dashboard. Via Anzo Hi-Res users can click-through and traverse the knowledge graph for on-demand access to all available properties — based on permissions — for answering unanticipated questions, computing new data, and even augmenting the knowledge graph with new properties. Anzo Hi-Res provides a variety of dashboard lenses to empower users to make new insights and discoveries.

Additionally, Anzo automatically generates endpoints to which users connect their favorite BI tools (Tableau, Power BI, etc.) for easy dashboard construction in their BI tool of choice.

Other Helpful Resources


knowledge graph revolution


Subscribe to the Smart Data Blog!

Comment on this Blogpost!