Knowledge Graph Engineering
A knowledge graph is a machine readable collection of intelligently connected facts derived from one or more sources. Knowledge graphs help organizations use content as data to derive new insight, generate recommendations, track performance against goals, and integrate efficient and scalable omnichannel strategies.
I help organizations design and integrate goal-driven knowledge graph strategies that unlock the value and insight hidden in traditional data and content collections.
Why Knowledge Graphs Matter
Knowledge graphs provide context to data, including content, to help machines better understand the relationships that are obvious and intuitive to humans. With this “understanding,” algorithms can make better and more reliable content recommendations, can programmatically comb through massive content and data stores and surface insights a human would never have seen, and can ensure that the meaningful links between resources in large sets of interconnected content retain integrity and usefulness over time. In short, knowledge graphs help us delegate jobs that machines are good at to machines, and allow us to give those machines the context they need to do those jobs well.
The specific steps to planning, designing, and implementing a knowledge graph strategy will depend on your organization’s needs, content and data sources, publication process, and technology stack. That said, there are a few considerations that apply in most cases.
Understand the Business Context
Algorithmic business logic in traditional systems is often hopelessly intertwined with the constraints imposed by a particular product or vendor. This creates networks of Byzantine systems and legacy dependencies that grow heavier and more cumbersome year by year. Untangling these dependencies and understanding which rules are actually necessary to create value is the first step in modeling these systems for efficiency and scale.
Embrace Open Standards
Most of the technologies that power knowledge graphs and the semantic web have been around for at least 20 years, and the ecosystem and open source tools available are mature and well integrated. By embracing open semantic web standards like RDF, OWL, and SPARQL, organizations can ensure interoperability across platforms and avoid early lock-in with the customizations of a particular vendor.
Deliver Value Early
Knowledge graph solutions built with standards compliant semantic web technologies carry a fundamental difference from traditional monolithic data management products: it only takes a small amount of semantic structure to create new value, and subsequent efforts can be expanded alongside existing work. This shift in thinking means that it is possible to think big while starting small (with due credit to industry pioneer Dave McComb).