Wolfram Alpha – Computational knowledge engine

Back in September of last year I did a post on Cognition’s Semantic NLP and today I spotted an article on TechCrunch about Stephen Wolfram’s knowledge engine. It’s not a search engine like Google’s but rather a knowledge engine due to it’s nature to give answers to factual questions. 

It doesn’t simply return documents that (might) contain the answers, like Google does, and it isn’t just a giant database of knowledge, like the Wikipedia. It doesn’t simply parse natural language and then use that to retrieve documents, like Powerset, for example. Instead, Wolfram Alpha actually computes the answers to a wide range of questions — like questions that have factual answers such as “What country is Timbuktu in?” or “How many protons are in a hydrogen atom?” or “What is the average rainfall in Seattle?”

Okay, that sounds really cool, I hear you say – but how does it work?

Wolfram Alpha is a system for computing the answers to questions. To accomplish this it uses built-in models of fields of knowledge, complete with data and algorithms, that represent real-world knowledge.

For example, it contains formal models of much of what we know about science — massive amounts of data about various physical laws and properties, as well as data about the physical world.

Based on this you can ask it scientific questions and it can compute the answers for you. Even if it has not been programmed explicity to answer each question you might ask it.

Okay, so they’re building Skynet or something similar that’ll will send back Cybernetic organisms to take over the world? No, not quite.

There is no risk of Wolfram Alpha becoming too smart, or taking over the world. It’s good at answering factual questions; it’s a computing machine, a tool — not a mind.

On the topic of semantic web and why they’re not using NLP, they said the following:

The first question was could (or even should) Wolfram Alpha be built using the Semantic Web in some manner, rather than (or as well as) the Mathematica engine it is currently built on. Is anything missed by not building it with Semantic Web’s languages (RDF, OWL, Sparql, etc.)?

The answer is that there is no reason that one MUST use the Semantic Web stack to build something like Wolfram Alpha. In fact, in my opinion it would be far too difficult to try to explicitly represent everything Wolfram Alpha knows and can compute using OWL ontologies. It is too wide a range of human knowledge and giant OWL ontologies are just too difficult to build and curate.

For a detailed in-depth look you can read the full interview on the TechCrunch article. Coming to a web browser near you May 2009! [source TechCrunch]