Database Reference
In-Depth Information
5.
Also, using this function is just as simple:
user=> (sel d :rows (range 3)
:cols [:full-name :name :iso :symbol])
| :full-name | :name | :iso | :symbol |
|-----------------------------+---------+------+---------|
| United Arab Emirates dirham | dirham | AED | إ.د |
| Afghan afghani | afghani | AFN | ؋ |
| Albanian lek | lek | ALL | L |
How it works…
First, here's some background information. Resource Description Format (RDF) isn't an XML
format, although it's often written using XML. (There are other formats as well, such as N3 and
Turtle.) RDF sees the world as a set of statements. Each statement has at least three parts
(a triple): a subject, predicate, and object. The subject and predicate must be URIs. (URIs are
like URLs, only more general. For example, uri:7890 is a valid URI.) Objects can be a literal
or a URI. The URIs form a graph. They are linked to each other and make statements about
each other. This is where the linked in linked data comes from.
If you want more information about linked data, http://linkeddata.org/guides-and-
tutorials has some good recommendations.
Now, about our recipe. From a high level, the process we used here is pretty simple, given
as follows:
1.
Create a triple store ( kb-memstore and init-kb )
2.
Load the data ( load-data )
3.
Query the data to pull out only what you want ( q and load-data )
4.
Transform it into a format that Incanter can ingest easily ( rekey and col-map )
5.
Finally, create the Incanter dataset ( load-data )
The newest thing here is the query format. kb uses a nice SPARQL-like DSL to express the
queries. In fact, it's so easy to use that we'll deal with it instead of working with raw RDF.
The items starting with ?/ are variables which will be used as keys for the result maps.
The other items look like rdf-namespace/value . The namespace is taken from the
registered namespaces deined in init-kb . These are different from Clojure's namespaces,
although they serve a similar function for your data: to partition and provide context.
See also
The next few recipes, Querying RDF data with SPARQL and Aggregating data from different
formats , build on this recipe and will use much of the same set up and techniques.
 
Search WWH ::




Custom Search