Database Reference
In-Depth Information
'?/minorExponent :minor-exp
'?/exchangeRate :exchange-rate
'?/exchangeWith :exchange-with
'?/exchangeRateDate :exchange-date})
The speciic URL that we're going to scrape is http://www.x-rates.com/
table/?from=USD&amount=1.00 . Let's go ahead and put everything together:
user=> (def d
(aggregate-data t-store "data/currencies.ttl"
"http://www.x-rates.com/table/?from=USD&amount=1.00"
q col-map))
user=> (sel d :rows (range 3)
:cols [:fullname :name :exchange-rate])
| :fullname | :name | :exchange-rate |
|-----------------------------+--------+----------------|
| United Arab Emirates dirham | dirham | 3.672845 |
| United Arab Emirates dirham | dirham | 3.672845 |
| United Arab Emirates dirham | dirham | 3.672849 |
As you will see, some of the data from currencies.ttl doesn't have exchange data
(the ones that start with nil ). We can look in other sources for that, or decide that some
of those currencies don't matter for our project.
How it works…
A lot of this is just a slightly more complicated version of what we've seen before, pulled
together into one recipe. The complicated part is scraping the web page, which is driven by
the structure of the page itself.
After taking a look at the source for the page and playing with it on the REPL, the page's
structure was clear. First, we needed to pull the timestamp off the top of the table that lists
the exchange rates. Then, we walked over the table and pulled the data from each row. Both
the data tables (the short and long ones) are in a div element with a moduleContent class,
so everything begins there.
Next, we drilled down from the module's content into the rows of the rates table. Inside each
row, we pulled out the currency code and returned it as a symbol in the currency namespace.
We also drilled down to the exchange rates and returned them as loats. Then, we put
everything into a map and converted it to triple vectors, which we added to the triple store.
 
Search WWH ::




Custom Search