Database Reference
In-Depth Information
Chapter 7
Breaking Dimensions: Adaptive Scoring
with Sparse Grids
Abstract We introduce the concept of a sparse grid and show how this powerful
approach to function space discretization may be employed to tackle high-
dimensional machine learning problems of regression and classification. In partic-
ular, we address the issue of incremental computation of sparse grid regression
coefficients so as to meet the requirements of realtime data mining. Conclusively,
we present experimental results on real-world data sets.
7.1
Introduction
In this chapter, we shall use hierarchical methods as introduced in the previous
chapter to devise a powerful method for scoring - sparse grids. We first demonstrate
how scoring can be used for calculating high-quality recommendations, although
only for a limited number of products. Then we will introduce the sparse grid
method and develop an adaptive sparse grid version. Finally, we present some
numerical results.
We follow the approach of the papers [GG01a, GGT01] which describe sparse
grids for classification. We develop an incremental sparse grid approach, i.e.,
incremental learning from new data points. Although sparse grids are quite
complex, it turns out that extending them for this type of adaptivity is straightfor-
ward. This chapter addresses the mathematical inclined reader and requires solid
knowledge of numerical analysis. Since it is not directly connected to the following
chapters, it can also be skipped.
We start by mentioning that the term “scoring” is very general. Throughout this
chapter we identify scoring with supervised learning which represents the common
scoring domain.
In supervised learning we consider the given set of already classified data
(training set)
M
1 ,
d
S ¼
ð
x i ;
y i
Þ∈ R
R
Search WWH ::




Custom Search