Information Technology Reference
In-Depth Information
Based on the above theorems, we note the following.
Both RankBoost and RankNet are consistent with the unweighted Kendall's τ
(and therefore inconsistent with the weighted Kendall's τ ). In order to make them
consistent with the weighted Kendall's τ , we need to introduce the position dis-
count factor D(u,v) to each document pair in their surrogate loss functions.
For Ranking SVM, since the hinge loss does not satisfy the condition in Theo-
rem 18.5 , we cannot obtain its consistency based on the current theoretical result.
ListMLE is consistent with the unweighted Kendall's τ (and therefore inconsis-
tent with the weighted Kendall's τ ). If we add the position discount factor D(u)
to the likelihood loss, then ListMLE will become consistent with the Difference-
Weighting Kendall's τ .
18.4 Consistency Analysis for Two-Layer Ranking
As far as we know, there is no work on the consistency analysis for two-layer rank-
ing.
18.5 Summary
In this chapter, we have introduced some representative work on statistical consis-
tency for ranking. To summarize, we have Table 18.1 , which takes the same format
as the summary tables in the previous chapter.
According to the table and the content of this chapter, we have the following
discussions.
Given that there is still no consensus on the true loss for ranking, different algo-
rithms may be proven to be consistent with different true losses. In this case, it
is still very difficult to directly compare whether the theoretical property of an
algorithm is better than that of the other.
Meaningful true loss should be determined by application. For example, widely
used evaluation measures in information retrieval include MAP and NDCG (or
DCG). The discussions regarding these measures could be more meaningful
than those regarding less frequently used true losses (e.g., pairwise 0-1 loss and
permutation-level 0-1 loss). In [ 6 ], some discussions are made with respect to
Table 18.1
Consistency analysis for ranking
Approaches
Document ranking
Subset ranking
Two-layer ranking
Pointwise
-
[ 6 ]
-
Pairwise
[ 5 ]
[ 10 ]
-
Listwise
-
[ 10 , 12 , 13 ]
-
Search WWH ::




Custom Search