Information Technology Reference
In-Depth Information
5.2
Comparison of the Models
We compare the proposed model with three other models as described below.
1. Support Vector Regression (SVR) [15]. This model solves the following regression
problem:
1
2 1
; ,
(13)
Note that Eq. 13 is different from the original form as there is an additional 1/y i
term added to optimize the MAPE. We tried both linear and polynomial transforma-
tion, and use LIBLINEAR [7] for the experiment. After several trials, we choose the
regularization parameter C=1.
2. GP_ard. GPR with ARD SE kernel. (Eq. 6)
3. GP_iso. GPR with isotropic SE kernel. (Eq. 6, with )
There are six types of features, namely social network features (from Facebook),
opinion features, trend features, ratings from previous three episodes, rating of the
first episode, and weekday indicator variables. Therefore, there are total 2 163
different combinations of features. Table 2 shows the average MAPE of all feature
combinations for each drama. We also rank the MAPE obtained from different mod-
els, and then compute the average value. The result is shown in Table 3. Our model
outperforms baseline models in terms of both MAPE and ranking.
Table 2. Average MAPE of all possible feature combinations
Drama
D1
D2
D3
D4
Avg.
Model
SVR
0.1132
0.1027
0.1300
0.1396
0.1214
GP_ard
0.1124
0.0928
0.1297
0.1162
0.1128
GP_iso
0.1165
0.0959
0.1357
0.1158
0.1160
Our Model
0.1163
0.0918
0.1276
0.1117
0.1118
Table 3. Average ranking of all feature combinations
Drama
D1
D2
D3
D4
Avg.
Model
SVR
2.1904
3.3492
2.1587
3.0952
2.6984
GP_ard
2.4365
2.2619
2.9444
2.6349
2.5694
GP_iso
2.7540
2.4683
2.5635
2.4365
2.5556
Our Model
2.6190
1.9206
2.3333
1.8333
2.1766
Then, we compare our modified GP model with the two standard GP-based com-
petitors, GP_ard and GP_iso. Since the best feature combination is fairly different for
Search WWH ::




Custom Search