The Linear Discriminant Analysis (LDA) stands out as a popular technique for classification tasks. However, its applicability falters when the sample is high-dimensional; that is, the number of input variables
Many authors have studied modifications of the classical LDA concerning high-dimensional sample, with early proposals based on the independence rule and variable selection. Bickel and Levina (2004) proposed to use the independence rule or the diagonal LDA paradigm for estimating the population covariance matrix by ignoring the correlation structure of the input variables. Tibshirani
Recently, the sparse LDA approaches have been proposed as an alternative to the independence rule. Cai and Liu (2011) proposed the linear programming discriminant rule that finds a sparse estimate of the population covariance matrix by using the Dantzig selector (James
In this study, we focus on the DSDA applied with a class of concave penalties, including the smoothly clipped absolute deviation (Fan and Li, 2001), minimax concave (Zhang, 2010), and truncate d-
The rest of the paper is organized as follows. Section 2 introduces the sparse LDA. Section 3 introduces the concave penalized LDA and presents the related theoretical results. Section 4 provides numerical studies to confirm the theoretical results, and concluding remarks are given in Section 5. Technical details and proofs are provided in
Fisher’s Linear Discriminant Analysis (LDA) (Fisher, 1936) is an efficient technique for discriminating a binary class label
independently, where
In addition, when the covariance matrices in (
where
Let (
where
and
In many real fields, application of the LDA raises two challenging problems because of sparsity of the model and high-dimensionality of the samples. The sparsity of the model implies that Bayes direction vector satisfies
Let
for some positive constant
Hence the LDA rule in (
Given the sparsity and high-dimensionality, it is natural to employ the penalized LSE:
where
where
Let
(J1)
(J2)
(J3)
The class of penalties that satisfy (J1), (J2), and (J3) has been studied as a representative concave penalty class (Kim and Kwon, 2012; Zhang and Zhang, 2012), including the Smoothly Clipped Absolute Deviation (SCAD) (Fan and Li, 2001),
minimax concave (MCP) (Zhang, 2010),
and truncated-
as examples, where
Let
Now, considering the true signal set
which is unavailable in practice without the knowledge of . Hence, the main objective of the concave penalized LDA is to recover
where
In this subsection, we introduce three lemmas that provide non-asymptotic sufficient conditions for a given estimator to be a local or unique local minimizer of
The conditions in Lemma 1 are simply the sub-gradient optimality conditions for the penalty class that satisfy penalty conditions (J1), (J2), and (J3), under which a given estimator becomes a local minimizer (Kim
The following lemma states that the uniqueness condition forms a sufficient condition for a local minimizer to be unique. Let
Lemma 2 requires
Let
(a) The uniqueness and Sparse Riesz conditions in (
In this subsection, we present the main results of the paper: The concave penalized LSE in (
We introduce some notations. For any matrix
Recall the definition of the oracle LSE
where ΩΜ =
where Ω = Cov(
The main objective of the concave penalized LSE is to recover
(C1) There exist positive constants,
for any
The conditions in (C1) specify technical requirements for the oracle property that are slightly weaker than those applied with the Least Absolute Selection and Shrinkage Operator (LASSO) in Mai
If
as
The penalized LDA allows for exponentially many input variables, polynomially many signal variables, and diminishing regression coefficients, satisfying that
Compared with the ordinary penalized linear regression (Zhang, 2010), the requirements are stronger since
The stronger requirements mainly comes from the random design matrix
In this section, we report some results of simulation studies and real data analysis.
We generated
We first investigated whether the oracle property can hold with finite samples and Table 1 shows the estimated probabilities of achieving the oracle property. For each simulation, we first found the interval [
The ratios of the TLP and MCP approaches nearly 1 for some cases, while the SCAD has lower ratios, with the largest ratio being 0.79. In cases where both
In addition, we checked the sign consistency, sign(
Second, we compared the finite sample performance of the concave penalized estimators, using the LASSO as a benchmark method. The primary objective of the simulation is to check whether the oracle property can be realized through typical tuning parameter selection criteria. Note that there are two natural tuning parameter selection criteria from the characteristic of the framework: Minimizing the regression error in (
Tables 2 and 3 summarizes the results. We first discuss the cases when
TPS increases to
For the concave penalties, FPS decreases as
For the concave penalties, CMS increases as
One of the reasons for the poor FPS and CMS of the SCAD seems to be because its shape or concavity on the interval [0,
ERR decreases to that of the Bayes, the best performance, as
For the cases of classification errors, we observed the followings:
TPS shows similar patterns as
The LASSO and SCAD select significantly less noisy variables, producing lower FPS and slightly higher CMS. However, the TLP and MCP show opposite results.
To sum up, (a) we can conclude that tuning parameter selection based on the regression errors exhibits better selection performance for the TLP and MCP. However, for the LASSO and SCAD, the classification errors seems to be more informative, as indicated by smaller FPS and larger CMS. (b) The two tuning parameter selection criteria can function to some extent in the simulation settings designed in this paper. (c) However, the probabilities of correctly identifying the true model in Table 1 consistently reach nearly 1, regardless of the penalties and simulation settings. This suggests that we need to develop other alternatives such as the information criterion based on the underling probability structure.
We apply the penalized LDA methods to two benchmark datasets: The prostate and lung cancer datasets, which is available from the R packages datamicroarray. The prostate cancer dataset consists of the expression levels of approximately 12,600 genes obtained from 52 prostate tumour samples and 50 non-tumour prostate samples (Singh
Table 4 presents the average values of the three measures obtained from 200 random partitions of data. In most cases, the LASSO shows the best prediction performance but selects the most variables. The TLP, MCP, and SCAD show similar prediction performances and they substantially selects fewer variables than the LASSO. Among the concave penalized methods, the SCAD selects more variables than other methods. Similar to the simulation results, the methods based on regression error exhibit better prediction performances. For model size, the LASSO and SCAD based on the classification error produce more sparse models while the methods based on regression error produce more sparse model for TLP and MCP. These results suggest that the concave penalized LDA method could be a good alternative when we wish to construct the sparse model without losing the prediction accuracy much.
In this paper, we studied the high-dimensional LDA based on the concave penalized linear regression. We proved that an oracle property holds uniformly on a class of concave penalties, including the TLP, SCAD, and MCP as examples. The primary advantage of the concave penalized approach lies in its superior selection performance compared to the convex penalized approach, as supported by the simulation studies. In addition, we found that the tuning parameter selection criteria based on the prediction errors work to some extent, and hence, we may use the criteria in practice. However, we believe that there are better alternatives, such as the Bayesian information criterion, which has been proven to be useful for other penalized approaches, including the penalized linear regression (Fan and Tang, 2012).
Let ||
Let
From Cauchy’s interlacing theorem, . By Lemma 4, it follows that
provided that
Second, we will prove that
with probability tending to 1. By the triangular inequality,
Since
From Lemma 4, there exist positive constants
for all sufficiently large
for all sufficiently large
Third, we will show that the third inequality in the uniqueness condition holds with probability tending to 1. By using
From Lemma 4, there exist positive constants
for all sufficiently large
for all sufficiently large
Estimated probabilities of including correct model
Oracle property | Sign consistency | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | ||||
0.25 | 300 | 10 | 500 | 0.38 | 0.37 | 0 | 0.53 | 0.38 | 0.38 | 0.14 |
1000 | 0.95 | 0.97 | 0.13 | 0.99 | 0.96 | 0.97 | 0.78 | |||
2000 | 1 | 0.98 | 0.79 | 1 | 1 | 1 | 0.99 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
1000 | 0.11 | 0.11 | 0 | 0.13 | 0.11 | 0.11 | 0.03 | |||
2000 | 0.87 | 0.87 | 0.01 | 0.83 | 0.87 | 0.87 | 0.43 | |||
3000 | 10 | 500 | 0.05 | 0.07 | 0 | 0.10 | 0.05 | 0.08 | 0.02 | |
1000 | 0.89 | 0.89 | 0 | 0.90 | 0.89 | 0.90 | 0.48 | |||
2000 | 1 | 0.98 | 0.33 | 1 | 1 | 1 | 0.99 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
1000 | 0 | 0 | 0 | 0.01 | 0 | 0 | 0 | |||
2000 | 0.50 | 0.50 | 0 | 0.50 | 0.50 | 0.51 | 0.09 | |||
0.5 | 300 | 10 | 500 | 0.05 | 0.11 | 0 | 0 | 0.06 | 0.11 | 0 |
1000 | 0.72 | 0.76 | 0.01 | 0.30 | 0.72 | 0.76 | 0.18 | |||
2000 | 1 | 1 | 0.53 | 0.88 | 1 | 1 | 0.86 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
1000 | 0 | 0.01 | 0 | 0 | 0 | 0.01 | 0 | |||
2000 | 0.38 | 0.38 | 0 | 0.04 | 0.38 | 0.38 | 0.01 | |||
3000 | 10 | 500 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
1000 | 0.32 | 0.37 | 0 | 0.03 | 0.32 | 0.37 | 0 | |||
2000 | 0.96 | 0.97 | 0.14 | 0.71 | 0.96 | 0.97 | 0.36 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
1000 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
2000 | 0.08 | 0.08 | 0 | 0 | 0.08 | 0.08 | 0 |
Averages of the four measures: Validation by regression errors
TPS | FPS | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
LASSO | TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | ||||
0.25 | 300 | 10 | 500 | 9.97 | 9.31 | 9.33 | 9.65 | 29.29 | 1.15 | 1.04 | 14.01 |
1000 | 10.00 | 9.99 | 9.99 | 10.00 | 30.48 | 0.66 | 0.51 | 8.08 | |||
2000 | 10.00 | 10.00 | 10.00 | 10.00 | 29.40 | 0.48 | 0.37 | 4.92 | |||
20 | 500 | 18.66 | 13.44 | 13.42 | 15.62 | 29.74 | 2.36 | 2.18 | 12.64 | ||
1000 | 19.94 | 18.94 | 18.98 | 19.05 | 30.02 | 2.50 | 2.34 | 10.43 | |||
2000 | 19.99 | 19.94 | 19.94 | 19.98 | 30.26 | 0.72 | 0.51 | 8.46 | |||
3000 | 10 | 500 | 9.74 | 7.69 | 7.80 | 8.81 | 36.88 | 1.12 | 0.96 | 18.79 | |
1000 | 10.00 | 9.91 | 9.91 | 9.96 | 38.56 | 0.57 | 0.44 | 17.28 | |||
2000 | 10.00 | 10.00 | 10.00 | 10.00 | 37.88 | 0.47 | 0.30 | 6.08 | |||
20 | 500 | 15.64 | 8.55 | 8.63 | 11.67 | 34.84 | 1.31 | 1.15 | 17.91 | ||
1000 | 19.02 | 15.37 | 15.32 | 16.55 | 32.12 | 2.21 | 1.75 | 13.99 | |||
2000 | 19.97 | 19.64 | 19.64 | 19.52 | 31.38 | 1.21 | 0.98 | 10.87 | |||
0.5 | 300 | 10 | 500 | 9.74 | 8.20 | 8.31 | 8.86 | 37.05 | 2.06 | 1.80 | 13.90 |
1000 | 9.99 | 9.99 | 9.97 | 10.00 | 38.49 | 1.02 | 0.90 | 10.03 | |||
2000 | 10.00 | 10.00 | 10.00 | 10.00 | 37.44 | 0.42 | 0.35 | 4.27 | |||
20 | 500 | 16.12 | 10.21 | 10.30 | 11.83 | 33.76 | 2.62 | 2.62 | 13.12 | ||
1000 | 19.42 | 17.31 | 17.25 | 16.71 | 31.20 | 4.72 | 4.05 | 12.86 | |||
2000 | 19.97 | 19.71 | 19.74 | 19.51 | 30.68 | 1.74 | 1.60 | 9.65 | |||
3000 | 10 | 500 | 7.94 | 5.87 | 5.81 | 6.47 | 41.96 | 1.15 | 0.94 | 19.51 | |
1000 | 9.78 | 9.17 | 9.16 | 9.11 | 41.51 | 1.01 | 0.88 | 20.05 | |||
2000 | 10.00 | 9.99 | 9.99 | 10.00 | 41.88 | 0.39 | 0.31 | 8.20 | |||
20 | 500 | 9.88 | 5.93 | 5.77 | 7.19 | 39.89 | 1.32 | 1.10 | 18.78 | ||
1000 | 14.72 | 10.76 | 10.78 | 11.15 | 36.70 | 2.00 | 1.94 | 18.46 | |||
2000 | 18.98 | 17.82 | 17.82 | 15.88 | 32.21 | 2.56 | 2.60 | 14.61 | |||
CMS | ERR | ||||||||||
LASSO | TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | ||||
0.25 | 300 | 10 | 500 | 0 | 0.28 | 0.29 | 0 | 0.093 | 0.092 | 0.092 | 0.093 |
1000 | 0 | 0.57 | 0.66 | 0.04 | 0.088 | 0.085 | 0.085 | 0.085 | |||
2000 | 0 | 0.71 | 0.79 | 0.34 | 0.086 | 0.084 | 0.084 | 0.084 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.082 | 0.088 | 0.088 | 0.089 | ||
1000 | 0 | 0.06 | 0.07 | 0 | 0.072 | 0.071 | 0.071 | 0.075 | |||
2000 | 0 | 0.53 | 0.67 | 0 | 0.068 | 0.066 | 0.066 | 0.067 | |||
3000 | 10 | 500 | 0 | 0.03 | 0.05 | 0 | 0.103 | 0.105 | 0.103 | 0.109 | |
1000 | 0 | 0.65 | 0.69 | 0 | 0.091 | 0.086 | 0.086 | 0.090 | |||
2000 | 0 | 0.66 | 0.80 | 0.14 | 0.087 | 0.084 | 0.084 | 0.084 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.097 | 0.104 | 0.103 | 0.107 | ||
1000 | 0 | 0 | 0 | 0 | 0.080 | 0.079 | 0.078 | 0.087 | |||
2000 | 0 | 0.35 | 0.42 | 0 | 0.072 | 0.067 | 0.067 | 0.073 | |||
0.5 | 300 | 10 | 500 | 0 | 0.04 | 0.08 | 0 | 0.173 | 0.168 | 0.167 | 0.166 |
1000 | 0 | 0.49 | 0.53 | 0 | 0.159 | 0.150 | 0.150 | 0.149 | |||
2000 | 0 | 0.72 | 0.77 | 0.28 | 0.152 | 0.148 | 0.148 | 0.148 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.167 | 0.168 | 0.167 | 0.165 | ||
1000 | 0 | 0 | 0.01 | 0 | 0.147 | 0.141 | 0.141 | 0.148 | |||
2000 | 0 | 0.28 | 0.32 | 0 | 0.135 | 0.130 | 0.130 | 0.132 | |||
3000 | 10 | 500 | 0 | 0 | 0 | 0 | 0.202 | 0.187 | 0.185 | 0.196 | |
1000 | 0 | 0.25 | 0.32 | 0 | 0.171 | 0.156 | 0.155 | 0.164 | |||
2000 | 0 | 0.67 | 0.76 | 0.05 | 0.157 | 0.148 | 0.148 | 0.148 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.203 | 0.188 | 0.186 | 0.196 | ||
1000 | 0 | 0 | 0 | 0 | 0.171 | 0.159 | 0.158 | 0.166 | |||
2000 | 0 | 0.07 | 0.07 | 0 | 0.149 | 0.136 | 0.136 | 0.147 |
Averages of the four measures: Validation by classification errors
TPS | FPS | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
LASSO | TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | ||||
0.25 | 300 | 10 | 500 | 9.80 | 8.79 | 8.79 | 9.30 | 14.67 | 1.87 | 1.84 | 8.96 |
1000 | 10.00 | 9.82 | 9.79 | 9.95 | 14.08 | 2.44 | 2.39 | 6.28 | |||
2000 | 10.00 | 9.95 | 9.93 | 9.99 | 14.54 | 3.31 | 2.91 | 5.10 | |||
20 | 500 | 17.41 | 12.57 | 12.62 | 13.72 | 14.54 | 2.65 | 2.86 | 6.95 | ||
1000 | 19.54 | 17.37 | 17.43 | 17.90 | 15.52 | 2.31 | 2.28 | 5.80 | |||
2000 | 19.98 | 19.50 | 19.53 | 19.86 | 14.33 | 1.98 | 2.01 | 5.63 | |||
3000 | 10 | 500 | 9.36 | 7.36 | 7.51 | 8.08 | 15.34 | 1.37 | 1.23 | 10.29 | |
1000 | 10.00 | 9.74 | 9.72 | 9.80 | 18.61 | 1.13 | 1.31 | 10.81 | |||
2000 | 10.00 | 9.98 | 9.96 | 9.99 | 15.93 | 1.94 | 1.96 | 5.40 | |||
20 | 500 | 13.98 | 8.29 | 8.44 | 10.23 | 16.91 | 1.50 | 1.80 | 9.29 | ||
1000 | 18.47 | 14.84 | 14.93 | 15.36 | 17.62 | 2.47 | 2.54 | 8.24 | |||
2000 | 19.93 | 18.95 | 19.08 | 19.14 | 18.73 | 1.50 | 1.65 | 6.87 | |||
0.5 | 300 | 10 | 500 | 9.20 | 7.98 | 8.06 | 8.45 | 21.94 | 2.74 | 3.13 | 10.80 |
1000 | 9.95 | 9.77 | 9.71 | 9.84 | 23.10 | 2.82 | 2.34 | 8.84 | |||
2000 | 10.00 | 9.94 | 9.96 | 9.97 | 21.13 | 2.83 | 2.76 | 6.34 | |||
20 | 500 | 14.69 | 10.14 | 10.63 | 11.00 | 22.63 | 3.32 | 4.23 | 10.12 | ||
1000 | 18.99 | 15.96 | 15.98 | 15.19 | 23.22 | 4.24 | 4.15 | 8.63 | |||
2000 | 19.87 | 19.27 | 19.19 | 19.05 | 21.89 | 2.97 | 2.93 | 7.27 | |||
3000 | 10 | 500 | 7.12 | 5.60 | 5.66 | 5.91 | 23.50 | 1.62 | 1.54 | 12.70 | |
1000 | 9.51 | 8.97 | 8.97 | 8.43 | 24.92 | 1.97 | 2.12 | 12.08 | |||
2000 | 9.99 | 9.86 | 9.85 | 9.93 | 25.88 | 1.20 | 1.75 | 8.47 | |||
20 | 500 | 8.29 | 5.77 | 5.69 | 6.10 | 21.49 | 1.69 | 1.56 | 11.51 | ||
1000 | 13.72 | 10.48 | 10.32 | 9.89 | 24.71 | 3.29 | 2.59 | 10.57 | |||
2000 | 18.76 | 17.51 | 17.41 | 14.88 | 26.59 | 3.52 | 3.42 | 9.78 | |||
CMS | ERR | ||||||||||
LASSO | TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | ||||
0.25 | 300 | 10 | 500 | 0.03 | 0.15 | 0.14 | 0.02 | 0.097 | 0.097 | 0.097 | 0.098 |
1000 | 0.09 | 0.33 | 0.29 | 0.15 | 0.089 | 0.088 | 0.088 | 0.088 | |||
2000 | 0.20 | 0.43 | 0.44 | 0.42 | 0.086 | 0.086 | 0.086 | 0.085 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.086 | 0.092 | 0.091 | 0.095 | ||
1000 | 0 | 0.04 | 0.04 | 0 | 0.075 | 0.074 | 0.074 | 0.078 | |||
2000 | 0.03 | 0.29 | 0.23 | 0.07 | 0.069 | 0.067 | 0.067 | 0.068 | |||
3000 | 10 | 500 | 0 | 0.05 | 0.05 | 0 | 0.107 | 0.108 | 0.106 | 0.113 | |
1000 | 0.11 | 0.52 | 0.41 | 0.02 | 0.092 | 0.088 | 0.088 | 0.092 | |||
2000 | 0.07 | 0.43 | 0.43 | 0.24 | 0.087 | 0.086 | 0.086 | 0.085 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.102 | 0.106 | 0.105 | 0.112 | ||
1000 | 0 | 0 | 0 | 0 | 0.082 | 0.081 | 0.081 | 0.090 | |||
2000 | 0.04 | 0.26 | 0.25 | 0 | 0.073 | 0.069 | 0.069 | 0.075 | |||
0.5 | 300 | 10 | 500 | 0 | 0.04 | 0.04 | 0 | 0.178 | 0.171 | 0.171 | 0.170 |
1000 | 0 | 0.25 | 0.23 | 0.03 | 0.160 | 0.154 | 0.155 | 0.153 | |||
2000 | 0 | 0.50 | 0.41 | 0.17 | 0.153 | 0.150 | 0.150 | 0.150 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.171 | 0.170 | 0.170 | 0.169 | ||
1000 | 0 | 0 | 0 | 0 | 0.149 | 0.145 | 0.145 | 0.150 | |||
2000 | 0 | 0.14 | 0.11 | 0 | 0.137 | 0.132 | 0.132 | 0.134 | |||
3000 | 10 | 500 | 0 | 0 | 0 | 0 | 0.207 | 0.190 | 0.189 | 0.200 | |
1000 | 0 | 0.19 | 0.21 | 0 | 0.174 | 0.160 | 0.160 | 0.168 | |||
2000 | 0 | 0.54 | 0.47 | 0.08 | 0.158 | 0.150 | 0.150 | 0.150 | |||
20 | 500 | 0 | 0 | 0 | 0 | 0.209 | 0.191 | 0.189 | 0.201 | ||
1000 | 0 | 0 | 0 | 0 | 0.174 | 0.162 | 0.162 | 0.171 | |||
2000 | 0 | 0.05 | 0.04 | 0 | 0.150 | 0.138 | 0.138 | 0.149 |
Averages of the measures in prostate and lung cancer data analysis
Data | Measure | Cross-validation by regression error | Cross-validation by classification error | |||||||
---|---|---|---|---|---|---|---|---|---|---|
LASSO | TLP | MCP | SCAD | LASSO | TLP | MCP | SCAD | |||
Prostate | ERR(%) | 500 | 6.34 | 9.32 | 9.44 | 9.04 | 7.28 | 9.64 | 9.48 | 9.06 |
1000 | 7.06 | 9.40 | 9.24 | 9.12 | 7.84 | 10.28 | 9.80 | 9.16 | ||
2000 | 7.76 | 9.40 | 9.32 | 9.30 | 8.00 | 10.38 | 10.56 | 9.14 | ||
#MIS | 500 | 1.58 | 2.33 | 2.36 | 2.26 | 1.82 | 2.41 | 2.37 | 2.27 | |
1000 | 1.76 | 2.35 | 2.31 | 2.28 | 1.96 | 2.57 | 2.45 | 2.29 | ||
2000 | 1.94 | 2.35 | 2.33 | 2.33 | 2.00 | 2.60 | 2.64 | 2.28 | ||
SIZE | 500 | 25.31 | 4.60 | 4.00 | 11.24 | 12.25 | 5.98 | 5.17 | 8.45 | |
1000 | 25.65 | 3.08 | 3.10 | 13.40 | 9.12 | 6.43 | 6.68 | 9.03 | ||
2000 | 25.50 | 3.01 | 2.21 | 12.94 | 6.28 | 4.46 | 5.21 | 6.40 | ||
Lung | ERR(%) | 500 | 0.94 | 1.69 | 1.66 | 1.64 | 1.57 | 2.03 | 2.09 | 1.84 |
1000 | 0.93 | 1.84 | 1.86 | 1.59 | 1.56 | 2.02 | 2.23 | 1.79 | ||
2000 | 0.94 | 1.80 | 2.00 | 1.50 | 1.54 | 2.20 | 2.26 | 1.91 | ||
#MIS | 500 | 0.43 | 0.76 | 0.75 | 0.74 | 0.71 | 0.92 | 0.94 | 0.83 | |
1000 | 0.42 | 0.83 | 0.84 | 0.72 | 0.70 | 0.91 | 1.01 | 0.81 | ||
2000 | 0.43 | 0.81 | 0.90 | 0.68 | 0.70 | 0.99 | 1.02 | 0.86 | ||
SIZE | 500 | 58.57 | 21.36 | 18.03 | 21.60 | 12.54 | 11.31 | 9.76 | 9.39 | |
1000 | 60.05 | 17.73 | 17.57 | 25.31 | 12.95 | 11.23 | 10.26 | 9.85 | ||
2000 | 60.81 | 17.45 | 18.09 | 27.88 | 12.51 | 11.95 | 11.80 | 10.10 |