In small sample case, Linear Discriminant Analysis (LDA) may suffer from rank deficiency issue. Applied mathematics has used Tikhonov regularization - also known as $$\ell_2$$ regularization/shrinkage - to adjust linear operator. Regularized Linear Discriminant Analysis (RLDA) adopts such idea to stabilize eigendecomposition in LDA formulation.

do.rlda(X, label, ndim = 2, alpha = 1)

## Arguments

X

an $$(n\times p)$$ matrix or data frame whose rows are observations and columns represent independent variables.

label

a length-$$n$$ vector of data class labels.

ndim

an integer-valued target dimension.

alpha

Tikhonow regularization parameter.

## Value

a named list containing

Y

an $$(n\times ndim)$$ matrix whose rows are embedded observations.

trfinfo

a list containing information for out-of-sample prediction.

projection

a $$(p\times ndim)$$ whose columns are basis for projection.

## References

Friedman JH (1989). “Regularized Discriminant Analysis.” Journal of the American Statistical Association, 84(405), 165.

Kisung You

## Examples

if (FALSE) {
## use iris data
data(iris)
set.seed(100)
subid = sample(1:150, 50)
X     = as.matrix(iris[subid,1:4])
label = as.factor(iris[subid,5])

## try different regularization parameters
out1 <- do.rlda(X, label, alpha=0.001)
out2 <- do.rlda(X, label, alpha=0.01)
out3 <- do.rlda(X, label, alpha=100)

## visualize
plot(out1$Y, pch=19, col=label, main="RLDA::alpha=0.1") plot(out2$Y, pch=19, col=label, main="RLDA::alpha=1")