

ica(e1071)                                   R Documentation

_I_n_d_e_p_e_n_d_e_n_t _C_o_m_p_o_n_e_n_t _A_n_a_l_y_s_i_s

_D_e_s_c_r_i_p_t_i_o_n_:

     This is an R-implementation of the Matlab-Function of
     Petteri.Pajunen@hut.fi.

     For a data matrix X independent components are
     extracted by applying a nonlinear PCA algorithm. The
     parameter `fun' determines which nonlinearity is used.
     `fun' can either be a function or one of the following
     strings "negative kurtosis", "positive kurtosis", "4th
     moment" which can be abbreviated to uniqueness. If
     `fun' equals "negative (positive) kurtosis" the func-
     tion tanh (x-tanh(x)) is used which provides ICA for
     sources with negative (positive) kurtosis. For `fun ==
     "4th moments"' the signed square function is used.

_U_s_a_g_e_:

     ica(X, lrate, epochs=100, ncomp=dim(X)[2], fun="negative")

_A_r_g_u_m_e_n_t_s_:

       X: The matrix for which the ICA is to be computed

   lrate: learning rate

  epochs: number of iterations

   ncomp: number of independent components

     fun: function used for the nonlinear computation part

_V_a_l_u_e_:

     An object of class "ica" containing

 weights: ICA weight matrix

projection: Projected data

  epochs: Number of iterations

     fun: Name of the used function

   lrate: Learning rate used

initweights: Initial weight matrix

_N_o_t_e_:

     Currently, there is no reconstruction from the ICA sub-
     space to the original input space.

_A_u_t_h_o_r_(_s_)_:

     Andreas Weingessel

_R_e_f_e_r_e_n_c_e_s_:

     Oja et al., "Learning in Nonlinear Constrained Hebbian
     Networks", in Proc. ICANN-91, pp. 385-390.

     Karhunen and Joutsensalo, "Generalizations of Principal
     Component Analysis, Optimization Problems, and Neural
     Networks", Neural Networks, v. 8, no. 4, pp. 549-562,
     1995.

_S_e_e _A_l_s_o_:

_E_x_a_m_p_l_e_s_:

