

loess {modreg}                               R Documentation

_L_o_c_a_l _P_o_l_y_n_o_m_i_a_l _R_e_g_r_e_s_s_i_o_n _F_i_t_t_i_n_g

_D_e_s_c_r_i_p_t_i_o_n_:

     Fit a polynomial surface determined by one or more
     numerical predictors, using local fitting.

_U_s_a_g_e_:

     loess(formula, data, weights, subset, na.action, model = FALSE,
           span = 0.75, enp.target, degree = 2,
           parametric = FALSE, drop.square = FALSE, normalize = TRUE,
           family = c("gaussian", "symmetric"),
           method = c("loess", "model.frame"),
           control = loess.control(...), ...)

_A_r_g_u_m_e_n_t_s_:

 formula: a formula specifying the response and one or more
          numeric predictors (best specified via an interac-
          tion, but can also be specified additively).

    data: an optional data frame within which to look first
          for the response, predictors and weights.

 weights: optional weights for each case.

  subset: an optional specification of a subset of the data
          to be used.

na.action: the action to be taken with missing values in the
          response or predictors.  The default is to stop.

   model: should the model frame be returned?

    span: the parameter alpha which controls the degree of
          smoothing.

enp.target: an alternative way to specify `span', as the
          approximate equivalent number of parameters to be
          used.

  degree: the degree of the polynomials to be used, up to 2.

parametric: should any terms be fitted globally rather than
          locally?  Terms can be specified by name, number
          or as a logical vector of the same length as the
          number of predictors.

drop.square: for fits with more than one predictor and
          `degree=2', should the quadratic term (and cross-
          terms) be dropped for particular predictors?
          Terms are specified in the same way as for `para-
          metric'.

normalize: should the predictors be normalized to a common
          scale if there is more than one?  The normaliza-
          tion used is to set the 10% trimmed standard devi-
          ation to one.  Set to false for spatial coordinate
          predictors and others know to be a common scale.

  family: if `"gaussian"' fitting is by least-squares, and
          if `"symmetric"' a re-descending M estimator is
          used with Tukey's biweight function.

  method: fit the model or just extract the model frame.

 control: control parameters: see `loess.control'.

     ...: control parameters can also be supplied directly.

_D_e_t_a_i_l_s_:

     Fitting is done locally.  That is, for the fit at point
     x, the fit is made using points in a neighbourhood of
     x, weighted by their distance from x (with differences
     in `parametric' variables being ignored when computing
     the distance). The size of the neighbourhood is con-
     trolled by alpha (set by `span' or `enp.target').  For
     alpha < 1, the neighbourhood includes proportion alpha
     of the points, and these have tricubic weighting (pro-
     portional to (1 - (dist/maxdist)^3)^3.  For alpha > 1,
     all points are used, with the `maximum distance'
     assumed to be alpha times the actual maximum distance.

     For the default family, fitting is by (weighted) least
     squares. For `family="symmetric"' a few iterations of
     an M-estimation procedure with Tukey's biweight are
     used. Be aware that as the initial value is the least-
     squares fit, this need not be a very resistant fit.

     It can be important to tune the control list to achieve
     acceptable speed. See loess.control for details.

_V_a_l_u_e_:

     An object of class `"loess"'.

_N_o_t_e_:

     As this is based on the `cloess' package available at
     `netlib', it is similar to but not identical to the
     `loess' function of S. In particular, conditioning is
     not implemented.

     The memory usage of this implementation of `loess' is
     roughly quadratic in the number of points, with 1000
     points taking about 10Mb.

_A_u_t_h_o_r_(_s_)_:

     B.D. Ripley, based on the `cloess' package of Cleve-
     land, Grosse and Shyu.

_R_e_f_e_r_e_n_c_e_s_:

     W.S. Cleveland, E. Grosse and W.M. Shyu (1992) Local
     regression models. Chapter 8 of Statistical Models in S
     eds J.M. Chambers and T.J. Hastie, Wadsworth &
     Brooks/Cole.

_S_e_e _A_l_s_o_:

     `loess.control', `predict.loess'

_E_x_a_m_p_l_e_s_:

     data(cars)
     cars.lo <- loess(dist ~ speed, cars)
     predict(cars.lo, data.frame(speed=seq(5, 30, 1)), se=TRUE)
     # to get extrapolation
     cars.lo2 <- loess(dist ~ speed, cars,
       control=loess.control(surface="direct"))
     predict(cars.lo2, data.frame(speed=seq(5, 30, 1)), se=TRUE)

