

Memory {base}                                R Documentation

_M_e_m_o_r_y _A_v_a_i_l_a_b_l_e _f_o_r _D_a_t_a _S_t_o_r_a_g_e

_D_e_s_c_r_i_p_t_i_o_n_:

     Use command line options to set the memory available
     for R.

_U_s_a_g_e_:

     R --vsize v --nsize n

_A_r_g_u_m_e_n_t_s_:

       v: Use `v' bytes of heap memory

       n: Use `n' cons cells.

_D_e_t_a_i_l_s_:

     R (currently) uses a static memory model.  This means
     that when it starts up, it asks the operating system to
     reserve a fixed amount of memory for it.  The size of
     this chunk cannot be changed subsequently.  Hence, it
     can happen that not enough memory was allocated, e.g.,
     when trying to read large data sets into R.

     In these cases, you should restart R (after saving your
     current workspace) with more memory available, using
     the command line options `--nsize' and `--vsize'.  To
     understand these options, one needs to know that R
     maintains separate areas for fixed and variable sized
     objects.  The first of these is allocated as an array
     of ``cons cells'' (Lisp programmers will know what they
     are, others may think of them as the building blocks of
     the language itself, parse trees, etc.), and the second
     are thrown on a ``heap'' of ``Vcells'' (see
     `gc()["Vcells","total"]') of 8 bytes each.  Effec-
     tively, the input `v' is therefore truncated to the
     nearest multiple of 8.

     The `--nsize' option can be used to specify the number
     of cons cells (each occupying 16 bytes) which R is to
     use (the default is 250000), and the `--vsize' option
     to  specify the size of the vector heap in bytes (the
     default is 6 MB).  Both options must be integers or
     integers ending with `M', `K', or `k' meaning Mega (=
     2^{20} = 1048576), (computer) Kilo (= 2^{10} = 1024),
     or regular kilo (= 1000). (The minimum allowed values
     are 200000 and 2M.)

     E.g., to read in a table of 10000 observations on 40
     numeric variables, `R --vsize 10M' should do;  For
     `source()'ing a large file, you'd use `R --nsize 500k'.

     Note that the information on where to find vectors and
     strings on the heap is stored using cons cells.  Thus,
     it may also be necessary to allocate more space for
     cons cells in order to perform computations with very
     ``large'' variable-size objects.

     You can find out the current memory consumption (the
     proportion of heap and cons cells used) by typing
     `gc()' at the R prompt.  This may help you in finding
     out whether to increase `--vsize' or `--nsize'.  Note
     that following `gcinfo(TRUE)', automatic garbage col-
     lection always prints memory use statistics.

     R will tell you whether you ran out of cons or heap
     memory.

     The defaults for `--nsize' and `--vsize' can be changed
     by setting the environment variables `R_NSIZE' and
     `R_VSIZE' respectively, perhaps most conveniently in
     the file `.Renviron' or `~/.Renviron'.

     When using `read.table', the memory requirements are in
     fact higher than anticipated, because the file is first
     read in as one long string which is then split again.
     Use `scan' if possible in case you run out of memory
     when reading in a large table.

_S_e_e _A_l_s_o_:

     `gc' for information on the garbage collector.

_E_x_a_m_p_l_e_s_:

     # Start R with 15MB of heap memory and 1 million cons cells

     R --vsize 15M --nsize 1000k

