[R] Maximum amount of memory
Prof Brian Ripley
ripley at stats.ox.ac.uk
Tue Mar 22 09:26:04 CET 2005
On Mon, 21 Mar 2005, Thomas Lumley wrote:
> On Mon, 21 Mar 2005, Tim Cutts wrote:
>
>>
>> On 21 Mar 2005, at 4:42 pm, marvena at tin.it wrote:
>>
>>> Hi,
>>> I have a problem:I need to use the maximum amount of memory in order to
>>> perform a very tough analysis. By purchasing the suitable computer, what's
>>> the maximum amount of memory obtainable in R?
>>
>> Assuming that R is happy to use 64-bit memory pointers, the limit will be
>> your wallet.
(It has been for several years.)
> I believe there are still some limits on sizes of individual objects, such as
> C and Fortran code that uses int or INTEGER to hold dimensions.
>
> Many packages will definitely have problems: for example, the survival
> package cannot correctly handle a design matrix with more than 2^31-1
> elements, no matter how much memory it has. I don't know how much of the
> internal R code would also break when vectors have more than 2^31-1 entries.
For the record: R limits the length of vectors to 2^31 - 1, even on 64-bit
machines. We have discussed changing this, but the use of Fortran for
e.g. matrix algebra (which does not have a longer integer type) means that
a lot of work would be needed to raise the limit.
> Now, 2^31-1 entries in a numeric matrix is 16Gb in one object, so your wallet
> is still likely to be the practical limit.
Indeed, that is why we have postponed changing the internal limit until
nearer the time as machines with say 64Gb of RAM become commonplace.
(You need to be able to make copies to do anything useful with R objects.)
Moore's Law suggests that will not happen until the early 2010s.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list