[R] lpSolve space problem in R 2.4.1 on Windows XP
Uwe Ligges
ligges at statistik.uni-dortmund.de
Fri Mar 9 17:51:30 CET 2007
If R is closed that way (i.e. crashes), it is a bug by definition:
either in R or (more probable) in the package. Can you please contact
the package maintainer to sort things out.
Thanks,
Uwe Ligges
Talbot Katz wrote:
> Hi.
>
> I am trying to use the linear optimizer from package lpSolve in R 2.4.1 on
> Windows XP (Version 5.1). The problem I am trying to solve has 2843
> variables (2841 integer, 2 continuous) and 8524 constraints, and I have 2 Gb
> of memory. After I load the input data into R, I have at most 1.5 Gb of
> memory available. If I start the lp with significantly less memory
> available (say 1 Gb), I get an error message from R:
>
> "Error: cannot allocate vector of size 189459 Kb"
>
> If I close all my other windows and try to maximize the available memory to
> the full 1.5 Gb, I can watch the memory get filled up until only about 400
> Mb is left, at which point I get a Windows error message:
>
> "R for Windows GUI front-end has encountered a problem and needs to close.
> We are sorry for the inconvenience."
>
> This behavior persists even when I relax the integer constraints, and
> eliminate the 2841 constraints that restrict the integer variables to values
> <= 1, so I'm just running a standard lp with 2843 variables and 5683
> constraints.
>
> I have been able to get the full MIP formulation to work correctly on some
> very small problems (~10 variables and 25 constraints).
>
> Here is the code for a working example:
>
>> library(lpSolve)
>> (v1=rev(1:8))
> [1] 8 7 6 5 4 3 2 1
>> (csv1=cumsum(as.numeric(v1)))
> [1] 8 15 21 26 30 33 35 36
>> (lencsv1=length(csv1))
> [1] 8
>> (Nm1=lencsv1-1)
> [1] 7
>> (Np1=lencsv1+1)
> [1] 9
>> ngp=3
>> f.obj=c(1,1,rep(0,Nm1))
>> f.int=3:Np1
>> bin.con=cbind(rep(0,Nm1),rep(0,Nm1),diag(Nm1))
>> bin.dir=rep("<=",Nm1)
>> bin.rhs=rep(1,Nm1)
>> gp.con=c(0,0,rep(1,Nm1))
>> gp.dir="<="
>> (gp.rhs=ngp-1)
> [1] 2
>> ub.con=cbind(rep(-1,rep(Nm1)),rep(0,Nm1),!upper.tri(matrix(nrow=Nm1,ncol=Nm1)))
>> ub.dir=rep("<=",Nm1)
>> (ub.rhs=csv1[1:Nm1]*ngp/csv1[lencsv1])
> [1] 0.6666667 1.2500000 1.7500000 2.1666667 2.5000000 2.7500000 2.9166667
>> lb.con=cbind(rep(0,Nm1),rep(1,rep(Nm1)),!upper.tri(matrix(nrow=Nm1,ncol=Nm1)))
>> lb.dir=rep(">=",Nm1)
>> lb.rhs=ub.rhs
>> f.con=rbind(bin.con,gp.con,ub.con,lb.con)
>> f.dir=c(bin.dir,gp.dir,ub.dir,lb.dir)
>> f.rhs=c(bin.rhs,gp.rhs,ub.rhs,lb.rhs)
>> lglp=lp("min",f.obj,f.con,f.dir,f.rhs,int.vec=f.int)
>> lglp$objval
> [1] 0.9166667
>> lglp$solution
> [1] 0.0000000 0.9166667 0.0000000 1.0000000 0.0000000 1.0000000 0.0000000
> [8] 0.0000000 0.0000000
>
> What this is doing is taking the points of v1 and dividing them into
> contiguous groups (the variable ngp is the number of groups) such that the
> sums of the v1 values are as close as possible to equal within the three
> groups. So, for v1 = c(8,7,6,5,4,3,2,1), the groups c(8,7), c(6,5),
> c(4,3,2,1), with sums 15,11,10 is the best such split, and the solution
> vector shows that the splitting occurs after the second and fourth elements.
>
>
> Anyway, I am wondering... Are 3000 variables and 8500 constraints usually
> too much for lpSolve to handle in 1.5 Gb of memory? Is there a possible bug
> (in R or in Windows) that leads to the Windows error when the memory falls
> below 400 Mb? Is there a problem with my formulation that makes it unstable
> even after the integer constraints are removed?
>
> Thanks!
>
>
> -- TMK --
> 212-460-5430 home
> 917-656-5351 cell
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list