[R] Logistic regression for large data
Bill Dunlap
w||||@mwdun|@p @end|ng |rom gm@||@com
Mon Nov 14 16:44:45 CET 2022
summary(Base)
would show if one of columns of Base was read as character data instead of
the expected numeric. That could cause an explosion in the number of dummy
variables, hence a huge design matrix.
-Bill
On Fri, Nov 11, 2022 at 11:30 PM George Brida <georgebrida94 using gmail.com>
wrote:
> Dear R users,
>
> I have a database called Base.csv (attached to this email) which
> contains 13 columns and 8257 rows and whose the first 8 columns are dummy
> variables which take 1 or 0. The problem is when I wrote the following
> instructions to do a logistic regression , R runs for hours and hours
> without giving an output:
>
> Base=read.csv("C:\\Users\\HP\\Desktop\\New\\Base.csv",header=FALSE,sep=";")
>
> fit_1=glm(Base[,2]~Base[,1]+Base[,10]+Base[,11]+Base[,12]+Base[,13],family=binomial(link="logit"))
>
> Apparently, there is not enough memory to have the requested output. Is
> there any other function for logistic regression that handle large data and
> return output in reasonable time.
>
> Many thanks
>
> Kind regards
>
> George
> ______________________________________________
> R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
[[alternative HTML version deleted]]
More information about the R-help
mailing list