[R] Need help to split a given matrix is a "sequential" way
Megh
megh700004 at yahoo.com
Tue Mar 30 09:20:24 CEST 2010
I need to split a given matrix in a sequential order. Let my matrix is :
> dat <- cbind(sample(c(100,200), 10, T), sample(c(50,100, 150, 180), 10,
> T), sample(seq(20, 200, by=20), 10, T)); dat
[,1] [,2] [,3]
[1,] 200 100 80
[2,] 100 180 80
[3,] 200 150 180
[4,] 200 50 140
[5,] 100 150 60
[6,] 100 50 60
[7,] 100 100 100
[8,] 200 150 100
[9,] 100 50 120
[10,] 200 50 180
Now I need to split above matrix according to unique numbers in the 2nd
column. Therefore I have following :
> dat1 <- dat[which(dat[,1] == unique(dat[,1])[1]),]
> dat2 <- dat[-which(dat[,1] == unique(dat[,1])[1]),]; dat1; dat2
[,1] [,2] [,3]
[1,] 200 100 80
[2,] 200 150 180
[3,] 200 50 140
[4,] 200 150 100
[5,] 200 50 180
[,1] [,2] [,3]
[1,] 100 180 80
[2,] 100 150 60
[3,] 100 50 60
[4,] 100 100 100
[5,] 100 50 120
Now each of dat1 and dat2 needs to be splited according to the it's 2nd
column i.e.
> dat11 <- dat1[which(dat1[,2] == unique(dat1[,2])[1]),]
> dat12 <- dat1[which(dat1[,2] == unique(dat1[,2])[2]),]
> dat13 <- dat1[which(dat1[,2] == unique(dat1[,2])[3]),]; dat11; dat12;
> dat13
[1] 200 100 80
[,1] [,2] [,3]
[1,] 200 150 180
[2,] 200 150 100
[,1] [,2] [,3]
[1,] 200 50 140
[2,] 200 50 180
similarly for dat2..............
This kind of sequential spliting would continue for
(no_of_cols_of_ogirinal_matrix -1) times. It would be greate if again I can
put all those matrices within a "list" object for further calculations.
Therefore you see if the original matrix is of small_size then that can be
handled manually. However for a moderately large matrix that task would be
very clumbersome. Therefore I am looking for some mechanized way to do that
for an arbitrary matrix.
Can anyone here help me on this regard?
Thank you so much for your kind attention.
--
View this message in context: http://n4.nabble.com/Need-help-to-split-a-given-matrix-is-a-sequential-way-tp1744803p1744803.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list