How can I expoort data from matlab to R and not lose format when using read_csv command in R?
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I've tried R.matlab package and was unsuccessful. (Data was an matrix within a matrix) I used:
csvwrite('coeff.dat',coeff)
coeff.dat
....
4.8217e+06,4.8217e+06,4.8218e+06,4.8218e+06,4.8218e+06,4.8218e+06,4.8218e+06, ...
4.8218e+06,4.8218e+06,4.8218e+06,4.8219e+06,4.8219e+06,4.8219e+06,4.8219e+06, ...
...[large amount superfluous data array elided for brevity--dpb]
4.8225e+06,4.8225e+06,4.8225e+06,4.8225e+06,4.8225e+06,4.8225e+06,4.8225e+06,.....
Which is what I expected, but when I use the read_csv import command in R the exponents are deleted.
library(readr)
xarray <- read_csv("~/Downloads/xarray.dat", col_names = FALSE)
or
xarray <- read_csv("~/Downloads/xarray.dat",
col_names = FALSE, locale = locale(encoding = "ASCII"))
Any suggestions?
1 comentario
Respuestas (1)
dpb
el 26 de Jul. de 2017
Editada: dpb
el 27 de Jul. de 2017
That's an R syntax question, not Matlab and it's been so long since used it I don't recall much at all about it. Are you sure it isn't just a case of display rather than actual value similar to scaling in the Matlab command window? I've not seen any package that would fail to read a .csv file correctly in a long time.
I will note, however, that you're not outputting enough significant digits to show the results accurately -- there's only a variation in the last couple of decimal places in the default five-digit results; you'll need to step that up by using dlmwrite with the optional 'precision' argument--
dlmwrite('coeff.dat',coeff,'delimiter',',','precision',15)
will write nearly full precision at the expense of a much larger output file of course (but probably mandatory if you want accurate results on the R side).
I'm certain R can also read a stream file which would hold full precision and be much smaller and faster than using text files for data transfer but I don't know syntax for that any longer, either, sorry. But, on the Matlab side
fid=fopen('coeff.bin','w');
fwrite(fid)
fid=fclose(fid,coeff)
will get the job done.
ff <- file("coeff.bin", "rb")
data <- readBin(ff, "double", N)
close(ff)
is, I think, the basic idea in R
Like Matlab in R the internal storage order is column major so the sequence will be the same.
N will be the length of the input file in elements; I'm pretty sure there's a syntax that can be used to query the file size and return it to make that dynamic rather than hard coding it in but the specifics are too far in the distant past, sorry...
2 comentarios
Walter Roberson
el 26 de Jul. de 2017
That should be fwrite(fid,coeff)
If you use fwrite then watch out for the order the data appears in the binary file: it vary most quickly down the columns, whereas csv files represent rows.
dpb
el 26 de Jul. de 2017
Editada: dpb
el 26 de Jul. de 2017
Good catch on leaving off the variable, Walter! I'll edit it...and I'd no issue if you were to make such when see 'em; you know my typing and proofing are terrible and seem to be getting worse! :) )
On order with stream file they'll be read in same order as written. I just checked recollection that R uses column-major storage order. It does so should be good to go...
Ver también
Categorías
Más información sobre Matrix Indexing en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!