Reading large excel file using "read" command takes ~5 minutes, is this expected performance?

1 visualización (últimos 30 días)
I am reading a simulation/test output data in a .xlsx file into a matlab table through a datastore variable. The test data contains 450+ variables, each with 20000+ samples (i.e.) 450+ columns and 20000+ rows but all are numbers. I created a datastore on the excel file, modified the selected variables and variable type properties and used read command to read the file into a matlab table, it took about ~5 minutes. When I tried readtable command on the excel file directly, it took about the same time as well. However when I tried reading the file interactively using matlab export dialog, it took less than 30 seconds, so I am wondering if there's any way to achieve the same level of efficiency programmatically?

Respuesta aceptada

J. Alex Lee
J. Alex Lee el 6 de Sept. de 2020
Try manually creating the import options with spreadsheetimportoptions().
  2 comentarios
Ajay Kumar
Ajay Kumar el 7 de Sept. de 2020
Thanks. I will read up on this function and try it out. The format of the test output sheet will not change that often, but If it does, then I will have to update this object that I am gonna create?
J. Alex Lee
J. Alex Lee el 7 de Sept. de 2020
Yes, the idea is to fully specify the import parameters so that they don't have to be auto-detected.

Iniciar sesión para comentar.

Más respuestas (0)

Productos


Versión

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by