I have a large array stored in a .dat file (see Example.dat attached) and I need to import the array into MATLAB.
At the moment I am using the following approach to load the table and convert it to an array.
Example_Table = readtable("Example.dat");
Example_Array = table2array(Example_Table);
This process is, however taking much longer than I would expect since I have a reasonably powerful PC.
I suspect that the issue is related to the array having a large number of zero entries.
The results of Run & Time are shown below
It is clear that pretty much all of the time is involved in reading the table and not in converting it to an array.
The timing profile of table.readTextFile>textscanReadData is shown below
Where all of the time is spent on the TreatAsEmpty command (because of having many zero entries?).
Below is a snapshot of the CPU and RAM usage during the reading of table.
Here it is clear that there is a lot of computational power not being used so this process should be able to be sped up some way or another.
How can I make this process run faster?
I have to read in lots of data like this and it is a very frustrating process.
Thanks in advance!