Matlab R2018b, slow issue

Good morning,
After having used Matlab R2016a, I've finally switched to Matlab R2018b to get some new functionalities I needed, and by the way hopefully have faster execution times (as I had been told during a presentation session of the product).
Nevertheless, it occurs finally that the model I was running on Matlab 2016a takes even more time on the R2018b version... Passing from 7 min to 20min. On the trial version I tested before, I had also experienced this, but after several runs, quieckly the execution time disminished to 5min. So I thought it was dued to a progressively improving function call management... But now I got a "real" version of Matlab R2018b, it always takes long time to execute (20min), even after several runs... It's particularly disturbing because it freezes during the execution and nothing else can be done in parallel.
Would someone have a clue of why it happens and what I could try to solve that problem? Memory management and vectorization are issues I have already tried.
To give some more details, here I depict more my script: it manages several large multidimensional double arrays (400x8760x40), use array-management functions like repelem, repmat, accumarray and only basic operations (+, -, .*, ./). While it also interacts intensively with Excel with xlsread1 and xlswrite1 functions. And I'm on Windows 64bit 4 cores.
Thank you in adavnce for your help,
Have a nice day,
Eric

16 comentarios

Bruno Luong
Bruno Luong el 15 de Nov. de 2018
I did not noticed any slow down.
Did you profile your code to narrow down the culprit?
dpb
dpb el 16 de Nov. de 2018
Interacting with Excel via COM is notoriously slow altho I don't know why it should change drastically with release...
As Bruno, says, profile the code and show those results; maybe that will be a klew...
Jan
Jan el 16 de Nov. de 2018
"it was dued to a progressively improving function call management" - this is a pure gues only. It is more likely, that you access the same file multiple times and it is store in the disk cache already.
Eric Francois
Eric Francois el 19 de Nov. de 2018
Thank you very much Bruno, dpb and Jan for your answers,
Well, I did profile the code to try finding finding the culprit. I do have several clues
  • Clearvars is lasting 10 times more than before, passing from 12sec to 2 minutes...
  • It seems then that the main difference in execution time is spread over "other lines". More specifically, I've seen all the bucles "for" I had done (with preallocation) for progessive table fulling are lasting sensibly more...
  • Finally, The Excel-Matlab interaction is not the problem.
Do the two first points remind you of an issue that could be managed by an other way ? I will think of a way to avoid those loops for, but I had done it because previously with R2016a, memory issues prevented me from applying my operation to the hole inputs at one time.
Thank you in advance, Eric
dpb
dpb el 19 de Nov. de 2018
clearvars is rarely actually needed; storing over the previous variable reallocates automagically. That is remarkable difference, however, and probably worth sending the profiler result to TMW as support issue for their awareness if nothing else.
" I've seen all the bucles "for" I had done ..." "bucles" ???
It might be possible to still vectorize portions of the code using partial datasets rather than the whole depending on the calculation or there are other specific data-handling tools for large data that might be more suited.
Would need more details of the actual application and the code to go with the profile to say much more in detail.
Bruno Luong
Bruno Luong el 19 de Nov. de 2018
If clearvars takes time, I suspect then part of the memory are swapped into the HD, meaning you start to have serious memory filled-in issue.
Personally I don't use clearvars either..
dpb
dpb el 19 de Nov. de 2018
Good thought on memory thrashing, Bruno...additional resources required by updated release could be just enough to throw what fit in memory previously over the top and cause such an issue.
Could use the Task Manager to watch memory usage and disk access to check or some of the other freeware tools available that are a little more capable/simpler for specific purpose.
dpb
dpb el 19 de Nov. de 2018
Altho with current systems,
>> 400*8760*40*8/1024/1024
ans =
1.0693e+03
really isn't all that large so that unless there are more than just a "few" of these, one would still think all should be in memory...here, with a pretty low-end machine w/ only 8 GB physical memory, and R2017b I could have ~10 of those in what is shown as free memory; in reality more than 8 would likely begin to crowd things, but it doesn't seem all that unreasonable altho the use of repelem and repmat can make things grow in a hurry if not careful.
dpb
dpb el 19 de Nov. de 2018
One last Q? for OP -- could your algorithms function with single instead of double? That would cut down memory requirements by 2X if have sufficient precision.
Eric Francois
Eric Francois el 20 de Nov. de 2018
Thank you again Bruno and dpb,
My code is following numerous steps of calculations, and so I need to create variables to store and use some data from Excel and then once used, trash it with clearvars. I'm surprised to hear it is not so used to apply it because in my case I don't see other way of doing it...
While I'm currently coding my script to convert in single array the data, hoping it will help and tell you more soon,
Thank you again,
PS: dpb, "bucles" was for "loops" sorry...
Eric
Bruno Luong
Bruno Luong el 20 de Nov. de 2018
Usually I put code in function(s), each perform a specific task. So for example if I read EXCEL sheet, I'll create a ReadExcel file. All the local variables will be automatically cleared when I exit function, keeping only the reading result as output. So I do not need to call CLEARVARS.
Eric Francois
Eric Francois el 20 de Nov. de 2018
Well, so it deals effectively with a memory congestion problem... Passing all my inputs into single array, the execution time passes from 10 min to 1.5 min ! So I'll have a look again on how I can manage better memory issue.
Bruno: As I know now it is a memory issue, using functions wouldn't solve my problem, would it? Because any way it creates implicit variables duing function calling and it takes the same memory space no? Except if it saves "clearvars" running time, but would you know if functions operate the same "clearvars" process on its implicit variable?
Thank very much, it has been a big step to improve my code,
Bruno Luong
Bruno Luong el 20 de Nov. de 2018
No, indeed putting code in function doesn't not help to reduce memory, but at least it minimizes the risk of forgetting clearing big variables.
dpb
dpb el 20 de Nov. de 2018
Editada: dpb el 20 de Nov. de 2018
ML does "smart copy on write" in making copies of function arguments -- it will not make local copies of input variables needlessly so that using functions is not a penalty in that sense if that is what you are asking.
What it does, as Bruno says, is that all local variables that may have been constructed inside the function are automagically destroyed when the function goes out of scope so you don't need to explicitly remove them as do by having everything in a main script.
Eric Francois
Eric Francois el 22 de Nov. de 2018
Ok, thank very much to all of you for your answers,
Have a nice day,
Eric
jose
jose el 24 de En. de 2019
I have the exact same issue my version is 'a':
R2018a 64 bit win64.
After passing through all intense matrix calculations, cell2mat and other functions it remains 'wondering' freezed for a very long time compared to what those calculations took.
Calculations take 50 seconds.
It freezes for 180 seconds.
In windows task manager shows no signs of running out of cpu or memmory usages.
I need to repeat those calculations some thousand times so any help will be much apreciated.
12 cores
32 GB RAM

Iniciar sesión para comentar.

Respuestas (2)

Ignacio Ortiz de Landazuri
Ignacio Ortiz de Landazuri el 4 de Feb. de 2019

0 votos

I have the same problem, both with PC and Mac. R2018a is much slower than R2013b. I have done tests with different versions of Matlab and on both platforms. The more recent the version, the slower the test.
Mahkalan
Mahkalan el 27 de Feb. de 2019

0 votos

I have a similar issue; the code that would take 5 minute in 2017b now takes over an hour and sometimes never finishes (after waiting 5 hours ...). I found one issue which could be solved by preallocation. Without preallocation, the function became slower and slower, so much that 2 lines of code would take a good fraction of a second to execute (are we back to 1970s?)
After resolving that I ran into the cell2mat issue and now the neural net toolbox's train function is much slower than what it did in 2017b version.

Categorías

Más información sobre Signal Integrity Kits for Industry Standards en Centro de ayuda y File Exchange.

Etiquetas

Preguntada:

el 15 de Nov. de 2018

Respondida:

el 27 de Feb. de 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by