I have a somewhat large model, that I am simulating (repeatedly with slight variations of the parameters) in Simbiology, that is consistently resulting in "Out of Memory" errors. I can see with the `memory` command when I run the line: simData = sbiosimulate(virtualPatientModel,configSettings,doses) that the "Memory used by MATLAB" jumps about 4GB with each iteration (so it doesn't get through many iterations). If I run clear simData the memory is not released (though the variable is removed from the workspace). If I run clear all the memory is not released. Setting the number of output times to reasonably small, and logging only 2 species does not reduce the additional memory used with each iteration. If I assign a variable to every property of simData, I can see with the whos command that none of them are using much memory (KBs at most). So I'm confused about what is actually using the memory and how to release it. This behavior is consistent in R2022a and R2020b, with sundials and ode23t. (The integration is extremely slow with ode45 and ode15s). Unfortunately I cannot share this model because both the structure and the parameters are proprietary, and I have been unsuccessful in making a minimal working example. (Clearing simData releases the memory as expected in the smaller model, and reducing the output times reduces the memory accumulated with each iteration.) What is causing this? How do I diagnose or fix it?