Hyperthreading & Number of Cores. Parallel computing toolbox

191 visualizaciones (últimos 30 días)
I am using:
MATLAB Version: 8.3.0.532 (R2014a)
MATLAB License Number:
Operating System: Microsoft Windows 8.1 Version 6.3 (Build 9600)
Java Version: Java 1.7.0_11-b21 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
----------------------------------------------------------------------------------------------------
MATLAB Version 8.3 (R2014a)
Parallel Computing Toolbox Version 6.4 (R2014a)
I check my number of cores;
feature('numcores')
MATLAB detected: 4 physical cores.
MATLAB detected: 8 logical cores.
MATLAB was assigned: 8 logical cores by the OS.
MATLAB is using: 4 logical cores.
MATLAB is not using all logical cores because hyper-threading is enabled.
ans =
4
I also type;
>> p = parpool(8)
p.NumWorkers
Error using parpool (line 99)
You requested a minimum of 8 workers, but the cluster "local" has the NumWorkers property set to allow a maximum of 4 workers. To run a communicating job on
more workers than this (up to a maximum of 512 for the Local cluster), increase the value of the NumWorkers property for the cluster. The default value of
NumWorkers for a Local cluster is the number of cores on the local machine.
How can I allocate all 8 logical cores for matlab to use when running parfor.m for example? i.e. how can I make MATLAB use/ not use hyperthreading?
thanks

Respuesta aceptada

Jill Reese
Jill Reese el 9 de Abr. de 2014
Editada: MathWorks Support Team el 1 de Sept. de 2022
From the MATLAB desktop Parallel menu, select Create and Manage Clusters (or if you are using MATLAB version R2018a or earlier, select Manage Cluster Profiles). This opens the Cluster Profile Manager window. Select a local cluster profile, click the Edit button, and change the value of the NumWorkers property.
  9 comentarios
Jeff L.
Jeff L. el 4 de Jun. de 2014
To further follow up and put another nail in the coffin of my question, I did an additional test on a Mac laptop with 4 physical cores and 8 logical cores. When I ran my program (using parfor), it used 4 workers and took ~200 secs to run. The OSX Activity Monitor utility reported 50% CPU usage (essentially 4 of the 8 cores were used according to the monitor).
Then, I opened a second instance of Matlab on the same computer, and ran the program simultaneously in BOTH instances of Matlab. This time, the OSX Activity monitor reported 100% CPU activity and all 8 cores were fully used. HOWEVER , the run time for each instance was now 400 seconds. I.e., it took twice as long to run two instances of the program. This anecdotally shows that when running only one instance, it really is using close to 100% of the cpu resources, despite what OSX Activity Monitor says.
I hope this saves someone some time in the future. I wasted a full day trying to milk more performance out of my system.
Kay Gemba
Kay Gemba el 19 de Mayo de 2016
Jeff, the 'additional' 4 logical cores share their resources with the actual 4 cores. Hence using 8 will not help when maxing out the resources. Manufactures thus can then sell you '8' cores ;)
~kai

Iniciar sesión para comentar.

Más respuestas (1)

John Videtich
John Videtich el 13 de Abr. de 2018
To resurrect an old thread, I ran a test on a machine with 24 cores / 48 logical (Xeon Gold 6146 x 2).
With 24 workers my calculation was taking 25.5 seconds. With 36 workers it was down to 21 seconds. With 40 workers it was down to 20.3. With 46 workers it was down to 19.8 seconds.
So it would seem hyperthreading does provide a real benefit - in this case, 22% reduction. I'm sure it's workload-dependent.
  1 comentario
Shifan Gao
Shifan Gao el 8 de Mayo de 2018
Add on: my task takes 33 h with 4 pools and 19 h with 8 pools (i7 4 core). So new edition gives support to that kind of setting?

Iniciar sesión para comentar.

Categorías

Más información sobre Parallel Computing Fundamentals en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by