Borrar filtros
Borrar filtros

Negative TrainedVariance error when running matlab example of image GAN(i change the code enlarge the image size from 64x64 to 128x128)

3 visualizaciones (últimos 30 días)
I follow this: https://www.mathworks.com/matlabcentral/answers/585146-how-to-change-gan-example-to-generate-images-with-a-larger-size to change the image size processed in GAN from 64x64 to 128x128. A guy said he can make it at the end of post but I got a negative TrainedVariance error when running it.
Here is the system error warning:
Error using nnet.internal.cnn.dlnetwork/set.State
Layer 'batchnorm_1': Invalid State. Expected TrainedVariance to be positive.
Error in dlnetwork/set.State (line 551)
net.PrivateNetworkStorage.State = values;
Error in trainGAN (line 70)
netG.State = stateG;
The error pumped out when it runs for 1 to 2 minutes.
I Googled it and tried to add those codes to replace the line netG.State = stateG;:
idx = netG.State.Parameter == "TrainedVariance";
boundAwayFromZero = @(X) max(X, eps('single'));
netG.State(idx,:) = dlupdate(boundAwayFromZero, netG.State(idx,:));
But cant work since netG.State isnt get value from stateG
  3 comentarios
Suhail Mahmud
Suhail Mahmud el 9 de Nov. de 2022
Hi @Ziqi Sun, were you able to resolve the trainedVariance issue and generate images higher than 128 pixel?

Iniciar sesión para comentar.

Respuestas (1)

Ayush Modi
Ayush Modi el 12 de En. de 2024
Hi Ziqi,
As per my understanding, you are trying to process 128x128 images in GAN but you are getting negative TrainedVariance error while running it.
I was able to achieve this by following the answer provided by @Fred Liu in the Matlab community question you are following (https://www.mathworks.com/matlabcentral/answers/585146-how-to-change-gan-example-to-generate-images-with-a-larger-size).
Below is the code for your reference:
filterSize = 5;
numFilters = 128;
numLatentInputs = 100;
projectionSize = [4 4 512];
layersGenerator = [
featureInputLayer(numLatentInputs)
projectAndReshapeLayer(projectionSize)
transposedConv2dLayer(filterSize,8*numFilters)
batchNormalizationLayer
reluLayer
transposedConv2dLayer(filterSize,4*numFilters,Stride=2,Cropping="same")
batchNormalizationLayer
reluLayer
transposedConv2dLayer(filterSize,2*numFilters,Stride=2,Cropping="same")
batchNormalizationLayer
reluLayer
transposedConv2dLayer(filterSize,numFilters,Stride=2,Cropping="same")
batchNormalizationLayer
reluLayer
transposedConv2dLayer(filterSize,3,Stride=2,Cropping="same")
tanhLayer];
netG = dlnetwork(layersGenerator);
dropoutProb = 0.5;
numFilters = 128;
scale = 0.2;
inputSize = [128 128 3];
filterSize = 5;
layersDiscriminator = [
imageInputLayer(inputSize,Normalization="none")
dropoutLayer(dropoutProb)
convolution2dLayer(filterSize,numFilters,Stride=2,Padding="same")
leakyReluLayer(scale)
dropoutLayer(dropoutProb)
convolution2dLayer(filterSize,2*numFilters,Stride=2,Padding="same")
batchNormalizationLayer
leakyReluLayer(scale)
dropoutLayer(dropoutProb)
convolution2dLayer(filterSize,4*numFilters,Stride=2,Padding="same")
batchNormalizationLayer
leakyReluLayer(scale)
dropoutLayer(dropoutProb)
convolution2dLayer(filterSize,8*numFilters,Stride=2,Padding="same")
batchNormalizationLayer
leakyReluLayer(scale)
convolution2dLayer(8,1)
sigmoidLayer];
netD = dlnetwork(layersDiscriminator);
I hope this helps!

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by