Why I cant repeat nntool result?

I wanted to use codes to repeat it. All the parameters are the same and I copied all the weights from nntool to the generated m file. However, it can not give me accurate target results as nntool gave to me.

3 comentarios

Greg Heath
Greg Heath el 22 de Abr. de 2015
Insufficient info
zmywahrheit
zmywahrheit el 22 de Abr. de 2015
Editada: zmywahrheit el 22 de Abr. de 2015
function [y1] = myNeuralNetworkFunction(x1)
%MYNEURALNETWORKFUNCTION neural network simulation function.
%
% Generated by Neural Network Toolbox function genFunction, 21-Apr-2015 18:28:41.
%
% [y1] = myNeuralNetworkFunction(x1) takes these arguments:
% x = 16xQ matrix, input #1
% and returns:
% y = 1xQ matrix, output #1
% where Q is the number of samples.
%#ok<*RPMT0>
% ===== NEURAL NETWORK CONSTANTS =====
% Input 1
x1_step1_xoffset = [4.44;0;-10.2249;129.0924632;0;0;9;10;1;0;3;0;27.6414;0;0;0];
x1_step1_gain = [0.00373775883979966;0.111111111111111;0.0926822712717398;0.00151381078706761;2;0.333333333333333;0.0188679245283019;0.0208333333333333;0.0588235294117647;0.0645161290322581;0.027027027027027;0.2;0.00586066846784544;0.0555555555555556;0.4;0.666666666666667];
x1_step1_ymin = -1;
% Layer 1
b1 = [1.0187152477488607971;0.75546368634858018787;-0.74885827917346814431;0.49840389395597878286;0.40556335746536542608;0.36926239362124896326;0.21123730116657193912;-0.025373121452792912756;0.19967990101952995396;0.27041719258768764922;-0.40836152912273004922;-0.49895333981394524026;0.58280744543833351567;-0.98279507860101433803;-0.85307328904034396277];
IW1_1 = [0.021623484744762562493 -0.018729547913856961494 -0.4463095337965073961 0.11658736719799629722 0.1820883672252611718 -0.49980347081214338001 -0.004160165007411408683 -0.082218866386227537624 0.39917305997529339834 0.61907898643480729906 -0.1301702436426000753 0.14846716996946748846 0.38332367267454187099 0.1160773086298665796 0.15962567365300706079 -0.81549119765880584421;-0.10176757683786431807 -0.067082902494107324309 -0.24478093477504409003 0.14471594138371229876 -0.29774894183487810029 -0.15262040268887105965 -0.1986863270020334804 -0.26232351234069195556 0.12076020028427790487 0.23964909545605658781 -0.25164584469519668541 0.042810462694995617128 -0.0065787881505747162342 -0.32900178882780067857 0.31117611626381352918 -0.23405362834604623301;0.12642001515766773667 0.33785444088147437158 -0.35668556064442535902 0.068482427549264271449 -0.27659831693846009815 -0.05000011369585958404 -0.55287632646485118659 -0.18273422944920805766 0.36740541696062767318 -0.083338139434556784968 -0.085242419162411112166 -0.5084706279912846183 -0.24608516698393126387 0.33748127795134863494 -0.39502640692026746994 -0.73692558916202011154;-0.6024836200322895019 -0.044341517391508072921 0.77050640211124510337 0.67517290557074927104 0.10732996530031993498 -0.056070470095972400892 0.36145018684669921738 0.59180577881668994245 -0.062793441730618085317 -0.38766962046315128765 0.031648772372364131211 0.35006320617875125167 0.40827990477034670214 0.42839464093765761143 0.35428743585882521705 -0.88376568275760625237;-0.070694292106749476168 0.32371000911750785756 -0.54861817907246968051 -0.048991714583014378537 0.099183138491613601073 0.035463281719314898588 -0.17331118448100502061 -0.41584021655143516005 0.42910030147969091541 -0.11292108343009227278 -0.18113834644202786439 -0.087380999330328779084 -0.64096989986820518403 -0.98889498564086930621 0.14595010149925036202 -0.021614628789299143719;-0.054613454031374304709 0.24183805266020680369 -0.33931160461633469438 0.23508716012510333382 -0.22839672199252916696 0.40557937727428328145 0.31440148719921051645 0.062138991773293350551 -0.26859824380939545385 0.065983167140675549889 0.57995749608274271836 -0.30943415740875024333 0.14902627596684581146 -0.54694070553067097862 0.067651012378648206491 -0.48528487771831452235;0.047083020048724041895 0.12880558289986507314 -0.59400917851867540698 0.10794151475478487578 0.34986318694142370944 0.18679057874986843002 -0.23147659936162809369 0.0073143421783514392387 -0.086283885220526296278 -0.33002803393629370765 -0.024406428841466265567 -0.48965341982587790026 -0.31992183431446652442 -0.045819320894121037813 0.58283465600641481963 -0.30968005505786466713;-0.3899578417391355023 -0.3545014135780459319 0.11905622693406950341 -0.37971169719620306848 0.17049635475923849426 -0.17700412936504997252 -0.024330279753830768436 0.19511605725330882777 -0.41841029251049355331 -0.65596825287026527818 -0.24046404575081095634 0.1265011821005138648 0.043074609459639527642 0.0034185643483677996893 0.55447843109332373146 0.069441448706866457097;0.2165984661241129039 -0.03639162081514803887 -0.43480685896596982243 -0.23015794239894921125 -0.0061073632219754165479 0.13630716513286747493 -0.074504517783738150039 -0.020801561855184346611 -0.22676130995533175239 -0.26098279461622331254 -0.21400647569202768672 0.1926333324946392922 -0.45763035962025572223 -0.09621865524353988286 -0.26336989282172346138 -0.15495756018466588788;0.59226151882540312155 -0.11846113554004245039 0.1811301780101980119 0.11178390714198771505 -0.2639121408061909646 -0.23352989733955861751 -0.29149781037068411838 -0.043112544156266863205 0.0025936312271751344946 0.35163171433456030801 0.44304808428636716089 0.30308631041526185035 -0.25766484953169138672 -0.23995409434979564356 -0.64789012536203138826 -0.032048468293470910861;-0.39741292508115777515 0.033937845219689247167 0.21231650762250878217 -0.16953782105594311647 -0.40111432350039061268 0.76064845012240545064 -0.28650635121549933304 0.030500299138605389132 -0.14471965778510390011 -0.40312709703282850748 0.35751107611518712082 -0.31345831366352822078 -0.051574929318761376928 -0.16371365878678934003 -0.37979655532444256405 0.096985786826975159403;-0.10206950748650463201 0.2963128613567546088 -0.19784353761856096621 -0.3407339686979944382 0.0013000490374522467689 0.20453516605689817109 -0.072407456069634570928 0.18524576666476680331 0.34025151792148905505 0.13434184950876373876 -0.23004482630279216027 0.25951302727189096142 -0.11018146315880740505 0.18269127748774915965 -0.093677296755658653882 0.28015544786112972497;0.15804321766437562369 0.27736273470164291011 0.029885764775259726633 0.071305628812854279786 0.36335316955670526884 -0.074113555376563985955 0.37931599722472586489 0.17791751968753624635 -0.1209976805362879132 -0.3869303866517806445 0.05269103041043135538 0.00020431331478683724227 0.038762896212724287648 -0.39227080963337340735 -0.24936723713761579924 0.054042450731183008794;-0.93010226428395048615 -0.24592193293273242882 0.61063150629643236922 0.13251463249196523964 -0.37325973918559679499 0.25542482884626505069 0.35406453808947291328 0.45158322514291276972 -0.26417507175366061745 -0.76146453291028892707 0.031782236862040728897 -0.010010272582272590133 0.35527995663387901271 -0.13919861188674931274 -0.13718061928257815896 0.37830414218503993773;-0.21546932791073386548 0.26194025091142536521 0.34326841632467824939 0.052651407753684327517 0.29488725655765901656 -0.21435263996970860267 0.16487456563763155937 0.079291067227972883424 -0.07783122393518961124 -0.10225278949890470592 0.051321273549401601188 0.31206229131298474089 0.20356035099956140688 0.22621081144196322765 0.27719765247120053964 -0.18331648955562909742];
% Layer 2
b2 = -0.13483127316506471338;
IW2_1 = [0.27297526199238836275 -0.01061329768955520704 -0.017072594911850123151 -0.20712007366106333084 0.055965215774075628696 -0.40906560459621588155 0.28553803362971003077 0.2121924425974551498 0.48674934257347901578 0.092763713566969135949 -0.37525710716054566651 -0.41254929207087043252 -0.14795619655743660892 -0.22245882237440342544 -0.63015860115665145447 0.44851328525886757781];
LW2_1 = [-0.87783627413367593117 0.43448999000928678615 0.92595127483084715969 1.2425845410116103107 1.2418182495443916835 -0.73587261033737727534 -0.63558561108775402104 -0.86250864408726424237 0.37951627646893132395 -0.67129875596089905176 -0.89876237178417928142 -0.11890411404320742894 0.41562304333597371864 1.1287486653429303907 -0.20881407741400892508];
% Output 1
y1_step1_ymin = -1;
y1_step1_gain = 0.02000400080016;
y1_step1_xoffset = 0;
% ===== SIMULATION ========
% Dimensions
Q = size(x1,2); % samples
% Input 1
xp1 = mapminmax_apply(x1,x1_step1_gain,x1_step1_xoffset,x1_step1_ymin);
% Layer 1
a1 = tansig_apply(repmat(b1,1,Q) + IW1_1*xp1);
% Layer 2
a2 = repmat(b2,1,Q) + IW2_1*xp1 + LW2_1*a1;
% Output 1
y1 = mapminmax_reverse(a2,y1_step1_gain,y1_step1_xoffset,y1_step1_ymin);
end
% ===== MODULE FUNCTIONS ========
% Map Minimum and Maximum Input Processing Function
function y = mapminmax_apply(x,settings_gain,settings_xoffset,settings_ymin)
y = bsxfun(@minus,x,settings_xoffset);
y = bsxfun(@times,y,settings_gain);
y = bsxfun(@plus,y,settings_ymin);
end
% Sigmoid Symmetric Transfer Function
function a = tansig_apply(n)
a = 2 ./ (1 + exp(-2*n)) - 1;
end
% Map Minimum and Maximum Output Reverse-Processing Function
function x = mapminmax_reverse(y,settings_gain,settings_xoffset,settings_ymin)
x = bsxfun(@minus,y,settings_ymin);
x = bsxfun(@rdivide,x,settings_gain);
x = bsxfun(@plus,x,settings_xoffset);
end
zmywahrheit
zmywahrheit el 22 de Abr. de 2015
I copied iw1_1 iw2_1 lw2_1 b1 and b2 from nntool to this code. What else do I need to repeat the work? Thank you!

Iniciar sesión para comentar.

 Respuesta aceptada

Greg Heath
Greg Heath el 22 de Abr. de 2015

1 voto

Frequently, the reason why two supposedly equivalent trainings don't yield the same result is because the initial state of the random number generator is not the same.
Frequently, the reason why testing of two supposedly equivalent nets doesn't yield the same result is because the input data normalization and output data denormalization are not the same.
Hope this helps.
Greg

4 comentarios

zmywahrheit
zmywahrheit el 22 de Abr. de 2015
How can I set the initial state of the random number? Can I copy it from nntool to my code? It is weird because all nntool results (same method and parameter) are better than my codes as I have tried more than 30 times.
Greg Heath
Greg Heath el 23 de Abr. de 2015
RNG initialization is only used for training. So, that is not the problem.
I do have a comment: The standard configuration with IW2_1 = 0 is a universal approximator.
Why did you feel compelled to add this skip-layer complication?
Also, it would have helped if you posted the results of the command
whos
Before the simulation section I get
>> whos Name Size Bytes Class Attributes
IW1_1 15x16 1920 double
IW2_1 1x16 128 double
LW2_1 1x15 120 double
b1 15x1 120 double
b2 1x1 8 double
x1_step1_gain 16x1 128 double
x1_step1_xoffset 16x1 128 double
x1_step1_ymin 1x1 8 double
Good Luck
Greg
zmywahrheit
zmywahrheit el 23 de Abr. de 2015
Editada: zmywahrheit el 23 de Abr. de 2015
I just noticed that nntool forced the predicted values to a proper range. I think that is why nntool can get better result. As you can see the right figure was generated by nntool. In the left figure, the x axis is the real value(target) and the y axis is the values calculated by my function. How can I force my calculated values to a proper range? Thank you!
Greg Heath
Greg Heath el 24 de Abr. de 2015
Maybe you need to filter the data of bad points before designing a model. I use the following the info before deciding to delete or modify data
minmax(x) = ?
minmax(t) = ?
plot(x,t)
zx = zscore(x)
zt = zscore(t)
minmax(zx) = ?
minmax(zt) = ?
plot(zx,zt)
Hope this helps. Greg

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.

Preguntada:

el 21 de Abr. de 2015

Comentada:

el 24 de Abr. de 2015

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by