How idTreePartition function for System Identification Toolbox?

14 visualizaciones (últimos 30 días)
Rocco
Rocco el 19 de Feb. de 2025
I am triyng to replicate the code done by sim(model,inputdata,x0) like i asked in this question Problem with system identification toolbox example - MATLAB Answers - MATLAB Central but using in the model the function IdTreePartition. I tried doing it looking at the documentation idTreePartition
idprops treepartition
Help on properties of the TREEPARTITION nonlinearity estimator. TREEPARTITION is a nonlinear function y = F(x), where y is scalar and x a 1-by-m vector. F is a piecewise linear (affine) function of x: F(x) = x*L+ [1,x]*C_a + d when x belongs to D_a, where L is a 1-by-m vector, C_k is a 1-by-(m+1) vector, and D_k is partition of the x-space. The active partition D_a is determined as an intersection of half-spaces by a binary tree as follows: first a tree with N nodes and J levels is initialized. A node at level J is a terminating leaf and a node at a level j<J has two descendants at level j+1. All levels are complete, so N = 2^(J+1)-1. The partition at node r is based on [1,x]*B_r > 0 or <= 0 (move to left or right descendant), where B_r is chosen to improve the stability of least-square computation on the partitions at the descendant nodes. Then at each node r the coefficients C_r of best linear approximation of unknown regression function on D_r are computed using penalized least-squares algorithm. The value F(x) is computed by evaluate(T,x), where T is the treepartition object. At this stage an adaptive 'pruning' algorithm is used to choose the active partition D_a(= D_a(x)) on the branch of partitions of the tree, which contain x. PROPERTIES: NumberOfUnits - The number of nodes N in the tree. 'Auto' (default) means that N is selected from data by pruning. If set to an integer before estimation, N is given as the largest value of the form 2^(J+1)-1 less than this integer. Parameters - A structure containing the following fields: RegressorMean: a 1-by-m vector containing the means of the regressors in estimation data. RegressorMinMax: an m-by-2 matrix, containing maximum and minimum regressor values in estimation data. OutputOffset: The scalar d. LinearCoef: The m-by-1 vector L. SampleLength: The length of estimation data. NoiseVariance: The estimated variance of the noise in estimation data. Tree: A structure containing the tree parameters: These are given be the following fields: TreeLevelPntr: a N-by-1 vector containing the levels j of each node. AncestorDescendantPntr: a N-by-3 matrix, so that entry (k,1) is the ancestor of node k, entries (k,2) and (k,3) are the left and right descendants. LocalizingVectors: a N-by-(m+1) matrix, whose r:th row is B_r. LocalParVector: a N-by-(m+1) matrix, whose k:th row is C_k. LocalCovMatrix: a N-by-((m+1)m/2) matrix whose k:th row is the covariance matrix of C_k (reshaped as a row vector). Options - A structure containing the following fields: FinestCell: Integer or string giving the minimum number of data points in the smallest partition. Default value: 'Auto', meaning that it is determined from data. Threshold: threshold parameter used by the adaptive pruning algorithm. The smaller is the threshold value, the shorter will be the branch terminated by the active partition D_a. Higher threshold value will result in a longer branch. Default value: 1.0 . Stabilizer: the penalty parameter of the penalized least-squares algorithm used to compute local parameter vectors C_k. Higher stabilizer value will improve stability but may deteriorate the accuracy the least-square estimate. Default value: 1e-6.
load twotankdata
estData = iddata(y,u,0.2,'Tstart',0);
M = nlarx(estData,[1 1 0],'idTreePartition');
L_val = M.OutputFcn.LinearFcn.Value;
tree = M.OutputFcn.NonlinearFcn.Parameters.Tree;
offset = M.OutputFcn.Offset.Value;
RegNorm_center = M.Normalization.RegressorCenter;
RegNorm_scale = M.Normalization.RegressorScale;
OutNorm_center = M.Normalization.OutputCenter;
OutNorm_scale = M.Normalization.OutputScale;
x0=0;
regressorVector = [0,estData.u(1)];
x = regressorVector;
x = (x-RegNorm_center).*RegNorm_scale;
maxLevel = max(tree.TreeLevelPntr); % tree max level
node = 1;
while tree.TreeLevelPntr(node) < maxLevel
B_k = tree.LocalizingVectors(node, :)';
if [1, x] * B_k > 0
node = tree.AncestorDescendantPntr(node, 2); %go left
else
node = tree.AncestorDescendantPntr(node, 3); %go right
end
end
C_k = tree.LocalParVector(node, :)';
y = x * L_val' + [1, x] * C_k + offset;
y = y.*OutNorm_scale + OutNorm_center
y = -0.2198
y_sim = sim(M,estData.u(1),x0)
y_sim = 0.0107
My guess is that i'm missing the adaptive 'pruning' algorithm used to choose the active partition but i don't know how to replicate it. Is there another documentation that i can look up to that explains it? Thank you for your attention, hope the question is clear.

Respuestas (0)

Categorías

Más información sobre Linear Model Identification en Help Center y File Exchange.

Productos


Versión

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by