I am using the forward feature selection algorithm from MATLAB. The code is as follows:
X=combine_6_non; y=target; c = cvpartition(y,'k',10); opts = statset('display','iter'); [fs,history] = sequentialfs(@fun,X,y,'cv',c,'options',opts) The function fun is as follows:
function err = fun(XT,yT,Xt,yt) model = svmtrain(XT,yT, 'Kernel_Function', 'rbf', 'boxconstraint', 1); err = sum(svmclassify(model, Xt) ~= yt); end Now for different runs of the selection algorithm I am getting different feature sets. How should I zero down to the best feature set?
seedorrandom_state? If yes, then that should solve the issue :) $\endgroup$random_stateor aseedin Matlab, similar to Python :) $\endgroup$