Ask an expert. Trust the answer.

Your academic and career questions answered by verified experts

Multi-Class SVM( one versus all)

Date: 2022-07-21 10:59:37

I know that LIBSVM only allows one-vs-one classification when it comes to multi-class SVM. However, I would like to tweak it a bit to perform one-against-all classification. I have tried to perform one-against-all below. Is this the correct approach? 

The code: 

 

TrainLabel;TrainVec;TestVec;TestLaBel;
u=unique(TrainLabel);
N=length(u);
if(N>2)
    itr=1;
    classes=0;
    while((classes~=1)&&(itr<=length(u)))
        c1=(TrainLabel==u(itr));
        newClass=c1;
        model = svmtrain(TrainLabel, TrainVec, '-c 1 -g 0.00154'); 
        [predict_label, accuracy, dec_values] = svmpredict(TestLabel, TestVec, model);
        itr=itr+1;
    end
itr=itr-1;
end 

I might have done some mistakes. I would like to hear some feedback. Thanks.

Second Part: As grapeot said : I need to do Sum-pooling (or voting as a simplified solution) to come up with the final answer. I am not sure how to do it. I need some help on it; I saw the python file but still not very sure. I need some help. 

Expert Answer:

s: 

 

 %# Fisher Iris dataset
load fisheriris
[~,~,labels] = unique(species);   %# labels: 1/2/3
data = zscore(meas);              %# scale features
numInst = size(data,1);
numLabels = max(labels);

%# split training/testing
idx = randperm(numInst);
numTrain = 100; numTest = numInst - numTrain;
trainData = data(idx(1:numTrain),:);  testData = data(idx(numTrain+1:end),:);
trainLabel = labels(idx(1:numTrain)); testLabel = labels(idx(numTrain+1:end));
%# train one-against-all models
model = cell(numLabels,1);
for k=1:numLabels
    model{k} = svmtrain(double(trainLabel==k), trainData, '-c 1 -g 0.2 -b 1');
end

%# get probability estimates of test instances using each model
prob = zeros(numTest,numLabels);
for k=1:numLabels
    [~,~,p] = svmpredict(double(testLabel==k), testData, model{k}, '-b 1');
    prob(:,k) = p(:,model{k}.Label==1);    %# probability of class==k
end

%# predict the class with the highest probability
[~,pred] = max(prob,[],2);
acc = sum(pred == testLabel) ./ numel(testLabel)    %# accuracy
C = confusionmat(testLabel, pred)                   %# confusion matrix

 

Why Matlabhelpers ?

Looking for reliable MATLAB assignment help? Our expert MATLAB tutors deliver high-quality, easy-to-understand solutions tailored to your academic needs. Whether you're studying at Monash University, the University of Sydney, UNSW, or the University of Melbourne, we provide trusted MATLAB assistance to help you excel. Contact us today for the best MATLAB solutions online and achieve academic success!

MATLAB Assignment Help Services

Personalized Tutoring: Get one-on-one guidance from our MATLAB experts. Whether you're tackling basic concepts or advanced algorithms, we provide clear, step-by-step explanations to help you master MATLAB with confidence.

Assignment Assistance: Struggling with tight deadlines or complex assignments? Our team offers end-to-end support, from problem analysis to code development and debugging, ensuring your assignments meet the highest academic standards.

Project Development: Need expert help with your MATLAB research project? We assist in designing and implementing robust solutions, covering project planning, data collection, coding, simulation, and result analysis.

Coursework Support: Enhance your understanding of MATLAB with our comprehensive coursework assistance. We help you grasp lecture concepts, complete lab exercises, and prepare effectively for exams.

Thesis and Dissertation Guidance: Incorporate MATLAB seamlessly into your thesis or dissertation. Our experts provide support for data analysis, modeling, and simulation, ensuring your research is methodologically sound and impactful.

Contact us on WhatsApp for MATLAB help

Contact us on Telegram for MATLAB solutions