Ask an expert. Trust the answer.

Your academic and career questions answered by verified experts

Single layer neural network [closed]

Date: 2022-11-24 14:25:21

For the implementation of single layer neural network, I have two data files. 

 

 In:
    0.832 64.643
    0.818 78.843

Out:
    0 0 1
    0 0 1

The above is the format of 2 data files.

The target output is "1" for a particular class that the corresponding input belongs to and "0" for the remaining 2 outputs.

The problem is as follows: 

Your single layer neural network will find A (3 by 2 matrix) and b (3 by 1 vector) in Y = A*X + b where Y is [C1, C2, C3]' and X is [x1, x2]'.

To solve the problem above with a neural network, we can re-write the equation as follow: Y = A' * X' where A' = [A b] (3 by 3 matrix) and X' is [x1, x2, 1]'

Now you can use a neural network with three input nodes (one for x1, x2, and 1 respectively) and three outputs (C1, C2, C3).

The resulting 9 (since we have 9 connections between 3 inputs and 3 outputs) weights will be equivalent to elements of A' matrix. 

Basicaly, I am trying to do something like this, but it is not working: 

 

function neuralNetwork   
    load X_Q2.data
    load T_Q2.data
    x = X_Q2(:,1);
    y = X_Q2(:,2);

    learningrate = 0.2;
    max_iteration = 50;

    % initialize parameters
    count = length(x);
    weights = rand(1,3); % creates a 1-by-3 array with random weights
    globalerror = 0;
    iter = 0;
    while globalerror ~= 0 && iter <= max_iteration
        iter = iter + 1;
        globalerror = 0;
        for p = 1:count
            output = calculateOutput(weights,x(p),y(p));
            localerror = T_Q2(p) - output
            weights(1)= weights(1) + learningrate *localerror*x(p);
            weights(2)= weights(1) + learningrate *localerror*y(p);
            weights(3)= weights(1) + learningrate *localerror;
            globalerror = globalerror + (localerror*localerror);
        end 
    end  

I write this function in some other file and calling it in my previous code. 

 

function result = calculateOutput (weights, x, y)
    s = x * weights(1) + y * weights(2) + weights(3);
    if s >= 0
        result = 1;
    else
        result = -1;
    end 

Expert Answer:

I can spot a few problems with the code. The main issue is that the target is multi-class (not binary), so you need to either use 3 output nodes one for each class (called 1-of-N encoding), or use a single output node with a different activation function (something capable of more than just binary output -1/1 or 0/1)

In the solution below, the perceptron has the following structure: 

 

%# load your data
input = [
    0.832 64.643
    0.818 78.843
    1.776 45.049
    0.597 88.302
    1.412 63.458
];
target = [
    0 0 1
    0 0 1
    0 1 0
    0 0 1
    0 0 1
];

%# parameters of the learning algorithm
LEARNING_RATE = 0.1;
MAX_ITERATIONS = 100;
MIN_ERROR = 1e-4;

[numInst numDims] = size(input);
numClasses = size(target,2);

%# three output nodes connected to two-dimensional input nodes + biases
weights = randn(numClasses, numDims+1);

isDone = false;               %# termination flag
iter = 0;                     %# iterations counter
while ~isDone
    iter = iter + 1;

    %# for each instance
    err = zeros(numInst,numClasses);
    for i=1:numInst
        %# compute output: Y = W*X + b, then apply threshold activation
        output = ( weights * [input(i,:)';1] >= 0 );                       %#'

        %# error: err = T - Y
        err(i,:) = target(i,:)' - output;                                  %#'

        %# update weights (delta rule): delta(W) = alpha*(T-Y)*X
        weights = weights + LEARNING_RATE * err(i,:)' * [input(i,:) 1];    %#'
    end

    %# Root mean squared error
    rmse = sqrt(sum(err.^2,1)/numInst);
    fprintf(['Iteration %d: ' repmat('%f ',1,numClasses) '\n'], iter, rmse);

    %# termination criteria
    if ( iter >= MAX_ITERATIONS || all(rmse < MIN_ERROR) )
        isDone = true;
    end
end

%# plot points and one-against-all decision boundaries
[~,group] = max(target,[],2);                     %# actual class of instances
gscatter(input(:,1), input(:,2), group), hold on
xLimits = get(gca,'xlim'); yLimits = get(gca,'ylim');
for i=1:numClasses
    ezplot(sprintf('%f*x + %f*y + %f', weights(i,:)), xLimits, yLimits)
end
title('Perceptron decision boundaries')
hold off 

The results of training over the five sample you provided: 

 

Iteration 1: 0.447214 0.632456 0.632456 
Iteration 2: 0.000000 0.447214 0.447214 
...
Iteration 49: 0.000000 0.447214 0.447214 
Iteration 50: 0.000000 0.632456 0.000000 
Iteration 51: 0.000000 0.447214 0.000000 
Iteration 52: 0.000000 0.000000 0.000000  

Note that the data used in the example above only contains 5 samples. You would get more meaningful results if you had more training instances in each class.

Why Matlabhelpers ?

Our Matlab assignment helpers for online MATLAB assignment help service take utmost care of your assignments by keeping the codes simple yet of high-quality. We offer the most reliable MATLAB solutions to students pursuing their Computer Science course from the Monash University, the University of Sydney, the University of New South Wales, the University of Melbourne; to name a few. Approach us today for best Matlab solutions online!

Our Comprehensive Matlab Assignment Help Services

Personalized Tutoring:Our team of MATLAB experts offers one-on-one tutoring sessions tailored to your specific needs. Whether you’re struggling with basic concepts or advanced algorithms, we provide clear, step-by-step guidance to help you understand and master MATLAB.

Assignment Assistance:Facing tight deadlines or complex assignments? We’re here to help! From initial problem analysis to code development and debugging, we offer full-spectrum support to ensure your assignments meet the highest standards.

Project Development: Need help with a research project? Our specialists can assist you in designing and implementing robust MATLAB solutions. We cover everything from project planning and data collection to coding, simulation, and result analysis.

Coursework Support: We provide comprehensive support for your coursework, helping you understand lectures, complete lab exercises, and prepare for exams. Our goal is to ensure you grasp the core principles and practical applications of MATLAB.

Thesis and Dissertation Guidance:Writing a thesis or dissertation? Our experts can assist you in incorporating MATLAB for data analysis, modeling, and simulation. We help you develop a strong methodological framework and ensure your research stands out.

whatsApp order on matlabhelpers.com

telegram order on matlabsolutions.com