# Matlab Project Homework Help

The sample project solution below is based on application of Matlab for Neural Networks. Neural Network is one of the most important application areas of Matlab and the solution prepared by our Matlab experts can be considered as a reflection of our Matlab Project Homework help. Students can also consider this a proxy for Matlab Assignment help as the Matlab assignments are also complex in nature and need expert guidance to achieve excellent grades. The solution includes Neural Network classifier to predict red wine quality using a set of chemical properties.

#### Data Mining

A function is developed in Matlab named transform.m that reshapes and encodes the data into a valid input for further processing with Matlab project homework help Neural Network toolbox.

```clear variables
close all
clc
gt_train=data_train(:,end);
gt_test=data_test(:,end);
data_train(:,end)=[];
data_test(:,end)=[];
class_train=transform(gt_train);
class_test=transform(gt_test);
data_train=data_train';
data_test=data_test';```
 Here, input mat is the vector of ground truth values. The dimension of the output is 11xN, where N is the length of the input Matlab project online homework help matrix.

After the Neural Network toolbox is used to create a Matlab project homework help network with one hidden layer and train it. The training entropy is shown on the figure below:

The training accuracy achieved is 60.91%. After that, the testing phase is initiated. Matlab homework help. The confusion matrix is shown below:

The testing accuracy is found to be 0.6008.

```%% This is an automatically generated code from NN toolbar that can be used to do the same result as using the GUI
% Solve a Pattern Recognition Problem with a Neural Network
% Script generated by NPRTOOL
% Created Wed Oct 29 19:20:25 CET 2014
% This script assumes these variables are defined:
%   set1 - input data.
%   class1 - target data.
x = data_train;
t = class_train;
% Create a Pattern Recognition Network
hiddenLayerSize = 10;
net = patternnet(hiddenLayerSize);
% Setup Division of Data for Training, Validation, Testing```

net.divideParam.trainRatio = 80/100;

net.divideParam.valRatio = 10/100;

net.divideParam.testRatio = 10/100;

The parser that converts the data archive into a correct form is created to generate a correct CSV file. Formatting is very important at this step to allow for Matlab project homework help correct recognition by Weka. The dataset was searched for associations which indicate coexistence of standard deviation in Matlab different data occurrences:

=== Run information ===

Scheme:       weka.associations.Apriori -N 10 -T 0 -C 0.4 -D 0.05 -U 1.0 -M 0.1 -S -1.0 -c -1

Relation:     transtest

Instances:    9999

Attributes:   62

=== Associator model (full training set) ===

Apriori

=======

Minimum support: 0.1 (1000 instances)

Minimum metric <confidence>: 0.4

Number of cycles performed: 18

Generated sets of large itemsets:

Size of set of large itemsets L(1): 11

Size of set of large itemsets L(2): 7

Best rules found:

1. Chocolate drink=t 1239 ==> Chocolate=t 1239 conf:(1)
2. Chocolate=t 1449 ==> Chocolate drink=t 1239 conf:(0.86)
3. Beer=t 2184 ==> Mixed nuts=t 1451 conf:(0.66)
4. Beer=t 2184 ==> Chips=t 1450 conf:(0.66)
5. Beer=t 2184 ==> Red wine=t 1444 conf:(0.66)
6. Mixed nuts=t 2317 ==> Beer=t 1451 conf:(0.63)
7. Chips=t 2318 ==> Beer=t 1450 conf:(0.63)
8. Red wine=t 2317 ==> Beer=t 1444 conf:(0.62)
9. Mixed nuts=t 2317 ==> Red wine=t 1441 conf:(0.62)
10. Red wine=t 2317 ==> Mixed nuts=t 1441 conf:(0.62)

The implications are listed according to their confidence ratio which is the proportion of instances in which we observed the association. The strongest association Matlab project homework help is with confidence 1, showing that all people who buy chocolate milk also buy chocolate. It is interesting to note that the converse association appears with a somewhat smaller proportion – only 86% of people who buy chocolate buy chocolate data mining milk as well. This enables deeper analysis of the correlated items.

```% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
tind = vec2ind(t);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
performance = perform(net,t,y)```

Naïve Bayes is a technique that makes a “naïve” assumption that all the input variables are independent of each other. The numerical derivative Matlab method was implemented in Matlab using both built-in Matlab classifiers and coded manually. The priors are calculated Matlab project homework help as maximum  likelihood  estimates using their relative appearance on the training sample. Both methods give the identical results – all 4 instances give outdoor class of sports.

Priors on classes are estimated to be 0.6/0.4 in favor or outdoor sports. For each class we have the following conditional probabilities:

 Class 1 2 3 Weather 0.5 1/6 1/3 Wind 1/3 0.5 1/6

Categorical variables are coded with numeric variables (Weather (S=1, R=2, C=3), Wind (W=1,M=2,S=3)). For the continuous temperature we Matlab project homework help assume Gaussian distribution and find the parameters using MLE estimate:

 Class Mean SD Outdoor 21.5 8.66 Indoor 15.5 10.21

For each new observation we show the log total probabilities that are used as a classifier:

 Observation P1 P2 Result 1 -2.9736 -2.9851 Outdoor 2 -2.9610 -2.9773 Outdoor 3 -2.6890 -3.6551 Outdoor 4 -1.8595 -3.6499 Outdoor

Minimum spanning tree (MST) algorithm is a substrate Matlab project homework help of a single link agglomeration method. The basis of an algorithm is determining the smallest metric distance between a point not already in MST and the one Matlab project homework help already in it and adding a link there. The process is repeated until all the observations are linked in the MST. We show a five point example:

P1(1,2), P2(0,0), P3(-1,-1), P4(4,0), P5(0,5)

```% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotconfusion(t,y)
%figure, plotroc(t,y)
%figure, ploterrhist(e)```

The starting point is chosen at random, so we can choose the first one. The Euclidean distance (chosen as metric here) between that point and the others is:

 P P2 P3 P4 P5 P1 2.71 3.6 3.6 3.16

The minimum distance is P1-P2, so a link is introduced there. Next we introduce P2 to the tree and update the distance tables:

 P P3 P4 P5 P1 3.6 3.6 3.16 P2 1.41 4 5

We introduce a link between P2 and P3 and update the table:

 P P4 P5 P1 3.6 3.16 P2 4 5 P3 5.1 6.08

We introduce a link between P1 and P5 and look at the one last table:

 P P4 P1 3.6 P2 4 P3 5.1 P5 6.4

The final link is between points P1 and P4 and the algorithm is finished.

```%% Test the NN with set 2
pred=net(data_test);
[~,loc]=max(pred);
[~,loc2]=max(class_test);
%Reduce by one since it starts from 0
acc=sum((loc-1)==(loc2-1))/length(loc);
display('Testing accuracy:')
display(acc)```

An alternative way to generate MST is to start from a random spanning tree, and then change the connectivity of points in such a way that each new iteration decreases the total length of edges. This method is valid Matlab project homework help because the resulting total distance has no local minima – every change that decreases the total length of edges leads closer towards the global optimum in the search space of the problem.

Save