# 代写代考 Tutorial 9 – cscodehelp代写

Tutorial 9
Q1. Consider the following dataset: {(x = (1, 0), y = +1), (x = (−1, 0), y = +1), (x = (0,1),y = −1),(x = (0,−1),y = −1)} where x = (x1,x2) is the input feature and y is the output class. 8 weak classifies are designed and given as follows:
􏰉+1, h1 (x) = −1,
􏰉+1, h3 (x) = −1,

Copyright By cscodehelp代写 加微信 cscodehelp

if x1 > −0.5 otherwise
if x1 > 0.5 otherwise
􏰉−1, ifx1>−0.5
h5 (x) = −1, otherwise
if x2 > −0.5 􏰉+1, if x2 > 0.5
􏰉−1, ifx2>0.5
h7 (x) = −1, otherwise
a. Plot the dataset in 2-dimensional space. Is the dataset linearly separable? b. Determine the classification error for all weak classifiers.
c. Find the minimal classifier using AdaBoost algorithm, which can reach zero training error.
Q2. Using a Bagging algorithm, 8 classifiers shown in Q1 are obtained.
a. Determinethetrainingerror(usingthetrainingdatasetinQ3)givenbytheBaggingclassifier.
a. Determine the training error (using the training dataset in Q1) given by the Bagging classifier.
b. Instead of using 8 classifiers, which classifiers should be used to form the bagging classifier. Determine the trainin error using the trabin.inIngsdteaatadseotfiunsQin3g. 8 classifiers, which classifiers should be used to form the bag-
Figure Q5.1 shows two binary decision trees.
ging classifier. Determine the training error using the training dataset in Q1.
Q3. Figure 1 shows two binary decision trees.
􏰉−1, ifx1>0.5
􏰉−1, ifx2>−0.5
0.5>=x1>0.5 -0.5>=x2>-0.5
0.5>=x2>0.5 0.5>=x2>0.5
Figure 1: Two binary decision trees.
a. UsingthetrainingdatagiveninQ3,calculatetheP(class=°1x)foreachleafnode. b. Hence, determine the class for a new sample x (0, 0) predicted by each decision tree.
Figure Q5.1

a. Using the training data given in Q1, calculate the P(class = −1|x) for each leaf node.
b. Hence, determine the class for a new sample x = (0,0) predicted by each decision tree.
c. If the two binary decision trees shown in Figure 1 were learnt using a random forest algorithm. What would be the class for a new sample x = (0, 0) predicted by this random forest?

Posted in Uncategorized