日韩精品一区二区三区高清_久久国产热这里只有精品8_天天做爽夜夜做爽_一本岛在免费一二三区

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

DDA3020代做、代寫Python語言編程
DDA3020代做、代寫Python語言編程

時間:2024-10-12  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



DDA3020 Homework 1
Due date: Oct 14, 2024
Instructions
• The deadline is 23:59, Oct 14, 2024.
• The weight of this assignment in the ffnal grade is 20%.
• Electronic submission: Turn in solutions electronically via Blackboard. Be sure to submit
 your homework as one pdf ffle plus two python scripts. Please name your solution ffles as
”DDA3020HW1 studentID name.pdf”, ”HW1 yourID Q1.ipynb” and ”HW1 yourID Q2.ipynb”.
(.py ffles also acceptable)
• Note that late submissions will result in discounted scores: 0-24 hours → 80%, 24-120 hours
→ 50%, 120 or more hours → 0%.
• Answer the questions in English. Otherwise, you’ll lose half of the points.
• Collaboration policy: You need to solve all questions independently and collaboration between
students is NOT allowed.
1 Written Problems (50 points)
1.1. (Learning of Linear Regression, 25 points) Suppose we have training data:
{(x1, y1),(x2, y2), . . . ,(xN , yN )},
where xi ∈ R
d and yi ∈ R
k
, i = 1, 2, . . . , N.
i) (9 pts) Find the closed-form solution of the following problem.
min
W,b
X
N
i=1
∥yi − Wxi − b∥
2
2
,
ii) (8 pts) Show how to use gradient descent to solve the problem. (Please state at least one
possible Stopping Criterion)
1DDA3020 Machine Learning Autumn 2024, CUHKSZ
iii) (8 pts) We further suppose that x1, x2, . . . , xN are drawn from N (µ, σ
2
). Show that the
maximum likelihood estimation (MLE) of σ
2
is σˆ
2
MLE =
1
N
PN
n=1
(xn − µMLE)
2
.
1.2. (Support Vector Machine, 25 points) Given two positive samples x1 = (3, 3)
T
, x2 =
(4, 3)
T
, and one negative sample x3 = (1, 1)
T
, ffnd the maximum-margin separating hyperplane and
support vectors.
Solution steps:
i) Formulating the Optimization Problem (5 pts)
ii) Constructing the Lagrangian (5 pts)
iii) Using KKT Conditions (5 pts)
iv) Solving the Equations (5 pts)
v) Determining the Hyperplane Equation and Support Vectors (5 pts)
2 Programming (50 points)
2.1. (Linear regression, 25 points) We have a labeled dataset D = {(x1, y1),(x2, y2),
· · · ,(xn, yn)}, with xi ∈ R
d being the d-dimensional feature vector of the i-th sample, and yi ∈ R
being real valued target (label).
A linear regression model is give by
fw0,...,wd
(x) = w0 + w1x1 + w2x2 + · · · + wdxd, (1)
where w0 is often called bias and w1, w2, . . . , wd are often called coefffcients.
Now, we want to utilize the dataset D to build a linear model based on linear regression.
We provide a training set Dtrain that includes 2024 labeled samples with 11 features (See linear
 regression train.txt) to fft model, and a test set Dtest that includes 10 unlabeled samples with
11 features (see linear regression test.txt) to estimate model.
1. Using the LinearRegression class from Sklearn package to get the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest by the model trained well. (Put
the estimation of w0, w1, . . . , w11 and these yˆ in your answers.)
2. Implementing the linear regression by yourself to obtain the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest. (Put the estimation of
w0, w1, . . . , w11 and these yˆ in your answers. It is allowed to compute the inverse of a matrix
using the existing python package.)
2DDA3020 Machine Learning Autumn 2024, CUHKSZ
(Hint: Note that for linear regression train.txt, there are 2024 rows with 12 columns where the
ffrst 11 columns are features x and the last column is target y and linear regression test.txt
only contains 10 rows with 11 columns (features). Both of two tasks require the submission of
code and results. Put all the code in a “HW1 yourID Q1.ipynb” Jupyter notebook. ffle.(”.py”
ffle is also acceptable))
2.2. (SVM, 25 points)
Task Description You are asked to write a program that constructs support vector machine
models with different kernel functions and slack variables.
Datasets You are provided with the iris dataset. The data set contains 3 classes of 50 instances
each, where each class refers to a type of iris plant. There are four features: 1. sepal length in cm;
2. sepal width in cm; 3. petal length in cm; 4. petal width in cm. You need to use these features
to classify each iris plant as one of the three possible types.
What you should do You should use the SVM function from python sklearn package, which
provides various forms of SVM functions. For multiclass SVM you should use the one vs rest
strategy. You are recommended to use sklearn.svm.svc() function. You can use numpy for vector
manipulation. For technical report, you should report the results required as mentioned below (e.g.
training error, testing error, and so on).
1. (2 points) Split training set and test set. Split the data into a training set and a test set.
The training set should contain 70% of the samples, while the test set should include 30%.
The number of samples from each category in both the training and test sets should reffect
this 70-30 split; for each category, the ffrst 70% of the samples will form the training set, and
the remaining 30% will form the test set. Ensure that the split maintains the original order
of the data. You should report instance ids in the split training set and test set. The output
format is as follows:
Q2.2.1 Split training set and test set:
Training set: xx
Test set: xx
You should ffll up xx in the template. You should write ids for each set in the same line with
comma separated, e.g. Training set:[1, 4, 19].
2. (10 points) Calculation using Standard SVM Model (Linear Kernel). Employ the
standard SVM model with a linear kernel. Train your SVM on the split training dataset and
3DDA3020 Machine Learning Autumn 2024, CUHKSZ
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, output the weight vector w, the bias b, and the indices of support vectors
(start with 0). Note that the scikit-learn package does not offer a function with hard margin,
so we will simulate this using C = 1e5. You should ffrst print out the total training error
and testing error, where the error is
wrong prediction
number of data
. Then, print out the results for each class
separately (note that you should calculate errors for each class separately in this part). You
should also mention in your report which classes are linear separable with SVM without slack.
The output format is as follows:
Q2.2.2 Calculation using Standard SVM Model:
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
Linear separable classes: xx
If we view the one vs all strategy as combining the multiple different SVM, each one being
a separating hyperplane for one class and the rest of the points, then the w, b and support
vector indices for that class is the corresponding parameters for the SVM separating this class
and the rest of the points. If a variable is of vector form, say a =


1
2
3
?**4;
?**5;?**5;?**6;, then you should write
each entry in the same line with comma separated e.g. [1,2,3].
3. (6 points) Calculation using SVM with Slack Variables (Linear Kernel). For each
C = 0.25 × t, where t = 1, 2, . . . , 4, train your SVM on the training dataset, and subsequently
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, the weight vector w, the bias b, and the indices of support vectors, and the
slack variable ζ of support vectors (you may compute it as max(0, 1 − y · f(X)). The output
format is as follows:
Q2.2.3 Calculation using SVM with Slack Variables (C = 0.25 × t, where t = 1, . . . , 4):
4DDA3020 Machine Learning Autumn 2024, CUHKSZ
-------------------------------------------
C=0.25,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
-------------------------------------------
C=0.5,
<... results for (C=0.5) ...>
-------------------------------------------
C=0.75,
<... results for (C=0.75) ...>
-------------------------------------------
C=1,
<... results for (C=1) ...>
4. (7 points) Calculation using SVM with Kernel Functions. Conduct experiments with
different kernel functions for SVM without slack variable. Calculate the classiffcation error
for both the training and testing datasets, and the indices of support vectors for each kernel
type:
(a) 2nd-order Polynomial Kernel
(b) 3nd-order Polynomial Kernel
(c) Radial Basis Function Kernel with σ = 1
(d) Sigmoidal Kernel with σ = 1
The output format is as follows:
5DDA3020 Machine Learning Autumn 2024, CUHKSZ
Q2.2.4 Calculation using SVM with Kernel Functions:
-------------------------------------------
(a) 2nd-order Polynomial Kernel,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
-------------------------------------------
(b) 3nd-order Polynomial Kernel,
<... results for (b) ...>
-------------------------------------------
(c) Radial Basis Function Kernel with σ = 1,
<... results for (c) ...>
-------------------------------------------
(d) Sigmoidal Kernel with σ = 1,
<... results for (d) ...>
Submission Submit your executable code in a “HW1 yourID Q2.ipynb” Jupyter notebook(”.py”
file is also acceptable). Indicate the corresponding question number in the comment for each cell,
and ensure that your code can logically produce the required results for each question in the required
format. Please note that you need to write clear comments and use appropriate function/variable
names. Excessively unreadable code may result in point deductions.

6

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp




 

掃一掃在手機打開當前頁
  • 上一篇:代做CS 259、Java/c++設計程序代寫
  • 下一篇:代做MSE 280、代寫Matlab程序語言
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    2025年10月份更新拼多多改銷助手小象助手多多出評軟件
    2025年10月份更新拼多多改銷助手小象助手多
    有限元分析 CAE仿真分析服務-企業/產品研發/客戶要求/設計優化
    有限元分析 CAE仿真分析服務-企業/產品研發
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    出評 開團工具
    出評 開團工具
    挖掘機濾芯提升發動機性能
    挖掘機濾芯提升發動機性能
    海信羅馬假日洗衣機亮相AWE  復古美學與現代科技完美結合
    海信羅馬假日洗衣機亮相AWE 復古美學與現代
    合肥機場巴士4號線
    合肥機場巴士4號線
    合肥機場巴士3號線
    合肥機場巴士3號線
  • 短信驗證碼 trae 豆包網頁版入口 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    日韩精品一区二区三区高清_久久国产热这里只有精品8_天天做爽夜夜做爽_一本岛在免费一二三区

      <em id="rw4ev"></em>

        <tr id="rw4ev"></tr>

        <nav id="rw4ev"></nav>
        <strike id="rw4ev"><pre id="rw4ev"></pre></strike>
        欧美亚洲第一区| 久久久久成人精品| 韩国成人福利片在线播放| 欧美日韩国产一级片| 欧美黄色免费网站| 亚洲性线免费观看视频成熟| a4yy欧美一区二区三区| 国产欧美日韩亚州综合| 国产亚洲毛片在线| 欧美在线啊v| 国产精品国产三级国产| 久久久欧美精品| 欧美在线日韩精品| 亚洲香蕉伊综合在人在线视看| 免费观看在线综合| 久久久不卡网国产精品一区| 日韩一级黄色片| 欧美日韩国产不卡在线看| 欧美精品一区二区三区一线天视频| 精品不卡在线| 国产精品r级在线| 国模精品娜娜一二三区| 欧美激情1区2区| 欧美激情亚洲另类| 欧美午夜不卡在线观看免费| 一区二区三区久久网| 亚洲一区国产视频| 亚洲国产综合在线| 欧美精品一区在线发布| 国产精品嫩草影院av蜜臀| 欧美性猛交99久久久久99按摩| 亚洲国产免费看| 欧美性事免费在线观看| 欧美另类变人与禽xxxxx| 久久久久久九九九九| 欧美日韩免费精品| 国产婷婷色一区二区三区| 亚洲免费在线观看| 亚洲欧洲久久| 欧美/亚洲一区| 久久久久国内| 欧美日韩卡一卡二| 久久精品五月| 欧美经典一区二区| 欧美性大战久久久久久久| 欧美精品久久久久久久免费观看| 亚洲国产美女精品久久久久∴| 欧美激情精品久久久久久黑人| 久久久精品国产一区二区三区| 久久精品视频免费观看| 欧美日韩精品欧美日韩精品| 国内精品视频久久| 欧美久久久久久久久久| 亚洲人午夜精品免费| 激情一区二区三区| 韩国一区二区三区美女美女秀| 亚洲视频网站在线观看| 一区二区久久| 欧美日韩国产va另类| 亚洲精品无人区| 乱人伦精品视频在线观看| 国产精品日韩高清| 能在线观看的日韩av| 激情偷拍久久| 裸体一区二区三区| 欧美日韩一区二区在线播放| 欧美日韩一本到| 美女日韩在线中文字幕| 久久免费的精品国产v∧| 影音先锋久久资源网| 亚洲福利视频免费观看| 亚洲大胆美女视频| 亚洲欧洲日韩综合二区| 欧美在线播放| 国产亚洲精品久久久久婷婷瑜伽| 亚洲黄色影片| 久久高清福利视频| 亚洲黄色精品| 国产精品毛片va一区二区三区| 中文欧美在线视频| 亚洲丰满在线| 免费久久久一本精品久久区| 欧美啪啪成人vr| 男人的天堂亚洲在线| 久久女同精品一区二区| 久久综合亚州| 亚洲一区免费| 欧美激情精品久久久久久蜜臀| 制服诱惑一区二区| 国产精品资源在线观看| 好吊视频一区二区三区四区| av成人毛片| 美女黄网久久| 欧美日韩中国免费专区在线看| 国产精品入口麻豆原神| 欧美在线观看视频一区二区| 一本久道久久综合中文字幕| 亚洲三级免费电影| 欧美三级视频在线观看| 又紧又大又爽精品一区二区| 国模精品一区二区三区| 另类尿喷潮videofree| 国产欧美一区二区白浆黑人| 亚洲视频第一页| 欧美日韩在线一区二区三区| 亚洲免费影院| 亚洲免费精品| 欧美96在线丨欧| 国产精品海角社区在线观看| 欧美在线一级va免费观看| 久久久久久69| 欧美精品一区视频| 新狼窝色av性久久久久久| 欧美日产国产成人免费图片| 久久综合九色综合久99| 亚洲国产va精品久久久不卡综合| 西西人体一区二区| 免费在线国产精品| 国产亚洲成人一区| 亚洲高清av| 国产精品视频精品视频| 欧美在线首页| 亚洲欧洲一二三| 国产精品xnxxcom| 久久久久久久波多野高潮日日| 久久综合久久综合这里只有精品| 欧美激情一区二区三区四区| 一本综合精品| 亚洲国产日韩欧美| 韩国一区二区三区美女美女秀| 亚洲一区二区影院| 亚洲美女黄色片| 性色av香蕉一区二区| 一区二区日韩免费看| 激情综合电影网| 欧美日韩国产欧美日美国产精品| 在线观看国产精品淫| 欧美激情亚洲视频| 一区二区日韩精品| 欧美成人精品激情在线观看| 国产精品欧美日韩一区二区| 欧美日韩在线播| 欧美日韩国产亚洲一区| 亚洲国产欧美精品| 国产精品久久久久9999吃药| 欧美精品一区二区三区在线看午夜| 欧美黑人在线观看| 欧美日韩免费高清一区色橹橹| 国产麻豆精品久久一二三| **欧美日韩vr在线| 欧美日韩国产小视频在线观看| 老司机亚洲精品| 亚洲人体影院| 韩国av一区| 亚洲天堂av综合网| 欧美交受高潮1| 亚洲蜜桃精久久久久久久| 国产欧美丝祙| 亚洲国产婷婷综合在线精品| 亚洲欧美激情一区| 久久久久国产精品一区二区| 久久国产一区二区三区| 久久精品视频99| 国产精品日韩欧美一区二区三区|