Lab: Supervised Machine Learning - Regression and Classification
Optional labs - W1
Lab01 - Lab02
Just follow its instructions.
Lab03
There is a markdown syntax error in Notation paragraph:
To solve this, turn:
1 | |: ------------|: ------------------------------------------------------------|| |
to
1 | |:---:|:---:|:---:| |
In Tools paragraph, if you have put deeplearning.mplstyle
into the same folder of your labs'file, but code plt.style.use('./deeplearning.mplstyle')
still can't run, you should delete ./
in ./deeplearning.mplstyle
. If your os is Linux, this problem will not occur.
Lab04
In Tools paragraph, code %matplotlib widget
can't run because of the lack of ipympl even though you installed jupyter using anaconda. Maybe the reason is that the version of ipympl is not compatible with jupyter. To solve this, run your anaconda prompt as a administrator and:
1 | conda install -c conda-forge ipympl |
The problem of ./deeplearning.mplstyle
will still occur, modify it as before. In addition, you should also modify the same path in lab_utils_commonpy
and lab_utils_uni.py
. For the following labs, you should keep doing so once you encounter ./deeplearning.mplstyle
.
Lab05
You may run into int
overflow when running plt_divergence(p_histm J_hist, x_train, y_train)
. Solution:
1 | # in file lab_utils_uni.py |
Optional labs - W2
Lab01
See NumPy.
Lab02
The same bugs as Supervised Machine Learning.Optional labs - W1.Lab03.
Lab03
The same bugs as Supervised Machine Learning.Optional labs - W1.Lab03.
Lab04
Just follow its instructions.
Lab05 - Lab06
See Scikit-learn.
Optional labs - W3
Lab01 - Lab09_Soln
Just follow its instructions.
Lab01_user
Answer:
1 | g = 1 / (1 + np.exp(-z)) |
Lab02_user
Answer:
1 | x1 = 3 - x0 |
Besides, there is a bug in lab_utils.plot_data
:
1 | # Add the following codes after neg = y == 0 |
Lab03_user
Answer:
1 | for i in range(m): |
@
can represent matrix multiplication.
Lab04_user
Non-vectorized answer:
1 | ### START CODE HERE ### |
For each example, get its error and apply it to different $w_j$.
Vectorized answer:
1 | ### START CODE HERE ### |
Get error of all examples simultaneously.
Lab05_user
See Scikit-learn.
Lab06_user
This lab realize multiclass classification using multiple binary classification models, that is One Vs All algorithm. In fact, its core idea is quite simple. For an example with $n$ possible $y$, represent its label using a vector with $n$ binary elements, only one of which is 1
. Then, we just need to train $n$ binary classification models and choose the biggest prediction of them as $\widehat{y}$.
Answer:
1 | # step 1 |
Lab07_user
Just follow its instructions. However, there is also a bug because of the update of sklearn
. We should turn penalty='none'
to penalty=None
.
map_feature
is a very interesting function:
1 | def map_feature(X1, X2, degree): |
It produces polynomials of degree up to degree
formed by X1
and X2
. That is, if degree=3
:
$$(x_1+x_2)+(x_1^2+x_1x_2+x_2^2)+(x_1^3+x_1^2x_2+x_1x_2^2+x_2^3)$$
Lab08_user
Answer:
1 | ### START CODE HERE ### |
Lab09_user
Answer:
1 | # Looping version |
PracticeLab01
This lab requires us to implement the compute_cost
and compute_gradient
function of a linear regression model with only one feature. It is quite simple:
1 | # compute_cost |
PracticeLab02
This lab requires us to implement a logistic regression model with regularization. It is a bit more complicated than PracticeLab01, but it's still easy to finish.
1 | # sigmoid |