JourneyToCoding

Code for Fun

最大均值差异(Maximum Mean Discrepancy)是迁移学习(Transfer Learning)中常用的损失函数,它可以用来衡量两个随机变量分布的差异性。

Read more »

Fourier transform is a transform that converts a function into a form that describes the frequencies present in the original function. To a certain extent, it can also be viewed as a coordinate transformation, which transform coordinate in time domain to frequency domain. It is the basis for understanding convolution and CNN.

Read more »

In deep learning, numerical stability refers to making the training of the model stable and feasible. More spefically, numerical stability is the stability of model parameters, outputs and gradients.

Read more »

Hyperparameters are parameters of neural networks other than w and b, including learning rate, depth of layer, size of each layer and so on. Compared to model parameters w and b, hyperparameters are determined entirely by ourselves. Tuning a model is to find better hyperparameters.

Read more »

Dive into deep learning (D2L) is a online deep learning course by Mu Li. In this course, Mu Li teaches the basic concept of DL and usage of pytorch. This post includes the environment configuration in windows, introduction of this course and some basic concept about matrix derivative.

Read more »

Reinforcement learning is an area of machine learning concerned with how intelligent agents ought to take actions in an environment in order to maximize cumulative reward. This post includes the course notes of reinforcement learning course by Andrew Ng.

Read more »
0%