JourneyToCoding

Code for Fun

Discrete Fourier Transform (DFT) is a linear transform that converts a finite sequence of equally-spaced samples of a function in the time domain into a same-length sequence of equally-spaced samples in the frequency domain, which is a complex-valued function of frequency.

Read more »

最大均值差异(Maximum Mean Discrepancy)是迁移学习(Transfer Learning)中常用的损失函数,它可以用来衡量两个随机变量分布的差异性。

Read more »

Fourier transform is a transform that converts a function into a form that describes the frequencies present in the original function. To a certain extent, it can also be viewed as a coordinate transformation, which transform coordinate in time domain to frequency domain. It is the basis for understanding convolution and CNN.

Read more »

In deep learning, numerical stability refers to making the training of the model stable and feasible. More spefically, numerical stability is the stability of model parameters, outputs and gradients.

Read more »

Hyperparameters are parameters of neural networks other than w and b, including learning rate, depth of layer, size of each layer and so on. Compared to model parameters w and b, hyperparameters are determined entirely by ourselves. Tuning a model is to find better hyperparameters.

Read more »

Dive into deep learning (D2L) is a online deep learning course by Mu Li. In this course, Mu Li teaches the basic concept of DL and usage of pytorch. This post includes the environment configuration in windows, introduction of this course and some basic concept about matrix derivative.

Read more »
0%