Do you have this classic bestseller in the field of deep learning 《 Deep learning 》, Is there something you can't understand ? Are you worried about some math problems , Have you given up reading this classic book because of math problems . I'll give you a move .

 

You can prepare one 《 Mathematics of machine learning 》 Master the mathematical knowledge in machine learning . Let's look at deep learning .

Why do you say that? , Let's look at reading first 《 Deep learning 》 Mathematics knowledge needed . Take a look at the second one in the picture below 1 This part introduces the basic mathematical tools and the concept of machine learning .

 

Look again 《 Mathematics in machine learning 》 What mathematical knowledge is used in this course

 

probability theory

Probability theory is also very important for machine learning , It's an important tool . If the machine learning algorithm is input , The output is treated as a random variable / vector , Then we can use probability theory to model the problem . One advantage of using probability theory is that uncertainty can be modeled , This is very necessary for some problems . in addition , It can also mine the probability dependence between variables , Realizing causal reasoning . Probability theory for some stochastic algorithms - Such as Monte Carlo algorithm , genetic algorithm , And random number generation algorithm - Including basic random number generation , And the sampling algorithm provides a theoretical basis and guidance . last , Probability theory is also information theory , The pilot course of stochastic process .

Probability theory and mathematical statistics teaching materials for engineering , Most of the probabilistic knowledge required for machine learning has been described , Except for the following knowledge points :

1. Conditional independence

2. Jensen Inequality

3. Some probability distributions , Such as multinomial distribution , Laplace distribution ,t Distribution, etc

4. Probability distribution transformation

5. Multidimensional normal distribution

6. Transformation of multidimensional probability distribution

7. Some parameter estimation methods , Including maximum a posteriori probability estimation , Bayesian estimation, etc

8. Random number generation algorithm , Including inverse transform sampling , Reject sampling and other algorithms

 

optimization method

Optimization methods play a central role in machine learning , Unfortunately, many readers have not systematically studied this course , Including linear programming , Convex optimization , nonlinear programming . In general numerical analysis course , Only a small part of the optimization method is described . Almost all machine learning algorithms come down to solving optimization problems , Then the model parameters are determined , Or get the prediction results directly . The typical representative of the former is supervised learning , The parameters of the model are determined by minimizing the loss function or optimizing other types of objective functions ; The typical representative of the latter is data dimension reduction algorithm , The result of dimension reduction is determined by optimizing some objective function , Such as principal component analysis .

 

information theory

Information theory is the extension of probability theory , It is usually used to construct objective function in machine learning and deep learning , As well as the theoretical analysis and proof of the algorithm . This is also a course that many readers have not learned .

In machine learning, especially in deep learning , The knowledge of information theory can be seen everywhere :

1. In the training process of decision tree, entropy should be used as an index

2. Cross entropy is often used in deep learning ,KL divergence ,JS divergence , Mutual information and other concepts

3. The derivation of variational inference needs to be based on KL Based on divergence

4. Distance metric learning , Manifold dimension reduction algorithm also needs the knowledge of information theory

 

random process

Stochastic process is also an extension of probability theory , This is also a course that most readers haven't learned . In machine learning , Stochastic process is used in probability graph model , Reinforcement learning , And Bayesian optimization method . Don't understand Markov process , You will be right MCMC Sampling algorithm confused .

 

graph theory

Graph theory seems to have been studied only in computer related majors , And it's incomplete , For example, spectrum theory . In machine learning , Probabilistic graph model is a typical graph structure . Both manifold dimension reduction algorithm and spectral clustering algorithm use spectral graph theory . Computational graph is a typical representation of graph , Graph neural network as a new deep learning model , It is also closely related to graph theory . Therefore, it is necessary to supplement the knowledge of graph theory .

Are you happy to read this , This Bible level deep learning book can finally continue to learn .

Deep learning [deep learning]

 

《 Deep learning 》 By three world-renowned experts Ian Goodfellow,Yoshua Bengio and Aaron
Courville compose , It is a classic textbook in the field of deep learning . The contents of the book include 3 Three parts : The third 1 This part introduces the basic mathematical tools and the concept of machine learning , They are the preparatory knowledge for deep learning ; The third 2 Part of the system in-depth explanation of today's mature deep learning methods and technologies ; The third 3 Some forward-looking directions and ideas are discussed , They are recognized as the future research focus of deep learning .

《 Deep learning 》 Suitable for all kinds of readers , Including related professional college students or graduate students , And no machine learning or statistical background , But want to quickly add deep learning knowledge , So as to be applied in the actual product or platform .

 

Mathematics of machine learning

 

The book consists of three parts 8 Chapter composition , It covers machine learning accurately and systematically in a very small space , Deep learning , Mathematics knowledge necessary for reinforcement learning , The content basically covers this 3 Most of the mathematical knowledge required for this course . For the undergraduate stage of science and Engineering “ Advanced mathematics / Calculus ”,“ linear algebra ”,“ Probability theory and mathematical statistics ” It was precisely supplemented . The main contents of each chapter are introduced below .

Technology
©2020 ioDraw All rights reserved
SSM— User module ( two ) Forget the password , Change Password , Get user information One and a half years JAVA Summary of work experience python To solve the problem of dictionary writing list in Vue Common features ( Form operation ) Rare expletives in Lei Jun's press conference : this XXX I'm definitely here to make trouble !PYTHON Summary of final review 2021 Front end interview written questions and answers After the black myth Wukong Another domestic game blown up by foreigners 27 Year old invention SQL in the future , God took him away CLion Novice step on the pit :CMake project is not loaded