<>
I haven't blogged these days , It's mainly about watching matlab Neural network based on Neural Network , Through the learning of machine learning , Learned the perceptron , Radial basis function ,bp Equal neural network . Next, learn one of the most widely used bp Neural network cases !

<> example : bp Prediction of gasoline concentration by neural network .

*
First, the data set spectra_data.mat There are two sets of data in , Data set P and T, In this set of data , yes 60 Data , Each data has 401 Two eigenvalues , What we have to do is take part of it as training , Part of it is a test .
* Let's look at the dataset first NIR(p)

* data set octane(t)

* The numerical range of the data set is confusing, and we should adjust them ( Here I will process all the data in the database [0,1] It's easy to deal with the later part of the scope , Data sets of course p,t You need to do this . %%
Here, the function is to p_train The data were normalized [p_train, ps_input] = mapminmax(P_train,0,1);
* We are here and there 50 Data as training ,10 Data as test , To ensure the universality of the model , We'll take a random sample 50 As a training set % 2. Random generation of training set and test set temp =
randperm(size(NIR,1)); % Upset 60 Sample ranking disp(temp(1:50)) % Training set ——50 Samples P_train =
NIR(temp(1:50),:)'; T_train = octane(temp(1:50),:)'; % Test set ——10 Samples P_test =
NIR(temp(51:end),:)'; T_test = octane(temp(51:end),:)'; N = size(P_test,2);
* When the data set is processed, the neural network is trained %% IV. BP Establishment of neural network , Training and simulation test % 1. Create a network net =
newff(p_train,t_train,9); %9 Is the number of neurons in the hidden layer ( We change the test results ), What is the connection weight 3628, Tell me how to calculate it %% %
2. Set training parameters net.trainParam.epochs = 1000; % Iterations net.trainParam.goal = 1e-3;
%mse The root mean square error is less than this value net.trainParam.lr = 0.01; % Learning rate %% % 3. Training network net =
train(net,p_train,t_train); %% % 4. Simulation test t_sim = sim(net,p_test); % return 10 Predicted value of samples
* What happens when it's done , We need to put the processed value in the [0,1] between , We need to restore it T_sim =
mapminmax('reverse',t_sim,ps_output); %reverse Results of inverse data normalization
* Finally, we evaluate the performance of this model , This is shown in the form of a graph %% V. performance evaluation %% % 1. relative error error error = abs(T_sim -
T_test)./T_test; %% % 2. Coefficient of determination R^2 R2 = (N * sum(T_sim .* T_test) - sum(T_sim) *
sum(T_test))^2 / ((N * sum((T_sim).^2) - (sum(T_sim))^2) * (N *
sum((T_test).^2) - (sum(T_test))^2)); %% % 3. Comparison of results result = [T_test' T_sim'
error'] % Output true value , Estimate , error %% VI. mapping figure plot(1:N,T_test,'b:*',1:N,T_sim,'r-o')
legend(' True value ',' Estimate ') xlabel(' Forecast Sample ') ylabel(' Octane number ') string =
{' Comparison of prediction results of octane number content in test set ';['R^2=' num2str(R2)]}; title(string)
* Look at the effect

<> The coefficient of determination is R2 by 0.961 Quite successful .

<>
summary : I understand what neural networks do , For the data with less eigenvalues , We can use linear regression and logistic regression to classify data , But for eigenvalues, there are many ( Like here 401 Two eigenvalues ) Machine learning problem based on , Linear regression and logistic regression have the problem of over fitting or under fitting , At this time, we need to use neural network , Let him train on his own , It has achieved the best classification effect . This is the advantage of neural networks !

<> Finally, all the codes are attached
%% I. Clear environment variables clear clc %% II. Training set / Test set generation %% % 1. Import data load spectra_data.mat %%
% 2. Random generation of training set and test set temp = randperm(size(NIR,1)); % Upset 60 Sample ranking disp(temp(1:50)) %
Training set ——50 Samples P_train = NIR(temp(1:50),:)'; T_train = octane(temp(1:50),:)'; %
Test set ——10 Samples P_test = NIR(temp(51:end),:)'; T_test = octane(temp(51:end),:)'; N =
size(P_test,2); %% III. data normalization [p_train, ps_input] = mapminmax(P_train,0,1);
p_test = mapminmax('apply',P_test,ps_input); [t_train, ps_output] =
mapminmax(T_train,0,1); %% IV. BP Establishment of neural network , Training and simulation test % 1. Create a network net =
newff(p_train,t_train,9); %9 Is the number of neurons in the hidden layer ( We change the test results ), What is the connection weight 3628, Tell me how to calculate it %% %
2. Set training parameters net.trainParam.epochs = 1000; % Iterations net.trainParam.goal = 1e-3;
%mse The root mean square error is less than this value net.trainParam.lr = 0.01; % Learning rate %% % 3. Training network net =
train(net,p_train,t_train); %% % 4. Simulation test t_sim = sim(net,p_test); % return 10 Predicted value of samples
%% % 5. Data denormalization T_sim = mapminmax('reverse',t_sim,ps_output); % Inverse normalization results %% V. performance evaluation
%% % 1. relative error error error = abs(T_sim - T_test)./T_test; %% % 2. Coefficient of determination R^2 R2 = (N
* sum(T_sim .* T_test) - sum(T_sim) * sum(T_test))^2 / ((N * sum((T_sim).^2) -
(sum(T_sim))^2) * (N * sum((T_test).^2) - (sum(T_test))^2)); %% % 3. Comparison of results
result = [T_test' T_sim' error'] % Output true value , Estimate , error %% VI. mapping figure
plot(1:N,T_test,'b:*',1:N,T_sim,'r-o') legend(' True value ',' Estimate ') xlabel(' Forecast Sample ')
ylabel(' Octane number ') string = {' Comparison of prediction results of octane number content in test set ';['R^2=' num2str(R2)]}; title(string)

Technology