当前位置:网站首页>Analysis of 43 cases of MATLAB neural network: Chapter 29 research on the application of limit learning machine in regression fitting and classification -- Comparative Experiment
Analysis of 43 cases of MATLAB neural network: Chapter 29 research on the application of limit learning machine in regression fitting and classification -- Comparative Experiment
2022-06-22 05:26:00 【mozun2020】
《MATLAB neural network 43 A case study 》: The first 29 Chapter Application of limit learning machine in regression fitting and classification —— Comparative experiments
1. Preface
《MATLAB neural network 43 A case study 》 yes MATLAB Technology Forum (www.matlabsky.com) planning , Led by teacher wangxiaochuan ,2013 Beijing University of Aeronautics and Astronautics Press MATLAB A book for tools MATLAB Example teaching books , Is in 《MATLAB neural network 30 A case study 》 On the basis of modification 、 Complementary , Adhering to “ Theoretical explanation — case analysis — Application extension ” This feature , Help readers to be more intuitive 、 Learn neural networks vividly .
《MATLAB neural network 43 A case study 》 share 43 Chapter , The content covers common neural networks (BP、RBF、SOM、Hopfield、Elman、LVQ、Kohonen、GRNN、NARX etc. ) And related intelligent algorithms (SVM、 Decision tree 、 Random forests 、 Extreme learning machine, etc ). meanwhile , Some chapters also cover common optimization algorithms ( Genetic algorithm (ga) 、 Ant colony algorithm, etc ) And neural network . Besides ,《MATLAB neural network 43 A case study 》 It also introduces MATLAB R2012b New functions and features of neural network toolbox in , Such as neural network parallel computing 、 Custom neural networks 、 Efficient programming of neural network, etc .
In recent years, with the rise of artificial intelligence research , The related direction of neural network has also ushered in another upsurge of research , Because of its outstanding performance in the field of signal processing , The neural network method is also being applied to various applications in the direction of speech and image , This paper combines the cases in the book , It is simulated and realized , It's a relearning , I hope I can review the old and know the new , Strengthen and improve my understanding and practice of the application of neural network in various fields . I just started this book on catching more fish , Let's start the simulation example , Mainly to introduce the source code application examples in each chapter , This paper is mainly based on MATLAB2015b(32 position ) Platform simulation implementation , This is an example of the application of extreme learning machine in regression fitting and classification in Chapter 29 of this book , Don't talk much , Start !
2. MATLAB Simulation example 1
open MATLAB, Click on “ Home page ”, Click on “ open ”, Find the sample file 
Choose main.m, Click on “ open ”
main.m Source code is as follows :
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% function : Research on the application of limit learning machine in classification problem
% Environmental Science :Win7,Matlab2015b
%Modi: C.S
% Time :2022-06-20
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Research on the application of limit learning machine in classification problem
%% Clear environment variables
clear all
clc
warning off
tic
%% Import data
load data.mat
% Randomly generated training set / Test set
a = randperm(569);
Train = data(a(1:500),:);
Test = data(a(501:end),:);
% Training data
P_train = Train(:,3:end)';
T_train = Train(:,2)';
% Test data
P_test = Test(:,3:end)';
T_test = Test(:,2)';
tic
%% ELM establish / Training
[IW,B,LW,TF,TYPE] = elmtrain(P_train,T_train,100,'sig',1);
%% ELM The simulation test
T_sim_1 = elmpredict(P_train,IW,B,LW,TF,TYPE);
T_sim_2 = elmpredict(P_test,IW,B,LW,TF,TYPE);
toc
%% Results contrast
result_1 = [T_train' T_sim_1'];
result_2 = [T_test' T_sim_2'];
% Training set accuracy
k1 = length(find(T_train == T_sim_1));
n1 = length(T_train);
Accuracy_1 = k1 / n1 * 100;
disp([' Training set accuracy Accuracy = ' num2str(Accuracy_1) '%(' num2str(k1) '/' num2str(n1) ')'])
% Test set accuracy
k2 = length(find(T_test == T_sim_2));
n2 = length(T_test);
Accuracy_2 = k2 / n2 * 100;
disp([' Test set accuracy Accuracy = ' num2str(Accuracy_2) '%(' num2str(k2) '/' num2str(n2) ')'])
%% Show
count_B = length(find(T_train == 1));
count_M = length(find(T_train == 2));
rate_B = count_B / 500;
rate_M = count_M / 500;
total_B = length(find(data(:,2) == 1));
total_M = length(find(data(:,2) == 2));
number_B = length(find(T_test == 1));
number_M = length(find(T_test == 2));
number_B_sim = length(find(T_sim_2 == 1 & T_test == 1));
number_M_sim = length(find(T_sim_2 == 2 & T_test == 2));
disp([' Total number of cases :' num2str(569)...
' Benign :' num2str(total_B)...
' Malignant :' num2str(total_M)]);
disp([' Total number of training set cases :' num2str(500)...
' Benign :' num2str(count_B)...
' Malignant :' num2str(count_M)]);
disp([' Total number of cases in the test set :' num2str(69)...
' Benign :' num2str(number_B)...
' Malignant :' num2str(number_M)]);
disp([' Benign breast tumor was diagnosed :' num2str(number_B_sim)...
' Misdiagnosis :' num2str(number_B - number_B_sim)...
' Diagnostic rate p1=' num2str(number_B_sim/number_B*100) '%']);
disp([' Malignant breast tumor was diagnosed :' num2str(number_M_sim)...
' Misdiagnosis :' num2str(number_M - number_M_sim)...
' Diagnostic rate p2=' num2str(number_M_sim/number_M*100) '%']);
R = [];
for i = 50:50:500
%% ELM establish / Training
[IW,B,LW,TF,TYPE] = elmtrain(P_train,T_train,i,'sig',1);
%% ELM The simulation test
T_sim_1 = elmpredict(P_train,IW,B,LW,TF,TYPE);
T_sim_2 = elmpredict(P_test,IW,B,LW,TF,TYPE);
%% Results contrast
result_1 = [T_train' T_sim_1'];
result_2 = [T_test' T_sim_2'];
% Training set accuracy
k1 = length(find(T_train == T_sim_1));
n1 = length(T_train);
Accuracy_1 = k1 / n1 * 100;
% disp([' Training set accuracy Accuracy = ' num2str(Accuracy_1) '%(' num2str(k1) '/' num2str(n1) ')'])
% Test set accuracy
k2 = length(find(T_test == T_sim_2));
n2 = length(T_test);
Accuracy_2 = k2 / n2 * 100;
% disp([' Test set accuracy Accuracy = ' num2str(Accuracy_2) '%(' num2str(k2) '/' num2str(n2) ')'])
R = [R;Accuracy_1 Accuracy_2];
end
figure
plot(50:50:500,R(:,2),'b:o')
xlabel(' Number of neurons in the hidden layer ')
ylabel(' Test set prediction accuracy (%)')
title(' Number of neurons in the hidden layer ELM The impact of performance ')
toc
Add completed , Click on “ function ”, Start emulating , The output simulation results are as follows :
Time has passed 0.138923 second .
Training set accuracy Accuracy = 87%(435/500)
Test set accuracy Accuracy = 88.4058%(61/69)
Total number of cases :569 Benign :357 Malignant :212
Total number of training set cases :500 Benign :314 Malignant :186
Total number of cases in the test set :69 Benign :43 Malignant :26
Benign breast tumor was diagnosed :40 Misdiagnosis :3 Diagnostic rate p1=93.0233%
Malignant breast tumor was diagnosed :21 Misdiagnosis :5 Diagnostic rate p2=80.7692%
Time has passed 0.704174 second .

3. MATLAB Simulation example 2
open MATLAB, Click on “ Home page ”, Click on “ open ”, Find the sample file 
Choose main.m, Click on “ open ”
main.m Source code is as follows :
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% function : Research on the application of limit learning machine in regression fitting problem
% Environmental Science :Win7,Matlab2015b
%Modi: C.S
% Time :2022-06-20
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Research on the application of limit learning machine in regression fitting problem
%% Clear environment variables
clear all
clc
tic
%% Import data
load data
% Randomly generate training sets 、 Test set
k = randperm(size(input,1));
% Training set ——1900 Samples
P_train=input(k(1:1900),:)';
T_train=output(k(1:1900));
% Test set ——100 Samples
P_test=input(k(1901:2000),:)';
T_test=output(k(1901:2000));
%% normalization
% Training set
[Pn_train,inputps] = mapminmax(P_train,-1,1);
Pn_test = mapminmax('apply',P_test,inputps);
% Test set
[Tn_train,outputps] = mapminmax(T_train,-1,1);
Tn_test = mapminmax('apply',T_test,outputps);
tic
%% ELM establish / Training
[IW,B,LW,TF,TYPE] = elmtrain(Pn_train,Tn_train,20,'sig',0);
%% ELM The simulation test
Tn_sim = elmpredict(Pn_test,IW,B,LW,TF,TYPE);
% Anti normalization
T_sim = mapminmax('reverse',Tn_sim,outputps);
toc
%% Results contrast
result = [T_test' T_sim'];
% Mean square error
E = mse(T_sim - T_test)
% Coefficient of determination
N = length(T_test);
R2 = (N*sum(T_sim.*T_test)-sum(T_sim)*sum(T_test))^2/((N*sum((T_sim).^2)-(sum(T_sim))^2)*(N*sum((T_test).^2)-(sum(T_test))^2))
%% mapping
figure
plot(1:length(T_test),T_test,'r*')
hold on
plot(1:length(T_sim),T_sim,'b:o')
xlabel(' Test set sample number ')
ylabel(' Test set output ')
title('ELM Test set output ')
legend(' Expected output ',' Forecast output ')
figure
plot(1:length(T_test),T_test-T_sim,'r-*')
xlabel(' Test set sample number ')
ylabel(' Absolute error ')
title('ELM Test set prediction error ')
toc
Add completed , Click on “ function ”, Start emulating , The output simulation results are as follows :
Time has passed 0.026963 second .
E =
0.0012
R2 =
1.0000
Time has passed 0.960279 second .


4. Summary
Extreme learning machine (Extreme Learning Machine, ELM) or “ Transfinite learning machine ” It is a kind of neural network based on feedforward (Feedforward Neuron Network, FNN) Constructed machine learning system or method , It is applicable to supervised learning and unsupervised learning .ELM It is regarded as a special kind of FNN, Or right FNN And the improvement of its back propagation algorithm , Its characteristic is that the weights of hidden layer nodes are given randomly or artificially , And no update is required , The learning process only calculates the output weights . Conventional ELM With single hidden layer , In conjunction with other shallow learning systems , For example, a single-layer perceptron (single layer perceptron) And support vector machines (Support Vector Machine, SVM) In comparison with time , It is considered that it may have advantages in learning speed and generalization ability .ELM Some of the improved versions of the have achieved depth structure by introducing self encoder to construct or stack hidden layers , Be able to conduct representational learning .ELM Applications include computer vision and bioinformatics , It is also applied to some earth sciences 、 Regression in environmental science .
Interested in the content of this chapter or want to fully learn and understand , It is suggested to study the contents of chapter 29 in the book . Some of these knowledge points will be supplemented on the basis of their own understanding in the later stage , Welcome to study and exchange together .
边栏推荐
- YARN 的高可用设计有哪些?
- Create a new local content and upload it to the code cloud branch
- Service migration when deploying SuperMap iserver war package
- 2022 Shanxi secondary vocational group "Cyberspace Security" event module b- web page penetration
- 拉取码云某一个分支的所有数据覆盖本地代码
- Which methods are not allowed to be overridden?
- Liunx virtual machine environment uses docker to install Oracle database and Navicat to connect
- Gateway uses global filter for token verification
- Conversion between JSON, string and map
- C#中Cookie设置与读取
猜你喜欢

Online text code comparison tool

Liunx virtual machine environment uses docker to install Oracle database and Navicat to connect

2022 Shanxi secondary vocational group "Cyberspace Security" event module b- web page penetration

2022 new test questions for tea specialists (primary) and summary of tea specialists (primary) examination

Pytest (12) -allure common features allure attach、allure. step、fixture、environment、categories

Hide symbol of dynamic library

2022 a special equipment related management (elevator) examination data and a special equipment related management (elevator) analysis
Please, use three JS make 2D pictures have 3D effect cool, OK

Cookie setting and reading in C #
![P1061 [NOIP2006 普及组] Jam 的计数法](/img/53/7ca41b2ed4084f49ebcc2dd47e5919.png)
P1061 [NOIP2006 普及组] Jam 的计数法
随机推荐
Jedissentinel tool class
postmanUtils工具类,模拟postman的get,post请求
基于WebUploader实现大文件分片上传
2022 welder (primary) new version test questions and welder (primary) free test questions
OPTEE notes
用简单方法实现对象的深克隆封装js
[graduation season · advanced technology Er] a graduate student's chatter
TIDB-升级版本
Idea创建方法时,使用注解提示方法参数(param)、返回值(return)、方法作用(Description)
从JedisSentinelPool获取jedis
open source hypervisor
Record local project startup error: invalid source distribution: 8
Running yolov5 reports an error attributeerror: cant get attribute sppf on module models common
移动端 realm数据库使用及解耦,跨线程安全使用 OC realm
旅行家的预算(洛谷)
Cookie setting and reading in C #
Force deletion of namespaces in terminating state
Get jedis from jedissentinelpool
Gerrit Code Review Setup
printf becomes puts