matlab 前向神经网络
深度学习入门:理解神经网络基础 #生活技巧# #学习技巧# #深度学习技巧#
matlab 前向神经网络 feedforwardnetinfolayerslayersweight注:另参考其他网络 fitnet(拟合),patternnet(模式识别)
feedforwardnet
net=feedforwardnet; 12
同通用语法
feedforwardnet(hiddenSizes,trainFcn) hiddenSizes为向量,定义多个隐藏层中神经元的数量 trainFcn训练函数,默认trainlm 123
info
dimensions: numInputs: 1 %一个输入,(一个并发输入) numLayers: 2 %两层,一个隐藏层,一个输出层 numOutputs: 1 %一个输出层 %numInputDelays: 0 %numLayerDelays: 0 %numFeedbackDelays: 0 %numWeightElements: 10 %sampleTime: 1 connections: biasConnect: [1; 1] %向量,两层,均有偏置 inputConnect: [1; 0] %向量,仅输入与第一层有连接 layerConnect: [0 0; 1 0] %矩阵,行为目标层,列为出发层,仅1连接到2层 outputConnect: [0 1] %向量,决定输出层 subobjects: input: Equivalent to inputs{1} output: Equivalent to outputs{2}%全是对象(结构体) 元胞数组 inputs: {1x1 cell array of 1 input} layers: {2x1 cell array of 2 layers} outputs: {1x2 cell array of 1 output} biases: {2x1 cell array of 2 biases} inputWeights: {2x1 cell array of 1 weight} layerWeights: {2x2 cell array of 1 weight} functions: adaptFcn: 'adaptwb' adaptParam: (none) derivFcn: 'defaultderiv' divideFcn: 'dividerand' divideParam: .trainRatio, .valRatio, .testRatio divideMode: 'sample' initFcn: 'initlay' performFcn: 'mse' performParam: .regularization, .normalization plotFcns: {'plotperform', plottrainstate, ploterrhist, plotregression} plotParams: {1x4 cell array of 4 params} trainFcn: 'trainlm' %训练函数 trainParam: .showWindow, .showCommandLine, .show, .epochs, .time, .goal, .min_grad, .max_fail, .mu, .mu_dec, .mu_inc, .mu_max weight and bias values: IW: {2x1 cell} containing 1 input weight matrix LW: {2x2 cell} containing 1 layer weight matrix b: {2x1 cell} containing 2 bias vectors methods: adapt: Learn while in continuous use configure: Configure inputs & outputs gensim: Generate Simulink model init: Initialize weights & biases perform: Calculate performance sim: Evaluate network outputs given inputs train: Train network with examples view: View diagram unconfigure: Unconfigure inputs & outputs evaluate: outputs = net(inputs)
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869layers
>>net.layers{1} name: 'Hidden' %自定义的可改 dimensions: 10 %与size的区别没懂 distanceFcn: (none) distanceParam: (none) distances: [] initFcn: 'initnw' netInputFcn: 'netsum' %层函数,加上偏置 netInputParam: (none) positions: [] range: [10x2 double] size: 10 %默认神经元数,net.layers{i}.size topologyFcn: (none) transferFcn: 'tansig' %转换函数,net.layers{i}.transferFcn transferParam: (none) userdata: (your custom info)
1234567891011121314151617layersweight
这与info是对应的layerConnect: [0 0; 1 0] %矩阵,行为目标层,列为出发层,仅1连接到2层
>>net.layerWeights{2,1} delays: 0 initFcn: (none) initSettings: .range learn: true learnFcn: 'learngdm' learnParam: .lr, .mc size: [0 10] weightFcn: 'dotprod' %权重函数 weightParam: (none) userdata: (your custom info) 123456789101112
网址:matlab 前向神经网络 https://www.yuejiaxmz.com/news/view/440973
相关内容
Matlab代码实践——BP神经网络【Matlab学习手记】BP神经网络数据预测
卷积神经网络
神经网络调参总结
Matlab
m基于遗传算法的城市生活垃圾回收网络优化matlab仿真
matlab & yalmip在微电网优化调度中的应用(一)
训练神经网络的五大算法
Tensorflow笔记之【神经网络的优化】
Matlab的for循环优化