|
汽车零部件采购、销售通信录 填写你的培训需求,我们帮你找 招募汽车专业培训老师
>> p11=[0.64 1.37 0.71 0.78]';
>> p12=[0.68 1.31 0.62 1.31]';
>> p21=[1.65 1.66 0.9 4.48]';
>> p22=[1.35 1.39 0.95 2.89]';
>> p31=[8.24 2.23 0.99 2]';
>> p41=[2.01 1.65 0.94 4.39]';
>> p51=[0.93 1.33 0.73 1.54]';
>> p=[p11 p12 p21 p22 p31 p41 p51];
>> t11=[0 0 0 0]';
>> t12=[0 0 0 0]';
>> t21=[1 0 0 0]';
>> t22=[1 0 0 0]';
>> t31=[0 1 0 0]';
>> t41=[0 0 1 0]';
>> t51=[0 0 0 1]';
>> t=[t11 t12 t13 t21 t22 t31 t41 t51];
Undefined function or variable 't13'.
>> t=[t11 t12 t21 t22 t31 t41 t51];
>> net=newff(minmax(p),[8,4],{'logsig','purelin'},'trainlm'),
Warning: NEWFF used in an obsolete way.
> In obs_use at 18
In newff>create_network at 127
In newff at 102
See help for NEWFF to update calls to the new argument list.
net.trainparam.show=100,
net =
Neural Network
name: 'Custom Neural Network'
efficiency: .cacheDelayedInputs, .flattenTime,
.memoryReduction
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: (none)
divideParam: (none)
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization, .squaredWeighting
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
net =
Neural Network
name: 'Custom Neural Network'
efficiency: .cacheDelayedInputs, .flattenTime,
.memoryReduction
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: (none)
divideParam: (none)
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization, .squaredWeighting
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
>> net.trainparam.epoch=2000,
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
In param at 6
In network.subsasgn>setTrainParam at 2024
In network.subsasgn>network_subsasgn at 466
In network.subsasgn at 13
net =
Neural Network
name: 'Custom Neural Network'
efficiency: .cacheDelayedInputs, .flattenTime,
.memoryReduction
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: (none)
divideParam: (none)
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization, .squaredWeighting
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max, .epoch
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
>> net.trainparam.goal=1e-3,
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
In param at 6
In network.subsasgn>setTrainParam at 2024
In network.subsasgn>network_subsasgn at 466
In network.subsasgn at 13
net =
Neural Network
name: 'Custom Neural Network'
efficiency: .cacheDelayedInputs, .flattenTime,
.memoryReduction
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: (none)
divideParam: (none)
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization, .squaredWeighting
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max, .epoch
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
>> [net,tr]=train(net,p,t),
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
In param at 6
In trainlm at 92
In network.train at 106
net =
Neural Network
name: 'Custom Neural Network'
efficiency: .cacheDelayedInputs, .flattenTime,
.memoryReduction
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: (none)
divideParam: (none)
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization, .squaredWeighting
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max, .epoch
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
tr =
trainFcn: 'trainlm'
trainParam: [1x1 nnetParam]
performFcn: 'mse'
performParam: [1x1 nnetParam]
derivFcn: 'defaultderiv'
divideFcn: 'dividetrain'
divideMode: 'sample'
divideParam: [1x1 nnetParam]
trainInd: [1 2 3 4 5 6 7]
valInd: []
testInd: []
stop: 'Performance goal met.'
num_epochs: 11
trainMask: {[1]}
valMask: {[0]}
testMask: {[0]}
best_epoch: 11
goal: 1.0000e-03
states: {'epoch' 'time' 'perf' 'vperf' 'tperf' 'mu' 'gradient' 'val_fail'}
epoch: [0 1 2 3 4 5 6 7 8 9 10 11]
time: [1x12 double]
perf: [1x12 double]
vperf: [NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN]
tperf: [NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN]
mu: [1x12 double]
gradient: [1x12 double]
val_fail: [0 0 0 0 0 0 0 0 0 0 0 0]
best_perf: 5.8284e-05
best_vperf: NaN
best_tperf: NaN
>> |
|