中国汽车工程师之家--聚集了汽车行业80%专业人士 

论坛口号:知无不言,言无不尽!QQ:542334618 

本站手机访问:直接在浏览器中输入本站域名即可 

  • 1418查看
  • 0回复

神经网络故障检测

[复制链接]


该用户从未签到

发表于 4-9-2012 15:33:51 | 显示全部楼层 |阅读模式

汽车零部件采购、销售通信录       填写你的培训需求,我们帮你找      招募汽车专业培训老师


>>  p11=[0.64 1.37 0.71 0.78]';
>> p12=[0.68 1.31 0.62 1.31]';
>> p21=[1.65 1.66 0.9 4.48]';
>> p22=[1.35 1.39 0.95 2.89]';
>> p31=[8.24 2.23 0.99 2]';
>> p41=[2.01 1.65 0.94 4.39]';
>> p51=[0.93 1.33 0.73 1.54]';
>> p=[p11 p12 p21 p22 p31 p41 p51];
>> t11=[0 0 0 0]';
>> t12=[0 0 0 0]';
>> t21=[1 0 0 0]';
>> t22=[1 0 0 0]';
>> t31=[0 1 0 0]';
>> t41=[0 0 1 0]';
>> t51=[0 0 0 1]';
>> t=[t11 t12 t13 t21 t22 t31 t41 t51];
Undefined function or variable 't13'.

>>  t=[t11 t12  t21 t22 t31 t41 t51];
>> net=newff(minmax(p),[8,4],{'logsig','purelin'},'trainlm'),
Warning: NEWFF used in an obsolete way.
> In obs_use at 18
  In newff>create_network at 127
  In newff at 102
          See help for NEWFF to update calls to the new argument list.

net.trainparam.show=100,

net =

    Neural Network

              name: 'Custom Neural Network'
        efficiency: .cacheDelayedInputs, .flattenTime,
                    .memoryReduction
          userdata: (your custom info)

    dimensions:

         numInputs: 1
         numLayers: 2
        numOutputs: 1
    numInputDelays: 0
    numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
        sampleTime: 1

    connections:

       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

    subobjects:

            inputs: {1x1 cell array of 1 input}
            layers: {2x1 cell array of 2 layers}
           outputs: {1x2 cell array of 1 output}
            biases: {2x1 cell array of 2 biases}
      inputWeights: {2x1 cell array of 1 weight}
      layerWeights: {2x2 cell array of 1 weight}

    functions:

          adaptFcn: 'adaptwb'
        adaptParam: (none)
          derivFcn: 'defaultderiv'
         divideFcn: (none)
       divideParam: (none)
        divideMode: 'sample'
           initFcn: 'initlay'
        performFcn: 'mse'
      performParam: .regularization, .normalization, .squaredWeighting
          plotFcns: {'plotperform', plottrainstate,
                    plotregression}
        plotParams: {1x3 cell array of 3 params}
          trainFcn: 'trainlm'
        trainParam: .showWindow, .showCommandLine, .show, .epochs,
                    .time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
                    .mu_inc, .mu_max

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    methods:

             adapt: Learn while in continuous use
         configure: Configure inputs & outputs
            gensim: Generate Simulink model
              init: Initialize weights & biases
           perform: Calculate performance
               sim: Evaluate network outputs given inputs
             train: Train network with examples
              view: View diagram
       unconfigure: Unconfigure inputs & outputs

    evaluate:       outputs = net(inputs)


net =

    Neural Network

              name: 'Custom Neural Network'
        efficiency: .cacheDelayedInputs, .flattenTime,
                    .memoryReduction
          userdata: (your custom info)

    dimensions:

         numInputs: 1
         numLayers: 2
        numOutputs: 1
    numInputDelays: 0
    numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
        sampleTime: 1

    connections:

       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

    subobjects:

            inputs: {1x1 cell array of 1 input}
            layers: {2x1 cell array of 2 layers}
           outputs: {1x2 cell array of 1 output}
            biases: {2x1 cell array of 2 biases}
      inputWeights: {2x1 cell array of 1 weight}
      layerWeights: {2x2 cell array of 1 weight}

    functions:

          adaptFcn: 'adaptwb'
        adaptParam: (none)
          derivFcn: 'defaultderiv'
         divideFcn: (none)
       divideParam: (none)
        divideMode: 'sample'
           initFcn: 'initlay'
        performFcn: 'mse'
      performParam: .regularization, .normalization, .squaredWeighting
          plotFcns: {'plotperform', plottrainstate,
                    plotregression}
        plotParams: {1x3 cell array of 3 params}
          trainFcn: 'trainlm'
        trainParam: .showWindow, .showCommandLine, .show, .epochs,
                    .time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
                    .mu_inc, .mu_max

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    methods:

             adapt: Learn while in continuous use
         configure: Configure inputs & outputs
            gensim: Generate Simulink model
              init: Initialize weights & biases
           perform: Calculate performance
               sim: Evaluate network outputs given inputs
             train: Train network with examples
              view: View diagram
       unconfigure: Unconfigure inputs & outputs

    evaluate:       outputs = net(inputs)

>> net.trainparam.epoch=2000,
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
  In param at 6
  In network.subsasgn>setTrainParam at 2024
  In network.subsasgn>network_subsasgn at 466
  In network.subsasgn at 13

net =

    Neural Network

              name: 'Custom Neural Network'
        efficiency: .cacheDelayedInputs, .flattenTime,
                    .memoryReduction
          userdata: (your custom info)

    dimensions:

         numInputs: 1
         numLayers: 2
        numOutputs: 1
    numInputDelays: 0
    numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
        sampleTime: 1

    connections:

       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

    subobjects:

            inputs: {1x1 cell array of 1 input}
            layers: {2x1 cell array of 2 layers}
           outputs: {1x2 cell array of 1 output}
            biases: {2x1 cell array of 2 biases}
      inputWeights: {2x1 cell array of 1 weight}
      layerWeights: {2x2 cell array of 1 weight}

    functions:

          adaptFcn: 'adaptwb'
        adaptParam: (none)
          derivFcn: 'defaultderiv'
         divideFcn: (none)
       divideParam: (none)
        divideMode: 'sample'
           initFcn: 'initlay'
        performFcn: 'mse'
      performParam: .regularization, .normalization, .squaredWeighting
          plotFcns: {'plotperform', plottrainstate,
                    plotregression}
        plotParams: {1x3 cell array of 3 params}
          trainFcn: 'trainlm'
        trainParam: .showWindow, .showCommandLine, .show, .epochs,
                    .time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
                    .mu_inc, .mu_max, .epoch

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    methods:

             adapt: Learn while in continuous use
         configure: Configure inputs & outputs
            gensim: Generate Simulink model
              init: Initialize weights & biases
           perform: Calculate performance
               sim: Evaluate network outputs given inputs
             train: Train network with examples
              view: View diagram
       unconfigure: Unconfigure inputs & outputs

    evaluate:       outputs = net(inputs)

>> net.trainparam.goal=1e-3,
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
  In param at 6
  In network.subsasgn>setTrainParam at 2024
  In network.subsasgn>network_subsasgn at 466
  In network.subsasgn at 13

net =

    Neural Network

              name: 'Custom Neural Network'
        efficiency: .cacheDelayedInputs, .flattenTime,
                    .memoryReduction
          userdata: (your custom info)

    dimensions:

         numInputs: 1
         numLayers: 2
        numOutputs: 1
    numInputDelays: 0
    numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
        sampleTime: 1

    connections:

       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

    subobjects:

            inputs: {1x1 cell array of 1 input}
            layers: {2x1 cell array of 2 layers}
           outputs: {1x2 cell array of 1 output}
            biases: {2x1 cell array of 2 biases}
      inputWeights: {2x1 cell array of 1 weight}
      layerWeights: {2x2 cell array of 1 weight}

    functions:

          adaptFcn: 'adaptwb'
        adaptParam: (none)
          derivFcn: 'defaultderiv'
         divideFcn: (none)
       divideParam: (none)
        divideMode: 'sample'
           initFcn: 'initlay'
        performFcn: 'mse'
      performParam: .regularization, .normalization, .squaredWeighting
          plotFcns: {'plotperform', plottrainstate,
                    plotregression}
        plotParams: {1x3 cell array of 3 params}
          trainFcn: 'trainlm'
        trainParam: .showWindow, .showCommandLine, .show, .epochs,
                    .time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
                    .mu_inc, .mu_max, .epoch

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    methods:

             adapt: Learn while in continuous use
         configure: Configure inputs & outputs
            gensim: Generate Simulink model
              init: Initialize weights & biases
           perform: Calculate performance
               sim: Evaluate network outputs given inputs
             train: Train network with examples
              view: View diagram
       unconfigure: Unconfigure inputs & outputs

    evaluate:       outputs = net(inputs)

>> [net,tr]=train(net,p,t),
Warning: 'epoch' is not a legal parameter.
> In param>do_test at 63
  In param at 6
  In trainlm at 92
  In network.train at 106

net =

    Neural Network

              name: 'Custom Neural Network'
        efficiency: .cacheDelayedInputs, .flattenTime,
                    .memoryReduction
          userdata: (your custom info)

    dimensions:

         numInputs: 1
         numLayers: 2
        numOutputs: 1
    numInputDelays: 0
    numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 76
        sampleTime: 1

    connections:

       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

    subobjects:

            inputs: {1x1 cell array of 1 input}
            layers: {2x1 cell array of 2 layers}
           outputs: {1x2 cell array of 1 output}
            biases: {2x1 cell array of 2 biases}
      inputWeights: {2x1 cell array of 1 weight}
      layerWeights: {2x2 cell array of 1 weight}

    functions:

          adaptFcn: 'adaptwb'
        adaptParam: (none)
          derivFcn: 'defaultderiv'
         divideFcn: (none)
       divideParam: (none)
        divideMode: 'sample'
           initFcn: 'initlay'
        performFcn: 'mse'
      performParam: .regularization, .normalization, .squaredWeighting
          plotFcns: {'plotperform', plottrainstate,
                    plotregression}
        plotParams: {1x3 cell array of 3 params}
          trainFcn: 'trainlm'
        trainParam: .showWindow, .showCommandLine, .show, .epochs,
                    .time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
                    .mu_inc, .mu_max, .epoch

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    methods:

             adapt: Learn while in continuous use
         configure: Configure inputs & outputs
            gensim: Generate Simulink model
              init: Initialize weights & biases
           perform: Calculate performance
               sim: Evaluate network outputs given inputs
             train: Train network with examples
              view: View diagram
       unconfigure: Unconfigure inputs & outputs

    evaluate:       outputs = net(inputs)


tr =

        trainFcn: 'trainlm'
      trainParam: [1x1 nnetParam]
      performFcn: 'mse'
    performParam: [1x1 nnetParam]
        derivFcn: 'defaultderiv'
       divideFcn: 'dividetrain'
      divideMode: 'sample'
     divideParam: [1x1 nnetParam]
        trainInd: [1 2 3 4 5 6 7]
          valInd: []
         testInd: []
            stop: 'Performance goal met.'
      num_epochs: 11
       trainMask: {[1]}
         valMask: {[0]}
        testMask: {[0]}
      best_epoch: 11
            goal: 1.0000e-03
          states: {'epoch'  'time'  'perf'  'vperf'  'tperf'  'mu'  'gradient'  'val_fail'}
           epoch: [0 1 2 3 4 5 6 7 8 9 10 11]
            time: [1x12 double]
            perf: [1x12 double]
           vperf: [NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN]
           tperf: [NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN]
              mu: [1x12 double]
        gradient: [1x12 double]
        val_fail: [0 0 0 0 0 0 0 0 0 0 0 0]
       best_perf: 5.8284e-05
      best_vperf: NaN
      best_tperf: NaN

>>


该用户从未签到

发表于 17-4-2025 12:22:07 | 显示全部楼层
根据您给出的数据,似乎是在处理一种故障检测任务时遇到了一些与神经网络相关的问题。这里所定义的矩阵p可能包含了多个参数向量,这些向量可能是故障特征或其他相关数据。同时,矩阵t可能表示某种标签或分类结果。在处理这类问题时,通常需要将这些数据输入到神经网络中进行训练或检测。如果您需要进一步的帮助,例如如何构建神经网络模型、如何训练或如何检测故障等,请提供更具体的问题描述。作为一名汽车工程师,对于这方面的应用有更深入的理解。如果上述理解有偏差,请告知具体的背景和需求。
回复 支持 反对

使用道具 举报

快速发帖

您需要登录后才可以回帖 登录 | 注册

本版积分规则

QQ|手机版|小黑屋|Archiver|汽车工程师之家 ( 渝ICP备18012993号-1 )

GMT+8, 28-8-2025 20:26 , Processed in 0.334052 second(s), 32 queries .

Powered by Discuz! X3.5

© 2001-2013 Comsenz Inc.