FANN xor_test - eiichiromomma/CVMLAB GitHub Wiki

(FANN) xor_test

XORのテスト

ソース

    #include <stdio.h>
    #include "fann.h"
    int main()
    {
      fann_type *calc_out;
      unsigned int i;
      int ret = 0;
      struct fann *ann;
      struct fann_train_data *data;
      
      printf("Creating network.\n");
      
      //ネットワークの読み込み(ファイルから)
      ann = fann_create_from_file("xor_float.net");
      //エラー処理
      if(!ann)
        {
          printf("Error creating ann --- ABORTING.\n");
          return 0;
        }
      //接続状態の表示
      fann_print_connections(ann);
      //パラメータの表示
      fann_print_parameters(ann);
      
      printf("Testing network.\n");
      //テストデータの読み込み(ファイルから)
      data = fann_read_train_from_file("xor.data");
    
      for(i = 0; i < fann_length_train_data(data); i++)
        {
          //誤差のリセット
          fann_reset_MSE(ann);
          //テストの実行
          calc_out = fann_test(ann, data->input[i], data->output[i]);
          printf("XOR test (%f, %f) -> %f, should be %f, difference=%f\n",
    	     data->input[i][0], data->input[i][1], calc_out[0], data->output[i][0],
    	     (float) fann_abs(calc_out[0] - data->output[i][0]));
        }
      //後片付け
      printf("Cleaning up.\n");
      fann_destroy_train(data);
      fann_destroy(ann);
      
      return ret;
    }

実行結果

Creating network.
Layer / Neuron 0123456
L   1 / N    3 CCc....
L   1 / N    4 CCC....
L   1 / N    5 CCC....
L   1 / N    6 .......
L   2 / N    7 ...dBBd
L   2 / N    8 .......
Input layer                          :   2 neurons, 1 bias
  Hidden layer                       :   3 neurons, 1 bias
Output layer                         :   1 neurons
Total neurons and biases             :   8
Total connections                    :  13
Connection rate                      :   1.000
Shortcut connections                 :   0
Training algorithm                   :   FANN_TRAIN_RPROP
Training error function              :   FANN_ERRORFUNC_TANH
Training stop function               :   FANN_STOPFUNC_BIT
Learning rate                        :   0.700
Learning momentum                    :   0.000
Quickprop decay                      :  -0.000100
Quickprop mu                         :   1.750
RPROP increase factor                :   1.200
RPROP decrease factor                :   0.500
RPROP delta min                      :   0.000
RPROP delta max                      :  50.000
Cascade output change fraction       :   0.010000
Cascade candidate change fraction    :   0.010000
Cascade output stagnation epochs     :  12
Cascade candidate stagnation epochs  :  12
Cascade max output epochs            : 150
Cascade max candidate epochs         : 150
Cascade weight multiplier            :   0.400
Cascade candidate limit              :1000.000
Cascade activation functions[0]      :   FANN_SIGMOID
Cascade activation functions[1]      :   FANN_SIGMOID_SYMMETRIC
Cascade activation functions[2]      :   FANN_GAUSSIAN
Cascade activation functions[3]      :   FANN_GAUSSIAN_SYMMETRIC
Cascade activation functions[4]      :   FANN_ELLIOT
Cascade activation functions[5]      :   FANN_ELLIOT_SYMMETRIC
Cascade activation steepnesses[0]    :   0.250
Cascade activation steepnesses[1]    :   0.500
Cascade activation steepnesses[2]    :   0.750
Cascade activation steepnesses[3]    :   1.000
Cascade candidate groups             :   2
Cascade no. of candidates            :  48
Testing network.
XOR test (-1.000000, -1.000000) -> -0.982257, should be -1.000000, difference=0.017743
XOR test (-1.000000, 1.000000) -> 0.990769, should be 1.000000, difference=0.009231
XOR test (1.000000, -1.000000) -> 0.991142, should be 1.000000, difference=0.008858
XOR test (1.000000, 1.000000) -> -0.988301, should be -1.000000, difference=0.011699
Cleaning up.

重みが均等にかかっている。たまたま上手くいった。

⚠️ **GitHub.com Fallback** ⚠️