Sunday, October 26, 2014

Matlab Code for FOAGRNN in the past




Generalized Regression Networks

A generalized regression neural network (GRNN) is often used for function approximation. It has a radial basis layer and a special linear layer.

Network Architecture

The architecture for the GRNN is shown below. It is similar to the radial basis network, but has a slightly different second layer. Here the nprod box shown above (code function normprod) produces S2 elements in vector n2. Each element is the dot product of a row of LW2,1 and the input vector a1, all normalized by the sum of the elements of a1. For instance, suppose that

   
      LW{2,1}= [1 -2;3 4;5 6];

      a{1} = [0.7;0.3];

Then

          aout = normprod(LW{2,1},a{1})

  aout =

          0.1000
          3.3000
          5.3000

The first layer is just like that for newrbe networks. It has as many neurons as there are input/ target vectors in P. Specifically, the first-layer weights are set to P'. The bias b1 is set to a column vector of 0.8326/SPREAD. The user chooses SPREAD, the distance an input vector must be from a neuron's weight vector to be 0.5.

Again, the first layer operates just like the newbe radial basis layer described previously. Each neuron's weighted input is the distance between the input vector and its weight vector, calculated with dist. Each neuron's net input is the product of its weighted input with its bias, calculated with netprod. Each neuron's output is its net input passed through radbas. If a neuron's weight vector is equal to the input vector (transposed), its weighted input will be 0, its net input will be 0, and its output will be 1. If a neuron's weight vector is a distance of spread from the input vector, its weighted input will be spread, and its net input will be sqrt(-log(.5)) (or 0.8326). Therefore its output will be 0.5.
The second layer also has as many neurons as input/target vectors, but here LW{2,1} is set to T.
Suppose you have an input vector p close to pi, one of the input vectors among the input vector/target pairs used in designing layer 1 weights. This input p produces a layer 1 ai output close to 1. This leads to a layer 2 output close to ti, one of the targets used to form layer 2 weights.

A larger spread leads to a large area around the input vector where layer 1 neurons will respond with significant outputs. Therefore if spread is small the radial basis function is very steep, so that the neuron with the weight vector closest to the input will have a much larger output than other neurons. The network tends to respond with the target vector associated with the nearest design input vector.
As spread becomes larger the radial basis function's slope becomes smoother and several neurons can respond to an input vector. The network then acts as if it is taking a weighted average between target vectors whose design input vectors are closest to the new input vector. As spread becomes larger more and more neurons contribute to the average, with the result that the network function becomes smoother.

Design (newgrnn)

You can use the function newgrnn to create a GRNN. For instance, suppose that three input and three target vectors are defined as
   
      P = [4 5 6];
      T = [1.5 3.6 6.7];

You can now obtain a GRNN with
   
      net = newgrnn(P,T);

and simulate it with
   
      P = 4.5;
      v = sim(net,P);

You might want to try demogrn1. It shows how to approximate a function with a GRNN.

(Reference: From the Matlab Help file)

%%***********************************************

FOGGRNN Program: FOA+GRNN

 
%%************************************************
%% EMA Economic Department, Soochow University, Taipei, Taiwan
%% 
%% Pan's Original 2D-FOAGRNN
%% Use the FOA to adjust the spread of GRNN
%%
% % Copyright by W-T Pan (2011)
% % Revised by W-Y Lin (2011)

%*************************************
% Begin of Program
% Set parameters
% Clear the operating environment
clc;
clear all;
load TXY.txt;

% for testing length of TXY

LengthofInputdata=length(TXY);

% TXY;
% Input No. of Normalized Data
%  Or use mapminmax;

TrainOb=228  % No. of Traning data

% LenghtofTrain=length(OP)

P = TXY(1:TrainOb,1:7);
LenghtofTrain=length(P)

 P=P'

%  Normalized the Data

 for i9=1:length(P(:,1))

    P(i9,:)=(P(i9,:)-min(P(i9,:)))/(max(P(i9,:))-min(P(i9,:)));

 end

NP=P

LtofTrNormal=length(NP);

Ltr=length(NP);

[row,col]=size(TXY);

set=row/5;
row=row-set;
row1=row/2;

%***************************
Lth=length(TXY)
OP = TXY(1:TrainOb,1:7);
LenghtofTrain=length(OP)
NP=NP'

% for testing length of traindata1
traindata1=NP(1:row1,1:col-1);

% length(traindata1);
% for testing length of traindata2

traindata2=NP(row1+1:row,1:col-1);

%length(traindata2);
% target of traindata1

t1=NP(1:row1,col);

% target of traindata2

t2=NP(row1+1:row,col);
t1=t1'
t2=t2'
tr1=traindata1'
tr2=traindata2'

la=1;
X_axis=rand();
Y_axis=rand();
maxgen=100; 
% maxgen=50; 
sizepop=10; 
%*********
for i=1:sizepop

X(i)=X_axis+20*rand()-10;
Y(i)=Y_axis+20*rand()-10;
D(i)=(X(i)^2+Y(i)^2)^0.5;
S(i)=1/D(i);

%***

g=0;
p=S(i); % Learning spread of GRNN

if 0.001>p
  p=1;
end

% Cross validation

if la == 1

  net=newgrnn(tr1,t1,p);
  yc=sim(net,tr2);

  y=yc-t2;%

  for ii=1:row1
    g=g+y(ii)^2;
  end
 
Smell(i)=(g/row1)^0.5; % RMSE

la=2;

 else

  net=newgrnn(tr2,t2,p);
  yc=sim(net,tr1);

  y=yc-t1;%

  for ii=1:row1
    g=g+y(ii)^2;
  end

  Smell(i)=(g/row1)^0.5; % RMSE
 
la=1;

 end

end

%***

[bestSmell bestindex]=min(Smell);

%%
X_axis=X(bestindex);
Y_axis=Y(bestindex);
bestS=S(bestindex);
Smellbest=bestSmell;
%

for gen=1:maxgen

gen

bestS

  for i=1:sizepop

 %

  g=0;

  X(i)=X_axis+20*rand()-10;
  Y(i)=Y_axis+20*rand()-10;

  %

  D(i)=(X(i)^2+Y(i)^2)^0.5;

  %

  S(i)=1/D(i);

  %
  p=S(i); % Learning the spread of GRNN

  if 0.001>p
  p=1;
  end

% Cross validation

if la == 1
  net=newgrnn(tr1,t1,p);
  yc=sim(net,tr2);

  y=yc-t2;%

  for ii=1:row1
    g=g+y(ii)^2;
  end

  Smell(i)=(g/row1)^0.5;  % RMSE
 
la=2;

 else

  net=newgrnn(tr2,t2,p);
  yc=sim(net,tr1);

  y=yc-t1;

  for ii=1:row1
    g=g+y(ii)^2;
  end

  Smell(i)=(g/row1)^0.5; 

  la=1;

 end

end
   
  %***

  [bestSmell bestindex]=min(Smell); % find the min of RMSE

  %***
   if bestSmell<Smellbest

         X_axis=X(bestindex);
         Y_axis=Y(bestindex);
         bestS=S(bestindex);
         Smellbest=bestSmell;
   end

   %

   yy(gen)=Smellbest;
   Xbest(gen)=X_axis;
   Ybest(gen)=Y_axis;
end

%

figure(1)

plot(yy)

title('Optimization process','fontsize',12)
xlabel('Iteration Number','fontsize',12);ylabel('RMSE','fontsize',12);

bestS
Xbest
Ybest

figure(2)

plot(Xbest,Ybest,'b.');
title('Fruit fly flying route','fontsize',14)
xlabel('X-axis','fontsize',12);ylabel('Y-axis','fontsize',12);
%*******Begin to Predict

% TestData

LengthofInputdata=length(TXY)

% Input No. of Normalized Testing Data
% LenghtofAll=length(OP)

P = TXY(1:LengthofInputdata,1:7);

% LenghtofTallData=length(P);
% Length of testing data (All Data Normalized)
% Changed Non-normalized Data into Normalized Data

P=P';

for i9=1:length(P(:,1))
  P(i9,:)=(P(i9,:)-min(P(i9,:)))/(max(P(i9,:))-min(P(i9,:)));
end

Nt=P';

% Training Data

TrainData=Nt(1:row,1:col-1);
tr=TrainData';

% tr=[tr1 tr2]
% LTr=length(tr)
% Testing Data

TestData=Nt(row+1:LengthofInputdata,1:col-1);

% predict value of testdata
% No target Y

test3=TestData';
LengthofTestData=length(TestData)
t3=TXY(row+1:LengthofInputdata,col);

% length_tr3=length(tr3);
% tt=Nt(1:row,col);

tt=[t1 t2];

% Ltt=length(tt)
% bestS for parameter p;

p=bestS;

% TrainData put inot grnn

net=newgrnn(tr,tt,p);

%% predict value of testdata

ytest=sim(net,test3);
Y_hat=ytest'

% length_Y_hat=length(Y_hat)
% Predicted output Y_hat normalized
Lny=length(Y_hat);
P = Y_hat(1:Lny,1);
P=P';
LenghtofTrain=length(P)

% Changed Non-normalized Data into Normalized Data

for i9=1:length(P(:,1))

  P(i9,:)=(P(i9,:)-min(P(i9,:)))/(max(P(i9,:))-min(P(i9,:)));

end

 NPP=P';

% target of testdata
Target3=t3;

save Y_hat

% End of Program

Test it!

Good Luck!

References:

1. Pan, W.-T. (2011). Fruit Fly Optimization Algorithm. Taiwan: Tsang Hai Book  Publishing Co.,   ISBN 978-986-6184-70-3. (in chinese).
2. Nien Benjamin (2011) Application of Data Mining and Fruit Fly Optimization Algorithm to Construct  Financial Crisis Early Warning Model – A Case Study of Listed Companies in Taiwan, Master Thesis, Department of Economics, Soochow University, Taiwan (in chinese), Adviser: Wei-Yuan Lin. 
3. Wei-Yuan (2012) "A Hybrid Approach of 3D Fruit Fly Optimization Algorithm and General  Regression Neural Network for Financial Distress Forecasting ", Jan. 2012, Working paper, Soochow University, Taiwan.
 

Jing Si Aphorism:
The greater our generosity
the greater our blessings



Soochow University EMA

Friday, October 24, 2014

How to improve the Pan's FOA program

 

 

 

Question from Universiti Malaysia  below: 

Dear Dr Wei-Yuan Lin

 My name is kamal from Universiti Malaysia Pahang. I am trying to
understand your Fruit fly Optimization Algorithm  and implement it to
solve the t-way testing problem.

I would really appreciate  if you could send in sample source code
especially in Java if you have one. FYI, I have downloaded all your
papers.  But still having hard time to understand especially on how to
out in the objective function based on the smell.

I have got this simple quadratic problem and intend to find the x that
gives minimum .. still unable to solve it using FOA

f(x) = (x-10)(x-2)


Ans: you are going to find the min  instead of  the max.

The program are modified as follow:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Pan's Original 2D-FOA
%% EMA Economic Department, Soochow University, Taipei,Taiwan
%
% Copyright by W-T Pan (2011)
% Revised by W-Y Lin (2011)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%***Topic: How to find  the Min. value of a quadratic function

% Clear the operating environment.
tic
clc
clear

% Randomize the initial Drosophila population positions.
X_axis=10*rand();
Y_axis=10*rand();

% Set parameters
maxgen=500;  % No.of iterations
sizepop=100;  % Population size

% Start the FOA: Flies use the sense of smell to find food

for i=1:sizepop

% The Drosophila uses its olfactory to search the food
% by random direction and distance

X(i)=X_axis+2*rand()-1;
Y(i)=Y_axis+2*rand()-1;

% Due to the fly cannot find the exact location of the prey, so we first estimate the distance from the origin (Dist).
% And then calculate the flavor concentration determination value (S), it is the inverse of the distance.

D(i)=(X(i)^2+Y(i)^2)^0.5;
S(i)=1/D(i);

% Concentration determination value (S) is substituted into the fitness function, i.e. the concentration of the flavor (Smelli) of the flies.

% Smell(i)=7-S(i)^2;
Smell(i)= (S(i)-10)*(S(i)-2);

end

% Identify the highest concentration values of this fruit fly Drosophila groups (find the maximum value).

[bestSmell bestindex]=min(Smell); % modified

% Retain the best  Concentration values and best x, y coordinates of the flies

X_axis=X(bestindex);
Y_axis=Y(bestindex);
Smellbest=bestSmell;

% Start the Drosophila iterative optimization

for g=1:maxgen   

 for i=1:sizepop

X(i)=X_axis+2*rand()-1;
Y(i)=Y_axis+2*rand()-1;
D(i)=(X(i)^2+Y(i)^2)^0.5;
S(i)=1/D(i);
Smell(i)= (S(i)-10)*(S(i)-2);


end

[bestSmell bestindex]=min(Smell); % modified

% Determine whether the  concentration value is greater than the previous one.
% If so, the best value and its location of the fly is retained.
% Then all flies utilize their visual to find this best position.

if bestSmell<Smellbest  % modified
X_axis=X(bestindex);
Y_axis=Y(bestindex);
Smellbest=bestSmell;
S_best=S(bestindex);

end

% Record  the optimal Smell value of each generation to yy array.
% Record the coordinates of the optimal iterations

yy(g)=Smellbest;
Xbest(g)=X_axis;
Ybest(g)=Y_axis;
end

% *** Draw the optimal concentration values and flight path for every iteration

figure(1)
plot(yy)
grid  on;
title('Optimization process 7-X^2','fontsize',14)
xlabel('Iteration Number','fontsize',12);ylabel('Smell','fontsize',14);

figure(2)
plot(Xbest,Ybest,'b.');
grid on;
title('Fruit fly flying route 7-X^2','fontsize',14)
xlabel('X-axis','fontsize',12);ylabel('Y-axis','fontsize',12);

% pause(0.5)

S_best

% yy
Smellbest
toc
%%******************************************

Simulated Outputs:

S_best =     5.9991

Smellbest =   -16.0000

Elapsed time is 0.441575 seconds.

%%*********************************

 More than one variable case:

Dear kamal :

You cannot find the minimum value of f(x)=(x1-10)(x2-2) by any method, including FOA. It does not exit. You can try the other function.

First, I will give you the modified Pan's Matlab FOA program below, please find the defect of it. Then Email to me. I will offer you a right program to solve it.
 
Pan's 3D-FOA Porgram:

%*****************************************************************
% Pan's  3D-FOA
% EMA Economic Department, Soochow University, Taipei,Taiwan
% Copyright by W-T Pan (2011)
% Revised by W-Y Lin (2011)
%******************************************************************
%  A Defect QP for Two Variables
 % Finding Min  Smell(i)=f(x1,x2)=(x1(i)-10)^2+(x2(i)-10)^2;
 % Optimal solution is x1*=10, x2*=10, f*(x1,x2)=0

%***
tic
clc
clear

pp=20;
X_axis=pp*rand();
Y_axis=pp*rand();
Z_axis=pp*rand();

%
maxgen=500;
sizepop=100; 

%***
for i=1:sizepop

%%***
X(i)=X_axis+2*rand()-1;
Y(i)=Y_axis+2*rand()-1;
Z(i)=Z_axis+2*rand()-1;

%***
D(i,1)=(X(i)^2+Y(i)^2+Z(i)^2)^0.5;
D(i,2)=(X(i)^2+Y(i)^2+Z(i)^2)^0.5;
x1(i)=1/D(i,1);
x2(i)=1/D(i,2);

%***
%
% f(x)=(x*x)-(12*x)+20) = (x-10)(x-2)
% unbounded solution
 % Smell(i)=(x1(i)-10)^2+(x2(i)-2)^2;

 Smell(i)=(x1(i)-10)^2+(x2(i)-10)^2;
end
%***
[bestSmell bestindex]=min(Smell);

%***
X_axis=X(bestindex);
Y_axis=Y(bestindex);
Z_axis=Z(bestindex);
Smellbest=bestSmell;

% add xValueBest,yValueBest
xValueBest=x1(bestindex);
yValueBest=x2(bestindex);

%***
for g=1:maxgen   
%***
  for i=1:sizepop
 
  X(i)=X_axis+2*rand()-1;
  Y(i)=Y_axis+2*rand()-1;
  Z(i)=Z_axis+2*rand()-1;
 
  
  D(i,1)=(X(i)^2+Y(i)^2+Z(i)^2)^0.5;
  D(i,2)=(X(i)^2+Y(i)^2+Z(i)^2)^0.5;

%***
  
   x1(i)=1/D(i,1);
   x2(i)=1/D(i,2);
  
   Smell(i)=(x1(i)-10)^2+(x2(i)-10)^2;
  
  end
 
  %***
  [bestSmell bestindex]=min(Smell);
  %***
 
if bestSmell<Smellbest
    X_axis=X(bestindex);
    Y_axis=Y(bestindex);
    Z_axis=Z(bestindex);
    Smellbest=bestSmell;

% add xValueBest,yValueBest
    xValueBest=x1(bestindex);
    yValueBest=x2(bestindex);
  
   end

%***
    optimalobj(g)=Smellbest;
    Xbest(g)=X_axis;
    Ybest(g)=Y_axis;
    Zbest(g)=Z_axis;
   
    xVBest(g)=xValueBest;
    yVBest(g)=yValueBest;
  
end

%***
figure(1)

subplot(2,2,1)
plot(optimalobj);
grid on;
title(' QP function ','fontsize',14)
xlabel('Evolution','fontsize',12);ylabel('Objective Function','fontsize',12);

subplot(2,2,2)
plot3(Xbest,Ybest,Zbest,'b.');
grid on;
title('Fruit fly flying route','fontsize',14)
xlabel('X-axis','fontsize',12);ylabel('Y-axis','fontsize',12);zlabel('Z-axis','fontsize',12);

subplot(2,2,3)
plot(xVBest,'b.')
grid on;
title('x1','fontsize',14)
xlabel('Evolution','fontsize',12);ylabel('Value','fontsize',12);

subplot(2,2,4)
plot(yVBest,'b.')
grid on;
title('x2','fontsize',14)
xlabel('Evolution','fontsize',12);ylabel('Value','fontsize',12);

yValueBest
xValueBest
optimalobj
toc

%%**********************************************************************

Reference: Nien Benjamin (2011) Application of Data Mining and Fruit Fly Optimization Algorithm to Construct  Financial Crisis Early Warning Model – A Case Study of Listed Companies in Taiwan, Master Thesis, Department of Economics, Soochow University, Taiwan (in chinese), Adviser: Wei-Yuan Lin.


Jing Si Aphorism:

To give with joy is to help others with a happy mood

Soochow University EMA
 

Wednesday, October 22, 2014

FOA for constrained evolutionary optimization


 

Solving PrG6f(x)

%%*********************************
 % Matlab Code by A. Hedar (Nov. 23, 2005).
 % Min y = (x(1)-10)^3+(x(2)-20)^3;

  % Constraints

% y(1) = -(x(1)-5)^2-(x(2)-5)^2+100

%y(2) = (x(1)-6)^2+(x(2)-5)^2-82.81;

 % Variable lower bounds
% y(3) = -x(1)+13;

% y(4) = -x(2);

 % Variable upper bounds

% y(5) = x(1)-100;

% y(6) = x(2)-100;
 

Optimal value x*=(14.095,0.84296) f*=-6961.81



 %%*********************************

Case 1:

 
 
 
 

Clever 3D-FOA (More efficient):



Output:
 
  

 

 

Optimal value x* = (13.1708, 0.0017), f* = -7966.1
%%************************************************

Original 3D-FOA:  


 
 

Output:

Optimal value x*= (13.0467, 0.7330), f*= -7123.9

 %%**************************************************************
 

Case 2:

 
% Min y = (x(1)-10)^3+(x(2)-20)^3;

 % Constraints

%  -(x(1)-5)^2-(x(2)-5)^2 + 100 <=0
%  (x(1)-6)^2+(x(2)-5)^2-82.81 <=0;

 % Variable lower bounds
%  -x(1)+13 <=0 ;
%  -x(2) <=0 ;  ;

% Variable upper bounds
%  x(1)-100 <=0 ;
%  x(2)-100 <=0 ;
%%********************
 Output:
 


 

 

 

 

 
  
Optimal value x*= (14.0989, 0.8514), f*= -6952.3

Smellbest1 =   1.0e+03 * -6.9523
 
x1Best =  14.0989
x2Best =  0.8514
g1 =     -1.7997
g2 =     -0.0140
>> 
%%*********************************

 PS:  After my paper was published, and I will put these programs into my blogger.

 
Jing Si Aphorism:
 
 
Willing to think,
cultivate ourselves,
and take mindful action,  
there is nothing
we cannot achieve.
 
 
 
 
 Soochow University EMA