Assumption University
Bangkok, Thailand
November 1996
GENETIC
FORECASTING ALGORITHM
by
Somsong Chiraphadhanakul

Submitted to
Qualifying Examination Committee
Graduate School of Computer Information Systems
Assumption University, Bangkok, Thailand.
In partial fulfillment of the requirement for
THE DOCTORAL
DEGREE
IN
COMPUTER INFORMATION SYSTEM
Approval Page
RESEARCH TITLE : GENETIC FORECASTING ALGORITHM
CANDIDATE NAME : Ms. Somsong Chiraphadhanakul
ADVISOR NAME : Dr. Vichat Avatchanakorn
ACADEMIC YEAR : 1996
 GENETIC
FORECASTING ALGORITHMS
Abstract
 Recently, genetic algorithms (GAs) are
more frequently used in business, scientific, and engineering
applications. The increasing popularity of GAs is due to their
adaptability and simplicity. In this research, a new forecasting
algorithm using features of GAs is proposed. With GAs’ blind search,
the algorithm takes into account every related components that
assigned to them. The algorithms combines the powerful search
of GAs and their capability to learn about the relationship patterns
of past data in order to forecast future values.
 The genetic forecasting algorithms consists
of two steps : forecasting and learning steps. GAs are used in
forecasting step to estimate parameters of the problem domain.
Patterns learning is taken into account in the algorithm to captures
the patterns relationship of learning data. The effectiveness
of the algorithms is examined by applying to two problem domains
: commercial banks deposit and bankruptcy prediction. The investigation
is performed by computer simulation.

Table
of Contents


PAGE 
Abstract 
i 
list of Figures 
iii 
Intriduction 
1 Problem 
Definition 
2 
Fundamentals of GAs 
3 
Literature Review 
4 
Genetic Forecasting Algorithms 
5 
Conclusion and Recommendation 
10 
Work Plan 
10 
References 
11 

LIST OF
FIGURES

PAGE 
Figure 1 Genetic Forecasting
Algorithm 
6 
Figure 2 Genetic Forecasting Loop 
8 
Figure 3 Agentt chart show schedule of
the project plan 
10 
INTRODUCTION
 Genetic algorithms (GAs) are search procedures
based on natural genetic evolution. Since theoretical developments
by J. Holland [1] in 1975, the procedure can be viewed as general
purpose optimization methods. Moreover, they are employed in a
difficult search, optimization, and machinelearning problems.
The application of these search procedures are increasing in a
wide range. There are some reasons that why GAs are attractive
in application work [2] : GAs can solve hard problems quickly
and reliably, easy to interface to existing simulations models,
extensible and easy to hybridize. The basic idea of GAs is the
genetic program works toward finding better solutions to problems,
just like species adapt to the better related environments. Recently,
GAs are more frequently used in business, scientific, and engineering
applications [3]. The increasing popularity of GAs is due to their
adaptability and simplicity.
 Forecasting is an important function for planning
in various areas such as : control engineering, electric power
systems, economics, business strategic development, production,
finance, etc. There are various techniques in forecasting based
on statistical approach such as : moving average, exponential
smoothing, time series, regression, and economic modeling. These
traditional techniques some are limited in their scope, and results
are frequently low accuracy of prediction [4]. Forecaster always
looks for improving the forecasts, that means some alternative
forecasting techniques are being examined. In the last decade,
an artificial intelligence ( AI ) concepts which includes : knowledge
engineering and expert systems, fuzzy systems, neural networks
and genetic algorithms are introduced to a task of forecasting.
 In this research, a forecasting method
using genetic algorithms (GAs) is introduced. With GAs’ blind
search, the algorithm takes into account every component. The
algorithm combines the powerful search of GAs and their capacity
to learn about the relationship patterns of past data in order
to forecast future values. Conceptually, the genetic forecasting
algorithm composes of two loops, the genetic forecasting loop,
and the pattern learning loop. The genetic forecasting loop aims
at minimizing any error between the actual value and the forecast
value as well as minimizing average patterns error. The pattern
learning loop serves as a means of pattern relationship learning
which enhances the predictive power of the algorithm.
PROBLEM
DEFINITION
 Forecasting is the most important role in doing
business. Top management needs forecasts for planning and implementing
longterm strategic. Financial management in both private and
public organizations are operated under uncertainty or risk, therefore,
forecasting is basically needed. The traditional forecasting techniques
result in some shortcomings or problems :
 1. In multiple regressions method, there are
more than one independent variables in the equation , these variables
may have high correlation between themselves. Multicollinearity
occurs when these variables interfere with each other. The variables
with highly correlated will be left out. This may creates a marginal
error since the missing components may have significant information
for the forecasting process.
 2. In time series method, a large data set is
required to extrapolate the past behavior and the causal relationships
between other factors are not revealed. Thus, in this method,
some important indicator may be left out.
 3. In artificial neural networks, the problem
is that the entrapment in a local minimum which often occurs with
a gradient descent algorithm like backpropagation or an energy
minimization of the Hopfield network [5] . In addition, neural
networks are models on human brain and learn by themselves from
the given patterns for their neural models. Consequently, the
resulting models can not explain the physical characteristics
of the problem domain under study.
 To overcome this drawback, a forecasting
method using genetic algorithms is introduced. With the use of
GAs’, search process can avoid the differentiation and can escape
from the local minimum. In GAs’ blind search, the algorithm takes
into account every related components . The algorithm combines
the powerful search of GAs and their capability to learn about
the relationship patterns of past data in order to forecast future
values. The objective of this research is to propose a new algorithm
in forecasting that can be a guideline for employing various types
of models to simulate different agency environments.
FUNDAMENTAL OF SIMPLE GAs
 Genetic algorithms (GAs) are search procedures
based on natural genetic evolution. The principle behind GAs is
essentially Darwinian principles of biological evolution in the
sense that within a population, those elements of the population
which are stronger or fitter have a better ability to reproduce
another generation of elements.
 The basic idea of GAs is to maintain a constantsize
set of candidate solutions to the current problem. Candidates
in the population compete with one another many iterations to
gain the best solution to the objective problem, with the control
of GAs operators. Theoretical analyses have shown that GAs exploit
the knowledge accumulated during a search in a way that they efficiently
balance the need to explore new areas of the search space with
the need to focus on high performance regions of that space [6].
 In simple GAs, a reproduction operator
maintain a size of population by a guide of selection mechanism.
The search always has new information to explore with the assist
from a crossover operator and a mutation operator. Crossover drives
information exchange between two random selected parents chromosomes
to form a new offspring, which mutation ensures against a premature
loss of important information by simply altering the bit at randomly
chosen site.
LITERATURE REVIEW
 There are various applications using GAs in order
to search, optimization, and machinelearning to the problem domains
in a wide rang area. Here are some examples which emphasized on
using GAs in forecasting task.
 Financial forecasting
and portfolios management : GAs
are used in financial forecasting and money market at a large
number. Karjalainen R.E., [7] used GAs to solve optimization problem
and find technical trading rules. In addition, several applications
presented that GAs can improve a trading system [8, 9]. In the
paper of Wittkemper H.G. and Steiner M. [10], they used GAs to
optimize and forecasting the systematic risk of stocks. In portfolio
management, GAs are used to manage financial portfolios and to
optimize portfolio performance [11, 12, 13]. Schwartz E.I. [14]
used neural networks trained by GAs to help traders forecast market
patterns which was able to earn 25% returns per year investment
in the currency market.
 Economics
: Arifovic
J. [15, 16] used GAs as machinelearning to learn correct decision
rules in overlapping generation economics, and examine the behavior
of the exchange rate and experimental economies .
 Production
planning : In a dynamic manufacturing environment,
GAs are used to find the optimal parameters to predict the requirements
of machine and workload assignment and scheduling [17, 18].
GENETIC FORECASTING ALGORITHM
 Fig. 1 illustrates a simplified idea of the proposed
genetic forecasting algorithm. Conceptually, the algorithm composes
of two loops: the genetic forecasting loop, and the pattern learning
loop. The genetic forecasting loop aims to minimize an error between
the actual value and the forecast value as well as to minimize
average patterns error. The pattern learning loop serves as patterns
relationship learning to add more predictive power to the algorithm.
 The genetic forecasting algorithm uses the following
mathematical model to represent the pattern of the problem domain
:
 y(k+1) = b0 + b1x1(k) + b2x2(k) + .................
+ bnxn(k) + we(k), (1)
 where y is the output, x1, x2, ....xn are components
of the inputs, e is an error, and k is a step.
 At any instant step k, the input components x1(k),
x2(k),...., and error, e(k) are used to calculate a forecast of
the output of the next step, y(k+1), according to the following
equation:
 y(k+1) = b0 + b1x1(k) + b2x2(k) + , ...................
+ bnxn(k) + we(k), (2)
 where b0, b1,…. and w, are parameters estimated
by using the input data at step k–1 and the output data y at step
k. The parameters are optimally selected by genetic forecasting
loop from a set of chromosomes in the population.
A.Genetic Forecasting Loop
 The genetic forecasting loop, the main
forecasting function in the proposed algorithm, is illustrated
in detail in Fig. 2. The aim of the genetic forecasting loop is
to determine parameters b0, b1,.... and w, which are unknown.
Genetic forecasting uses chromosomes to represent these parameters
at step k. Each chromosome is composed of n+2 orders of parameters
concatenating to form a string. A parameter is represented by
ten bits. The first bit represents positive/negative sign, while
the remaining nine bits represent the parameter value. To decode
the nine binary bits to real values, the decoding factor a is
used. The range of parameter is limited by a. These decoded parameter
values are part of search spaces that GAs have to go around to
find an optimum parameter.
 The genetic forecasting loop selects the
best fit parameter set based on the selection mechanism (to be
stated next) of which the fitness function is derived by taking
into account a number of patterns in the pattern learning loop.
The genetic forecasting loop is deemed satisfactory when the forecasting
error is less than a predefined value s.
B. Pattern Learning Loop
 The pattern learning loop is incorporated
into the proposed algorithm to increase the effectiveness of forecasting
the result. A set of data (x1, x2, x3, x4, x5 )k, and yk+1, k=1,...m
(m is a number of patterns to be learned), are sequentially presented
to the genetic forecasting loop to capture the patterns relationship
between x at step k, and y at the next step k+1. All patterns
are recursively presented until the genetic forecasting loop has
learned the m patterns for a predefined number of rounds, R. At
this point the optimal parameters are identified. The identified
parameters at step m and x at step m, are then used to forecast
the
 output y for step m+1. The genetic forecasting
component determines the optimal yopt(m+1) by the following equation:
 yopt(m+1) = b0 + b1x1(m) + b2x2(m) + ………...
+ bnxn(m) + we(m) (3)
C. Selection Mechanism
 The selection mechanism of the genetic
forecasting is based on the evaluation of the fitness function
derived from two categories of errors: (i) the percentage error,
% error, between the actual and the forecast values at any present
pattern k : error =  y(Actual) – y(forecast)  (4)
 (ii) the average error of all learning
patterns : m
 pattern [error] = ĺ errorj / m, (5) j=1
 where m is a number of patterns being presented
to the genetic forecasting loop.
 The selection mechanism chooses optimal
parameters at step k from a set of chromosomes in the population.
It minimizes the following fitness function:
 fitness = error + pattern [error] (6)
 The optimum identified parameters together
with the inputs at step k are used to calculate the forecasting
value at step k+1.
CONCLUSION AND RECOMMENDATION
 The research proposes the genetic forecasting
algorithm that consists of two loops: the genetic forecasting
loop and the pattern learning loop. The fitness function of the
selection mechanism is defined by minimizing the fitness function
in which the errors from both the genetic forecasting and pattern
learning loops are taken into account. The effectiveness of the
algorithm is examined by applying to two problem domains : commercial
banks deposit and bankruptcy prediction. The investigation is
be performed by computer simulation.
WORK
PLAN
 Figure3 : A gantt chart show schedule of
the project plan.
REFERENCES
[1] J. Holland, Adaptation in Natural
and Artificial Systems, Ann Arbor: University of Michigan Press,
1975.
[2] D.E. Goldberg, " Genetic and Evolutionary
Algorithms Come of Age" , Communication of ACM, vol. 37,no.
3, March 1994, pp. 113119.
[3] D.E. Goldberg, Genetic Algorithms
in Search Optimization, and Machine Learning, AddisonWesley,
1989.
[4] Jae K. Shim, Joel G. Siegel, Handbook
of Financial Analysis, Forecasting & Modeling, PrenticeHall,
Inc., 1988 .
[5] LiMin Fu, Neural Networks in Computer
Intelligence, McGrawHill, Inc., 1994
[6] John J. Grefenstette, "Genetic Algorithms",
IEEE Expert, vol. 8, no. 5, October 1993, pp. 68.
[7] R.E. Karjalainen, " Using Genetic
algorithms To Find Technical Trading Rules In Financial Markets"
, PhD. Dissertation, University of Pennsylvania, 1994.
[8] Bob Johnstone, " Research & Innovation
: Remaking Markets" , Far Eastern Economic Review, vol.156,
iss.12, Mar 25, 1993, pp.50.
[9] Murray A Jr Ruggiero, "Breeding a
Super Trader (Part 1)" , FuturesCedar Falls[CMM], vol.26, iss.1,
Jan. 1997, pp. 5254.
[10] HansGeorg Wittkemper; Manfred Steiner,
" Using Neural network to Forecast The Systematic Risk of Stocks
", European Journal of Operational Research, vol.90, iss.3,
May 10,1996, pp. 577588 .
[11] Joe Mullich, " Software Evolves
as ‘ Thinking ’ Tool ", Advertising Age’s Business Marketing,
vol.81, iss.2, Mar. 1996, pp. 1012 .
[12] Rebecca lewin , " Maximizing Money
Management " , Bank Systems & Technology , vol.32, iss.10, Oct.
1995, pp.3843 .
[13] Karen Corcella, " Market Prediction
Turns The Tables On Random Walk ", Wall Street & Technology,
vol. 12, iss.13, May 1995, pp.3642 .
[14] E.I. Schwartz , " Where Neural Networks
Are Already At Work ", Business Week, Nov. 2, 1992, pp.136137
.
[15] Jasmina Arifovic, " The Behavior
of The Exchange Rate in The Genetic Algorithm and Experimental
Economies", Journal of Political Economy, vol.104, iss3, Jun
1996, pp.510541.
[16] Jasmina Arifovic, " Genetic Algorithms
and Inflationary Economies ", Journal of Monetary Economics,
vol.36,iss.1, Aug. 1995, pp.219243 .
[17] B.Porter; K.L.Mak; Y.S.Wong, " Machine
Requirements Planning and Workload Assignment Using Genetic
Algorithms", IEEE Proceeding of ICNN’ 1995, Australia, pp.711715.
[18] Christian Bierwirth; Herbert Kopfer;
Dirk C Mattfeld; Ivo Rixen, " Genetic Algorithm Based Scheduling
in a Dynamic Manufacturing Environment ", IEEE Proceeding of
ICNN’ 1995, Australia, pp. 439443 .
