Monday, October 14, 2019

Optimizing Cash Management Model With Computer Intelligence

Optimizing Cash Management Model With Computer Intelligence Alli  and M.M. Ramya Abstract In today’s technical era, the financial organizations have great challenges to optimize the cash management process. Maintaining minimum cash leads to customer frustration. At the same time, upholding excess cash is a loss to the organization. Hence, soft computing based cash management solutions are required to maintain optimal cash balance. An Artificial Neural Network (ANN) is one such technique which plays a vital role in the fields of cognitive science and engineering. In this paper, a novel ANN-based cash Forecasting Model (ANNCFM) has been proposed to identify the cash requirement on daily, weekly and monthly basis. The six cash requirement parameters: Reference Year (RY), Month of the Year (MOY), Working Day of the Month (WDOM), Working Day of the Week (WDOW), Salary Day Effect (SDE) and Holiday Effect (HDE) were fed as input to ANNCFM. Trials were carried out for the selection of ANNCFM network parameters. It was found that number of hidden neurons, learning rate and the momentum when set to 10, 0.3 and 0.95 respectively yielded better results. Mean absolute percentage error (MAPE), and mean squared error (MSE) were used to evaluate the performance of the proposed model. MSE that was less than 0.01 proves the capability of the proposed ANNCFM in estimating the cash requirement. Keywords: ANN, ANNCFM , neuron, back-propagation, momentum, learning rate. Introduction: Forecasting cash demand needs to be more accurate for any financial organization including banks [1-3]. If the forecast is flawed, in addition to making financial losses to the banks, it results in customer dissatisfaction. In banking industry, an earlier cash requirement study was made using feed forward neural network with back propagation for short term data of two months [1]. Subsequently another comparative study was made for the cash anticipation using a classic time series models and artificial neural networks [2]. The daily cash requirement models for a bank were optimized with particle swarm and compared with least square method for short term data [3]. The main objective of the paper is to design, develop and test a unique supervised method to forecast the cash requirement for banks from their historic data. 1.1 ANN Background ANN is an efficient tool in understanding the complexities of real world problems in all fields of our daily life[4]. It is used as a function optimizer for linear as well as nonlinear problems in science,engineering,technology,management and finance[5-9]. Artificial neural network learning methods provides the best approach for approximating discrete,real and vector valued target functions [10-12], for complex problems, which are not possible to solve by conventional mathematical methods like analytical and numerical technique. ANN are applied in forex market prediction,portfolio optimization,decision making, metrological parameters forecasting[13-19] etc., The various ANN based approaches applied by researchers in finance field as an alternative to traditional time series model includes Financial and economic forecasting, credit authorization screening, simulation of market behavior, mortgage risk assessment, risk rating of investments and detection of regularities in security price movements [15-19]. 2.0 Design of Proposed ANNCFM Architecture: The process of designing a neural network in many fields resulted in a satisfactory performance but building a neural network forecast for a particular problem is nontrivial task. The modeling issues that affect the performance of the neural network must be selected carefully. 2.1. Selection of ANN Parameters In general, the design of multilayer ANN can have many layers where a layer represents a set of distributed parallel processing nodes. The three layered ANN network with one input, one output and one intermediate hidden layer is sufficient to approximate any complex non-linear function. In the case of forecasting studies many experimental results also confirms ANN with one hidden layer is enough to predict the required data [6-8]. The model architecture of ANNCFM is shown in the Fig1. Fig1: Architecture of ANNCFM Model The important critical decision is to determine the architecture is i) number of layers, ii) number of neurons in each layer, iii) number of arcs which interconnect with nodes , iv) activation function of hidden and output nodes, v) training algorithm, vi)data transformation or normalization, vii)training and test sets and viii)performance measures. 3.0 Design of Proposed ANN Models The proposed ANNCFM model consists of one input, one hidden and an output layer as discussed in section 2.1. In this study the data was collected from a semi–urban area bank located in India. The typical daily cash requirement for thebank for one year is shown in Fig2. Fig. 2: Typical Cash Requirement for a Year The collected data was for a period of three years (2010 to 2012) and was used for training and testing with the following input parameters: RY- Reference year: ranges between 1 to 3 as three years MOY-Month of the year: ranges from 1to 12, WDOM-Working day of the month: ranging from 1 to 27, WDOW –Working day of the week: ranging from 1 to 6, SDE- Salary day effect: ranging from 1 to 3, and HDE- Holiday and the week end effect: either 0 or 1. The fore mentioned parameters were used as six input neurons. In the hidden layer, the number of neurons were varied from 8 to 50.The output layer had one neuron that corresponds to the optimal cash requirement for a day. 3.1 Pseudocode- ANNCFM Main() { [W, V, Voj, Wok]=ANNCFMtrain( x,nip,nh,op,ÃŽ ±,ÃŽ ¼,t) yk = ANNCFMtest(ts, W,V, Voj,Wok,t) [Mserr,Mape]=ANNCFMevaluate() } FunctionANNCFMtrain(x,nip,nh,op,ÃŽ ±,ÃŽ ¼,t) returns network with modified weight { Repeat { For each training sample x(I,nip) //Feed forward computation //Determine the output neuron between input layer and hidden layer //Determine the output neuron between hidden layer and output layer //Compute the error signal between the output and hidden layer //Update the weights between the output(k) and Hidden(j) layer; If itr=1 then { else End if } //Update bias between the output and hidden If itr =1 then { Else End if } //Update the weights between the input(i) and Hidden(j) layer; If itr=1 then { Else End if } //Update bias between the hidden and input If itr =1 then { Else End if } } Until mse } Function ANNCFMtest(ts, W,V, Voj,Wok,t ) returns output(y) { For each testsample ts //Feed forward computation //Determine the output neuron between input layer and hidden layer //Determine the output neuron between hidden layer and output layer } ANNCFM evaluate(tk ,yk,ts) { } 4.0 Evaluation Metrics: In order to derive and evaluate the performance of the most appropriate model that fulfils our objective of optimizing the cash management, few metrics were used. The accuracy of the proposed ANNCFM is evaluated using MAPE and MSE which are defined as follows: MSE= Where Xt is the actual data at period t, Ft is the forecast at period, t, et is the forecast error at period t, while n is the number of observations. 5.0 Results and discussion: The data for a period of three years (2010-2012) was collected from City Union Bank (CUB)-ukt bank branch to simulate the network using MATLAB .For the proposed study the total number of data for the three years is 879, in which the first two and half years, 737 data were used for training(80%) and the remaining six months 142 data sets (20%) were used for testing. Studies found that input data normalization with certain criteria, prior to training process, is crucial to obtain good results, as well as to fasten significantly the calculations [J.Sola J. Sevilla]. Hence the input data was normalized before training. In ANNCFM, 15 runs were made by varying the number of hidden neurons from 10 to 50 using gradient descent with momentum back-propagation (traingdm) for the default training parameters learning rate =0.01, momentum=0.95, Goal=0, and number of iterations as 6000, are illustrated in table 1-column2. The convergence of ANNCFM is influenced by number of hidden neurons in which by varying the number of hidden neurons between 10 through 50. The error was minimal when the number of hidden neurons was set to 10, 20, 40, 45 and 50, by achieving a MSE of 0.0079 as observed from column 3 of table 1. As the number of hidden neurons increase, there is a significant increase in the computational time. Hence the number of hidden neurons in the proposed study was fixed as 10. The pictorial representation for the optimal hidden neuron against its MSE are shown in Fig. 3. Fig.3: Optimal Number of hidden neurons. The learning rate ‘lr’ arrives at a local optimum for the higher learning rate and global optimum for slow learning process. Different trials were made to identify the optimal learning rate to avoid the unstable condition and fluctuations in the results. Learning rate was varied between 0.1 through 0.5 in which 0.3 yielded an optimal learning rate for the given data set, as shown in Fig-4. Fig 4: Optimal learning rate The momentum plays a vital role in identifying the convergence point. Momentum, when set too low, it may get stuck into local minima, and if it is too high, network will become unstable. So there is a need to identify the optimal momentum value for ANNCFM, various momentum values were tested between 0.8 and 1.0, the trained results shows that the optimal momentum value was 0.95 are shown in the Fig-5 Fig 5: Optimal Momentum rate In the ANNCFM model to train and test the cash requirement for a day, week, month the following parameters values are selected based on their performance from the different number of runs made above: i) the number of input neurons=6, ii) maximum number of iteration=6000, iii) learning rate= 0.3, iv) momentum=0.95, v) transfer function=tansig/tansig (hidden and output layer). The optimal selection of the above parameters helped in improving the performance, by minimizing the error rate. This is evident from Table 1, that shows the MSE achieved before and after parameter selection. Table1: ANNCFM performance for different number of hidden neurons The ANNCFM was used to estimate daily, weekly and monthly cash requirement. The estimated values were compared with the actual values for the testing period are shown in Fig.6a,b,c.for the daily ,weekly and monthly prediction. The obtained results shows the ANNCFM was found to perform reasonably good for all the three models .The weights calculated by our ANNCFM was found to be sufficient for cash prediction in which RY,MOY,WDOM, WDOW are essential parameters, and SDE,HDE are additional parameters .The connection weight approach was used to quantify the importance of input variable [20]. The preference of the input parameters were found based on the weights obtained was evident from Table 2, column-4. Table 2: ANNCFM Weights-Preferences. The input parameters SDE and HDE plays a vital role in daily and weekly model as it was observed from the above table it effectively takes care the need of peak cash requirement at the beginning of every month and during holiday periods. The role of SDE in the weekly cash prediction could be easily understood for the weeks like 1,5,14, where the cash requirement is maximum since the beginning of the month lies within the week. However for the 9th and 10th as well as for the 18th and 19th week cash requirement shows the new month starts between the weeks. The monthly model was plotted for six months as shown in Fig.6c in which the experimental results shows that the estimated values were most influenced by WDOM .The cash required and predicted was minimum for the fourth month in which WDOM was minimum. The MAPE and MSE for ANNCFM are shown in Table 3 . Fig.6-a: ANNCFM –Daily Model Fig.6-b: ANNCFM –Weekly Model Fig .6-c : ANNCFM –Monthly Model Table 3 : MAPE and MSE errors for ANNCFM The comparison made between the actual and forecast data shown from the figures indicates that the six input variables selected in our model is sufficient to identify the cash need which is changing from time to time. 6.0 Conclusion: The observations from the experimental results of this study shows that ANNCFM is a useful tool to predict the cash requirement in emerging banking sector. ANNCFM using feed forward neural network training with back-propagation algorithm optimize the needs of cash on daily, weekly and monthly basis. In the implementation process the data set used for the years between 2010 and 2012 were trained and tested to measure the performance. The input parameters were initialized and different runs were made for the proposed model to find out the optimal number of hidden neurons as 10, momentum as 0.95 and learning rate as 0.3 to train and test the network using sigmoid transfer function. The estimated results were with minimal error for the better performance with an accuracy of 91.23%. References. Fraydoon Rahnama Roodposhti , FarshadHeybati and Seyed Reza Musavi, â€Å"A comparison of classic time series models and artificial neural networks in anticipation of cash requirements of banks: A case study in Iran â€Å", Academic and Business Research Institute International Conference, Orlando, USA, 2010. PremChand Kumar and EktaWalia , â€Å"Cash Forecasting: An Application of Artificial Neural Networks in Finance†, International Journal of Computer Science Applications , Vol. 3, No. 1, pages. 61-77, 2006. Alli A, Ramya M M, Srinivasa Kumar V, â€Å"Cash Management Using Particle Swarm Optimization†, International conference of Data Mining and Soft Computing, SASTRA University, Thanjavur, India, 2013. Haykin, Simon, â€Å"Neural Networks†: A Comprehensive Foundation. Macmillan College Publishing Company, New York,1994. Nakamura, Emi, Inflation forecasting using a neural network. Economics Letters, Volume 86(3), pages 373-378, 2006. Refenes, A.P. and H. White , Neural Networks and Financial Economics, International Journal of Forecasting, Volume 6(17),1998. F. Aminian, E. Suarez, M. Aminian and D. Walz, Forecasting economic data with neural networks, Computational Economics 28, pages. 71–88,2006. A. Hanna, D. Ural and G. Saygili, Evaluation of liquefaction potential of soil deposits using artificial neural networks, Engineering Computations 24 , pages. 5–16,2007 W. Gorr, D. Nagin and J. Szczypula, Comparative study of artificial neural network and statistical models for predicting student point averages, International Journal of Forecasting 10, pages. 17–34,1994 Zhang, G., Patuwo, B. E., and Hu, M. Y. Forecasting with artificial neural networks: The state of the art. International Journal of Forecasting, 14:35.62,1998. Z. W. Geem and W. E. Roper, â€Å"Energy demand estimation ofSouth Korea using artificial neural network,† Energy Policy, vol.37, no. 10, pages. 4049–4054, 2009. R. Yokoyama, T. Wakui, and R. Satake, â€Å"Prediction of energy demands using neural network with model identification byglobal optimization,† Energy Conversion and Management, vol.50, no. 2, pages. 319–327, 2009. Bishop, C. Bishop, Neural networks for pattern recognition, Oxford University Press, New York ,1999. H. Taubenbà ¶ck, T. Esch, M. Wurm, A. Roth and S. Dech, Object-based feature extraction using high spatial resolution satellite data of urban areas, Journal of Spatial Science, Volume 55, Issue 1, pages 117-132,2010. P. Tenti, â€Å"Forecasting Foreign Exchange Rates Using Recurrent Neural Networks,† Applied Artificial Intelligence, Vol. 10, pages 567-581, 1996. W. Leigh, R. Hightower and N. Modani, Forecasting the New York stock exchange composite index with past price and interest rate on condition of volume spike, Expert Systems with Applications , pages. 1–8,2005. Manfred Steiner and Hans-Georg Wittkemper, Portfolio optimization with a neural network implementation of the coherent market hypothesis, 1997, Volume 100, Issue 1, Pages 27–40, July 1997. M. Carolin Mabel and E. Fernandez, Analysis of wind power generation and prediction using ANN: A case study, Volume 33, Issue 5, Pages 986–992,May 2008, Sharda, R. and Delen, D. Predicting Box-office Success of Motion Pictures With Neural Networks. ExpertSystems with Applications 30, pages 243–254, 2006. Julian D.Olden, Michael K.Joy, Russell G.Death, An accurate comparison of methods for quantifying variable importance in artificial neural network using simulated data,2004.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.