Wednesday, October 9, 2019

The Determinants of Consumer Price Index in Indonesia

[pic] THE DETERMINANTS OF CONSUMER PRICE INDEX IN INDONESIA Instructor DR. Moussa Larbani Prepared By Ali Faris(G0912449) Imala Hussain(G0822498) Ma Yue(G0918271) Mia Fathia(G0827756) Nurma Saleah(G0912298) Suthinee Suayngam(G0916798) Ulfah Hidayatun(G0815892) ECON 6030 ADVANCE QUANTITATIVE METHOD Term Paper Kulliyah of Economics and Management Sciences Department of Business Administration 2009/2010 Abstract The most well known and widely quoted economic indicator is the CPI (Consumer Price Index).It represents an estimation of the change in prices of consumer goods and services. Generally, it represents a measurement of our expenses on goods and services we use to meet our day-to-day needs. Severe problems to the overall economy can be caused if the prices of consumer goods and services are abruptly changed. This paper attempts to examine the factors that influence the Consumer Price Index. We observe four variables, namely, money supply, gross domestic product, interest rate, and share price.By utilizing quarterly data from 1996 to 2008, this study applies multiple regressions method to find the best model and factors which can explain Consumer Price Index. The result indicates gross domestic product, interest rate, and stock price significant effect to consumer price index, whereas money supply does not have significant effect. This study also finds that the highest Adjusted R2 as goodness criteria of the model is derived when we include all the factors in the model.Hence, we can conclude that those factors have either strong or weak contribution to consumer price index. Keyword: Consumer Price Index, 1. INTRODUCTION From the beginning of civilization, tribes, countries and nations have always been looking for ways to attain prosperity and growth so as to improve the standard of living for their own people. From the times of Caesar to leaders of today such as John F. Keneddy, things haven’t changed much. To attain prosperity one of the most important things is to maintain a healthy economy.However there are many factors that threaten a healthy economy such as inflation, economic recessions and many other factors. Despite all these threats and inevitable slumps and declines in economy, an economy can be monitored and as such Consumer Price Index is one of the most important economic indicators. Using consumer pricing index, the health of the economy can be in check and the state can take necessary preventive measures otherwise not taken could lead to devastating effects in the form of high unemployment, bankruptcies, major financial losses etc.The CPI is a fixed-basket price index as it represents the price of a constant quantities basket of goods and services purchased by the average consumer. CPI is one of the most frequently used  statistics for identifying periods of inflation or deflation. This is  because large rises in CPI during a short period of time typically denote periods of inflation and  large drops in CPI du ring a short period of time usually  mark  periods of  deflation. It is compiled by the Department of Labor's Bureau of Labor Statistics.In order to get the final result for the CPI, wide researches of the prices of the included in the consumer basket goods and services are made. Then they are entered into a special computer program that makes the calculations. The importance of CPI is viewed in the fact that the estimations of other products, services and benefits are directly linked to the levels of the CPI. For example, if the CPI experiences an increase in its value, then the Social Securities benefits will rise as well. Other things that are directly linked to CPI include: †¢ Wages †¢ Lease agreements Union contracts †¢ Benefit statements and etc. Severe problems to the overall economy can be caused if the prices of consumer goods and services are abruptly changed. Most people associate the concept of CPI with inflation. An increase in the value of the CPI means that an increase in inflation has been observed. When inflation increases the purchasing power of money is lost and people will change their spending habits as they meet their purchasing thresholds and producers will suffer and be forced to cut output. This can be readily tied to higher unemployment rates.The whole economy falls into a recession. The objective of this paper is to find a linear regression model that will accurately estimate the consumer pricing index of Indonesia by using the following independent variables, 1) money supply, 2) Gross domestic product, 3) interest rates and 4) stock prices. In economics, money supply is the total amount of money available in an economy at a particular point in time. There are several ways to define â€Å"money†, but standard measures usually include currency in circulation and demand deposits.The gross domestic product (GDP) or gross domestic income (GDI) is a basic measure of a country's economic performance and is the m arket value of all final goods and services made within the borders of a country in a year. It is a fundamental measurement of production and is very often positively correlated with the standard of living. An interest rate is the price a borrower pays for the use of money they do not own, for instance a small company might borrow from a bank to kick start their business, and the return a lender receives for deferring the use of funds, by lending it to the borrower.Interest rates are normally expressed as a percentage rate over the period of one year. Stock Price in this paper is referred to as Stock Market index which is based on a statistical compilation of the share prices of a number of representative stocks. We observe four variables namely, money supply, gross domestic product, interest rate, and stock price. By utilizing quarterly data from 1996 to 2008, this study applies multiple regressions method to find the best model and factors which can explain Consumer Price Index (S ee appendix 1 and 2). 2. METHODOLOGY 2. 1Bivariate Pearson CorrelationPearson [pic]is typically used to describe the strength of the linear relationship between two quantitative variables. Often, these two variables are designated [pic](predictor) and [pic](outcome). Pearson [pic] has values that range from -1. 00 to +1. 00. The sign of [pic]provides information about the direction of the relationship between [pic] and [pic]. A positive correlation indicates that as scores on [pic] increase, scores on [pic]also tend to increase; a negative correlation indicates that as scores on [pic] increase, scores on [pic]neither increase nor decrease in a linear manner.The absolute magnitude of Pearson [pic] provides information about the strength of the linear association between scores on [pic] and [pic]. For values of [pic]close to 0, there is no linear association between [pic] and [pic]. When [pic]= +1. 00, there is a perfect positive linear association; when [pic]= -1. 00, there is a perf ect negative linear association. Intermediate values of [pic]correspond to intermediate strength of the relationship (Warner, 2008). 2. 1. 1Assumption for Pearson [pic] (Warner, 2008)The assumptions that need to be met for Pearson [pic] to be an appropriate statistic to describe the relationship between a pair of variables are as follows: 1. Each scores on [pic] should be independent of other [pic] scores (and each score on [pic]should be independent of other [pic]scores). 2. Scores on both [pic]and [pic]should be quantitative and normally distributed. 3. Scores on [pic]should be linearly related to scores on[pic]. 4. [pic], [pic]scores should have a bivariate normal distribution. 2. 1. Computation of Pearson [pic] (Warner, 2008) Formula to calculate Pearson [pic]from the raw scores on [pic]and [pic]is as follows: [pic](2. 1) 2. 1. 3Correlation matrix (Warner, 2008) A correlation matrix usually denoted by R; it contains the correlations among all possible pairs of [pic]variables. Th e entire set of correlation in an R matrix is as follow [pic] R = [pic][pic] Note several characteristics of this matrix. All the diagonal elements equal 1 (because the correlation of a variable with itself is, by definition, 1. 0).The matrix is â€Å"symmetric† because each element below the diagonal equals one corresponding element above the diagonal. 2. 2Multiple Regressions Multiple Regression analysis provides an equation that predicts raw score on a quantitative [pic] variable from raw scores on [pic] variables, with [pic]. The predictor or [pic] variables are usually also quantitative, but it can also be a dichotomous variable (dummy variable). Usually, regression analysis is used in non experimental research situations, in which the researcher has manipulated none of the variables.In the absence of an experimental design, causal inferences cannot be made. However, researchers often select at least some of the predictor variables for regression analysis because they be lieve that these might be â€Å"causes† of the outcome variable. If an [pic] variable that is theorized to be a â€Å"cause† of [pic]fails to account for a significant amount of variance in the [pic] variable in the regression analysis, this outcome may weaken the researcher’s belief that the [pic] variable has a causal connection with [pic].On the other hand, if a [pic] variable that is thought to be â€Å"causal† does uniquely predict a significant proportion of variance in [pic] even when confounded variables or competing causal variables are statistically controlled, this outcome may be interpreted as consistent with the possibility of causality. (Warner, 2008) 1. The Multiple Regressions Model Equation The raw score version of regression equation with [pic] predictor variables is written as follows [pic](2. 2) here [pic]is the predicted score on the outcome ([pic]) variable, [pic]is the intercept or constant term, [pic]are regression coefficients, an d [pic]are predictor variables. The [pic]regression coefficient represent partial slope. The [pic] slope represents the predicted change in [pic] for a one-unit increase in[pic], controlling for [pic](i. e. , controlling for all other predictor variables included in the regression analysis). The standard score version of a regression equation with [pic] predictors is represented as follows: [pic](2. ) where [pic]is [pic]scores on [pic], [pic]are beta coefficient that is used to predict The beta coefficients in the standard score version of the regression can be compared across variables to assess which of the predictor variables are more strongly related to the [pic]outcome variable when all the variables are represented in [pic]score form. Beta coefficient may be influenced by many types of artifacts such as unreliability of measurement and restricted range of scores in the sample. (Warner, 2008) 2. Model buildingThis paper use Stepwise regression model building to develop the leas t squares regression in steps, either to forward selection backward elimination, or through standards stepwise regression. The coefficient of partial determination is the measure of the marginal contribution of each independent variable, given that other independent variables are in the model. 2. 2. 3Statistics Sum-of-squares terms. Several regression statistics are computed as functions of the sums of-squares terms: [pic] (2. 4) Partitioning of variation.The regression equation is estimated such that the total sum-of squares can be partitioned into components due to regression and residuals: SST = SSR+ SSE(2. 5) Coefficient of determination. The explanatory power of the regression is summarized by its â€Å"R-squared† value, computed from the sums-of-squares terms as [pic](2. 6) R2, also called the coefficient of determination, is often described as the proportion of variance â€Å"accounted for†, â€Å"explained†, or â€Å"described† by regression. It i s important to keep in mind that a high R2 does not imply causation.The relative sizes of the sums-of-squares terms indicate how â€Å"good† the regression is in terms of fitting the calibration data. If the regression is â€Å"perfect†, all residuals are zero, SSE is zero, and R2 is 1. If the regression is a total failure, the sum-of-squares of residuals equals the total sum-of-squares, no variance is accounted for by regression, and R2 is zero. Adjusted R2. The R2 value for a regression can be made arbitrarily high simply by including more and more predictors in the model. The adjusted R2 is one of several statistics that attempts to compensate for this artificial increase in accuracy.The adjusted R2 is given by: [pic](2. 7) n = sample size (e. g. , number of years of data in calibration period) p = number of predictors in the model, not counting the constant term As shown by the equation, R2 with hat is lower than R2 if the model has more than one predictor. Adding predictors has the effect of increasing the difference between R2 with hat and R2. Adjusted R2 is also useful in comparing among models. ANOVA table and definition of â€Å"mean squared† terms. The sums-of-squares terms and related statistics are often summarized in an Analysis of Variance (ANOVA) table: [pic] Source= source of variationSS= sum-of-squares term df= degrees of freedom for SS term MS= â€Å"mean squared† terms The mean squared terms are the sums-of-squares terms Standard error of the estimate. The residual mean square (MSE) is the sample estimate of the variance of the regression residuals. The population value of the error term is sometimes written as ? e2 while the sample estimate is given by se2 = MSE(2. 8) where MSE has been defined previously. The square root of the residual mean square is called the root-mean-square error (RMSE), or the standard error of the estimate. [pic](2. 9) The subscript â€Å"c† is attached (RMSEc) in (4. ) to distingu ish the RMSE derived from calibration from the root-mean-square error derived by cross-validation (see later). F ratio or â€Å"overall F†. Recall that the explanatory power of a regression is given by the regression R2, which is computed from sums-of-squares terms. The F-ratio, or overall F, which is computed from the mean squared terms in the ANOVA table, estimates the statistical significance of the regression equation. The F-ratio is given by [pic](2. 10) The advantage of the F- ratio over R2 is that the F- ratio takes into account the degrees of freedom, which depend on the sample size and the number of predictors in the model.A model can have a high R2 and still not be statistically significant if the sample size is not large compared with the number of predictors in the model. The F- ratio incorporates sample size and number of predictors in an assessment of significance of the relationship. The significance of the F- ratio is obtained by referring to a table of the F distribution, using degrees of freedom {df1,df2}, where df1 and df2 are the degrees of freedom for the regression mean square and residual mean square from the ANOVA table.How to reject or accept F-test (for overall significance) HO: ? 1 = ? 2 HA : ? 1 and ? 2 not both zero ? = . 05 Decision: Reject Ho if the f-stat falls in the rejection area (p values > ? = . 05) [pic] T-test. The T-test shows if there is a linear relationship between the variable xi and y. The test statistic: [pic](2. 11) How to reject or accept T-test (for individual significance) HO: ? 1 = 0 HA : ? 1 ? 0 ? = . 05 Decision: Reject Ho if the test statistic for each variable falls in the rejection region (p values < . 05) [pic]Confidence interval for estimated coefficients. If the regression assumptions on the residuals are satisfied, including the normality assumption, then the sampling distribution of an estimated regression coefficient is normal with a variance proportional to the residual mean square (MSE). Th e variance of the estimator also depends on the variances and covariances of the predictors. The idea is best illustrated for the case of simple linear regression, for which the variance of the regression coefficient is given by [pic](2. 12)Where Se2 is the residual mean square, xi is the value of the predictor in year xi with hat is the mean of the predictor, and the summation is over the n years in the calibration period. The 100 (1 ? ?) % confidence interval is [pic], where t? /2 is obtained from s t distribution with n-2 degrees of freedom. For more than one predictor, the confidence intervals for regression can be computed similarly, but the equation is more complicated. The equation for the variances and covariances of estimated coefficients is expressed in matrix terms by [pic](2. 13) where X is the time series matrix of predictors.This equation returns a matrix, with the variances of the parameters along the diagonal, and the covariances as the off-diagonal elements (Weisber g 1985, p. 44). The appropriate degrees of freedom of the t distribution is df = n ? K ? 1, where K is the number of predictors in the model, and n is the sample size. Multicolinearity The predictors in a regression model are often called the â€Å"independent variables†, but this term does not imply that the predictors are themselves independent statistically from one another. In fact, for natural systems, the predictors can be highly intercorrelated. Multicolinearity† is a term reserved to describe the case when the intercorrelation of predictor variables is high. It has been noted that the variance of the estimated regression coefficients depends on the intercorrelation of predictors. Haan (2002) concisely summarizes the effects of multicolinearity on the regression model. Multicolinearity does not invalidate the regression model in the sense that the predictive value of the equation may still be good as long as the prediction are based on combinations of predictors within the same multivariate space used to calibrate the equation.But there are several negative effects of multicolinearity. First, the variance of the regression coefficients can be inflated so much that the individual coefficients are not statistically significant – even though the overall regression equation is strong and the predictive ability good. Second, the relative magnitudes and even the signs of the coefficients may defy interpretation. For example, the regression weight on a tree-ring index in a multivariate regression equation to predict precipitation might be negative even though the tree-ring index by itself is positively correlated with precipitation.Third, the values of the individual regression coefficients may change radically with the removal or addition of a predictor variable in the equation. In fact, the sign of the coefficient might even switch. Signs of multicolinearity. Signs of multicolinearity include 1) high correlation between pairs of predictor variables, 2) regression coefficients whose signs or magnitudes do not make good physical sense, 3) statistically non-significant regression coefficients on important predictors, and 4) extreme sensitivity of sign or magnitude of regression coefficients to insertion or deletion of a predictor variable.Variance Inflation Factor (VIF). The Variance Inflation Factor (VIF) is a statistic that can be used to identify multicolinearity in a matrix of predictor variables. â€Å"Variance Inflation† refers here to the mentioned effect of multicolinearity on the variance of estimated regression coefficients. Multicolinearity depends not just on the bivariate correlations between pairs of predictors, but on the multivariate predictability of any one predictor from the other predictors. Accordingly, the VIF is based on the multiple coefficient of determination in regression of eachpredictor in multivariate linear regression on all the other predictors: pic](2. 14) where Ri2 is the multip le coefficient of determination in a regression of the ith predictor on all other predictors, and i VIF is the variance inflation factor associated with the ith predictor. Note that if the ith predictor is independent of the other predictors, the variance inflation factor is one, while if the ith predictor can be almost perfectly predicted from the other predictors, the variance inflation factor approaches infinity. In that case the variance of the estimated regression coefficients is unbounded.Multicolinearity is said to be a problem when the variance inflation factors of one or more predictors becomes large. How large it appears to be a subjective judgement. According to Haan (2002), some researchers use a VIF of 5 and others use a VIF of 10 as a critical threshold. These VIF values correspond, respectively, to Ri2 values of 0. 80 and 0. 90. Some compute the average VIF for all predictors and declare that an average â€Å"considerably† larger than one indicates multicolinea rity (Haan, 2002).At any rate, it is important to keep in mind that multicolinearity requires strong intercorrelation of predictors, not just non-zero intercorrelation. The VIF is closely related to a statistic call the tolerance, which is 1/VIF. Some statistics packages report the VIF and some report the tolerance (Haan 2002). 3. MODEL SPESIFICATION AND DATA SOURCE Based on the theory review in the previous section, we build the following specification to capture the determinants of money supply in Indonesia: CPI = ? 0 + ? 1M1 + ? 2GDP + ? 3IR + ? 4SP + ? The variables are defined as followed: 1.Money supply (M1) is M0 (physical currency) and demand deposits, which are checking accounts. This is used as a measurement for economists trying to quantify the amount of money in circulation. The M1 is a very liquid measure of the money supply, as it contains cash and assets that can quickly be converted to currency. 2. Gross Domestic Product (GDP) is the income of individuals or nations after adjusting for inflation. 3. Consumer price index (CPI) is an index number measuring the average price of consumer goods and services purchased by households. 4. Interest rate (IR) is a fee paid on borrowed capital. . Share Price (SP) is the price of one share of stock. This paper uses quarterly data, from quarter 1 of 1996 to quarter 2 of 2008 that is taken from International Financial Statistic. We also use SPSS software to regress the model above. 4. VALUATION 4. 1Model Estimation We will present the result of data analysis using Multiple Regression Analysis. Multiple Regression analysis provides an equation that predicts raw score on a quantitative [pic] variable from raw scores on [pic] variables, with[pic]. The best model is indicated by the highest Adjusted R2 and the lowest standard errors.In this study, consumer price indexes (CPI) were predicted from the following variables: money supply (M1), gross domestic product (GDP), interest rate (IR) and share price (SP). The sample size[pic]is 50. 4. 2Bivariate correlation In this part, we will observe the strength of the linear relationship between each independent variable and CPI. Table 1. Correlations | | |CPI |M1 |GDP |IR |SP | |1996Q1 |11. 716111 |19. 30 |4. 771904 |3. 788341 |3. 853983 |0. 065642 | |1996Q2 |11. 766373 |19. 4 |4. 819983 |3. 822246 |3. 858643 |0. 036398 | |1996Q3 |11. 827298 |19. 17 |4. 717570 |3. 877778 |3. 863081 |-0. 014697 | |1996Q4 |11. 879324 |19. 16 |4. 810590 |3. 909344 |3. 872063 |-0. 037281 | |1997Q1 |11. 889998 |18. 98 |4. 934683 |3. 905124 |3. 897606 |-0. 007518 | |1997Q2 |11. 914423 |18. 72 |4. 941414 |3. 921527 |3. 906252 |-0. 015275 | |1997Q3 |12. 002958 |23. 38 |4. 78997 |4. 036802 |3. 924765 |-0. 112037 | |1997Q4 |12. 039144 |26. 19 |4. 477901 |4. 111444 |3. 959830 |-0. 151614 | |1998Q1 |12. 262335 |26. 33 |4. 624532 |4. 270861 |4. 140733 |-0. 130127 | |1998Q2 |12. 314070 |32. 16 |4. 495154 |4. 363053 |4. 309088 |-0. 053965 | |1998Q3 |12. 484700 |34. 93 |4. 308177 |4. 530508 |4. 491942 |-0. 038566 | |1998Q4 |12. 457244 |35. 20 |4. 294247 |4. 12515 |4. 538626 |0. 026111 | |1999Q1 |12. 510708 |34. 11 |4. 396215 |4. 536868 |4. 585091 |0. 048223 | |1999Q2 |12. 512071 |30. 34 |4. 767910 |4. 478028 |4. 578437 |0. 100409 | |1999Q3 |12. 533785 |24. 52 |4. 754038 |4. 455257 |4. 555728 |0. 100470 | |1999Q4 |12. 525806 |21. 68 |4. 830264 |4. 422381 |4. 555029 |0. 132648 | |2000Q1 |12. 689215 |19. 58 |4. 798267 |4. 536245 |4. 79349 |0. 043104 | |2000Q2 |12. 725801 |18. 46 |4. 615507 |4. 572988 |4. 589384 |0. 016396 | |2000Q3 |12. 796032 |17. 98 |4. 534614 |4. 630870 |4. 611431 |-0. 019440 | |2000Q4 |12. 817033 |17. 80 |4. 436443 |4. 654596 |4. 639514 |-0. 015082 | |2001Q1 |12. 894097 |17. 85 |4. 423641 |4. 715383 |4. 668689 |-0. 046693 | |2001Q2 |12. 957670 |18. 26 |4. 396349 |4. 769620 |4. 695093 |-0. 74527 | |2001Q3 |12. 980581 |18. 88 |4. 453272 |4. 786409 |4. 731538 |-0. 054871 | |2001Q4 |12. 967675 |19. 20 |4. 357638 |4. 787355 |4. 758569 |-0. 02878 5 | |2002Q1 |13. 014972 |19. 32 |4. 495629 |4. 812124 |4. 804455 |-0. 007668 | |2002Q2 |13. 038967 |19. 18 |4. 670443 |4. 813862 |4. 813371 |-0. 000492 | |2002Q3 |13. 083051 |18. 87 |4. 499660 |4. 860963 |4. 830240 |-0. 030724 | |2002Q4 |13. 67842 |18. 42 |4. 383610 |4. 856585 |4. 856372 |-0. 000213 | |2003Q1 |13. 119451 |18. 20 |4. 382903 |4. 894774 |4. 879052 |-0. 015721 | |2003Q2 |13. 127729 |17. 68 |4. 568618 |4. 880785 |4. 881073 |0. 000288 | |2003Q3 |13. 168067 |16. 44 |4. 706932 |4. 890676 |4. 889544 |-0. 001132 | |2003Q4 |13. 145558 |15. 43 |4. 867750 |4. 851847 |4. 910358 |0. 058511 | |2004Q1 |13. 193018 |14. 0 |5. 023394 |4. 869902 |4. 926710 |0. 056808 | |2004Q2 |13. 243557 |14. 28 |5. 023446 |4. 905169 |4. 946239 |0. 041070 | |2004Q3 |13. 296856 |13. 88 |5. 055704 |4. 940406 |4. 956855 |0. 016449 | |2004Q4 |13. 303815 |13. 54 |5. 255827 |4. 925389 |4. 972241 |0. 046852 | |2005Q1 |13. 357168 |13. 36 |5. 375579 |4. 954380 |5. 001198 |0. 046817 | |2005Q2 |13. 415743 |13. 29 |5. 03708 |4. 996403 |5. 019906 |0. 023503 | |2005Q3 |13. 477237 |13. 78 |5. 410051 |5. 046527 |5. 037628 |-0. 008900 | |2005Q4 |13. 539065 |15. 78 |5. 387751 |5. 110080 |5. 135998 |0. 025918 | |2006Q1 |13. 570606 |16. 34 |5. 539352 |5. 124611 |5. 157502 |0. 032891 | |2006Q2 |13. 608447 |16. 23 |5. 624725 |5. 145281 |5. 164111 |0. 018831 | |2006Q3 |13. 676882 |16. 00 |5. 78345 |5. 191494 |5. 176234 |-0. 015260 | |2006Q4 |13. 679898 |15. 35 |5. 839146 |5. 174745 |5. 194761 |0. 020015 | |2007Q1 |13. 732362 |14. 70 |5. 885796 |5. 206364 |5. 219177 |0. 012812 | |2007Q2 |13. 777640 |14. 08 |6. 039466 |5. 223014 |5. 222613 |-0. 000401 | |2007Q3 |13. 848229 |13. 56 |6. 144445 |5. 264208 |5. 239273 |-0. 024935 | |2007Q4 |13. 855779 |13. 11 |6. 305412 |5. 52354 |5. 259836 |0. 007483 | |2008Q1 |13. 930695 |12. 94 |6. 292750 |5. 309960 |5. 292817 |-0. 017143 | |2008Q2 |14. 023264 |12. 95 |6. 172412 |5. 392000 |5. 199684 |-0. 192316 | |   |   |   |   |   | |0. 175227 | From the table above, we found the sum square value of error is 0. 175. Predict Consumer Price index (CPI) for a quarter in which the logarithmic of GDP is 12. 89 logarithmic of Interest Rate is 17. 5 and logarithmic of Share Price is 4. 42 LCPI = -4. 927 + 0. 769 (LGDP) + 0. 007 (LIR) – 0. 090 (LSP) = -4. 927 + 0. 769 (12. 89) + 0. 007 (17. 52) – 0. 090 (4. 42) = 4. 72 Confidence interval for the mean LCPI value : [pic]; [pic] [pic] [pic] Prediction interval for the mean LCPI value : [pic]; [pic] [pic] [pic] CONCLUSION We have employed multiple regression analysis method, which involve five variables which are expected to affecting money supply. They are consumer price index, interest rate, stock price, GDP, and money supply [M1]. The data are selected from Indonesia international financial statistics.In the recent years Indonesia has been successfully controlling its money supply to get stability in economic circumstances. From the study we found out that there is strong relations hip between consumer price index [CPI]and GDP. When the Gross Domestic Product [GDP] increases, it will also increase consumer price index as these two have linear relationship. Also there is strong correlation between money supply and consumer price index, which means that mean of CPI increase when money supply increases. Addition to this there is positive correlation between stock price and CPI, when stock price increase it tend to increase CPI.However there is negative correlation between interest rate and CPI, when interest rate increases, CPI decreases. From our finding it shows that R-square is 96 percent, which means it is a good model to describe the relation between CPI and other variables we use in the study. REFERENCES Lawrence S. Meyers, Glenn Gamst, and A. J. Guarino. (2006). Applied Multivariate Research Design and Interpretation. Thousand Oaks, London, and New Delhi: Sage Publications. Miles, Jeremy and Mark Shevlin. (2001). Applying Regression & Correlation: A Guide for Students and Researchers. London: Sage Publications. Warner, R. M. (2008).Applied Statistics From Bivariate Through Multivariate Techniques. Los Angeles, London, New Delhi, Singapore: SAGE Publications. Watson, Collin J. and et al. (1993). Statistics for Management and Economics 5th Edition. Massachusetts: Allyn and Bacon. http://www. investopedia. com http://www. stock-market-investors. com http://www. wikipedia. org Appendix 1. Variables Data |   |M1 |Stock Price |CPI |INTEREST RATE |GDP | | | | | | | | |1996Q1 |53162. 00 |118. 14 |47. 8 |19. 30 |122530. 00 | |1996Q2 |56448. 00 |123. 96 |47. 40 |19. 24 |128846. 00 | |1996Q3 |59684. 00 |111. 90 |47. 61 |19. 17 |136940. 00 | |1996Q4 |64089. 00 |122. 80 |48. 04 |19. 16 |144253. 00 | |1997Q1 |63565. 00 |139. 03 |49. 28 |18. 98 |145801. 00 | |1997Q2 |69950. 00 |139. 97 |49. 71 |18. 72 |149406. 00 | |1997Q3 |66258. 00 |118. 99 |50. 64 |23. 8 |163237. 00 | |1997Q4 |78343. 00 |88. 05 |52. 45 |26. 19 |169252. 00 | |1998Q1 |98270. 30 |101. 96 |62. 85 |26. 33 |211575. 00 | |1998Q2 |109480. 00 |89. 58 |74. 37 |32. 16 |222809. 00 | |1998Q3 |102563. 00 |74. 30 |89. 29 |34. 93 |264263. 00 | |1998Q4 |101197. 00 |73. 28 |93. 56 |35. 20 |257106. 00 | |1999Q1 |105705. 00 |81. 14 |98. 01 |34. 11 |271226. 0 | |1999Q2 |105964. 00 |117. 67 |97. 36 |30. 34 |271596. 00 | |1999Q3 |118124. 00 |116. 05 |95. 18 |24. 52 |277558. 00 | |1999Q4 |124633. 00 |125. 24 |95. 11 |21. 68 |275352. 00 | |2000Q1 |124663. 00 |121. 30 |97. 45 |19. 58 |324232. 00 | |2000Q2 |133832. 00 |101. 04 |98. 43 |18. 46 |336314. 00 | |2000Q3 |135430. 00 |93. 19 |100. 63 |17. 98 |360783. 00 | |2000Q4 |162186. 0 |84. 47 |103. 49 |17. 80 |368440. 00 | |2001Q1 |148375. 00 |83. 40 |106. 56 |17. 85 |397956. 00 | |2001Q2 |160142. 00 |81. 15 |109. 41 |18. 26 |424077. 00 | |2001Q3 |164237. 00 |85. 91 |113. 47 |18. 88 |433905. 00 | |2001Q4 |177731. 00 |78. 07 |116. 58 |19. 20 |428341. 00 | |2002Q1 |166173. 00 |89. 62 |122. 05 |19. 32 |449087. 00 | |2002Q2 |174017. 00 |106. 5 |123. 15 |19. 18 |459993. 00 | |2002Q3 |181791. 00 |89. 99 |125. 24 |18. 87 |480725. 00 | |2002Q4 |191939. 00 |80. 13 |128. 56 |18. 42 |473469. 00 | |2003Q1 |181239. 00 |80. 07 |131. 51 |18. 20 |498546. 00 | |2003Q2 |195219. 00 |96. 41 |131. 77 |17. 68 |502690. 00 | |2003Q3 |207587. 00 |110. 71 |132. 89 |16. 44 |523382. 00 | |2003Q4 |223799. 00 |130. 03 |135. 9 |15. 43 |511733. 00 | |2004Q1 |219087. 00 |151. 93 |137. 93 |14. 80 |536605. 00 | |2004Q2 |226147. 00 |151. 93 |140. 65 |14. 28 |564422. 00 | |2004Q3 |234676. 00 |156. 92 |142. 15 |13. 88 |595321. 00 | |2004Q4 |245946. 00 |191. 68 |144. 35 |13. 54 |599478. 00 | |2005Q1 |244003. 00 |216. 07 |148. 59 |13. 36 |632331. 00 | |2005Q2 |261814. 00 |222. 23 |151. 40 |13. 9 |670476. 00 | |2005Q3 |267762. 00 |223. 64 |154. 10 |13. 78 |713000. 00 | |2005Q4 |271166. 00 |218. 71 |170. 03 |15. 78 |758475. 00 | |2006Q1 |270425. 00 |254. 51 |173. 73 |16. 34 |782779. 00 | |2006Q2 |303803. 00 |277. 20 |174. 88 |16. 23 |812968. 00 | |200 6Q3 |323885. 00 |292. 47 |177. 02 |16. 00 |870551. 00 | |2006Q4 |347013. 00 |343. 49 |180. 33 |15. 35 |873181. 0 | |2007Q1 |331736. 00 |359. 89 |184. 78 |14. 70 |920214. 00 | |2007Q2 |371768. 00 |419. 67 |185. 42 |14. 08 |962838. 00 | |2007Q3 |400075. 00 |466. 12 |188. 53 |13. 56 |1033260. 00 | |2007Q4 |450055. 00 |547. 53 |192. 45 |13. 11 |1041090. 00 | |2008Q1 |409768. 00 |540. 64 |198. 90 |12. 94 |1122080. 00 | |2008Q2 |453093. 00 |479. 34 |181. 22 |12. 95 |1230910. 00 |Appendix 2. Lag of Variable Data |   |lm1 |lsp |lcpi |lgdp |ir | |1996Q1 |10. 881099 |4. 771904 |3. 853983 |11. 716111 |19. 30 | |1996Q2 |10. 941075 |4. 819983 |3. 858643 |11. 766373 |19. 24 | |1996Q3 |10. 996819 |4. 717570 |3. 863081 |11. 827298 |19. 17 | |1996Q4 |11. 068028 |4. 810590 |3. 872063 |11. 879324 |19. 16 | |1997Q1 |11. 059818 |4. 934683 |3. 897606 |11. 889998 |18. 98 | |1997Q2 |11. 55536 |4. 941414 |3. 906252 |11. 914423 |18. 72 | |1997Q3 |11. 101311 |4. 778997 |3. 924765 |12. 002958 |23. 38 | |1997 Q4 |11. 268852 |4. 477901 |3. 959830 |12. 039144 |26. 19 | |1998Q1 |11. 495477 |4. 624532 |4. 140733 |12. 262335 |26. 33 | |1998Q2 |11. 603497 |4. 495154 |4. 309088 |12. 314070 |32. 16 | |1998Q3 |11. 538233 |4. 308177 |4. 491942 |12. 484700 |34. 93 | |1998Q4 |11. 524824 |4. 294247 |4. 538626 |12. 57244 |35. 20 | |1999Q1 |11. 568407 |4. 396215 |4. 585091 |12. 510708 |34. 11 | |1999Q2 |11. 570855 |4. 767910 |4. 578437 |12. 512071 |30. 34 | |1999Q3 |11. 679490 |4. 754038 |4. 555728 |12. 533785 |24. 52 | |1999Q4 |11. 733129 |4. 830264 |4. 555029 |12. 525806 |21. 68 | |2000Q1 |11. 733369 |4. 798267 |4. 579349 |12. 689215 |19. 58 | |2000Q2 |11. 804341 |4. 615507 |4. 589384 |12. 725801 |18. 46 | |2000Q3 |11. 16210 |4. 534614 |4. 611431 |12. 796032 |17. 98 | |2000Q4 |11. 996499 |4. 436443 |4. 639514 |12. 817033 |17. 80 | |2001Q1 |11. 907498 |4. 423641 |4. 668689 |12. 894097 |17. 85 | |2001Q2 |11. 983816 |4. 396349 |4. 695093 |12. 957670 |18. 26 | |2001Q3 |12. 009066 |4. 453272 |4. 731538 |1 2. 980581 |18. 88 | |2001Q4 |12. 088026 |4. 357638 |4. 758569 |12. 967675 |19. 20 | |2002Q1 |12. 020785 |4. 495629 |4. 804455 |13. 14972 |19. 32 | |2002Q2 |12. 066908 |4. 670443 |4. 813371 |13. 038967 |19. 18 | |2002Q3 |12. 110613 |4. 499660 |4. 830240 |13. 083051 |18. 87 | |2002Q4 |12. 164933 |4. 383610 |4. 856372 |13. 067842 |18. 42 | |2003Q1 |12. 107572 |4. 382903 |4. 879052 |13. 119451 |18. 20 | |2003Q2 |12. 181877 |4. 568618 |4. 881073 |13. 127729 |17. 68 | |2003Q3 |12. 243306 |4. 706932 |4. 889544 |13. 168067 |16. 44 | |2003Q4 |12. 18504 |4. 867750 |4. 910358 |13. 145558 |15. 43 | |2004Q1 |12. 297224 |5. 023394 |4. 926710 |13. 193018 |14. 80 | |2004Q2 |12. 328941 |5. 023446 |4. 946239 |13. 243557 |14. 28 | |2004Q3 |12. 365961 |5. 055704 |4. 956855 |13. 296856 |13. 88 | |2004Q4 |12. 412867 |5. 255827 |4. 972241 |13. 303815 |13. 54 | |2005Q1 |12. 404936 |5. 375579 |5. 001198 |13. 357168 |13. 36 | |2005Q2 |12. 475390 |5. 403708 |5. 19906 |13. 415743 |13. 29 | |2005Q3 |12. 497854 |5. 410051 |5. 037628 |13. 477237 |13. 78 | |2005Q4 |12. 510486 |5. 387751 |5. 135998 |13. 539065 |15. 78 | |2006Q1 |12. 507750 |5. 539352 |5. 157502 |13. 570606 |16. 34 | |2006Q2 |12. 624135 |5. 624725 |5. 164111 |13. 608447 |16. 23 | |2006Q3 |12. 688144 |5. 678345 |5. 176234 |13. 676882 |16. 00 | |2006Q4 |12. 757118 |5. 839146 |5. 194761 |13. 679898 |15. 5 | |2007Q1 |12. 712095 |5. 885796 |5. 219177 |13. 732362 |14. 70 | |2007Q2 |12. 826025 |6. 039466 |5. 222613 |13. 777640 |14. 08 | |2007Q3 |12. 899407 |6. 144445 |5. 239273 |13. 848229 |13. 56 | |2007Q4 |13. 017125 |6. 305412 |5. 259836 |13. 855779 |13. 11 | |2008Q1 |12. 923346 |6. 292750 |5. 292817 |13. 930695 |12. 94 | |2008Q2 |13. 023853 |6. 172412 |5. 199684 |14. 023264 |12. 95 | ———————– [pic] [pic] [pic]

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.