• No results found

The consequences of fair-value accounting during the financial crisis of 2007

N/A
N/A
Protected

Academic year: 2021

Share "The consequences of fair-value accounting during the financial crisis of 2007"

Copied!
38
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

The consequences of fair-value accounting

during the financial crisis of 2007

Ralph van de Leur

Student Number: 10268766

Faculty of Economics and Business

Specialization: Economics and Finance

Supervisor: Ms. Lin Zhao

(2)

Abstract.

This paper researches the consequences of fair-value accounting and the resulting impact of liquidity pricing during the financial crisis of 2007. First of all the development of historical cost accounting and its evolution to fair value accounting is summarized to provide a frame of reference. The unfolding of the financial crisis impacted financial markets’ liquidity and the overall state of the economy. In this research the dimensions that characterize the liquidity of the stock market are used to provide an insight into the changing conditions. According to the transaction cost measures, the dimensions of breadth and resiliency decreased significantly and the transaction costs in the financial market increased. The intrinsic liquidity of stocks in our sample decreased, in other words the impact of volume on the trading price increased. For both these reasons we can state that a situation of liquidity pricing is more likely during the crisis concerning our sample.

(3)

Table of contents.

Abstract. ... 2 Table of contents. ... 3 1. Introduction. ... 4 2. Literary Review. ... 7

A. Development of the Fair-value standards. ... 7

B. Regulation of FAS no. 157 and IFRS no. 13. ... 8

C. Proponents and Opponents. ... 10

3. Theoretical Framework. ... 12

I. Liquidity Pricing. ... 12

II. Liquidity Channels. ... 16

III. Liquidity Dimensions. ... 19

IV. Liquidity measurement. ... 20

A. Transaction cost measures ... 20

B. Volume-based measures. ... 21

C. Price-based measures... 22

D. Market-impact measures. ... 23

4. Method and Data. ... 24

5. Results. ... 26

A. Transaction cost measures. ... 26

B. Volume-based measures. ... 26 C. Price-based measures ... 26 D. Market-impact measures ... 27 6. Conclusion... 31 7. Appendix 1. ... 32 8. Appendix 2. ... 34 9. Appendix 3. ... 35 10. Bibliography. ... 37

(4)

1.

Introduction.

The recent financial crisis of 2007 and its consequences for financial institutions have led to an international debate on the suitability of classification and valuation of financial assets and liabilities on the statements according to IFRS no. 13 and FAS no. 157 (Heaton et al., 2010). Both of these recently issued standards concern the handling of fair-value accounting (Allen & Carletti, 2008a, p. 358). Fair-value accounting (FVA) introduces market values into the financial reporting statements as opposed to the historical cost accounting (HCA), which reflects the original cost price adjusted by the depreciation over time. The introduction of these standards is part of a shift towards an increasing use of FVA. Currently both methods, FVA and HCA, are assigned to a variety of asset and liability categories, creating a hybrid system.

The International Accounting Standards Board (IASB) and the Financial Accounting

Standards Board (FASB) are now converging toward fair-value accounting for an increasing number of categories (Schaffer, 2010, p. 4). In October 2002 a statement was made and the two bodies agreed upon the ‘Norwalk agreement’ to develop compatible standards resulting in comparable standards for fair value accounting in the upcoming future (SEC, 2008, p. 178). Besides the expanding

categorization also a framework of judgment has come in to place that defines fair value measurement and guides its process. Both FAS no. 157 and IFRS no. 13 provide a structure, consisting of three levels based upon the availability of information, which determines the valuation method apart from the categorization. When market prices are available and used, fair value accounting is also called mark-to-market accounting (Laux & Leuz, 2009, p. 1). Presuming the reporting standards only present the information currently at hand without directly impacting the market, the differences between the accounting methods would be irrelevant. The reality however is that the economic environment can be assumed to imperfect (Sapra, 2008, pp. 1-3).

The consequences of these imperfections give rise to an extensive debate concerning the accounting methods applied. The evolution from HCA towards a more frequently used FVA fosters this debate about the desirability of the direction this system is developing towards. Because this change of standards for the accounting of financial assets and liabilities is often argued to have contributed or exacerbated the recent crisis according to the United States Securities and Exchange Commission (SEC, 2008, pp. 11-12). In particular when markets are illiquid, assets are written down and valued below their true economic value. Therefore the balance sheet values reported may be driven by short-term market fluctuations instead of the underlying long term value. Moreover these redemptions can be substantial because they are said to trigger a downward spiral when margin and regulatory capital requirements are no longer met. Namely the forced sale of assets to comply with the regulatory requirements induces an unwanted feedback effect.

The regulation on FVA accounting makes the reported statements dependent on the liquidity in the market. Liquidity pricing namely occurs if the prices in times of financial distress, when there is

(5)

likely to be a liquidity shortage in the market, not reflect the fundamentals but the amount of liquidity in the market available. The actual price according to FVA in these situations is lower than the fundamental value. Therefore the liquidity situation in the market is of impact on the statements reported. This introduces significant risks to firms with a substantial part of their assets reported under FVA regulations (Sapra, 2008, pp. 1-3).

The problems induced by FVA are highly relevant according to the ECB report because relatively minor changes of these standards can inflict substantial changes for the financial statements reported by the banking and financial industry (Enria et al., 2004, p. 4). This is the case since the affected asset classes make up a major part of the balance sheet. Fair value reporting has been used for 31% of the banks’ assets. The insurance industry has even up to 71% of the assets reported at fair value (SEC, 2008, pp. 47-48). The write-offs have substantial consequences because the

categorization results in a significant share of fair valued products on the balance sheet of firms in the financial industry (SEC, 2008, pp. 47-49). Therefore any changes to those standards give rise to extensive debates and objections by many of the affected parties. A further insight on the debate is provided in the literary review section.

Since the existing literature on liquidity pricing, a situation that rises when the price is not based on the value of the underlying but on the liquidity in the market, is ambiguous on providing the best solution on the topic of fair value measurement, further research is needed. Looking on the currently used models in the literature the core assumption made is concerning the presence or the absence of market liquidity. Note that this research will not focus on the internal holdings of liquid assets or liabilities, but only on the market liquidity. The liquidity in the market is assumed to be fully liquid or illiquid to compute the models correctly, however the actual level prevailing will lie

somewhere in between. The range of liquidity situations cannot be properly captured in a two state, binary situation of liquidity, model. The varying liquidity pricing impact on the value of the assets and liabilities as measured is left out of most research. The proponents’ arguments can only be weighed when the real liquidity pricing impact is researched. Therefore by researching the liquidity pricing impact, the almost binary input used for many models for liquidity can be refined. Moreover the arguments put forward in the current debate might be refined by the results presented. This has resulted in the thesis research question: In which way and to what extent has liquidity pricing through the use of fair value accounting impacted the values of the reported financial assets and liabilities during the financial crisis started in 2007.

The research executed contains a literature review combined with an empirical data research. To answer the research question, a literature review will be performed by interpreting and

summarizing the points made in academic articles, starting off with the identification and comparison of fair-value accounting under IFRS and local GAAP. The extent of the consequences may depend on the standard use. To identify the consequences fair value accounting is illustrated by comparing it to alternative valuation methods. Thereafter in the theoretical framework the theories will be discussed

(6)

and evaluated, this is done with the help of professionals dealing with the issues on a day-to-day basis. Deloitte offers the opportunity to discuss the implications with various experienced audit experts. This is an essential link to verify the actual implications apart from the theoretical implications.

The data research is executed by examining the dimensions of liquidity according to the findings from the S&P 500. The sample consists of the five companies with the largest market capitalization on the index, Exxon Mobil, Microsoft, Apple, Johnson & Johnson, and Procter & Gamble. The findings suggest an impact on the market liquidity during the financial crisis and therefore provide a possible basis for liquidity pricing.

The rest of the paper proceeds as follows. Section 2 describes the development of the Fair-value standards over time and the opinion of both the proponents and the opponents. Also in this section is a description of the current regulation of FAS no. 157 and IFRS no. 13. Section 3 outlines a framework of literature review in which the models and theories are further elaborated. Both

econometric models and theoretical arguments are integrated in the framework. Section 4 describes the method of research as performed and the data sample used. Section 5 contains the empirical research and the results. Finally section 6 presents the conclusion that can be drawn from section 3, 4 and 5 of the research.

(7)

2.

Literary Review.

An initial literary review is presented in this report after extensive research of a variety of articles. To provide a complete insight the review, is divided in three main components, which are the development of the fair-value standards, the regulation and lastly the proponents and opponents.

A. Development of the Fair-value standards.

The fair value accounting as described in the introduction originates from the period prior to the Great Depression according to the report of the Securities Exchange Commission (SEC, 2008, p. 34). Initially the balance sheet of firms recorded current or appraised values as a reflection of their asset value. Alongside the assets banking organizations were required to report their investment securities portfolios according to their current market value. Performance measurement and

investment decisions were complex and involved a high degree of uncertainty for investors since the firms were left unregulated in terms of valuation principles. Following from the Great Depression fixed assets and intangibles were reported by the historic cost accounting method (SEC, 2008, p. 34). This method became a widespread standard for reporting assets, until the market fluctuations during 1973-1975. During the upswings the written down securities recovered up to their original value, while there was no guidance of revaluing these assets. To cover the prevailing issues, SFAS No. 12 was issued to provide guidance for marketable security revaluation. Nevertheless the still remaining historical cost accounting method contained some flaws, which were emphasized in the Saving and Loan Crisis (Allen & Carletti, 2008a, p. 376). These flaws masked the problems by revealing the actual problems only gradually over time. A project was started to solve the issue of off-balance sheet financing and the disclosure of financial instruments, resulting in the SFAS No. 107. Meanwhile these institutions kept their long-term fixed rate mortgage loans, which were under water due to the raising short term financing rates, for an artificial value on their balance sheet. The sale of these items would lead to substantial direct losses and insolvent firms. On one hand detection of insolvency and

information about the current state of the assets values was delayed. On the other hand the retarding effect of HCA enabled firms to use gain trading in their advantage (Jordan et. al, 1997, p. 50). The gain trading opportunities are restricted by the regulations on banks equity and volatility ratios. The regulatory focus, imposed on the insurance industry, however is more aimed at liquidity and reserve ratios, which offers more opportunities for gain trading (Jordan et. al, 1997, p. 51). The SFAS 115, implemented in 1994, was issued to solve some of these problems of inconsistency in the recognition. This standard required to report both trading and available-for-sale securities at fair value (Laux & Leuz, 2009, pp. 4-5). Jaggi et al. (2010, p. 472) argue that the main objective of this standard has been to increase the transparency of the assets’ estimated value by defining three categories, being ‘trading securities’, ‘available-for-sale’ and ‘held-to-maturity securities’. Equity and debt instruments held for the purpose of selling them in the near future are classified as trading securities. This group is reported according to fair value, unrealized changes of value are reported in the profit and loss

(8)

account. Instruments a company intends to hold to maturity are classified as held-to-maturity, which are reported at amortized cost reduced by impairments. All others are reported as available-for-sale and valued at fair value. The unrealized gains and losses are reported as a component of shareholders equity and therefore on the profit and loss account under comprehensive income. They also point out the absence of a strict guidance for market value determination of securities. Moreover the use of derivatives expanded drastically during the 1990s (SEC, 2008, pp. 37-38). Because of the leveraged nature these positions make HCA an unreliable model to capture all the associated risks and value changes. Accounting for derivatives was changed by the issuance SFAS No. 133, that requires that all derivative instruments should be reported at fair value from on 1998. The standards lack any guidance on the determination of market values. Up to this point the regulators have only focussed on the categorization of assets (Jaggi et al., 2010, pp. 472-473).

To provide a solution, fair value measurement is more specifically defined by SFAS No. 157 since 2006. The new principle is affected for the fiscal years beginning after the 15th of November 2007. Prior to this statement, different definitions of fair value measurement existed, moreover the guidance was scattered across multiple principles. The reasons for issuing this statement were to increase consistency and comparability of the measurements and to expand the disclosures. The renewed definition of fair value presented is “the price that would be received to sell an asset or paid to transfer a liability in an orderly transaction between market participants at the measurement date” (FASB, 2006, p. 8). Fair value is defined similarly under IFRS (Laux & Leuz, 2009, p. 4). The usage of this core definition throughout the whole set of standards gives rise to an increased consistency of reporting. The statement makes use of the exit price of the orderly transfer to emphasize that the forced transactions price is not the fair value observed. This exit price is preferably observed from the principal market and otherwise the most advantageous alternative market for the asset or liability. The hierarchy of fair value measurements prescribes a hierarchy of three levels of valuation inputs. The first two levels are based upon observable inputs and the third level is based upon unobservable valuation. The first level inputs are quoted prices in active markets of identical accessible assets or liabilities. The second level uses quoted prices of inactive markets or quoted prices for identical or similar assets and liabilities as inputs. In the absence of quoted prices other observable inputs may be used. The third level inputs are unobservable for the asset or liability in case of an inactive market. Subsequent the firm shall reflect the asset or liability prices according to their own assumptions. The valuation techniques allowed are the market, income and cost approach. The eventual market value is a weighted average of the previously mentioned relevant techniques.

B. Regulation of FAS no. 157 and IFRS no. 13.

The FAS no. 157 and the IFRS no. 13 standards are issued to establish a framework for measuring fair value, by expanding the disclosures stated about fair value measurements. The

(9)

standards consist of three main components the definition of fair value, a framework of measurement and a requirement on disclosures about fair value measurements (FASB, 2006, p. 1).

The definition of fair value is stated as “the price that would be received to sell an asset or paid to transfer a liability in and orderly transaction between market participants at the measurement date”. According to the FASB (2006, pp. 3-6) the price should be determined by the exit price of a hypothetical transaction if it is not a forced transaction. These forced transactions are excluded to eliminate an excessive volatility in times of a forced liquidation or a distress sale. The assumption is made that the principal market is used to sell the asset or to transfer the liability. This market is defined as being the market in which the reporting entity would sell the asset or transfer the liability with the greatest volume and level of activity. This is not by definition the most advantageous market available with respect to the price. The standards do not specifically refer to the liquidity in the market, but they do mention two important aspects, as presented in the theoretical framework. The price determined in these markets should not include the transaction costs charged. The market participants are buyers and sellers in the principal market that are independent of the reporting entity, while being able and willing to transact. This may exclude the creation of artificial prices created by bid and ask quotes produced by the reporting entity.

The valuation techniques distinguish three main approaches used by firms to be used

according to the data availability and observability (FASB, 2006, pp. 9-12). The first level inputs are quoted prices in active markets of identical accessible assets or liabilities that the reporting entity has the ability to access at the measurement date. This market approach uses prices of actual transactions in the market. The conditions that have to be in place for a market to be characterized as active are that frequent trading and corresponding volume occur to provide pricing information on an ongoing basis (FASB, 2006, p. 10). Restrictions are however placed on the derived prices. When adjustment is needed due to significant events, the valuation is characterized as being of level two or three quality. Moreover when a significant size of the trading volume is owned by the reporting firm it is not allowed to adjust by a blockage factor. Even when the block owned is larger than the daily trading volume, by having a very probable downwards pressure on the quoted trading prices, this may not be corrected for.

The second level uses inputs other than quoted prices of active markets or quoted prices for identical or similar assets and liabilities as inputs. In the absence of quoted prices other observable inputs and correlated observable data may be used. This income approach uses expectations of discounted future amounts. Small adjustments will be necessary to adjust for various factors concerning the condition and/or location to match the approximated assets and liabilities (FASB, 2006, p. 11).

Third level inputs are unobservable for the asset or liability in case of an inactive market. Subsequently the firm shall reflect the asset or liability prices according to their own assumptions. This cost approach is frequently based on the amount that currently would be required to replace the

(10)

service capacity of an asset. The assumptions made have to be elaborated on in the notes. The information used has to involve the free information that is reasonably available.

Finally the disclosures require an oversight of the asset measurements according to the hierarchy. The purpose of the disclosures is to provide information on the firm specific measurements techniques. This should be supplemented by a more profound disclosure concerning the changes and the transfers in and out of the third level. The valuation technique used and its development have to be explained to clarify the firm specific measurement (FASB, 2006, pp. 12-16).

These safeguards are provided to ensure the reliability of prices incorporated in the valuation, which include the activity and the continuity requirement of the market. However they lack any provision for the ability to absorb an abnormal volume without significant observable price changes. According to Allen & Carletti (2008a, p. 377) the level 2 or 3 inputs may be more accurate when an abnormal volume is supplied to the market. Besides that they suggest in these situations the use of HCA. According to them this will provide more accurate measurement since this method combines the features of historic costs and impairments (Allen & Carletti, 2008a, p. 377).

So the level of input to value an asset is determined by the quality of information that can be retrieved from the market. The quality of information is characterized by the activity of observable markets and the activity of the market of identical or of similar assets and liabilities. The market liquidity of the financial asset and liability is therefore of great importance by having an impact on the observation of prices, this concept is called liquidity pricing (Allen & Carletti, 2008a, pp. 376-377). In the theoretical framework this concept will be discussed in more detail.

C. Proponents and Opponents.

The debate on fair-value accounting as mentioned in the introduction leaves both academics and business professionals undecided on the most appropriate policy. The proponents of fair value accounting for assets and liabilities state that the information relevance is optimized by reflecting the current condition of the market. The timely information and the according transparency enable stakeholders to take actions (Laux & Leuz, 2009, p. 5). Following on this Burkhardt and Strausz (2006) state that FVA reduces asymmetric information and thereby stimulates an increasing liquidity. Other proponents state that hiding potential problems under FVA has become more difficult and therefore reduces the severity of crises. As can be seen from the savings and loan crisis, the delay of information can be very harmful (Allen & Carletti, 2008a, p. 376). The true reflection of FVA enables investors and policy makers to better assess the risk profile and undertake more timely market discipline (Allen & Carletti, 2008a, p. 358). Various proponents including Laux & Leuz (2009, p. 19) argue that regulations should be changed instead of the transparency and relevance losses that go together with deviating from the standard of FVA. Especially since FVA provides an opportunity to assess the risk profile of firms according to Allen and Carletti (2008b, p.2), the previously mentioned deviation would be harmful. Besides the optimized information relevance it could contribute to the

(11)

integration and efficiency of financial European markets according to the ECB report (Enria et al., 2004, p. 4). In addition to this, the European firms will be better prepared to access the international financial markets.

Some of the opponents of fair value accounting criticize on consequences of FVA for the stability of the financial system. According to Allen and Carletti (2008a) these consequences are emphasized during times of illiquidity in the markets. In this environment the prices reflect the amount of scarce liquidity still available. To provide incentives to buy assets in the market, a

substantially decrease of the sales price during a rather short period of time is essential (Sapra, 2008, p. 380). During these periods liquidity pricing becomes a key friction which leads to the contagion in the financial system from banking to insurance firms under FVA. The core concept of contagion is described by Sapra’s article (2008, p. 385) as the catalytic role of FVA by spreading cracks appearing from one part in the financial system to other related parts. This article presents the consequences of this contagion for the economy to be an excessive and artificial volatility. In contrast to this, the application of HCA does not introduce any form of contagion, since the values are based upon the initial price and its depreciation. According to Sapra (2008) the relative proportion of welfare losses caused by contagion in the financial system, as compared to the welfare losses from inefficient continuation under HCA determine the best accounting method.

Besides the liquidity pricing impact, the assumptions of market efficiency and investors rationality may not be reliable (Laux & Leuz, 2009, p. 5). This becomes even more relevant for valuing assets that are held for a longer period. In this case the procyclicality contribution by FVA during booms and busts also magnifies the swings by up and downward spirals (ECB, 2004). Other opponents criticize on the definition of fair value accounting. The notion of orderly transaction and measurement date are theoretically attainable, however these situations do not occur. Both conceptual and in practice the definition will provide problems when the markets become illiquid (Ryan, 2008, p. 27). Ryan also states that even when internal models are used to make a fair value measurement, inputs may be disturbed due to a lack of trading in the case of illiquid markets. Moreover a lack of trading forces firms to incorporate more data from a longer time period, which may be incomparable to the current situation. Besides information disturbance, the level 3 fair values are hard to interpret because there is not quantitative disclosure containing primitive variables that underlie the valuation (Ryan, 2008, pp. 1606-1607). These variables can vary across companies and therefore reduce comparability. Well-disclosed level 3 measurements on the other hand may be more reliable as the quality of input signals is poor due to illiquid markets. Another point of argument concerning the definition presented by Ronen (2012) focuses on the use of exit values. The exit value is defined as the value that is received to sell an asset or paid to transfer a liability. According to his theory, exit values are biased downwards when markets are illiquid, exposing the firms to various procyclical harmful effects. Therefore it is proposed that exit values are not used as a measure for

(12)

value but as a measure of risk (Ronen, 2012, pp. 162-163). The measurement of risk is only feasible when the exit values are accompanied by discounted cash flow (DCF) values.

3.

Theoretical Framework.

At this point the regulation of fair value accounting and its development is explained. Further theory is needed to assess whether the use of FVA impacted the values reported of financial assets and liabilities during the crisis. The framework starts with an assessment of the concept of liquidity pricing and the channels created by FVA that transfer it to the balance sheet of firms. The concept of liquidity pricing is explained according to the models introduced by Sapra (2008) and Allen and Carletti (2008a). In these models the liquidity pricing and its extent in the banking and insurance sector by interaction between the two is elaborated on. Thereafter the link between theory and practice is formed by exploring the dimensions of liquidity and the appropriate liquidity measurements.

I.

Liquidity Pricing.

The core phenomenon of liquidity pricing is based upon the liquidity frictions in the markets. By conceptualizing these frictions, Sapra (2008, p. 382) concludes that liquidity pricing occurs when the prices in times of financial distress, when there is likely to be a liquidity shortage in the market, might not reflect the fundamentals but the amount of liquidity in the market available. In this case the asset price is the ratio of available money in the market looking for an opportunity to purchase to the asset supply. In the presence of sufficient liquidity however the FVA methods do value according to the fundamentals.

The article brings forward an intuitive equation to understand and map both liquidity situations. The following Figure 1 from the Sapra (2008, p. 383) article provides an insight in the liquidity pricing situation.

(13)

( ) (1).

P = Asset price

γ = Amount of liquidity in the markets L = Asset supply

E(R) = Expectation of future asset return γ* = market liquidity assumption

The formula (1). which determines the price minimizes between the ratio of γ on L and the E(R). Therefore in the case of an excess of liquidity in the market, γ > γ*, the price depends on the

fundamental E(R). On the other hand when γ < γ*, the price of the asset becomes subject to liquidity pricing. Accordingly the P drops below its fundamental value and becomes fully dependent on the γ, the key rationale is the lower the γ, the lower the P (Sapra, 2008, pp 382-383). The ability to identify the situations of excess and shortage of liquidity enables market participants to distinguish between fundamental or liquidity pricing.

Therefore a thorough understanding of the interactions of liquidity in the markets and its consequences is relevant and consequently studied by a variety of models. The theory of liquidity pricing is based upon the incremental models build by various articles. One of the central models is built by Allen and Carletti (2008a) which is based on the analysis of the crisis. The main statement of their paper is that the use of FVA, in the case of liquidity shortage, may not be appropriate since in this situation financial institutions may become insolvent, while under HCA the same institutions would be solvent. The use of FVA accounting exacerbates the solvency problems by the contagion in an illiquid environment. In the model three necessities have to be fulfilled for the occurrence of contagion: the existence of systemic risk, the possibility of liquidation of long assets owned by both banks and insurance companies, and lastly liquidity pricing (Allen & Carletti, 2008a, pp. 359-360). The model introduces a system of bank and insurance firms that interact with each other in the primary market on three different dates (t = 0, 1, 2). Two securities are traded on this market, one short term and one long term. The short-term security when traded at date t provides a return at t+1. The long-term security however takes two periods to mature but can be liquidated after the first time slot at t+1. So when bought at t the return, R>1, comes at t+2, however when liquidated at t+1 at a modified return. The long security can be looked upon as a bond holding.

Both the bank and the insurance companies have unique investment opportunities and liabilities with different returns and payments. The banks operate in an environment where both the long and short-term investments can be made; meanwhile the returns of the short and long-term investment have to cover the claims made by the early and late depositors. The insurance firms on the other hand insure the machines of a group of manufacturers against any damage in the upcoming period. If the quantity of damaged machines is low the insurer will pay out, if the quantity is above a

(14)

certain threshold it is optimal to liquidate the firm and sell off all the investments. The high or low performance of the investment of the banks and insurance firms during the first period creates four different states. The situations as explained in the table are referred to as HH, HL, LH and LL.

State Bank investment performance

Insurance investment

performance Consequence

HH High High No market

HL High Low Insurance firm liquidation > long asset

market at t=1

LH Low High No market

LL Low Low Insurance firm liquidation > long asset

market at t=1

Table 1. Investment performance Allen and Carletti model.

The liquidation of long term investments in a market where both banks and insurers own only long term investments would imply extremely low prices for this long term investment. This price would provide an opportunity for an enormous return for a market entrant (Allen & Carletti, 2008a, p. 368). In the other states, HH and LH, the price of the asset is R. Therefore this is not equilibrium because the investors will hold their liquidity until the liquidation date. The liquidity provided in the market at t = 1 is provided by the banks that try to maximize their return by buying the liquidated assets. The table below gives an insight in the expected payoffs on the different dates.

Initial investment

State

Expected Return

t = 0

t = 1

t = 1

t = 2

Long

0

R

Short

HH

α

α

HL

α

(1-α) * R/P

L

LH

α

α

LL

α

(1-α) * R/P

L

Table 2. Expected return for banks in the Allen and Carletti model.

For holding liquidity the banks require to be compensated in the second time slot by abnormal high returns, to compensate for the additional risk. The high return can be earned by a low initial asset price, PL.

Therefore the price of the long-term asset has to be low in state HL or LL, where the price level will depend on the amount of assets supplied (Allen & Carletti, 2008a, p. 368). This constraint

(15)

concerning the price of long term assets can be represented by the following equation in a situation where insurance firms can go bankrupt. This possibility creates a supply market for the long asset at t = 1, subsequently banks will be willing to hold the short-term asset during t=0 to t=1 to buy the liquidated long assets at t = 1 at PHL = PLL = PL. In equilibrium the following formula (2) has to be balanced.

(2).

ρ = the opportunity cost of capital

α = expected payoff to holding short term assets if insurance firms do not

sell their long assets

1 – α = expected payoff to holding short term assets if insurance firms liquidate R = payoff to holding long term assets at date 2

PL = Low long asset price at t=1

The return on the long term asset has to be larger than the cost of capital for the banks before they will provide liquidity. The PL has to be sufficiently low to incentivise banks to invest in the long term asset. From this formula can be concluded that there is no cost for providing liquidity ρ > R > 1 (Allen & Carletti, 2008a, pp. 368-369).

The core concept is in line with the concept introduced in the article of Sapra (2008)

presented above. The constraint of the model is denoted by the minimum price which is driven only by liquidity considerations (Allen & Carletti, 2008a, p. 369). This minimum price also called the investors participation constraint is represented by the following equation (3).

(3).

The intuition behind this formula, assuming ρ > R > 1, is when the α approaches 1 and the PL approaches 0. If PL < 1 < R the prices of the assets are dependent on the liquidity in the market. This price is affected by liquidity pricing because it is dependent on the ratio between the cash holdings and the long asset holdings.

(4).

γ = cash holdings of investors

= long asset holdings of insurance firms

The minimum price PL is a new element introduced as compared to the model presented by Sapra (2008, p. 383).This provides a boundary, equal to the opportunity cost, for the asset prices which lack

(16)

in the Sapra model. Allen and Carletti (2008a, p. 370) therefore have a different graph, which is sketched below in Figure 2.

Figure 2. PL determination in a situation of liquidity pricing.

The model as presented forces banks to devalue their assets according to the current fair market value when the liquidity is low (Allen & Carletti, 2008a, pp. 371-372). Since banks try to remain solvent in all states they optimally choose for reallocating their assets towards other categories. This distortion in portfolio and contract choices has a welfare cost. The use of fair value measurement brings a cost in this case (Allen & Carletti, 2008a, p. 372).

The compensation for liquidity risk as mentioned above is induced by the uncertainties in the market.According to the Allen and Carletti (2008b) model banks face two types of liquidity related uncertainties (risks) of financial markets, in which financial intermediaries, banks and consumers are interacting. These uncertainties consist of on one hand idiosyncratic risks driven by the liquidity needs of the consumer on the other hand aggregate risk triggered by the exposure to a liquidity shock. Both these risks channel the liquidity pricing into the reported values.

II.

Liquidity Channels.

After the introduction on the standards and the intuition of liquidity pricing we now explore the channels through which the liquidity pricing effect has an impact and the reported values. According to Ronen (2012) the exit value definition of FVA is an important channel. The exit values reported do not accurately reflect the value of the assets or liabilities but discounted cash flow values do this better. Particularly in situations of illiquid markets, caused by crisis, the exit prices do not reflect the future earnings power and this can lead to unnecessary liquidation (Ronen, 2012, p. 152). The Long Term Capital Management (LTCM) collapse illustrates this concept. The convergence trades made by the highly leveraged fund relied on the markets being complete and perfect. The flight to quality after the deferral of payment on Russian government debt led to the divergence of prices from the expected

(17)

discounted future cash flows (Allen & Carletti, 2008b, pp. 3-4). The exit values caused a fast liquidation in this case. In the article by Allen and Carletti (2008a), it is stated that this divergence is caused by the interaction of financial institutions and markets. The use of FVA in these circumstances impacts the value banks’ assets by a reflection of the available money in the market. In a perfect market it is possible to use range of instruments to make sure a certain amount of liquidity is received from the counterparties. However in the actual incomplete markets not all instruments are actually traded and not all firms hedge their liquidity risk. Raising capital by those companies is done through the sales of assets. The moment of these sales has consequences for its price, as presented in the ‘Liquidity Theory’ section.

The definitions used in the standards provide some channels for the liquidity pricing effect to spread. Specifically the overall relevance and reliability of the exit values with regard to the second and third level are questionable (Ronen, 2012, p. 151). The relevance of exit values applied in the second level is based upon the link between the actual requested value and its estimator. The definition of the principal market introduces another channel of impact for liquidity. The principal market is made subject to a reduction of volume and decrease in the level of activity. In this case the principal market is replaced accordingly by the most advantageous market, with uncertain

implication. Possibly the maximum amount that is received in this market deviates strongly from the price in the normal situation.

According to Allen & Carletti (2008b) some of the issues concerning liquidity pricing in crisis situations, when liquidity is scarce, can be handled by an additional supplement with HCA and level 3 models. In their article it is stated that in instances of liquidity scarcity the initial framework will not provide useful nor reliable input for valuations.

Based upon the liquidity pricing theory and the channelling presented certain expectations can be formed regarding the use of FVA during the crisis. The liquidity pricing models as presented in the articles of Sapra (2008) and Allen & Carletti (2008a) suggest that in the situations of low liquidity the asset prices will drop. These market conditions do not have to impact the statements since the

regulations, as introduced by FAS no. 157 and IFRS no. 13, do require active markets for the level 1 valuation. In the case of inactive markets the assets should be valued through level 2 or 3. Allen & Carletti (2008a, p. 377) state however that this requirement is not enough to ensure that the markets are liquid, since large supply can suddenly hit the market prices because the market is unable to absorb this volume. Therefore the current regulations do not cover the issue of liquidity prices fully and the firms are still subject to liquidity pricing. A further research on the liquidity dimensions and measurement is needed to assess the impact and the extent of liquidity pricing through FVA.

The hypothesis according to the theory and the literature provided is that the liquidity pricing due to the financial crisis has moderately impacted the financial statements directly as explained by Sapra (2008). There are sufficient channels through which asset prices valued at fair value find their impact on the balance sheet. Therefore I expect that most of the liquidity will be sustained. On one

(18)

hand does this liquidity decrease the liquidity pricing losses, on the other hand does it force the firms to value their financial assets on a level 1 or 2, which results in a rapid loss accumulation if the underlying products lose value. The regulations of FVA force firms to reflect these losses on their financial assets directly on their balance sheets.

(19)

III.

Liquidity Dimensions.

Up to this point liquidity has been presented as a single concept. This will be refined initially to get a true insight into liquidity pricing. In the model of Allen and Carletti (2008a), liquidity is described as a single market feature, Nikolaou (2009) however refines this by explaining the liquidity cycle model. The later model points out several types of liquidity, which all may contribute to channel liquidity pricing into the statements. The cycle argued to consist of three distinct but interconnected types of liquidity; these are defined as central bank liquidity, market liquidity and funding liquidity. The debate, as discussed in the introduction, is focussed on the market liquidity, therefore this is now described more accurately.

Understanding the concept of market liquidity seems vital for the assessment of the liquidity pricing impact on valuations. This concept is used by explaining a variety of mechanisms, namely asset liquidity, asset market liquidity, liquidity of a firm and financial market liquidity. The focus of this paper is on the latter. The market is perceived as liquid, if it is possible to quickly sell large amounts of asset without adversely impacting the price (Lybek & Sarr, 2002, pp. 4-5). In these markets financial assets are characterized by small transaction costs, easy trading and timely settlement. According to Keynes market liquidity incorporates key elements of volume, time and transaction costs (Nikolaou, 2009, p. 14). The elements Keynes formulated can be described by three dimensions: depth, breadth and resiliency. Baker concluded in 1996 that there is no single

unambiguous and accepted definition of liquidity (Lybek & Sarr, 2002, p. 5). Therefore the full range of the liquidity dimensions does include multiple existing definitions. Subsequently these identified liquid markets to be sharing the following five characteristics (Lybek & Sarr, 2002, pp. 4-8):

 Tightness, referring to low transaction costs including the difference in buy and sell prices and implicit costs. The former is noted in quote-driven markets as the bid-ask spread.

 Immediacy, described the speed with which orders can be executed and settled. Therefore this reflects the efficiency of the trading, clearing and settling system.

 Depth, is the characteristic that maps orders placed not matching the current trading price. Therefore these prices are not the actual execution prices but the incoming bids and asks to the market.

 Breadth, represents the quantity and size of the orders.

 Resiliency, the corrective action that is undertaken and its time span covered when the market is in imbalance, which means the prices do not reflect the fundamentals. The main challenge lies in the fact that most of the data does not fully correspond to one of these dimensions (Lybek & Sarr, 2002, p. 5). Moreover the measurement of market liquidity in financial markets has to incorporate some qualitative factors as well that assist in identifying the

(20)

secondary market. In the section liquidity measurement of all dimensions and the according measurement is further elaborated on.

IV.

Liquidity measurement.

This section provides multiple measurement tools of stock market liquidity to quantify the five dimensions as explained in the last section. The insight into the changing market liquidity during the crisis is necessary to assess the impact of liquidity pricing through FVA. The theory of market liquidity provides traction on the measurement issues described before. The measurements presented below can be applied to a variety of financial instruments. However the observability of the typical over the counter markets is lower than for the exchange traded markets like equities and stocks (Lybek and Sarr, 2002, pp. 12-20). According to the dimensions brought forward by Lybek and Sarr (2002, pp. 5-8), a measurement system is developed. The basis of liquidity measurement lies in the four different categories: (A.) Transaction cost measures, (B.) Volume-based measures, (C.) Equilibrium price measures, (D.) Market-impact measures. Each of these categories measures the dimensions of liquidity to some extent.

A. Transaction cost measures.

This category captures the tightness dimension, which are the costs of trading financial assets and the associated frictions according to Lybek and Sarr (2002, pp. 8-11).The costs of trading consist of two types of expenses, implicit and explicit costs. The implicit costs are the execution costs by making the transaction. The explicit costs are the collection of processing costs and taxes payable. Almost all of the implicit and explicit costs are included in the bid ask spread. The primary costs reflected consist of order-processing, asymmetric, inventory-carrying and oligopolistic market structure costs (Lybek and Sarr, 2002, p. 9). The order-processing costs are caused by the regular handling costs. The providence of continuous markets and the resulting immediacy to market

participants entails risks for the dealers. To compensate for these risks they have to charge a premium for the uncertainty of potential gains/losses in their process. As an increasing number of participants trade in the market the asymmetric information is revealed earlier and the costs of the market continuity decrease. The cost reductions, as more trades are executed, result in an increasing number of participants. Besides these feedback effects the existence of a highly competitive dealer market reduces the transaction fee charged and accordingly also the liquidity dimensions. On the other hand in the case of high transaction costs the number of participants decrease. Accordingly the small number of trades gives rise to situations of discontinuous prices. Meanwhile large transactions take longer time to execute in these markets which reduces the ability to correct imbalances. These examples illustrate that the market breadth, the quantity and size of orders, and the market resiliency, the imbalance corrections, are impacted by a change of the bid ask spread. Measuring the spread a choice can be made between an absolute (5) and a relative measure (6).

(21)

(5). (6).

Pa = Lowest quoted ask price Pb = Highest quoted bid price

The higher the absolute and the relative spread are, the lower the market breadth and resiliency. The Pa and Pb used in the formulas can be approximated in two ways. Firstly Pa can be the lowest ask and Pb can be the highest bid price. Secondly an average can be derived from the executed trades by taking a weighted average to determine the actually realized spread. These approximations may be different from each other, since not all trades are made in the middle of the bid ask spread. In this research the high frequency databases are not used because the patterns that consist on a daily/weekly bases may not provide good indications of spread changes. The analysis of the actual spread

movements and the patterns is too complex for this research, therefore the bid ask spread measures will be calculated on a daily basis.

B. Volume-based measures.

A solid basis for the evaluation of the dimensions of breadth and depth is given by this measurement category (Lybek and Sarr, 2002, pp. 11-14). In the article of Lybek and Sarr (2002, p. 11), it is stated that these definitions can be used interchangeably when using volume-based measures, since a large order (breath) can be split into multiple components in order to leave the price

unchanged (depth). Note however that often incorrect inferences are drawn about the resiliency based upon a volume-based measure. The basic volume based measure is the turnover rate, equation (8).

(7).

(8).

S = Outstanding stock

Pi = Price of the trades at end of day i V = Volume traded

Qi = Quantity of trades during the day

The resulting turnover rates give a first indication of the trading volume (Lybek and Sarr, 2002, p. 12). The trading volume outcomes can be used to describe the developments of breadth and depth during the crisis. The volume measure presented above however can be substantially impacted by a volatility of the turnover, which decreases its consistency and reliability approximations of the breadth and depth dimension. For example if a large order is placed, the high turnover of the entire period may result an unfunded conclusion of high market liquidity. To correct for this unusual

(22)

turnover volatility the Hui-Heubel liquidity ratio, as reflected in equation (9), is a more reliable measure. By assessing the separate firms in the index, the Hui-Heubel liquidity ratio includes information of the stock outstanding. The rationale is that to make a judgement on the liquidity of a stock not the absolute volume is relevant but the volume traded with respect to the number of

instruments outstanding. For example a trading volume of $20 million dollars in absolute terms is not a strong indicator but when the trading volume is compared to the number of outstanding stock it provides more useful information.

Lhh

((Pmax – Pmin)Pmin ) (S*PV)

⁄ (9).

Pmax = highest daily price over last 5 days Pmin = lowest daily price over last 5 days V = total dollar volume traded last 5 days S = number of instruments outstanding

P = average closing price of the instrument over a 5-day period The lower this ratio, the higher the liquidity of the financial market and the more breadth (Lybek and Sarr, 2002, p. 13).

C. Price-based measures.

According to Lybek and Sarr (2002, pp. 5-15) new information in the markets may and most frequently does change the equilibrium values. The according price reaction to this information does depend on the resiliency of the market. The resiliency dimension does cover both the price changes and their quickness. They state that a relatively high market resiliency is characterized by a high market efficiency coefficient (MEC). The MEC, equation (10), is based on the concept that price movements are more continuous in liquid markets. In contrast, markets with lower resiliency tend to follow a more volatile price path, with an according low MEC. The coefficient does provide a measure for the continuity of price changes (Lybek & Sarr, 2002, pp. 14-15). Moreover it provides a distinguishing feature of short-term from long-term price changes.

(10).

Var (Rt) = variance of logarithm of long-period returns Var (rt) = variance of the logarithm of short-period returns T = number of short periods in each longer period

At the same time Lybek and Sarr (2002, p. 15) do question the direct link between continuity and resiliency. For instance when the fundamental values of the assets or liabilities afterwards turn out to

(23)

be lower than assumed, the market then creates an order imbalance. This imbalance shifts the market price downwards. If due to this price shift the excess of supply is countered by new buying orders the market seems to be very efficient, however in reality it is not. Therefore Lybek and Sarr (2008, pp. 15-17) recommend the use of lagged regression techniques as well to improve the fit to use MEC to estimate resiliency.

D. Market-impact measures.

The Hui-Heubel liquidity ratio does correct the basic volume indicator for unusually large orders, however it does not differentiate between permanent and transitory price changes (Lybek and Sarr, 2002, p. 17). The changes in prices of trading while new information is revealed may be

misinterpret, due to a significant price movement despite the small trading volume. Price changes due to significant news events should for this reason be extracted measure the market liquidity more effectively. The dimension of breadth is better understood by excluding these price changes due to new information.

The aforementioned unexpected price changes are systematic risks that affect all stocks (Lybek and Sarr, 2002, p. 17). The capital asset pricing model (CAPM), equation (11), has a measure for these risks, called market beta.

(11). Ri = daily stock return

Rm = daily market return

β = regression coefficient of systematic risk ui = regression residuals or specific risk

By subtracting the market risk from the equation it is possible to calculate the Market-Adjusted Liquidity for equities by using the residual to run a new regression on the volume of stock traded. This regression in formula (12), determines the intrinsic liquidity of the asset (Lybek and Sarr, 2002, p. 18).

(12). ui

2

= residuals squared

Vi = percentage change in dollar volume traded

ei = residual

The γ2 does represent the impact of the trading volume on the variance of asset prices. A relatively low value of the coefficient implies high asset liquidity and accordingly a higher breadth dimension.

(24)

4.

Method and Data.

The previous chapter has provided a theoretical basis on liquidity, liquidity pricing and its measurement dimensions. To evaluate these dimensions properly the data has to be selected and utilized properly. This chapter describes the data that has been used, the empirical testing executed and the hypotheses that is set.

Since market liquidity is not directly observable, approximations have to be made to extract the useful information from alternative sources. As explained in the liquidity measurement section the stock market statistics offer important information on the market liquidity. The information required to describe the dimensions according to the paper Lybek and Sarr (2002) can be obtained from the stock exchange data from the S&P 500. The essential factors brought forward in this paper are the price and the volume. The other input needed can be computed from these exogenous variables. The S&P 500 is a suitable data source for various reasons.

First of all the index consists of a broad variety of firms and therefore provides a quite complete indication of overall market liquidity. Besides this is the basic data publicly available through the finance websites of Yahoo and similar sources. The data used is graphed in Appendix 2, this can be used to get a basic insight in the sample.

Moreover the stock market prices are reported on a frequent time intervals. A daily interval is used for executing this research. These daily prices are accessible and prevent from an overload of details and unnecessary complications as compared to data consisting of even smaller time slots. By selecting daily time intervals, we accept that part of the volatility may not be reported. This is the case when the price of a certain stock or index has countered a larger fluctuation during the day than the end prices suggest. Being aware of this issue, the end of the day prices are used.

Most of the additional information needed can be further derived from the quoted

information. Since the stock information is provided on a daily basis the developments over time can be tracked of the other variables as well.

The daily data sample used consists of S&P 500 index and firm information. This exchange consists of a selection from the NASDAQ, NYSE and the AMEX based upon market capitalization. The representation is dependent on the weighted size of the current market capitalization of the firms. The data used in this research is retrieved from a Bloomberg terminal, as shown in Appendix 5. The period of research focuses on the financial crisis started in 2007. To provide a reference for the liquidity measurements the period before the start of the crisis is also included in the sample. The volume based measure requires the input of the number of outstanding stock (S). This cannot be provided by the data from an index, therefore the research is based upon the data of five firms. This sample is taken to represent the financial markets. As the resources are limited for this research the data sample is based on the firms with the largest market capitalization of the S&P 500 index. The total sample period ranges between the 3rd of January 2006 and the 31st of December 2010.

(25)

The first step is to compute the missing values needed to calculate the measures of all the dimensions. Both the transaction cost measures the absolute and relative spread can be calculated using the bid and ask prices provided by Bloomberg. The spread will be tested by comparing the pre-crisis, crisis and after crisis situations.

The volume based measures, the turnover rate and the Hui-Heubel liquidity ratio, are calculated for the five firms with the largest weight in the S&P index. Those companies are Exxon Mobile, Microsoft, Apple, Johnson & Johnson, and Procter & Gamble.

The third measure is price-based and is computed with the variance of the return in long and short periods. The ratio used is the Market Efficiency Coefficient, which provides an indication on the resiliency of the market (Lybek & Sarr, 2002, pp. 14-15). A value of slightly below one should be expected because of a minimum of short term volatility. Lower values can be seen as indicators for a lower market resiliency.

The fourth measure based on the market impact of trading. The measure is a result of the CAPM model combined with the calculation on intrinsic liquidity. Regressing the daily stock excess return on the market excess return provides α, β and residuals. The impact of the trading volume on the variance of the prices provides an insight in the dimension of breadth.

(26)

5.

Results.

A. Transaction cost measures.

The dimensions of tightness, breadth and resiliency are reflected by the Bid-Ask spread measures in formulas (5) and (6). Both the absolute and relative spread is researched. These measures are mapped in Appendix 1 and provide some overview on their developments over time. Both spreads provide evidence on the dimensions described.

The absolute and the relative bid-ask spread results show a large spread for all the stocks from the beginning of 2006 until the second quarter of 2007. The S&P 500 has a bigger spread over the whole period. The spread increases substantially when Lehman Brothers files for bankruptcy at the 15th of September 2008. According to these findings the tightness, breadth and resiliency of the financial stock market was low during 2006 and the first quarter of 2007. Also can be stated that these dimensions were negatively impacted during 6 months after September 2008 for the S&P 500 index.

B. Volume-based measures.

The volume based measures, the turnover rate (8) and the Hui-Heubel liquidity ratio (9), are computed for the five firms included in the data sample, see Appendix 3. Both these measures cannot be calculated for the S&P 500 index, since the number of outstanding stock is one of the input variables. On the one hand the S&P 500 itself does not have shares outstanding, on the other hand the mutual funds that represent the index are not suitable to represent the number of outstanding stock. The turnover rate gives an indication of the speed at which the stocks change hands (Lybek & Sarr, 2002, pp. 11-13). The higher the turnover rate the higher the breadth and depth of the market. The turnover rate, while different for every stock, increased for half a year after the 15th of September 2008. The graph shows that the slight raise was consistent for a substantial period. As compared to the turnover rate earlier in the time sample the raise is very minor.

The Hui-Heubel liquidity rate measures the breadth and the resiliency of the stocks. A low Hui-Heubel ratio outcome may indicate a high liquidity, a high ratio emphasizes a situation of low liquidity. As shown in appendix 3 the ratios concerning J&J, P&G and Exxon Mobile increased at the start of the crisis. This indicates a lower breadth and resiliency during this period. After the second quarter of 2009 the ratios stabilize around the original level again. The Hui-Heubel ratio proves that the breadth and resiliency in the financial markets was restored after some time. The recovery of the original level of the ratio suggests that the ratio has deviated from its usual value, proving the point that the market conditions of breadth and resiliency were unusually low during 2008.

C. Price-based measures.

In the essence the MEC (10) is all about the response time of the market on a change in the equilibrium price. This continuous price movement in liquid markets can be observed more closely if the short-term intervals are very small. The proposed time intervals by Gabrielsen et al. (2011, pp. 15-16) originate from 1988. At that time Hasbrouck and Swarts recommended using a short time interval

(27)

of half an hour and a long interval of two days. Since the speed of trading has changed so dramatically nowadays, a more applicable interval would be a few seconds or a minute. During the research I have not been able to have access to this kind of data. Therefore these measures will be left out of the

further consideration in this research. For additional research this could be an important starting point.

D. Market-impact measures.

At first an insight into the overall data is provided by calculating the CAPM coefficients as introduced in equation (11). The results below, the outcome of the regression in formula (11), are provided by regressing the return of the S&P 500 on the five stocks during the period of 2006 to 2010. A 95% significance level is used.

Regression of firm return on S&P 500 return 2006-2010

Coefficient Std. Err. P F R-sq Exxon Mobil β 0.9604100 0.0226730 0.000 1794.30 0.5882 α 0.0258094 0.0356796 0.470 Microsoft β 0.9262803 0.0295527 0.000 982.41 0.4389 α 0.0162685 0.0465057 0.727 Apple β 0.9980826 0.0361425 0.000 762.60 0.3778 α 0.1372563 0.0568759 0.016 J&J β 0.4955868 0.0159192 0.000 969.17 0.4355 α 0.0014074 0.0250513 0.955 P&G β 0.5670388 0.0171625 0.000 1091.60 0.4650 α 0.0090786 0.0270079 0.737

Table 3. Regression results of the 2006-2010 market return on stock return

These results should be used as a reference for the further results presented. As shown in table 3 only the α of Apple during the 2006-2010 period has been significant, as is indicated by the p-value of 0.016. As expected by general CAPM theory both the technology firms, Apple and Microsoft, and the oil firm, Exxon Mobil, have a higher β than the consumer good firms, J&J and P&G. This beta coefficient implies a higher sensitivity of asset returns to the market returns. For now, the information in table 3 can be used to compare the annual numbers presented in table 4.

In this research the stock data is split in separate years to track the development over time of the market impact measures. The 5 1-year time samples enclose the information of return and volume. In the first phase the daily return of the S&P500 is regressed on the stock return of the same year according to equation (11). The results as published by Stata are provided in Table 4. In the following table the β coefficients estimated are summarized. The β coefficients presented are all considered to be significant at the 95% level, while only the α of Apple of 2007 and 2009 are proven to be significant.

(28)

Regression of firms on S&P 500 annually

2006 2007 2008

Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq

Exxon Mobil β 0.9247134 0.1086659 0.000 72.41 0.2260 1.1608490 0.0641693 0.000 327.26 0.5679 1.0455340 0.0467295 0.000 500.60 0.6660 α 0.0725201 0.0683964 0.290 0.0701712 0.0647822 0.280 0.1566952 0.1207702 0.196 Microsoft β 0.7708370 0.1214098 0.000 40.31 0.1398 1.0701340 0.0822256 0.000 169.38 0.4048 0.9633622 0.0476510 0.000 408.73 0.6195 α 0.0152138 0.0764176 0.842 0.0640156 0.0830110 0.441 -0.0458557 0.1231517 0.710 Apple β 1.6291220 0.2224643 0.000 53.63 0.1778 1.1843910 0.1294879 0.000 83.66 0.2515 0.9711708 0.0653945 0.000 220.55 0.4677 α 0.0035868 0.1400232 0.980 0.3441857 0.1307246 0.009 -0.1103322 0.1690092 0.514 J&J β 0.3916151 0.0771390 0.000 25.77 0.0941 0.3765821 0.0438111 0.000 73.88 0.2288 0.5809039 0.0294049 0.000 390.27 0.6086 α 0.0124436 0.0485528 0.798 0.0000818 0.0442296 0.999 0.0674592 0.0759957 0.376 P&G β 0.5867121 0.0756823 0.000 60.10 0.1951 0.4880357 0.0518692 0.000 88.53 0.2623 0.5910540 0.0299586 0.000 389.23 0.6080 α 0.0117969 0.0476359 0.805 0.0483999 0.0523646 0.356 0.0449222 0.0774267 0.562 2009 2010

Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq

Exxon Mobil β 0.7661725 0.0416956 0.000 337.66 0.5746 0.8430254 0.0426932 0.000 389.91 0.6093 α -0.1229030 0.0716821 0.088 -0.0105025 0.0485400 0.829 Microsoft β 0.8305764 0.0819574 0.000 102.70 0.2912 0.8728318 0.0697751 0.000 156.48 0.3850 α 0.1406348 0.1408994 0.319 -0.0695335 0.0793308 0.382 Apple β 0.8769359 0.0558006 0.000 246.98 0.4970 1.0361370 0.0660562 0.000 246.04 0.4960 α 0.2956068 0.0959312 0.002 0.1267864 0.0751025 0.093 J&J β 0.3893825 0.0334156 0.000 135.79 0.3520 0.4305232 0.0389846 0.000 121.96 0.3279 α -0.0026890 0.0574473 0.963 -0.0357919 0.0443235 0.420 P&G β 0.5844357 0.0430286 0.000 184.48 0.4246 0.4675450 0.0369156 0.000 160.41 0.3909 α -0.0533566 0.0739739 0.471 0.0017604 0.0419712 0.967

(29)

The results as shown above in table 4 already reveal that the regression coefficient of the stocks decreased when the crisis in 2008 commenced. The main interest of our research however is on the return response to the change in volume. Therefore the daily squared residuals are computed according to the coefficients in table 4 by using the CAPM. To calculate the residuals the β is assumed to be constant over the year. The α are used to compute the Ui2 if their P-values indicate significance.

The squared residual, resulting from formula (11), information is used for further

computations. As introduced by formula (12), the squared residuals, ui2, are regressed accordingly on the percentage of change in dollar volume traded, Vi. The coefficients of the results correspond to γ1 and γ2. The latter of the two is important for the determination of the intrinsic liquidity of the stock. The intrinsic liquidity of the stock can also be formulated as the impact of the trading volume on the variance of asset prices. The results of γ2 are presented below in Table 5.

The lower the γ2 coefficient is, the smaller is the impact of trading volume on the variability of the stock price (Lybek & Sarr, 2002, pp. 17-18). Accordingly it can be concluded that the intrinsic stock liquidity is higher in such circumstances. As can be seen, the intrinsic liquidity changes over the years and both Microsoft and Apple have a lower intrinsic liquidity as compared to the other firms of our sample. In the case of Apple this may be caused by the high absolute price of the share, which excludes small investors. In the sample used in this research the commodity firms, J&J and P&G, have a substantially higher intrinsic liquidity.

(30)

Regression Ui² on Δvi

2006 2007 2008

Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq

Exxon Mobil γ2 0.0100737 0.0036783 0.007 7.50 0.0294 0.0123268 0.0034800 0.000 12.55 0.0480 0.0463029 0.0181151 0.011 6.53 0.0254 γ1 1.1264550 0.1102529 0.000 0.9961468 0.1123792 0.000 3.5144420 0.4823382 0.000 Microsoft γ2 0.1004581 0.0077226 0.000 169.22 0.4056 0.0227424 0.0087005 0.009 6.83 0.0267 0.0403633 0.0168580 0.017 5.73 0.0223 γ1 0.5816178 0.4226257 0.170 1.5857290 0.3213721 0.000 3.6033590 0.5644785 0.000 Apple γ2 0.0813441 0.0180253 0.000 20.37 0.0759 0.0924629 0.0124352 0.000 55.29 0.1817 0.1099401 0.0331887 0.001 10.97 0.0419 γ1 4.2243780 0.8312848 0.000 3.6987930 0.5207818 0.000 6.7279560 1.0284740 0.000 J&J γ2 0.0135611 0.0021519 0.000 39.71 0.1380 0.0060985 0.0013792 0.000 19.55 0.0728 0.0030106 0.0061067 0.622 0.24 0.0010 γ1 0.5133800 0.0804571 0.000 0.4475144 0.0534168 0.000 1.4349480 0.1962426 0.000 P&G γ2 0.0203352 0.0018060 0.000 126.79 0.3383 0.0098920 0.0023864 0.000 17.18 0.0645 0.0138328 0.0044983 0.002 9.46 0.0363 γ1 0.4106160 0.0832634 0.000 0.6183670 0.0961212 0.000 1.4355230 0.1498875 0.000 2009 2010

Coefficient Std. Err. P F R-sq Coefficient Std. Err. P F R-sq

Exxon Mobil γ2 0.0316378 0.0033196 0.000 90.83 0.2665 0.0029325 0.0016322 0.074 3.23 0.0127 γ1 1.1699930 0.1266325 0.000 0.5740959 0.0569427 0.000 Microsoft γ2 0.1542239 0.0246275 0.000 39.22 0.1356 0.0198784 0.0051367 0.000 14.97 0.0565 γ1 4.1064510 1.0190330 0.000 1.4587650 0.1896794 0.000 Apple γ2 0.0385502 0.0102667 0.000 14.10 0.0534 0.0210058 0.0057469 0.000 13.36 0.0507 γ1 2.1517290 0.3136514 0.000 1.3260690 0.1949793 0.000 J&J γ2 0.0113922 0.0038756 0.004 8.64 0.0334 0.0049739 0.0014149 0.001 12.36 0.0471 γ1 0.7837861 0.1120417 0.000 0.4629285 0.0546766 0.000 P&G γ2 0.0279480 0.0045826 0.000 37.20 0.1295 0.0068324 0.0013302 0.000 26.38 0.0955 γ1 1.2355190 0.1589350 0.000 0.3933263 0.0574502 0.000

Referenties

GERELATEERDE DOCUMENTEN

The objectives of this study were to: (i) systematically review DDIs related to frequently prescribed ABs among COPD patients from observational and clinical studies; and (ii)

amplifier material with carrier chip, thinning the bonded material to a certain thickness that defines the waveguide height, and finally milling the ridge waveguide architecture..

In this paper, we focus on a framework for rail weak-resilience-signal (WRS) modelling and we emphasize one main area - workload - for which we develop a specific method to measure a

[r]

The list suggests an expansion of the conference to three kinds of tracks, each with their own evaluation criteria: technical solutions to be evaluated on novelty and

Wanneer blijkt dat in het nieuws op bepaalde zenders de mate van aandacht voor individuele politici verschilt en de toon overwegend negatief of positief is ten aanzien van

Mijn ouders die hier niet aanwezig zijn omdat dat allemaal te vermoeiend zou zijn, dank ik voor de steun die ze me altijd hebben gegeven.. Mijn dochters, Renée en Shoshanna bedank

To provide the conclusion, both the theoretical framework as well as the case studies concerning the control environment of subsidiaries, the incentive and opportunity