So a typical company in manufacturing might do 8 inventory turns. Samsung does 17. Dell, which practically invented hardcore electronics supply chain management, does 36. Apple is doing 74!Click through to Gartner to get the full report.
So a typical company in manufacturing might do 8 inventory turns. Samsung does 17. Dell, which practically invented hardcore electronics supply chain management, does 36. Apple is doing 74!Click through to Gartner to get the full report.
although the study of payment flows is of immediate interest to central bankers, it may miss an essential aspect of systemic risk, namely the ‘contagion dynamics’ of public perceptions and asset valuation associated with the interaction of balance-sheets (the mutual financial obligations and exposures that link companies). For example, how contagious are inflated valuations of Internet stocks? Are there hidden, mutually dependent risks associated with such high valuations? It could be useful to examine the dynamic network of balance-sheets, and if possible to quantify the inter active effects of valuations, credit policies, hedging and so on among financial institutions, especially investment banks. Such balance-sheet networks could be helpful in studying the effects of asset- pricing bubbles, credit crises and the poorly understood but potentially worrying effects of the current widespread use of derivatives (futures and options) and dynamic hedging by investment banks to manage risk on the fly.Just discovered Sugihara through a link in an interesting post on O'Reilly Radar.
Fascinating interview of Gary Gorton re: the gory details of the credit crisis in late 2008, which continues to present:
Securitization basically allows the traditional banks to finance their loans by selling them rather than holding them on balance sheet, and the source of value here is avoidance of bankruptcy costs. A firm that originates loans does so by lending money to any number of borrowers—both corporate and consumer—and it then selects a large portfolio of its loans to sell, in a very specific legal sense, to a “special purpose vehicle,” an entity it creates for that very precise reason. The main advantage of doing so—of establishing the SPV and legally selling the loans to it—is that this arrangement circumvents the costs associated with bankruptcy.
Via Alex Tabarrok at MR.
I understand why Gorton is in favor of innovation in revenue generation, and I even understand why he considers shadow banking an inevitable next step given the amount of institutional money moving around now. I even understand why there need to be bankruptcy remote entities -- there should be a spread of higher and lower-risk vehicles.
What I don't understand is how or why these special purpose vehicles get to stay off the balance sheet.
[Submitted today in response to this call for papers. -MFM]
Thanks to the ubiquity and integration of digital computers, each consumer of a product or service now produces a digital stream of data. Bundled together, these streams of data record how particular products or services, including any patented features, are valued by actual consumers in different geographical regions and in different windows of time. Most relevant to patent royalty calculations are the histories of transactions for a particular product or service. These transactions eventually show up in financial statements, which permit top-down estimates of value, such as revenue or gross margins. But alternative metrics drawn from these streams, such as change in the frequency of transactions before and after the introduction of a patented feature, also permit bottom-up estimates of the value of patented features. Demand can also be estimated using metrics for customer engagement, such as the frequency and length of visits to a particular website or the conversion ratio of number of independent visits per completed transaction. In some cases, analyses of these data permit calculation of a more reliable lower bound for the marginal value of a patented feature to a complex product or service. More generally, time-frequency analyses of data on consumer behavior complement existing methods for calculating lost profits or reasonable royalties, and can help resolve questions about the marginal value a patented feature adds to a complex or highly integrated product or service.
The Damages Apportionment Problem for Complex Systems
Under 35 U.S.C. § 284, “[u]pon finding for the claimant the court shall award the claimant damages adequate to compensate for the infringement but in no event less than a reasonable royalty for the use made of the invention by the infringer, together with interest and costs as fixed by the court.” Famously, the Southern District of New York laid out a “comprehensive list of [fifteen] evidentiary facts relevant, in general, to the determination of the amount of a reasonable royalty for a patent license” Georgia-Pacific Corp. v. U.S. Plywood Corp., 318 F.Supp. 1116, 1120 (S.D.N.Y.1970). The thirteenth in this list is “[t]he portion of the realizable profit that should be credited to the invention as distinguished from non-patented elements, the manufacturing process, business risks, or significant features or improvements added by the infringer.” Id.
When Georgia-Pacific was decided in 1970, Jimi Hendrix was still performing, the Internet was known only as ARPANET, and access to information about cutting edge research and development usually required national security clearance. Four decades later, instead of secrets about national defense leaking to the press through a former first lieutenant in the Marine Corps with high-level security clearance, an Australian hacker is leaking them to the Internet. Instead of a carefully choreographed ballet of contractors producing a 747, a globally distributed swarm of programmers continuously releases new features and combinations of features in an effort to attract a stable base of users. Consumers depend upon producers – producers who compete vigorously with each other to make the sales necessary for their own survival – to cooperate in the development and adoption of industry standards that are necessary for complex, integrated systems to exist at all. The Federal Circuit put it well in describing one of the most commercially successful productivity software products as “an enormously complex software program comprising hundreds, if not thousands or even more, features.” Lucent Technologies, Inc. v. Gateway, Inc., 580 F. 3d 1301, 1333 (Fed. Cir. 2009). The Federal Circuit found “it inconceivable to conclude, based on the present record, that the use of one small feature … constitutes a substantial portion of the value.” Id.
Most lawyers agree with the Federal Circuit that the base for calculating royalties should be discounted to adjust for the value that other features contribute to a product. Other features may include features themselves covered by other patents (not owned in common with a patent-in-suit), or features not easily protectable through patents (such as the durability of parts, precision of manufacturing tolerances, or the timely availability of the product or service at a requested order volume). Moreover, what feature or features were sufficient or necessary to the completion of a sale may vary from customer to customer, or even for the same customer over time. In most cases, it is indeed inconceivable that any single patented feature could be responsible for the entire revenue (much less profit) generated by a complex or highly integrated product or service.
On the other hand, the very fact that an infringer has chosen to implement a patented feature rather than a non-infringing alternative is strong evidence that the patented feature most likely contributes some finite value to the product or service. That value, in turn, is ultimately reflected in the revenue and profit generated by the sale of the complex product or service that includes the patented feature.
[Discussion of economic case for patents, focusing on the potential social benefit of granting property rights in easily appropriable innovations in markets with high barriers to entry. Exclusionary rights permit supracompetitive prices in proportion to inelasticity of demand for patented innovations. Thus, measuring inelasticity of demand is core problem for patent valuation in general, and for damages apportionment in particular.]
Measuring the Price Elasticity of Demand
The Federal Circuit has acknowledged that price elasticity is at the core of the problem of patent valuation. The opinion in Crystal Semiconductor v. Tritech Microelectronics, 246 F. 3d 1336 (2001) is particularly instructive, although it deals specifically with price erosion. In Crystal Semiconductor, the Federal Circuit explained the economic theory of price elasticity:
For example, if substitution of a product were impossible and the product were a necessity, elasticity of demand would be zero — meaning consumers would purchase the product at identical rates even when the price increases. This very rare type of market is called inelastic. On the other side of the spectrum, if any price increase would eradicate demand, elasticity of demand would be infinite — meaning consumers would decline to purchase another single product if the price increases by any amount. This very rare type of market is called perfectly elastic. Markets typically have an elasticity greater than zero and less than infinity.
Crystal Semiconductor, 246 F.3d at 1359 (italics added). In other words, the market value of a patented feature should be tied to its effect on price elasticity, and price elasticity can be estimated by reference to the relative rate of sales at different price levels. The Federal Circuit went on to rule that the patentee in Crystal Semiconductor had failed to show sufficient evidence of whether or how much the rate of sales would have declined as a result of a price increase. Id. at 1360-1. It is enough for my purposes, however, to note that the Crystal Semiconductor case provides a legal precedent for the use frequency-averaged measures of the value of patented innovations, such as relative rates of sales.
Top-Down Estimates from Time-Averaged Revenues and Profits
The approach to estimating price elasticity described in Crystal Semiconductor requires the use of revenue and profit figures, which would typically be derived from financial statements. Financial statements provide a highly compact summary of the entire history of transactions that occurred among a company and its many stakeholders, from the purchasing of capital equipment and parts from suppliers, to the payment of labor for design and assembly, to the actual payments of cash from customers for delivery of the final product. Thus, financial statements provide a good foundation for what I will call “top-down” estimates of the value of patented features.
[Discussion of the information architecture of financial statements: Income and cash-flow statement record time-integrated flow of revenue and expenses; balance sheet records snapshot in time of assets and liabilities. The information architecture of financial statements lends itself to optimization of time-averaged measures of efficiency in increasing sales, decreasing costs of sales. Trends in sales or costs of sales are visible only after two or more periods have elapsed.]
Top-down estimates of the value of patented innovations are most appropriate for valuation of products or services that are protected by a single patent. Top-down estimates provide only a ceiling on valuation when multiple patents cover each product or service sold in the transactions that ultimately result in the financial statements from which the top-down estimates are drawn.
Frequency-Averaged Measures of Productivity
Every time-averaged measure of profitability has a frequency-averaged counterpart. In practice, experienced investors and managers use both to develop a picture of the health of an organization. For example, in addition to gross margin – a time-averaged ratio of profit to revenue – experienced investors and managers also track inventory turnover – a frequency-averaged ratio of costs of goods sold to average inventory. A company with high gross margins but dropping inventory turnover is unlikely to maintain its margins going forward. Conversely, low margins with increasing inventory turnover signals that higher margins might be available in the future. [Discuss analogy to driving without a tachometer.]
Inside large corporations, managers don’t wait until the end of the quarter to spot trends in supply or demand for a particular good or service. Rather, managers monitor daily reports about order and inventory volume to ensure that the supply of products and services stays synchronized with customer demand.
Bottom-up Estimates from Frequency-Averaged Measures
The stream of data that management uses for frequency-averaged measures of productivity provides the basis for alternative, bottom-up estimates of demand for patented innovations. Bottom-up estimates derived from frequency-averaged measures are useful in valuing patents on innovations introduced as features of an existing product or service, or in valuing patents on products or services that are sold as a bundle with many other products or services. Bottom-up estimates may also be helpful in putting a more reliable floor on the minimum value of a patented innovation. A variety of techniques may be used to produce bottom-up estimates using frequency-averaged measures. I give two as examples: Fourier analysis of a time-series of transactions and A|B testing of a web service.
Sales of a particular product are recorded as a time-series of transactions, each of which is associated with a particular price. Within a particular window of time, one can count up the number of transactions and the total sales. The former is a frequency-integrated measure of sales, the latter the time-integrated measure of revenue. Focussing on the former, we can see how the frequency of sales changes within different same-width and different-width windows of time. Most relevant to patent valuation, we can look at how the frequency of sales changes before and after the introduction of a new patented feature. If frequency increases even as price is held constant or decreased, the patented feature probably increased the demand for the product or service.
Because frequencies change seasonally, many web services rely upon a more direct method for testing demand for a feature, called A|B testing. Using A|B testing, managers can watch how actual consumers interact in real-time with a web service that includes slightly different features. It is conceivable that A|B testing could be used in some cases to demonstrate that a patented feature is less desirable than an existing version of web service, effectively resolving some disputes that cannot be resolved using only top-down estimates of value. [Discussion of A|B testing, Google Analytics, Optimizely, &c.]
In an effort to make Canadian government expense data more accessible, FFunction designed the Expense Visualizer. A slider on top lets you filter by time, and small graphs show spending by different departments. Rearrange panels as you wish, and select among several scaling options as absolute values or relative. Bookmark your custom views or send them to others.
Over two years ago, I put up a post called Cost Accounting and Lean Accounting as a Fourier Pair. That post has turned out to be one of the most popular on this blog. The point of the post (which is short -- go read it) is that the way accounting information is collected and reported through financial statements is not optimal. The post concludes with the following recommendation:
Financial statements should report average turnover rates for each balance sheet account.
Under the proposed rule, all companies would have to disclose not only how much debt they have at the end of the quarter but also average and maximum figures during the quarter.
Technically, a turnover rate would give you the same information. But the average and maximum figures are easier to understand, and hence probably harder to game.
I like to think that somebody involved with the proposal -- directly or indirectly -- might have seen or heard about the post. Either way, I'm very happy. I think this is one of the most significant advances in accounting rules since the invention of double-entry bookkeeping.
It's about @#$@# time! WSJ reports: (see their graphic especially)
The agency's staff has been considering whether banks should be required to provide more frequent disclosure of their average borrowings, which would give a better picture of their debt throughout a quarterly period than do period-end figures.
See related posts: (by Nyquist-Shannon, quarterly balance sheet sampling permits only semiannual and longer term trends to be spotted)
Great Suggestions for Financial Reform (from Lloyd Blankfein)
Schapiro's remarks are still ringing in my ears:
"Rather than relying on carefully staged quarterly and annual snapshots, investors and creditors should have access to a complete real-life picture of a company's financial situation," the senators wrote to SEC Chairman Mary Schapiro, citing the Journal articles, among other things.
Or to put the question differently, how should patents be valued if not by the cost of litigation?
It seems to me that there ought to be a way to measure the introduction of new products, and rank them by novelty and by widespread acceptance, in some way that reflects a more substantial measure of innovation and its impact on the economy.
There is no single right answer to this question. For earlier ruminations, see Reigniting the Engine of Growth, which was cross-posted to PatentlyO (where some of the comments were very good). But the best answers will share at least this in common: Simplicity
Complex rules are too easy to game. To discourage fraud and bad behavior, rules have to be simple (not just valuation rules, by the way, this is true for any rules).
But innovation is complex. Doesn't that preclude any simple measure? Yes and no. Yes in that any measure will be an approximation. This is true for any complex phenomenon, however. That doesn't stop us from measuring temperature, humidity, barometric pressure, and then forecasting the weather. Weather forecasts are not perfect, but at certain times and in certain places they are useful. Why else would we bother? The small talk alone is not sufficient.
The ultimate answer must be that there are ways to measure innovation. Indeed, everything that for-profit organizations do is complex, even putting aside innovation. Yet we have a highly developed set of rules we follow and measurements we take in tracking the performance of these for-profit organizations. They're called accounting rules and financial statements.
Some people think that you can go even one more step removed and look at the public market price of for-profit organizations, compare that against the number of patents pending or issued, and make predictions about how a company's stock price will move in response to the number of patents. That may work in a few isolated cases, but in general public stock prices are driven up and down by too many other influences. If you stick with the financial statements, you're only increasing your signal to noise ratio
To recap what is obvious for anybody exposed to accounting but unfamiliar to the rest, financial statements include (1) a snapshot in time of the assets, liabilities, and equity of a company (this is called the balance because it has to balance assets with liabilities and equity) and (2) a time-averaged summary of revenues and expenses (this is the income statement, or just P&L for "Profits and Losses"). Actually most also include a cash-flow statement, but we can ignore that for purposes of explaining how innovation might be measured. The cash-flow statement is there so that people can figure out how much of the revenue and expenses were "accounting adjustments," which smear revenue and expenses out over time, rather than actual cash changing hands.
So how can a financial statement give you insight into how innovative a company is? The answer is that whenever a company makes an actual improvement (whether or not that improvement is patented!) in a product or process for making a product, that actual improvement will be reflected in its stream of revenues and expenses.
For example, say PayPal introduces a new feature on April 1. Before April 1, PayPal was spending $0.30 per customer to generate $0.50 in revenue per customer. After April 1, PayPal spends $0.25 per customer and generates the same $0.50 in revenue. The value of the feature is $0.05 per customer. In this example, the PayPal increased its profit margin. That's the most obvious way that innovations add value to companies, and it's the one that most people think of, but it's not the only one!
In some cases, the company may not actually have lower costs or be able to charge higher prices as the result of an innovation. Yet the company may still be more profitable as the result of the innovation. This seems to be the more common case for most incremental innovations, which in fact are the majority of those patented in my experience.
Some of you are scratching your heads, wondering: "How can a company be more profitable if it doesn't charge higher prices or pay less in costs?" The answer is turnover. Consider Company X, which sells 10 widgets a day for $100 after paying $60 to make each them. Total profit per day = 10 x ($100 - $60) = $400.
Now consider Company Y, which sells 100 widgets a day for $100 after paying $60 to make each of them. Total profit per day = 100 ($100 - $60) = $4000.
What's the difference between these companies? Turnover rate. Company Y is selling widgets for the same price at the same cost much faster than Company X. You might think this is easy to do, but in fact it is very difficult.
Now of course if Company X and Company Y have otherwise identical accounting entries, the difference in turnover rates will show up as a difference in earnings on the next financial statement. And if the differential in turnover is sustainable over many accounting periods, it will almost certainly show up as increased earnings.
The problem that readers of public financial statements have in backing out these inventory turnover rates is that many products (both old and new) are combined into a single earnings figure. Moreover, adjusting entries often washout the signal reflected in increased (or decreased!) inventory turnover rates. In practice, at least retail investors have to do a little extra research to figure out whether inventory turnover ratios are increasing or decreasing for particular products, and by how much.
So there you have it, Tim. That's my suggestion for a simple way to measure the value of innovation: Read it off the time-series of revenue and expenses. If you can convince the SEC to get public companies to report this info, we will all be indebted to you.
Thanks to Paul Kedrosky for the link.
But even a country as cautious, sound, and generous as Basicland could come to ruin if it failed to address the dangers that can be caused by the ordinary accidents of life. These dangers were significant by 2012, when the extreme prosperity of Basicland had created a peculiar outcome: As their affluence and leisure time grew, Basicland's citizens more and more whiled away their time in the excitement of casino gambling. Most casino revenue now came from bets on security prices under a system used in the 1920s in the United States and called "the bucket shop system."Via Reflections in Value Investing.
The winnings of the casinos eventually amounted to 25 percent of Basicland's GDP, while 22 percent of all employee earnings in Basicland were paid to persons employed by the casinos (many of whom were engineers needed elsewhere). So much time was spent at casinos that it amounted to an average of five hours per day for every citizen of Basicland, including newborn babies and the comatose elderly. Many of the gamblers were highly talented engineers attracted partly by casino poker but mostly by bets available in the bucket shop systems, with the bets now called "financial derivatives."
Stillwell spends a bit of time on the history of the binomial theorem, which gives the expansion for a polynomial of arbitrary degree:
The coefficients of the expansion are equivalent to the number of combinations of k elements taken from a set of n elements:
These coefficients have a use in probability theory, which is probably the more familiar to most people -- namely, that the probability of getting k successes in n binary trials (e.g., coin flips) is the coefficient divided by the total number of possibilities (2^n for coin flips). They can also be calculated using Pascal's Triangle:
The binomial expansion is sometimes expressed in a simplified formula, in which y is set equal to 1, so that the equation reduces to:
The relationship between the binomial theorem and probability theory is of interest because the same formula turns up in a pretty important place in finance -- namely, in the formula for compound interest:
Here, the binomial expansion is multiplied by a fixed coefficient equal to the principal (or "present value") PV. The total expected value of the sequence of interest payments is simply the binomial expansion, with interest i as the variable x.
Curiously, one does not often see cash-flows expanded through the binomial theorem. Rather, the tendency is for each of the n payments to multiplied out. But that's of course equivalent to mutiplying the present value PV times the binomial expansion of interest payments.
What interpretation should be attached to the fact that compound interest is a binomial expansion? Here are a few observations:
The last is a consequence of the fact that as compounding occurs more and more frequently over the period in which payments are made, future value nonetheless approaches a finite limit:
Thus, if compound interest is like a binomial expansion in powers of interest, then continuously compounded interest is an exponential function of interest.
A final thought to ponder: Why do we calculate future value with this formula? In many cases, the answer is obvious -- because that's how we agree payments will be made by a borrower.
In other cases, however, that answer is less obviously satisfactory, and may even be incorrect. We can use the binomial expansion as an approximation for future value when there is some predictable process of growth occuring. But when growth occurs at different rates, or when compounding cannot be carried out efficiently, some other approximation for future value is necessary. The use of the binomial expansion as a crutch may be positively misleading.
Accounting is a trade, but the history of accounting is a subject of disinterested history -- a liberal art. And the accountant who knows something about the history of accounting will be a better accountant. That knowledge pays off in the marketplace. Similarly, future lawyers benefit from learning about the philosophical aspects of the law, just as literature majors learn more about poetry by writing poems.
Judge Posner makes a similar point about legal education in The Problematics of Jurisprudence. The context in Menand suggests that accounting may indeed have been neglected as a subject of liberal inquiry (i.e., as a subject of disinterested literary, historical, and/or scientific theory) because of its rep as a trade.
My personal experience suggests this to be an accurate depiction of how accountants are educated. They were even more parochial in their interests and education than the physics and chemistry majors I rubbed elbows with as an undergrad and grad student.
Could this be part of the explanation for the repeated failures of our financial system? Many earlier posts on this blog have addressed the serious weaknesses of financial accounting theory as a basis for measuring and managing our finanical system. But others see the problems, as Northwestern, Chicago, and Stanford Law Schools have begun teaching accounting theory to their students, at least through electives.
It's the unsexy, quotidien words that carry the heavy baggage in language -- a lesson learned early by every patent litigator. Similarly, its the unsexy, quotidien rules of accounting that move the cash-flows that represent the very life blood of our economy.
Click the accounting category at right for earlier posts. A fascinating historical fact, which I believe is not without larger significance, is that both a patent system and double-entry bookeeping were adopted around the same time in Venice in the 15th Century.
Basically, the direct method of accounting tracks cash changes from the bottom up to arrive at net income, rather than starting with net income and making adjustments.
In other words, you calculate profit and loss using a measure of actual dollars and cents flowing in and out of the company. Sounds sensible, right? Not if you know how things are done now!
"We don't collect this type of information [required by the direct method]," asserted Bond, a panelist at the conference. "That's not the way our ERP system is set up to do things." Steve Whaley, controller for WalMart, agreed, saying the enterprise-resource-planning system for the king of the box stores would also have to be reprogrammed to spit out financial results under the direct method.
Which begs the question, what the heck kind of information are we collecting right now?
See also the academic literature on this topic.
The answer is redesign financial statements. Don't leave it to the FASB, which is like letting the foxes guard the henhouse.
The occasion for this reflection came today as Christine Varney announced a series of workshops aimed at improving the Merger Guidelines. Lots of interesting hints in this speech. I was particularly interested in this topic:
[W]e are interested in your views on the use of more direct evidence that is not strictly based on inferences drawn from increases in market concentration. There are several categories of such evidence worth exploring: (1) evidence of the actual, post-merger competitive effects of consummated mergers, (2) evidence of "natural experiments" obtained by looking across different geographic markets, time periods, customer categories, or similar product markets; (3) evidence of the firms' post-merger plans; (4) evidence of customer views of post-merger competition; (5) historical evidence of actual head-to-head competition between the merging firms; and (6) historical evidence of actual or attempted coordination in the industry. Although the Agencies routinely rely heavily on these kinds of evidence to assess competitive effects, the Guidelines address their relevance only in passing and only secondarily, after the relevant market is defined and concentration in that market is measured. Courts also regularly rely on this type of evidence in assessing competitive effects.28 We are interested in views on whether we should adjust the Guidelines to address explicitly what kinds of direct evidence are pertinent and how they should be weighed.
Is it too much to ask that the FTC and the SEC coordinate on this? Recall that Goldman Sachs CEO Lloyd Blankfein recently suggested reforming the accounting rules to require all exposure to flow through to financial statements. Not only might these kinds of reforms benefit the SEC in enforcing its rules, it would give the FTC a better source of data -- at least for public companies -- on which to base econometric analyses of substitutability, for example. Needless to say, investors and ultimately consumers would benefit from having better information too.
Long time readers of Broken Symmetry may recall earlier posts (here, here, and here, for example) on how financial statement report only snapshots in time and time-averages of sales and costs of sales. These were all that was needed when buying and selling patterns changed only annually, or at fastest quarterly. To handle more rapid changes, you have to increase the sampling rate. What we need is a time-series of balance sheet accounts, or income/cash-flow statements on shorter intervals.
Having better designed financial statements would benefit consumers far more overall than even perfectly clear merger guidelines or perfectly efficient SEC enforcement.
How hard would it be for economists from the FTC to sit down with economists from the SEC and hammer out a set of economic data that should be required from public companies but that isn't in the financial statements now? To make it palatable politically, they could offer to get rid of Sarbanes-Oxley hassles in exchange for more information. Have a time-delay of the release of real-time balance sheet info to deal with the trade secret issues.
[A]ll of the exposures of a financial institution should be reflected through the P&L. Consider Structured Investment Vehicles or SIVs and other off-balance sheet vehicles that represented significant sources of funding for financial institutions around the world. Unfortunately, risk models failed to capture the risk inherent in these activities. Post Enron, that is quite amazing. If contingent liabilities and stand-by credit commitments don’t flow through the P&L, how can risk managers and regulators see the risks that a bank is exposed to?Blankfein's views are balanced and obviously well-informed. We could do worse than to follow the lead of the leader of one of the handful of institutions that came through the crisis smelling like a rose. Goldman Sachs clearly had a better understanding of what they were doing and of how the system as a whole was functioning than did other stakeholders, including government regulators.
Everything else being equal, the 10% margin business with one inventory turn is no better or worse than the 2% margin business with five turns a year.In fact, as Joe explains in the rest of his post, the company doing five turns a year is often in better shape. If you want an analogy, it's like the difference between two cars, both of which can carry you 100 miles on 9 gallons of gasoline, but one of which does so at 5000 rpm the whole way while the other does so at 1000 rpm. Same fuel efficiency, but you'll pick the car with more torque every time. You can accelerate or slow down much quicker with the 5000 rpm machine.
Sometimes inventory turnover rates are reported as part of financial statements, usually in the notes. An inventory turnover rate is a frequency -- i.e., given a unit of inventory the turnover rate is in units # / time. Repeatedly on this blog I have suggested that financial statements should report frequencies for each balance sheet account. Such frequencies are called "turnover rates" when the accounts in question are inventory accounts.
Note that accountants usually order balance sheet accounts from short-term to longer-term liquidity. That's a convention, not a measurement. Wouldn't it be nice to have actual numbers for each account?
The results are mixed. Not a surprise since even among economists the answers to these questions are mixed! But the title of the post raises a large question, which I'd like to answer using systems theory -- i.e., the theory of how highly integrated social networks self-organize. Systems theory has been a central theme for this blog.
If we understand the question in terms of systems, then we can answer in layers the question, "Is economics a science?" These categories are linked to the three generalized components that make "the system of economics":
The high-level answer is that economics is a science if it has active components of domain, field, and data. Economics is not only dismal, but dead if none of these is changing. But so long as each of these three have some time-dependence, the difference between economics and other sciences is a difference in degree, not kind.
But on closer inspection, there are indeed apparent problems with these components for economics that do not exist for other sciences. The biggest problem seems to be that at least until about ten or twenty years ago, there was no stream of data on individual or group behavior that could be used to falsify the domain of neoclassical theory that developed around WWII. In other words, there have been few new experiments or new kinds of data sets since neoclassical theory was developed. The field has become somewhat retrenched as it has labored at refining its theories using the same sets of data over and over again.
There are exceptions, of course. I'm in the middle of reading Vernon Smith's Rationality in Economics, in which he describes the interesting ways that actual behavior deviates from neoclassical predictions in actual experiments and field studies. And Smith is not laboring in obscurity, of course. He won a Nobel for his efforts in this regard. But in general, the "science" of economics seems to have been limited by a lack of data. There simply has not been an experimental or observational mechanism for the field to reach consensus on what hypotheses have been falsified.
In perhaps not-so-humble fashion for a non-economist, I would like to propose that experiments and observations of the average time period between individual acts of consumption or production is exactly the kind of data that economists could use to falsify models of individual or group behavior. The fact that such observations have not been made in the past is no accident, for they were generally too expensive to be collected before the Internet became widely available. (Csikszentmihalyi's Experience Sampling Method is a notable exception.)
Journal entries themselves encode the information that is necessary for managers, investors, and owners of corporations to see the frequency spectrum of activity within a firm that provides a unique signature of the firm's activities. Time-averaged measures of costs are not the only variables important to the health of a corporation. Profit is a function of sales and costs of sales over time.
Economics could become more scientific by statistical observations of individual and group behavior over time.
Either the price is right or it ain't, right? Why would "synergies" matter if not to price?
Firms often pay a substantial premium to the market price when making acquisitions. Does their willingness to pay a premium suggest the shares of target firms were mispriced?
EFF: The empirical evidence says that all the gains from mergers are eaten up in the premiums paid to acquire firms. On average, the acquiring firm gets nothing. This doesn't necessarily imply that the shares of the acquired firm were mispriced since there can be synergies (real business gains) from mergers.
KRF: Takeover premiums do not imply that the target firms were mispriced. Since we do not expect the market to accurately forecast every acquisition that will create value, we should not be surprised that prices rise when tender offers and mergers are announced.
Firms are subsystems embedded in a larger economic ecosystem. Synergies do matter. Beyond certain thresholds, otherwise linear correlations can go nonlinear, with concomitant consequences for growth or decay. But F&F are equivocating here. If synergies matter, then why wouldn't they be reflected in pre-announcement price? Shouldn't "the efficient market" be able to price in the probability of a merger and its resulting synergies?
As usual, a close look at the temporal dynamics suggests problems with EMH.
The key to using music to teach modern physics, in my mind, is harmonic analysis. Harmonic analysis is nothing other than Fourier analysis, a tool which is highly valued in physics and engineering. It is not widely known by the non-scientist, yet it is easy to explain in the musical context—when someone looks at a graphic equalizer on their stereo they are looking at a real-time Fourier amplitude analysis of the music that is being played. One can explain how the synthesis of harmonics affects the timbre of musical sound—how the coefficients in a Fourier series can describe different real functions. A very simple example is the comparison between the organ pipe and the piano string. The ‘stopped’ organ pipe has only odd-numbered harmonics and the piano has all harmonics. (Because the ‘stopped’ organ pipe has a pressure node at one end and a pressure antinode at the other, whereas a string has nodes at both ends.) The different symmetry of the boundary conditions accounts for a major aspect of the difference in the timbre of the organ and the piano.That's from The birth of the blues: how physics underlies music by J.M. Gibson. Via Physics and Physicists
If you're a regular Broken Symmetry reader, then you should know already that it is not only music that can be understood better with Fourier Analysis, but any time-series that can be decomposed into ocillations of different frequency, including cash-flows. The balance sheet is sort of like the graphic equalizer, except that the frequency of the accounts is not rigorously ordered in terms of liquidity. If it were, then the analogy would be complete.
That focus on flow is at the very heart of excellent manufacturing - not merely finding tidbits of waste and eliminating it. The final step in the logical thought progression is that the most fertile ground in the effort to accelerate the flow of products across the fixed costs within the value stream is to know what is limitingthe flow - and that is where Eli Goldratt comes in. He said that an hour saved at the bottleneck (constraint) is an hour saved for the system (in lean terms, an hour saved for everything within the value stream). An hour saved anywhere else is a mirage.Both theories focus on the frequency spectrum of activities within the firm rather than a time-averaged picture of costs. That's why lean is such a pain for cost accounting, which is based on time-averaged measurement of costs.
As mentioned in an earlier post here, what Goldratt calls "throughput" could also be called flow. Both are quantities per unit time.
If balance sheets included frequency-averages (i.e., turnover rates) for each account, would accountants be able to move assets or liabilities on and off the accounts without there being a trace on the financial statements? Right now, an asset moved on and off the balance sheet betwen the period of financial statements may not show up in the income statement or cashflow. These are time-average pictures of income and cash, and by definition an asset moved on and off within the period of time produces no net effect on the time-average.
We have blinded ourselves by excluding information about flow from our financial statements.
However, how do you measure value when IP can be so frustratingly intangible - what price that collaborative relationship with a competitor that a piece of IP may enable, for example? One at least partial solution to this conundrum may be to make royalty rates far more transparent. One suggestion was that all licensing deals agreed in the US should be publicly recorded at the USPTO so that people could see the sums involved. This would have to be compulsory as no-one would do it voluntarily.From the IAM Blog.
The activities of firms are cyclical. Even if a firm has only one customer that places only one order, or multiple orders at apparently random intervals, the activities of the firm that serves that customer will be cyclical in response to the order or orders. The ultimate reason for this is that human biology is cyclical, characterized by (for example) circulatory, circadian, and neurolgical rhythms that can be adjusted only within certain limits.
Managers and accountants keep track of the cycles of activity within firms using a variety of tools, including financial statements. In a nutshell, each set of financial statements gives managers a snapshot (the balance sheet) and a time-inegrated sum (the income or cash-flow statement) of the result of the cycles of activity that occurred during the period of the financial statement.
It almost goes without saying that no manager would try to run her company reading only the financial statements. On a day to day basis, managers look at a host of other measurements in order to understand whether the company is growing or shrinking as a result of the cycle of activities within the firm. Nonetheless, at the end of the day, the key question for every manager is whether each cycle of activity within the firm is increasing profitability (and hence equity) over some time horizon.
Again, it almost goes without saying that investors are interested in the same thing -- i.e., with increasing the profitability (and hence, equity) of the firm. Yet unlike managers, investors now have access only to the financial statements produced by the firm. And financial statements are available now only once per quarter.
It is a mathematical fact that the frequency of measurement must be at least twice the frequency of a signal for the frequency of the signal to be detected accurately. What this means is that even if all relevant accounting information is accurately reported on financial statements investors now have no way of measuring changes in the frequency of cycles within firms that occur faster than twice a year!
Ask any electrical engineer and they will confirm that the shortest cycle you can measure accurately is twice the length of the cycle of sampling. Any cycle shorter than that will be "folded over" into the sample.
Isn't this basic insight important to understanding how we could prevent a recession or depression like this one from spiralling out of control in the future? Maybe if investors could see how things change within firms more often than twice a year they could be more effective at putting the breaks on bad management.
Maybe the SEC should focus on making accounting information available to the public in real time through technology like XML rather than prosecuting managers for misconduct years after the damage has been done.
Any readers have experience or knowledge here?
A time series of journal entries records sales and costs of sales. Later these are summarized into income statement, balance sheet, and cash flows. Lost in these financial statements is the information about how often each unit of sales and costs of sales was earned or incurred, respectively. But obviously the company has to keep track of the rate or frequency of sales and costs to keep profits increasing. The two have to stay synchronized to keep profits increasing.
How is this done now?
After yesterday's SEC hearing on mark to market, I'm hoping the message is getting through that this is a rule without a reason. Isn't the name itself inappropriate? For when an asset is marked to a "market" value without any transaction having taken place, the value is NOT a market value at all, but rather a pro forma estimate, and a bad one at that since it will not usually include consideration of the total volume and liquidity of the asset in question.
I'm not a CFA or registered broker agent, so this isn't investment advice. But have you noticed that at the same time that short-term treasuries have been at historic lows, the long-term treasuries are very low also? Sure, the story about 1, 3 and 6 month t-bills makes for great headlines -- those are dramatic cliffs in mid-September. At the same time, there hasn't been the volatility in the longer-term notes. 30 year notes, for example, are still hanging around 4 percent, where they've been all year.
Which demonstrates quite vividly that nobody really believes that the equity market isn't going to resume a normal, sustainable (8% to 15% depending on how optimistic you are politically) rate of growth over the next 30 years. Yet most investors are not ready to jump into the equity market for the long-term (as the short-term rates ably demonstrate).
Again, I'm not an investment advisor or macroeconomist by training, but the money being used to buy short-term notes will have to go somewhere within the next six-months when those short-term notes expire. Real estate, like all assets whose value isn't fundamentally derived from human capital, doesn't have the growth potential of equity. Debt markets will surely pick up for some high-quality issuers, including the manufacturers of staple commodities. But for long time horizons, equity is looking better and better as 2008 goes on. Besides that, selling equity now means you fail a basic test of investing -- if Warren Buffett is buying, are you buying or selling?
Take a look at the term sheet for the injection. The government plans to buy preferred stock, now in vogue. It'll carry a 5% dividend, rising to 9% after five years; it can't be redeemed for at least 3 years; restricts dividends on junior preferred stock or common stock; has no voting rights; and restricts executive compensation to be in accordance with the Emergency Economic Stability Act's requirements. One other thing: it has a "perpetual life," apparently because the term sheet says so. Realistically, the Treasury is not going to be a longer-term player in these preferreds any longer than it has to be - and there are no restrictions on the transferability of its investment.Is this complaint sound? Yes and no. At this point in time, there's no way to argue with the facts they cite in evidence for their reasoning. But it's important to remember how we got here. Fifty years ago, most public companies paid a substantial dividend to their shareholders, and holding preferred shares was a good way to invest in smaller or struggling companies whose common dividends were not secure. With the shift away from paying dividends toward reinvestment of free cash flow, the significance of the distinction between preferred and common has diminished. Thus the recent suggestion of some on FASB that only common shares should be considered equity.
Some readers may argue that tax efficiency is a justification for retaining earnings. Maybe. But on my view, if a company can't earn more than a dollar for each dollar it reinvests, then it should give me my cash and pay taxes rather than keeping some pet project on life support. I don't like taxes, but the government is setup to spend them -- and is subject to more checks, balances, and transparency into its spending.