|SiteNav||SitNav (0)||SitNav (1)||SitNav (2)||SitNav (3)||SitNav (4)||SitNav (5)||SitNav (6)||SitNav (7)||SitNav (8)|
|Date: 2022-07-07 Page is: DBtxt001.php bk0004080|
The Basic Concepts of True Value Metrics
Determining Cause and Effect
This is not an academic exercise
Needs to be timely and low cost
Making decisions needs to be done on a timely basis … not at some time convenient to academic researchers that is measured in years but perhaps in days. The academic community may say that this is not the “right” way to do things, which may be true in an ivory tower, but is not the best way when the family is hungry and likely to starve!
Most problems in society are unique … not in every aspect, but in enough to make the usual form of academic research inadequate. In many situation the need for societal improvement is urgent, and rapid decisions are required.
An academic analysis of the correlation between quality water and health does not inform the decision … the connection between decent water and better health is well known, so knowing more about this is not going to add much to decision making. What is needed is knowledge about how best to make water available, and how to pay for it both in terms of capital cost and in terms of ongoing maintenance and operation.
Cause and Effect NOT Just Correlation!
Decision making requires a practical understanding of the process … in other words responsible people have to know the behavior of costs and the cause and effect in the process. This is not an academic exercise … it is real experience in play.
Data have some value in helping to identify causal relationships. In general, though, systems that are influenced by humans are complex, and good data and analysis can only go so far to understand how the system works.
There will never be strong accountability just using economic analysis for decision making and oversight. Decision making needs data that help with “cause and effect” that is very specific to a community … not simply a generic causality … something that can emerge using value analysis along the lines of TVM. Accountability follows when there are data about progress and performance … a routine analysis of TVM.
Most, if not all, economic models to simulate the development process are are based on statistical correlation which has limited reliability when causality … cause and effect … is not understood. TVM discounts planning and projections based on models that rely on correlation, preferring to have data that are more closely associated with action items that produce measurable results.
Statistics alone insufficient
Management information not rigorous … but it works
In any complex system ascertaining cause and effect is not easy … and in fact it is essentially impossible using small surveys and statistical interpretation. In the TVM analysis framework the question of cause and effect is addressed using a process of drill down, and data acquisition about matters that might be meaningful.
Where the balance sheet comparison shows some element of the balance sheet has changed, there should be focus on what activities of the community or what set of externalities might be the cause of the change.
Conversely, where it is known that there have been substantial changes in the activities of the community and the externalities associated with the community … then it is reasonable to be looking for changes in the balance sheet of the community that serve as metrics for progress.
Large scale datasets and complex multivariate analysis may make it possible to discern relationships within complex systems ... but it takes very large datasets that in many cases are impractical to obtain.
Big computer or a little bit of common sense?
Some years ago the operators of one of the biggest commercial computers in the world were asked a simple costing question … how much does it cost to process a piece of luggage through an airport. They took their available data about luggage throughput, and the available data from the accounting department and worked out complex algorithms to calculate the answer. After hundreds of computer hours … piles of printouts … the computer analysts admitted defeat.
For a tiny proportion of the computer analysis cost, it would have been possible to do a process study, cost that, and draw conclusions. The answer would have been easier to understand , and likely more correct … at a way lower cost, and in less time.
Use location specific small datasets
By using location specific small datasets it is possible to figure out some of the most important causal factors by simple observation over a short period of time. Good observation and common sense judiciously used are powerful analytical tools. Quite brief observation of a situation where performance is very bad may be sufficient to identify a practical solution ... a low cost analysis giving a high value outcome.
Operations analysis … simple common sense
A factory production line is producing on average 20,000 units per shift. The theoretical production should be nearly 40,000 per shift. Labor cost was nearly double what it should be! What is wrong?
Watching the production line for a short time showed the unreliability of several of the machines ... strung together in series ... and downtime escalating exponentially.
There were two possible solutions ... redesign the production line to have parallel equipment ... or using a spare line approach so that the production crew could be fully utilized. The second approach was used and labor costs dropped to almost what they should be theoretically!
Analysis of a small dataset in an academic setting may produce little or no conclusions of value. Considerable care needs to be taken in drawing conclusions from small observations. This is a problem that is everywhere in the official relief and development arena. In many cases material improvement can be achieved by getting additional data simply by “walking around” and making observations. Especially, this validates other data. Most important it may add the important dimension of externalities that are the bane of small dataset analysis.
Data getting worse … statistical methods better!
This is what I was told at FAO in Rome a long time ago, and updated more recently. From 1950 to about 1965 the data about fisheries resources improved substantially … from 1960 to 1980 the mathematics and statistical methods for fisheries resource data analysis improved substantially. From 1965 to the present much fisheries resource data has diminished in quantity and deteriorating in accuracy. No matter how good the statistical analysis of data … bad data produces bad conclusions.
Capital markets all correlation The public news about capital markets and investment is all about correlation. There is a big industry associated with the distribution of news about capital markets and the correlation of all sorts of money economy data.
Stock prices are reported as a continuous stream ... indexes of capital market valuation are also reported in real time. Why? Bloomberg News and others facilitate some knowledge about economic activity ... but rather little of the totality of what is required for an informed public. The socio-economic behavior of society would be changed substantially if more understanding of cost and profit were in the public debate.
But this is not how money is made in the capital markets. From time to time there are scandals about “insider trading” which is not allowed. Insider trading happens because it is very profitable … if the perpetrators are not caught … and it is pure “cause and effect” rather than the mushy correlation accessible to the general space.
Some make money in the capital markets because they have done their research and know their industry very well. Many of these people are able to bring this knowledge to bear and improve the performance of an organization because they know more about cause and effect in the industry than other managers who are under-performing in their jobs. This is the essence of some successful hedge funds … profitable and value adding because they reduce waste and inefficiency.
|The text being discussed is available at|
Blog Counters Reset to zero January 20, 2015
|TrueValueMetrics (TVM) is an Open Source / Open Knowledge initiative. It has been funded by family and friends. TVM is a 'big idea' that has the potential to be a game changer. The goal is for it to remain an open access initiative.|
|WE WANT TO MAINTAIN AN OPEN KNOWLEDGE MODEL||A MODEST DONATION WILL HELP MAKE THAT HAPPEN|
The information on this website may only be used for socio-enviro-economic performance analysis, education and limited low profit purposes
Copyright © 2005-2021 Peter Burgess. All rights reserved.