image missing
Date: 2024-04-19 Page is: DBtxt003.php bk0003010-v2019
Burgess Book Manuscript
Basic Concepts for TrueValueMetrics
Version of 2010 by Section
LAST ... Open bk0002010-v2019 CHAPTER ... L0900-BCofTVM-2010-030000 NEXT ... Open bk0004010-v2019
LAST ... Open bk0002040-v2019 THIS SECTION ... bk0003010-v2019 NEXT ... Open bk0003020-v2019
.
Chapter 3 ... Data Types and Attributes
3-1 Characteristics of TVM Data
.

What Impact Will TVM Have?

Data centric everything

In a world where there are complex linkages there must be management information so that decisions can be made … and data about performance so that decision making can be revised to take advantage of opportunities for progress and to minimize the impact of unfavorable outcomes. The key is to make data drive every thing … and for the data to include not only money accounting elements but also elements that relate to operations and value that impacts society.

TVM is not a management system … it is only part of a management system. It is an important part … but only a part. TVM provides a way to do “scorekeeping” and to have “stats” so that performance can be improved. The data and metrics of TVM are independent and aim to be objective and useful. The goal is to help to have real progress and high performance … not merely a data construct that makes people feel good on top of bad progress and performance.


Data that gets used … scaling down

Data is only useful if it gets used, and in part this is going to be achieved by having data that is useful to people who “do things” rather than merely being of interest to the academic world and to those who only work at the policy level.

Many dataflows are designed to by aggregated and used to give top level indexes of performance … but these dataflows do not help to make good decisions at an operational level. TVM does not have a focus on scaling up and aggregation, but aims at having very good data at the lowest scale rather than at the biggest scale. In the TVM data model a small about of data will tell a lot about a small place … something that gets more and more difficult as the scale increases.


A different set of perspectives

TVM is is a paradigm shift for the metrics that are used in the decision making processes of society. The prevailing process uses metrics that are about an organization and its profit performance, and then the impact of these and other decisions on metrics like the national gross domestic product (GDP and the various measures of capital market performance. Very little in the prevailing system of metrics relate directly to society and quality of life and the the dynamics of value economics and value accountancy.


Nature of Data

Representation of reality

Good data are a digital representation of a reality … with no meaning at all except as they relate to the reality.

TVM avoids advanced statistical method that seems to create a virtual reality where there is really no tangible reality at all. That is not to say that statistical method has no utility … but merely that in the context of socio-economic data and analysis, other methods for understanding performance are likely to be much more reliable.

Data are used to store knowledge ... data are knowledge ... and data are used to communicate knowledge. Reality becomes data ... then with analysis, information ... and with human interaction, knowledge ... and with experience and reflection, wisdom.


Materiality … don't sweat the small stuff

The purpose of TVM data is to facilitate progress and make it possible better to manage resources. Having more and more data is not the goal ... rather it is to have more and more productivity and social value. In order for this to be achieved, the data must relate to matters that are material ... that can make a difference.

A surprising amount of activity is small stuff that does not have much impact … and individually not of great moment. This small stuff must not be allowed to clog the system and slow down or stop good decision making.

There are times when the small stuff adds up to something that is important. Sometimes it becomes possible for great progress to be made because some small item can be deployed over and over again … and in aggregate becomes very valuable.

Knowledge has the potential to be very important … small increments of knowledge have the power to release millions of people from their constraints … but where to start? The potential of knowledge to change people's lives is fairly obvious … but what is surprising is that something this obvious does not seem to be working very well!


Materiality … relevance of data

The data needed for analysis and decision making are those data that are relevant to the solution to the problem. An iterative approach to data is required. Simple data about everything is important to identify where the problem needing solution is to be found … and then more data are needed about these matters. For example if a problem seems to be related to water … then more data about water is needed whether it is about its availability or its quality or otherwise. If the problem is about water, data about housing, health and education is not relevant and need not be taken into consideration in connection with the analysis of the water problem.

The relational construct

The data about socio-economic performance may be organized using a normalized relational construct. TVM data are organized using the relational model. This approach makes it possible for a small amount of incremental data to be related to everything else and have useful value within the established analysis framework.
Cobb in 1978
When the relational model for database design was developed in 1978, there was a paradigm shift in the way accounting could be done. The relational model made it very much simpler to handle the aggregation of data and drill down ... and to be able to study the data from different perspectives.

Ubiquitous

Data are everywhere. The more we learn about life ... about almost anything ... we learn that there is a data component that makes life work. The brain is all about data ...
Mali ... 1980s
I had a conversation with an old man in Mali ... a village elder ... when I was in his village during the Sahel famine of the 1980s. He knew a lot about the history of rain in his village ... much more than was recorded in sophisticated data systems. It taught me that “If I do not know something ... it does not mean that it is not known”.
In the broader context, I argue that very little is known by economists and planners about community ... but a lot is known about community by the people that live in the community. They have the data ... but not in a form that we find easy to use!

Data of many different types

In meaningful metrics, the data are a real representation of a reality. Some realities change slowly and so do the data ... some move rapidly and these data change rapidly. Both are important in the proper context. TVM uses data as efficiently as possible using a concept of organizing data that has its origins in classic business accounting where data are of several types.

Data may be characterized as either permanent data and transient data. Permanent data changes slowly, while transient data is changing all the time. For example the name of the town and its location are permanent data, while the current weather is changing all the time and is transient data. Transient data sometimes changes very rapidly ... for example data about economic transactions, while the results or impact changes more slowly.


Data need to be believable

Data need to be right. The analysis of data that have little relationship to reality has little value ... worse, the analysis may result in bad decision making. There is a need to ensure that dataflows have integrity and there is no replacement of valid information with fictional data. There is also the need to ascertain that data that are in the system are correct through a system of validation.

GIGO: Garbage In … Garbage Out


While it is good practice to have fully normalized data in a relational system to have the most efficient data processing ... it is sometimes desirable to have redundancy in the data and dataflows so that data may be verified in an independent manner. Data should not only be right, but be seen to be right!


Detailed data … no more tyranny of the average

The socio-economic system is complex … but with meaningful data about a specific place it is possible to identify critical constraints in this place and address them. It becomes possible to understand cause and effect, and to identify and build on specific possibilities of the community. The difference between the performance of one place and another is substantial … and data that improves decisions about the allocation of resources and the application of effort will make a big difference. In any specific place some things are good and some are bad … nothing, or very little, is average.
Good Managers Use Specific Data
Good managers understand their operations … they know what works well and they know what is not working well. They achieve improved performance by replicating what works well and eliminating what does not work well. As a result of this, the average improves. They do not work on the “average” … they work in one way on things that are good and in another way on things that are bad.
With meaningful metrics it becomes possible for managers and decision makers to facilitate more that is good and to sort out what it is that makes some things bad. Decision making based on this approach results in impact that is huge … but again not easy to quantify.

Beyond proxy measures of performance

Causality based on statistics is academically interesting … but not practical for operational decision making. In a specific place, there must be knowledge and understanding of specific cause and effects. Broad policy agendas do not translate efficiently into local action and impact without local specifics. The devil is in the detail … and only at local level is detail recognizable and issues solvable.

Good metrics improve decision making … and the impact from this will be huge, though difficult to predict and quantify. TVM true value metrics are meaningful about community priorities … not simplistic or statistical constructs around some simplistic proxy for performance.
World Bank … disbursement a key progress metric
The World Bank for years used the amount of project disbursement as a key measure of the performance of a project. A project that did not disburse was considered a bad project. The idea of how well the funds were used was a secondary matter. The World Bank is a huge institution that has compiled a massive amount of data, almost all of it with a “Top Down” perspective and of little utility for practical improvement of society using “Bottom Up” community level decision making and activity.


LAST ... Open bk0002010-v2019 Chapter Navigation NEXT ... Open bk0004010-v2019
LAST ... Open bk0002040-v2019 THIS SECTION ... bk0003010-v2019 NEXT ... Open bk0004020-v2019

SITE COUNT Amazing and shiny stats
Copyright © 2005-2021 Peter Burgess. All rights reserved. This material may only be used for limited low profit purposes: e.g. socio-enviro-economic performance analysis, education and training.