image missing
HOME SN-BRIEFS SYSTEM
OVERVIEW
EFFECTIVE
MANAGEMENT
PROGRESS
PERFORMANCE
PROBLEMS
POSSIBILITIES
STATE
CAPITALS
FLOW
ACTIVITIES
FLOW
ACTORS
PETER
BURGESS
SiteNav SitNav (0) SitNav (1) SitNav (2) SitNav (3) SitNav (4) SitNav (5) SitNav (6) SitNav (7) SitNav (8)
Date: 2022-07-04 Page is: DBtxt001.php bk0003030-v2019
Burgess Book Manuscript
Basic Concepts for TrueValueMetrics
Version of 2010 by Section
LAST ... Open bk0002010-v2019 CHAPTER ... L0900-BCofTVM-2010-030000 NEXT ... Open bk0004010-v2019
LAST ... Open bk0003020-v2019 Section Navigation NEXT ... Open bk0003040-v2019
.
Chapter 3 ... Data Types and Attributes
3-3 Easy Data
.

Making Data Acquisition Efficient

Use what is available!

The fact is that there is a huge stock of data … much of which never gets used. Some is compiled at great expense, and then used just once and forgotten about. Consultants have been paid enormous amounts of money to study a variety of things … in practically every case the work includes compiling data, doing analysis and drawing some conclusions. Once the study is done, the data exist, but do not get used again. The system is high cost and inefficient. These data can have value in a system that seeks to understand community state, progress and performance at least cost.

Easy data are everywhere

Some data are easy to acquire ... some very difficult. To the extent possible, easy data should be used as much as possible. These data may sometimes be obtained very quickly. The key is not to ask for specific information in a specific form, but to ask about what data are available that broadly relate to the subject at hand and use these data to the maximum extent possible. In many cases these data are easily available.

Some easy data have the added advantage of providing some history from past periods that cannot be obtained in a data acquisition program that is only collecting current data.


Data repositories and documentation centers

A surprising amount of data exists … but it is only going to get found when there is some pro-active search. Much older historic data are in paper documents … and while not immediately usable in electronic media, the data may be transcribed if it seems to be of some value.

Of course care should be taken in using data … whether new data or old historic data, that the data represents what it purports to relate to! Much data has been “fabricated” over the years and served to satisfy some dataflow conditionality without in fact representing any reality at all.
Some data … probably not very precise!
In my experience, I know of a hospital in Africa that reported completely fictional data about its patients for years. The hospital was chronically short of doctors and nurses and had no admin staff … and was also short of money and medical supplies. It had to report extensive and intrusive statistics in order to have even a limited supply of drugs from the government. The forms were filled with numbers every month based on pure guesswork not based on any data collection process! Good decision … but does not make for great data. We need to be careful drawing important conclusions from unverified statistics!


Walking around … observation and perception

A large amount of data may be obtained simply by “walking around” … but converting this into a useful record is not particularly easy. Increasingly this is being done using photographic images, but too often there is inadequate labeling of the image. The time and the place are critical information … together with some brief narrative.

Training in “observation and perception” is helpful … too many people do not see what there is to be seen. Hardly anything of what people see gets into any system of metrics about the progress and performance of society. This has to change!


Not more and more date … more information.

The goal is not to get more and more data … but to get more and more understanding of the community and the socio-economic state, progress and performance.

Some duplicate data is an advantage. When the same set of facts is reported using two separate sets of data, there is a good probability that the data are accurate. If there are three separate sets of data also showing the same set of facts, then it is very likely that the data are accurate. More sets of data after this, does not add anything except cost.

Data about other things adds to understanding. If one set of data are about health, another set of data about education would be interesting … and any other sector that seems to be of importance in the community, especially the production sectors.


Advanced common sense

The key goal of data acquisition is to have data that are useful and help improve performance. The goal of TVM is not to have data suited to research studies, but to have data that are useful for decision making and measuring performance.
Example of data acquisition for fishing fleet
A group of experienced scientists were asked to collect data about the structure of the fishing fleet. They designed a survey and statistical method to make their inquiries and did a perfectly random set of interviews three times a week for six months. At the end of this time they had nearly nothing of value.
I was faced with the problem of time and money used and no useful data. I am an accountant that does not particularly like statistical data. Every fishing boat has a license. To get a license the fishing boat must be registered ... and to get registered a form has to be filled in, and is filed somewhere! I found the filing cabinets and now had details of every fishing boat ever registered ... date of registration, size, type of construction, date of construction, engine make and horsepower, fishing gear type, refrigeration equipment or not, etc., etc.. After a day of data entry typing there was a respectable database. After a few days of checking at the fishing port we were able to verify much of the data in the database ... and now had complete and good data about the fishing fleet.
This cost effective data collection was obtained by building on data that was already available ... but unused because it was in another department!
Sometimes, the understanding of data may be enhanced by statistical study ... but good techniques of data collection, accounting and analysis are usually sufficient to get good management information for decision making. The key is to fully understand what data are important and what issues have a material impact on performance.

LAST ... Open bk0002010-v2019 CHAPTER ... L0900-BCofTVM-2010-030000 NEXT ... Open bk0004010-v2019
LAST ... Open bk0003020-v2019 Section Navigation NEXT ... Open bk0003040-v2019



The text being discussed is available at
SITE COUNT<
Amazing and shiny stats
Blog Counters Reset to zero January 20, 2015
TrueValueMetrics (TVM) is an Open Source / Open Knowledge initiative. It has been funded by family and friends. TVM is a 'big idea' that has the potential to be a game changer. The goal is for it to remain an open access initiative.
WE WANT TO MAINTAIN AN OPEN KNOWLEDGE MODEL
A MODEST DONATION WILL HELP MAKE THAT HAPPEN
The information on this website may only be used for socio-enviro-economic performance analysis, education and limited low profit purposes
Copyright © 2005-2021 Peter Burgess. All rights reserved.