Saturday, January 14, 2012

How do we Maintain the Quality of Distributed Monetary Operations & Decision-making?


First, always remember that currency is a tool created by an issuing group, and that all fiscal/monetary/tax issues involve how best to use that tool. No matter how good a tool you have, you have more options for both using and misusing it.

Separating tool & tool use is a good way to look at what goes wrong everywhere. For now, I'm going to attribute all usage problems as Cargo-Cult Process Management.  How we misuse monetary operations is similar to how we mismanage anything.

Instead of being careful, we're being stupidly lazy and making our nation one big skunk-works. We are, by definition, always in an ongoing aggregation process, yet it seems our thinking is typically stuck somewhere between:
1) personal gain,
2) system gain, or
3) co-optimizing both components + system.

Our greatest need? Refined methods for keeping higher proportions of our electorate practiced at stage 3 thinking.

What can we best use monetary operations for?  Obviously, to advance Public Purpose, which boils down to a 2-stage optimization task of maintaining components (people) as well as system (nation).  That goal is easily stated, but difficult to practice.   How do we actually tune monetary operations to help rather than hinder Public Purpose?  To answer that, we need to step back and look more broadly at context.

When naive people recognize a process for the first time, there's a temptation to think that it's use is adequately defined by a few variables ... when, in fact, there's a polynomial list of variables involved and no easy way to see which permutations scale across multiple contexts. Otherwise, it wouldn't have required 3.5 billion years on Earth to produce the densely-engineered, incredibly complex system call "us".

Take something as mundane as orange juice.
Some mechanical engineers & organic chemists say, "How come the growers/grocers have all the orange juice? It's mostly sugars, ascorbic acid, and some volatile oils. That's not stable. Let's fractionate it for separate storage, then recombine on-demand." Voila! Cargo-cult context management!

Once deoxidized & fractionated, purified parts of what remains stable is hoarded in bulk storage.

Eventually though, such, overly simplistic, Cargo-Cult thinking causes apoplexy among nutritionists, biochemists, physiologists, diverse clinicians, statisticians and general system scientists ... not to mention gourmets & chefs, and ultimately parents, educators & diverse developmental specialists.  Defeat was chosen from the beginning, because all the tool users weren't adequately linked to why they were using their tools.

We're always novices at emerging context. We presume that one-pass sampling of a complex system provides enough predictive power to cavalierly tinker with the whole system. To improve, we always have to explore more group options faster, so tinkering is needed, but must initially be safely isolated until beginners learn from most of the mistakes that others would see coming, if asked.  After that, all tinkering always needs full-group review.  There is no known mathematical alternative that can define the quality of distributed decision-making.  Decision quality, like data, is meaningless without full-group defined context.

Our greatest need - even in monetary operations - is to extend the sanity-check process to better sample all emerging sub-processes.  Here's how.
1) Have a rough model or checklist of all stakeholders repercussions (contingency tables).
2) Have a rough model for what constitutes adequate sampling of stakeholders (actually use the contingency tables).
3) Have a rough model for the ratio of disruptive/adaptive momentum (the summary from reading extended contingency tables).


Basically, don't imbalance groups faster than you can re-organize them. Otherwise, we're sawing off the limb we're sitting on.

[Note: It's not really as bad as it seems. Sub-processes more than ~5 layers deep in any system converge to stability. New options are statistically shielded from black boxes more than ~5 fractals in the past?]