Part 1 of 2
When it comes to compliance, sometimes knowing the “why” can make all the difference. In Part 1 of this series, Ken Agle provides some background on how the gold standard, historical wars, economic development, and other events have had upon modern transaction limitations.
“My financial institution stopped paying interest on my money market account because I wrote too many checks. Can they do this?”
One of our artifacts that we request as part of our AffirmX compliance solution services focuses on how institutions monitor and control this particular aspect. Frequently, institutions and customers/members alike wonder why there is a limitation on certain accounts, and how this all came to be.
Why are there such strict regulations?
Whenever this issue comes up, there usually isn’t time to fully explain the history behind the matter, which would go far to explain the “why.” So when I heard the question come up again recently, I resolved to finally write this brief recap of how these regulations came to be. Having a fundamental background can help us understand the nuances of these transaction limitations, which is especially useful in this particular instance, because the nuances are rather significant.
Modern reserve requirements and transaction limitations have been wrought by two major, historical factors—the metal standards and developing reserve requirements.
The Metal Standards Play their Role
Defining the first US monetary system
The United States came around in a time when precious metals were used to establish monetary value. In the 1780s our financial founding fathers, led by Thomas Jefferson, Robert Morris, and Alexander Hamilton, recommended to Congress the value of a decimal system. This system would also apply to monies in the United States. The question was what type of standard: gold, silver, or both. The United States adopted a silver standard based on the Spanish milled dollar in 1785.
You can clearly see the impact of history upon that decision, as the US would seem to favor other nations over Great Britain at that time.
Twelve years later, Congress passed the Mint and Coinage Act that allowed the Government’s use of the Bank of the United States to hold its reserves. This act also established a fixed ratio of gold to the US dollar. Spanish influence remained as the Spanish real remained part of the equation along with gold and silver coins.
Gold in turmoil
By 1806, the impact of payments for the Revolutionary War partially led to the suspension of minting silver coins, since they were largely used to pay for the war. Over the next generation, the fluctuation of gold and silver prices would cause some havoc to their fixed pricing. This was further exacerbated with the discovery of gold in California in 1848, when gold gluts caused its value to fall nearly to the same level of silver.
The next major event in the impact on using the bi-metallic standard was, of course, the US Civil War. The staggering costs of the war and inflationary pressures meant that payments in gold and silver were exceptionally difficult, resulting largely in the creation of bank liabilities in the form of bank notes and deposits. By 1862 paper money was made legal tender that were then, and even now, called “greenbacks.”
For the next 30 or so years the battle waged over what various standards to establish. At one point, silver was eliminated as a reserve element. During this time greenbacks and gold were considered the primary method of payments and reserves. The battles over silver’s role would continue through the late 1800s until the Gold Standard Act of 1900 ended free silver as an effective implement of American politics, declaring the gold dollar to be the U.S. standard of value.
Giving up on metallic standards
The 1900s would see the end of the metallic standards. The impact of inflationary elements from World War I substantially undermined European nations from adhering to any form of the standard. However, many would find their way back to some form of the gold standard by 1927. Of course, that was before the Great Depression.
While most consider the Great Depression as a 1929 event, it wasn’t really until 1931 that international banks were hit. These included runs on some of the largest banks in Europe, destroying any remaining confidence in the metallic standard, as reserves were insufficient to meet a panicked global population.
The uneven international suspension/reliance of the gold standards during the early 20th century were considered material events for both potentially initiating as well as protracting the Great Depression. Some see Great Britain’s dropping of the gold standard as the key event that shredded confidence in the banking system. Other arguments abound, but it remains logical that the US’ and other nation’s adherence to the gold standard limited governments from expanding the money supply necessary to stimulate economies.
World War II gave us Bretton Woods, a gold exchange standard whereby participating countries pegged their currency to the US dollar and then from US dollars to gold at an exchange rate of $35 an ounce. This system worked relatively effectively until the cost of yet another war emerged—in this case the Vietnam War. It made reliance on gold unfeasible. What’s more, President Nixon ended convertibility on August 15, 1971. As of 2015 no country maintains a true gold standard (specie or bullion). However, that doesn’t mean that reserves are not maintained.
All of this leads us back to our original question (or complaint) we hear quite frequently. Why does anyone care about transaction limitations? There may be many answers, but hopefully knowing the history of the turbulent gold standard for the past 200+ years illustrates that there is a delicate balance frequently thrown into chaos during turbulent times.
Check back next week for part 2 in this series, “[Those Who Do Remember History: Part 2].”