Feeds:
Posts
Comments

Archive for the ‘Guest Post’ Category

Objective Truths Regarding Minimum Wage

Lenzini Photo portrait

Guest post by Matthew Lenzini, chairman of the Colonial Region Republicans.

I can’t recall a time when both of my parents didn’t work. My father for most of my childhood worked nights and weekends managing a Diner in South Philadelphia. My mother waited tables as a night shift waitress at the Philadelphia Airport Marriott. My first job in High School paid $4.75 an hour and truth be told, given my skill set at the time, I was probably overpaid. My parents instilled a sense of hard work and frugality in me that exists to this day. We never had much money and though I have been able to find some success in my life, I am a firm believer in adding value to society and living below my means. I believe that personal growth and success are core American Values. I believe that we live in the land of opportunity and that with hard work and a little luck, everyone has the ability to better their circumstances. This by no means, guarantees success but unlike many countries in the world, we all have a shot at the American Dream.

There has been a tremendous amount of discourse on the national stage regarding the federal minimum wage. Economically speaking, raising the minimum wage is actually bad for growth and harms those that we are most seeking to help, entry level workers and the working poor. I believe that much of what has been cast in the media is misleading. It is either intentionally misleading for political reasons or misguided due to misinformation and a lack of understanding. Unfortunately, feel good economic policies, catch the attention of the press and politicians oft present what appear to be efforts geared at helping the public but, are in fact bad policies based on popular demand, rather than sound judgement. I’d like to take a more objective and analytical look at unwinding four popular myths surrounding minimum wage in America that when analyzed, are in fact wrong and misleading:

  1. Minimum wage has not kept up with inflationary pressures
  2. Increases to minimum wage will improve the economy and decrease unemployment
  3. Minimum wage earners are on average 35 years old
  4. Raising the minimum wage helps working families

Myth 1: Minimum Wage has not kept up with Inflationary Pressures

The first minimum wage was enacted in the United States in 1938. At the time, the rate was set to $0.25 cents an hour.  There is a popular myth that the minimum wage rate has not kept up with inflation.

Objective Truth 1: Minimum Wage has kept up with inflation and the Consumer Price Index

If the 1938 wage were adjusted for annual inflation through 2015, the minimum wage rate would currently be around $4.90 give or take a few pennies. The federal government has reset minimum wage a number of times; typically during or after periods of rapid inflation. The table below highlights a few of the federal adjustments to minimum wage and the inflation adjusted wage today if there were no additional adjustments to the wage by the federal government.

Lenzini 1

                                  Lenzini 2 

                                   Lenzini 3

I’d like to note a few things regarding the table above. Inflation adjusted, the minimum wage has fluctuated some but typically stays within one or two standard deviations of the average, which is approximately $8.20. We have had an inflation adjusted range high of $10.07 in 1981 and a low of $4.97 in 1939. All in all, the current rate of $7.25 is well within one standard deviation, which is about +/- $1.25 of the average (this is well within tolerance limits for statistical measurements).

Now, there is an argument that rather than using inflation, one should use the Consumer Price Index. After all, the cost of goods and services changes over time. CPI uses a baseline year (1984) as an index. As such, 1984 has an index of 100 and all other years are adjusted using a percentage of the index. For instance, 1939 has an index of 13.9. If we take the 1981 wage of $3.35 and multiply it by 13.9 percent, we have a 1939 equivalent rate of $0.47. If we were to use similar dates to our inflation based analysis, we would get the following CPI adjusted rates. Again, we find that the federal rates are within a reasonable level of tolerance of the average. The greatest deviation occurring in 1939, one of the first years that a minimum wage ever existed which also happens to occur in the midst of the great depression.

Lenzini 4

Myth 2: Increases to minimum wage will improve the economy and decrease unemployment

The U.S. labor participation rate is currently sitting below 63 percent. The last time the rate was this low, was in the late 1970s when we had according to President Carter, a crisis of confidence. The reason I use the labor participation rate instead of the more popular unemployment rate is that the unemployment rate only represents people that are actively seeking work. Anyone that has given up the hope of finding work, does not show up in the U.S. unemployment figures. The “real” rate of unemployment is closer to 10 or 11 percent. The unemployment rate has improved for the wrong reason; people have stopped looking for jobs. A number of legislators have pushed for an increase in the minimum wage. Their thought process is that a higher minimum wage will improve the economy and get people back to work. Unfortunately, that is very far from the economic reality.

Objective Truth 2: Increasing the Minimum Wage will actually increase unemployment rates

The labor market reacts similarly to any other market. It is primarily driven by the laws of supply and demand. Without a minimum wage (which is effectively an artificial floor on the price of labor), the market would settle at a rate where the demand for labor and the supply for labor meet. In other words, without an artificial floor, we would reach the maximum economic output and we would have the lowest level of unemployment (excluding externalities such as a disincentive to work because one could “make” more on government subsidies aka welfare).

Artificial floors create economic loss. Essentially, supply will be higher than demand and the gap, essentially becomes unemployment. The higher the wage, the more people will be willing to work so there is more supply. However, given the higher wage, a business with limited resources will be able to hire fewer people. The graphic below highlights the conceptual increase in unemployment when the artificial floor is raised. This is not unique to labor, it is an economic reality for all goods and services. The unemployment rate is the gap that exists between the supply of labor and the demand for labor.

Lenzini 5

Howard Schultz the CEO of Starbucks was asked about the $15 minimum wage in Seattle. He said that Starbucks would adapt. They can leverage technology and automation allowing them to hire fewer people but smaller companies don’t have that option – “I wouldn’t want to see the unintended consequences of job loss as a result of going that high. That would not be the case at Starbucks, but I suspect that most companies, especially small- and mid-sized companies, would not be able to afford it.” The net effect will be one of a few scenarios: a) the company uses more technology to maintain its margins b) the company raises prices c) the company goes out of business. None of these bode well for the employees or the consumer.

Myth 3: Minimum wage earners are on average are 35 years old

There is a myth that the average age of a minimum wage earner is 35 years old. I believe that this is a purposely misleading statistic, put forth by certain politicians seeking populist support for re-election.

Objective Truth 3: The distribution of minimum wage earners is skewed

If you have four 17 year olds and two 68 year old retirees (who are most likely working to have a second income above any retirement benefits), you have an average of about 35. When we look at the underlying data, we do in fact see that the distribution is skewed towards young people who are working in their first job and older Americans who may be working to supplement their income. So let’s also be clear that only about 4.3% of all working Americans earn the minimum wage. Most other workers earn more than the current rate. That number is down significantly from the 1980 high, where 15% of working Americans earned the minimum wage. 50% of minimum wage earners are under the age of 24 and 25% of minimum wage earners are teenagers.

Lenzini 6 Lenzini 7

The vast majority 64% are part time employees who mostly occupy low skill jobs in the food services or retail spaces. Only about 20% of those that receive minimum wage are married and only 13% of minimum wage earners are married and over the age of 25. The vast majority, nearly 80% do not have a college education and most do not have a high school diploma. So the argument that too many Americans are trying to support their families on minimum wage is just not true. The percent of family head of household minimum wage earners is actually the lowest it has been since the metrics have been tracked, and the path to higher wages is and always should be improving your skill set.

Myth 4: Raising the minimum wage helps working families

A common element that is often missed in the debate surrounding minimum wage is the role of the earned income tax credits. Many supporters of raising the minimum wage will state that the intent is to help young working heads of household, support their family.

Objective Truth 4: Raising the minimum wage hurts those we intend to help

Unfortunately, by raising the minimum wage, we do the exact opposite, as mentioned above, when we raise the wage uniformly, we actually create higher levels of unemployment. A better targeted approach to helping families, is to continue to leverage the earned income tax credit. The earned income tax credit provides a financial benefit to those that are heads of their household and those with children. In essence, it is a much better way to pinpoint those working adults while still allowing businesses to employ as many people as possible (for instance high school students in their first job who have minimal skills and few responsibilities). The graphic below, is from the United States Treasury Department.

In 2014, a married minimum wage earner with two children, receives an additional $5,460 dollars of tax credits.   Assuming that the average work year consists of 2080 hours at 40 hours a week, this would equate to an additional $2.63 an hour above the minimum wage. If we were to add the two, the hourly wage would be $9.88 an hour. This figure does not include any other state benefits, housing assistance, education assistance or other government program support.

Lenzini 9

Summary: I have only touched on a few of the more common misconceptions that surround the minimum wage debate in America. The truth of the matter is, that many politicians will push for an increased minimum wage rate either because they know it will get votes or because they are ill-informed. The minimum wage has been raised in the past. It will be raised again. We cannot however afford to do so arbitrarily and without thoughtful and informed decision making. Plenty of ideas feel good but they have to make sense in the longer run. We need to do away with feel good economics and political ideologies that are crafted on ideals that are not grounded in economic truths. Policies that pander to populist ideas that will only hurt those that we intend to help.

Matthew can be reached at Lenzinml@yahoo.com

Read Full Post »

Below is a guest post from the Sound Money Defense League. CRI does not necessarily endorse or oppose the views expressed in the article below, but we do believe it’s important to understand the value of money and how currencies get debased.

We Americans no longer carry gold and silver money in our pockets and purses as our grandparents did during their lives. But we still carry the history, legacy and spirit of those gold and silver coins in our language – with more meaning than you might imagine.

“Sound money” has a clear message recognized for centuries around the world. It describes the musical, metallic ring of a gold, silver, or copper coin dropped on any hard surface of glass, stone, wood, or metal. Sound money literally refers to real wealth, with a natural, unmistakable signature of honesty and integrity, as opposed to the swishy paper and plastic debt used almost exclusively today.

The term “sound money” is believed to come from Ancient Rome, where small silver coins were standard in everyday commerce, for paying Roman soldiers to buying exotic goods from all corners of the known world. As Rome squandered its wealth, it found what seemed an easy shortcut to shore up the treasury. It gradually debased those silver coins with common metals, ultimately cutting the silver content to just 5 percent.

But that didn’t fool anyone for long, most of all disciplined Roman soldiers, who did not appreciate being paid with worthless mystery metal in return for risking their lives on Rome’s bloody battlefields.

Do You Want True Money or a Debased Dud?

Not every Roman soldier had room in his gear for a touchstone, usually fieldstone or slate, also used to test the purity of metals. But they quickly discovered the difference in the sound of true money and a debased dud.

They recognized that real silver had a distinctive melodious ring when bounced on a hard surface, such as the blade of a handy sword, a bronze breastplate, or an ornate marble floor. Sound money carried the ‘ring of truth,’ while debased coinage landed with a dull, disappointing thud.

The debasement of Rome’s silver currency unmasked the deceit of a bankrupt empire, which ended with the fall of Rome, a pattern repeated many times. Sound money’s “ring of truth” had found its place in the history of money and of nations.

As the United States grew westward to the Pacific Coast and north to Alaska, gold, silver and copper coins of all nations were legal tender in the young United States until the 1850’s, and were in use even long after that. Americans with no formal education in reading, writing and arithmetic relied on the sight, sound, and feel of the only money they knew. Learning the different musical ringing sounds of those coins could easily qualify even a prairie settler fresh off the wagon train as an economic expert.

In the Old West of the range roving American cowboy, the ring from that silver dollar tossed on the bar of polished oak told the saloon keeper he was pouring whiskey for sound money, and not for a counterfeit forgery.

The sound money test unmasked one of the most famous counterfeiting schemes in American coinage history. The Liberty Nickel (1883-1913) was originally struck without the words “Five Cents,” bearing instead only the Roman numeral “V.” Gold plated Liberty Nickels were passed off as a newly designed $5 gold piece, but the sound money test quickly identified the scandal. Within six months of issuing the first “V” nickels, the U.S. Mint added the words “Five Cents.” But for the next many years, every Liberty $5 Half Eagle in town was tested for its ring of truth.

Sound money means simplicity, honesty, and trustworthy recognition. It stands for strength and durability, which were also characteristics of those pioneering Americans who built our nation.

The ring of sound money for centuries has transcended borders and nationalities by singing its own melodic language. No matter what words were stamped into a precious metal coin, that ring of sound money certified its value, or exposed the deception.

Governments Have Distorted the Meaning of Money

“Sound money” carries such a powerful message there’s little wonder that governments issuing paper fiat currency have attempted to corrupt its meaning, with help from unimaginative and lazy educators and journalists.

“Hard currency” first referred to metal coins, not paper money, but the term over the years has come to mean that flimsy, paper, folding cash is more trustworthy than a handwritten check or IOU.

“Good as gold” is another aberration of “sound money,” usually referring to credit worthiness, even though there is no credit as good as gold.

When Washington and Wall Street began pushing plastic credit cards, which are nothing more than debt disguised as wealth, Americans were introduced to the gold card along with the credit rating and FICO score as a false measure of one’s financial worth. Today, the newest edition of the $100 Federal Reserve note carries a golden inkwell and feather pen, as if to sarcastically say money itself is a masquerade of paper script and not precious metal.

Americans today have no memory of those times when gold, silver, and copper coins were tossed across a store counter, or counted out by hand, to pay for everything from penny candies to Ford Model-T automobiles. That era began ending when President Roosevelt in 1933 outlawed the use of gold coins in everyday American commerce.

The separation of Americans from their Constitutional heritage to true money continued through 1964, with the end of small coinage containing 90% silver. The deception was complete by 1982 when copper quietly disappeared from the Lincoln penny.

But no government could remove the ringing echo of sound money from history, or from us. And government cannot camouflage its counterfeits with gold colored paint. You can experience sound money’s evident ring of truth for yourself. Toss any gold or silver coin on your kitchen table and you will hear the history of honest money ringing down through the centuries.

And perhaps, thanks to grassroots projects like the Sound Money Defense League, you will hear the trumpeting of better days to come.

Sound Money Defense League and MoneyMetals.com columnist Guy Christopher is a veteran writer living on the Gulf Coast. A retired investigative journalist, published author, and former stockbroker, Christopher has taught college as an adjunct professor and is a veteran of the 101st Airborne in Vietnam.

 

Read Full Post »

photo credit to learnaboutancientrome.weebly.com.

Below is a guest post from Lawrence Reed, president of the Foundation for Economic Education. The Foundation for Economic Education, founded in 1946, is the leader in education, publishing, and the production of ideas related to the economic, ethical and legal principles of a free society. Republished with permission.

More than 2,000 years before America’s bailouts and entitlement programs, the ancient Romans experimented with similar schemes. The Roman government rescued failing institutions, canceled personal debts, and spent huge sums on welfare programs. The result wasn’t pretty.

Roman politicians picked winners and losers, generally favoring the politically well connected — a practice that’s central to the welfare state of modern times, too. As numerous writers have noted, these expensive rob-Peter-to-pay-Paul efforts were major factors in bankrupting Roman society. They inevitably led to even more destructive interventions. Rome wasn’t built in a day, as the old saying goes — and it took a while to tear it down as well. Eventually, when the republic faded into an imperial autocracy, the emperors attempted to control the entire economy.

Debt forgiveness in ancient Rome was a contentious issue that was enacted multiple times. One of the earliest Roman populist reformers, the tribune Licinius Stolo, passed a bill that was essentially a moratorium on debt around 367 BC, a time of economic uncertainty. The legislation enabled debtors to subtract the interest paid from the principal owed if the remainder was paid off within a three-year window. By 352 BC, the financial situation in Rome was still bleak, and the state treasury paid many defaulted private debts owed to the unfortunate lenders. It was assumed that the debtors would eventually repay the state, but if you think they did, then you probably think Greece is a good credit risk today.

In 357 BC, the maximum permissible interest rate on loans was roughly 8 percent. Ten years later, this was considered insufficient, so Roman administrators lowered the cap to 4 percent. By 342, the successive reductions apparently failed to mollify the debtors or satisfactorily ease economic tensions, so interest on loans was abolished altogether. To no one’s surprise, creditors began to refuse to loan money. The law banning interest became completely ignored in time.

By 133 BC, the up-and-coming politician Tiberius Gracchus decided that Licinius’s measures were not enough. Tiberius passed a bill granting free tracts of state-owned farmland to the poor. Additionally, the government funded the erection of their new homes and the purchase of their faming tools. It’s been estimated that 75,000 families received free land because of this legislation. This was a government program that provided complimentary land, housing, and even a small business, all likely charged to the taxpayers or plundered from newly conquered nations. However, as soon as it was permissible, many settlers thanklessly sold their farms and returned to the city. Tiberius didn’t live to see these beneficiaries reject Roman generosity, because a group of senators murdered him in 133 BC, but his younger brother Gaius Gracchus took up his populist mantle and furthered his reforms.

Tiberius, incidentally, also passed Rome’s first subsidized food program, which provided discounted grain to many citizens. Initially, Romans dedicated to the ideal of self-reliance were shocked at the concept of mandated welfare, but before long, tens of thousands were receiving subsidized food, and not just the needy. Any Roman citizen who stood in the grain lines was entitled to assistance. One rich consul named Piso, who opposed the grain dole, was spotted waiting for the discounted food. He stated that if his wealth was going to be redistributed, then he intended on getting his share of grain.

By the third century AD, the food program had been amended multiple times. Discounted grain was replaced with entirely free grain, and at its peak, a third of Rome took advantage of the program. It became a hereditary privilege, passed down from parent to child. Other foodstuffs, including olive oil, pork, and salt, were regularly incorporated into the dole. The program ballooned until it was the second-largest expenditure in the imperial budget, behind the military.It failed to serve as a temporary safety net; like many government programs, it became perpetual assistance for a permanent constituency who felt entitled to its benefits.

In 88 BC, Rome was reeling from the Social War, a debilitating conflict with its former allies in the Italian peninsula. One victorious commander was a man named Sulla, who that year became consul (the top political position in the days of the republic) and later ruled as a dictator. To ease the economic catastrophe, Sulla canceled portions of citizens’ private debt, perhaps up to 10 percent,leaving lenders in a difficult position. He also revived and enforced a maximum interest rate on loans, likely similar to the law of 357 BC. The crisis continually worsened, and to address the situation in 86 BC, a measure was passed that reduced private debts by another 75 percent under the consulships of Cinna and Marius.

Less than two decades after Sulla, Catiline, the infamous populist radical and foe of Cicero, campaigned for the consulship on a platform of total debt forgiveness. Somehow, he was defeated, likely with bankers and Romans who actually repaid their debts opposing his candidacy. His life ended shortly thereafter in a failed coup attempt.

In 60 BC, the rising patrician Julius Caesar was elected consul, and he continued the policies of many of his populist predecessors with a few innovations of his own. Once again, Rome was in the midst of a crisis. In this period, private contractors called tax farmers collected taxes owed to the state. These tax collectors would bid on tax-farming contracts and were permitted to keep any surplus over the contract price as payment. In 59 BC, the tax-farmer industry was on the brink of collapse. Caesar forgave as much as one-third of their debt to the state. The bailout of the tax-farming market must have greatly affected Roman budgets and perhaps even taxpayers, but the catalyst for the relief measure was that Caesar and his crony Crassus had heavily invested in the struggling sector.

In 33 AD, half a century after the collapse of the republic, Emperor Tiberius faced a panic in the banking industry. He responded by providing a massive bailout of interest-free loans to bankers in an attempt to stabilize the market. Over 80 years later, Emperor Hadrian unilaterally forgave 225 million denarii in back taxes for many Romans, fostering resentment among others who had painstakingly paid their tax burdens in full.

Emperor Trajan conquered Dacia (modern Romania) early in the second century AD, flooding state coffers with booty. With this treasure trove, he funded a social program, the alimenta, which competed with private banking institutions by providing low-interest loans to landowners while the interest benefited underprivileged children. Trajan’s successors continued this programuntil the devaluation of the denarius, the Roman currency, rendered the alimenta defunct.

By 301 AD, while Emperor Diocletian was restructuring the government, the military, and the economy, he issued the famous Edict of Maximum Prices. Rome had become a totalitarian state that blamed many of its economic woes on supposed greedy profiteers. The edict defined the maximum prices and wages for goods and services. Failure to obey was punishable by death. Again, to no one’s surprise, many vendors refused to sell their goods at the set prices, and within a few years, Romans were ignoring the edict.

Enormous entitlement programs also became the norm in old Rome. At its height, the largest state expenditure was an army of 300,000–600,000 legionaries. The soldiers realized their role and necessity in Roman politics, and consequently their demands increased. They required exorbitant retirement packages in the form of free tracts of farmland or large bonuses of gold equal to more than a decade’s worth of their salary. They also expected enormous and periodic bonuses in order to prevent uprisings.

The Roman experience teaches important lessons. As the 20th-century economist Howard Kershner put it, “When a self-governing people confer upon their government the power to take from some and give to others, the process will not stop until the last bone of the last taxpayer is picked bare.” Putting one’s livelihood in the hands of vote-buying politicians compromises not just one’s personal independence, but the financial integrity of society as well. The welfare state, once begun, is difficult to reverse and never ends well.

Rome fell to invaders in 476 AD, but who the real barbarians were is an open question. The Roman people who supported the welfare state and the politicians who administered it so weakened society that the Western Roman Empire fell like a ripe plum that year. Maybe the real barbarians were those Romans who had effectively committed a slow-motion financial suicide.

read the original post at the FEE website here

Read Full Post »

This is a guest post from Mary Sandra Marie, Marketing Coordinator with the Regency Shop based in Los Angeles. In support of Breast Cancer Awareness CRI is posting this guest post. CRI is not being paid to post this nor are we affiliated in any way with the Regency Shop. If you are interested in bidding on these items to raise money for breast cancer awareness you may choose to visit the links below.

Here is How Regency Shop Supporting Breast Cancer Survivors.
October is the official breast cancer awareness month, and whether individual or business, everyone is coming up with creative and innovative ideas to support the social cause. Regency Shop, renowned modern furniture retailer has organized charity auction to financially support breast cancer research and encourage breast cancer survivors.
The furniture retailer has decided to provide beautiful custom designed chair while encouraging people to contribute to an important social cause. Regency Shop has been participating in the movement with three thematic auctions. Participates will get a chance to show their support for breast cancer survivors, and an opportunity to win one-of-a-kind stylish lounge chairs.
According to recent studies, one in every eight American women has chances to develop invasive breast cancer, and the count is expected to increase tremendously. You can generate awareness about the disease and be there for millions of women by taking part in Regency Shop’s exclusive auction during breast cancer awareness month, from 1st Oct to 30 Oct, 2014.
The charity auction organized at Regency Shop is not mere a contest. It’s more like you becoming October’s special hero by supporting a global cause, and support the event. It is indeed the best way to showcase your care for breast cancer survivors and bring a positive change to the world.
Regency Shop is auctioning three magnificent eero aarnio ball chair, hanging bubble chair and barcelona chair. All these three chairs are 100% custom designed and pink as pink is the signature color used in breast cancer awareness campaigns.
For auction listings visit:
Eero Aarnio Ball Chair
http://www.regencyshop.com/p514/Eero-Aarnio-Style-Ball-Chair-Special-Edition/product_info.html

Barcelona Chair
http://www.regencyshop.com/p513/Ibiza-Chair-Special-Edition/product_info.html

Hanging Bubble Chair
http://www.regencyshop.com/p515/Hanging-Bubble-Chair-Special-Edition/product_info.html

Mary Sandra Marie

Marketing Coordinator, Regency Shop

Read Full Post »