Government intervention, the President and current administration tell us–daily–is the way that we will achieve our goals for the future.
There are many reasons to take exception to this statement. Some of the more obvious (at least to me): First of all, what happened in the years before the modern government existed? How did people even survive, let alone thrive, in the absence of the vast resources available from the current government programs? Second, why so many people telling us, to the point of shouting down any dissenters, that this is THE ONLY WAY to do business in today’s world? Third, toward whose goals are we pressing, again? Fourth, what makes you (the progressives) so sure that “everybody” wants what you do?
Just a Thought: Independent men built this country…not bureaucrats.
One of the email newsletters to which I subscribe is titled “The Morning Jolt” and features bold headlines and news that will “set your blood to simmer[ing],” as the author wrote in February 13, 2013′s edition. This daily imprint features Jim Geraghty’s thoughts on a number of subjects, and he is usually dead-on in his analysis of current events. He is committed to combating the pervasive influence of the lovers of the “progressive agenda” (read “retrogressive suppression of independence”) for America.
Jim Geraghty is a conservative political pundit and a contributing editor to The National Review
One of the subjects he addresses head-on in the 2/13 edition of Morning Jolt is the rising cost of employing full-time workers here in the US. Written by Charles Hugh Smith, this article is worth reading because it addresses several factors (one of which is the return on investment consideration for employers [woefully under-discussed in the current climate]) influencing the economic stagnation which is so prevalent, and because it does so in such a well-written format that there is no need for this author to reinvent the wheel.
Geraghty quotes extensively from the article cited above, and then offers a few words of his own on this subject:
“In short, the unemployed, the departed-the-workforce, the just-entered-the-workforce and soon-to-enter-the-workforce cannot be sufficiently productive to justify the expense of hiring them. And we know this pretty much has to be true, because corporations are sitting on roughly $1.7 trillion in cash right now [according to a recent article from one of the blogs of the New York Times, apparently. I (Dave here) was not able to follow this link to research it.]. It’s not that they don’t have the money to hire people. They just don’t think that hiring people would generate more money than having it just sit there in their accounts, which is a phenomenally depressing conclusion.”
That’s pretty simple, and it’s pretty clear, too. Excessive and punitive regulation has driven the cost of adding new workers so high that it has exacerbated the underemployment of the younger and (usually) less-experienced members of the workforce. Why do I say that it has made this problem worse, rather than “excessive regulations have caused underemployment of the young?” Because this underemployment is caused by the understandable predisposition of employers to hire more experienced workers, rather than younger workers who require more investment (time to train, money to pay minimum wages, benefits, etc.) and less return for said outlay.
In other words, the employer is made slightly worse off by being forced to be more choosy in who he hires; or, alternatively, by being forced to buy machinery to automate what a minimum-wage worker would otherwise do. The potential minimum-wage worker, however, is made much worse off: not only is he robbed of the opportunity to work for minimum wage, he is also (often) denied the opportunity to work and gain experience that would qualify him for more appealing, better-paying jobs.
As one of the young and underemployed, I know whereof I speak. A six-month job hunt–during which I have aggressively sought employment by repeated phone calls, in-person submissions of my resume, and face-to-face introductions–has resulted in only one interview.
Am I complaining? No, I’m simply pointing out that I understand (from personal experience) how tough it is to find a job in today’s depressed economy.
While punitive (from an employers’ perspective) regulation is certainly a key factor, to say that it is the sole variable in this complicated problem would be to commit an unforgivable oversimplification.
I cannot in the scope of this article address every economic factor that contributes to the high unemployment rate among young workers; however, I believe there are two surpluses, surpluses that go largely unaddressed in the current discussion on the unemployment rate, that play a critical role in this phenomenon.
The first surplus is the result of education. More and more, we see students entering liberal arts colleges pursuing majors such as Twelfth-Century Poetry, Ancient Literature Interpretation, and Underwater Basket-Weaving. (Ok, I made the last one up, but you get the idea.) Even students in fields that, a few years ago, featured robust demand are seeing a dramatic drop in employment opportunities. This decrease has affected not only “soft” majors such as the ones above, but “not-so-soft” majors like liberal arts, communication arts, art, and others. Geraghty wrote a very insightful paragraph in his February 13 “Morning Jolt” column:
“Folks, the art world and publishing world are fiercely competitive even in the very best of times, so you’re going to need a backup career just in case things don’t work out. This also applies to those who aspire to fame and fortune in journalism, professional athletics, the music industry, most of the entertainment industry, and most of the jobs that the world covets. You’ve got to be really talented, and really hard-working. And yes, lucky. I realize I’m very, very, very, very lucky to have a job that I (usually) enjoy and that allows me to make a living. Of course, I suspect those outside those fields overestimate the role of luck. My buddy Cam — now on the Sportsman Channel! – will periodically hear from someone, ‘Boy, you’re really lucky to find a job where you get to host a radio show!’ and he has to bite his tongue and refrain from mentioning all the years he worked as reporter and assistant news director, driving all over the state of Oklahoma on any assignment he could get, long hours, lousy pay, and so on.”
He also makes a very pithy, observant statement: “Nobody just hands you a plum job in journalism.” Truth! It may shock some college students to realize that nobody “just hands you” any plum job. Most of the plum jobs in the world go to those who have busted their behinds for it.
Is that fair?
Before you answer that, stop and think about how you would feel if you spent ten years of your life working at any job you could find in your chosen profession, striving for that “dream job,” only to see it handed to some fresh-faced newbie fresh off the education assembly line because they “deserved it.” How would that make you feel?
Yeah, you’re right…there are two sides to every coin.
The second surplus is the result of a lack of practical education. Increasingly, college graduates are sorely lacking in portable skills that can only be obtained by personal contact and interaction with people. Why is there such a deficit of ordinary, everyday interpersonal skills? Again, the answer is too long and complex for a post of this length, but some contributing factors are the increasing obsession with screen media, the widespread revolt against traditional values, and the epidemic, not of illiteracy, but of a-literacy.
According to Susan Jacoby’s book The Age of American Unreason, published in 2008, only 57 percent of the American public has ever read a non-fiction book. Making a logical assumption that some people’s only non-fictional exposure is from required reading for school, the number of people who read non-fiction books from personal desire is quite possibly substantially smaller. This is relevant to the discussion at hand, precisely because readers are better equipped than non-readers to follow, evaluate, and process a complex train of thought in a logical fashion.
In education, as in many other areas of life, many Americans have forsaken personal responsibility for the convenience of “pre-packaged” curriculum. Some seem to think that if it’s not taught in school, or required reading for one of their classes, it’s just not worth the time it would take read about it.
It’s heartbreaking to see education of all types going to ruin here in the States. It’s particularly sad when one considers the historical successes of the “self-made man,” the man who educated himself–outside the scope of the marbled halls and manicured lawns of the university–at great expense of time and effort, and built a business, a trade, a living, and (for more than a few) a fortune. The current scoffing at those who have made their living in this way reflects the near-idolatrous regard many hold for the “almighty college degree.” Current disdain notwithstanding, self-education–as with many other forms of self-reliance–is a phenomenon that is disproportionately represented in the United States. Freedom to keep the proceeds of one’s efforts truly is the greatest encouragement to innovation and initiative.
In short, government intervention of all sorts into employment contracts is fraught with examples of stagnation following efforts taken for the (expressed) purpose of producing greater economic growth. In every case, more government intervention and regulation resulted in LESS growth, not MORE as the politicians predicted. The brighter tomorrow our elected officials pay lip-service to will only come about by a return to the truly American principles of independence, self-education, individual choice and liberty, personal responsibility, and self-governance.