I was re-reading Roger Zelazny’s Amber Chronicles this week past, and it occurred to me to wonder about an eternal country built on what amounted to a feudal system. The problem in any fictional work, of course, is that any system will have its flaws and so any form of government described will have failings which readers will discover and announce. Then again, some are better than others – J.K. Rowling is a very good writer, for example, but her attempt to create a believable magic world parallel to the mundane world we know falls on its face as soon as any critical thought is applied to the matter. Zelazny’s Amber and Chaos realms function relatively well as monarchies, even if that include byzantine rules and Machiavellian conspiracies. The focus is on the royal person, and the story works because the reader focuses on him naturally as the protagonist. That, in turn, made me wonder how much we have really advanced in our sense of democracy.
Far and away, the voter who supported Barack Obama for President in 2008 did so because they liked the man, not because of his qualifications. Because frankly, one incomplete term in the U.S. Senate, some talk show appearances to plug his books, and a network of cronies in Al Capone’s old neighborhood do not begin to qualify a man for the most powerful political office on the planet. President Obama is far from the first President, however, to win office over more qualified opponents. America’s history is full of men who became President because of eloquent speeches, wartime heroism, or even physical appearance – some voters said to the press that they voted for Warren Harding in 1920 over James Cox because they thought him better and more honest-looking. Critics from the media and rival political factions have noted the royal demeanor of Presidents Reagan and Kennedy, and even of far earlier men like John Quincy Adams and Thomas Jefferson. In that manner, Barack Obama is merely acting in a pattern long established, though a strange one for a nation conceived in the notion that men should build their own fortunes and the government work for the people. This brings up the second class to consider, the villains.
In modern parlance, “villain” means the bad guy, but for many years a villain simply referred to a person in a village – that is, a common and crude person. The inference was usually that the villain was a hindrance to the hero of the story (usually noble), because he was unable to understand the hero’s mission and did not properly respect the noble for whom he was. Vestiges of this thinking still continue – in most movies and TV shows, the villain is not especially smart, and is almost never good looking or successful, while the hero is smart, wise, good looking and conveys a clear biological superiority over everyone else. So naturally, in politics the major players always try to cast some villains, in order to shine in comparison. It was not enough, therefore, for Barack Obama to cast himself as the hero of his story; it was necessary for him to mock and insult Senator Clinton, and after winning the part nomination, to do the same to Senator John McCain and Governor Sarah Palin. Since winning the White House, Obama continues to deride anyone who stands in the way of his policies and proposals, rather than defend his position or prove that his position is the best course or even well-considered. It’s just easier, and no doubt more fun, to rail at his enemies a la Richard Nixon.
The problem is, there are a lot more ordinary people than privileged people. And eventually, folks decide that they get a bit tired of a privileged person lording it over them, especially since he already enjoys a life with more luxury and comfort than they have. The glamour of supporting a charismatic hero fades if that hero is only seeking his own advantage.
As we open 2010, one can only hope that we can finally find leaders who can grow beyond the one-dimensional pretense of fictional leadership.
Thursday, December 31, 2009
Thursday, December 24, 2009
The Meaning of Blogging
I have been blogging for about seven years now. Like so many others, I started by blogging at my own site, lost interest for a while and spent some time posting comments at larger blogs. I am happy to say that some of my comments produced a great many responses and some support, so that I was invited to be a regular poster at a major blog, Polipundit.com, an enterprise of group blogging which lasted several years with great success. Along the way, I found myself quoted in the Wall Street Journal and interviewed by a Japanese news agency, and received attention, both support and attacks, from a number of prominent bloggers. Unfortunately, the site owner had a regrettable change in opinion regarding freedom of expression and he ejected all of his writers and got himself a new cast. That incident led to some additional publicity and attention in the blogosphere, but in the end it was an unfortunate turn of events.
I received a number of offers to blog on other sites, and soon thereafter joined the team at Wizbang, an eclectic and upbeat right-of-center site. One thing I particularly liked, was that I could post when I liked, on a great variety of subjects, including my faith, my fight with cancer, and my return to school. Wizbang readers know they can see not only political commentary, but a community of conservative thought, including topics that appeal to common life issues.
You may have noticed that it’s been nearly a month since my last post. Sorry for that, it comes down to my Mom’s hospitalization and rehab therapy, my new job, and the fact that I don’t want to repeat old things long said or just parrot popular themes. But as I tried to write posts earlier this month, I also became aware of the flood of blogs and bloggers, and it occurs to me that we might discuss the issue of what blogging has become and where it is going.
One thing which drives me to distraction is the amount of unwanted mail I get in my email. Not the junk mail from people trying to sell me things, that’s just a dismal fact of life, but the constant deluge of links to posts and entire columns reprinted for my “convenience” that I never asked to receive. If you are one of those people, let me say for your own good that SPAM is not good writing. If you are sending your stuff out by email to people who never asked to be on a mailing list, please just STOP. You are not making anyone happy or improving your reputation. Spamming folks through email blog updates is just plain creepy.
Now as to blogging itself. Bloggers frequently criticize the print and broadcast media, and with good reason – there’s a lot of bias and inaccuracy in what is commonly called journalism, and some of it is just plain fraud (Rathergate, anyone?). But criticizing media requires the blogger to demonstrate a standard of conduct himself, and a lot of bloggers are deficient in that regard. Some post opinion while pretending it to be fact. Some bloggers cannot tolerate any difference of opinion, even from allies. Some bloggers are egocentric fools, speaking bluntly, who imagine that they alone have perfect wisdom and are the next Reynolds or Limbaugh – such bloggers are usually deficient in humility, insight, and humor, but I will leave it at that. As a result, bloggers are commonly read mostly by other bloggers and by a select audience; we must accept that even today, most people are unfamiliar with blogs at all (not to mention twitter), and unless/until blogs demonstrate a social value great enough to attract the attention and respect of the public, they will remain of marginal interest only. That, by the way, is just one reason there are no “blogger millionaires”.
Blogs are sometimes trivial and have the staying power of say, a snowball in Houston. But some are well-written and are potent accounts of seminal issues and represent serious thought on a range of subjects. To a large degree, bloggers respond to readers and they thrive on reader support. This is one reason most blogs allow comments. It not only provides a forum for people, it also allows the blogger to receive direct feedback, something so lacking in print and broadcast media that it’s killing off a number of the media businesses. It also provides the energy that many bloggers need to keep posting – there’s a real incentive to produce when you know someone’s paying attention.
But it comes down to a symbiotic relationship between the blogger and the readers. What matters to you and why? Any good blogger wants to build on that basis, and any blog worth reading will pay attention not to superficial bilge that can be supplied by TMZ or the like, but the things that keep you up at night or provoke the ageless question, “What if?”
I received a number of offers to blog on other sites, and soon thereafter joined the team at Wizbang, an eclectic and upbeat right-of-center site. One thing I particularly liked, was that I could post when I liked, on a great variety of subjects, including my faith, my fight with cancer, and my return to school. Wizbang readers know they can see not only political commentary, but a community of conservative thought, including topics that appeal to common life issues.
You may have noticed that it’s been nearly a month since my last post. Sorry for that, it comes down to my Mom’s hospitalization and rehab therapy, my new job, and the fact that I don’t want to repeat old things long said or just parrot popular themes. But as I tried to write posts earlier this month, I also became aware of the flood of blogs and bloggers, and it occurs to me that we might discuss the issue of what blogging has become and where it is going.
One thing which drives me to distraction is the amount of unwanted mail I get in my email. Not the junk mail from people trying to sell me things, that’s just a dismal fact of life, but the constant deluge of links to posts and entire columns reprinted for my “convenience” that I never asked to receive. If you are one of those people, let me say for your own good that SPAM is not good writing. If you are sending your stuff out by email to people who never asked to be on a mailing list, please just STOP. You are not making anyone happy or improving your reputation. Spamming folks through email blog updates is just plain creepy.
Now as to blogging itself. Bloggers frequently criticize the print and broadcast media, and with good reason – there’s a lot of bias and inaccuracy in what is commonly called journalism, and some of it is just plain fraud (Rathergate, anyone?). But criticizing media requires the blogger to demonstrate a standard of conduct himself, and a lot of bloggers are deficient in that regard. Some post opinion while pretending it to be fact. Some bloggers cannot tolerate any difference of opinion, even from allies. Some bloggers are egocentric fools, speaking bluntly, who imagine that they alone have perfect wisdom and are the next Reynolds or Limbaugh – such bloggers are usually deficient in humility, insight, and humor, but I will leave it at that. As a result, bloggers are commonly read mostly by other bloggers and by a select audience; we must accept that even today, most people are unfamiliar with blogs at all (not to mention twitter), and unless/until blogs demonstrate a social value great enough to attract the attention and respect of the public, they will remain of marginal interest only. That, by the way, is just one reason there are no “blogger millionaires”.
Blogs are sometimes trivial and have the staying power of say, a snowball in Houston. But some are well-written and are potent accounts of seminal issues and represent serious thought on a range of subjects. To a large degree, bloggers respond to readers and they thrive on reader support. This is one reason most blogs allow comments. It not only provides a forum for people, it also allows the blogger to receive direct feedback, something so lacking in print and broadcast media that it’s killing off a number of the media businesses. It also provides the energy that many bloggers need to keep posting – there’s a real incentive to produce when you know someone’s paying attention.
But it comes down to a symbiotic relationship between the blogger and the readers. What matters to you and why? Any good blogger wants to build on that basis, and any blog worth reading will pay attention not to superficial bilge that can be supplied by TMZ or the like, but the things that keep you up at night or provoke the ageless question, “What if?”
Sunday, November 29, 2009
A Personal Perspective on Health Care
10 days ago, my mother fell while visiting her primary care physician. He had to put three stitches in the back of her head. Last Monday, she fell in the porch outside her house, and it was necessary for her to have surgery on her arm, which was broken in two places and the elbow was dislocated. The hospital did the surgery Wednesday and kept her overnight for observation. She was released Thanksgiving morning and my brother and I drove her home. I went to my home where my wife was preparing Thanksgiving dinner, and received a call from my brother around 345 PM; Mom had fallen yet again and hit her head.
Mom spent the rest of Thursday in the Emergency Room at Memorial Hermann Hospital, and she was admitted into the NTICU around 1 AM Friday morning. Her vital signs were unstable and the tests showed a possible aneurism. Fortunately, her condition improved over the next couple days and today Mom is being transferred to a different ICU. She has insurance through Medicare and a supplemental provider. The costs of the multiple stays will be expensive, but for the most part Mom’s costs will be controlled. On the one hand, my Mom paid for many years on insurance she never had to use, but now it is an invaluable resource. Overall, in my Mom’s case the system seems to work the way it is meant to do.
Health Care needs reform; there is no serious argument on that point. The rules can be burdensome, the costs rise more quickly than anyone can really afford, and there is real concern that a large segment of the population is underserved. The solution, however, cannot be found through simple application of a mathematical formula, or any political theory. Medicine, first and foremost, is about people. The patients and their families, the doctors, nurses and other providers, and the general public all have authentic interests in health care and how it is provided.
There are economic and political reasons why I do not like the reform plan proposed by the Obama Administration and the Democrats in Congress. But the main reason is far more personal – the Democrats make a lot of promises, in general and lacking substance, about access and choice, but they refuse to have a serious debate about the cost, effects, and conditions of their proposed changes. I need to know what the changes will mean to a 76-year-old woman who needs multiple trips to ICU. I need to know what the changes will do to my continuing treatment for cancer, a type which at this time has no known cure. I need to know what the changes will do for my other relatives, including my wife and daughter. The proposed plans seem focused only on cost reductions, with no protections for significant demographics, like seniors, people with chronic and incurable conditions, or people who are healthy now but face a history of serious problems. The problem is not that these questions have not been answered yet, it’s that the Democrats refuse to even admit those problems exist, that they have to address them in public discussions. Debate and disagreement have been suppressed and punished, with constituents locked out of allegedly public meetings, with alternative proposals by Republicans denied consideration and debate. The only plan offered to the public is the Democrats’ plan, with no serious attention given to flaws.
Until the Congress listens to the people who are affected by this plan, and serious effort is made to improve the plan to create an effective, functional plan, I cannot support it on any rational basis. Nor do I think that anyone who lives in the real world will find the proposed plan acceptable when they understand how it works, or more to the point, will not work.
Mom spent the rest of Thursday in the Emergency Room at Memorial Hermann Hospital, and she was admitted into the NTICU around 1 AM Friday morning. Her vital signs were unstable and the tests showed a possible aneurism. Fortunately, her condition improved over the next couple days and today Mom is being transferred to a different ICU. She has insurance through Medicare and a supplemental provider. The costs of the multiple stays will be expensive, but for the most part Mom’s costs will be controlled. On the one hand, my Mom paid for many years on insurance she never had to use, but now it is an invaluable resource. Overall, in my Mom’s case the system seems to work the way it is meant to do.
Health Care needs reform; there is no serious argument on that point. The rules can be burdensome, the costs rise more quickly than anyone can really afford, and there is real concern that a large segment of the population is underserved. The solution, however, cannot be found through simple application of a mathematical formula, or any political theory. Medicine, first and foremost, is about people. The patients and their families, the doctors, nurses and other providers, and the general public all have authentic interests in health care and how it is provided.
There are economic and political reasons why I do not like the reform plan proposed by the Obama Administration and the Democrats in Congress. But the main reason is far more personal – the Democrats make a lot of promises, in general and lacking substance, about access and choice, but they refuse to have a serious debate about the cost, effects, and conditions of their proposed changes. I need to know what the changes will mean to a 76-year-old woman who needs multiple trips to ICU. I need to know what the changes will do to my continuing treatment for cancer, a type which at this time has no known cure. I need to know what the changes will do for my other relatives, including my wife and daughter. The proposed plans seem focused only on cost reductions, with no protections for significant demographics, like seniors, people with chronic and incurable conditions, or people who are healthy now but face a history of serious problems. The problem is not that these questions have not been answered yet, it’s that the Democrats refuse to even admit those problems exist, that they have to address them in public discussions. Debate and disagreement have been suppressed and punished, with constituents locked out of allegedly public meetings, with alternative proposals by Republicans denied consideration and debate. The only plan offered to the public is the Democrats’ plan, with no serious attention given to flaws.
Until the Congress listens to the people who are affected by this plan, and serious effort is made to improve the plan to create an effective, functional plan, I cannot support it on any rational basis. Nor do I think that anyone who lives in the real world will find the proposed plan acceptable when they understand how it works, or more to the point, will not work.
Thursday, November 26, 2009
Why the Wife Does All The Cooking
Back in 2007, my wife was tired and did not want to cook a big turkey dinner for Thanksgiving.
“No problem,” I said. “I’ve got it covered.”
Well, on the plus side it was the talk of the neighborhood for quite a while.
[ disclaimer - the above-mentioned event never actually happened, but my wife assures me that if I were to actually try making Thanksgiving dinner on my own, it would ]
“No problem,” I said. “I’ve got it covered.”
Well, on the plus side it was the talk of the neighborhood for quite a while.
[ disclaimer - the above-mentioned event never actually happened, but my wife assures me that if I were to actually try making Thanksgiving dinner on my own, it would ]
Saturday, November 21, 2009
The Precautionary President
One of the more Orwellian concepts of the Twenty-First Century has to be the so-called “Precautionary Principle”. The harmless-sounding concept is described by proponents as a simple extension of ‘better safe than sorry’, and is most commonly explained as actions taken in the absence of complete scientific proof when there is significant risk of serious harm if the action is not taken. Such advocates point out that this is similar to the reasoning for insurance, locking doors and inoculations – a small action taken to ward against more serious possibilities if you do ignore the danger. The disciples of Precautionism claim that they represent a reasonable sense of caution. And President Barack Obama has been an eager acolyte for the cause, whether the venue is Climate Change, American Imperialism, the Corporate Greed Culture, or the Need for More Guilt among Ordinary Americans. President Obama is all about “We Can’t Wait”.
The main problem is that the Precautionary Principle is self-contradictory in practice. The stated premise for the Precautionary Principle (PP for short), is that once a reasonable argument exists that a certain action is necessary to prevent a serious danger or likely event which would cause serious harm, it is necessary to act immediately to mitigate that danger. The reader may note that key words are ‘reasonable’, ‘necessary’, and ‘serious’ – all words which have subjective meanings. As a result, the decision on just when an argument is reasonable rather than merely an opinion, what defines the necessity of an action rather than its advocacy, and when the possible consequences of continuing on the present course are serious enough to require change are variables that even the most intelligent people may differ upon. Conclusive evidence has been the standard for many years, precisely because demonstrated proof of cause and effect, not only of the problem but of the proposed solution, is the only valid means for establishing a valid consensus. Anything else is no better than mob psychology. PP is self-contradictory, because in actual practice it is invariably applied to demand radical and risky change, on nothing more than emotional contention. It’s not that conclusive proof is not used in the theory, but that no real tipping point exists at all; the activists decide what they want to do and make what amounts to a mix of Chicken Little paranoia and Hitlerian thug tactics to get what they want.
PP’s self-contradictory practice can seen in how activists in the Ecology and Trade Regulation debates are demanding radical action with no substance whatsoever to either their claim of imminent danger, or that their demanded actions will have the effects they claim. Climate Change radicals, for example, not only refuse to provide defense of their sweeping claims of imminent catastrophe if humans don’t abandon their cars, living standards, and capitalism, they become apoplectic if anyone suggests they demonstrate how their proposed actions will actually improve the Earth’s climate to any extent – the theory of Climate Change is practiced as Modern Fascism, and there’s just no pretending otherwise. In fact, every major social effort based on PP is practiced in the Fascist mode, where government acts in thuggish manner to demand compliance, where evidence which contradicts the party line is suppressed and opponents marginalized and harassed, and where the true focus of the effort is political gain.
The Precautionary Principle guides Barack Obama in his major decisions. Rather than go to the trouble of weighing all of the rational options, their costs and known or likely effects, Obama simply chooses policies which benefit his friends and crafts his argument to support what he wanted to do in the first place, as was the case with the first Stimulus Bill – almost a trillion dollars stolen from the American people in order to help his cronies. When he does not have a chosen plan, Obama listens to the people who can make or break him to make his decision, which is why he is having such trouble coming up with a plan for Afghanistan or how to address consumer confidence. Like the Precautionary Principle, Obama is not concerned with the evidence in any major issue, nor with the cost of his intended actions, since he plans for others to pay for his policies.
The main problem is that the Precautionary Principle is self-contradictory in practice. The stated premise for the Precautionary Principle (PP for short), is that once a reasonable argument exists that a certain action is necessary to prevent a serious danger or likely event which would cause serious harm, it is necessary to act immediately to mitigate that danger. The reader may note that key words are ‘reasonable’, ‘necessary’, and ‘serious’ – all words which have subjective meanings. As a result, the decision on just when an argument is reasonable rather than merely an opinion, what defines the necessity of an action rather than its advocacy, and when the possible consequences of continuing on the present course are serious enough to require change are variables that even the most intelligent people may differ upon. Conclusive evidence has been the standard for many years, precisely because demonstrated proof of cause and effect, not only of the problem but of the proposed solution, is the only valid means for establishing a valid consensus. Anything else is no better than mob psychology. PP is self-contradictory, because in actual practice it is invariably applied to demand radical and risky change, on nothing more than emotional contention. It’s not that conclusive proof is not used in the theory, but that no real tipping point exists at all; the activists decide what they want to do and make what amounts to a mix of Chicken Little paranoia and Hitlerian thug tactics to get what they want.
PP’s self-contradictory practice can seen in how activists in the Ecology and Trade Regulation debates are demanding radical action with no substance whatsoever to either their claim of imminent danger, or that their demanded actions will have the effects they claim. Climate Change radicals, for example, not only refuse to provide defense of their sweeping claims of imminent catastrophe if humans don’t abandon their cars, living standards, and capitalism, they become apoplectic if anyone suggests they demonstrate how their proposed actions will actually improve the Earth’s climate to any extent – the theory of Climate Change is practiced as Modern Fascism, and there’s just no pretending otherwise. In fact, every major social effort based on PP is practiced in the Fascist mode, where government acts in thuggish manner to demand compliance, where evidence which contradicts the party line is suppressed and opponents marginalized and harassed, and where the true focus of the effort is political gain.
The Precautionary Principle guides Barack Obama in his major decisions. Rather than go to the trouble of weighing all of the rational options, their costs and known or likely effects, Obama simply chooses policies which benefit his friends and crafts his argument to support what he wanted to do in the first place, as was the case with the first Stimulus Bill – almost a trillion dollars stolen from the American people in order to help his cronies. When he does not have a chosen plan, Obama listens to the people who can make or break him to make his decision, which is why he is having such trouble coming up with a plan for Afghanistan or how to address consumer confidence. Like the Precautionary Principle, Obama is not concerned with the evidence in any major issue, nor with the cost of his intended actions, since he plans for others to pay for his policies.
Thursday, November 19, 2009
Career Calculus
When I was a teenager, my friends and I gave little thought to what we would do for work. We all had vague ideas about what sort of careers we wanted, but no specifics. Part of this came from a lack of experience with the working world, but some of it came from the sense that we were still in development, which implied that once we were finished with our preparation we would find our proper role, where we ought to be. Of course, the real word does not work that way, and most of us discovered that as we finished college and tried to find work in our chosen fields. Top students enjoyed multiple job offers from prominent companies, good rewarding jobs that many people would have found satisfying, but most of us had to search and struggle to find any offers at all. Turns out that you have to have the experience to get anything but entry-level posts, and even the entry-level jobs were a matter of several applicants for each job. It was difficult to stand out, and to get a serious look you had to stand out. I mention this here, because from my experience and the stories I have heard from friends and colleagues over the years, most of us have had to work at jobs that were not our first or second choice. Not that the jobs were bad, but they were not the career path we expected.
I have always been aware that my perspective was unusual. I could not always tell if this gave me an advantage or a handicap. For instance, I have long heard that the average tenure at a job lasts somewhere between three and four years, so in 26 years of working I ought to have had about eight jobs, but in the actual case I have worked for only four companies. In three of my jobs, I stayed an average of over eight years, generally because I am a strong believer in loyalty to my employer. I have also worked in three different industries, which again is off the normal track, although I used similar skill sets in all of my work.
So after a person has worked for a while in a job, and they either choose or need to find a new position, what they find is that there is a range of choices. Two such ranges, actually, which is where the calculus comes in. Normal math is about finding a specific answer, but Calculus produces a range of possibilities, and your answer lies somewhere within that range. This is how a job search works – you determine the range of positions in which you have interest, and determine the range of positions which you fit, and the intersecting range provides your options.
So far I’m playing Captain Obvious. Everyone knows that they won’t apply to every job that is available, and that not every company they apply to will show interest in them. But understanding why you select a company for application, and why companies choose certain people for interviews and final-selection interviews, will help you refine your search effort to target jobs where you have a better chance of winning the job, as well as jobs you will be more likely to enjoy.
There are basically five stages in the hiring process:
1. Company posts a position
2. You apply for the position
3. Your qualifications are screened
4. You are interviewed for personality and skills ‘fit’
5. An offer is extended, and following negotiations accepted (if the offer is not accepted the offer falls into the 4th category, as something was not a good fit)
The fun part is that you have to prepare for four of the five stages, and to understand that you have little direct control in any of them. Instead, you may influence the decision through your choices and presentation. If you’re like me, you will send out a lot of resumes that never get response, but some will see a return, but not immediately. I sent out, by my count, over a hundred applications in two months, got calls from 8 recruiters or placement firms, met with five different such firms in interviews, had 11 phone interviews, 8 face-to-face interviews with real companies, 3 second interviews, and finally a job offer that was good enough to accept. The significance of the numbers was that my early efforts were kind of broad, but I soon learned to target my resume to specify what I did really well; Credit & Collections. That cut out a lot of jobs that I could technically do, but where my experience was not enough to make me a top contender. I still sent out my resume to jobs that were not C&C positions per se, but only one of my company interviews was for a position not tied to Credit & Collections work. And I was one of very few candidates chosen for interviews in those applications:
Co. 1: One of 4 interviewed, not invited to second interview
Co. 2: One of 3 interviewed, only 1 offered second interview, received offer and accepted
Co. 3: One of 2 interviewed, received second interview, withdrew when offered job at another company
Co. 4: One of 4 interviewed, not invited to second interview
Co. 5: One of 2 interviewed, received second interview, withdrew when offered job at another company
Co. 6: One of 3 interviewed, accepted job offer with another company before second round
Co. 7: One of 4 interviewed, offered second interview but accepted job offer at other company so withdrew
It’s interesting to see that in those situations, no more than four candidates got an interview, and twice only two were considered. It pretty much makes you a finalist right from the start if you are considered. So the trade-off is that you get a smaller pool of companies for your targeted applications, but the response for specializing is a much better rate, and those interviews are pretty close to the goal.
I have always been aware that my perspective was unusual. I could not always tell if this gave me an advantage or a handicap. For instance, I have long heard that the average tenure at a job lasts somewhere between three and four years, so in 26 years of working I ought to have had about eight jobs, but in the actual case I have worked for only four companies. In three of my jobs, I stayed an average of over eight years, generally because I am a strong believer in loyalty to my employer. I have also worked in three different industries, which again is off the normal track, although I used similar skill sets in all of my work.
So after a person has worked for a while in a job, and they either choose or need to find a new position, what they find is that there is a range of choices. Two such ranges, actually, which is where the calculus comes in. Normal math is about finding a specific answer, but Calculus produces a range of possibilities, and your answer lies somewhere within that range. This is how a job search works – you determine the range of positions in which you have interest, and determine the range of positions which you fit, and the intersecting range provides your options.
So far I’m playing Captain Obvious. Everyone knows that they won’t apply to every job that is available, and that not every company they apply to will show interest in them. But understanding why you select a company for application, and why companies choose certain people for interviews and final-selection interviews, will help you refine your search effort to target jobs where you have a better chance of winning the job, as well as jobs you will be more likely to enjoy.
There are basically five stages in the hiring process:
1. Company posts a position
2. You apply for the position
3. Your qualifications are screened
4. You are interviewed for personality and skills ‘fit’
5. An offer is extended, and following negotiations accepted (if the offer is not accepted the offer falls into the 4th category, as something was not a good fit)
The fun part is that you have to prepare for four of the five stages, and to understand that you have little direct control in any of them. Instead, you may influence the decision through your choices and presentation. If you’re like me, you will send out a lot of resumes that never get response, but some will see a return, but not immediately. I sent out, by my count, over a hundred applications in two months, got calls from 8 recruiters or placement firms, met with five different such firms in interviews, had 11 phone interviews, 8 face-to-face interviews with real companies, 3 second interviews, and finally a job offer that was good enough to accept. The significance of the numbers was that my early efforts were kind of broad, but I soon learned to target my resume to specify what I did really well; Credit & Collections. That cut out a lot of jobs that I could technically do, but where my experience was not enough to make me a top contender. I still sent out my resume to jobs that were not C&C positions per se, but only one of my company interviews was for a position not tied to Credit & Collections work. And I was one of very few candidates chosen for interviews in those applications:
Co. 1: One of 4 interviewed, not invited to second interview
Co. 2: One of 3 interviewed, only 1 offered second interview, received offer and accepted
Co. 3: One of 2 interviewed, received second interview, withdrew when offered job at another company
Co. 4: One of 4 interviewed, not invited to second interview
Co. 5: One of 2 interviewed, received second interview, withdrew when offered job at another company
Co. 6: One of 3 interviewed, accepted job offer with another company before second round
Co. 7: One of 4 interviewed, offered second interview but accepted job offer at other company so withdrew
It’s interesting to see that in those situations, no more than four candidates got an interview, and twice only two were considered. It pretty much makes you a finalist right from the start if you are considered. So the trade-off is that you get a smaller pool of companies for your targeted applications, but the response for specializing is a much better rate, and those interviews are pretty close to the goal.
Sunday, November 08, 2009
The Future of Islam
Yesterday, I wrote an unpopular piece reminding readers that anger against Islam for the actions of a non-representative few is a dangerous thing, a vicious prejudice that is not only contrary to the American spirit, but also works against American interests in the long term. I presented two contentions in that article, that prejudice against Muslims in general is immoral and foolish, and that Islam needs to recognize the need to define its creed and standards in order to prosper and grow in the long run. This article examines the possible courses available to Islam in the next decade.
Islam was created through the teachings of the prophet Mohammed in the 7th Century. The faith spread through conquest of territory and the coerced conversion of defeated Arabs. In its first century a major dissent rose in Islam after the death of Husayn in 680, and the creation of the Shi’a sect. Islam continued to invade and conquer territory, entering Africa proper and also Europe, the campaign stopped at the Battle of Tours in 732. A short age of prosperity began, but the conflicts between Sunni and Shi’a sects continued, and the Shi’a also split, with Isma’iliyaa extremists rising within the Shiites. The rise of the Fatimid caliphs in the 10th Century was soon followed with a schism between the Fatimids and the Umayyads, diluting Islamic political clout. The Crusades came after that, further fragmenting Islam until the reign of Saladin, which proved to be the exception to the decline of Islam’s potency. The Mongol invasion in the 13th Century ended significant Islamic power outside of a few regional pockets. Although Islam continued to expand as a faith, by the 14th Century the original Islamic territories became the property of the Ottoman Empire, which remained the case through 1918.
The 20th Century was noted for political instability and the rise of Fascism and Marxism. Islamic political theorists bought into both concepts to varying degrees, which is why Middle Eastern militants aligned with both Nazi Germany and the Soviet Union, especially Palestinian front organizations. At the same time, Muslim extremists tied religious fervor to political goals, especially through the proto-terror Muslim Brotherhood, which was connected to extortion, election fraud, and assassinations as early as 1924. This led to a shadow government effect in most regional governments through 1972.
Muslim extremists had pressed for the exit of all “colonial” powers from the region following World War 2, and after the United Kingdom began to withdraw from the Gulf in 1971, the United States became the focus of Islamic Nationalism, especially during the time of the short-lived United Arab Republic. Political and economic pressure mounted, along with escalations of violence against businessmen, but with negligible results until the Iranian Revolution. Islamic terrorism became more and more organized as the PLO’s mercenary strategy was replaced by groups like Islamic Jihad and HizBollah, and nation-sponsorship of terrorism became the norm. Withdrawal of US forces from Lebanon, and the futility of the campaigns in Sudan and Somalia encouraged Islamic Fascists to pursue aggressive strategies targeting any government or leader allied with the United States or Western democracy. The key to Islamic Fascism is that it represents the views of a small minority of Muslims, but achieves its goals through brutality and threats. Islamic Fascists promote their political agenda through a campaign touting nationalism and supposed piety, counting on the lack of a focused political identity among Muslims to preclude effective rebuttal. Democracy is anathema to Fascism, and so democratic parties and coalitions are the natural target for Islamic violence. The battleground for the past three decades has been cultural disinformation versus globalism.
Following the fall of the Soviet Union, Islamic Fascists played on the old myth of American Imperialism and the inconsistency of American policy regarding Islam. The Fascists have been able to play up the lie that Americans don’t respect Islam or Muslims to have equal rights. It was this thinking that inspired Al Qaeda’s campaign of 9/11; Osama bin Laden made no secret that he hoped to spur a US-led invasion of Afghanistan, in the belief that the US would fare as poorly as the USSR in that campaign. The Islamic Fascists have basically followed the same strategy for 85 years, which makes it simple, but not easy, to defeat them. The trick is to convince Islam to create its Renaissance.
Islam is a religion built as much on a cult of personality and on political ambition as it was on faith in a God of truth and righteousness. That said, most Muslims are peaceful and wish no harm to other people. The splintered organization of Islam, however, and the cultural suppression of dissent by lay people has allowed for the Fascists to gain an advantage in polemics and in control of the major political organizations. Damascus and Teheran may not look very much like Tammany Hall, but there is no mechanism for grassroots political movements in most Middle East countries, nor any venue for reform. Part of that comes from the lack of history in most political parties. People in the West too often forget that democracy is uncommon in Islamic countries. The royal families did not want to encourage parties they could not control, and Islamic-focused parties similarly expect to direct their members and voters, not answer to the public. A caste system continues to exist in many countries, and there is genuine fear of the potential chaos which might ensue if democracy were given free rein. The contrived support for Palestine, the drummed-up hatred of Israel and the Jews and America, the contempt for western-style protection of due process and law, and the imbalance between contemporary moral values and the harsh conditions under Sharia are artificial constructs likely to fail if given a true public choice. And despite the history of rigid control by Muslim hardliners, the trend since 2001 has been for reform. While by Western standards there is much work to be done, governments in most Middle Eastern nations have adopted a more pro-American stance. Students have shown public support for democracy and opposition to dictatorships and oligarchies. And most important, revenue from foreign organizations in support of terrorist groups has diminished, been cut off completely in some quiet but vitally important efforts. The reduced effectiveness of terror as a political instrument has freed an increasing number of government officials to make decisions freely on the basis of the commonwealth, the good of the nation and its people. The tide is shifting, but more fundamentally, the channel now exists for Muslims to decide where to take their religion.
The Christian religion has made its way through debate, dissent, schism, internal conflicts and more than a few wars. But the faith grew most steadily and achieved its best results when politics was kept separate from the doctrine. Islam may do well to learn that lesson.
But can Islam, which was founded on political goals as much as religious beliefs, accept a foundation of pure spirituality and ethics? The answer to that question is the essential directive for Islam. The evidence against the Dar-al-Islam campaign is overwhelming; the Islamic Empire reached its zenith less than a century after the death of Mohammed, but never came close since then. The Islamic standard of living, once praised as the highest in the world, has also fallen far behind many other nations and cultures. If governments are based on historic Islamist objectives, the most likely consequences are violence, instability, and poverty. From that perspective, it would be absurd and cruel to the world’s Muslims to pursue what is a hopeless strategy.
The problem comes from the belief among extremist Muslims that world conquest is the only strategy acceptable to Allah. While certain verses in the Quran have been used to claim holy direction for such a bloody plan, many more condemn the unjust and cruel practices of terrorists and Fascist regimes. Many of the arguments made by the Islamist Fascists are actually based on controversial interpretation of statements alleged to have been made by Mohammed or early Islamic leaders, such as Ali. This is significant, given how Christian militants during the reign of leaders like the Emperor Constantine or Pope Leo X tried to justify Christian conquest in verses from the Book of Daniel or Revelation. The Muslim quandary is more difficult, given the historical example of Mohammed himself, but the Muslim faith can adjust its goals according to its precepts. If peace and goodwill are incorporated into the faith through active discussion of the morals and plan of Islam, leaders in major mosques and madrasas can begin to lay a foundation upon which Islam can continue its growth, while at the same time accomplishing its secular goals through advancement of the human condition.
Why should the leaders of Islam choose to do this? In the first place, many imams have urged that a simple reading of the Quran leads the individual to accept Allah’s will, which is peaceful and just. Abandoning the historical excuse of violence and focusing on the healing and constructive teachings of Mohammed would strengthen that argument and make Islamic apologetics more effective, and defang many of the splinter groups which have hijacked major sects in the past. The hashasheen are a thing of the past; there is no reason why terrorism as a path to the will of Allah should not also be rejected.
Second, Islam remains splintered across the world. In addition to the continuing schism between Sunni and Shi’a, a number of extremist cults have poisoned many schools of theology. Just as there are Christians who do not agree completely with their denomination’s dogma and there are Jews and Buddhists who have not been temple in years because they find the stricter requirements burdensome, there are many Muslims whose commitment to the pillars is strong, but who have doubts about what their role is in Islam. A leader who advances the purpose of democracy as service to Allah may be able to gain a great deal of the public trust. Certainly even within Islam there are gen-Y people, who demand to be persuaded rather than accept orders without good reason.
And third, the huge growth in Islam comes from its promise to believers, that Allah’s will is made manifest in the faith. True imams and mullahs will recall that for most of history, the work of teachers in the faith has been to help families and communities, to protect the innocent and advance hope. Conquest has always been the aberration, and it only takes a charismatic leader at the right time to lead Islam to its rightful, peaceful, place in the world.
Islam was created through the teachings of the prophet Mohammed in the 7th Century. The faith spread through conquest of territory and the coerced conversion of defeated Arabs. In its first century a major dissent rose in Islam after the death of Husayn in 680, and the creation of the Shi’a sect. Islam continued to invade and conquer territory, entering Africa proper and also Europe, the campaign stopped at the Battle of Tours in 732. A short age of prosperity began, but the conflicts between Sunni and Shi’a sects continued, and the Shi’a also split, with Isma’iliyaa extremists rising within the Shiites. The rise of the Fatimid caliphs in the 10th Century was soon followed with a schism between the Fatimids and the Umayyads, diluting Islamic political clout. The Crusades came after that, further fragmenting Islam until the reign of Saladin, which proved to be the exception to the decline of Islam’s potency. The Mongol invasion in the 13th Century ended significant Islamic power outside of a few regional pockets. Although Islam continued to expand as a faith, by the 14th Century the original Islamic territories became the property of the Ottoman Empire, which remained the case through 1918.
The 20th Century was noted for political instability and the rise of Fascism and Marxism. Islamic political theorists bought into both concepts to varying degrees, which is why Middle Eastern militants aligned with both Nazi Germany and the Soviet Union, especially Palestinian front organizations. At the same time, Muslim extremists tied religious fervor to political goals, especially through the proto-terror Muslim Brotherhood, which was connected to extortion, election fraud, and assassinations as early as 1924. This led to a shadow government effect in most regional governments through 1972.
Muslim extremists had pressed for the exit of all “colonial” powers from the region following World War 2, and after the United Kingdom began to withdraw from the Gulf in 1971, the United States became the focus of Islamic Nationalism, especially during the time of the short-lived United Arab Republic. Political and economic pressure mounted, along with escalations of violence against businessmen, but with negligible results until the Iranian Revolution. Islamic terrorism became more and more organized as the PLO’s mercenary strategy was replaced by groups like Islamic Jihad and HizBollah, and nation-sponsorship of terrorism became the norm. Withdrawal of US forces from Lebanon, and the futility of the campaigns in Sudan and Somalia encouraged Islamic Fascists to pursue aggressive strategies targeting any government or leader allied with the United States or Western democracy. The key to Islamic Fascism is that it represents the views of a small minority of Muslims, but achieves its goals through brutality and threats. Islamic Fascists promote their political agenda through a campaign touting nationalism and supposed piety, counting on the lack of a focused political identity among Muslims to preclude effective rebuttal. Democracy is anathema to Fascism, and so democratic parties and coalitions are the natural target for Islamic violence. The battleground for the past three decades has been cultural disinformation versus globalism.
Following the fall of the Soviet Union, Islamic Fascists played on the old myth of American Imperialism and the inconsistency of American policy regarding Islam. The Fascists have been able to play up the lie that Americans don’t respect Islam or Muslims to have equal rights. It was this thinking that inspired Al Qaeda’s campaign of 9/11; Osama bin Laden made no secret that he hoped to spur a US-led invasion of Afghanistan, in the belief that the US would fare as poorly as the USSR in that campaign. The Islamic Fascists have basically followed the same strategy for 85 years, which makes it simple, but not easy, to defeat them. The trick is to convince Islam to create its Renaissance.
Islam is a religion built as much on a cult of personality and on political ambition as it was on faith in a God of truth and righteousness. That said, most Muslims are peaceful and wish no harm to other people. The splintered organization of Islam, however, and the cultural suppression of dissent by lay people has allowed for the Fascists to gain an advantage in polemics and in control of the major political organizations. Damascus and Teheran may not look very much like Tammany Hall, but there is no mechanism for grassroots political movements in most Middle East countries, nor any venue for reform. Part of that comes from the lack of history in most political parties. People in the West too often forget that democracy is uncommon in Islamic countries. The royal families did not want to encourage parties they could not control, and Islamic-focused parties similarly expect to direct their members and voters, not answer to the public. A caste system continues to exist in many countries, and there is genuine fear of the potential chaos which might ensue if democracy were given free rein. The contrived support for Palestine, the drummed-up hatred of Israel and the Jews and America, the contempt for western-style protection of due process and law, and the imbalance between contemporary moral values and the harsh conditions under Sharia are artificial constructs likely to fail if given a true public choice. And despite the history of rigid control by Muslim hardliners, the trend since 2001 has been for reform. While by Western standards there is much work to be done, governments in most Middle Eastern nations have adopted a more pro-American stance. Students have shown public support for democracy and opposition to dictatorships and oligarchies. And most important, revenue from foreign organizations in support of terrorist groups has diminished, been cut off completely in some quiet but vitally important efforts. The reduced effectiveness of terror as a political instrument has freed an increasing number of government officials to make decisions freely on the basis of the commonwealth, the good of the nation and its people. The tide is shifting, but more fundamentally, the channel now exists for Muslims to decide where to take their religion.
The Christian religion has made its way through debate, dissent, schism, internal conflicts and more than a few wars. But the faith grew most steadily and achieved its best results when politics was kept separate from the doctrine. Islam may do well to learn that lesson.
But can Islam, which was founded on political goals as much as religious beliefs, accept a foundation of pure spirituality and ethics? The answer to that question is the essential directive for Islam. The evidence against the Dar-al-Islam campaign is overwhelming; the Islamic Empire reached its zenith less than a century after the death of Mohammed, but never came close since then. The Islamic standard of living, once praised as the highest in the world, has also fallen far behind many other nations and cultures. If governments are based on historic Islamist objectives, the most likely consequences are violence, instability, and poverty. From that perspective, it would be absurd and cruel to the world’s Muslims to pursue what is a hopeless strategy.
The problem comes from the belief among extremist Muslims that world conquest is the only strategy acceptable to Allah. While certain verses in the Quran have been used to claim holy direction for such a bloody plan, many more condemn the unjust and cruel practices of terrorists and Fascist regimes. Many of the arguments made by the Islamist Fascists are actually based on controversial interpretation of statements alleged to have been made by Mohammed or early Islamic leaders, such as Ali. This is significant, given how Christian militants during the reign of leaders like the Emperor Constantine or Pope Leo X tried to justify Christian conquest in verses from the Book of Daniel or Revelation. The Muslim quandary is more difficult, given the historical example of Mohammed himself, but the Muslim faith can adjust its goals according to its precepts. If peace and goodwill are incorporated into the faith through active discussion of the morals and plan of Islam, leaders in major mosques and madrasas can begin to lay a foundation upon which Islam can continue its growth, while at the same time accomplishing its secular goals through advancement of the human condition.
Why should the leaders of Islam choose to do this? In the first place, many imams have urged that a simple reading of the Quran leads the individual to accept Allah’s will, which is peaceful and just. Abandoning the historical excuse of violence and focusing on the healing and constructive teachings of Mohammed would strengthen that argument and make Islamic apologetics more effective, and defang many of the splinter groups which have hijacked major sects in the past. The hashasheen are a thing of the past; there is no reason why terrorism as a path to the will of Allah should not also be rejected.
Second, Islam remains splintered across the world. In addition to the continuing schism between Sunni and Shi’a, a number of extremist cults have poisoned many schools of theology. Just as there are Christians who do not agree completely with their denomination’s dogma and there are Jews and Buddhists who have not been temple in years because they find the stricter requirements burdensome, there are many Muslims whose commitment to the pillars is strong, but who have doubts about what their role is in Islam. A leader who advances the purpose of democracy as service to Allah may be able to gain a great deal of the public trust. Certainly even within Islam there are gen-Y people, who demand to be persuaded rather than accept orders without good reason.
And third, the huge growth in Islam comes from its promise to believers, that Allah’s will is made manifest in the faith. True imams and mullahs will recall that for most of history, the work of teachers in the faith has been to help families and communities, to protect the innocent and advance hope. Conquest has always been the aberration, and it only takes a charismatic leader at the right time to lead Islam to its rightful, peaceful, place in the world.
Saturday, November 07, 2009
The Balance Between Judgment and Hysteria
Thursday's shootings at Fort Hood have naturally evoked strong emotions. And the media and some prominent political leaders have taken all-too-predictable postures, two of which I feel compelled to address. The falsehood that Islam is aligned with terrorism and malice, and the falsehood that Islam is a victim in such situations as this, with innocents who worry about an unreasonable backlash. Both contentions are wrong.
First, about Islam. There are over 1.5 billion practicing Muslims in the world. So far, there have been 221 terrorist incidents in 2009 through November 2, so even if we say that every single one of those were committed by a Muslim (including attacks in Greece, Ireland, South Africa, and the Starbucks bomb in New York) and that two dozen Muslims were involved on average in each incident, that only implicates 5,304 Muslims and means that over 99.999% of Muslims worldwide had nothing to do with terrorism so far this year. Frankly, if even one percent of Muslims worldwide had it in for America, or even against Israel, we'd see an unprecedented level of violence and murders, because even one percent of Islam would be a force of 15 million terrorists, and no realistic estimate of terrorist activity has ever come close to a million total, much less 15 million. Between Iraq and Afghanistan, over 50 million Muslims have come into close and regular contact with U.S. troops. While there are places of hostility against the West and the U.S. in particular, and some spots rank with pure evil and hatred, the soldiers who have been there will tell you that it has much more to do with culture and politics than religion. For most Muslims, their faith is a private matter between them and God, a question of living honestly and by their best ideals, and hatred towards another human is a sin to be avoided. Most Muslims love their families and their nation, and have a generally tolerant outlook towards everyone else.
So what happens when someone like Nidal Hasan (allegedly) decides to kill innocents while screaming the name of his god? To me, once you get past the emotion and look at the facts, pretty much the same thing as any fanatic who goes psychotic. Hasan had no wife or girlfriend, he had no close friends, even his family and those at Fort Hood who had the most contact with him note that he was distant and aloof. While Hasan complained to some family that he was being mocked for his Muslim beliefs, other Muslims at Fort Hood emphasized that the military accommodated them at all times and they felt proud to serve with the men and women in the U.S. Military. When you dig down to the bottom of it, Hasan was a lot like another Islamist Loser: Khalid Sheikh Muhammad.
Does this look like a chick magnet? A guy who wants to raise a family and be a good husband and father, someone who thinks first about his moral duties and personal integrity?
No. In any culture, this guy is a loser, albeit a clever and dangerous psychopathic loser. He and the real world were quits, so he joined up with a group of other losers who tried to compensate for their personal failures by blaming everyone else. And that is what hapened with Major Hasan. He became bitter about his place in the world, and decided to punish everyone else. That's really the only way to explain how someone could decide to kill a roomful of innocent people, most of whom he had no grievance whatsoever, including a 21-year-old pregnant soldier, a band member, two soldiers who had just returned from Iraq, and two others who were being deployed to Afghanistan just as Major Hasan was scheduled to go, among others. This was not a blow against some imperial power, it was the impotent scream of a coward.
Islam has its share of such cowards, to be sure, but so has Christianity and other religions. Remember the cowards who bombed abortion clinics in the name of Christ? Anti-war protesters who think nothing of attacking soldiers in the name of peace? Look at the troubles in Northern Ireland for nearly a century - there is nothing in either the Anglican or Roman Catholic dogmas to excuse the kidnappings, torture, bombings, and murders that happened there for so long. Consider the tribal conflicts in Rwana and Burundi not so long ago, or the cold-blooded extermination practices of Miloslavic and his Serbs. Even Buddhism, founded on clear commitment to reverance for life, has its share of extremists, including Triad groups who see conflict in murdering people then going to temple to be 'spiritually cleansed', so they won't feel bad about their crimes. My point is that some people will abuse the tenets of any religion.
So why do Muslims not march in outrage over the hijacking of their faith? For one thing, I don't believe they feel they should have to state what they think is obvious. Even though most serial killers are white males, I have never felt it necessary to point out that most white males would never commit murder. Even though many crimes were committed in the name of Christ over the years, most notably during the various Inquisitions, even atheists and Muslims recognize that Christianity in its essence had nothing to do with the spirit of evil which tortured and killed in the name of the Prince of Peace.
It is true that certain teachings of Mohammed are troubling to non-Muslims, but let's not forget that other beliefs have had similar problem areas. Most Mormons today live exemplary lives of charity, tolerance and humility, and so have very little in common with the racist, xenophobic Joseph Smith. Many Scientologists are open-minded and just want to live by their creed, and so have almost nothing in common with the arrogant and greedy L. Ron Hubbard. Come to that, I am a Southern Baptist but have little in common with most of the denomination's leading ministers. I'm not saying they aren't fine men and honest, but a man whose career focuses on only one creed and point of view has trouble seeing things the same way as a working man who sees real life from the perspectives of the street and the diversity of a truly global community.
Also, Islam is rooted in the culture of the Middle East. While American Muslims live in the modern world, their faith comes from a place where women and the young are expected to give way to the men and the elders, where criticism is uncommon because it so often leads to conflict and escalation, and where challenging those in authority is seen as rebellion rather than reasonable doubt and skepticism. Even the Roman Catholics have their Jesuits to challenge assumptions; Islam has not yet reached the point where theologians can help the faith become relevant to changing social and cultural conditions. Whatever he was, the prophet Mohammed did not prepare his people for a world of cultural diversity and demographic trend shifts.
That brings us to the second problem. Islam likes to play the victim card, even when a Muslim is the criminal. It is very difficult for a Mullah to explain why a man like Osama bin Laden, educated and from a good family, would countenance the murder of innocents on a Hitler-like scale. So they evade the question and try to leverage a sense of guilt from the victims, because the United States is a generous and open-minded country, one of very few willing to examine its own behavior in a critical way. No one in the Saudi royal family, for instance, has ever shown an interest in criticizing their own policies and behavior in the past, and the Palestinans are even worse. These guys have made the wrong choice in every major decision, since they chose to back the Nazis in World War 2. But rather than consider the foundation of so many bad choices, Palestinian leaders chose instead to insult and attack Isreal, precisely because Israel is careful of its behavior and considerate of the rights of Pelstinans in most cases.
In the Unted States, Islam has always shown that it sensed its place in America as a tolerated segment of the population, rather than a welcome member of the community. This comes to some degree from a certain discomfort with the way Muslims speak and act and dress, but it also comes from Muslims' self-chosen segregation. Muslims do not eat the same foods as most Americans, do not attend the same entertainment and recreational events as most Americans, and do not treat Americans as close friends in most cases. Islam is not liberal in the traditional sense, many Muslims act as if Americans carry a kind of infection, and so it is difficult for a non-Muslim to be close friends with a believer. Even in the heart of America, Muslims often act is if they must live apart. This happens with other faiths, of course. Hasidic Jews, for example, also cordon themselves off from contact with Gentiles and they have strict dress and dietary codes which set them apart. Some fundamentalists also dress, eat, and behave in ways that seem strange to most Americans. But there are many more Muslims than Hasidic Jews or fundamentalist Christians, and so the segregated culture becomes more obvious.
The acts of a Hasan or other psychotic Muslims is an issue that has no easy answer, but it is important for non-Muslims to recognize that such behavior is anomolous to Islam, just as it is important for Islam as a whole to recognize that these extremists must be denounced in the interest of understanding what makes someone a Muslim, and what does not.
First, about Islam. There are over 1.5 billion practicing Muslims in the world. So far, there have been 221 terrorist incidents in 2009 through November 2, so even if we say that every single one of those were committed by a Muslim (including attacks in Greece, Ireland, South Africa, and the Starbucks bomb in New York) and that two dozen Muslims were involved on average in each incident, that only implicates 5,304 Muslims and means that over 99.999% of Muslims worldwide had nothing to do with terrorism so far this year. Frankly, if even one percent of Muslims worldwide had it in for America, or even against Israel, we'd see an unprecedented level of violence and murders, because even one percent of Islam would be a force of 15 million terrorists, and no realistic estimate of terrorist activity has ever come close to a million total, much less 15 million. Between Iraq and Afghanistan, over 50 million Muslims have come into close and regular contact with U.S. troops. While there are places of hostility against the West and the U.S. in particular, and some spots rank with pure evil and hatred, the soldiers who have been there will tell you that it has much more to do with culture and politics than religion. For most Muslims, their faith is a private matter between them and God, a question of living honestly and by their best ideals, and hatred towards another human is a sin to be avoided. Most Muslims love their families and their nation, and have a generally tolerant outlook towards everyone else.
So what happens when someone like Nidal Hasan (allegedly) decides to kill innocents while screaming the name of his god? To me, once you get past the emotion and look at the facts, pretty much the same thing as any fanatic who goes psychotic. Hasan had no wife or girlfriend, he had no close friends, even his family and those at Fort Hood who had the most contact with him note that he was distant and aloof. While Hasan complained to some family that he was being mocked for his Muslim beliefs, other Muslims at Fort Hood emphasized that the military accommodated them at all times and they felt proud to serve with the men and women in the U.S. Military. When you dig down to the bottom of it, Hasan was a lot like another Islamist Loser: Khalid Sheikh Muhammad.
Does this look like a chick magnet? A guy who wants to raise a family and be a good husband and father, someone who thinks first about his moral duties and personal integrity?
No. In any culture, this guy is a loser, albeit a clever and dangerous psychopathic loser. He and the real world were quits, so he joined up with a group of other losers who tried to compensate for their personal failures by blaming everyone else. And that is what hapened with Major Hasan. He became bitter about his place in the world, and decided to punish everyone else. That's really the only way to explain how someone could decide to kill a roomful of innocent people, most of whom he had no grievance whatsoever, including a 21-year-old pregnant soldier, a band member, two soldiers who had just returned from Iraq, and two others who were being deployed to Afghanistan just as Major Hasan was scheduled to go, among others. This was not a blow against some imperial power, it was the impotent scream of a coward.
Islam has its share of such cowards, to be sure, but so has Christianity and other religions. Remember the cowards who bombed abortion clinics in the name of Christ? Anti-war protesters who think nothing of attacking soldiers in the name of peace? Look at the troubles in Northern Ireland for nearly a century - there is nothing in either the Anglican or Roman Catholic dogmas to excuse the kidnappings, torture, bombings, and murders that happened there for so long. Consider the tribal conflicts in Rwana and Burundi not so long ago, or the cold-blooded extermination practices of Miloslavic and his Serbs. Even Buddhism, founded on clear commitment to reverance for life, has its share of extremists, including Triad groups who see conflict in murdering people then going to temple to be 'spiritually cleansed', so they won't feel bad about their crimes. My point is that some people will abuse the tenets of any religion.
So why do Muslims not march in outrage over the hijacking of their faith? For one thing, I don't believe they feel they should have to state what they think is obvious. Even though most serial killers are white males, I have never felt it necessary to point out that most white males would never commit murder. Even though many crimes were committed in the name of Christ over the years, most notably during the various Inquisitions, even atheists and Muslims recognize that Christianity in its essence had nothing to do with the spirit of evil which tortured and killed in the name of the Prince of Peace.
It is true that certain teachings of Mohammed are troubling to non-Muslims, but let's not forget that other beliefs have had similar problem areas. Most Mormons today live exemplary lives of charity, tolerance and humility, and so have very little in common with the racist, xenophobic Joseph Smith. Many Scientologists are open-minded and just want to live by their creed, and so have almost nothing in common with the arrogant and greedy L. Ron Hubbard. Come to that, I am a Southern Baptist but have little in common with most of the denomination's leading ministers. I'm not saying they aren't fine men and honest, but a man whose career focuses on only one creed and point of view has trouble seeing things the same way as a working man who sees real life from the perspectives of the street and the diversity of a truly global community.
Also, Islam is rooted in the culture of the Middle East. While American Muslims live in the modern world, their faith comes from a place where women and the young are expected to give way to the men and the elders, where criticism is uncommon because it so often leads to conflict and escalation, and where challenging those in authority is seen as rebellion rather than reasonable doubt and skepticism. Even the Roman Catholics have their Jesuits to challenge assumptions; Islam has not yet reached the point where theologians can help the faith become relevant to changing social and cultural conditions. Whatever he was, the prophet Mohammed did not prepare his people for a world of cultural diversity and demographic trend shifts.
That brings us to the second problem. Islam likes to play the victim card, even when a Muslim is the criminal. It is very difficult for a Mullah to explain why a man like Osama bin Laden, educated and from a good family, would countenance the murder of innocents on a Hitler-like scale. So they evade the question and try to leverage a sense of guilt from the victims, because the United States is a generous and open-minded country, one of very few willing to examine its own behavior in a critical way. No one in the Saudi royal family, for instance, has ever shown an interest in criticizing their own policies and behavior in the past, and the Palestinans are even worse. These guys have made the wrong choice in every major decision, since they chose to back the Nazis in World War 2. But rather than consider the foundation of so many bad choices, Palestinian leaders chose instead to insult and attack Isreal, precisely because Israel is careful of its behavior and considerate of the rights of Pelstinans in most cases.
In the Unted States, Islam has always shown that it sensed its place in America as a tolerated segment of the population, rather than a welcome member of the community. This comes to some degree from a certain discomfort with the way Muslims speak and act and dress, but it also comes from Muslims' self-chosen segregation. Muslims do not eat the same foods as most Americans, do not attend the same entertainment and recreational events as most Americans, and do not treat Americans as close friends in most cases. Islam is not liberal in the traditional sense, many Muslims act as if Americans carry a kind of infection, and so it is difficult for a non-Muslim to be close friends with a believer. Even in the heart of America, Muslims often act is if they must live apart. This happens with other faiths, of course. Hasidic Jews, for example, also cordon themselves off from contact with Gentiles and they have strict dress and dietary codes which set them apart. Some fundamentalists also dress, eat, and behave in ways that seem strange to most Americans. But there are many more Muslims than Hasidic Jews or fundamentalist Christians, and so the segregated culture becomes more obvious.
The acts of a Hasan or other psychotic Muslims is an issue that has no easy answer, but it is important for non-Muslims to recognize that such behavior is anomolous to Islam, just as it is important for Islam as a whole to recognize that these extremists must be denounced in the interest of understanding what makes someone a Muslim, and what does not.
Sunday, November 01, 2009
The Good Grudge
Job hunters are under a lot of stress. In the first place, few people are looking for a job as a luxury, and almost as few feel that they have most of the control in getting the job they want. For all the books, seminars and classes in pursuing your ideal career, I’d venture to say that while most people like their jobs for the most part, fewer than one in forty would say they are in the job of their dreams, and that they accomplished their job through a disciplined job search. Luck plays a role, for bad as well as good fortune. Accordingly, it seems reasonable to me that people whose job search is slow or less than satisfying may display indications of their discontent. It’s only human to reflect the stress of the endeavor, exasperation of the bull-headed bureaucracy, and anger at a system which seems to reward appearance over substance, and style over real ability. Oddly enough, many hiring managers also suffer the same stress, of trying to find truly qualified candidates in an ocean of poseurs. But it’s more difficult for the individual than the business.
The people trying to help job seekers, pretty much unanimously, emphasize the need for an upbeat, positive attitude. Generally, they are right. It is important not to be negative about your past work experience when speaking to recruiters and at interviews, it is essential to be positive about your skills and what you can do for your company if they offer you the position. But at the same time, a lot of things get attacked as “negative” when they are actually important, in more than one way. For one thing, the people offering advice are sometimes wrong. One discussion board I joined had an administrator telling everyone that they needed to join groups, then make sure to use their groups as a network to find leads towards jobs. Having been part of a number of groups apart from professional societies, I can promise that such behavior is a very poor idea, and would likely kill your credibility and poison your network. That’s because special interest groups exist for the specific purpose of the group, and members don’t like or respect people who join just so they can advance their own interests, especially when those interests have nothing to do with the group’s purpose. For example, I spent a number of years as a high school official in three sports, and each sport had a local chapter with weekly meetings to go over events, rule changes and interpretations, game films, and anything else related to the sport. A member of any of these chapters is expected to come to the meetings to learn about how to be a better sports official. While it’s fine to discuss informal and personal items, even then those topics tend to be related to sports; loving the sport is why these guys become officials in the first place. So some first-year member who starts trying to find out if the chapter members can help him find job leads, is not going to be considered legitimate; the topic is just plain wrong. And many groups I have belonged to have demonstrated a similar attitude. So the administrator of that job-seeker group was absolutely wrong, and was actually hurting members with her advice. So while positive attitudes and finding inventive ways to expand your professional network are good things, it is also important to rein in wild, untested theories and assumptions. It’s also necessary to deal with your grudges.
With very few exceptions, everyone in a job search situation has some bad feelings lurking around. While I agree that you should focus on your skills, experience, and positive attitude when applying and interviewing for a job, it is also important to deal with the things which cause you anger, resentment, or other negative emotions. After all, in most cases the person who lost the job did not deserve to lose his job, and even those employees who could blame themselves for their job loss often have good qualities to their work which they may feel should have been considered. Also, the way in which the company lets employees go is often a cause for unhappiness, and then there is the difficulty of the job search itself. Put it all together, and you have a condition where the stress and frustration needs an outlet, ideally in a way where it helps the individual move forward in their work search. Rather than tell people to suppress or hide their grudges, assistance groups should help people find ways to turn their grudge to good purpose.
As I mentioned earlier, I agree that when making an application or in an interview, the focus must be on how your skills help the company and how your attitude is positive and team-centered. But you need to deal with the weight and fire of your negative side, and find ways to use that to your advantage. For instance, I once had a boss, pretty high up in his company, who was afraid that his managers would discover they were underpaid and quit on him. His solution was to attack, harass and demean those managers at every opportunity, really rip them up so they would be in constant fear of being fired for some trivial (or even nonexistent) mistake, and never realize their rights, even though any one of us would have been fired on the spot if we had treated our staffs the way we were ourselves being treated. At first, I and the other managers just took the abuse, but years later I reflected on the behavior and used it to remind myself of the importance of actively listening to my people, to make sure my behavior was as ethical and courteous as I believed it to be. This not only helped my relations with my staff in my next three jobs, it also helped me get promoted when my consideration earned me credibility as someone who did what he preached.
Finding positive uses for negative experiences is one way to deal with your grudges. Another important use, however, is to talk about them in a confidential setting. Let’s be clear, I am not saying you should ever give potential employers indications that you might be a malcontent at their company or that you can’t let go of bad experiences, but it’s important to recognize valid events, and frankly just as every company sooner or later suffers from a bad or dishonest employee, so too most of us have had the misfortune to work for a company which was unethical and dishonest. Imagine someone who quit working at Enron in 2000, before it came out how corrupt its officers were. Imagine someone who left WorldCom, or someone who was an auditor for Arthur Andersen, who quit because of how those companies did business. Looking back, not only would it make no sense to praise a former employer whose business practices are now documented in ethics textbooks as egregious examples of criminal behavior, it would also make perfect sense for these individuals to feel that they had been badly used for staying true to higher ethical standards.
When you have to find a new job, you also look at the horizon in every direction, and that includes things you did not like about past jobs, things you hope to avoid the next time around. In a best-case situation, you can apply those lessons at your new job, avoiding the damage done in the prior experience. Recognizing that high-level bosses tended to berate and ignore the floor staff at one job helped me focus on selling bosses on opening informal feedback channels at another, beyond the usual ‘open door’ claim that is made so often. The experience of support for a project evaporating because the superior forgot about it, reminded me to include update reports on pending projects to superiors, including reference to their prior written support, to keep the idea fresh and familiar. Almost any bad experience can be useful in building tools to prevent it from happening later on. So even a grudge can be a good thing, if you use it to positive effect.
Finally, it needs to be said that humans are not machines. The facts of a situation can be handled rationally, but the effects carry emotional weight. There needs to be a way to deal with mistreatment and injustice suffered in a job, even if it’s just that you were sometimes under-recognized or happen to be one of the employees in your group laid off in a company reorganization. I notice that the help groups always point out that there is no “stigma” in being let go these days, which is a help on one level, but being laid off always has impact, and while you may intellectually understand that there was no personal insult meant, being laid off when you have been doing good work and were relatively happy at your job will always carry the emotional weight that you were selected to be let go, a sense that the work you did was not recognized the way you hoped it should be, that your skills and experience were not valued enough to be kept on board. If you’re like me, at some point you may even wonder why your value to the company was lower than the old furniture in the reception – there are chairs that have almost no value anymore, yet they are kept forever, yet in any downturn there are many good employees let go. Understanding the economics of the variable cost of human capital does not satisfy the sense of injustice which comes from being considered not only not indispensable, but as disposable assets of little consequence. We all like to believe that we matter, and being let go is an assault on that sense. It is only reasonable, therefore, that while the sense should be managed in a productive way, there is a valid need to address that grudge. The grudge is authentic, even if the help groups say it must be suppressed. I would argue that it is far healthier to recognize that the grudge is real, that it exists for valid reasons, and that it can and should be applied to good purpose in personal reflection and planning for the future. The short version may be as simple as 'don’t get fooled again’, but in a less cynical sense it also carries the value of hard-earned experience, unique lessons that can be applied to real world situations, and which may have specific value to your new company and team.
The people trying to help job seekers, pretty much unanimously, emphasize the need for an upbeat, positive attitude. Generally, they are right. It is important not to be negative about your past work experience when speaking to recruiters and at interviews, it is essential to be positive about your skills and what you can do for your company if they offer you the position. But at the same time, a lot of things get attacked as “negative” when they are actually important, in more than one way. For one thing, the people offering advice are sometimes wrong. One discussion board I joined had an administrator telling everyone that they needed to join groups, then make sure to use their groups as a network to find leads towards jobs. Having been part of a number of groups apart from professional societies, I can promise that such behavior is a very poor idea, and would likely kill your credibility and poison your network. That’s because special interest groups exist for the specific purpose of the group, and members don’t like or respect people who join just so they can advance their own interests, especially when those interests have nothing to do with the group’s purpose. For example, I spent a number of years as a high school official in three sports, and each sport had a local chapter with weekly meetings to go over events, rule changes and interpretations, game films, and anything else related to the sport. A member of any of these chapters is expected to come to the meetings to learn about how to be a better sports official. While it’s fine to discuss informal and personal items, even then those topics tend to be related to sports; loving the sport is why these guys become officials in the first place. So some first-year member who starts trying to find out if the chapter members can help him find job leads, is not going to be considered legitimate; the topic is just plain wrong. And many groups I have belonged to have demonstrated a similar attitude. So the administrator of that job-seeker group was absolutely wrong, and was actually hurting members with her advice. So while positive attitudes and finding inventive ways to expand your professional network are good things, it is also important to rein in wild, untested theories and assumptions. It’s also necessary to deal with your grudges.
With very few exceptions, everyone in a job search situation has some bad feelings lurking around. While I agree that you should focus on your skills, experience, and positive attitude when applying and interviewing for a job, it is also important to deal with the things which cause you anger, resentment, or other negative emotions. After all, in most cases the person who lost the job did not deserve to lose his job, and even those employees who could blame themselves for their job loss often have good qualities to their work which they may feel should have been considered. Also, the way in which the company lets employees go is often a cause for unhappiness, and then there is the difficulty of the job search itself. Put it all together, and you have a condition where the stress and frustration needs an outlet, ideally in a way where it helps the individual move forward in their work search. Rather than tell people to suppress or hide their grudges, assistance groups should help people find ways to turn their grudge to good purpose.
As I mentioned earlier, I agree that when making an application or in an interview, the focus must be on how your skills help the company and how your attitude is positive and team-centered. But you need to deal with the weight and fire of your negative side, and find ways to use that to your advantage. For instance, I once had a boss, pretty high up in his company, who was afraid that his managers would discover they were underpaid and quit on him. His solution was to attack, harass and demean those managers at every opportunity, really rip them up so they would be in constant fear of being fired for some trivial (or even nonexistent) mistake, and never realize their rights, even though any one of us would have been fired on the spot if we had treated our staffs the way we were ourselves being treated. At first, I and the other managers just took the abuse, but years later I reflected on the behavior and used it to remind myself of the importance of actively listening to my people, to make sure my behavior was as ethical and courteous as I believed it to be. This not only helped my relations with my staff in my next three jobs, it also helped me get promoted when my consideration earned me credibility as someone who did what he preached.
Finding positive uses for negative experiences is one way to deal with your grudges. Another important use, however, is to talk about them in a confidential setting. Let’s be clear, I am not saying you should ever give potential employers indications that you might be a malcontent at their company or that you can’t let go of bad experiences, but it’s important to recognize valid events, and frankly just as every company sooner or later suffers from a bad or dishonest employee, so too most of us have had the misfortune to work for a company which was unethical and dishonest. Imagine someone who quit working at Enron in 2000, before it came out how corrupt its officers were. Imagine someone who left WorldCom, or someone who was an auditor for Arthur Andersen, who quit because of how those companies did business. Looking back, not only would it make no sense to praise a former employer whose business practices are now documented in ethics textbooks as egregious examples of criminal behavior, it would also make perfect sense for these individuals to feel that they had been badly used for staying true to higher ethical standards.
When you have to find a new job, you also look at the horizon in every direction, and that includes things you did not like about past jobs, things you hope to avoid the next time around. In a best-case situation, you can apply those lessons at your new job, avoiding the damage done in the prior experience. Recognizing that high-level bosses tended to berate and ignore the floor staff at one job helped me focus on selling bosses on opening informal feedback channels at another, beyond the usual ‘open door’ claim that is made so often. The experience of support for a project evaporating because the superior forgot about it, reminded me to include update reports on pending projects to superiors, including reference to their prior written support, to keep the idea fresh and familiar. Almost any bad experience can be useful in building tools to prevent it from happening later on. So even a grudge can be a good thing, if you use it to positive effect.
Finally, it needs to be said that humans are not machines. The facts of a situation can be handled rationally, but the effects carry emotional weight. There needs to be a way to deal with mistreatment and injustice suffered in a job, even if it’s just that you were sometimes under-recognized or happen to be one of the employees in your group laid off in a company reorganization. I notice that the help groups always point out that there is no “stigma” in being let go these days, which is a help on one level, but being laid off always has impact, and while you may intellectually understand that there was no personal insult meant, being laid off when you have been doing good work and were relatively happy at your job will always carry the emotional weight that you were selected to be let go, a sense that the work you did was not recognized the way you hoped it should be, that your skills and experience were not valued enough to be kept on board. If you’re like me, at some point you may even wonder why your value to the company was lower than the old furniture in the reception – there are chairs that have almost no value anymore, yet they are kept forever, yet in any downturn there are many good employees let go. Understanding the economics of the variable cost of human capital does not satisfy the sense of injustice which comes from being considered not only not indispensable, but as disposable assets of little consequence. We all like to believe that we matter, and being let go is an assault on that sense. It is only reasonable, therefore, that while the sense should be managed in a productive way, there is a valid need to address that grudge. The grudge is authentic, even if the help groups say it must be suppressed. I would argue that it is far healthier to recognize that the grudge is real, that it exists for valid reasons, and that it can and should be applied to good purpose in personal reflection and planning for the future. The short version may be as simple as 'don’t get fooled again’, but in a less cynical sense it also carries the value of hard-earned experience, unique lessons that can be applied to real world situations, and which may have specific value to your new company and team.
Wednesday, October 28, 2009
Regional and Industry Recession Effects
Patrick Jankowski, the Vice-President of Research for the Greater Houston Partnership, spoke Tuesday about the recession and its effects, both nationally and in Houston. His lecture was informative and fascinating, as it reminded our group of an important point about economics – local economics are influenced by national economics, but are not directly controlled by it. That is, while the entire nation is in a recession, different regions, industries, and demographic classes are effected by that recession, and the recovery from that recession is of varying lengths and difficulty, as well.
Take Detroit, for example. Detroit stands out among U.S. cities because, according to economic data, the city has been losing jobs (that is, fewer jobs have been created than have been lost in consecutive months) since May 2000, and currently suffers a 22.2% unemployment rate, one of 13 metropolitan areas with unemployment of 15% or higher, and 117 with unemployment at or above 10%.
This gives Detroit the unfortunate distinction of having one of the highest unemployment rate among metropolitan areas in the United States, as well as the longest recession specific to a metropolitan area. El Centro, California, has the worst unemployment at 30.1%, but El Centro is about one-three-hundredth the size of Detroit and has been in a condition of recession only since 2008.
It’s interesting to note that El Centro’s job base is agricultural, with significant retail and service sectors, while Detroit’s job base was heavily committed to the auto and truck building industry.
The flip side would be to consider those metropolitan areas which are not suffering badly from the recession. 12 metropolitan areas have unemployment rates ranging from 4.8% down to 2.9%, which would be envied by most of the nation. Three cities in North Dakota, two in Nebraska, two in South Dakota, two in Iowa, and one each in Utah, Kansas, and Montana make up that happy club.
It is reasonable, on the available data, to say that smaller towns have handled the recession with lower unemployment and shorter duration of job loss than have the major cities.
Industry matters in unemployment, as well. The highest unemployment by industry is in construction, durable goods manufacturing, leisure and hospitality services, business services and information technology, in that order. As a sector, government workers have by far the lowest unemployment rate, at 4.2%.
The landscape from the industry unemployment shows that all businesses are cutting non-essential costs, including research and development, growth activity, and service activities. Government, following its historical pattern, is making no effort at all to scale back costs or headcount. The problem there is that as foreclosures rise and tax revenue from income and sales falls, an inevitable shortfall will occur at most government levels, creating an incentive to raise tax rates at the time when taxpayers would most resent such actions. Possibility of a political backlash increases significantly for the 2010 and 2012 election cycles.
Mr. Jankowski also presented his forecast for economic recovery from the recession. In general, the recession will technically be over for most of the country by mid-to-late 2010, meaning that the economic conditions which define a recession will no longer be in place, but the effects of the recession will linger for some time afterwards. Specifically, Mr. Jankowski warned that it will take from one to four more years for the jobs lost in the recession to be replaced in full, to the extent that each metropolitan area will produce GDP equal or greater to what it was prior to the recession. That means it will be anywhere from 2010 to 2013 before the jobs lost in this recession are replaced at the same professional and wage level. And that cheery projection does not address the loss of career growth; the projection is that people who lost/lose their jobs in 2008-2012 will spend between 12 and 48 months finding a position equal to where they were when they lost their job; the savings lost while unemployed and the career growth which ordinarily would have happened in that time will be permanently lost, which may have significant meaning when the individual retires. Young workers will be competing with people still younger and cheaper than themselves, but with no superior experience or position to use to their advantage, and older workers will have to delay retirement or forget it altogether, as their savings decay from the cost of being unemployed. As a result, the effects of this recession will be felt by many people for a long time to come. In many situations the damage may be permanent.
Take Detroit, for example. Detroit stands out among U.S. cities because, according to economic data, the city has been losing jobs (that is, fewer jobs have been created than have been lost in consecutive months) since May 2000, and currently suffers a 22.2% unemployment rate, one of 13 metropolitan areas with unemployment of 15% or higher, and 117 with unemployment at or above 10%.
This gives Detroit the unfortunate distinction of having one of the highest unemployment rate among metropolitan areas in the United States, as well as the longest recession specific to a metropolitan area. El Centro, California, has the worst unemployment at 30.1%, but El Centro is about one-three-hundredth the size of Detroit and has been in a condition of recession only since 2008.
It’s interesting to note that El Centro’s job base is agricultural, with significant retail and service sectors, while Detroit’s job base was heavily committed to the auto and truck building industry.
The flip side would be to consider those metropolitan areas which are not suffering badly from the recession. 12 metropolitan areas have unemployment rates ranging from 4.8% down to 2.9%, which would be envied by most of the nation. Three cities in North Dakota, two in Nebraska, two in South Dakota, two in Iowa, and one each in Utah, Kansas, and Montana make up that happy club.
It is reasonable, on the available data, to say that smaller towns have handled the recession with lower unemployment and shorter duration of job loss than have the major cities.
Industry matters in unemployment, as well. The highest unemployment by industry is in construction, durable goods manufacturing, leisure and hospitality services, business services and information technology, in that order. As a sector, government workers have by far the lowest unemployment rate, at 4.2%.
The landscape from the industry unemployment shows that all businesses are cutting non-essential costs, including research and development, growth activity, and service activities. Government, following its historical pattern, is making no effort at all to scale back costs or headcount. The problem there is that as foreclosures rise and tax revenue from income and sales falls, an inevitable shortfall will occur at most government levels, creating an incentive to raise tax rates at the time when taxpayers would most resent such actions. Possibility of a political backlash increases significantly for the 2010 and 2012 election cycles.
Mr. Jankowski also presented his forecast for economic recovery from the recession. In general, the recession will technically be over for most of the country by mid-to-late 2010, meaning that the economic conditions which define a recession will no longer be in place, but the effects of the recession will linger for some time afterwards. Specifically, Mr. Jankowski warned that it will take from one to four more years for the jobs lost in the recession to be replaced in full, to the extent that each metropolitan area will produce GDP equal or greater to what it was prior to the recession. That means it will be anywhere from 2010 to 2013 before the jobs lost in this recession are replaced at the same professional and wage level. And that cheery projection does not address the loss of career growth; the projection is that people who lost/lose their jobs in 2008-2012 will spend between 12 and 48 months finding a position equal to where they were when they lost their job; the savings lost while unemployed and the career growth which ordinarily would have happened in that time will be permanently lost, which may have significant meaning when the individual retires. Young workers will be competing with people still younger and cheaper than themselves, but with no superior experience or position to use to their advantage, and older workers will have to delay retirement or forget it altogether, as their savings decay from the cost of being unemployed. As a result, the effects of this recession will be felt by many people for a long time to come. In many situations the damage may be permanent.
Friday, October 23, 2009
Defensive Staffing
Many job seekers get frustrated by the time and effort it takes to get a response from potential employers. Many times it seems that the HR departments of most companies exist only to prevent applicants from actually speaking to people who need new employees, and the nature of the process is not unlike a minefield – you have to find out about the position, successfully apply and make your way past the computer screener, the phone interview, the initial interview in person, to make it to the point where you are speaking to someone – perhaps – who would actually be making the hiring decision, all the while risking immediate elimination if someone inside the company is given the job, someone reaches the hiring manager personally, or the company decides not to fill the position after all. But for all of that, companies have worries as well, and the big one is the fear that someone may interview well enough to win the job, but prove a bad choice when they have locked in the position. It’s difficult for a company to let someone go once they have hired them without committing to a period of training and review, and even then the position has to be opened and applicants considered all over again, losing time and spending resources. Most companies would love some way to insure their choice, especially for a position that needs to be filled quickly but which needs both competency and a good fit with the company. And that brings me to the professional contract-to-hire position.
Many companies use contractors, but most often for low-level responsibilities and without the option for permanent hires. That is, a contractor often has to be exceptional to be offered a permanent position. This comes from budget concerns and the types of jobs filled, as many of them are seasonal or specific to a limited project. However, in recent months that thinking has changed a bit, as recruiters have offered a more enticing option – a sort of ‘no long commitment’ contract for managers and skilled professionals. The contractor in these cases would be hired for a specific project, usually lasting six months to a year, but the company would fill not only staff positions but management positions as well, and if the contractor performs up to a certain level, they might be offered a permanent position at the same level or even higher. For instance, a contractor might be hired to head a strategic project lasting six months, but if he surpasses a certain level of proficiency and ability, he might be offered the role of managing the whole department. This can be done most often in companies where managers have shifting responsibilities, and a new manager may be brought on to ease the workload of existing managers. While specifics have been closely guarded by the companies involved, informal accounts exist of even senior managers being hired in this way at some companies. While the practice is too new to be judged on its strategic value to the companies, the concept demonstrates a way by which a firm may take on high-level talent without committing for a long term until the new hire proves his or her worth.
Many companies use contractors, but most often for low-level responsibilities and without the option for permanent hires. That is, a contractor often has to be exceptional to be offered a permanent position. This comes from budget concerns and the types of jobs filled, as many of them are seasonal or specific to a limited project. However, in recent months that thinking has changed a bit, as recruiters have offered a more enticing option – a sort of ‘no long commitment’ contract for managers and skilled professionals. The contractor in these cases would be hired for a specific project, usually lasting six months to a year, but the company would fill not only staff positions but management positions as well, and if the contractor performs up to a certain level, they might be offered a permanent position at the same level or even higher. For instance, a contractor might be hired to head a strategic project lasting six months, but if he surpasses a certain level of proficiency and ability, he might be offered the role of managing the whole department. This can be done most often in companies where managers have shifting responsibilities, and a new manager may be brought on to ease the workload of existing managers. While specifics have been closely guarded by the companies involved, informal accounts exist of even senior managers being hired in this way at some companies. While the practice is too new to be judged on its strategic value to the companies, the concept demonstrates a way by which a firm may take on high-level talent without committing for a long term until the new hire proves his or her worth.
Wednesday, October 21, 2009
Career Building: A Look from Reality
Every so often, I have read up on the latest books and articles on the subject of career building. The idea seems simple enough; we all understand that the ideal job situation is much more than just getting a job somewhere and hoping for the best. We all would like to believe that we can improve our opportunities and find, if not the perfect job, a position which meets our ideal or at least where we enjoy the work and its benefits. Trouble is, most of us are working somewhere where we dislike something about the job. It may be the pay, the company culture, the office politics, or maybe it’s the location or some of the specific duties, but for most of us there’s always something. Speaking for myself, I have always been able to enjoy my work and I have great loyalty to all the companies which brought me onto their teams, but even then I could not ever say the position I had was “ideal”.
Some of that was education. My BA was in English. That’s a long story, but generally it came down to indecision and a counselor who assured me that I could do “anything” with an English degree. The real world, it turns out, differed in that opinion. So, it became clear to me that I needed a better degree. Since I love Accounting, I decided I wanted to earn a CPA license, and to do that required not only a slew of Accounting courses, but also a set of Business courses as well. So, I pursued and earned an MBA with a concentration in Accounting. Did it with a 3.94 GPA too, which got me into an honor society, Beta Gamma Sigma. I felt pretty good about my progress.
The plan at the time I graduated with my MBA was this: One reason I love Accounting, is that every business needs accountants, good ones. But more, a successful business needs accountants with management experience, for the plain reason that you have to see first-hand the effect of accounting decisions. You can talk about Activity-Based Costing, or Managerial Accounting strategies, but they need to be understood at the pointy end of the business to really grasp how the decision will affect the company. That’s something I knew I could provide for any employer, the ability to connect real-world effects to theoretical decisions. So, I figured that after earning my MBA, I would try to transfer to the Accounting department at my company, maybe work as an internal auditor while completing my educational requirements to sit for the CPA exams. My manager at the time was very supportive and genuinely wanted me to succeed in this track. However, when my company was bought out by another company in May of 2009, that all changed. The new company has its own accounting staff, and they do not plan to expand the Houston staff. That left me in a sort of drift for my career plans. Until reorganization and layoffs came in late September. Being laid off changed my career search from keep-this-job-until-I-get-a-better-fit to must-find-work.
That was more than a month ago. In job search terms, five weeks is not so very long. But when you go that long without much success in even getting interviews, you begin to get tired and frustrated. Part of it is the economy; in a down economy, professional positions are not usually filled in the last quarter. Part of it is the nature of a professional job hunt; if you have higher-than-average salary expectations or specialized skills, there will be fewer positions available that match what you are looking to find. And part of it is the stress of just having to find a job; many companies which have had to fill an opening complain about how hard it is to find qualified applicants, while job seekers similarly complain about the difficulty in finding a suitable position. Since the day I was laid off, I have searched every day for jobs, read about more than two hundred positions open for employment (and screened out more than half which either demanded qualifications I did not have, or offered unreasonable compensation – who seriously expects to get an experienced Credit Manager for $30k a year?), applied for more than eighty positions, and had a total of three phone interviews and two face-to-face interviews, counting the one I have tomorrow morning. It’s not a strong return on the effort.
That’s not to say that you should not do everything you can, when searching for work. You never know when, where or just how your next job will turn up, but the one thing I can say for sure is that it won’t show up looking for you while you sit around waiting for it. You have a degree of control in your search, in that you choose what area of work, which companies to apply to, and which jobs to try for. However, unless you happen to have a pile of money from your lottery winnings sitting around, you have a limited amount of time in which to find your next job. Also, the psychological weight of not knowing exactly when your next job will begin makes the passing of time feel longer and more ominous. Intellectually, you understand how long your savings can last, but emotionally the uncertainty is poisonous to your confidence. As a result, after a month or two you begin to question whether your goals are too high, whether you are being too picky in how much you want to be paid, or whether you should accept a position that you would earlier have rejected as a bad choice for your skills and experience. As time passes, the likelihood that you will hold out for a ‘perfect’ fit decreases until, unless you are lucky or very well-connected, it becomes just another thing that would have been nice but does not happen in real life. We all want a perfect job, but we have to pay the mortgage and the bills, and for most people the idea of building a career is not really feasible.
Some of that was education. My BA was in English. That’s a long story, but generally it came down to indecision and a counselor who assured me that I could do “anything” with an English degree. The real world, it turns out, differed in that opinion. So, it became clear to me that I needed a better degree. Since I love Accounting, I decided I wanted to earn a CPA license, and to do that required not only a slew of Accounting courses, but also a set of Business courses as well. So, I pursued and earned an MBA with a concentration in Accounting. Did it with a 3.94 GPA too, which got me into an honor society, Beta Gamma Sigma. I felt pretty good about my progress.
The plan at the time I graduated with my MBA was this: One reason I love Accounting, is that every business needs accountants, good ones. But more, a successful business needs accountants with management experience, for the plain reason that you have to see first-hand the effect of accounting decisions. You can talk about Activity-Based Costing, or Managerial Accounting strategies, but they need to be understood at the pointy end of the business to really grasp how the decision will affect the company. That’s something I knew I could provide for any employer, the ability to connect real-world effects to theoretical decisions. So, I figured that after earning my MBA, I would try to transfer to the Accounting department at my company, maybe work as an internal auditor while completing my educational requirements to sit for the CPA exams. My manager at the time was very supportive and genuinely wanted me to succeed in this track. However, when my company was bought out by another company in May of 2009, that all changed. The new company has its own accounting staff, and they do not plan to expand the Houston staff. That left me in a sort of drift for my career plans. Until reorganization and layoffs came in late September. Being laid off changed my career search from keep-this-job-until-I-get-a-better-fit to must-find-work.
That was more than a month ago. In job search terms, five weeks is not so very long. But when you go that long without much success in even getting interviews, you begin to get tired and frustrated. Part of it is the economy; in a down economy, professional positions are not usually filled in the last quarter. Part of it is the nature of a professional job hunt; if you have higher-than-average salary expectations or specialized skills, there will be fewer positions available that match what you are looking to find. And part of it is the stress of just having to find a job; many companies which have had to fill an opening complain about how hard it is to find qualified applicants, while job seekers similarly complain about the difficulty in finding a suitable position. Since the day I was laid off, I have searched every day for jobs, read about more than two hundred positions open for employment (and screened out more than half which either demanded qualifications I did not have, or offered unreasonable compensation – who seriously expects to get an experienced Credit Manager for $30k a year?), applied for more than eighty positions, and had a total of three phone interviews and two face-to-face interviews, counting the one I have tomorrow morning. It’s not a strong return on the effort.
That’s not to say that you should not do everything you can, when searching for work. You never know when, where or just how your next job will turn up, but the one thing I can say for sure is that it won’t show up looking for you while you sit around waiting for it. You have a degree of control in your search, in that you choose what area of work, which companies to apply to, and which jobs to try for. However, unless you happen to have a pile of money from your lottery winnings sitting around, you have a limited amount of time in which to find your next job. Also, the psychological weight of not knowing exactly when your next job will begin makes the passing of time feel longer and more ominous. Intellectually, you understand how long your savings can last, but emotionally the uncertainty is poisonous to your confidence. As a result, after a month or two you begin to question whether your goals are too high, whether you are being too picky in how much you want to be paid, or whether you should accept a position that you would earlier have rejected as a bad choice for your skills and experience. As time passes, the likelihood that you will hold out for a ‘perfect’ fit decreases until, unless you are lucky or very well-connected, it becomes just another thing that would have been nice but does not happen in real life. We all want a perfect job, but we have to pay the mortgage and the bills, and for most people the idea of building a career is not really feasible.
Thursday, October 15, 2009
The New Reality
Sorry for the hiatus. Turns out being unemployed is hard work, at least if you’re serious about trying to get your next post and, like me, are made aware of the many things you always knew you should catch up on but now are compelled to address. Career foundation preparation is a lot like going to the doctor, physical conditioning, or preparing your own tax return – you always knew you should have been working on it all along, that it was best done in a smooth, consistent manner, but somehow it was always put on the back burner, until of course you suddenly discovered you need it right now.
Ouch. In my case, the pain comes from networking. I am, as you may suspect, pretty old school in a lot of ways. I don’t twitter, I don’t even IM, in fact I need to start up my cell service again, I stopped using a cell when I realized no plan offered what I really wanted – a simple way to make and receive phone calls about 8-10 times a month. No texting, cameras, no calling everybody I ever met in a single month, just simple means for emergency calls and accessibility when I’m out of the office or home. So now I have to catch up on that. Yes, I know about pre-paid plans, but my general opinion of them is not much better than the kid-centered packages most services offer.
So anyway, I updated my LinkedIn account, started my WorkInTexas searches, updated Monster of course, started service with Jobfox, and started up with SimplyHired. I am also taking courses from an outsourcing company on my resume, interviewing, and naturally I am learning about networking as well. I have also been working on my references. Odd, that. As a manager I know how important references can be, but despite my blogging I am pretty much a private person, so the number of people I know well is limited to family and recent work colleagues. The fun part there, is not just that no potential employer is going to be impressed by a reference from your wife and kids, but the company I spent the last decade working for has a policy against specific references; they will release general data confirming your department, title, and dates of employment. So the people I have known for the last nine years in my work are not allowed to offer a recommendation for me. So that means I am chasing down some folks I know who left the company years ago. Hardly optimal, but better than an empty page on that score.
It’s also hard coming up with a really good resume. What I mean, is that while I can post my skills and what I have been doing, it’s not easy to convey what I have done that sets me apart. I’ve always been a team player, not least because rarely does one person do the whole job in a major project, so I have not spent a lot of effort looking for ways to brag about why my role made a key difference. Resumes, of course, are built on such accomplishments, so I have had to think and write about those places where my efforts and work created real results. There have actually been many situations where I am justifiably proud of my work, through leadership, initiative, or just plain being willing to do what was needed to get the job done. The hard part is explaining that in a way that still respects my team and colleagues. That takes a while and a bunch of rewrites.
So here I am. In case you did not already know, job hunting means a lot of effort for no real apparent return. I’ve sent out about a hundred applications and resumes so far, with almost no response. Part of that is the economy, I am told, but part of it is just the continuing problem that any job seeker finds; there are always a lot of people applying for any job, especially if the company and/or the position appears to offer career potential. You come to feel like a salmon swimming upstream after a while.
And then there are the sharks, the people who offer job-hunting assistance, especially the resume writing companies, who make promises just vague enough to be legal but whose ethics are clearly absent. Also present are companies which try to hire at wages far below industry scale, which seems extremely short-sighted to me in terms of strategy. I had a laugh early on, as I received an email invitation to interview for a sales position at a Saturn dealership. It sounded like an ad to join the crew of the Titanic, you know?
More later, and hopefully something more interesting than my personal career pursuit, but I owed an update to a few friends who had asked. And to those friends, thanks for thinking of me.
Ouch. In my case, the pain comes from networking. I am, as you may suspect, pretty old school in a lot of ways. I don’t twitter, I don’t even IM, in fact I need to start up my cell service again, I stopped using a cell when I realized no plan offered what I really wanted – a simple way to make and receive phone calls about 8-10 times a month. No texting, cameras, no calling everybody I ever met in a single month, just simple means for emergency calls and accessibility when I’m out of the office or home. So now I have to catch up on that. Yes, I know about pre-paid plans, but my general opinion of them is not much better than the kid-centered packages most services offer.
So anyway, I updated my LinkedIn account, started my WorkInTexas searches, updated Monster of course, started service with Jobfox, and started up with SimplyHired. I am also taking courses from an outsourcing company on my resume, interviewing, and naturally I am learning about networking as well. I have also been working on my references. Odd, that. As a manager I know how important references can be, but despite my blogging I am pretty much a private person, so the number of people I know well is limited to family and recent work colleagues. The fun part there, is not just that no potential employer is going to be impressed by a reference from your wife and kids, but the company I spent the last decade working for has a policy against specific references; they will release general data confirming your department, title, and dates of employment. So the people I have known for the last nine years in my work are not allowed to offer a recommendation for me. So that means I am chasing down some folks I know who left the company years ago. Hardly optimal, but better than an empty page on that score.
It’s also hard coming up with a really good resume. What I mean, is that while I can post my skills and what I have been doing, it’s not easy to convey what I have done that sets me apart. I’ve always been a team player, not least because rarely does one person do the whole job in a major project, so I have not spent a lot of effort looking for ways to brag about why my role made a key difference. Resumes, of course, are built on such accomplishments, so I have had to think and write about those places where my efforts and work created real results. There have actually been many situations where I am justifiably proud of my work, through leadership, initiative, or just plain being willing to do what was needed to get the job done. The hard part is explaining that in a way that still respects my team and colleagues. That takes a while and a bunch of rewrites.
So here I am. In case you did not already know, job hunting means a lot of effort for no real apparent return. I’ve sent out about a hundred applications and resumes so far, with almost no response. Part of that is the economy, I am told, but part of it is just the continuing problem that any job seeker finds; there are always a lot of people applying for any job, especially if the company and/or the position appears to offer career potential. You come to feel like a salmon swimming upstream after a while.
And then there are the sharks, the people who offer job-hunting assistance, especially the resume writing companies, who make promises just vague enough to be legal but whose ethics are clearly absent. Also present are companies which try to hire at wages far below industry scale, which seems extremely short-sighted to me in terms of strategy. I had a laugh early on, as I received an email invitation to interview for a sales position at a Saturn dealership. It sounded like an ad to join the crew of the Titanic, you know?
More later, and hopefully something more interesting than my personal career pursuit, but I owed an update to a few friends who had asked. And to those friends, thanks for thinking of me.
Saturday, October 03, 2009
Economics Perspective of the Unemployed
I have been laid off three times from jobs. Oddly or not, all three times came when a Democrat was President. The most recent condition came a couple weeks ago, when my company decided to ‘reorganize’, a useful word that can mean anything from making judicious use of your resources to maximize effectiveness, to panic-induced blunders that will eventually hurt everyone involved in the matter. During the settling of things after the decision, I put my resume on the internet and was soon invited to interview as a salesman for Saturn. Since I am not a sales person by nature, along with – hmm – things I have read in the news, I have decided not to pursue that wonderful opportunity.
I also have been making adjustments in how I live. Since my healthcare coverage was going to end soon, I made sure that I, my wife, and my daughter all had trips to the doctor, and my daughter’s dentist visit was also moved up, just to be sure. Which brings me to that unending fountain of joy, COBRA coverage. When I say ‘Cobra’, I don’t mean that venomous snake or the nemesis of the G.I. Joe team – or at least I don’t think there’s a connection – I mean the “Consolidated Omnibus Budget Reconciliation Act”, also known as the ”you want how much for health insurance?” plan.
Near as I can make out, the way COBRA works is that the government has set up health insurance coverage for people who lose their jobs. The formula for what you pay is really very simple; simply calculate how much you could realistically afford, double that then add another 20 percent, and that’s your COBRA premium. I think it’s mean to remind us how nice our employers were to offer us healthcare coverage, and to punish us for losing our jobs. Here’s a place where I actually give out some props to the Obama Administration – that Stimulus bill that passed earlier this year includes a government subsidy for some of the COBRA cost. The bad news is that the process is a bit long and complex, and my insurance contact said they’d “get back to me” when they knew what COBRA would cost with the reduction from the subsidy. I sure hope that subsidy money didn’t end up being used for Cash For Clunkers, instead.
Obviously, losing your job makes you very conscious of how much everything costs. I haven’t exactly been living hand-to-mouth, but the sudden end to an income, even with a severance package, means that everything is considered in terms of budget life, how long you can live on a certain asset if you need to do so. The first order of economics is really determining how much you have to have, and where you will get what you need. The short-term is no problem, but until interviews and job offers come in, you really become much more aware of the financial horizon.
Ironically, you also become cautious about what sort of job you will consider. I don’t mean that you become overly picky about jobs you apply for (I’ve sent out about three dozen applications so far), but you consider all the aspects of a job, including elements that did not seem important before, such as how far you would have drive to get to work, how much travel is involved, whether you would be willing to relocate and if so to where, what base bay is acceptable, what working conditions are must-have, what nature of work you would consider, and the like. I found some surprises already, like two jobs for different companies which are very similar in their requirements and duties, yet one pays barely half of what the other pays. Or the company which is extremely particular about whom they will even consider, which explains why the job has been open since May. For the same general responsibilities, some companies are very demanding about who they want, which makes it hard to get in but at least you are clear about what they want, while other companies are very general, even vague, which may seem attractive until you ask whether they know what they are looking for in a candidate. I’ve already had one job in my past where the business owner did not know what the job needed. You have to consider everything in the job posting, to make sure you understand what you are walking into in a job.
The economy is a big topic of discussion among the unemployed. The Workforce office is jam-packed, so is my out-sourcing company, and so are the employment recruiters I have talked to. Pretty much no one believes that the economy is in good shape or that finding work is easy. The rotten economy is also punishing the poor more than anyone else. People on the low end of pay and position get let go more often than anyone else, and they get less in severance as well. All the fine speeches spinning how making businesses pay more in taxes will be good for the economy somehow gets no traction when all you see is belt-tightening. What’s interesting, to me at least, is that most folks just want to work a decent job. No one is looking for a free ride that I have met, and they are all getting pretty disgusted with a government that spends so much time on spin, that it never considers the effect its new laws have on regular people.
Sooner or later, it will dawn even on Congress that unemployed people vote too, and they are in no mood to continue on the present course.
I also have been making adjustments in how I live. Since my healthcare coverage was going to end soon, I made sure that I, my wife, and my daughter all had trips to the doctor, and my daughter’s dentist visit was also moved up, just to be sure. Which brings me to that unending fountain of joy, COBRA coverage. When I say ‘Cobra’, I don’t mean that venomous snake or the nemesis of the G.I. Joe team – or at least I don’t think there’s a connection – I mean the “Consolidated Omnibus Budget Reconciliation Act”, also known as the ”you want how much for health insurance?” plan.
Near as I can make out, the way COBRA works is that the government has set up health insurance coverage for people who lose their jobs. The formula for what you pay is really very simple; simply calculate how much you could realistically afford, double that then add another 20 percent, and that’s your COBRA premium. I think it’s mean to remind us how nice our employers were to offer us healthcare coverage, and to punish us for losing our jobs. Here’s a place where I actually give out some props to the Obama Administration – that Stimulus bill that passed earlier this year includes a government subsidy for some of the COBRA cost. The bad news is that the process is a bit long and complex, and my insurance contact said they’d “get back to me” when they knew what COBRA would cost with the reduction from the subsidy. I sure hope that subsidy money didn’t end up being used for Cash For Clunkers, instead.
Obviously, losing your job makes you very conscious of how much everything costs. I haven’t exactly been living hand-to-mouth, but the sudden end to an income, even with a severance package, means that everything is considered in terms of budget life, how long you can live on a certain asset if you need to do so. The first order of economics is really determining how much you have to have, and where you will get what you need. The short-term is no problem, but until interviews and job offers come in, you really become much more aware of the financial horizon.
Ironically, you also become cautious about what sort of job you will consider. I don’t mean that you become overly picky about jobs you apply for (I’ve sent out about three dozen applications so far), but you consider all the aspects of a job, including elements that did not seem important before, such as how far you would have drive to get to work, how much travel is involved, whether you would be willing to relocate and if so to where, what base bay is acceptable, what working conditions are must-have, what nature of work you would consider, and the like. I found some surprises already, like two jobs for different companies which are very similar in their requirements and duties, yet one pays barely half of what the other pays. Or the company which is extremely particular about whom they will even consider, which explains why the job has been open since May. For the same general responsibilities, some companies are very demanding about who they want, which makes it hard to get in but at least you are clear about what they want, while other companies are very general, even vague, which may seem attractive until you ask whether they know what they are looking for in a candidate. I’ve already had one job in my past where the business owner did not know what the job needed. You have to consider everything in the job posting, to make sure you understand what you are walking into in a job.
The economy is a big topic of discussion among the unemployed. The Workforce office is jam-packed, so is my out-sourcing company, and so are the employment recruiters I have talked to. Pretty much no one believes that the economy is in good shape or that finding work is easy. The rotten economy is also punishing the poor more than anyone else. People on the low end of pay and position get let go more often than anyone else, and they get less in severance as well. All the fine speeches spinning how making businesses pay more in taxes will be good for the economy somehow gets no traction when all you see is belt-tightening. What’s interesting, to me at least, is that most folks just want to work a decent job. No one is looking for a free ride that I have met, and they are all getting pretty disgusted with a government that spends so much time on spin, that it never considers the effect its new laws have on regular people.
Sooner or later, it will dawn even on Congress that unemployed people vote too, and they are in no mood to continue on the present course.
Thursday, September 24, 2009
Bambi Meets Godzilla 2009; President Bambi and Nukes
Somehow I knew this one was coming.
Notice the observation, “In a first for a U.S. president, Obama presided over the 15-member meeting”. Yep, that’s President Bambi leading the charge to rid the world of nukes. Wattamessiah!!
I mean, it’s a given that no thinking person likes nuclear weapons or wants to contemplate the consequences of even a ‘minor’ nuclear exchange. All leaders in government, religion, or any beneficent social effort would love to see nuclear weapons be removed from the world. So what’s wrong with the U.N. Security Council voting to move towards doing just that?
Reality.
The science behind nuclear weapons is well-established. That’s why, for decades after the U.S. and U.S.S.R. both came to the conclusion that an actual nuclear exchange could never be allowed to happen, both sides not only maintained their nuclear stockpile, but created new weapons and delivery systems for them. Because back when adults were in charge, they understood that deterrence was not only the status quo, but a critical mission. If either side reached a point where it held a commanding advantage in such weapons, the temptation to use them in a first strike to eliminate the threat would rise significantly. And even after the Cold War ended, the U.S. and Russia still maintained nuclear stockpiles, because other nations, some of them unstable and belligerent to civilization, possessed or were pursuing nuclear weapons, and the stockpiles of the greater powers was necessary to dissuade the development programs in many countries who otherwise would see their chance to claim territory and regional influence through the threat of nuclear war. Which brings us, of course, to the flies in the soup for even today’s naïve contestants.
“North Korea tested a second nuclear weapon this year, and Iran has resisted greater international oversight for its nuclear program. Iran says its nuclear activities are for peaceful purposes, but the United States and other major powers fear they are a cover for a weapons program.”
Yeah, and the sun came up in the East this morning. Call me cynical, but I worked in electricity for nearly nine years, and it’s just silly to pretend that a breeder reactor in the middle of nowhere, not even connected until very recently to any external delivery grid, much less established step-down distribution stations to industrial factories or metropolitan residential areas, is somehow meant to provide lights and power for ordinary folks. Never mind the fact that heavy-water plants like the ones used by Iran are now used exclusively for weapons-grade plutonium and uranium production. As for North Korea, anyone who thinks Dear Leader would be willing to give up his ambition to possess a nuclear arsenal is in need of a jacket with the sleeves in the back, the kind which tie together to prevent self-injury.
Bambi thinks he can wish the nukes away. He is dangerously wrong in his most recent fallacy.
Notice the observation, “In a first for a U.S. president, Obama presided over the 15-member meeting”. Yep, that’s President Bambi leading the charge to rid the world of nukes. Wattamessiah!!
I mean, it’s a given that no thinking person likes nuclear weapons or wants to contemplate the consequences of even a ‘minor’ nuclear exchange. All leaders in government, religion, or any beneficent social effort would love to see nuclear weapons be removed from the world. So what’s wrong with the U.N. Security Council voting to move towards doing just that?
Reality.
The science behind nuclear weapons is well-established. That’s why, for decades after the U.S. and U.S.S.R. both came to the conclusion that an actual nuclear exchange could never be allowed to happen, both sides not only maintained their nuclear stockpile, but created new weapons and delivery systems for them. Because back when adults were in charge, they understood that deterrence was not only the status quo, but a critical mission. If either side reached a point where it held a commanding advantage in such weapons, the temptation to use them in a first strike to eliminate the threat would rise significantly. And even after the Cold War ended, the U.S. and Russia still maintained nuclear stockpiles, because other nations, some of them unstable and belligerent to civilization, possessed or were pursuing nuclear weapons, and the stockpiles of the greater powers was necessary to dissuade the development programs in many countries who otherwise would see their chance to claim territory and regional influence through the threat of nuclear war. Which brings us, of course, to the flies in the soup for even today’s naïve contestants.
“North Korea tested a second nuclear weapon this year, and Iran has resisted greater international oversight for its nuclear program. Iran says its nuclear activities are for peaceful purposes, but the United States and other major powers fear they are a cover for a weapons program.”
Yeah, and the sun came up in the East this morning. Call me cynical, but I worked in electricity for nearly nine years, and it’s just silly to pretend that a breeder reactor in the middle of nowhere, not even connected until very recently to any external delivery grid, much less established step-down distribution stations to industrial factories or metropolitan residential areas, is somehow meant to provide lights and power for ordinary folks. Never mind the fact that heavy-water plants like the ones used by Iran are now used exclusively for weapons-grade plutonium and uranium production. As for North Korea, anyone who thinks Dear Leader would be willing to give up his ambition to possess a nuclear arsenal is in need of a jacket with the sleeves in the back, the kind which tie together to prevent self-injury.
Bambi thinks he can wish the nukes away. He is dangerously wrong in his most recent fallacy.
Monday, September 21, 2009
Thin Skin, Thick Head
My wife thinks President Obama is working for Al-Qaida. I kid you not. She brought up this contention while we were watching the news Saturday night. To Mikki, the combined effect of President Obama’s policies, being clearly anti-business (on any business level) anti-entrepreneur, and anti-consumer mean that he is working to destroy the infrastructure of the United States. Bear in mind that my wife is from Hong Kong, and tends to see things from a different perspective than most people. The thing is, Mikki is no Republican, in fact she does not vote much except when she really likes someone, like Mayor Bill White of Houston, or is furious with someone, like Barack Obama. And Obama has really set her off. So far as she is concerned, Obama is directly responsible for just about every problem in America right now, if for no reason beyond the fact that as President, Obama should be setting the example and building optimism and confidence. I do not agree that Obama is trying to undermine America, though. I think the man just has the interests of the United States of America far lower on his list of priorities than his own personal agenda.
The reason I mention that point of view, is that I am hearing it more and more, and from black Americans as often as from white Americans. Obama seems to have exhausted his mojo, and the canned promises sound, well, like canned promises that aren’t worth spit. This happens a lot with politicians, though, so the real question is the way an individual politicossack handles the turbulence. Some, like Harry Truman, tough out the flack, explain their reasoning and push ahead. Some, like Jimmy Carter, go into hiding and let things fall apart. Some, like Bill Clinton, recognize the battles they can’t win, cut their losses and regroup for the next proposal. And some, Like FDR, pull back from a sure loss, embrace the other side’s solution, and build credibility and support from that action in advancing his broader agenda. I mentioned Democrats who were President, because it reminds us that Barack Obama is in no way the first or last President who has had to face an angry public for a misstep, either in the way he delivers his decisions or who just plain gets the call wrong. But in every successful President’s administration, the solution begins with recognizing what is wrong, and why. Barack Obama has been stubborn in holding on to the pretense that his plans are perfect from the go, and if he pushes hard enough he will inevitably win. Given his record in getting to the White House, but the guy is no student of history.
Admitting you are wrong is hard for many people, and flat out impossible for some folks. I personally know some very fine people who have been completely wrong, yet won’t admit their error in the least, much less make necessary changes. It’s not about intellect, but ego. Certainly we’ve seen many debates in politics where the truth came out in an obvious way, but the party in the wrong refused to budge from its position, no matter the cost. This obstinacity is relatively harmless and can even be amusing at the low level debates we see on the blogs, but it’s a bit more serious when the President of the United States, in effect, puts his fingers in his ears and yells “I can’t hear you! I can’t hear you!”. Not smart, just plain dumb.
The problem comes down to sensitivity. Barack Obama is about as tetchy a POTUS as we’ve had since Richard ”They’re all out to get me” Nixon. Any criticism of the man’s policies and politics is immediately tagged as “racism”, which would just be a crude spin tactic if it were not so obvious that our Narcissist-in-Chief buys into that conspiracy theory himself. Or that President Orwell engages in so much NewSpeak, like telling us that taxing people if they don’t have health insurance would not really be a tax, and certainly not a tax on the poor, even though the people who do not have insurance now and who would be most likely not to have insurance in the future, are the poor. Like telling us that he has personally “created or saved” millions of jobs, in spite of a national unemployment average rising above 10 percent at times, with some urban minorities seeing local unemployment above thirty percent. Like telling us that there is no risk in “investigating” CIA agents through political committees on the assumption that the ‘civil rights’ of foreign terrorists may have been violated by men doing their jobs. The man imagines himself spotless, but that’s only because his moral vision is so poor.
That’s not to demean President Obama. Every President makes mistakes and has areas of weakness. Most of them are aware of their deficiencies and work to improve them and to avoid the worst mistakes. Almost all have to face situations which result from such mistakes. Sooner or later, President Obama is going to have to face the consequences of his mistakes and errors. The delay only means the consequences may be much more serous than he imagines.
The reason I mention that point of view, is that I am hearing it more and more, and from black Americans as often as from white Americans. Obama seems to have exhausted his mojo, and the canned promises sound, well, like canned promises that aren’t worth spit. This happens a lot with politicians, though, so the real question is the way an individual politicossack handles the turbulence. Some, like Harry Truman, tough out the flack, explain their reasoning and push ahead. Some, like Jimmy Carter, go into hiding and let things fall apart. Some, like Bill Clinton, recognize the battles they can’t win, cut their losses and regroup for the next proposal. And some, Like FDR, pull back from a sure loss, embrace the other side’s solution, and build credibility and support from that action in advancing his broader agenda. I mentioned Democrats who were President, because it reminds us that Barack Obama is in no way the first or last President who has had to face an angry public for a misstep, either in the way he delivers his decisions or who just plain gets the call wrong. But in every successful President’s administration, the solution begins with recognizing what is wrong, and why. Barack Obama has been stubborn in holding on to the pretense that his plans are perfect from the go, and if he pushes hard enough he will inevitably win. Given his record in getting to the White House, but the guy is no student of history.
Admitting you are wrong is hard for many people, and flat out impossible for some folks. I personally know some very fine people who have been completely wrong, yet won’t admit their error in the least, much less make necessary changes. It’s not about intellect, but ego. Certainly we’ve seen many debates in politics where the truth came out in an obvious way, but the party in the wrong refused to budge from its position, no matter the cost. This obstinacity is relatively harmless and can even be amusing at the low level debates we see on the blogs, but it’s a bit more serious when the President of the United States, in effect, puts his fingers in his ears and yells “I can’t hear you! I can’t hear you!”. Not smart, just plain dumb.
The problem comes down to sensitivity. Barack Obama is about as tetchy a POTUS as we’ve had since Richard ”They’re all out to get me” Nixon. Any criticism of the man’s policies and politics is immediately tagged as “racism”, which would just be a crude spin tactic if it were not so obvious that our Narcissist-in-Chief buys into that conspiracy theory himself. Or that President Orwell engages in so much NewSpeak, like telling us that taxing people if they don’t have health insurance would not really be a tax, and certainly not a tax on the poor, even though the people who do not have insurance now and who would be most likely not to have insurance in the future, are the poor. Like telling us that he has personally “created or saved” millions of jobs, in spite of a national unemployment average rising above 10 percent at times, with some urban minorities seeing local unemployment above thirty percent. Like telling us that there is no risk in “investigating” CIA agents through political committees on the assumption that the ‘civil rights’ of foreign terrorists may have been violated by men doing their jobs. The man imagines himself spotless, but that’s only because his moral vision is so poor.
That’s not to demean President Obama. Every President makes mistakes and has areas of weakness. Most of them are aware of their deficiencies and work to improve them and to avoid the worst mistakes. Almost all have to face situations which result from such mistakes. Sooner or later, President Obama is going to have to face the consequences of his mistakes and errors. The delay only means the consequences may be much more serous than he imagines.
Tuesday, September 15, 2009
The Cancer Did NOT Win
Family, friends and fans of Patrick Swayze are mourning his passing today, myself among them. There have been moving accounts written and hopefully his wife Lisa is coping well with the support of loved ones; certainly her support for Patrick was important and uplifting. But among the things written and said, is something with which I cannot agree – the claim that Patrick ‘lost his battle with Cancer’. Patrick won, not the Cancer, and here’s why.
Cancer is a terrible thing to have to fight, even if you are lucky like me and it gets caught early. If the words “Stage Four” are attached to the diagnosis, it’s going to be painful and in most cases fatal. That does not mean, however, that you have to give up hope or stop living. That’s what Patrick understood, and he explained that if you focus only on fighting to stay alive, you might forget to really live. Patrick Swayze was true to his beliefs; he worked, rode his horses and maintained his farm right up to the end. It was painful for him, at times horribly so, and it cost Patrick in ways that no healthy person can understand, just to get through his days. But Patrick was determined to live his life on his terms, to not give in on anything that was important to him, and to fight his cencer with every ounce of strength he possessed.
Yes, Patrick Swayze died from his cancer, but so? We all must die, sooner or later, and if not from one thing then from another. No one gets out alive, as the saying goes, so the people who think that dying is losing have lost sight of why we live in the first place.
Patrick Swayze was brilliantly successful in pretty much everything he tried. He was a popular and accomplished dancer, actor, horse breeder and pilot (Patrick held an instrument rating for his twin-engine Cessna). But what impresses many people the most is the quality of the man himself. Patrick married Lisa Niemi in 1975, and their marriage lasted all his life. Patrick was raised as a Roman Catholic, but also studied Tai Chi and several schools of Buddhism. More to the point, Patrick established a well-deserved reputation as a thoroughly honest and hard-working man, a fighter for what he believed in, be it his work, his family, or his values. To the end, Patrick showed the world a man worth respect, admiration, and purpose.
Cancer took his life, but even so, Patrick Swayze won his battles.
Cancer is a terrible thing to have to fight, even if you are lucky like me and it gets caught early. If the words “Stage Four” are attached to the diagnosis, it’s going to be painful and in most cases fatal. That does not mean, however, that you have to give up hope or stop living. That’s what Patrick understood, and he explained that if you focus only on fighting to stay alive, you might forget to really live. Patrick Swayze was true to his beliefs; he worked, rode his horses and maintained his farm right up to the end. It was painful for him, at times horribly so, and it cost Patrick in ways that no healthy person can understand, just to get through his days. But Patrick was determined to live his life on his terms, to not give in on anything that was important to him, and to fight his cencer with every ounce of strength he possessed.
Yes, Patrick Swayze died from his cancer, but so? We all must die, sooner or later, and if not from one thing then from another. No one gets out alive, as the saying goes, so the people who think that dying is losing have lost sight of why we live in the first place.
Patrick Swayze was brilliantly successful in pretty much everything he tried. He was a popular and accomplished dancer, actor, horse breeder and pilot (Patrick held an instrument rating for his twin-engine Cessna). But what impresses many people the most is the quality of the man himself. Patrick married Lisa Niemi in 1975, and their marriage lasted all his life. Patrick was raised as a Roman Catholic, but also studied Tai Chi and several schools of Buddhism. More to the point, Patrick established a well-deserved reputation as a thoroughly honest and hard-working man, a fighter for what he believed in, be it his work, his family, or his values. To the end, Patrick showed the world a man worth respect, admiration, and purpose.
Cancer took his life, but even so, Patrick Swayze won his battles.
Friday, September 11, 2009
What 9/11 Means
Eight years ago, a team of terrorists killed three thousand innocent people for the advancement of an evil conspiracy. On the eighth anniversary of the most horrific act of evil deliberately perpetrated against the United States, the man in the White House is arguably the least ready in memory to effectively deal with an enemy who would, if they could, repeat that act on an even greater scale. The Congress of the United States has taken steps to disarm the men and women who protect the nation, while all but apologizing to the colleagues of the murderers for the U.S. getting in their way during the Bush Administration. And the people of America, once united in the face of the crisis, are divided and worn out by petty bickering and farcical mockeries of the duties and obligations of politicians in both major parties.
The nation is at war. It is an obscene fact that many Americans have managed to somehow forget that fact, to take for granted the efforts of our military to secure stable, free nations in Iraq and Afghanistan, or that the war still continues to this day, with all the stakes and risk that existed from the beginning for those who put nation first.
Terrorists are not ‘criminals’, they are not ‘freedom-fighters’, they are not ‘misunderstood’. They do not have equal standing with the people they attack and kill. They do not have civil rights under any established law. They do not enjoy protection under the Geneva Convention. The Geneva convention was designed to protect combatants serving nations under certain rules of conduct, defined clearly and it’s not difficult at all to confirm that people who do not belong to any national army or militia, who do not operate under military protocols, who commits atrocities not in isolated cases but as deliberate strategy, do not enjoy identification as ‘combatants’ in the sense of that treaty. Terrorists commonly enter foreign countries to perform their murders, so it is not correct to presume that they enjoy the protection of law that is accorded citizens. And the very nature of their conduct and strategy makes it necessary to treat terrorists on a simple means of identification and extermination. Find them and kill them, end of story. If there is doubt, investigate, but if there is no doubt, then there is no quarter to be given. Terrorism by its nature is anathema to humanity, and therefore such groups must be exterminated in total whenever and wherever they are found.
Before 9/11, it was politically sensitive to deal directly against terrorists. This can be seen in the policies of airlines, for example, which told their crews not to resist hijackers, but cooperate in order to save lives. The 9/11 attacks made it clear not only that the old system was not functional, but hopelessly naïve. On the international level, as well, the clear focus and imperatives of the Bush Administration after 9/11 made it clear that informal wink-and-nod arrangements between terrorist groups and certain national political groups would no longer be tolerated. A new U.S. doctrine took effect, which required President Bush to set aside all his original plans and policies in deference to his commitment to defend America from the threat of terrorism.
Before 9/11, religious thought regarding terrorists was sparse, especially among Muslims. Since Islam does not emphasize a separation of Church and State (quite the opposite, the very concept of Dar-al-Islam presumes mutual religious and military conquest of all the world), there was no overt debate on the morality of terrorist actions – those who opposed such actions thought them too incidental to address in the context of the faith as a whole, and those who supported such actions thought it unnecessary to risk dissension by discussing the religious context of the actions. A few Muslim sheiks had observed Koranic prohibitions against killing known innocents, especially women and children, that suicide was permissible in defense of innocents but not in murder of same, even if Dar-al-Harb. After 9/11, Muslims found themselves more compelled to examine their faith in the light of such actions, to decide not only whether terrorism should be part of their faith but also what their response as Muslims should be to terrorism acts by Muslim extremists. Non-Muslims found that they knew little of Islam, and often judged the entire faith by the actions of its most extreme. As with all faiths, prejudice and history have been difficult for people to overcome, both outside of and within the community of faith.
Prior to 9/11, there was a belief in some quarters of the world that supporting a terrorist group could advance a national strategy, and in so doing produce an economic benefit for their country or government. It is now more generally recognized that Terrorism is economic parasitism. By its nature, terrorist groups consume goods and destroy people and materials; it is literally impossible for terrorism to create gain or improve economic conditions. Economics is not, and has never been and never will be, a zero-sum game; any farmer can tell you that his neighbor’s misfortune in no way helps his crops or livestock, and in many ways another’s loss threatens his own well-being. That fact is now more apparent than ever before, giving regimes pause in considering the results of supporting such groups.
Before 9/11, there was some discussion that American military force was not up to the job in all places, that “lessons” in places like Mogadishu and Haiti showed the limits to U.S. power and influence, and various apologists for defeatism and appeasement pushed to scale back the size and mission of the American military, and to replace pro-American doctrines with policies of retreat and surrender, similar to the British pull-backs following World War 2, on the theory that U.S. interests represented imperial designs. This lie ironically found increased support in the fiction of the “peace dividend” after the fall of the Warsaw Pact, as if the ensuing chaos in a part of the world with more than 50,000 nuclear warheads was of no concern, or that other nations would not rush in to fill the void of power left with the fall of the Soviet Union, with attendant threat to American interests and citizens. While America power was clearly impressive in the first Gulf War of 1990-91, critics charged that things would be far different if the U.S. were committed to a long war, or tried to actually change the political structure of a major Mid-East country.
The long war in Iraq and Afghanistan proved the critics wrong again. The war has been difficult, costly, painful, and if Obama lacks the backbone to stay the course, great damage could still be done to American goals and interests in the region, but at present any objective analysis of the war would conclude that the governments of Iraq and Afghanistan are a vast improvement on their predecessors, in terms of freedom, economic opportunity, and security for their citizens and the region. Whether the war’s cause was just or the results worth the cost may be debated, but the U.S. military clearly established an unsurpassed and undeniable ability to assert its power anywhere, anytime. The ramifications of this proof become obvious when the history of the region is examined.
In summary, 9/11 was a horrific atrocity, perpetrated by evil minds who have in some part come to a just yet terrible consequence, and in others deferred their reckoning to when they must stand before God. We have seen valor and heroism from many places, some unexpected, and loathsome hypocrisy and pusillanimity from others, especially those in privileged and public positions of celebrity and the avant-garde. We have seen the Mainstream Media sabotage its own credibility, and grass-roots bloggers rise to a degree of public acclaim and success. Our military has lost thousands of casualties in two campaigns, only to see the new President discount their sacrifice in hopes of gaining political coin for himself from our enemies. Both major political parties have demonstrated a grievous lack of commitment to fundamental American priorities and values, and few of the federal elected officials make themselves available to regular citizens, let alone accountable.
Yet for all of this, eight years after 9/11, our friends and enemies alike understand that there is a core of resolve in America unlike any other country, that there is a well of strength and purpose in this nation which no enemy may hope to overcome and no friend may fear will totally fail. We may be delayed, and we may take losses, but in the end, sooner or later we shall prevail. Not because Americans are better than other nations, but because this nation stands for the best of every nation, and while our methods may falter, our cause is just. No tyrant, no terrorist, no turncoat, no traducer shall win against us.
The nation is at war. It is an obscene fact that many Americans have managed to somehow forget that fact, to take for granted the efforts of our military to secure stable, free nations in Iraq and Afghanistan, or that the war still continues to this day, with all the stakes and risk that existed from the beginning for those who put nation first.
Terrorists are not ‘criminals’, they are not ‘freedom-fighters’, they are not ‘misunderstood’. They do not have equal standing with the people they attack and kill. They do not have civil rights under any established law. They do not enjoy protection under the Geneva Convention. The Geneva convention was designed to protect combatants serving nations under certain rules of conduct, defined clearly and it’s not difficult at all to confirm that people who do not belong to any national army or militia, who do not operate under military protocols, who commits atrocities not in isolated cases but as deliberate strategy, do not enjoy identification as ‘combatants’ in the sense of that treaty. Terrorists commonly enter foreign countries to perform their murders, so it is not correct to presume that they enjoy the protection of law that is accorded citizens. And the very nature of their conduct and strategy makes it necessary to treat terrorists on a simple means of identification and extermination. Find them and kill them, end of story. If there is doubt, investigate, but if there is no doubt, then there is no quarter to be given. Terrorism by its nature is anathema to humanity, and therefore such groups must be exterminated in total whenever and wherever they are found.
Before 9/11, it was politically sensitive to deal directly against terrorists. This can be seen in the policies of airlines, for example, which told their crews not to resist hijackers, but cooperate in order to save lives. The 9/11 attacks made it clear not only that the old system was not functional, but hopelessly naïve. On the international level, as well, the clear focus and imperatives of the Bush Administration after 9/11 made it clear that informal wink-and-nod arrangements between terrorist groups and certain national political groups would no longer be tolerated. A new U.S. doctrine took effect, which required President Bush to set aside all his original plans and policies in deference to his commitment to defend America from the threat of terrorism.
Before 9/11, religious thought regarding terrorists was sparse, especially among Muslims. Since Islam does not emphasize a separation of Church and State (quite the opposite, the very concept of Dar-al-Islam presumes mutual religious and military conquest of all the world), there was no overt debate on the morality of terrorist actions – those who opposed such actions thought them too incidental to address in the context of the faith as a whole, and those who supported such actions thought it unnecessary to risk dissension by discussing the religious context of the actions. A few Muslim sheiks had observed Koranic prohibitions against killing known innocents, especially women and children, that suicide was permissible in defense of innocents but not in murder of same, even if Dar-al-Harb. After 9/11, Muslims found themselves more compelled to examine their faith in the light of such actions, to decide not only whether terrorism should be part of their faith but also what their response as Muslims should be to terrorism acts by Muslim extremists. Non-Muslims found that they knew little of Islam, and often judged the entire faith by the actions of its most extreme. As with all faiths, prejudice and history have been difficult for people to overcome, both outside of and within the community of faith.
Prior to 9/11, there was a belief in some quarters of the world that supporting a terrorist group could advance a national strategy, and in so doing produce an economic benefit for their country or government. It is now more generally recognized that Terrorism is economic parasitism. By its nature, terrorist groups consume goods and destroy people and materials; it is literally impossible for terrorism to create gain or improve economic conditions. Economics is not, and has never been and never will be, a zero-sum game; any farmer can tell you that his neighbor’s misfortune in no way helps his crops or livestock, and in many ways another’s loss threatens his own well-being. That fact is now more apparent than ever before, giving regimes pause in considering the results of supporting such groups.
Before 9/11, there was some discussion that American military force was not up to the job in all places, that “lessons” in places like Mogadishu and Haiti showed the limits to U.S. power and influence, and various apologists for defeatism and appeasement pushed to scale back the size and mission of the American military, and to replace pro-American doctrines with policies of retreat and surrender, similar to the British pull-backs following World War 2, on the theory that U.S. interests represented imperial designs. This lie ironically found increased support in the fiction of the “peace dividend” after the fall of the Warsaw Pact, as if the ensuing chaos in a part of the world with more than 50,000 nuclear warheads was of no concern, or that other nations would not rush in to fill the void of power left with the fall of the Soviet Union, with attendant threat to American interests and citizens. While America power was clearly impressive in the first Gulf War of 1990-91, critics charged that things would be far different if the U.S. were committed to a long war, or tried to actually change the political structure of a major Mid-East country.
The long war in Iraq and Afghanistan proved the critics wrong again. The war has been difficult, costly, painful, and if Obama lacks the backbone to stay the course, great damage could still be done to American goals and interests in the region, but at present any objective analysis of the war would conclude that the governments of Iraq and Afghanistan are a vast improvement on their predecessors, in terms of freedom, economic opportunity, and security for their citizens and the region. Whether the war’s cause was just or the results worth the cost may be debated, but the U.S. military clearly established an unsurpassed and undeniable ability to assert its power anywhere, anytime. The ramifications of this proof become obvious when the history of the region is examined.
In summary, 9/11 was a horrific atrocity, perpetrated by evil minds who have in some part come to a just yet terrible consequence, and in others deferred their reckoning to when they must stand before God. We have seen valor and heroism from many places, some unexpected, and loathsome hypocrisy and pusillanimity from others, especially those in privileged and public positions of celebrity and the avant-garde. We have seen the Mainstream Media sabotage its own credibility, and grass-roots bloggers rise to a degree of public acclaim and success. Our military has lost thousands of casualties in two campaigns, only to see the new President discount their sacrifice in hopes of gaining political coin for himself from our enemies. Both major political parties have demonstrated a grievous lack of commitment to fundamental American priorities and values, and few of the federal elected officials make themselves available to regular citizens, let alone accountable.
Yet for all of this, eight years after 9/11, our friends and enemies alike understand that there is a core of resolve in America unlike any other country, that there is a well of strength and purpose in this nation which no enemy may hope to overcome and no friend may fear will totally fail. We may be delayed, and we may take losses, but in the end, sooner or later we shall prevail. Not because Americans are better than other nations, but because this nation stands for the best of every nation, and while our methods may falter, our cause is just. No tyrant, no terrorist, no turncoat, no traducer shall win against us.
Sunday, September 06, 2009
Responsibility and Accountability
On Thursday September 3, the University of Oregon played Boise State in a season-opening game which was important to both schools, as they were each nationally-ranked and hoping to start off strong. The game ended in a 19-8 win for Boise State, after starting with a larger-than-usual show of sportsmanship. An ironic gesture, given the ending. At the game's end, Byron Hout of Boise State approached LeGarrette Blount of Oregon, slapped him on the shoulder pad to get his attention, and yelled something at him which has not yet been revealed to the public. As he turned away to face Boise State head coach Chris Peterson, who was pulling Hout away from Blount, Blount angrily launched a punch which landed on Hout's jaw. To make matters worse, Blount then attempted to punch another player, struggled with his own teammates as they wrestled him towards the locker room, and had to be restrained by police from attacking fans who taunted him at the stands as he left the field. Still worse, the game and the actions of Blount were nationally televised by ESPN. And then the day after that, it was discovered that Blount had been suspended from the Oregon team back in February.
Friday, Oregon coach Chip Kelley suspended running back LaGarrette Blount for the rest of the season.
To some degree, the decision to end Blount's collegiate career (he is a Senior, and the suspension includes any bowl games that Oregon may earn) was predictable. Blount's action was not only blatent and deliberate, not to mention nationally televised, Coach Kelly serves on the NCAA's committee which address athletes' sportsmanship, and Kelly had already been under fire for an apparent lack of discipline on the Ducks' team. The public opinion on the matter seemed to demand a heavy punishment, and so the axe fell quickly in this case. To be honest, I don't know that I disagree all that much with the decision, except of course that Blount will not have a public opportunity to show his better side. I might have expected an indefinite suspension to be a better fit, but on the other hand the Oregon officials have sent a clear message and presumably have put this behind them.
But I am writing this article to address the other man who needs to accept accountability: Byron Hout. No, I am not saying that Hout should be suspended or even given any kind of official punishment for his part in the incident. That said, I am concerned about his part in the event. Hout chose to come over to Blount, whatever he said was obviously meant to be trash talk, and Hout's grinning face as he turned towards his coach indicates that he was just fine with insulting a key player on an opposing team. What Hout did was clearly out of bounds. If it has happened during the game, it would have earned a penalty for taunting, and I speak as a former UIL football official in Texas (which uses the NCAA rulebook). Normally, a good coach considers the damage done when assessing punishment to a player for an infraction. A face-mask penalty, for example, one thing, but if they score the winning touchdown because on 3rd-and-20 you tackled the runner by his facemask, then you are in big trouble. A false start may not be a big deal, unless of course it happens on 4th down and pushes you just out of field goal range. And sportsmanship is a much bigger issue when something you say impacts the game's outcome or the image of the school. Back in my day, players were expected to wear dress clothes and ties on the bus and to represent the school and team, with total respect. It was silly at times and made the trip longer. But then again, you knew you stood for something worth your work, win or lose. Call me old school, but the sport could do with getting back to that.
Coach Peterson has said that he will meet privately with Hout and considers the incident a 'teachable moment'. The problem, of course, is that the incident was public and Hout needs to make some gesture to show he recognizes that his taunt started a series of actions which had serious consequences. Hout did not make Blount throw a punch, but he knew he was not acting in the best interests of his school, team, or the game. And Hout's unsporting behavior was public, and so it needs a public response. At the very least, Hout should apologize in public for his behavior, and Peterson needs to show that such behavior has consequences, real ones.
Friday, Oregon coach Chip Kelley suspended running back LaGarrette Blount for the rest of the season.
To some degree, the decision to end Blount's collegiate career (he is a Senior, and the suspension includes any bowl games that Oregon may earn) was predictable. Blount's action was not only blatent and deliberate, not to mention nationally televised, Coach Kelly serves on the NCAA's committee which address athletes' sportsmanship, and Kelly had already been under fire for an apparent lack of discipline on the Ducks' team. The public opinion on the matter seemed to demand a heavy punishment, and so the axe fell quickly in this case. To be honest, I don't know that I disagree all that much with the decision, except of course that Blount will not have a public opportunity to show his better side. I might have expected an indefinite suspension to be a better fit, but on the other hand the Oregon officials have sent a clear message and presumably have put this behind them.
But I am writing this article to address the other man who needs to accept accountability: Byron Hout. No, I am not saying that Hout should be suspended or even given any kind of official punishment for his part in the incident. That said, I am concerned about his part in the event. Hout chose to come over to Blount, whatever he said was obviously meant to be trash talk, and Hout's grinning face as he turned towards his coach indicates that he was just fine with insulting a key player on an opposing team. What Hout did was clearly out of bounds. If it has happened during the game, it would have earned a penalty for taunting, and I speak as a former UIL football official in Texas (which uses the NCAA rulebook). Normally, a good coach considers the damage done when assessing punishment to a player for an infraction. A face-mask penalty, for example, one thing, but if they score the winning touchdown because on 3rd-and-20 you tackled the runner by his facemask, then you are in big trouble. A false start may not be a big deal, unless of course it happens on 4th down and pushes you just out of field goal range. And sportsmanship is a much bigger issue when something you say impacts the game's outcome or the image of the school. Back in my day, players were expected to wear dress clothes and ties on the bus and to represent the school and team, with total respect. It was silly at times and made the trip longer. But then again, you knew you stood for something worth your work, win or lose. Call me old school, but the sport could do with getting back to that.
Coach Peterson has said that he will meet privately with Hout and considers the incident a 'teachable moment'. The problem, of course, is that the incident was public and Hout needs to make some gesture to show he recognizes that his taunt started a series of actions which had serious consequences. Hout did not make Blount throw a punch, but he knew he was not acting in the best interests of his school, team, or the game. And Hout's unsporting behavior was public, and so it needs a public response. At the very least, Hout should apologize in public for his behavior, and Peterson needs to show that such behavior has consequences, real ones.
Subscribe to:
Posts (Atom)