Friday, January 29, 2016

Disparate Impact

Right now Iowa and New Hampshire are in winter's grasp, though their respective landscapes will reveal themselves after snow and ice give way to the thaws of spring and summer.

Iowa's corn fields will produce the kind of harvest that is the envy of nations, while the jagged peaks of New Hampshire's White Mountains attract outdoorsmen like moths to a flame. Iowa's largest city (recession-proof, insurance-centric Des Moines) will again prove to be one of America's most stable, and New Hampshire's median household income will again be among America's top ten.

These states embody much of what is good about our nation, from family values to covered bridges and so on... Over the years Iowa has given us Buffalo Bill, John Wayne, Johnny Carson, and five Nobel Prize winners... New Hampshire has given us Daniel Webster, Horace Greeley, Alan Shephard, and Carlton Fisk... Generally speaking, Iowans embrace the virtues of diligence and neighborliness; New Hampsherites, the virtues of diligence and thrift.

In short, these states contribute more than their share and the USA would not be the USA without them. But with the 2016 election process about to kick off in earnest, I find myself, for the umpteenth time, wondering why it is that they have such an outsize impact on who it is that the rest of the USA will be able to vote for come November.

Combined, they account for 1.38 percent of the nation's population and 1.73 percent of its size. The Tampa Bay, Florida metropolitan area (where I live) has never been mistaken for Los Angeles or Chicago, yet it has more than three times as many people as the state of New Hampshire and over a million more than the state of Iowa. Almost 20 American cities have larger metro-area populations than the combined statewide populations of these two states.

Nonetheless -- and almost entirely because Iowa and New Hampshire are the first two states to vote in the primary process -- it is axiomatic that a candidate who doesn't make a big splash in the former's caucuses or the latter's primary cannot wind up being his party's nominee for the presidential election.

Why is that? Where is the logic in that?

Why should candidates be written off before the likes of Texas, Florida, New York, California, and Ohio (to say nothing of Wyoming, West Virginia, and Vermont) have even had a chance to have their say?

Sure, smart campaigning and smart allocation of resources are important; and if a candidate fails to smartly orchestrate his campaign through the first handful of states that vote for who will become the party nominees, that must give us a clue about whether he will smartly execute the duties of the office he seeks. But still, why is it that quality candidates must face the prospect of being financially and "media-ly" driven to bow out simply because voters in these two small states -- just these two -- don't flock to them? How is that fair to the rest of the nation?

Right now there are more than a dozen Republican candidates vying for the nomination, all of whom bring significant ideas and talents to the table. Yet, despite all that variety lined up in the starting gate, the smart money is prognosticating that it could all be over (or at least be nearly over) before the other 48 states ever make it to the batter's box.

Up until a hew hiccups ago, the smart money was saying that Ted Cruz will win Iowa and Donald Trump New Hampshire, and that the race will then transfigure into a two-man contest between them and them alone -- unless, that is, Marco Rubio happens to finish third in Iowa and second in New Hampshire, in which case he still won't have a chance to win the nomination, but will at least be able to influence the outcome by eventually persuading his supporters to back either Cruz or Trump.

Now, that smart money is saying that Trump will win both states and then nobody -- nobody! -- will stand a chance of overcoming his juggernaut momentum.

To which I say: What?!?!

And to which I also say: How can it be that winning two states with 1.38 percent of the population equals the momentum of invincibility?

To repeat: There is an unprecedented wealth of talent on the GOP side, with more than a dozen candidates offering a multitude of robust and often conflicting ideas. Yet I am being told that the votes of less than two percent of my fellow Republicans can go almost all the way to deciding things even before more than ninety-eight percent of my fellow Republicans have even touched a ballot.

In other words, I am being told that the "less than two percent" can prevent the "greater than nine-eight percent" from offering America's non-Republicans any choice other than Trump or Cruz. And, that the "less than two percent" could even prevent the "greater than ninety-eight percent" from offering any choice other than Trump. How can that be good?

Granted, an axiom is not a rule, and the axiom that you must win Iowa and/or New Hampshire to win your party's nomination does not always hold true... But it usually does. The last time a Republican won his party's nomination after not winning either of those states' contests was 52 years ago, when Iowa did not hold its caucuses and thus nobody could win them. The last time a Democrat won his party's nomination after not winning either of those states' contests was 26 years ago, when Bill Clinton finished fourth in Iowa and second in New Hampshire.

I do not mind that some states have an amount of influence which outweighs their share of the population. Such an arrangement helps protect against tyranny by the majority, and in a nation founded on individual liberty, protecting against tyranny by the majority is every bit as important as protecting against tyranny by the minority.

However, the amount by which Iowa and New Hampshire disproportionately affect presidential politics is just not good.

I don't have an answer for this situation. By which I mean that I don't have an answer I'm comfortable with, since every possible solution has flaws of its own. But I do know that this situation is a problem, and we would be better off without it.


Thursday, January 28, 2016

Challenger

With all the hype focused on tonight's sans-Trump primary debate, I figured I'd sit that topic out by acknowledging that today is the 30th anniversary of the Challenger explosion. Below is the short post I published five years ago:



Twenty-five years ago today I was a freshman at St. Petersburg High School. Walking from PE to English class, I glanced up at the sky and saw a contrail that was split into two short branches. I didn’t think much of it, even though it was not the most ordinary of sights. But when I got to class I heard that the space shuttle Challenger had exploded; and when I watched the replays of its explosion after school, I realized that the contrail I had seen was the Challenger’s.

It is hard to believe that the 25th anniversary is upon us. Let us never forget those who lost their lives that day, and let us never forget the bereaved loved ones they left behind. Rather than conjure up platitudes about their lives, I will simply leave you with the words that were spoken on national television that night by President Reagan. I still get a tingle in my spine every time I hear them:

“The crew of the space shuttle Challenger honored us by the manner in which they lived their lives.We will never forget them, nor the last time we saw them, this morning, as they prepared for their journey and waved goodbye, and slipped the surly bonds of earth to touch the face of God.”

To watch the speech, go here.

Tuesday, January 26, 2016

Flint

Many news outlets (some of which are nothing more than partisan hack web sites) have been atwitter about the scandal involving the water supply of Flint, Michigan... And they should be, because it is a genuine scandal in which governments have actively harmed the citizens they are supposed to protect... But unfortunately, much of the so-called reporting about this scandal is off base.

Since I'm talking about the media, it should not surprise anyone that the coverage has tried to blame Republicans for Flint's water scandal. What makes this suggestion laughable is that everyone directly involved in the Flint situation is a Democrat. While it's true that Michigan's governor (Rick Snyder) is a Republican, he is, at most, only a peripheral figure where Flint is concerned.

*     *     *     *     *

Like many Midwestern cities, Flint is a one-party jurisdiction that has been run by Democrats for generations. And like many of those cities, it has spent the last few decades in a state of erosion. A number of reasons for that erosion can be cited -- not least of which are that Flint's Democrat-run government does a bad job providing basic services, and that it creates an environment in which it's hard for businesses to succeed.

For years, Flint purchased its drinking water from Detroit. As you probably know, Detroit has also been run by Democrats for generations and is a shadow of its former self... Unfortunately for Democrat-controlled Flint, this set-up resulted in it being price-gouged by Democrat-controlled Detroit. Therefore, Flint decided to stop buying water from Detroit and start buying it from the Karegnondi Water Authority. That decision, made in 2013, was to take effect in 2016. However, Detroit reacted like a jilted lover and told Flint that it would stop providing it with water two years' early, in spring of 2014... This drove Flint to find a temporary water source for the period from 2014 to 2016, and the source it chose was the Flint River -- which makes sense when you consider that the river was already the city's back-up source; but does not make sense when you consider that the river is polluted, and that the infrastructure for delivering its water to the public was known to be inadequate.

If you want to keep scoring this on a Democrat vs. Republican basis, it looks even worse for the former party when you consider the agencies charged with ensuring such things as safe drinking water... The U.S. Environmental Protection Agency (which was founded by Nixon, but has for years been a leftist oasis staffed almost exclusively by Democrats) knew that "switching the water supply could enhance pipe corrosion and thus increase lead levels in residents' drinking water," yet it failed to inform the public. This has led to the resignation of Region 5 Administrator Susan Hedman... Meanwhile, at the state level, Michigan's Department of Environmental Quality (another leftist oasis staffed almost exclusively by Democrats) simply told the city to "monitor the water for a year" -- a horrific bit of advice when you consider that Flint's 100,000 residents were drinking the water every day and many of them were being sickened by it.

Consuming the water exposed residents not "only" to lead, but also to Legionnaire's disease, E. coli, and other chemical byproducts. An outbreak of Legionnaire's resulted in 10 deaths, plus 77 serious but non-fatal illnesses. It is reported that as a result of the scandal, somewhere between 6,000 and 12,000 people have dangerously high levels of lead in their systems.

Large segments of the press would have you believe the Republican Party is to blame for all of this. In the most disgraceful example that I have seen, Cagle Cartoons published this doodle depicting Governor Snyder shoving the city of Flint into toxic sludge "in the name of deregulated trickle-down government," while the GOP elephant stands nearby crowing "Amen." Curiously absent from this truth-murdering cartoon is the Democrat donkey, despite the fact that it is Democrats who have been calling the shots all along -- and of course, the cartoon has been published by the likes of Occupy Democrats and has gone viral on social media.

*     *     *     *     *

Are there any Republican fingerprints at the crime scene that is the Flint Water Scandal? It would be a lie to say "absolutely not," but those fingerprints are vanishingly faint compared to the ones left by Democrats.

Like I already said, Governor Snyder is a Republican and Flint is a city in his state, so he bears some degree of accountability -- but that degree is minuscule, since it is not in his job description to oversee every municipality's decision regarding its water supply.

Yes, Snyder appointed the emergency city manager who was in office when Flint's water-related decisions were made -- but that manager was a Democrat, and Snyder appointed him specifically to keep from being accused of orchestrating a "Republican takeover" of a Democrat city.

And as I have already laid out, the reason an emergency city manager was needed was that Democrats threw the city into the shitter.

*     *     *     *     *

Republican scandal? Not even close. But it's sad that we (including me) are even talking about political parties. There is no way that Republicans ever wanted to sicken the residents of Flint, but there is also no way that Democrats ever wanted to sicken the residents of Flint. What happened here is a textbook example of what happens whenever a single party holds the reins of power for too long.

What happens with one-party rule is this: The party becomes entrenched; its unelected workers (i.e., bureaucrats) focus on themselves and their benefits, inevitably losing sight of the citizens who employ them but can't discipline them; the party becomes a machine whose goal is to sustain itself; and in due course, incompetence and corruption become unavoidable, with the only question being how bad (in this case, how deadly) the incompetence and corruption will be.

Governments are staffed by people, and every person is both fundamentally flawed and tragically fallible. Each of us is tempted by both good and bad; and perhaps worst, by naked self-interest. And each of us occasionally succumbs to the temptations of bad and, especially, to the temptations of self-interest.

Give people power and make them accountable to nothing except their collective conscience -- on the hope that that collective conscience will stay vigilant 24/7 -- and your hopes are guaranteed to be dashed, often in disastrous ways.

The Flint Water Scandal is a perfect teaching moment. It teaches how government can not be trusted, and how the stated aims of political parties can not be taken at face value.

It would teach America these things very clearly, if only the American media that covers politics would report the story without partisan shading. But the American media that covers politics won't do that, and perhaps that is the biggest lesson of this whole sordid affair.


Thursday, January 14, 2016

Oregon

In a remote snowy valley in Eastern Oregon, a figurative battle line has been drawn by a handful of American citizens who believe that America's government is eradicating the rights of America's people.

As happens whenever battle lines are drawn, people are lining up on either side for any number of reasons. The divide can be summarized in many ways (country boys vs. city slickers, individualists vs. collectivists, the rednecks vs. the refined, etc.) but at the end of the day, even though there is a major political component to this conflict, there is no denying that the divide is primarily a cultural one -- and there is no denying that the cultural divide on display in Oregon is a precise reflection of the one that is threatening to tear our country apart.

Plenty of people have done their version of commenting about what is happening in Oregon, by posting one-sentence opinions on Twitter and photo mock-ups on Facebook. Unfortunately, most of them have no knowledge -- I repeat, no knowledge -- about the background to what is happening.

Yes, a group of citizens has taken over a small building at the Malheur National Wildlife Refuge, a sprawling wilderness of 293 square miles. And yes, they have guns with them. But to flippantly say they have "occupied" or "seized" the building and incessantly refer to them as "armed" is to obscure the real issue.

The building was vacant and they knew it would be. They chose it so as to avoid creating a situation in which people could be harmed; and as they have repeatedly said, they brought their perfectly legal guns not because they were itching for a shootout, but to defend themselves in the event the government attacks them for being in the building. (Which doesn't sound wise, but given the government's track record [Elian Gonzalez, Ruby Ridge, etc.] it is quite reasonable for them to consider the possibility of being attacked.)

But the most important question is: Why did the protesters drive across state lines in the dead of winter to sit in a vacant building in the middle of nowhere? On this count, ignorance reigns supreme among the public because: 1) most of the media has declined to deal with the question; 2) few people have bothered to ask it; and 3) fewer still have bothered to listen to the protesters' explanation, despite the fact that the protesters have not been shy about volunteering it.

The long answer to the question follows, but the short answer is that the protesters came here to raise public awareness about the federal government abusing and ruining the lives of innocent people. In other words, they came to engage in civil disobedience, which has a time-honored tradition in this country (John Brown did it in a bad way, Rosa Parks in a good way, but they both intentionally broke the law "to arouse the conscience of the community," and they both succeeded).

*     *     *     *     *

When it comes to abusing individuals who own property in our western states, the U.S. government's track record is so long and goes back so far that there isn't enough room for it in a hundred blogposts. So let's just say that it is repugnant, harrowing, and adversarial to human rights.

To begin with, it's important to be aware that the feds own 47 percent (and counting) of the land in our 11 westernmost states not called Hawaii or Alaska -- a percentage that has been achieved by hook and by crook -- and meanwhile, they own one-fifth of Hawaii and a whopping 69 percent of Alaska. In staggering contrast, they own only 4 perfect of the land in the other 37 states.

Though the events which precipitated the "occupation" of the vacant building in Malheur National Wildlife Refuge peaked during this decade, they began in the 1980's, when the federal government sought to buy people's ranches in the Silvies Plain. The feds coveted the Silvies because it was adjacent to what was then the border of Malheur. The feds had long wanted to increase Malheur's size, and the only way to do that was by acquiring private land.

Many property owners in the Silvies refused to sell their ranches to the feds because they were, after all, their ranches, their homes, and their livelihoods. In light of the property owners' refusals, the feds could have attempted to acquire the properties by eminent domain, which might have looked bad but at least would have required them to pay each owner what his or her property was worth. However, eminent domain was not attempted.

What did happen was that curiously timed decisions were made to divert water into the already voluminous Malheur Lakes, which proceeded to overflow and flood the Silvies. Entire homes and barns were washed away and grazing land was tossed up and destroyed. Just like that, ranchers lost the roofs over their heads, were rendered unable to ply their trade, and saw the value of their land sink like an anvil in the ocean. Desperate and without recourse, they were compelled to come to the (ahem) "bargaining table" and practically beg Master Government to take their land off of their hands at basement prices.

It would be good, for lack of a better word, if their experience was some kind of bizarre exception to the way our government treats property owners, but it was not; and in the three decades since that chicanery on the Silvies Plain, the government has actually gotten worse. The protesters holed up in Oregon, unlike their critics, are aware of this, and they give a damn.

Which brings us to the more recent injustice that has occurred with regard to Malheur. The injustice which, more than any other, prodded the protesters to try to get the public's attention.

*     *     *     *     *

The Hammond family is one of the few that still owns land in the Silvies. Dwight Hammond is 74 years old and his son Steven is 46.

As anyone familiar with land management and rural life knows, if you own acreage, sometimes you need to start fires on it. It is necessary to clear undergrowth, in order to lower the risk of wildfires burning out of control. It is necessary to clear invasive species and prevent their spread. Landowners do this regularly on private property, and both the federal and state governments do it regularly on public property (though, by using the phrase "controlled burns" to describe the blazes they start, governments successfully gloss over the fact that it's not uncommon for said burns to spread onto private property).

Being ranchers, the Hammonds have obviously set fires on their land. It is two particular fires, one in 2001 and one in 2006, that are now the epicenter of an absurd drama... The first one crossed an indistinguishable line between private and public property and wound up burning an estimated 140 acres of the latter. For the record, that equals less than one-thousandth of one percent of Malheur... The second blaze also crossed the indistinguishable line, and burned, in the words of the U.S. Ninth Circuit Court of Appeals, "about an acre of public land" -- which, if you're keeping score, equals less than one-billionth of one percent of Malheur. At trial, the value of the Malheur land that caught fire was found to be less than $1,000.

For that, the feds chose to tar the Hammonds not only as criminals, but as felons; and not only as felons, but as terrorists -- and as if that wasn't bad enough, they came after them years after those relatively piddling fires took place.

They charged the Hammonds with a total of 19 counts, including charges under the Federal Anti-Terrorism and Effective Death Penalty Act, which mandates a minimum five-year prison sentence for anyone who "maliciously damages or destroys...any...personal or real property in whole or in part owned or possessed by, or leased to, the United States." If you believe that the existence of the adverb "maliciously" in front of the words "damages or destroys" would be enough to protect the Hammonds, all I can say is that you've never seen a federal prosecutor in action.

The charges were brought by Master Government in 2010; i.e., nine years after the "less than one-thousandth of one percent fire" and four years after the "less than one-billionth of one percent fire." As surely as night follows day (federal prosecutors are suspiciously famously good at getting their way), the Hammonds were found guilty and faced with the prospect of spending at least five years behind bars -- five years during which there was a not-small chance that Dwight Hammond would die.

At sentencing, Judge Michael Hogan did something that should have earned him praise, and which goes to show that judges actually can, you know, make judgments. Rather than follow the minimum-sentencing mandate and send the Hammonds away for five or more years, he ruled that doing so would violate the U.S. Constitution's Eighth Amendment, which forbids "cruel and unusual punishment."

In support of his decision, Hogan noted that the fires "could not have been conduct intended (to be targeted) under" anti-terrorism laws, and that five years in the federal pen would be "grossly disproportionate to the severity of the offenses...would not meet any idea of justice...would shock the conscience to me."

Because he had to sentence them to something, seeing as how they were found guilty, Hogan gave Dwight Hammond three months and Steven Hammond twelve months plus one day. They went to prison and did their time without incident. When their sentences were over, they were released and then they returned home to resume their lives. You might think that that would be that, but you would be wrong.

Remember, Master Government gets what Master Government wants. And Master Government was unhappy that Judge Hogan, one of its own employees, had left the reservation by roguishly refusing to impose a Soviet-style punishment for the Hammonds' trifling and victimless "crimes"... So Master Government appealed not the jury's verdict, but the judge's sentencing, up to another one of its courts; and to the Hammonds' great misfortune, said court happened to be the Frisco-based loony bin officially known as the U.S. Ninth Circuit Court of Appeals... Three months ago, that court, in keeping with its reputation (Rush Limbaugh has long referred to it not as the Ninth Circuit but as the Ninth Circus), overrode Judge Hogan and ordered the Hammonds back to prison until they have each been there for five years.

This means that Dwight Hammond -- despite having already served his sentence, and despite being in the midst of trying to pay a $400,000(!) civil settlement to Master Government -- has now been sent back to prison by Master Government with no new evidence submitted and no new charges filed... and will not be free again until after he turns 79 years old, assuming he is still alive when the date rolls around.

This is why American citizens who have never met the Hammonds (and only three of whom are related to Cliven Bundy) decided to raise awareness about their plight. The protesters' decision was made in the hope of igniting one of those "public conversations" about "social justice" that metropolitan liberals are always claiming they want to have.

Predictably, however, the metropolitan liberals -- as well as the social justice warriors they cavort with, plus an alarmingly high percentage of conservatives -- have responded by ignoring the issues and basically telling them to go fuck themselves.

*     *     *     *     *

Metropolitan liberals and social justice warriors (i.e., Democrats who are likely to vote) would have you believe they care about the little guy, and that they "have his back"... They would have you believe that they abhor the idea of the powerful railroading the powerless... They would have you believe that they abhor the idea of judging people based on roots, cultural background, ethnicity, etc... But in this case (and others) it is demonstrable that what they say they think does not always line up with what they actually think.

They are either fooling themselves about their beliefs or lying to everyone else about their beliefs. I suspect the former is true in most cases, for I do believe that most metropolitan liberals (and a sizeable number of social justice warriors) honestly see themselves as caring for the little guy. I know enough of them to say, with certainty, that most of them are good people. Nevertheless, their cultural instincts are driving them to prop up a political structure which, in the end, forces human beings to surrender their personal autonomy and hand it over to the state. Having decent intentions does not make that okay.

By "cultural instincts," I am referring to the ways that people's knees jerk, the subconscious ways that people presume things about those they perceive as the "Other." Big city liberals are not the only ones susceptible to this phenomenon, for it is part of human nature and rural conservatives are guilty of it as well. But by and large, rural conservatives lack the powerful connections that metropolitan liberals enjoy, and contrary to what the mainstream media would have you believe, they are much less likely to want to use government to "impose their values on others."

Where the situation in Oregon is concerned, most liberals are not even hearing the message for the simple reason that they instinctively dislike the messengers. In the fronts of their minds, they see rural folk with guns and think "rednecks." And in the backs of their minds, when they think "rednecks" they at best think "idiots" and at worst think "racists." Actually, they don't even really think these things, they feel them -- which is even worse, because feelings are resistant to thought and thus are resistant to being changed.

The shame of it all is that this results in the rights of the individual getting burned to death and sacrificed to the tribal tendencies of the mob -- with consequences that could deprive all of our children of freedom.

The Oregon protesters have attempted to bring to the public's attention the exact kind of injustice that metropolitan liberals and social justice warriors say they want to prevent, and they have made this attempt at personal risk -- only to find that metropolitan liberals and social justice warriors are unwilling to even listen.

Because metropolitan liberals are dominant in professional media  -- and are, along with social justice warriors, very good at saturating social media -- it turns out that liberals at large and society at large: 1) have not heard what it is that brought protesters to Oregon, but 2) have heard loud and clear the implication that the protesters are backwards gun-toting hooligans.

One need not like the protesters in order to listen to their complaints. Throughout history, there have been plenty of instances in which individuals found that they agreed with somebody about an important topic, even if they considered the somebody himself to be beyond the pale. However, it seems to happen less and less in today's dumbed-down, short-attention-span world. I think that trend is unhealthy for civil society, and no, I will not hesitate to allege that liberals are much worse than conservatives when it comes to listening to opponents. In my experience, most conservatives are fully capable of explaining liberal viewpoints and explaining the rational and emotional reasons for those views (even though they proceed to explain why they think those views are in error). On the other hand, a majority of liberals have little grasp of what conservatives think, and a supermajority of liberals are positively clueless about why conservatives think what they think.

The combined inability and unwillingness to listen to other people -- simply because of the way knees jerk when other people appear on Facebook or on TV screens -- is a very real poison in the bloodstream of our society. It closes minds and jeopardizes civility. In so doing, it puts human potential at risk and threatens to abet not social justice, but social injustice.

Know this about the situation in Oregon: It justifies civil disobedience, even if you don't care for the specific means of civil disobedience that the protesters have chosen. The things that outrage the protesters should outrage every American, liberal and conservative alike, even if the protesters themselves aren't the kind of people you would have over for dinner.

In a society based on self-government, it's a dangerous thing for mass numbers of citizens to plug their ears to a message simply because their eyes don't care for the messenger.


Much thanks to David French for writing at length about this case.


Tuesday, January 5, 2016

Gone Too Soon



Around lunchtime on New Year's Day, I hit the "Publish" button on my post that looked back at influential people who died in 2015. In it, I noted that Wayne Rogers had been claimed by pneumonia on the year's final day.

Little did I know that news was about to break that Natalie Cole had also passed away on the year's final day.

Although she deserves her own post and own recognition, it is impossible to mention her without mentioning her father -- and the good news is that I think she would like it that way.

They both died before their time, he at 45 and she at 65, yet they left such deep footprints in American culture that they will never be forgotten. Both of them had perfect timing and overflowing talent, both of them impacted several genres of music -- and most importantly, they both transcended race without sacrificing one bit of their true selves.

*     *     *     *     *

Nathaniel Adams Cole was born in Alabama one year after the end of World War I and started becoming nationally known during the World War II era. If you've noticed that some writers refer to him as Nat King Cole and others simply as Nat Cole, you might be interested to learn that the word choice usually reflects the writer's perception not just of Cole, but of himself.

The majority of people think of him as the mild-mannered crooner whose silky baritone wove tunes like "Too Young" and "Walkin' My Baby Back Home" into the American songbook. There is nothing phony about that image, and the people who hold to it always refer to him as Nat King Cole; however, many of them forget (if they ever knew) that he was a superb jazzman who ranked as one of the best swing pianists of the twentieth century.

Conversely, people who pay homage to the jazzy Cole tend not to use his royal sobriquet when they refer to him, presumably because they consider it too "commercial." As far as I'm concerned, they need to dial back their elitism and be reminded that Edward Ellington went by "Duke" and William Basie went by "Count."

Although Cole performed with a big band and came of age in a time when big bands ruled, he put his shoulder against the tide by making the jazz trio his forte. Specifically, he championed and pulled off an unconventional threesome of instruments that consisted only of a piano, guitar, and bass. Cole always played the piano while the latter instruments were usually played by Oscar Moore and Wesley Prince, respectively.

When he began performing more as a crooner and appealing to whiter wider audiences, he often recorded with a lush string section. Far from being some kind of betrayal of his background, his use of strings was but a faint echo of his diverse tastes and deep schooling. In Cole's own words, his study of the great composers of European classical music ranged "from Johann Sebastian Bach to Sergei Rachmaninoff," and in that regard he was similar to Miles Davis.

While doing all this, Cole had scuffles with bigotry and he effectively faced it down, which makes it odd that he is seldom mentioned when people talk about the Civil Right Movement... When he purchased a house in LA's prestigious Hancock Park neighborhood, the property owner's association approached him and said it did not want any "undesirables" moving in. He responded by saying "if I see anybody undesirable coming in here, I'll be the first to complain"... While performing a concert in Birmingham, AL, less than 100 miles from where he was born, three members of a KKK faction rushed the stage in an attempt to kidnap him. He responded by never again performing a concert anywhere in the entire South.

Nat King Cole died of lung cancer in February 1965, nine days after his daughter Natalie's fifteenth birthday. She would later tell the Wall Street Journal that his death "crushed me. Dad had been everything to me."

*     *     *     *     *

Natalie Cole's musical rearing was, like her father's, a rearing steeped in diversity.

She soaked up her dad's stylings, learned from her mother (who had sung with the Duke Ellington Orchestra), and was raised personally knowing Louis Armstrong, Peggy Lee, Lena Horne, and many other luminaries. When Natalie reached adulthood and sought a record deal, pretty much every label was happy to listen to her self-recorded works -- given her name, they surely had dollar signs swimming in their eyes -- but when their listening was done, every label except one turned her down after realizing that she was also strongly influenced by the likes of addled rocker Janis Joplin and aggressive soulster Aretha Franklin.

The only label willing to sign her was Capitol Records, which probably shouldn't be a surprise. Nat King Cole had recorded for it and earned it a fortune, and its higher-ups knew enough about the relationship between cart and horse that they still referred to Capitol's distinctive Hollywood office tower as "The House That Nat Built." So, it made sense that they refused to turn down his talented daughter even when he was no longer around to advocate for her.

Once under contract with Capitol, Natalie Cole hit the ground running. Her debut album, Inseparable, was released in 1975; went to #1 on Billboard's soul chart; generated a pair of hit songs; and resulted in her winning Grammys for Best New Artist and Best Female R&B Vocal Performance (for the single "This Will Be").

Recording and performing at a dizzying pace, she released three more albums in a span of 19 months between April 1976 and November 1977, followed by a live album in 1978 and new studio albums in 1979, 1980, and 1981. Throughout that period she collected five more Grammy nominations, another Grammy win, and was awarded a star on Hollywood's Walk of Fame one day shy of her 29th birthday.

But burning the candle at both ends while being besieged by personal demons often takes a big toll, and Natalie's struggles with drug addiction -- which had been foreshadowed by her 1975 arrest in Toronto for heroin possession -- spun out of control in the early 1980's. Hooked not "only" on heroin but also on coke and booze, she largely shrank from the public eye and checked herself into a rehab facility in 1983, remaining there for six months.

Her first album after overcoming her addiction, 1985's Dangerous, had low sales and none of its singles received much play on the radio. Then, her trajectory started to rise again with 1987's Everlasting, which was her first album with Capitol Records since 1981; it spawned a top ten single with her infectiously dancy cover of Bruce Springsteen's "Pink Cadillac."

In 1991 Natalie released the album Unforgettable...With Love, in which she lent her smooth crystalline voice to standards previously recorded by her father. Its pinnacle was the creative rendering of Nat's signature tune, "Unforgettable," with verses of his voice from his 1951 recording dubbed around her contemporary singing, thus creating a touching duet with the deceased.

Unforgettable...With Love went platinum seven times over, won six Grammys, and cemented Natalie's return to the music scene as a force to be reckoned with. It also increased young people's knowledge of Nat, since many of them previously knew him only as a dead guy who had sung about chestnuts roasting on an open fire.

In the years following it, Natalie released ten more albums, won three more Grammys, was nominated for yet another four Grammys, and toured extensively. Proof of her versatility can be found in the fact that she won vocal Grammys in three different categories: R&B, jazz, and pop.

*     *     *     *     *

Natalie Cole's death last week caught me by surprise.

I knew she had been diagnosed with Hepatitis C in 2008. I knew that she suspected she had contracted it decades before when she was on drugs, and that she suspected the infection had just been lying low during the intervening years.

But I also knew that lots of people with Hepatitis C get along well, and I knew she had been quite active since her diagnosis.

What I did not realize is that her kidneys had been faltering for a long time and were in such bad shape that she had been undergoing dialysis while on tour. Those issues did not lack for publicity -- in fact, publicity about them resulted in her getting a transplant when a donor family specifically requested that she receive the kidney of their deceased loved one -- but somehow, inexplicably, those news stories slipped by my radar without me seeing them.

Looking back, I am thinking of March 27, 1999, the day Erika and I got married. At my request, the DJ played "Unforgettable" towards the end of our reception. It was the Nat-only version from 48 years earlier, and the dance floor filled with all generations when it came on. Erika and I danced to it and were in our late twenties. Ten or fifteen feet away, my Great Uncle Tom and Great Aunt Helen danced to it and were in their sixties (though, now that I'm thinking about it, that vivacious cradle-robbing Helen might have already burst into her seventies by then!).

But the bottom line is this: That song is universal and timeless in the way it speaks to the human heart, and were it not for the Coles, father and daughter alike, it might never have become or remained popular.

I did not need the duet version of "Unforgettable" to make me aware of the song, and I'd like to believe that I would have thought to include it in our wedding repertoire even if Natalie hadn't recorded it; but if I'm being honest with myself, I know I probably would not have. And, I know without a doubt that our reception was elevated by virtue of "Unforgettable" being played.

So, Natalie, I thank you and pray that you rest in peace.

And you too, Nat. You and her both uplifted American culture, and without you there would have been no her.


Friday, January 1, 2016

2015: In Memoriams

As the curtain rises on 2016, here is a look back at some of the titans we lost in 2015:

Maureen O'Hara
I am listing Maureen O'Hara first as a kind of salute to my late grandfather, who was never shy about praising her beauty, but I would have put her on this list even if he never mentioned her name (and please don't say anything to my grandmother about me calling this a "salute").

I first saw O'Hara when I watched a VHS of The Parent Trap, that 1961 Disney classic in which she starred as the divorced mother of twins. O'Hara was 41 when it was released, which is to say that she was older than most leading ladies of the time, yet she owned the screen with her skill -- and yes, her drop-dead looks were more bedazzling than those of actresses 10 years her junior.

Born in Ireland in 1920, she lived an unusual childhood by being both an artistic performer and a tomboy. By the time O'Hara was in her teens she had ridden horses, trained in judo, played Gaelic football, implored her father to found a women's Gaelic football team -- and had won a national acting award for her portrayal of Portia in The Merchant of Venice

She made her screen debut in 1938, and her first major movie role came a year later when she was cast as the female lead in Alfred Hitchcock's Jamaica Inn. This led to RKO casting her in The Hunchback of Notre Dame, which in turn prompted her to move to California, and as they say, the rest is history.

Maureen O'Hara went on to become a Hollywood legend, playing lead roles alongside Charles Laughton, John Wayne, Tyrone Power, Douglas Fairbanks, Errol Flynn, Henry Fonda, and Brian Keith... Miracle on 34th Street, The Quiet Man, Rio Grande, The Black Swan, This Land is Mine, and The Long Gray Line represent just a mere sampling of the movies in which she shined... She had fiery red hair, mischievous eyes, and was a star during the precise period in history that color movies became common. This combination made her sex symbol status inevitable and resulted in her being nicknamed "the Queen of Technicolor."

But Maureen O'Hara was too talented and too active for her relevance to be limited to one window of time. In 1999, at the age of 78, she served as Grand Marshall of New York City's St. Patrick's Day parade. Six years after that she was named Irish-American of the Year... In December 2010, at the age of 90, she established a center in Glengarriff, Ireland, that is dedicated to training people to become actors and actresses... In 2013 she attended the groundbreaking of the John Wayne Birthplace Museum, and in 2014 she attended the TCM Film Festival.

On October 24th, O'Hara passed away in her sleep at her home in Idaho, 95 years old, still in possession of her mental capacities, and looking decades younger than her actual age. Instead of having her remains buried in a Hollywood cemetery surrounded by the graves of celluloid celebrities, she had them laid to rest in a place where they are surrounded by the graves of heroes: Arlington National Cemetery, next to the remains of her husband, U.S. Air Force Brigadier General Charles Blair .


Dean Smith
When he was 27 years old, Dean Smith accepted a position as an assistant basketball coach at the University of North Carolina. Three years later he became the head coach and remained in that role until he retired 36 years later. Astonishingly, his teams won more than 77 percent of their games across that 36-year span.

Smith retired having won more games than any coach in NCAA history. His legacy includes two national championships, thirteen conference championships, eleven Final Four appearances, and one run of 23 straight appearances in the NCAA Tournament. It also includes a gold medal from when he served as head coach of the U.S. Olympic Team in 1976... He is credited with creating a number of defensive sets, including the point zone and the run-and-jump, and also with being the first person to have his defense double-team the screen-and-roll... Dozens of the players he taught made it to the pros and many achieved stardom, including one particular chap named Michael Jordan.

But those things are only one part of his legacy. Taking a cue from his father Alfred (who had integrated the basketball team of Kansas's Emporia High School when he coached it in 1934), Assistant Coach Dean Smith confronted Jim Crow in 1961 by dining with a black theology student at Chapel Hill's segregated Pines Restaurant. In 1965, Head Coach Dean Smith helped a black grad student purchase a home in an all-white neighborhood. And in 1966 he inked Charlie Scott to a basketball scholarship, making him the first black athlete in any sport to receive a scholarship to UNC.

His first national championship came in 1982, when UNC edged Georgetown by a score of 63-62. In the title game's waning seconds, when Georgetown still had a chance to win, Georgetown guard Fred Brown threw an errant pass to UNC's James Worthy and Worthy dribbled away with the ball as time expired. In the words of opposing coach John Thompson, rather than celebrate, "Dean Smith's first reaction was to come down and console me. I hope I would have been classy enough to have done that."

Naturally, an overwhelming majority of college basketball players have no chance of ever making it to the NBA, so it goes without saying that the vast majority of players who fell under Smith's tutelage spent their post-college years in regular jobs or playing in secondary basketball leagues. It says much about his character that they were well equipped for the real world (more than 96 percent of them graduated) and that he kept in touch with them as the years went by.

Former player Derrick Phelps had this to say when Smith retired: "I didn't become a star in the NBA and he still calls me all the time. When he does, my teammates in the CBA or Europe can't believe it. They're always like, 'I wish my college coach still cared about me'" -- which makes it unsurprising that when Smith died in February, his will called for every player he ever coached at UNC to be sent a $200 check along with a note reading: "Enjoy a dinner out compliments of Coach Dean Smith."

He embodied everything that college athletics -- and in fact, Americanism itself -- is supposed to be.


Meadowlark Lemon
George Lemon III was born in Wilmington, NC, in 1932. When he was 11 years old he saw a newsreel of the Harlem Globetrotters and immediately wanted to play for them when he grew up. The first time he tried his hand at basketball, he did so by bending a coat hanger to serve as the rim, hanging an onion sack from it as the net, and shooting with a Carnation milk can instead of a ball.

While stationed in Austria during the two years he served in the Army, Lemon got a chance to play with the Globetrotters during their European tour and impressed them enough to earn a tryout after his discharge. He officially became a 'Trotter in 1954; adopted his now-famous nickname; and became renowned for sinking hook shots from half-court, completing no-look passes in the blink of an eye, and creating many of the gags the team continues to perform even today. Wilt Chamberlain declared that "Meadowlark was the most sensational, awesome, incredible basketball player I've ever seen."

He could have played in the NBA but chose to be a 'Trotter instead, largely because he felt his life's calling was to entertain people; but also, the mid-century was a different era in the game of hoops because the NBA was in its infancy and not much on the nation's radar. Lemon remained with the 'Trotters for an incredible 24 years, until 1978. His career spanned from the days when they were more of a competitive team -- playing serious foes not called the Washington Generals and saving their antics until the outcome was not in doubt -- until the days when they played the hired stooges called the Washington Generals and every game was antics from beginning to end.

A born-again Christian, he became an ordained minister in 1986 and spent the last three decades involved in outreach. He gave motivational speeches and attended youth basketball camps in an effort to, in the words of ABC's John Marshall, "spread a message of faith through basketball." He visited inmates in juvenile detention because, in his own words, "I feel if I can touch a kid in youth prison, he won't go to the adult prison." Meadowlark Lemon died two days after Christmas and will be sorely missed.


Daryl Dawkins and Moses Malone
2015 was a rough year for losing basketball luminaries. Daryl Dawkins and Moses Malone deserve to have their own entries on this list, but my mind can't help but think of them simultaneously, so I am mentioning them together.

In an age when almost all professional players made it to the pros by playing in the college ranks, Dawkins and Malone broke the mold by jumping straight from high school to the pros... Malone signed with the ABA's Utah Stars in 1974 and Dawkins with the NBA's Philadelphia 76'ers in 1975, after which the leagues merged in 1976 and Malone signed with the Houston Rockets... After the end of the 1981-82 season, Dawkins left the 76'ers and Malone joined them.

Daryl Dawkins will always be remembered for his emphatic slam dunks, which he executed with such ferocity that he broke backboards on multiple occasions and caused the NBA to start imposing fines and suspensions when one broke. He gave individual names to some of his dunks, including the Yo-Mama, the Spine-Chiller Supreme, and the In-Your-Face Disgrace. He called his first backboard-breaker "The Chocolate-Thunder-Flying, Robinzine-Crying, Teeth-Shaking, Glass-Breaking, Rump-Roasting, Bun-Toasting, Wham-Bam, Glass-Breaker-I-Am Jam." And as if that wasn't enough, he said he spent the offseasons on a planet called Lovetron studying "interplanetary funkmanship" with a girlfriend named Juicy Lucy.

Needless to say, Moses Malone couldn't match Daryl Dawkins when it came to personality, but then again, who could? He was content to dominate the key and establish himself as one of the best centers of his era -- which is saying something when you consider that he played at the same time as Kareem Abdul-Jabbar, Robert Parish, Artis Gilmore, (H)akeem Olajuwon, Bill Walton, and Dave Cowens. Unlike Dawkins, Malone won an NBA championship; and of course, he did so with the 76'ers in the first season he played for them, which was also the first season Dawkins did not.

They died 17 days apart, both before their time: Dawkins on August 27th of a heart attack at the age of 58, and Malone on September 13th of heart disease, at the age of 60.


Robert Conquest
Robert Conquest is nowhere near as well-known as the people mentioned above. However, he did invaluable work to protect the pro-freedom, pro-individual, anti-totalitarian society which allowed them to thrive on their own terms and merits.

This dogged historian and chronicler was born in the United Kingdom 98 years ago, back when its Prime Minister was David Lloyd George and our president was Woodrow Wilson. His 1968 book The Great Terror was the first comprehensively researched account of the Great Purge of the 1930's, during which the USSR's communist government oppressed any of its citizens that it perceived as opponents -- from peasant to generals alike -- by accusing them of subversion, ramming them through show trials in front of kangaroo courts, and imprisoning them according to whim.

Another of his books, The Harvest of Sorrow, exposed how the USSR destroyed human rights and created a man-made famine with its confiscation of farms, particularly in Ukraine.

Robert Conquest exposed the wickedness of what Ronald Reagan would later call the Evil Empire. He met with world leaders to insist that free enterprise be defended and Soviet Communism be confronted and beaten, rather than accommodated and enabled. Based on his observations of humanity through time, he expressed something that has come to be known by his own name as Conquest's Law: The more one knows about a subject, the more conservative one becomes about that subject.

When Conquest passed away last summer, living in the shadow of Stanford University in California, John O'Sullivan described him as "the single most important historian of the Soviet Union and its crimes while also being eminent in other fields, notably literature and criticism, and not least an influential adviser to Margaret Thatcher and Ronald Reagan at a key turning point in the Cold War." Had he not been around, it is not at all clear whether the West's leaders would have taken the Soviet threat as seriously as it needed to be taken.


B.B. King
Riley B. King, aka B.B. King, was active as a professional musician for 67 years, right up until he died on May 14th. His innovative style of string-bending and vibrato (performed on the Gibson guitars he invariably referred to as Lucille) made him a force, and has influenced countless numbers of guitarists from multiple generations who perform in multiple genres of music. He won 15 Grammy Awards and Rolling Stone ranks him as the sixth best guitar player to have ever lived.

B.B. King absolutely earned the nickname "King of the Blues," and some time later, when a smattering of self-professed purists complained that not all of his songs expressed emotions downtrodden enough to be called blues, he countered by saying he was a well-rounded person and therefore "I don't have the blues all the time."

In 1987, 16 years after his first Grammy, King's versatility was rewarded when he was inducted into the Rock and Roll Hall of Fame. 27 years after that, he was inducted into the Rhythm & Blues Hall of Fame, and in the interim he received the 2004 Polar Music Prize, which is given to artists "in recognition of exceptional achievements in the creation and advancement of music."

King's signature song is called "The Thrill is Gone" -- but we will always be able to catch a thrill as long as there are recordings of him soloing on Lucille and belting out lyrics in his perfectly gruff voice.


Yogi Berra
Born to immigrants in a heavily Italian section of St. Louis, Lorenzo Pietra Berra was arguably the best catcher in baseball history and definitely the most famous. When Berra was a young man playing in the amateur American Legion leagues, Jack Maguire observed that he resembled a Hindu yogi because he often sat with his arms and legs crossed -- thus his timeless nickname was born.

He served in the U.S. Navy during World War II and fought on D-Day, both at Omaha Beach and Utah Beach, receiving several commendations for bravery. A year after the war ended, he played his first Major League game with the New York Yankees on September 22, 1946. The following season he played in 83 games and made his first appearance in the World Series, which the Yankees won by beating the Brooklyn Dodgers in seven games. Berra went on to play 18 seasons with the Yankees (including his brief stint in 1946), winning 10 World Series rings during that time; and after a year away from the game he came out of retirement to play the 1965 season with the Mets.

When all was said and done, Yogi Berra was a three-time league MVP and 18-time All Star. The first catcher ever to leave one finger outside of his glove while playing, he ranks as one of only four catchers ever to finish a season with a 1.000 fielding percentage. He caught more shutouts than anyone else (173) and retired with the American League record for putouts (8,723). And in addition to all those World Series rings he earned as a player, Berra won three more as a coach -- for the Mets in 1969 and Yankees in 1977 and 1978.

Today he is best remembered for his verbal quips -- such as "a nickel ain't worth a dime anymore" and "baseball is 90% mental and the other half is physical" -- but his prowess on the diamond should never be forgotten.


Wes Craven
Were it not for Wes Craven, horror films as we know them would not exist. Born into a strict Baptist family in Ohio near the end of the Great Depression, he earned undergraduate degrees in English and psychology followed by a master's in philosophy and writing from Johns Hopkins. That sounds like the perfect background to enable someone to conjure horror stories -- and it is -- but Wes Craven's foray into silver screen macabre did not come until years later.

Between his college years and Hollywood years, Craven was a public school teacher in New York and got his first job in the film industry when he worked as a sound editor for a production company owned by Harry Chapin, the self-proclaimed "third-rate rock star" who is best known for the song "Cat's in the Cradle." Between his time with Chapin and his emergence as a big-name Hollywood figure, Craven opted to chase easy bucks by directing pornography, most notably the X-rated classic Deep Throat.

His turn toward horror began with his first feature (read: non-porno) film The Last House on the Left in 1972, and continued during that decade with such films as The Hills Have Eyes and the made-for-TV Stranger in Our House. Then, in 1984, he hit the really big time with A Nightmare on Elm Street, which he both wrote and directed -- and which should be remembered not only for introducing the character of Freddy Krueger, but for being the very first movie in which Johnny Depp appeared. As the subsequent years unfolded, Craven continued to drive the horror genre by churning out such works as The Serpent and the Rainbow, The People Under the Stairs, and Scream.


Leonard Nimoy
It's tempting to say that were it not for Spock, there would be no Leonard Nimoy. But perhaps we should look at it through the opposite lens and say that were it not for Leonard Nimoy, there would be no Spock. No other actor could have pulled the role off in the way that Nimoy did. Without that probing voice, steady demeanor, and crazily upturned eyebrow, Star Trek would not have had the character we recognize today.

But Leonard Nimoy was not limited to the role of Spock. He narrated the 1970's documentary series In Search Of..., and as good as that series was, it's hard to imagine it getting off the ground without his voice leading the way. Nimoy was also an excellent photographer, having had his work exhibited at the R. Michelson Galleries and Massachusetts Museum of Contemporary Art. This decade, he returned to the small screen by playing the pivotal, mysterious character of Dr. William Bell in the TV series Fringe.

Nimoy lived long and prospered, and slipped Earth's surly bonds in February at the age of 83.


Ken Stabler
It takes a lot for an Auburn grad to honor a Bama football star, but here I go.

Ken "The Snake" Stabler was born in Foley, AL on Christmas Day, in the same year that World War II ended. After quarterbacking Foley High School to a combined record of 29-1 throughout his high school career, he earned a scholarship to the University of Alabama and became its starting QB in his junior year. Stabler guided the Crimson Tide to an undefeated season in 1966, and while the 1967 season had three losses, it ended with him sprinting 53 yards to defeat Auburn (damn you, you sonofabitch!) in a play that became known as "The Run in the Mud."

After college he was drafted by the Oakland Raiders and went on to become one of the most decorated football players of the 1970's, quarterbacking them to victory in Super Bowl XI and appearing in four Pro Bowls. He twice led the league in touchdown passes, was the league MVP in 1974, and was named to the all-decade team.

Stabler's hard partying ways were legendary enough to personify the rebelliousness of the 1970's Raiders. Describing training camp in his autobiography, he wrote that its "monotony was...so oppressive that without the diversions of whiskey and women, those of us who were wired for activity and no more than six hours sleep a night might have gone berserk."


I know I'm leaving plenty of deserving people off of this list -- from Stuart Scott succumbing to cancer four days into the year, to Wayne Rogers being claimed by pneumonia on its final day, and in between, Ornette Coleman packing up his sax and taking it to Heaven -- but time and space are limited, and I am off to enjoy the day with my family.

Happy New Year's, everyone.