Saturday, June 09, 2007
It's interesting, and amusing, to track the permutations of these releases via Google and Google News. (Oddly, I get slightly different hits using the same search terms on Google and Google News. Go figure.) This led me to send e-mail to a couple-three far-flung correspondents regarding the propriety of my now claiming all of these various sources that used my release verbatim or nearly so as places where "my work" has "been published."
Unfortunately, one or two of the far-flungeds seemed to think I was serious, and replied seriously. This leaves me with the dilemma of how now to look like an idiot while at the same time not making them look like idiots. There are those who will tsk-tsk and say it's the because e-mail is an inferior communications medium, rife with the possibility of misunderstandings such as this. To which, after much thought, I reply: hooey! The written language has been around for a little while now, and with scant reliance on "emoticons" or "smileys" abbreviations or any other crap to indicate the placement of tongue-in-cheek.
Anyhow, I as usual digress. The point is that now my CV is crammed to bulging with all of these new places where "my work" has "appeared." No idea how I will keep up with it all.
It then occurred to me that I have been short-changing myself for years now, for I can in complete honesty say that "my writing" has appeared in Newsweek, The Atlantic, Smithsonian, and Macworld...since at one point or another they have all printed letters from me! (Hey, I wrote the letters and they published them--how much clearer could I be?)
The I got to thinking about the likes of, an expression I found odious during my magazine-editing days: By the likes of do you mean these? Then why not say so? Well, here's why: you can fudge like made with the likes of...and with just a little ingenuity I can pad the old vitae even more by pointing out that I have written for the likes of Esquire--since there was certainly an interval back in the 1980s when the editor and art director of the magazine on which I worked seemed intent on making it look and read an awful lot like Esquire, Jr. ("No I never said I wrote for Esquire; I said I wrote for the likes of Esquire!"
It also occurs to me that I can claim to have "contributed" to several sources, including the Associated Press--sources that did not use my releases verbatim but which adapted them or otherwise built on them for their own reports. It's a shame I didn't think to send releases to Reuters. Oh, well, there's always next year. As it is, it's going to take me several days just to update my CV.
He said jokingly!!
"[O]ne of the all-time great grammatical shibboleths [is] that when writing a sentence or a clause, you must not ... make a preposition the last word you put in. This notion apparently originated with the poet John Dryden, who in a 1672 work quoted Ben Jonson's line 'The bodies that these souls were frighted from' and commented 'The Preposition at the end of a sentence; a common fault with him, and which I have but lately observ'd in my own writings.' Probably, Dryden based his stand on two foundations. First, prepositions in Latin never appear at the end of a sentence, not surprising since praepositio is Latin for something that 'comes before.' Second, a principle of composition that's as valid in the twenty-first century as it was in the seventeenth holds that, whenever possible, sentences should end strongly--and prepositions, as necessary as they undeniably are, are usually more of a whimper than a bang.
"Whatever its origin, the ban found favor with prescriptivists through the centuries, including Edward Gibbon; John Ruskin, who in an entire book (Seven Lamps) concluded a sentence with a preposition precisely one time; Lily Tomlin's officious Ernestine the telephone operator, who asked, 'Is this the party to whom I am speaking?'; and my mother-in-law, Marge Simeone, who is prone to saying things like 'In which car are we going?'... [B]ut [this rule] was always a bit suspect. It was blown out of the water by [Henry] Fowler, who wrote in A Dictionary of Modern English Usage, 'Those who lay down the universal principle that final prepositions are 'inelegant' are unconsciously trying to deprive the English language of a valuable idiomatic resource, which has been used freely by all our greatest writers except those whose instinct for English idiom has been overpowered by notions of correctness derived from Latin standards.' Fowler then gave twenty-four examples of the 'rule' being broken by such writers as Chaucer, Spenser, Milton, Pepys, Swift, Defoe, Burke, Kipling, and the authors of the King James Bible. ... When the preposition occurs in a phrasal verb, the transposition task can be close to impossible. To 'fix' a phrasal-verb-concluding sentence like 'I'm turning in,' you'd have to come up with something like 'Turning in I am,' which not even Yoda from Star Wars could say with a straight face.
"To anyone still unconvinced, I offer two small anecdotes, in reverse order of familiarity.
"1. Winston Churchill, when corrected for violating this rule, supposedly replied, 'That is the sort of nonsense up with I will not put.'
"2. A guy from South Philadelphia, on vacation in London, asks a bowler-hatted gent, 'Where's the subway at?' The Londoner replies, 'Don't you Yanks realize that it's poor English to end a sentence with a preposition?' To which the South Philly guy says, 'Okay, where's the subway at, asshole?' "
Ben Yagoda, When You Catch an Adjective, Kill It, Broadway Books, 2007, pp. 163-165.
Thursday, June 07, 2007
Anyhow, here's this thought-provoking piece by Mark Buchanan:
The political party that claimed it would restore “honor and dignity to the White House” has done nothing of the sort. Having on false pretenses led us into the disaster of Iraq, the administration and its supporters are now beginning – cravenly and shamefully – to shift blame onto the Iraqi people. The administration continues to hold hundreds of people without charges in secret prisons around the world, while arguing that torture is O.K. and that President Bush can disregard the laws he doesn’t like. I haven’t even mentioned illegal spying or efforts to keep scientists quiet if they’re saying the wrong thing.
Where’s the honor and dignity?
In her testimony last week before a House panel, Monica Goodling, the Justice Department’s liaison to the White House, admitted that she had “crossed the line” in using political considerations to judge potential Justice Department employees. She may well have broken laws that forbid political influence over civil service positions. But “crossing the line” has been business as usual for the past six years. Goodling’s behavior follows a pattern established across almost all federal agencies, where the administration has sought loyalty over competence at every turn.
Another word for it, of course, is corruption – and it’s natural to wonder how we got so deeply mired in it. If the gathering storm of investigations forces Karl Rove and other White House officials one day to testify under oath, we may have some chance of finding out. And I suspect, if we do, that we’ll discover that honor and dignity were sacrificed at the very top. It will be a familiar story – of a few power-hungry and largely amoral political operatives, the real drivers, whose actions encouraged and directed a small army of fairly ordinary people, the Monica Goodlings of this world, many of whom were hardly aware they were doing something wrong.
People who engage in corrupt acts often do not see them as such. This much has emerged from studies of corporate scandals and fraud at places like Enron or WorldCom. In a study two years ago, for example, business professors Vikas Anand, Blake Ashforth and Mahendra Joshi concluded that most fraud within institutions takes place through the willing cooperation of many otherwise upstanding individuals with no psychological predisposition to be criminals.
Whether embezzling money, undermining product safety regulations, or even selling completely fake products, the perpetrators rationalize away their responsibility. They deny that they actually had any choice, saying that “everyone was doing it.” Or they deny that anyone really got hurt, so there really was no crime: “They’re a big company, they can afford to overpay us.”
Then there’s the popular appeal to higher authority, a mechanism with special relevance, perhaps, to the loyalty-rewarding Bush administration: “I had to do it out of loyalty to my boss.”
All of this isn’t so surprising, actually, when you realize that we like to feel good about ourselves and about those with whom we work, and that our brains have immense talent for producing reasons why we should. People engaged in corruption, the academic researchers suggest, create a kind of psychological atmosphere in which what they’re doing seems normal or even honorable. So if congressional oversight does ultimately expose the machinations behind anything from secret prisons to the United States prosecutor purge, brace yourself for a litany of the usual excuses – “We didn’t know it was wrong” and “We were told to do it.”
But the psychology of rationalization is only part of the story. The other element in all such cases seems to be a chain-like linking together of individual actions that can undermine social norms with surprising speed – or keep them safe, sometimes if just a single person remains strong.
In the late 1970s, Stanford sociologist Mark Granovetter pointed out that the differences among people – in their willingness to engage in certain kinds of acts – can lead to surprises. Think of the dance floor at a party. Some people are more than happy to be the first out there, dancing alone, but lots of the rest of us would like some others out there first. You might be willing to go out if five or six went before you, while others might require 20 or 30. Some might not go out unless everyone at the party was out there.
The point is that each of us has a threshold for joining in, which depends on personality, the music being played and so on, but also – and this is the really important part – on how many others have already joined in. As Granovetter argued, this can make a group’s behavior extremely difficult to predict.
Just imagine, for example, that 100 people at the party have thresholds ranging from 0 to 99. In this case, everyone will soon be dancing, you can be sure of it. The natural extrovert with threshold 0 will kick it off, soon to be joined by the person with threshold 1, and the dancing will grow, eventually involving even the reluctant people.
But notice how delicately the outcome depends on the precise interlocking of these thresholds. If the person with threshold 1 goes home, then after the first person starts dancing the rest will simply stand by watching. With no one willing to be the second person onto the floor, there’s no chain reaction. So just one person can have a dramatic effect on the overall group.
This is just a toy model, but it illustrates something about the logic of people joining not only dance floors, but riots or protests, trips to the pub in the evening, getting in with others to skim cash from the restaurant till – or violating well-known rules against taking political affiliation into account when hiring. Tiny differences in the group makeup, the presence or absence or a few people of the right type, might be the difference between a few renegade violators and division-wide corruption.
I can’t help thinking of the bizarre attempt by then-White House officials Andrew Card and Alberto Gonzales to get then-Attorney General John Ashcroft, drugged and in the hospital, to sign off on a secret National Security Agency wiretapping program. Ashcroft – who back then I would have thought would rubber-stamp anything Bush wanted – was clearly made of sterner stuff and refused, as did Deputy Attorney General James B. Comey. Again, we won’t know how much effect these refusals had – and just how extreme the program was that Bush wanted to authorize – until someone manages to get past White House stonewalling and digs up the real information.
But the fragility of social outcome, its potential sensitivity to the actions of just one person, brings home the profound importance of individual responsibility. Everyone’s actions count. The laws and institutional traditions we have were put in place precisely to help us avoid these social meltdowns, and to give people the incentive not to step over the line, especially when lots of others are doing so already. In particular, the laws of the civil service prevent hiring on the basis of political affiliation (at least for many positions), and the routine violation of those laws puts our democracy at risk. Many people went along with it, and so might have many more, had the creeping corruption not been exposed when it was.
Restoring honesty and dignity. One might say of it what Gandhi said when asked what he thought of Western Civilization: “I think it would be a good idea.”