In Question of Ethics: Template Op-Eds Under Fire Dan Keeney says, "The common public relations practice of providing template materials to grassroots advocates is under intense scrutiny and may fall victim to the scandals that have rocked the New York Times and USA Today...
My comments below>>>
I much agreed when he made the point, "Providing template language for op-eds and letters to the editor is a practice used to give voice to those who agree with a particular point of view but lack the time or the skills necessary to articulate their opinions. It serves the positive purpose of extending the reach of important societal issues into hometowns across America. However, given the plagiarism scandals and the increased scrutiny to ensure all published work is original, the practice may very well be unethical."
It will be difficult, if not impossible to get various NGOs to fall in line and observe proper ethics/procedures.
The reason I say this is that many activist projects operate using "advocacy research", or quoting numbers and stats that were never actually verified but that sounded good to someone at the time. I'm a veteran of NGOs with nearly 20 years either volunteering or serving as paid employee for different orgs at the admin level, and have often seen how they use their numbers to make themselves look better. I have in the past participated in this kind of "number crunching." The extent of how far the truth is stretched will vary according to the agency.
An org that can claim "an epidemic of _____" or "1 in 3 residents are affected by____" can accrue a lot more donations than another org which is scrupulous in its truth-telling. Often an org will publicize its number of service units (meals served, counseling sessions provided, etc) while deliberately wording their statements to appear that this equals the number of unduplicated individuals receiving their services on an ongoing basis. These numbers are not the same thing, but the number of service units will be exponentially larger in nearly all cases. Some orgs don't even track the actual number of clients they have, since most agencies that provide their funding (United Way, city or county gov'ts, etc.) are more concerned with the service unit; and their increases or decreases.
A couple of cases in point: when Governor Jane Hull was in office, the newsletter for the Governor's Task Force on Domestic Violence reported over 67,000 clients of various women's shelters over the course of a year, and 23,000 turned away due to lack of facilities. This was not an actual number of unduplicated individuals; it was the number of times services were used by women and children, sometimes the same women and children multiple times. (A service unit in this case equals one night in one bed.) Yet it was easily mis-read to say there were 90,000 women in Arizona seeking shelter for domestic violence. (I'm going by memory here, but I do have the actual document somewhere if anyone really needs it.) In a state where the total population is only about 4 million, this is clearly viewed as a serious problem -- an epidemic, if you will.
On most websites for domestic violence services in the US you will see either one or both these statements: "95% of the victims of domestic violence are women." "Every nine seconds, a woman is battered in the United States." Neither of these statements has any basis in fact. They fall into the category of those that sounded good at the time. Unfortunately, they seem to have taken on lives of their own; the likelihood is that any administrator will stand by these figures, despite the fact that she has no way of confirming them statistically. What happened in this case is that they are "based" on truthful, verifiable statements, but the original statements have something left out, and then the number was "prettied up" in
a way that makes sense to the casual reader. In a way, it's a case of a lie being told often enough that people begin to believe it.
The problem enters where a civic-minded reporter, circumspect about appearing to be anything other than supportive of their local human services agencies, which may even be publicly supported by the media outlet the reporter works for, accepts these kinds of figures without question. Often the person providing bogus statistics or advocacy research as fact believes the numbers they were given by the main office or other respected source, and based on their own experience has no reason to doubt their veracity. So
even if the numbers are questioned, the reporter is given a source, and that's the end of it. The numbers go into the paper or on air, and everyone goes away feeling they supported their community. I'm using the examples I have here because that's what I have at hand at the moment; but I witnessed instances in other kinds of orgs. Creative accounting is certainly not limited to any one kind of charity or service.
What it all boils down to is this: providing templates for any purpose could very easily propagate misinformation, which I'm sure happens all the time. So it's even more than a simple question of who wrote the letter or article; if a work is to be considered as factual, where is the proof of those facts?
What is to be considered acceptable as proof?
Thanks for the explanation.
I have often wondered at the figures I've seen.
Posted by: David St Lawrence | Wednesday, June 02, 2004 at 05:56 AM