Bogus News Reports
Original post made
by Jag Singh
on Nov 30, 2006
The Center for Media and Democracy exposed another major threat to our democracy - a damning report of 46 stations in 22 states using bogus new clips disguised as news reports. These Video News Releases, or VNR's, are carefully crafted to confuse the public in believing the prepackaged views of 'experts' beamed into our living rooms is delivered real time. Although the number of VNR's is down from a high of 77 stations last April, this practice shows no signs of letting up. The clips appear on the 6 'clock news to promote commercial products or political messages. Large media conglomerates, such as Disney, News Corp. and the Tribune have been identified and are funded by the country's largest corporations such as, Allstate Insurance, General Motors and GlaxoSmithKline. Public Relations firms, the hired guns, of this insidious practice, have sunk to new lows in promoting these corporate agendas. Sadly, corporate profits always seem to take precedence over the greater good. For example, the PR firm Medialink Worldwide, contracted a VNR production with "TCS Daily Science Roundtable" to challenge the 'uncomfortable truth' of global warming. It is outrageous that these attempts to influence public opinion by "fake reporters" are often paid by our tax dollars, I ask concerned readers to contact lawmakers and demand the Federal Communications Commission call an immediate halt to this gross abuse of the public airways
Posted by Critical Consumer,
a resident of Midtown
on Dec 1, 2006 at 12:24 am
I bet a lot of these things get on the air because people at the stations aren't smart enough to know that they are fluff pieces from PR firms.
If you watch the news, and they report on a topic you know a lot about, you can usually hear plenty of mistakes, or hear things that are not really "news" but these positioned PR pieces.
Its a good point, but I doubt legislating it will help - just don't believe everything you hear on TV!
Posted by Draw the Line,
a resident of Stanford
on Dec 3, 2006 at 1:28 pm
Ignorance is the real threat to Democracy. Always read information yourself, don't trust anyone to filter information for you.
Often you learn more by what ISN'T reported, than by what is. These omissions reveal the bias more than anything. For example, have you noticed how we have only gotten reports that the Army missed its recruitment goal a couple times in the last few years? Ask yourself WHY we haven't gotten the news all the other months that the military is turning away recruits, that the military has a higher average level of education than the population of the rest of the US, and that it has the highest re-enlistment rate amongst troops who have been to Iraq. Could it be because the mainstream press doesn't want the public to wonder what one million military VOLUNTEERS know that the rest of us don't?
Or, what about the news that was released locally ( for those who don't read the Wall Street Journal) that the economy is the best it has been with the lowest unemployment in 30 years? But, amazingly enough, this was released AFTER the elections. Why was that news missing for the last 2 years? It didn't suddenly get great one week post election.
So, Mr. Singh is correct. Be careful what you believe without sources. Study reliable sources. Ignorance is the real threat to democracy.
Posted by anonymous,
a resident of Barron Park
on Dec 23, 2006 at 9:01 am
Jag, as usual you have fallen for the propaganda of the far left -- the group that did this report is as hard left as you can get. But their real failing in this case is that they simply got their facts wrong. See Web Link
If you want to slam TV news, slam it for being superficial. Slam it for failing to cover elections in any meaningful way. Slam it for emphasizing sex over substance. But these guys at this Democracy center simply missed the mark.
Posted by Draw the Line,
a resident of Stanford
on Dec 30, 2006 at 4:22 pm
I keep thinking this absurd number of 655,000 is going to disappear in the shame of the lie, but it hasn't, so I am going to come back on to it.
Put 2 plus 2 together. Supposedly in November there was the "worst day" for the number of Iraqis who died since we went into Iraq. The number was about 100 dead people..which, indeed is awful, but if you try to reconcile that number " of the worst day", and then think for just a moment about what the number would be if there were 100 killed people per day for 3 1/2 years, how many people do you come up with?
Nowhere near the 655,000 that is still floating around from the horrifically incompetent, shameful "research" from..who was it, Duke? correct me, please, because I dismissed it from my memory.
You come up with 3,000 per month for a total of about 130,000 dead people. So, just thinking for a moment, which is all we need to do when we read these "stories" to sniff out what is wrong, tells anyone that the 655,000 number is completely made up out of thin air. Then all you have to do is go google the story and follow the rebuttals by the peers, who completely debunked the "research". It was as if someone sampled Mission Street in San Francisco to determine the views and the number of murders in the United States by extrapolating out a few stories.
I mean, really, for the first time in history there is a totally free press in Iraq..does anyone really think that somehow all these reporters MISSED all these dead people? Especially in a journalistic world determined to undermine the effort to bring democracy to Iraq?
CNN and every other "news" agency that was housed in Iraq BEFORE we went in managed to overlook the hundreds of thousands of REAL dead by Saddam, thank God he is gone, but that won't happen anymore. Democracy with its freedom of speech is well rooted in Iraq now. Fascism is dying.
Posted by Draw the Line,
a resident of Stanford
on Dec 31, 2006 at 2:20 pm
655,000 War Dead?
By STEVEN E. MOORE
October 18, 2006; Page A20 ( Wall Street Journal)
After doing survey research in Iraq for nearly two years, I was surprised to read that a study by a group from Johns Hopkins University claims that 655,000 Iraqis have died as a result of the war. Don't get me wrong, there have been far too many deaths in Iraq by anyone's measure; some of them have been friends of mine. But the Johns Hopkins tally is wildly at odds with any numbers I have seen in that country. Survey results frequently have a margin of error of plus or minus 3% or 5% -- not 1200%.
The group -- associated with the Johns Hopkins Bloomberg School of Public Health -- employed cluster sampling for in-person interviews, which is the methodology that I and most researchers use in developing countries. Here, in the U.S., opinion surveys often use telephone polls, selecting individuals at random. But for a country lacking in telephone penetration, door-to-door interviews are required: Neighborhoods are selected at random, and then individuals are selected at random in "clusters" within each neighborhood for door-to-door interviews. Without cluster sampling, the expense and time associated with travel would make in-person interviewing virtually impossible.
However, the key to the validity of cluster sampling is to use enough cluster points. In their 2006 report, "Mortality after the 2003 invasion of Iraq: a cross-sectional sample survey," the Johns Hopkins team says it used 47 cluster points for their sample of 1,849 interviews. This is astonishing: I wouldn't survey a junior high school, no less an entire country, using only 47 cluster points.
Neither would anyone else. For its 2004 survey of Iraq, the United Nations Development Program (UNDP) used 2,200 cluster points of 10 interviews each for a total sample of 21,688. True, interviews are expensive and not everyone has the U.N.'s bank account. However, even for a similarly sized sample, that is an extraordinarily small number of cluster points. A 2005 survey conducted by ABC News, Time magazine, the BBC, NHK and Der Spiegel used 135 cluster points with a sample size of 1,711 -- almost three times that of the Johns Hopkins team for 93% of the sample size.
What happens when you don't use enough cluster points in a survey? You get crazy results when compared to a known quantity, or a survey with more cluster points. There was a perfect example of this two years ago. The UNDP's survey, in April and May 2004, estimated between 18,000 and 29,000 Iraqi civilian deaths due to the war. This survey was conducted four months prior to another, earlier study by the Johns Hopkins team, which used 33 cluster points and estimated between 69,000 and 155,000 civilian deaths -- four to five times as high as the UNDP survey, which used 66 times the cluster points.
The 2004 survey by the Johns Hopkins group was itself methodologically suspect -- and the one they just published even more so.
Curious about the kind of people who would have the chutzpah to claim to a national audience that this kind of research was methodologically sound, I contacted Johns Hopkins University and was referred to Les Roberts, one of the primary authors of the study. Dr. Roberts defended his 47 cluster points, saying that this was standard. I'm not sure whose standards these are.
Appendix A of the Johns Hopkins survey, for example, cites several other studies of mortality in war zones, and uses the citations to validate the group's use of cluster sampling. One study is by the International Rescue Committee in the Democratic Republic of Congo, which used 750 cluster points. Harvard's School of Public Health, in a 1992 survey of Iraq, used 271 cluster points. Another study in Kosovo cites the use of 50 cluster points, but this was for a population of just 1.6 million, compared to Iraq's 27 million.
When I pointed out these numbers to Dr. Roberts, he said that the appendices were written by a student and should be ignored. Which led me to wonder what other sections of the survey should be ignored.
With so few cluster points, it is highly unlikely the Johns Hopkins survey is representative of the population in Iraq. However, there is a definitive method of establishing if it is. Recording the gender, age, education and other demographic characteristics of the respondents allows a researcher to compare his survey results to a known demographic instrument, such as a census.
Dr. Roberts said that his team's surveyors did not ask demographic questions. I was so surprised to hear this that I emailed him later in the day to ask a second time if his team asked demographic questions and compared the results to the 1997 Iraqi census. Dr. Roberts replied that he had not even looked at the Iraqi census.
And so, while the gender and the age of the deceased were recorded in the 2006 Johns Hopkins study, nobody, according to Dr. Roberts, recorded demographic information for the living survey respondents. This would be the first survey I have looked at in my 15 years of looking that did not ask demographic questions of its respondents. But don't take my word for it -- try using Google to find a survey that does not ask demographic questions.
Without demographic information to assure a representative sample, there is no way anyone can prove -- or disprove -- that the Johns Hopkins estimate of Iraqi civilian deaths is accurate.
Public-policy decisions based on this survey will impact millions of Iraqis and hundreds of thousands of Americans. It's important that voters and policy makers have accurate information. When the question matters this much, it is worth taking the time to get the answer right.
Mr. Moore, a political consultant with Gorton Moore International, trained Iraqi researchers for the International Republican Institute from 2003 to 2004 and conducted survey research for the Coalition Forces from 2005 to 2006.