ReportsColumnsBookOther WritingsBiog


Note: This op-ed column appeared in the March 26, 1998 Bangor (Maine) Daily News


Click Here for the Full-Length Report

Maine’s Student Drug Use Statistics:
Can We Believe Them?

Analysis by Jean Hay

The first job for members of the new legislative committee set up to combat substance abuse is to get a clear idea of the real size of Maine's drug problem.

I would not recommend they use data from the state's last three student drug surveys. A study I did last year while at the University of Maine showed many of those results were either exaggerated or provably unreliable.

Some of the problems I uncovered in the 1992, 1995 and 1996 surveys were the following:

Those issues and others uncovered "certainly will be addressed" before a similar drug survey is conducted by a new research team this spring, Lynn Dube, director of Maine's Office of Substance Abuse, said in a recent phone interview after reviewing my report.

Rep. Joe Brooks (D-Winterport), a member of the new committee, said he welcomes a new drug survey "because it will go right to my concerns that the results we were seeing were a bit high. The 'alarming rates' didn't seem to be consistent with what we were monitoring in Eastern Maine." But, he said, "I am not a trained researcher. We had no means of questioning the surveys."

Philip H. Person, Ph.D., of Orland is a trained researcher. Before his retirement, Person worked for 8 years as a Branch Chief and senior statistician with the National Institute on Drug Abuse and for 17 years as an analytical statistician with the National Institute of Mental Health.

After reviewing both the original drug surveys and my findings, Person said my study "raised some astute and legitimate questions about the very basis of the whole set of surveys.The value of truth in these surveys is placed into considerable doubt."

Maine's first statewide survey was done in 1988 by the late Dr. Barrie E. Blunt. Dr. Robert Q. Dana, director of the University of Maine Substance Abuse Services, was the lead researcher in the last three surveys. In all four statewide surveys, a sampling of 6th to 12th grade students was asked to anonymously self-report many kinds of behavior, including the personal use of various drugs.

Blunt was very concerned about accuracy, so he included a fake drug, as well as questions asking the students how honestly they had answered the survey.

"All students who indicated any lack of honesty were removed from the sample," Blunt wrote. He threw out about 9 percent of his surveys.

Dana parted company with his predecessor in 1992 and later years. First, he keep all the surveys of students who admitted lying, despite admitted overall lying rates of 10-15 percent (with junior-high boys topping out at 24 percent). Then, when computer checks showed a high correlation between students who said they used the fake drug, and those who said they had used real, hard drugs, Dana made a point of counting those surveys too. He explained in his summary that "a heavy drug user might not remember all of the names of particular drugs that have taken (sic)."

What studies supported that contention?

"I'm not sure if we looked into that or not," Dana replied during an interview last summer.

When asked what research supported another contention that anonymity produced honest answers, and not more exaggeration, Dana became angry. "The idea of justifying survey research is not something we have to do today," he said.

Person was startled by Dana's dismissal of the need to justify his research. He also had a different perspective than Dana on the issue of lying.

"Expecting drug-abusing students to answer truthfully questions about 'their own illegal or undesirable behaviors' strains one's credibility," Person said.

It also does not help when researchers hide controversial data.

The 1992 survey results had 10-14 percent of 11th and 12th graders admitting to regularly driving while drunk or stoned on marijuana. Dana that year called that rate "an epidemic."

What he did not say back then was that the raw statistics for 6th to 10th graders were nearly twice as high -- 17-23 percent. But Dana did not just forget to mention those figures -- he blacked them out in his summary charts behind lines of NA's (not applicable). "We were trying to get an accurate representations of the LICENSED population," he insisted in our August 1997 interview.

But weren't those lower-grade drunk-driving figures astonishing? "There is too much information to include," Dana replied. "One makes decisions. I didn't consider it to be important."

Another way to check for accuracy is to compare the students' answers with hard statistics such as crime reports.

In response to one question, the students surveyed said they stole or attempted to steal between 2,300 and 3,581 motor vehicles in 1995 and between 1,219 and 1,441 in 1996. Extrapolating to 100 percent, those answers imply that between 20,300 and 52,660 attempts or actual thefts of motor vehicles were made in Maine by 6th-12th graders in those years.

Yet official figures from the state's Uniform Crime Reporting Office in Augusta show only 1,756 motor vehicles -- including cars, trucks, buses, snowmobiles and motorcycles -- were reported stolen in Maine in 1994. The figures were 1,720 in 1995, and 1,768 in 1996.

Many students also exaggerated their arrest records, claiming up to five times more arrests than the actual juvenile arrests recorded in those years. Person called the comparison of the student claims of motor vehicle thievery and state crime statistics "a telling finding."

Random sampling, a scientific standard, also proved a problem for Dana's researchers. In 1992, when school and parental permission slips did not come back as expected, researchers decided to include all willing schools and all students whose parents did not object. Still, eight of Maine's 16 counties surveyed less than half the desired number of students. Sagadahoc County surveyed only 96 students and Somerset County a mere 47.

Later surveys had similar problems. In both 1995 and 1996, targeted numbers were not reached in seven counties. Cumberland County, which includes the greater Portland area, surveyed only 338 students in 1995 and 286 in 1996, out of a student population of about 20,000. In 1996, Waldo County had the low count of 91.

Undeterred, the researchers decided to "adjust" the discrepancies by weighting the responses based on where the students took the test. That meant counting every survey from the Portland area four times, but discounting over-represented Knox County kids by 70 percent.

In a 1997 BDN news story, Dana claimed a scientific margin of error of 1 percent for the 1996 report, which he said provided researchers with "unparalleled confidence" in the study.

Person, after reviewing Dana's reports, did not share that unparalleled confidence.

"These drug abuse surveys appear to have compromised seriously the random sampling plan," he said. "Deviations, compromises, and adjustments invalidate, or make meaningless, the use of sampling errors and confidence limits, both of which depend on these rules'. The compromises, adjustments, and exclusions that were made in executing the plan cast much doubt upon any results."
--------------------------------------------------
Freelance writer Jean Hay of Bangor is a former reporter and bureau chief for the Bangor Daily News.


Click Here for the Full Report upon which this Column is Based

To Top of This Page

Jean Hay's Home Page
ReportsColumnsBookOther WritingsBiog