Jean Hay's Home Page
Reports Columns Book Other Writings Biog

Note: This report was written as an independent study project for the journalism department at the University of Maine, Orono, in the summer of 1997. It was updated, with new budget figures, the complete list of surveyed schools, and a critique by retired statistician Philip Person, in February 1998. Comments from Lynn Duby of the Office of Substance Abuse, and Rep. Joe Brooks, were added to this report on March 26, 1998.

Click Here for Op-Ed Column that ran in March 26, 1998 Bangor Daily News

Student Drug Surveys:
More Noise than Substance?

Analysis by Jean Hay

Introduction Background Fake Drugs Honest, I Lied Driving Drunk Inept Car Thieves
Criminal Mentality Why Bother? Options What do you mean? Sampling Fudge, Anyone?
Home to Roost Surveyed Schools Recommendations Dana Defends Independent Critique

"An epidemic."

That’s how Dr. Robert Q. Dana, director of the University of Maine Substance Abuse Services, characterized the 1992 statewide student drug survey results which showed that 10 to 14 percent of 11th and 12th graders had admitted to regularly driving while drunk or stoned on marijuana.

What Dana did not say in his address to his concerned audience in Houlton, Maine that day in 1992 was that the statistics for 6th to 10th graders were nearly twice as high – 17 to 23 percent.

But Dana did more than just fail to mention those remarkable results – he blacked them out in his summary charts behind a series of NA’s. Questioned about it recently, Dana insisted the data blackout was appropriate. "We were trying to get an accurate representations of the LICENSED population," he said in an August 1997 interview.

But weren’t those lower-grade drunk-driving figures astonishing?

"There is too much information to include," Dana replied. "One makes decisions. I didn't consider it to be important."

So, how many students said they were driving drunk or stoned in the 1995 and 1996 surveys?

Not a single one.

Despite the "tremendous concern among drug and alcohol educators regarding substance-impaired driving" which Dana had noted in his 1992 survey summary, no drunk or substance-impaired driving question appears in either the 1995 or 1996 surveys.

Back To Table of Contents

Introduction

 Four times in the last decade the state of Maine, with financial help from the federal government, has conducted drug surveys among its 6th-12th grade students.

The results are guaranteed to make headlines.

Survey of Maine Students presents grim statistics" (Bangor Daily News July 21, 1994)
Study finds drug use up among kids" (BDN Oct. 14, 1995)
Drug use up among Maine students" (BDN Feb. 13, 1997)
The statistics are used to justify prevention and containment efforts, from the D.A.R.E. program in schools to the national "war on drugs."

But do these statistics represent reality? How far should we go in trusting students to honestly self-report their own illegal, anti-social, and/or undesirable behaviors, many of which, if traced back to them, could land them in juvenile court?

More importantly, can we trust researchers to tell us what their research showed?

An in-depth review of the four survey reports, plus an interview with the lead researcher for the three most recent polls, has turned up a number of inconsistencies which challenge the veracity of those reports. Among the problems uncovered were:

Back To Table of Contents

Background

The first survey, Alcohol and Other Drug Use by Maine Students, was done in 1988 by the late Dr. Barrie E. Blunt, using 5th-12th grade students as subjects. Fifth grade was dropped from the 1992 and subsequent surveys.

In 1995 and 1996, at the request of the federal Substance Abuse Prevention and Treatment program, a survey similar in many respects to the one previously used in Maine was given to students in six states – Maine, Kansas, Oregon, South Carolina, Utah, and Washington.

As director of the University of Maine Substance Abuse Services, Dr. Robert Q. Dana was the lead researcher in the 1992, 1995 and 1996 surveys. Staff at the Margaret Chase Smith Center for Public Policy at the University of Maine at Orono helped administer the latest two surveys.

While the 1988 budget total was not available, survey budgets were $39,731 in 1992, $59,022.24 in 1995, and $83,748 in 1996.

But those figures do not tell the whole drug-prevention budget story.

Maine's Office of Substance Abuse reported it spent $3.7 million for primary drug prevention in fiscal year 1997 (July 1, 1996 to June 30, 1997). Of that total, $729,285, or just under 20 percent, came from the state's general fund. The rest was from a federal block grant ($1.1 million) and a Safe and Drug-Free Schools and Communities Act block grant ($1.9 million).

In all four statewide surveys, a sampling of students was asked to anonymously self-report the personal use of various substances:

Back To Table of Contents
 Fake drugs

Surveys of this sort are useful only if the results reflect reality. For that to happen, students must answer the questions honestly.

"Great care was taken to exclude any respondent who might have been less than truthful in completing the questionnaire," Blunt wrote in explaining why he had included a fake drug, menotropins, among his 1988 survey questions. Menotropins, he wrote in his summary conclusions, "is a non-existent drug and any students responding that they had used this drug were excluded from the sample."

Dana, however, in the 1992 survey, parted company with his predecessor. Cross-checking showed those students who claimed to have used menotropins, the drug that didn't exist, were the same ones who "use the most drugs." Clearly, excluding those surveys as Blunt had done would have significantly lowered the reported user results for many real drugs.

Dana decided that excluding those students who said they used a fake drug was a big mistake.

"After reviewing our factor analyses and tests for reliability we have elected NOT to follow the protocol of the previous study." Dana wrote in his 1992 summary. "While students who indicated using menotropins might be inflating their level of substance abuse for other drugs, it may also be that can that (sic) a heavy drug user might not remember all of the names of particular drugs that have taken (sic). By discarding those students who reported using menotropins we could be discarding the most serious users in the sample."

What studies existed to support his contention that heavy drug users don't in fact remember the names of the drugs they have taken?

"I'm not sure if we looked into that or not," Dana replied during an interview in August 1997 in his office on the University of Maine campus in Orono. But he was emphatic that his decision was the correct one.

"Students may use drugs that have street names, slang names," Dana said. "They may not be excellent historians about what drugs they did use. They may not be able to articulate by name every drug, but they are aware of the amount of drugs they use. I don't expect them to be pharmacologists."

He added, "I feel OK about it."

Tables in the survey results broken down by grade level show reported usage of non-existent menotropins as high as 5.6 percent in some groups. By comparison, the highest reported heroin usage was 3.8 percent, and (with one interesting exception) 6 percent for cocaine or crack, and 5.1 percent for steroids.

The interesting exception was the 6th-8th graders in rural Somerset County, an identical 8.3 percent of whom reported using PCP, cocaine, crack, steroids, and hallucinogens including LSD and mescaline at least once in their lives. Somerset County polled only 47 students that year. Statistically, one student out of 12, or two out of 24, would account for an 8.3 percent figure. Oddly, none in that same group said they had ever used marijuana, heroin, caffeine in large doses, inhalants, depressants, stimulants -- or menotropins.

"Menotropins" does not make a repeat appearance in either the 1995 or 1996 surveys. But one question in later surveys asked about use of a drug called "derbisol."

Neither survey summary report indicates whether derbisol is a fake drug.

It is.

Dana said it was "an oversight" that he did not include that information in the final reports.

Back To Table of Contents

Honest, I lied

The use of a fake drug was only one of three questions used by Blunt in his 1988 survey to weed out "less than truthful" subjects. Two others, #106 and #107 near the end of his survey, asked how honest students had been when answering the questions about their use of alcohol and drugs. Three answers were possible:

1. I was very honest
2. I said I used it more than I really do
3. I said I used it less than I really do

"All students who indicated any lack of honesty were removed from the sample," Blunt wrote in his conclusions.

Using this three-question filter, 344 surveys, or about 9 percent of the total, were automatically excluded from all analysis in 1988.

While publicly parting company with Blunt on the issue of fake drugs, Dana is silent in his 1992 report on what, if anything, he did with his own honesty question, #116, which read:

When I answered the questions contained in this questionnaire

a. I was completely truthful
b. I was truthful on most questions
c. I was truthful on some questions
d. I was not very truthful.

Dana said he couldn't recall whether or not any forms were weeded out as a result of this question.

"I don't know if we discarded any. I do know we had a certain percentage of students who were dishonest. My presumption is that we would have kicked those or made a statement." He said he did not know why such a statement was not in the summary report.

After checking his records, Dana corrected himself.

"We did not discard those students who recorded dishonesty," he said.

So, were cross-tabulations done with the dishonest forms as they had been done with the fake drug replies, to see what statistics might be distorted?

"We did not do cross-tabs on the group that indicated dishonesty," Dana said.

On average 10 percent of the students in 1992 admitted to having lied on the survey, at least in part. While 16.7 percent of 6th grade boys said they had lied, only 3.4 percent of 12th grade girls admitted to lying.

In 1995 and 1996, the wording changed once again:

How honest were you in filling out this survey?
1. very honest
2. pretty honest
3. honest sometime
4. honest occasionally
5. not honest at all

In 1995, 13.7 percent overall said they had not been "very" honest in filling out the survey. Eighth-grade boys reported lying the most (23.6 percent), with 7.3 percent of senior girls admitting they lied. However, a full 16.4 percent, one in six, of the 7,477 students surveyed in 1995 did not answer this question.

In 1996, 15.3 percent of those answering the question said they had lied, with 7th grade boys leading the pack at 24 percent, and 10th and 12th grade girls tied for a low of 6.6 percent. About 9.2 percent did not answer the question.

By refusing to discard those forms as Blunt had done, were researchers guaranteeing that the survey results would be highly inaccurate, by at least the admitted overall lying rate of 10 to 15 percent, and as much as 24 percent in some categories?

"Fifteen percent of any group is reasonable to look at," Dana said. "That's something I would be interested in knowing more in detail. I don't recall writing about it, but I'm sure we looked at it."

Dana was also asked why researchers believed that asking students to self-report their own illegal, anti-social, and/or undesirable behaviors, many of which could land them in juvenile hall, would produce honest answers.

"Because we protect their responses from people like you," he said. "They know it's anonymous, we tell them it's anonymous. There is research that young people want to talk about their drug use.'"

Since no prior research on the issue of honesty had been cited in any of his reports, Dana was asked what research had been done to show that the promise of confidentiality and anonymity produced more honest answers, and not more exaggeration in those age groups.

"A lot of studies, go look them up," Dana responded, without offering any examples. "Do you think I just think these up? These things are hugely referenced. Huge scientific bodies approve these designs. These are major multi-state research initiatives. The idea of justifying survey research is not something we have to do today. I feel very confident with the data set, that this sample reflects the population."

Five other questions on the survey address students' attitudes toward honesty. If the answers are to be believed, between a quarter and a third of all middle and high school students are constantly testing those limits.

#63. I do the opposite of what people tell me, just to get them mad.
(Somewhat or very true for 27.5 percent in 1995 and 26.2 percent in 1996)

#70. I ignore rules that get in my way.
(Somewhat or very true for 30.5 percent in 1995, 31 percent in 1996)

#72. It is important to be honest with parents, even if they become upset or you get punished.
(22.5 percent in 1995 and 22.8 percent in 1996 answered "no" or "NO!")

#80. It is sometimes OK to cheat at school.
(38.8 percent in 1995 and 38.3 percent in 1996 answered "yes" or "YES!")

#81. I like to see how much I can get away with.
(Somewhat or very true for 35.5 percent in 1995 and 38 percent in 1996)

Back To Table of Contents
Driving Drunk

"There is tremendous concern among drug and alcohol educators regarding substance-impaired driving. Therefore two items (#66 & 67) related to this issue were included in the survey." Dana wrote in his 1992 survey summary. "...The second `operating under the influence' item [#67] asked respondents to indicate how often, if ever, they operated a motor vehicle after, or while, they had been drinking alcohol and/or smoking marijuana. This study reports the results for the 11th and 12th grade students only, to be sure that only those students of driving age were included...(emphasis added)"

Table 3, on page 17 of the summary results, showed that between 10 and 14 percent of 11th and 12th graders had reported that they regularly drove while drinking alcohol or smoking marijuana. The slots for the results of this question for grades 6 through 10 were marked with a series of NA's, meaning they were not applicable.



But were those results for the lower grades really irrelevant?

Although you couldn't tell it from the summary table, the raw data for 1992 showed that between 23 and 34 percent of the 6th to 10th graders reported they had driven a motor vehicle while drunk or stoned at least once or twice before in their lives. A full 17 to 23 percent reportedly doing so on a regular basis -- from monthly to daily.

Dana insisted the data blackout on lower grades in the summary results and on the summary chart was appropriate. Despite the fact that the question was asked of all students, "We were trying to get an accurate representations of the LICENSED population," he said. "It was not excluded to protect them [the younger students], or because I didn't want to get them into trouble."

He said he did not assume that all those students below the legal driving age, one in every five, were lying when they said they regularly drove while drunk or stoned, but "I cannot explain with any reasonable assuredness why they [underage students] are driving."

Asked why he did not consider those astonishingly high findings significant enough to mention, Dana said, "There is too much information to include. One makes decisions. I didn't consider it to be important."

Yet, in a newspaper interview in 1992, Dana had felt the 10 to 14 percent rate of drunk driving by assumedly licensed 11th and 12th graders was so important that he called it an "epidemic."

Under the front page headline, "High school students drink, drive" in the Sept. 10, 1992 Bangor Daily News, the fourth paragraph in the story reads:

" `Drunk driving is an epidemic in this age group [11th and 12th grades],' said Dr. Robert Q. Dana, director of the University of Maine Substance Abuse Services, and director of the study. With all the publicity around drunken driving, `I'm shocked that the message has not gotten out' to these young people, he said."

This question had proven ticklish for researchers before.

In the 1988 report, Blunt said it appeared that younger children in his sample had misunderstood questions about drinking and driving, since 14 of the 5th graders said they had received a traffic ticket because of their alcohol or drug use, while only seven 5th graders reported they had ever driven while consuming alcohol or marijuana.

In the 1992 survey, to help clarify the question which asked about driving while drinking alcohol or smoking marijuana, Dana added the phrase "Answer only if you drive." But it was to no avail. The same kind of answers showed up.

No comparable figures on these questions are available for 1995 or 1996. Despite the "tremendous concern among drug and alcohol educators regarding substance-impaired driving" which Dana had noted in his 1992 survey report, not a single drunk or substance-impaired driving question -- for any age category -- appears in either the 1995 or 1996 surveys.

 

Back To Table of Contents
Inept car thieves?

One way to check the accuracy of drug and risk assessment surveys is to compare the student-reported rates of various illegal activities with hard statistics such as crime reports.

By that measure, it appears that Maine students are either grossly exaggerating their efforts, or are particularly inept at stealing motor vehicles.

A question in 1995 and 1996 asked: "How many times in the past year (the last 12 months) have you stolen or tried to steal a motor vehicle such as a car or motorcycle?" Students could check off "none" or one of seven categories ranging from 1-2 times, to 40+ times.

Based upon the self-reported responses, the 6th-12th graders actually surveyed stole or attempted to steal between 2,300 and 3,581 motor vehicles in 1995 and between 1,219 and 1,441 in 1996.

The number of students surveyed represented 6.8 percent of that student population in 1995, and 6 percent in 1996. Extrapolating to 100 percent of the 6th-12th grade student population, the survey results imply that between 20,300 and 52,660 attempts or actual thefts of motor vehicles must have been made in Maine by 6th-12th graders in those years.

Official figures from the state's Uniform Crime Reporting Office in the Department of Public Safety in Augusta show 1,756 motor vehicles, including cars, trucks, buses, snowmobiles and motorcycles, were reported stolen in Maine in 1994. The figures were 1,720 in 1995, and 1,768 in 1996.

 

Back To Table of Contents
Criminal mentality

Many students also appear to have exaggerated their rap sheets.

Students surveyed in 1995 reported having been arrested between 2,230 and 2,887 times in the previous 12 months. In 1996, student self-reported arrests ranged in number from 1,554 to 2,141.

Extrapolated to 100 percent of the student population, those reports represent between 32,800 and 42,450 juvenile arrests in 1995, and between 25,900 and 35,680 in 1996.

State figures of actual juvenile arrests are 10,538 in 1994; 11,626 in 1995; and 12,965 in 1996.

 

Back To Table of Contents
  Why bother to ask?

Another questionable neglect by researchers uncovered in this investigation was their total dismissal of three questions.

At the end of the 1995 and 1996 survey forms, students were asked for feedback. The first part of each question had the usual computer bubbles, followed by a blank line for comments.

These were the questions:

#121. Were there any questions that you did not like? If "Yes," which ones? "Why?"
(1,476 students in 1995 and 1,530 students in 1996 answered "yes.")

#122. Were there any questions that upset you or made you feel uncomfortable? If "Yes," which ones? "Why?"
(697 in 1995 and 704 in 1996 answered "yes.")

#123. Are there things we should have asked about? If "Yes," what?
(829 students in 1995 and 636 in 1996 answered "yes.")

Despite nearly 6,000 positive answers to the three questions, not a single penciled-in comment on any blank line was deemed worthy enough to record.

"We got no fruitful data," Dana said, "things that we might want to use or talk about, something we might use, of utility for the state, research perspectives. We did discuss these, but no, we have no documentation. We would have [documented the responses] if there was something there, either a repeating theme, even a suggestion from one person. What we found is that generally you get something silly, like `suck my ass,' that has no real relevance."

Since not a single written response was recorded, and the original surveys have since been destroyed, Dana could not even say how many of the students who answered "yes" followed up that response by filling in the blank provided, if only to identify a troubling question by its number.

So why were the questions asked if no attempt would be made to record the answers?

The last two surveys were part of a six-state study, Dana explained, and "some of the states have questions that are relevant to them but are a dead hole here."

 

Back To Table of Contents
What are the options?

If honesty is the goal, it would help to have a full range of possible answers to difficult or tricky questions.

Several serious questions in the 1995 and 1996 surveys ask how easy it would be for a student to get beer, wine, hard liquor, cigarettes, marijuana, illegal drugs, or a handgun near where they live. Yet the students are not given the option of saying "I don't know" or "I never tried." Their only choices are "very hard," "sort of hard," "sort of easy," or "very easy."

The same lack of options follow questions about whether parents would catch them if they drank alcohol without permission, carried a handgun without permission, or skipped school. The answers there are limited to "NO!," "no," "yes," and "YES!"

 

Back To Table of Contents
What do you mean by that?

To get accurate responses, it also helps to ask specific questions. Yet several terms used in the 1995 and 1996 surveys are vague, some apparently deliberately so.

Students were asked if they had ever carried a handgun, with no specifics as to where it might have been carried (across a room?), in what manner or under what circumstances, whether concealed, with or without a permit, for hunting or trapping, or if under adult supervision.

In another example, immediately after 25 questions asking students about their lifetime and current use of chewing tobacco, cigarettes, alcohol, marijuana, LSD, cocaine or crack, steroids, inhalants, and derbisol, were these two questions:

#61. On how many occasions have you used other drugs in your lifetime?
#62. On how many occasions have you used other drugs during the past 30 days?

Were researchers asking about students' use of other illegal drugs they had failed to mention elsewhere in the survey?

Apparently the researchers thought they were, because they were initially stunned by the high response. Attempting to explain the results, the 1995 reports states: "...The questions, however, did not define `other drugs,' and it is possible that students understood the question to include prescription and over the counter medications. Lifetime use of `other drugs' was somewhat high for grades 7 through 12 (16 to 25 percent) and use over the past 30 days was moderate for grades 7 through 12 (8 to 14 percent). The meaning of these data with respect to substance abuse are ambiguous because of the wording of the questions."

Despite that known ambiguity, the identical two questions appear the following year. No attempt was made to clarify the situation, to ask separate questions about other illegal drugs, prescribed medication, or over-the-counter drugs such as aspirin or cough syrup.

But this time researchers were not puzzled by the results. The 1996 report on those questions states flatly that "No distinction was made between illicit and prescription drugs or over the counter medications. Use was not high, and nearly the same among boys and girls." Lifetime use was between 15 and 20 percent, and current use between 8 and 11 percent.

Such vagueness can have widespread repercussions, leading to distortions in public perceptions. National news broadcasts last summer touted a slight drop in the reported use of drugs among students, based on annual surveys such as these. Some, but not all, news anchors interjected the word "illegal" before the word "drugs."

 

Back To Table of Contents
Sampling and margin of error

To be considered scientifically sound, most social science research must be able to claim a margin of error of below 5 percent. The first three of Maine's drug surveys say they meet that standard overall, with the 1995 report claiming a margin of error of plus or minus 1.13 percent for any question answered by all students polled. The most recent survey, 1996, does not make any claims in its text, but Dana, in a Feb. 13, 1997 BDN story, placed the figure at 1 percent, which he reportedly said provided researchers with "unparalleled confidence" in the study.

Many people think "margin of error" refers to the accuracy of the survey. It does not. It refers to how closely the cross-section sampled accurately reflects the population at large. A small margin of error does not guarantee that the answers are true and honest, only that the people surveyed would answer about as truly and honestly as would the population as a whole.

Random sampling is the most basic of sampling techniques developed to get an accurate cross-section. This sampling technique means that any subject in a given population (in this case, Maine 6th-12th graders) has an equal or known chance of getting picked to take the survey. Any sampling technique less rigorous than a real random sample reduces the scientific validity. If the randomness is tinkered with too much, the validity can disappear entirely.

The 1992 study set a sampling goal of 400 randomly selected students from randomly selected schools in each of Maine's 16 counties. The plan soon fell apart.

"Initially, the plan involved having the field researchers randomly select approximately 30 students per grade from each school in order to yield a sample of approximately 400 students per county," the survey report states. "However, the permission slip return rate was less than desirable in many places. Therefore, we decided to include all students who returned parental consent forms. This procedure resulted in a sample size of 4,849 Maine students, approximately 81 percent of our original target of 6,000 students."

Not only was the sample size nearly 20 percent smaller than planned, but the randomness was badly compromised when any and all subjects who had returned permission slips were included. Despite that blanket inclusion, eight of the 16 counties polled fewer than 200 students, less than half the sample target. Sagadahoc County polled only 96 students and Somerset County a mere 47. At the other extreme, because of the blanket inclusion, Cumberland polled 911, York 590 and Aroostook 597.

While admitting that such county figures "will produce a larger standard error and wider confidence intervals," and "obviously hinders county-level analyses," researchers stuck to their guns. Without explaining why, the 1992 report states that despite the sampling problems, "we are confident that the state-wide sample is representative of Maine students in grades 6-12."

Breakdowns of student participation by county are not included in the 1995 or 1996 survey reports. A notation in the raw data portion of each report states that geographic data was not included "in order to protect the confidentiality of responding students and their respective schools."

There are hints that the same internal geographical distribution problems that arose in 1992 were encountered again in 1995 or 1996. In the 1996 report, the target of 400 per county is again the sample design, and a total of 6,398 usable survey forms were reportedly completed, just two shy of the target. But a footnote on the 1995 charts which compared some county percentages mentions that no high schools in Hancock or Oxford Counties were surveyed. And a footnote in the 1996 report reads: "Although all counties are represented in the sample, several yielded insufficient numbers for adequate county-by-county analysis."

List of Surveyed Schools
School 1992 1995 1996
Adams School -- Castine X

Appleton Village School
X X
Auburn Middle School X

Bath Middle School X X X
Belfast High School

X
Biddeford Middle School
X X
Boothbay Elementary

X
Boothbay High School X

Brunswick Junior High School X

Buckfield Jr-Sr HighSchool

X
Bucksport High School X

Camden-Rockport High School
X
Cape Elizabeth Middle School X
X
Caribou High School
X X
Caribou Middle School
X X
Central Grade School

X
Central High School
X X
Central Middle School
X X
Chelsea Elementary School
X
Cherryfield Elementary

X
Columbia Falls Elementary

X
Cunningham Middle School X

Deer Isle-Stonington High School X

Dirigo Middle School

X
Eagle Lake Elementary School X

Edward Little High School X
X
Ella Lewis School

X
Ellsworth High School

X
Ellsworth Middle School

X
Elm Street School
X X
Forest Hills Consolidated
X
Fort Fairfield High School X

Fort Fairfield Middle School X

Fort Kent High School

X
Fort Kent Elementary School X

Foxcroft Academy X X
Franklin High School X

Frisbee Elementary School
X X
Fryeburg Academy X

Gardiner Middle School

X
Gen.Bryant E. Moore School
X
Gouldsboro Grammar School
X X
Great Falls -- Auburn
X
Great Salt Bay Community School X X X
Greel Junior High School
X
Hall-Dale Elementary X

Hall-Dale High School X

Hichborn Middle School
X
Hiram Elementary School X

Houlton High School X

Houlton Middle School X

Huse Memorial School X

Islesboro Central

X
Jay High School X
X
Jay Middle School

X
Jonesboro Elementary

X
Jonesport-Beals High School X

Kennebunk High School X

Kennebunk Middle School X

Kingfield Elementary School X

Lake Regional High School X

Lake Regional Middle School X

Lambert Middle School X

Lawrence High School
X X
Lawrence Junior High School
X X
Leavitt Area High School
X
Leonard Middle School X

Lewiston Middle School
X
Limestone High School X

Limestone Middle School X

Lincoln Academy X

Livermore Falls High School X X
Livermore Falls Middle School X

Lubec Consolidated School
X X
Madison High School X

Madison Junior High School X

Mahoney Middle School X

Manchester Elementary School
X
Maranacook Community School
X X
Marshwood High School X

Marshwood Junior High School X

Mary Taylor School X

Mollyocket Middle School X

Morse High School X X X
Mount Abrams Regional School
X
Mount Ararat High School

X
Mount Desert Island High School X

Mount View Elementary School X

Mount View High School X

Mount View Junior High School X X
Mountain Valley Middle School
X X
Noble High School X

Noble Junior High School X

NoKomis Regional High School

X
North Haven Community School

X
Oak Hill High School X

Old Town High School X

Oxford Hills Elementary School

X
Oxford Hills High School

X
Oxford Hills Junior High School

X
Penobscot Elementary School X

Penquis Valley High School X X
Penquis Valley Middle School X X X
Phillips Middle School X

Piscataquis High School
X
Presque Isle High School X

Princeton Elementary School X

Rangeley Lake
X X
Rockland SAD 5 Middle School
X X
Rockland District High School X
X
Sabattus Elementary School X

Sacopee Valley Jr-Sr High School X

Sanford High School X

Sanford Junior High School X

Sanford Middle School X

Searsport Elementary School X

Searsport High School X X
Searsport Middle School X

Sedomocha Middle School, Dover X

South Hiram Elementary School X

South Portland High School
X
Stockton Springs Elementary X

Stratton Elementary School X X X
Strong Elementary X

Sugg Middle School

X
Telstar High School X

Telstar Middle School X

Thornton Academy

X
Traip Academy
X
Tripp Elementary School

X
Warsaw Middle School
X X
Washington Academy
X
Waterville High School X

Waterville Junior High School X X
Webster School X

Wells High School
X X
Wells Junior High School X X X
Westbrook Junior High School X

Winslow High School X

Winslow Junior High School X

Wiscasset High School X X X
Wiscasset Middle School X

Woodland Elementary School X X
Woodland Junior-Senior High School X X X
Woodside Elementary School

X

Data obtained from Maine's Office of Substance Abuse in Augusta and the Margaret Chase Smith Center in Orono confirmed the suspicions. In 1995, seven of Maine's 16 counties failed to reach the target of 400 students surveyed. One of those was Cumberland County, which includes the greater Portland metropolitan area -- only 338 students were surveyed out of a student population of about 20,000. Other numbers ranged from a low of 130 in Oxford County to a high of 1,013 in Penobscot County.

In 1996, a different set of seven counties failed to poll 400 students, with a low of 91 in Waldo County and a high of 613 in Penobscot County. Cumberland County (Portland metro area), which had managed to find 911 students to poll in 1992, surveyed only 286 students in 1996.

 


Back To Table of Contents
Fudge, anyone?

It appears that researchers in 1995 and 1996 felt the figures resulting from these discrepancies could not stand on their own, and needed some adjusting. This was done by weighting the responses depending on where the students took the test. Both the 1995 and 1996 reports explain:

"The data resulting from the study were weighted to adjust for the varying size of school populations in the 16 counties. The sample design called for approximately equal numbers of students to be selected from each county, while we know that some counties have much larger grade 6-12 enrollment than do other counties. The resulting data were statistically adjusted so that large and small counties are represented in the statewide results in proportion to their actual student population, not according to their size in the sample."

These adjustments amounted to far more than a little tinkering or fine-tuning.

Weighting ratios were not included in the official survey report, but were obtained from researcher Suzanne Hart at MCSC. Those showed that each student survey from Cumberland County was counted four times -- multiplied by 3.94 in 1995 and by 4.18 in 1996. Because of wildly fluctuating polling numbers in different years, Oxford County kids were counted three times each in 1995, but less than once each (0.6) in 1996.

In 1995, when no high schools in Hancock County were surveyed, answers from the 210 students polled in the lower grades were multiplied by 1.4 to balance out the numbers. In 1996, when one high school participated, the multiplier figure dropped to 0.65 in Hancock County.

York County's multiplier was 1.21 in 1995, and 2.04 in 1996.

Knox County, which polled an inordinate number of students for its size (637 in 1995 and 599 in 1996) had discounted rates of 0.30 and 0.285. It took more than three Knox County students added together to count as one student in those surveys.

 

Back To Table of Contents
Home to roost

So how can someone find out if the local school or nearby schools were among those polled? Don't ask Dana. He's not telling.

Dana refused to release a list of the schools or school districts involved in the three surveys he conducted, even though he admitted the information was a public record.

"I've made a commitment not to release them," Dana said. "I try to keep that confidential to not compromise their trust in research groups. Schools are skittish about being exposed in this way. Yes, all schools get reports back about their districts. I believe the list is a public record, you could probably get it from the state. I don't know how funky they are about that.

"I wouldn't be surprised if I am somewhat constrained by protection statutes," Dana said vaguely in explaining his refusal. "Drug and alcohol research is especially protected by federal regulations because you are talking about illegal activity. Some of these places are very tiny, there is fear of individualization."

School participation in such surveys is usually not a secret in local communities. After all, school officials must give their approval, parents are asked for consent, students certainly know they are being polled. School board briefings on results are often reported in the local papers. In fact, Dana's name turns up in many of those news stories.

Yet Dana contended that if schools are identified as having participated in the statewide study, "people are not going to want to be involved because some of the findings are not that positive. We're going to be out of the research business if I start divulging" which schools participated.

Dana said no high schools in either Hancock or Oxford Counties participated in the 1995 report because "we were closed out, couldn't get in."

Maria Faust, associate director of the state's Office of Drug Abuse in Augusta, said there was no question in her mind that the lists of participating schools are public records, and released the lists from computer data.

The 1992 list showed a total of 87 schools, with 48 schools polled in 1995, and 55 schools in 1996.

Maine has 828 public and private schools. According to the state Department of Education, 552 of those schools have some combination of grades 6-12.

Dana said that so far most schools have cooperated when asked. He said his school rate of refusal is "not very high, less than a quarter."

A comparison of the survey reports and the school lists show otherwise. It turns out researchers had the same self-selection randomness problem with schools that they had with students.

In 1995, although 48 schools participated, 90 had been invited, a 53 percent participation rate. In 1996, 121 schools were asked, and only 55 said yes, a 45 percent participation rate. If 87 schools out of "slightly less than 200" participated in 1992, that's a 43 percent participation rate.

But those figures do not mean that 191 schools were polled in the latest three surveys. Only 145 were.

The lists show that seven schools were surveyed all three years. Fifteen schools were polled in both 1992 and 1995. Eleven schools were polled in both 1992 and 1996. And, of the 48 schools polled in 1995, 26 -- more than half -- were surveyed again in 1996.

Dana said some schools are reluctant to participate because "they have very big agendas. They have a lot of people wanting to do a lot of studies. They have to have the belief that there is value in the data."

Hart said no drug survey was in the works in 1997 and none has been scheduled for 1998.

 

Back To Table of Contents
Recommendations

At the end of each report, Dana makes policy recommendations as well as recommendations for future research.

Many of his policy recommendations do not seem to bear much relationship to the published results. They include: 

"Illegal behaviors and the normative belief structure among youth that `I can do this (behavior) and not get caught' cannot be tolerated. It is recommended that illegal behaviors result in arrests or, at a minimum, direct confrontation and a conference with parent(s)/guardian(s). The threat of detection and subsequent punishment is a powerful disincentive to use alcohol, tobacco products, or other drugs."

Yet, nothing in the survey results points to arrests or confrontations as having a remedial affect. In fact, the survey questions on arrest records and car theft attempts would seem to indicate that students see illegal behavior as a badge of honor.

But the impact of marijuana legalization is not addressed in these surveys. Dana's recommendation against the legalization of marijuana appears to be a personal opinion.

 His recommendations for future research include: 

Dana's concern about including Native Americans and other minorities in future evaluations implies they were under-represented in the surveys. Yet demographic information in the 1996 survey show that Native Americans, African-Americans, Hispanics, Orientals and "others" were surveyed in larger numbers than their percentage of the Maine student population would warrant. For example, Native Americans represent 0.5 percent of the 6th-12th grade students in Maine, yet they accounted for 2.6 percent of the students sampled in 1996, and 2.4 percent in 1995.

Also, while it is obvious that juveniles not attending school were excluded from the school-based polling, no demographic questions were asked to determine whether students had any disabilities to justify that part of Dana’s recommendation.

In addition, Dana provides no evidence that these minority groups are "high risk."

 

Back To Table of Contents
Dana defends

Despite the serious problems with the sampling and the data uncovered in the reports, Dana was adamant in defending his work. When given the opportunity to sum up at the end of the August interview, he said:

"These are very important studies for the state. A state can be driven by popular convention or conventional wisdom. This state is trying to be more empirical and data driven. That's very important, because you have to make decisions about a lot of different things, like the allocation of funds, preventive work, what's needed. This state is getting into the position where they can do it.

"Studies like this can mobilize a community. Communities are frightened, they are desperate for this [information]. Drug use may be a barometer of problems. This will often give them a springboard to do something."

 

Back To Table of Contents
Independent Review

Philip H. Person, Ph.D, of Orland, Maine, is retired now after 8 years as a Branch Chief and senior statistician with the National Institute on Drug Abuse and 17 years as an analytical statistician with the National Institute of Mental Health. He was asked in early February to review the findings of this independent investigation. Person also had access to the original drug survey reports.

The investigation, he said in a written reply, "has raised some astute and legitimate questions about the very basis of the whole set of surveys."

"These drug abuse surveys appear to have compromised seriously the random sampling plan," he said. "Deviations, compromises, and adjustments invalidate, or make meaningless, the use of sampling errors and confidence limits, both of which depend on these rules….The compromises, adjustments, and exclusions that were made in executing the plan cast much doubt upon any results."

Person also had a different perspective than Dana on the issue of lying.

"Expecting drug-abusing students to answer truthfully questions about 'their own illegal or undesirable behaviors' strains one's credibility," Person wrote. "Drug abuse, alcohol abuse, and other substance abuse commonly contain large components of denial on the part of the user. Lying about such use is also common to these conditions so lying ought to be expected and considered in the question formation and analysis."

The comparison of the student claims of motor vehicle thievery and state crime statistics was "a telling finding," he said. With student credibility in doubt, Person said, "One might wonder whether these students who said they lied may have lied about having lied! There seems to be no good way to deal with answers that may be exaggerated, minimized, denied, or truthful."

Person concluded, "The value of truth in these surveys is placed into considerable doubt."

This is the text of his written critique:

The high school population samples for the various years present several problems. Clearly, in a survey about anything the whole population is usually not, and need not be, questioned. A random sample, well carried out and which follows the mathematical and statistical rules and criteria for random sampling, is a sensible alternative. The number of sample cases needed can be determined based on the expected occurrence in the population of the behavior being examined and the size of differences to be identified..

These drug abuse surveys appear to have compromised seriously the random sampling plan. The compromises, adjustments, and exclusions that were made in executing the plan cast much doubt upon any results, especially since none of the implications of these deviations (introductions of bias) was discussed. Further, there was no consideration given to the effect of under- and non-reporting. For example, were non-respondents, such as refusals of whole schools which had been randomly selected or of randomly selected individuals for whom parents would not fill out permission slips, any different from those students who, when selected for the sample, actually filled out the questionnaires? Did the excluded questionnaires come from a different sub-population from those not excluded? What biases were introduced into the study by weighting more heavily completed questionnaires in under-sampled areas and discounting to some degree the responses in other areas?

The concept of a sampling error or a "margin of error" depends entirely on following the rules of random sampling. Deviations, compromises, and adjustments invalidate, or make meaningless, the use of sampling errors and confidence limits, both of which depend on these rules.

Comparisons over time (among surveys) depends on the surveys being done with the same rules and without deviation from the random sampling rules. Clearly, there are such problems in these drug abuse surveys which were not discussed in making comparisons among them.

Ms. Hay has raised some astute and legitimate questions about the very basis of the whole set of surveys. She also goes on to illustrate that the randomness of a sample may be lost in ways other than selection of the respondents. For example, disqualifying responses or questionnaires and allowing self-withdrawal from the study (either whole schools or individuals failing to return permission slips) should suggest a discussion on how these events may have introduced biases.

In any survey, the question statements are always a problem. Words should be used which do not elicit socially approved responses. Drug abuse survey questions are particularly vulnerable since many (or perhaps most) of them deal with very personal, potentially socially unapproved behavior. Expecting drug-abusing students to answer truthfully questions about "their own illegal or undesirable behaviors" strains one's credibility.

Drug abuse, alcohol abuse, and other substance abuse commonly contain large components of denial on the part of the user. Lying about such use is also common to these conditions so lying ought to be expected and considered in the question formation and analysis. In the story it says on this point:

Did that make the survey results wrong by at least 10 to 15 percent overall, and as much as 24 percent in some groups?

One might wonder whether these students who said they lied may have lied about having lied! There seems to be no good way to deal with answers that may be exaggerated, minimized, denied, or truthful.

Ms. Hay has hit upon a telling finding by comparing the answers about stealing motor vehicles to the state crime statistics. She makes the point well; and the value of truth in these surveys is placed into considerable doubt.

After reviewing this independent report, Lynn Duby, director of Maine's Office of Substance Abuse said the issues raised in this investigation "certainly will be addressed" before a similar survey by a different, but as-yet-to-be-chosen research team, is undertaken in Maine schools this spring.

"We will be taking an intensive look at those areas" when working with the new researchers on the design of the new survey, she said in a telephone interview Friday, Feb. 27. A decision had already been made to go with a new researcher, she said, "in part to validate previous studies." The new survey, like two earlier ones, will be done in cooperation with several other states.

Rep. Joe Brooks (D-Winterport), a member of the Health and Human Services Committee, said he will welcome the results of this spring's survey "because it will go right to my concerns that the results we were seeing were a bit high."

Brooks had spent more than a decade on the now-defunct Eastern Regional Council on Alcohol and Drug Abuse, as that group monitored substance abuse spending programs in Penobscot, Piscatiquis, Hancock and Washington counties.

When statistics were released in the early and mid 1990s, he said, "I was in awe of the results. I think we all were. The 'alarming rates' didn't seem to be consistent with what we were monitoring in Eastern Maine." But, he said, "I am not a trained researcher. We had no means of questioning the surveys. A few people said 'This can't be.' But nobody took the University or Bob Dana to task."

-------------------------------------------------

Freelance writer Jean Hay of Bangor is a former reporter and bureau chief for the Bangor Daily News.

@ Copyright 1997 by Jean Hay


Click Here for Op-Ed Column that ran in March 26, 1998 Bangor Daily News

To Top of This Page

Jean Hay's Home Page
Reports Columns Book Other Writings Biog