Thursday, September 4, 2025

A MISSIVE FROM MRIP

 

A few days ago, I got a big envelope in the mail.

The return address was from the National Oceanic and Atmospheric Administration, c/o Gallup [the polling firm] in Lynbrook, New York.  I was a little curious, because although it’s not unusual for me to get some mail originating at NOAA, the packaging, and the fact that it was not addressed to me but to “NEW YORK RESIDENT” at my address, seemed a little strange.

The enclosed letter was hardly enlightening, reading, in part,

“I am writing to ask for your help in a study that the Gallup Poll is conducting on behalf of the National Oceanic and Atmospheric Administration (NOAA).  This survey asks questions about severe weather and outdoor activities.  The results will be used to learn more about the environment and help improve the quality of marine and coastal resources.

“For this study to be accurate, we need all households who receive this short survey to compete it and send it back.  Your address was randomly picked from a list of addresses in New York, and we can’t replace you with someone else.  Your responses will help all residents of New York have their voices heard.

“This survey asks about many outdoor activities.  Some people enjoy many of these activities, while others aren’t interested in these activities.  It is very important that your household complete the survey, even in no one participates in these activities.

“This survey should be completed by an adult limit at this address.  We have included a small gift as a way of saying thank you for your help…”

The letter was signed by John Foster, Chief, Recreational Fisheries Statistics Branch, NOAA Fisheries Office of Science and Technology.  About the same time as I noticed that, two crisp new dollar bills slipped out from between the letter and the enclosed survey form.

At that point, things began to make a little more sense.  Once I started to take the survey, and got a look at the questions, everything made itself clear.

The package that I had received was part of the Fishing Effort Survey, that portion of the Marine Recreational Information Program that is mailed to households in coastal states, in an effort to gauge how many recreational fishing trips were taken in each two-month “wave.”

The Fishing Effort Survey is built around two so-called “frames” that are used to collect effort data.  The first frame is composed of addresses collected from each state’s saltwater fishing license (or registration), which guarantees that the surveys will reach saltwater anglers.  The second frame consists of surveys mailed to random addresses within the state, which is intended to reach anglers who, for whatever reason, never purchased a license. 

Although I have an up-to-date saltwater registration, my survey seemingly came from the frame that was sent to random residents.

Regardless of which frame a survey recipient belongs to, the two dollars that were enclosed were carefully calculated to maximize the chance that the survey would be completed and returned.  Apparently, studies have shown that amounts less than two dollars lead to fewer returned surveys, while higher amounts don’t materially improve the response.  Two dollars apparently sits at the Goldilocks point, where it is just right to encourage cooperation without adding undue cost.

A lot of thought goes into the surveys.  But sometimes, the thought processes backfire a bit.

Two years ago, a pilot study sponsored by the National Marine Fisheries Service discovered that there was an apparent flaw in the design of the Fishing Effort Survey, which might have resulted in fishing effort being overestimated by 30 to 40 percent (comments made at yesterday’s Highly Migratory Species Advisory Panel meeting suggested that the actual overage is much smaller, at least with respect to anglers fishing from private boats).  Further research revealed that the overestimate seems to have arisen out of two questions asked of every angler covered by a survey:  How many times did they fish in the past two months, and how many times did they fish in the past calendar year?

It seems that a substantial number of anglers reported that they had fished more in the immediately preceding two months than they had in the previous year, which was an obvious impossibility.

The pilot study looked considered issue, and determined that such misreporting was the result of something called “telescoping error,” which occurs when

“a respondent misplaces an event in time, usually placing the event more recently in time than it actually occurred.”

There was also a suggestion that some anglers were reluctant to provide a negative response, saying that they either didn’t fish much or did not fish at all, and so exaggerated the number of trips taken rather than enter a zero.

The problem seemed to revolve around the order of the two questions.  Typically, in constructing a survey, designers ask the simplest and most easily-answered questions first, and then move on to questions of greater complexity.  Thus, the Fishing Effort Survey first asks each respondent how many times they have fished in the past two months, and then goes on to ask how many times they have fished in the previous year.

But somehow, many anglers were providing the seemingly the “wrong” answer—the higher number—in response to how many times they fished in the past two months, while entering a lower number in response to the question about how many times they fished over the last year.  That’s where the explanations of “telescoping error” as similar things came in.

I’m neither a statistician nor a designer of surveys, so I have to believe the experts when they tell us that’s where the Fishing Effort Survey went wrong.  But I have to believe that at least some of the error might have been due to something that we can call carelessness, or inadvertent error, or maybe just dumb mistakes.  Because I’ll confess right here—I never try to hide the truth from my readers—that I screwed up when answering my own questionnaire.

Maybe it happened because I’m a lawyer, and when faced with a series of questions, I tend to read all of them first, before answering any, because I’m trying to figure out if the questions are set out in a pattern designed to take me to a particular place. 

Maybe it was just a simple brain fart.

But whatever the cause, when I answered the questions about how much I fished, I found myself writing down my annual trip total first, even though the question actually asked me how many trips I had taken over the past two months.  I knew that both questions were being asked, but somehow my mind decided to answer them in the wrong sequence.

In my case, I happened to catch—and correct—the error, but it led me to wonder how many people might have done the same thing that I had.  No telescoping error, no reporting of trips never made, but instead just a moment of mental lapse that led them to put the right numbers in the wrong boxes.

In the end, I sent the survey back with the answers all as right as I could get them, recognizing that, after a lifetime of salt water fishing, I had just participated in the Fishing Effort Survey for the first time.

Which led to another question:  How many other people, here in New York and elsewhere on the coast, have received a similar survey and, not knowing what it was, either pocketed the two bucks and tossed the rest in the trash or, hopefully, dutifully filled out the form and put it in the mail, without ever knowing that they were contributing to the MRIP data pool.

Because one thing we constantly hear, whether at fisheries meetings, in conversations, or on Internet chats, is anglers arguing that the Marine Recreational Information Programs data must be invalid, because such anglers have never been surveyed, and no one they know has been surveyed, so how can the information be any good?

Leaving aside the statistical side of things, which tells us that, so long as a survey reaches a representative sample of the population, it doesn’t need a very large number of responses to reach a reasonably accurate result, being the recipient of what I belatedly recognized as a Fishing Effort Survey questionnaire made me wonder how many anglers who received such a questionnaire never realized that they were being surveyed by MRIP at all.

After all, it’s easy for an angler to know whether they’re part of the Access Point Angler Intercept Survey—that portion of MRIP where surveyors make personal contact with anglers at marinas, piers, and shore access spots, speak with them, and physically count and measure their fish.  That can’t really be mistaken for anything else.

But a seemingly random bit of mail, that asks whether a person has had any recent experiences with severe weather conditions, whether they engage in outdoor activities, go to the beach, fish in fresh water and—after all that—whether they went salt water fishing, and how many times, and whether anyone else in the household might have done the same, might not leave as clear an impression.

I suspect that more than a few people might have responded to the Fishing Effort Survey without ever realizing that that’s what they were doing.

And so long as they responded, that’s probably OK.

But as my experience demonstrates, MRIP sometimes comes calling, whether we expect it or not, and I have to wonder whether it might not get a more effusive welcome if people could more easily recognize it for just what it was.