Thursday, August 21, 2014

THE RISKS OF "COOPERATIVE" SCIENCE

The United States Senate is making a pretty good effort to reauthorize the Magnuson-Stevens Fishery Conservation and Management Act.

I haven’t had a chance to carefully study the latest draft, which came out about a month ago, or to compare it to the initial draft that was released last spring (when I do, I’ll let you know what I find).  But after a quick read, I noticed that the latest draft eliminates one of the original bill’s flaws, a so-called research program which was nothing more than a poorly camouflaged scheme to let folks kill more South Atlantic red snapper.

Unfortunately, the new draft preserved another section that calls for

“ a report on facilitating greater incorporation of data, analysis, stock assessments, and surveys from nongovernmental sources, including fishermen, fishing communities, universities, and research institutions, into fisheries management decisions.“
It’s another one of those things that looks pretty good at first glance, but falls apart when you examine the details.

Certainly, the biologists at NOAA Fisheries could use a little bit of help.  They have hundreds of stocks to manage, and lack the money and the personnel to get the job done on anything like a comprehensive basis.  The result is that the important stocks get assessed on a semi-regular basis, stocks that are not important to industry (but may have real significance to the ecosystem) go largely ignored and everything else is studied in a sort of hit-or-miss fashion when money and manpower allow.

Even some fairly important species may receive inadequate study.  Black sea bass cause a lot of angst and harsh words in the Mid-Atlantic, but because of their unique life history and stock structure, are a devilishly hard—and expensive—stock to assess.  The last assessment was unanimously rejected by a peer review panel, and the peer-review summary report noted that

“It was suggested that the assessment team continue to consider alternative methods for assessing black sea bass stock status…although achieving a new framework should not be expected in the short term…
“The three models suggested…are a major research task and may require additional data.  We do not anticipate that such models could be produced within an operational assessment framework.“
So it would seem that any help that non-governmental sources could provide, in the form of “cooperative science” could only make things better.

But that’s not necessarily true.

When you talk about “cooperative science,” you’ve got to stop and think about why anyone would want to cooperate in the first place.

When it comes to fishermen and fishing communities, the answer is usually pretty simple—they want to kill more fish (although, to be fair, that’s not always the case; striped bass anglers’ continuing advocacy for science-based harvest reductions are certainly one notable exception).

The essence of science is the use of unbiased data, or at least an effort to identify and account for the bias in whatever data exists.  However, we can be pretty certain that any “data, analysis…and surveys” provided by fishermen and their communities will be anything but bias-free.

Often, that’s not really the fishermen’s fault, it’s just a reflection of who they are and what they do.  Scientists are trained to be objective and skeptical, to search for sources of bias and error, and to seek defensible, repeatable results.  Fishermen are trained to catch fish.  That simple difference outlook places any fisherman-generated data in doubt.

You can see the problem arise just about any time that controversial harvest restrictions (and what harvest restrictions are not controversial?) are proposed.

The scientists will come to the table and argue that their surveys, conducted in the same places with the same gear over an a period of years, are showing far lower catch per unit effort, poor recruitment, decreasing average size or other symptoms of an ailing stock.  They will talk about abundance falling below established indices or long-term averages.

Fishermen will look at them with a sort of bemused look in their eye, and say “There’s lots of fish out there!”  They’ll tell the scientists that they’re not catching fish because the scientists aren’t sampling where the fish happen to be, or that their gear is rigged wrong.  Sometimes, they’ll offer to trawl side-by-side with the survey boats, to prove just how many fish they can catch while the scientists catch just a few.

The concept of year-to-year comparisons, that reject sources of bias, is completely lost on them.  The fishermen’s business is built on bias.  If you don’t catch fish in the first place that you try, you move, and you keep on moving until you’re successful.  If you finally catch in the only place holding fish in a thousand square miles of ocean—well, what does that matter?  The hold is still full, and all is OK..

We see a good example of this in the debate over crashing Gulf of Maine cod.

The fishermen sincerely believe that the stock is still fine, but sincerity is no obstacle to error.

And the last thing that anyone wants is sincerely wrong data ending up in the stock assessment.

At least, that’s the last thing that scientists and conservation advocates want.  The Gloucester Daily Times recently ran an editorial lamenting the fact that no fishermen are on the panel established to peer-review the recent Gulf of Maine cod assessment.  Apparently, trusting that task to unbiased scientists makes them uncomfortable…

Yet those problems exist when fishermen are just being themselves.  Things get worse when they bring in hired guns in the form of consultants—normally academics or former government managers who have set up their own firms—to influence the stock assessment process.

Sometimes, the industry scientists really do have insights that improve the management process.  Commercial fishermen up in New England frequently point to their success in safely increasing sea scallop quotas after their team convinced the Northeast Fisheries Science Center that higher harvest levels were completely sustainable.

The recreational fishing industry in the Mid-Atlantic claims a victory, too, after convincing biologists to lower the target biomass figure for summer flounder.  The industry proved their point—and avoided harvest reductions—by bringing in a biologist to successfully argue that males have a higher natural mortality rate than females, which prevented the population from growing as large as scientists once thought it could.

However, despite all the crowing that took place at the time, that “win” wasn’t as decisive as some in the industry would have us believe.  In fact, the data supporting the higher natural mortality rate appeared fairly weak.  In the end, the three scientists on the peer review panel had no deep conviction about the higher figure at all, with one, Dr. Michael Armstrong of The Center for Fisheries and Aquaculture Science in Suffolk, England writing that

“The specific [fishing mortality] by sex will be a function of the physiological determinants of longevity in males and females as well as the abundance of predators taking different sizes of summer flounder.  The specific value for summer flounder at present cannot be determined from existing dataI do  not have a basis for arguing against the [Southern Demersal Working Group’s] expert judgment in proposing a combined-sex value of M=0.25…  [emphasis added]“

Overall, the strength of evidence for the adopted value is not strong but the evidence considered suggested the value was unlikely to be below the previously assumed value of 0.2 and could be higher than the newly assumed value of 0.25.  The [Stock Assessment Review Committee] was in no better position to determine an alternative value and accepted the [Southern Demersal Working Group]-adopted value of 0.25…
“My view is that the use of M=0.25 for assessment, reference point calculation and stock status determination is justifiable, although the strength of evidence…for the specific value is weak.  [emphasis added] “
Those are hardly ringing endorsements, and demonstrate industry-paid scientists’ potential to skew the assessment process.  

They are professionals, with too much integrity to knowingly present bad or falsified data.  However, just like an “expert witness” hired to provide testimony in a courtroom trial, the hired guns for the fishing industry are paid to advance the industry’s cause, which is generally either killing more fish or avoiding harvest reductions.

They do that by either raising biologically plausible alternatives to the current management approach—alternatives such as higher mortality for male summer flounder—or by trying to impeach unfavorable data, challenge stock assessment models and question the methodologies used to survey the stock.

One fairly recent example of that was the pollock assessment, where an expert hired by the commercial fishing industry successfully argued that the reason that no pollock more than eight or nine years old were being caught—either by fishermen or in the scientists’ surveys—was because such fish managed to successfully escape the nets, not because they didn’t exist.  The fishermen’s hired gun managed to convince the stock assessment panel to substantially raise the annual harvest limit, and justified it by the existence of “cryptic” fish—pollock that nobody saw, but nonetheless believed were out there.

Such a faith-based stock assessment even managed to pass peer review, although not without reservation, as the peer review warned that

“There is, however, significant concern over the presumed large and as yet unobserved adult biomass (i.e. cryptic biomass) and the implications for management decisions.  For this reason, the Panel recommends a risk analysis approach…to determination of the consequences of assumptions on this biomass for management.  In addition, the Panel emphasizes the need for research that would confirm (or not) its existence.“
Which leads to the question, “Is it better to take a precautionary approach, and cut harvests to protect the fish that we know are there, or should we increase harvests in the belief that fish that no one—including the fishermen—have ever seen really exist in sufficient numbers to support the population?”

I know what my views are on that, and they’re probably 180 degrees opposed to those of the fishing industry’s hired guns.

And they make me pretty leery of “data” supplied by fishermen, whether recreational or commercial.

That’s probably enough reason not to accept fishermen’s data, but what about data provided by “universities, and research institutions”?

The answer to that probably is “It depends.”

There’s a lot of good work being done in such places by some very bright and dedicated people.  Some are experienced scientists teaching the next generation of fisheries biologists; some are the graduate students themselves, investigating novel questions in fisheries management as they earn their place in academia.  

But researchers are human, and both financial and personal biases can impact the research that’s done.

Competition for grants is formidable, and someone who receives significant and ongoing outside funding—say, from a petroleum company, for a continuing study assessing the impact of oil spills on fish populations—might be reluctant to present findings that clearly show that their funder is causing real harm.  

Studies might even be set up to prove that oil spills are benign.

The same sort of thing happens when researchers work closely with the fishing industry.  They often seem to lose their objectivity.  Instead of conducting unbiased research, they enter into projects best calculated to support the industry’s goals.

One of the biggest examples of that sort of industry-supported research was the so-called “Research Set-Aside Program” conducted by the Mid-Atlantic Fishery Management Council.  

Pursuant to that program, 3% of the annual harvest limit for each council-managed species was set aside, and auctioned off by the National Fisheries Institute, a commercial fishing trade organization.  Fishermen who purchased the fish at auction were allowed to catch them during the closed season, when they would bring a better price and thus make the investment worthwhile.  The money that they paid funded research, most of which (except for the NEAMAP survey of fish abundance) was related to the use of fishing gear.

Unfortunately, a lot of the science that the program produced wasn’t very good, and the program was used by unscrupulous fishermen to hide massive illegal harvests.  As a result, earlier this month, the Mid-Atlantic Council decided to take a long look at the research set-aside program and better determine its worth.

Dr. Richard J. Seagraves, an experienced fisheries scientist who serves as the Council’s Senior Scientist, didn’t mince words.  He noted that

“…while there were projects which produced tangible results that were subsequently incorporated into the Council’s management programs…there were also a number of projects which, after completion, failed to pass peer review and could not be used for science or management purposes.
“The fact that a number of RSA Projects failed scientific review after completion raised major concerns about the process by which RSA Proposals were vetted and the oversight of the projects as they were being conducted…considering the costs associated with administration and enforcement, as well as the value of the RSA quota, it’s probably that the program costs have far outweighed the benefits to the Council and public.”
If a council-sponsored program such as the Mid-Atlantic Council’s RSA program couldn’t pass scientific muster, how much faith can we have in “data, analysis, stock assessments, and surveys from nongovernmental sources, including fishermen, fishing communities, universities, and research institutions” that are subject to no council oversight at all?

America’s fisheries resources are a national treasure.  In order to rebuild them and keep them healthy, more and better science is needed.

But what our fish stocks don’t need is to have management plans diluted and corrupted by shoddy, biased and agenda-driven research that calls itself “science” but cannot stand up to rigorous peer review.

And that’s why the Senate reauthorization bill can use a little more tweaking, to assure that fishery management decisions—and decision-makers—are not led astray by bad information produced by folks with more interest in their own bottom lines than in our fisheries' future.




No comments:

Post a Comment