50 years worth of studies show hatcheries negatively impact salmonids

Matt Paluch

Steelhead
Forum Supporter
There's some awesome science linked to in the article.

Here's a link to the study the article is based on:

Finally, here's a link to a searchable database of the studies they analyzed for the meta analysis:
 

Matt B

RAMONES
Forum Supporter
Excuse me if I don't put much weight in a half assed meta-analysis.
No problem whatsoever, but your assertion of half-assedness would be better taken with some justification. Otherwise it's a pretty boring conversation. ("It's good!" "No it's not, it's bad!")
 

the_chemist

Steelhead
Forum Supporter
No problem whatsoever, but your assertion of half-assedness would be better taken with some justification. Otherwise it's a pretty boring conversation. ("It's good!" "No it's not, it's bad!")
Let me quote the actual paper.

"Ours was not a formal meta-analysis of quantitative effects, nor an assessment of fisheries that hatcheries can provide"

"Not a formal meta-analysis" is scientific lingo for half assed.

Couple of glaring observations.

1. This field of research is too small for this type of analysis. Meta-analysis need very large data sets to get through the noise of comparing different studies (think breast cancer research aka libraries of studies. Not hand selected papers that would be lucky to fill half a shelf). I'd bet $20 this started as a meta-analysis and they didn't find any statistically significant effects so it was reworked into the current format.

2. Their selection criteria is ambiguous and they added bias to it by adding preselected papers that didn't come up in their literature screen.
 
Last edited:

Tallguy

Steelhead
Let me quote the actual paper.

"Ours was not a formal meta-analysis of quantitative effects, nor an assessment of fisheries that hatcheries can provide"

"Not a formal meta-analysis" is scientific lingo for half assed.
I will throw an observation into the ring here, with my science and researcher hat on in terms of basic reading:

Science communicates precisely, generally. The statement listed above is written as "not a formal meta-analysis OF QUANTITATIVE EFFECTS".

The above statement does not preclude it being a fully assed, well done and complete meta-analysis of qualitative effects, such as "adverse", "benign" or "beneficial". It's the difference between direction and degree. I know that's a lot for some people.

Have I read it all and do I have any opinion on its findings? No. But it seems like it's important to recognize exactly what the authors are trying communicate before their results and hard work is disparaged, or taken out of context or twisted.
 

Tallguy

Steelhead
My second observation: 206 papers met their selections criteria. Must be a big bookshelf if 206 papers can't fill half of it. Statistically speaking, if they were going to bias the trend by hand selecting anything, they would have had to hand select, or leave out, something around 100 papers total to arrive at an 83%/17% split in the results from 206. Seems like a lot to me, but I don't know the field.
 

Matt B

RAMONES
Forum Supporter
As a certified Arm Waving Ecologist, I'll also observe that ecological studies frequently are only able make suggestions or conclusions with a level of certainty that is unsatisfying to folks accustomed to studies that are necessarily more robustly quantitative in nature, like medicine, or chemistry. It's sort of the nature of the beast with inherent limitations in ecological research capacity and knowledge.

I'll dig back into the paper at lunch.
 

nwbobber

Steelhead
Forum Supporter
I'd be more concerned about cherry picking if they had removed documents outside their screening procedure than adding documents that were referred to by other documents. The point is to find all of the relevant studies, whether you like what they say or not. The more studies they use the more useful the data at the end.
I am not a scientist, but I have always been interested in science. Environmental science is always underfunded, and is never going to get grants like something like breast cancer, so a study like this is never going to have that much material to digest. It is a good thing to gather all the work that has been done in the past looking for clues showing a trend, one way or another.
Those that are casting aspersions make me wonder if it is really that the conclusions don't fit your particular biases, rather than having real questions about the way the study was conducted.
 

wmelton

Steelhead
Forum Supporter
1. This field of research is too small for this type of analysis. Meta-analysis need very large data sets to get through the noise of comparing different studies (think breast cancer research aka libraries of studies. Not hand selected papers that would be lucky to fill half a shelf).

2. Their selection criteria is ambiguous and they added bias to it by adding preselected papers that didn't come up in their literature screen.
1. They use ~200 studies and find 83% of the studies indicate adverse effects. I think any statistician out there would agree that they have sufficient sample size to declare their finding statistically significant. A "very large data set" is certainly not necessary, the quality of the data is much more relevant. Even in the medical field, meta-analysis' are often made up of even fewer studies than this one, and still considered solid.

2. The selection criteria doesn't seem at all ambiguous to me. It actually is quite prescriptive. From what I can tell, I think if you had the free time, you could sort through the ~10,000 studies and end up with the same ~200. They don't explain the preselected papers at all, so I will give you that. Otherwise, I don't see any glaring reason to be critical of the selection criteria. Reading through the ~200 studies is the only way to make a judgement on the quality of data.

"Ours was not a formal meta-analysis of quantitative effects, nor an assessment of fisheries that hatcheries can provide"

"Not a formal meta-analysis" is scientific lingo for half assed.

I'd bet $20 this started as a meta-analysis and they didn't find any statistically significant effects so it was reworked into the current format.
Not going to pile on to what Tallguy said, but these quotes are taken completely out of context.

I'd be more concerned about cherry picking if they had removed documents outside their screening procedure than adding documents that were referred to by other documents. The point is to find all of the relevant studies, whether you like what they say or not.
This is exactly right. There is no subjective element to the selection process except for the 9 mystery studies. If someone can make an argument for how the criteria introduce bias be my guest, but from what I can tell the only glaring bias comes from the 9 additional studies. Now lets remember that 83% (170/206) of all the studies found adverse effects. Now assuming all 9 of the introduced studies found adverse effects, that means 82% (161/197) of the remaining studies also found adverse effects.
 

FinLuver

Native Oregonian…1846
Popcorn with extra butter, box of Good n Plenty, and a Lg Rootbeer please….

(These types of threads pop up occasionally…accomplishing about the same…and I like a good rerun) 😉
 
Top