One problem with a lot of the studies is that the studies have a lot of confoundeing and estimates of lead's harm seem to be rising even as exposure decreases, which should make people stop and think more. But this doesn't come up that much because it is obviously bad for you, and there's generally no good reason not to avoid exposure and not many people care about precisely how bad it is for you.
But it's weird that lead exposure has been dropping while the things it's supposed to cause don't seem to be decreasing in proportion to the lack of exposure.
The idea that the effects should grow is hard to make sense of, because what's grown is only the lead exposure gap between classes, not the IQ gap. So no change in IQ gap with a change in the lead gap means the lead effect is picking up on confounding, not becoming larger.
It would have to do that be consistent with this claim that the sensitivities have gone up, unless both groups have trended down in lead levels non-linearly.
> estimates of lead's harm seem to be rising even as exposure decreases, which should make people stop and think more
Is the effect size or confidence rising? The latter makes sense. We're moving from a population systematically exposed to lead (no control) to one with lead-free kids for a change.
Good studies from the different research eras can be used to illustrate how lead effect sizes have changed over time.
Landrigan et al. (1975) represents the Early Era. In this study, there were 46 children in the high lead group and 78 in the control group. Their respective BLLs were 48.3 and 26.9 in 1972 and 40.5 and 26.5 in 1973, and they were 8.3 and 9.3 years old, respectively. So we have a gigantic 14 μg/dL gap between these groups. The high lead group had an average IQ of 88.02 versus 92.88 for the low lead group, or a 4.86 point IQ gap, and thus a per μg effect of 0.35 IQ points.
Baghurst et al. (1992) represents the Middle Era. In this study, there were 494 children who had IQ results, and they were divided into quartiles by BLLs. The mean concentrations of blood at assessment age were 6.6 μg/dL for the lowest quartile, 10.1 for the second, 13.7 for the third, and 20.0 for the final one. Their IQs were 109.6, 107.7, 102.7, and 98.7, respectively. Going from the lowest to the highest lead exposure quartiles, we have a BLL difference of 13.4 μg/dL and an IQ difference of 10.9 points. Going quartile to quartile, the effect of 1 μg of lead was 0.54 IQ points, 1.39 IQ points, and then 0.63 IQ points, with the aggregate (1 - 4) being 0.81 IQ points.
Kim, Yu & Lee (2010) represents the the Modern Era. In this study, there were 302 children who were median-split by BLLs. The high BLL group had a mean BLL of 3.74 and the low BLL group had a mean of 1.92 with IQs of 106.4 and 110, respectively. These differences of 1.82 μg/dL and 3.60 IQ points mean that the per μg/dL IQ drop was 1.98 points.
You think the effect of a toxin is a certain curve. A giant initiative to remove it is moderately successful, but the numbers in the population are not coming down as much as you expected.
Is it that the toxin is correlated with another substance that is responsible for part of the harm? Is it corruption and the cleanups have been faked, leading to underreporting of exposure without underreporting of results? Or did you underestimate the harm at lower exposures (maybe because you underestimated the exposure of some of your subjects)?
It'd be hard for all the studies to measure wrong levels of lead in people's blood, though. How does one get all the different people studying the subject to conspire?
But it's weird that lead exposure has been dropping while the things it's supposed to cause don't seem to be decreasing in proportion to the lack of exposure.