Papers I read are mainly physics papers, especially particle physics. Not replicated results there are so rare that they often get significant attention in the community (Blog article) or even mainstream media (OPERA neutrino speed measurement).
The usual study&publication process for a new particle detector looks like that:
identify particles flying through the detector (known for >50 years)
find the decays of frequent short-living particles (known for >30 years), use them as calibration
look for other known particles and compare their masses and decays with the existing values
look for known decay modes of those particles and related properties, compare them with existing values and improve them by a significant factor
find new things
Completely new measurements are just a small fraction of the studies—most results confirm earlier experiments and improve the precision.
Papers I read are mainly physics papers, especially particle physics. Not replicated results there are so rare that they often get significant attention in the community (Blog article) or even mainstream media (OPERA neutrino speed measurement).
The usual study&publication process for a new particle detector looks like that:
identify particles flying through the detector (known for >50 years)
find the decays of frequent short-living particles (known for >30 years), use them as calibration
look for other known particles and compare their masses and decays with the existing values
look for known decay modes of those particles and related properties, compare them with existing values and improve them by a significant factor
find new things
Completely new measurements are just a small fraction of the studies—most results confirm earlier experiments and improve the precision.