If you are experiencing problems downloading PDF or HTML fulltext, our helpdesk recommend clearing your browser cache and trying again. If you need help in clearing your cache, please click here . Still need help? Email email@example.com
If a manuscript meets scientific standards and contributes to the advancement of science, it can be expected that two or more reviewers will agree on its value. Manuscripts are rated reliably when there is a high level of agreement between independent reviewers. This study investigates for the first time whether inter-rater reliability, which is low with the traditional model of closed peer review, is also low with the new system of public peer review or whether higher coefficients can be found for public peer review. To investigate this question we examined the peer-review process practiced by the interactive open access journal Atmospheric Chemistry and Physics (based on 465 manuscripts submitted between 2004 and 2006 receiving 1,058 reviews in total). The results of the study show that inter-rater reliability is low (kappa coefficient) or reasonable (Intraclass Correlation Coefficient) in public peer review.
Learned Publishing is the journal of the Association of Learned and Professional Society Publishers, published in collaboration with the Society for Scholarly Publishing. The journal is published quarterly in January/April/July/October.
Learned Publishing articles are available free online to members of ALPSP and SSP. ALPSP members: log in to www.alpsp.org. If you do not have a password contact firstname.lastname@example.org SSP members: log in to the Member Center using your membership username and password. Further information email@example.com