The academic social network Academia.edu has now more than 25 million registered participants, who use the network mainly to post papers. Its new tool called Sessions allows researchers to have a constant access to peer critiques on their works and to potentially broaden the circle of colleagues who can interact on their research.
“In Sessions, researchers upload a draft paper and then invite a list of other scholars on the network to comment on it during a 20-day period. After that time, the author can either extend the session for another 20 days or close off comments”.
Read more on this article published by The Chronicle of Higher Education: Academic Social Network Hopes to Change the Culture of Peer Review, on Sept. 25, 2015.
Sessions uses the Scribd viewer to display content in your browser and a hypothes.is like annotation system, which allows every participant to anchor his or her comments anywhere in the document, or just publish a general notice.
Open peer review is based around the idea of transparency and disclosure of the identities of those reviewing each particular work, as opposed to the anonymous commenting , known as anonymous/blind peer review.
- Authors invite expert peers to formally evaluate their work posted in any online archive (libraries, repositories, preprint servers, etc).
- Reviewers who accept submit a detailed qualitative and quantitative assessment of the work.
- The reviewer’s name and any conflict of interest are publicly disclosed.
- Reviews are published with a creative commons license (or similar) and become publicly available along with the original work.
- Reviews are subject to commentary and evaluation by the entire community.
- Author-guided open peer review can be implemented at any stage of an article’s lifetime: (a)before journal submission, (b) during journal peer review (in agreement with the journal’s editor), and (c) after journal publication.
Why Open Peer Review can be interesting?
- Previous comments were often managed by journal editors, who could therefore control access to each work and subsequent review
- Anonymity meant people could falsify comments; both friends/colleagues of the author to leave positive commentary, or critics leaving harsher criticisms hiding behind the guise of anonymity
- The accountability of adding a reviewer profile offers reviewers incentives to provide good quality and helpful reviews, as well as get into contact direclty to further works.
- Lower costs (no publisher/editor monopoly)
- Aims to keep taking scholarly advancement further
- Encourages collaboration, and more reviewing as opposed to academic competition.
The Libre project is one which has adopted free Open Peer Review as a platform for a community of volunteer scholars to share and develop its work, and its popularity is growing.
To find out more about Open Peer Review, read this article !
John Kratz of the California Digital Library recently published an article entitled ‘Fifteen ideas about data validation (and peer review)’
He describes it as a “longish list of non-parallel, sometimes-overlapping ideas about how data review, validation, or quality assessment could or should work, ” and lays out fifteen observations and recommendations to improve the process.
Problems with data validation can sometimes arise, as academic researchers often only publish raw datasets alongside their articles.
As a result it sometimes becomes difficult to assess the reliability and relevance of this data.
Whilst, as the author notes, there are some mechanisms in place to validate data, they are severely lacking in comparison to those in place for example in terms of citations ; where several widely recognised styles are already present.
This is somewhat surprising ; data validation is clearly of high importance in assuring the credibility of an academic article, and therefore strong mechanisms and even a standard procedure should be in place to ensure that this is the case.
One of the ongoing themes which runs throughout Kratz’ ideas is the depth of which the data needs to be reviewed ; not only by one person, but divided up among people or even organisations. Both data and metadata should be reviewed, not only by other academics, but experts in the field, the community and the users of the data. Similarly, aside from mere validation, actual use of the data is a form of review in itself, and works to confirm the true relevance and application of the data to conclude whether it really is fit for purpose.
View this video coming from Nature for more information on the subject: