Everyone who published a paper had to go through a peer–reviewing-process. A lot of (technical) things and people take part in this process and, naturally, its outcome is not always for the benefit of the authors of the submitted manuscript and even if it is: sometimes the feel is unsatisfactory. For instant, you had to transfer your whole 30–page–review–manuscript to a doc–file, because after acceptance the typesetters did not accept your tex–file… Wouldn’t it be great to share such experiences with other people? Or to read about such experiences before you submit?
SciRev is about sharing and rating these experiences. Last week, the team of SciRev wrote me an email to have a look at their website and tell my colleagues about it. I’ve never heard of SciRev before, but I was immediately excited about this!
Jolene over at Vox magazine (NL) and the people at Hedda already discussed several aspects about the peer–reviewing–reviewing in general. Thus, I thought I focus more on the implementation in this blog post. This post is a review about a review-site, which is about reviewing the peer-reviewing process by other peer-reviewers. True peer-reviewception!
When visiting SciRev.sc the user is presented with two options to search the SciRev database, namely by journal title or by scientific discipline. The former options is intended for finding a specific journal to get information about it and perhaps add a rating/review to it. The latter option lists all journals corresponding to a discipline such as ‘organic chemistry’, ‘cheminformatics’, or ‘materials science’ (unfortunately, there is no ‘analytical chemistry’). Additionally, the list gives an overview about different aspects regarding the peer–reviewing–process of this journal, such as the average duration of the first review round, the number of reviewers, and ratings about the quality and difficulty of the reviewing–process. The user can sort the list and so — theoretically — find the best option for the next submission.
However, at the moment this features still lacks an decent amount of ratings or participants. The SciRev team claims that there are over 500 reviews, yet. They are all distributed over hundreds of journals, though. Hence, there are some journals which already have one, two, or maybe three reviews/ratings, but there are thousands of journals which have no content at all. Of course, it is not (directly) the duty of the SciRev team (2 members at the moment, see below) to do that but of the research community. If we as researchers want to support such a project (and I think it is a very good idea), we have to participate and provide informations to push the number to 50k or 500k reviews!
In order to add a review one has to create an account, go to any of the journal pages on SciRev, and click ‘Review this journal’. For easiness, the user is guided by a questionnaire and has just to rate the individual steps of the peer-review-process or provide the number of days between certain ‘events’, such as submission and answer of editor. In the end, the user can provide a comment on the peer-reviewing-process of the journal as well as on the questionnaire of SciRev. The whole process just takes a couple of minutes. Altogether a very easy and well thought out system!
Unfortunately, an (unknown) error occured the first time I submitted a review and all my data was lost. Maybe the submitted ‘post–data’ should be printed (if possible) on the error page so one can copy the data and save it (did it the second time in notepad, though). Also, there seems to be an issue regarding the timeout of the log-in or the log-in-system in general. If I remained too long on a page (including the review page) it threw me out and I had to log-in, again. Nevertheless, the second time I submitted my review it worked flawlessly.
The people at Hedda broach some issues regarding the balancing of the comments and reviews. For instant, people who gained bad experience with the peer-reviewing process of some journal are much more likely to give a review and, therefore, a bad rating than those who did not. Additionally, inefficient processing in the past might lead to an excess of bad ratings even if the journal already updated and improved. I share these concerns, but I think if enough researchers participate in rating peer-review-processes this definitely could work out!
What I am missing is a rating system for the comments such as on pages of big online shops. Other people can see and judge if a comment is just the product of frustration or a good, impartial review. Imagine that a journal has 100 comments. Wouldn’t it be great to have two columns with the good experiences on the left side and the bad experiences on the right side sorted by the rating of these comments? Thus, on the top there would be the best rated good–experience versus the best rated bad–experience, which would be extremely valuable!
It is ‘ok’ to read that everything went well. But the most valueable contributions are those comments which describe problems during the peer-reviewing-process. What was the problem? Was it generated by the reviewer (bad mood?) or by the author or by the publisher (technical reasons or license issues)? How (if at all) did the editor try to solve this problem? Was the solution reasonable or just easy? Were all involved persons always polite? Was the paper accepted or rejected in the end? And so on. That are the stories I want to read and they should be easily accessible.
Finally, I wanted to know more about the people behind SciRev, who came up with the idea of it, and who wrote me the email (it was only signed by ‘SciRev Team’). At the moment, SciRev is maintained by two people, namely Jeroen Smits and Janine Huisman. Jeroen Smits is an associate professor at Nijmegen Center for Economics (Radboud University Nijmegen, NL). He studied sociology and psychology. According to his personal homepage, his central research interests are
Unfortunately, there are only short biographies available on SciRev itself, without additional links or more information about both authors. Thus, I had to find them at the respective institute to get an impression of both. I would suggest expanding the biographies a little bit, so (especially new) visitors get a better impression about the authors: are they students? are they experienced researchers? what and where have they published? In short: What is their research history or background? And maybe: What is the (personal) motivation of each author to make a project like SciRev?
Sharing experiences is very valuable and important in science. This is not only true for research but also applies to the peer-reviewing-process. Reviewing, rating, and commenting it on a public platform is a great idea and SciRev has made a very good start for such a platform with a well thought out system. The only thing missing now is more content, i.e. more reviews. Thus, write one! Now!