2024 ISMRM Abstract Review Instructions
Online Review & Scoring Deadline:
Please read all instructions carefully before you begin.
Thank you for agreeing to review abstracts for the Annual Meeting ISMRM 2024. Your role as a reviewer is integral to the success of the scientific program, and your efforts are greatly appreciated. Your reviews will assist the Annual Meeting Program Committee (AMPC) as they construct the scientific program for the Annual Meeting. Please read the following instructions and check the acknowledgement before reviewing your assigned abstracts. Finally, please note that the content and scoring of all abstracts is strictly confidential.
NEW THIS YEAR:
Impact, Quality and Novelty: We are bringing back scoring in 3 areas, but using the same 7-point scale as the last few years. While 3 areas are important, we realize that some abstracts may be stronger or weaker in different areas. Impact is the overall contribution to the field, and how it may change science or practice. Novelty is simply how new the work is relative to the field. Quality includes research study design, justification of methods, rigorous analysis, clarity and organization of the figures and presentation. You may use your judgment in assessing these areas and definitions.
Comments and Normalization: We ask that you provide brief comments whenever you can, and we do require comments to justify any score of 1 or 7 in any area. Taking 30 seconds to write a comment is very helpful to the program committee. We will normalize your scores so that all reviewers’ scores impact the program.
Group Sizes and Match: This year we have more categories and review groups in an effort to better match reviewer preferences with abstract primary and secondary categories. Most reviewers will receive abstracts from different review groups. Please try to score consistently across topic areas, and avoid scoring one group higher than another.
Journal Recommendation: (New last year) In an attempt to streamline submission, review, and publication of strong abstracts to our journals, we invite you to select 2 to 3 (or about 5%) of the abstracts you have reviewed as being of sufficiently high quality. This information will be sent to the MRM and JMRI editors who will have the discretion to use this information as they see fit.
For each abstract you would like to see as a journal paper you should:
(1) recommend which journal,
(2) specify your willingness to review this paper, and
(3) offer high-level comments to the authors, if you have any, to guide them in preparation of an article.
If you DO NOT recommend this abstract as a journal submission, please answer no.
If you answer that you recommend this as a journal submission, additional questions will appear:
All subsequent processes using this information (invitations, review, blindedness of review, and decisions) are entirely at the discretion of the journal editors. Reviewer names are kept confidential by the ISMRM and the journals. Note that this information will NOT be shared with the Annual Meeting Program Committee in final selection—only the scores and main comments are passed on.
Again, the goal of this option is to not only streamline submission, review, and publication of strong abstracts, but also incentivize authors of the best abstracts to write them up as full papers and to help feature our best ISMRM meeting content in our own journals.
TECHNICAL NOTE:
We have found that Google Chrome sometimes cannot render certain Latex elements. For example the \mathcal symbol cannot be rendered:
https://proofwiki.org/wiki/Symbols:%5Cmathcal
We have found this to be an issue with Chrome specifically running on M1 Macs (MacOs Ventura). Chrome on other types of computers appears to be normal. Please be aware of this issue during the abstract review process as many equations may not show up correctly on some reviewers’ screens.
FULL REVIEW INSTRUCTIONS:
A. Getting Started: after you log in, you will see the Summary Table followed by a list of abstracts assigned to you. The Summary Table provides a summary of your abstract scores, including your average score, standard deviation, and range of scores. This tool is provided to help you achieve a diverse range of scores. This screen will remain visible throughout the review process.
Below the Summary Table, you will see the list of abstracts that are assigned to you. You may start reviewing the first abstract (press “Enter Review”) or you may batch download the entire stack of your assigned abstracts if you wish to read them offline. If you would like to batch download all of your abstracts, please check the box to the right of “Status.” This will select all of your abstracts.
If you choose to batch download, a window with all of the abstracts listed sequentially will pop up. Please be sure that the download has completed as this may take some time. You may save these as a PDF file, print them, or save as an HTML file. With the HTML format, you can zoom in and out of the documents.
To start the review of an individual abstract, click on the button on the right indicated by “Enter Review.” This will open the individual review window below. Use the View Abstract button to bring up a pop-up window with the individual abstract. Clicking on the figure thumbnails will allow you to view the individual figures in detail.
B. Review and Scoring Guidelines: Please read the following carefully. The following four sections require your input:
1. Conflict of Interest (required): Please indicate if you have a conflict. A conflict exists when you are a co-author, the work is from your institution/employer or a close collaborator, you hold patents directly related to the research, or there is any other reason generally considered a conflict. Blinded review can make it difficult to identify all conflicts based on authorship, but we ask you to do your best. Please indicate a conflict by clicking “yes” and hit the Submit button. Your review for this abstract is complete.
2. Scores (required): The purpose of the scores is to assist the AMPC in making the program sessions, which is a complicated task that reflects the strong interdisciplinary nature of the society. This year we are asking you to score in 3 areas for each abstract. Be aware that the average of the 3 area scores IS used for ranking – excellent research should score well in all:
- Impact – is the degree to which the work will influence the field, including how likely the work is to change scientific discovery, clinical practice or further technique development and capability.
- Novelty – reflects how new the work is, or if it is somewhat incremental. Work does NOT have to be novel to have high impact! Novelty can include presenting a new direction or area of existing research.
- Quality – reflects the quality of the research (including study design, appropriateness of the research goal/question, rigor of the data analysis) AND the clarity with which the work is presented. Note that work may be high Quality with medium scores for Impact and Novelty for different reasons, or the reverse.
Past feedback is that reviewers already use these aspects in forming their overall score, so providing the individual scores is helpful to the program committee.
For each of the 3 review areas, please use this scale for scoring, again trying to spread your scores, aiming for a mean and standard deviation of about 4 and 1.5 respectively.
- Accept – Oral Presentation
- Accept – Prefer as Oral
- Accept – Strong Poster, Oral Acceptable
- Accept – Poster, not for Oral Presentation
- Accept – Weaker Poster, Reject in Competitive Cases
- Likely Reject – accept only if topic of special interest
- Definite Reject
Please use the entire 1-7 scoring range as much as possible. Note that reviewers are intentionally blinded to author preference for poster vs. oral presentation. Please use the above scoring system for all abstracts assigned to you. We will normalize all reviewers’ scores to the same mean and standard deviation, in order for all reviewers to have influence.
Comments: While comments are always appreciated, there are some important cases where we ask you to comment:
- We require comments on extreme scores (1 or 7) as, for example, not all reviewers may know that a similar paper exists, so explaining a “reject” or top score is important.
- If you believe the abstract is a duplicate of another abstract or a published paper, please score the abstract, but identify the duplicate in the comments.
- If you think there is a better category, please indicate this in the comment, but please try to score the abstract as best you can. Note that there may be excellent work for which there is no good category match.
C. General Principles of Reviewing Abstracts: Authors invest an enormous amount of time, effort, and resources to generate their research. As reviewers, you have a responsibility to take this process seriously, and you are serving as representatives of the ISMRM. By agreeing to review abstracts, you are agreeing to abide by the following principles:
- Confidentiality: The content of every abstract is strictly confidential until it is published in the Proceedings of the Annual Meeting. Strict adherence to confidentiality has many implications, including grant funding, intellectual property, and publications. Do not share any aspect of your assigned abstracts or your scoring with anyone.
- Conflict of Interest: Please do not score any abstracts with which you have a conflict of interest (see above).
- Reviewer Conflicts: reviewers are required to disclose all relevant financial disclosures involving any commercial interest.
- Reviewer Bias: Please remember to evaluate the science only. We do not reveal authors, institutions, or acknowledgements as a way to help you, the reviewer, to focus on the science and avoid unconscious bias. However, for some submissions, you may recognize the authors or institution based on your familiarity with specialized research equipment or continuation of prior work. We ask that you set that aside and still evaluate the science alone.
- Standards Involving Recommendations for Clinical Care:
- All recommendations involving clinical medicine must be based on evidence that is accepted within the profession of medicine as adequate justification for their indications and contraindications in the care of patients.
- Basic science / engineering research referred to, reported or used in support or justification of any patient care recommendation must conform to generally accepted standards of experimental design, data collection and analysis.
- Please note any deviations from these principles in the Comments section.
- Quality of Submitted Abstracts: Accepted abstracts will be published and referenced. For this reason, abstracts should be of high quality. Poorly prepared abstracts with spelling errors, confusing formatting, and poor grammar should be scored accordingly.
- Duplication of Abstracts: Clear duplication or strongly overlapping abstracts may be grounds for rejection of one or both abstracts. Please score duplicated abstracts as if they were independent, but please note any duplication (provide abstract numbers) in the comment section. Also, abstracts that are essentially the same as previously published journal articles should be identified. Please note this kind of duplication in the comment section as well, including a brief citation if possible.
- Abstract Categories: If you think an abstract is in the wrong primary category, please use the Comments field to suggest an alternate category if you can. Do not score an abstract poorly because the primary category is a poor fit – in some cases there is no good category match.
- Abstracts Outside of Your Field of Expertise: It is possible (and understandable) that some of your assigned abstracts are beyond your ability to provide an informed evaluation. You must use your judgment here – if possible please review the abstract. However, if you feel unqualified to assess an abstract fairly, please check “yes” under Conflict of Interest and make a note in the comment section that the abstract is beyond your expertise.
- Blinded Review: Abstracts are blinded by authors and institution. If an author inadvertently identifies himself or herself by name or affiliation, you may choose to review or not review, whichever you consider most appropriate. If inadvertent disclosure occurs, please note this in the Comments section and inform the ISMRM Office as quickly as possible, so this disclosure can be corrected. (If authors cite their own work, you need not comment on that.)
- Facts about the Review: (A reward for reading this far!): The review is a complicated matching process depending on the distribution of reviewer preferences and abstract categories.
- We received 6750 abstracts, in 14 categories and 174 subcategories.
- There are 1018 member reviewers, many trainee first-time reviewers, and an additional 75 program committee reviewers.
- We aim for a minimum of 5 reviewers per abstract, leading to over 33,000 reviews!
- Most trainee, first-time reviewers will receive an “introductory” load up to 12 abstracts, and these are in addition to the initial 5 reviews per abstract.
- The average number of abstracts a reviewer received is 36, and a maximum of 60.
- 82% of reviewer-group matches are within the top 3 reviewer preferences.
- 37% of abstract reviews have BOTH primary and secondary subcategories within the reviewer’s preference.
Staff Assistance
Melissa Simcox, Director of Education, melissa.simcox@ismrm.org
Anne-Marie Kahrovic, Executive Director (interim), anne-marie@ismrm.org
Sally Moran, Director of IT & Web, sally@ismrm.org
Thank you for your efforts to help make the 2024 ISMRM Annual Meeting a success!