Skip to content

lywinagain/awesome-slr-tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 

Repository files navigation

Awesome Systematic Literature Review (SLR) Tools

A comprehensive collection of tools, resources, and tutorials for conducting systematic literature reviews. Perfect for young researchers starting their SLR journey!

Awesome

💡 New to Systematic Literature Reviews? Start with the Quick Start Guide below!

Table of Contents


🚀 Quick Start Guide

Choose your path based on your needs:

For Beginners (Just Starting)

  1. Learn the basics: Read Systematic Literature Review: Easy Guide
  2. Follow standards: Download PRISMA 2020 Checklist
  3. Start simple: Use Rayyan (free) or Covidence (free trial)
  4. Manage references: Install Zotero (free, open-source)

For Developers/Programmers

  1. Install Python tools: pip install asreview litstudy
  2. Install R packages: install.packages(c("metafor", "bibliometrix", "PRISMA2020"))
  3. Explore GitHub: Browse systematic-review topic
  4. Try notebooks: Use Jupyter Guide

For Health/Medical Researchers

  1. Search: PubMed + Cochrane Library
  2. Screen: ASReview or Rayyan
  3. Quality: Cochrane RoB 2 Tool
  4. Analyze: RevMan (free Cochrane tool)

For Engineering/CS Researchers

  1. Search: IEEE Xplore + ACM DL + Scopus
  2. Visualize: VOSviewer for citation networks
  3. AI Help: Elicit or Consensus
  4. Publish in: IEEE COMST or ACM CSUR

📊 Recommended Workflow

Step-by-step process with tool recommendations:

Phase What to Do Free Tools Paid Tools
1. Planning Define question (PICO), write protocol PRISMA-P, PROSPERO -
2. Search Search multiple databases PubMed, Google Scholar, Semantic Scholar Scopus, Web of Science
3. Deduplication Remove duplicate records ASySD, Zotero EndNote, Covidence
4. Screening Title/abstract screening ASReview, Rayyan Covidence, DistillerSR
5. Full-text Review Read and select studies Rayyan, Zotero Covidence
6. Data Extraction Extract study data Covidence (trial), Google Sheets DistillerSR, NVivo
7. Quality Assessment Assess risk of bias JBI Tools, RoB 2 -
8. Analysis Meta-analysis/synthesis RevMan, R metafor Comprehensive Meta-Analysis
9. Visualization Create figures PRISMA2020 R package, VOSviewer Tableau
10. Writing Write manuscript Overleaf, Google Docs Microsoft Word

💰 Budget-Friendly Full Workflow:

  • All steps: Rayyan + Zotero + RevMan + R (metafor) + Overleaf = $0
  • Everything is free and open-source!

🚀 Premium Workflow:

  • All steps: Covidence + EndNote + Comprehensive Meta-Analysis + NVivo ≈ $500-1500/year
  • Professional features, support, and integrations

What is Systematic Literature Review?

A Systematic Literature Review (SLR) is a rigorous and comprehensive method of reviewing existing research on a specific topic or question. It follows a clearly defined protocol and systematic methodology to identify, select, and critically appraise research.

Key Characteristics:

  • Focused research question using frameworks like PICO, PICOT, or SPIDER
  • Systematic and explicit methodology that can be replicated
  • Comprehensive search across multiple databases and grey literature
  • Critical appraisal of study quality and risk of bias
  • Transparent reporting following standards like PRISMA

Essential Reading:


Key Journals & Publications

Top Journals for Publishing SLRs:

  1. IEEE Communications Surveys & Tutorials

    • Impact Factor: High-tier venue for comprehensive surveys
    • Scope: Communications, networking, and computing
    • Author Guidelines
  2. ACM Computing Surveys (CSUR)

    • Impact Factor: 23.8 (2023)
    • Scope: Computer science and computing surveys
    • Author Guidelines
  3. Systematic Reviews

    • Open access journal dedicated to systematic review methodology
    • Publishes protocols and completed reviews

Planning & Protocol

PRISMA Framework:

PRISMA Extensions:

  • PRISMA-P - For protocols
  • PRISMA-S - For searching (16-item checklist)
  • PRISMA-ScR - For scoping reviews

Protocol Development Tools:

Planning Frameworks:

  • PICO - Population, Intervention, Comparison, Outcome
  • PICOT - PICO + Time
  • SPIDER - For qualitative research
  • PCC - Population, Concept, Context

Search & Discovery

Academic Search Engines:

General & Multidisciplinary:

  • Google Scholar - Broad coverage, free access
  • Semantic Scholar - AI-powered, 200M+ papers, identifies influential citations
  • Dimensions - Research analytics with patent/clinical trial links
  • Lens.org - Free scholarly search with patent integration

Health & Medicine:

  • PubMed - Primary database for biomedical literature
  • Cochrane Library - Gold standard for health systematic reviews
  • Embase - Biomedical and pharmacological database

Engineering & Computer Science:

Social Sciences:

  • PsycINFO - Psychology and behavioral sciences
  • ERIC - Education research
  • ScienceDirect - Multidisciplinary scientific database

Search Strategy Development Tools:

  1. SR-Accelerator Polyglot Search Translator ⭐ Essential GitHub stars

    • Cross-database search translation
    • Supports: PubMed, Ovid MEDLINE, Cochrane, Embase, Web of Science, Scopus, PsycInfo, ProQuest, SPORTDiscus, CINAHL
    • Web interface + npm package
    • Validation study: 32% time reduction without errors
    • npm install sra-polyglot
    • GitHub: https://github.com/IEBH/sra-polyglot
  2. BOOL

    • AI-powered Boolean search assistance
    • Plugs into Google, Bing, Google Scholar, Scopus
    • CSV import/export
    • Free web-based tool
  3. PRESS 2015 ⭐ Essential Validation

Grey Literature & Special Databases:

Grey Literature:

  • CADTH Grey Matters - Practical toolkit for health grey literature
  • OpenGrey - European grey literature
  • Overton - Policy documents, guidelines, think tank publications

Trial Registries:

Preprint Servers:

Citation Discovery:

  • Connected Papers - Visual citation network exploration

  • ResearchRabbit - "Spotify for research" - personalized recommendations

  • CitNetExplorer - Citation network analysis

  • Citation Gecko ⭐ Free GitHub stars

    • Forward and backward citation chasing
    • BibTeX import, Zotero integration
    • Network visualization with timeline view
    • Iterative paper expansion
    • Uses OpenCitations, Crossref, Microsoft Academic
    • DOI: 10.5281/zenodo.7068284
  • Local Citation Network ⭐ Free GitHub stars

    • Visualizes local citation networks from seed articles
    • "Top Cited" and "Top Citing" recommendations
    • Uses OpenAlex, Semantic Scholar, Crossref, OpenCitations
    • GPL-3 license, web-based
  • Citation Graph Builder GitHub stars

    • PDF parsing (GROBID) with bibliographic APIs
    • SVG export for publication-ready figures
    • MIT license
    • Install via conda
  • Citograph GitHub stars

    • Citation graphs from Semantic Scholar
    • JSON output for Excalidraw
    • Apache 2.0 license

API Resources:

  • OpenCitations - 2+ billion citation links (July 2024) GitHub stars

    • SPARQL endpoint, REST APIs, dataset dumps
    • CSV, N-Triples, Scholix formats
    • CC0 licensed data, ISC licensed software
    • Sources: Crossref, NIH-OCC, DataCite, OpenAIRE, JaLC
  • OpenAlex API Tutorials GitHub stars

    • Jupyter notebooks with bibliometric analysis
    • Python code snippets
    • No authentication required
    • 100k requests/day limit

Systematic Review Toolbox:

  • SR Toolbox - Directory of 235+ tools for literature reviews

Study Management

🎯 Quick Pick: Free users → Rayyan | Budget available → Covidence

Comprehensive Platforms:

  1. Covidence ⭐ Recommended

    • Cochrane's official production platform
    • Full workflow support: screening, extraction, quality assessment
    • Subscription required (free trial available)
  2. Rayyan ⭐ Free

    • Free with unlimited collaborators
    • Excellent for screening and abstract management
    • AI-assisted screening
  3. DistillerSR

    • Professional-grade systematic review software
    • Strong data extraction features
    • Commercial license required
  4. EPPI-Reviewer

    • All types of literature reviews
    • Meta-analysis capabilities
    • Academic and commercial licenses
  5. Nested Knowledge

    • Modern interface with AI features
    • Visual data extraction
    • Growing platform (87% feature coverage)

Screening & Selection

🎯 Quick Comparison:

Tool Cost Best Feature Workload Reduction
ASReview Free AI active learning Up to 95%
Rayyan Free Collaborative screening Good
Abstrackr Free Semi-automated ~67%
Covidence Paid Full workflow Good

AI-Powered Screening Tools:

  1. ASReview ⭐ Free & Open Source

    • Active learning for efficient screening
    • Can reduce workload by up to 95%
    • Open source, published in Nature Machine Intelligence
  2. Abstrackr - Free

    • Semi-automated screening
    • Median workload savings: 67.2%
    • Web-based, no installation required
  3. SWIFT Active Screener

    • Machine learning-assisted screening
    • Supports all mandatory features

Web-Based Screening Platforms:

  1. Colandr - Open Source

    • Machine learning-assisted screening
    • Full evidence synthesis workflows
    • PICO framework support
    • Customizable data extraction forms
    • MIT-licensed
    • Uses word2vec for ML
    • Free access
  2. SRDR+ ⚠️ Retiring Nov 28, 2025

    • Machine learning-informed screening
    • FHIR-compliant data extraction
    • Free web-based platform
    • Customizable extraction forms
    • Important: Migrating to alternatives recommended
    • Website: https://srdr.ahrq.gov/
  3. SRA2 (Systematic Review Accelerator) GitHub stars

    • Search optimization
    • Reference deduplication
    • Multi-researcher consolidation
    • Open-source PHP/MySQL tool
    • Successor to original SRA
  4. SyRF (Systematic Review Facility)

  5. CADIMA

    • Free open-access platform
    • Developed for agriculture and environmental sciences
    • Supports entire systematic review process
    • Protocol development
    • Statistical analysis integration with R
    • Export files for offline work
  6. SysRev.com

    • Free collaborative data extraction
    • Boolean, categorical, and string labels
    • Multiple reviewers support
    • Abstract and full-text screening

Manual Screening Support:

  • Covidence - Structured screening workflow
  • Rayyan - Collaborative blind screening
  • RevMan - Cochrane's review manager

Data Extraction

Structured Extraction:

Qualitative Data Analysis:

  1. NVivo

    • Industry standard for qualitative analysis
    • Supports text, audio, video, social media
    • AI-powered autocoding
    • Great for systematic reviews with qualitative data
  2. MAXQDA

    • Mixed methods data analysis
    • AI-powered features
    • Automatic transcription
  3. Atlas.ti

    • Qualitative and mixed methods analysis
    • Visual analysis tools
  4. Dedoose

    • Web-based collaborative analysis
    • Mixed methods support

Data Exchange:

  • REFI-QDA - Standard format for exchanging data between NVivo, MAXQDA, and Atlas.ti

AI-Powered Data Extraction:

Automated Extraction from Clinical Trials:

  1. Trialstreamer ⭐ Living Database
    • Living database of 700,000+ RCTs
    • Automatic RCT identification from PubMed
    • PICO extraction and MeSH mapping
    • Sample size extraction
    • Risk of bias prediction
    • Daily updates with free database download
    • Searches WHO ICTRP and PubMed
    • Structured query interface
    • Marshall et al. (2020) JAMIA

PICO Extraction Systems:

  1. BioBERT/BioELECTRA-PICO (Hugging Face models)

    • Named Entity Recognition for PICO elements
    • BioELECTRA-PICO (110M parameters)
    • BioMobileBERT (compact, faster)
    • F1 scores 70-90% depending on element
    • Domain-adapted for biomedical text
  2. EBM-NLP Dataset

    • 5,000 annotated RCT abstracts
    • Gold standard for PICO extraction research
    • Publicly available corpus
  3. Section-Specific Learning Approach (Hu et al., 2023 Bioinformatics)

    • 90% extraction accuracy from Methods sections

    • Two-step pipeline: sentence classification + NER
    • Focus on structured sections
  4. AlpaPICO

    • LLM-based PICO extraction from clinical trials
    • Research code available
  5. OpenAI Batch API - Large Scale

    • Parallel PICO extraction from 680,000+ abstracts
    • PostgreSQL database storage
    • <3 hours processing for 700,000 abstracts
    • For EU HTAR compliance

LLM-Based Extraction:

  1. ChatGPT/GPT-4 - API-based

    • ~80% accuracy for PICO elements in feasibility studies
    • Prompt engineering with JSON schemas
    • pip install openai
    • Commercial API costs ~$1.50/1000 pages
    • Limitations: hallucinations requiring validation
  2. Claude (Anthropic)

    • 65-82% agreement across runs for binary outcomes
    • Kappa: 0.65-0.82 for consistency
  3. ChatPDF

    • LLM-powered PDF analysis
    • 70%+ correct responses for binary outcomes
    • Kappa: 0.78-0.96 for agreement
  4. Mistral OCR

    • Advanced document understanding via mistral-ocr-latest API
    • Handles complex elements (math, tables, images)
    • Thousands of scripts supported
    • 2000 pages/minute processing
    • $1/1000 pages pricing
    • Self-hosting option
    • Markdown output including images

Specialized Tools:

  1. MedTrialExtractor

    • Entity and relationship extraction
    • BiLSTM and BERT-based models
    • Designed for systematic reviews
  2. Supervised Distant Supervision (SDS) (Wallace et al., 2016 PubMed)

    • Extracts PICO sentences from full-text
    • Uses Cochrane Database of Systematic Reviews (CDSR)

Data Extraction Templates:

  1. Cochrane Data Extraction Templates

  2. PIECES Excel Workbook (Texas A&M Libraries)

    • Developer: Margaret J. Foster
    • Advanced Excel with drop-down menus
    • Range checks, data validation
    • Prevents entry errors
    • Customizable workbooks
  3. REDCap

    • HIPAA-compliant survey and database creation
    • Custom forms builder
    • Data validation
    • Export to Excel/CSV/SPSS
    • Requires institutional license
  4. Epi Info + R Pipeline


Quality Assessment

Risk of Bias Tools:

  1. Cochrane Risk of Bias Tool (RoB 2)

    • Gold standard for RCTs
    • Domain-based approach
    • RoB 2.0 current version
  2. JBI Critical Appraisal Tools ⭐ Free

    • Updated 2023-2024
    • Multiple study designs
    • Adaptable framework
    • Free checklists
  3. ROBINS-I - v2 Released Nov 2024

Study Design-Specific Tools:

  1. Newcastle-Ottawa Scale (NOS)

  2. QUADAS-2

  3. QUIPS

    • Assesses prognostic factor studies
    • 6 domains with three-grade risk of bias scale
    • Median 20 minutes per study
    • Inter-rater kappa 0.56-0.82
    • Recommended by Cochrane Prognosis Methods Group
    • Visualization supported by robvis
  4. PROBAST

Qualitative & Mixed Methods Tools:

  1. MMAT (Mixed Methods Appraisal Tool)

    • Version 2018 current
    • Covers qualitative studies, RCTs, non-randomized quantitative, descriptive, mixed methods
    • 5 core criteria per category
    • Download: http://mixedmethodsappraisaltoolpublic.pbworks.com/
    • Developed through modified e-Delphi with 73 experts
  2. COREQ

    • 32-item checklist for interviews and focus groups
    • Covers research team reflexivity, study design, data analysis
    • Most frequently cited qualitative research reporting guideline
  3. ENTREQ

    • 21-item checklist for systematic reviews of qualitative research
    • 5 domains
    • Higher uptake in reviews (28%) than COREQ (17%)

Systematic Review Quality Assessment:

  1. AMSTAR 2 ⭐ Essential

    • 16-item critical appraisal tool for systematic reviews
    • 7 critical domains
    • Overall ratings: high, moderate, low, critically low confidence
    • Published in BMJ 2017: https://pmc.ncbi.nlm.nih.gov/articles/PMC5833365/
    • Training videos and resources available
    • Used in ~80% of umbrella reviews
  2. ROBIS (Risk Of Bias In Systematic reviews)

    • Three-phase instrument focusing on bias
    • Complementary to AMSTAR 2
    • More sophisticated bias assessment

Visualization & Evidence Grading:

  1. robvis ⭐ Essential (See R packages section)

    • Publication-quality risk of bias figures
    • Supports: RoB 2, ROBINS-I, ROBINS-E, QUADAS-2, QUIPS
    • Colorblind-friendly schemes
    • Over 1,500 academic citations
    • Web app: https://mcguinlu.shinyapps.io/robvis (no R required)
  2. GRADEpro GDT

    • Creates Summary of Findings tables
    • Evidence Profiles with evidence-to-decision frameworks
    • Free web-based software: https://gdt.gradepro.org/app/
    • Over 100,000 users worldwide
    • Mobile/desktop compatibility

Additional Quality Assessment Frameworks:

  • CASP Checklists ⭐ Free Downloads

    • 8 different checklists: systematic reviews, RCTs, qualitative, cohort, case control, diagnostic, economic evaluations, clinical prediction rules, cross-sectional
    • All free downloads under Creative Commons license
  • NIH Quality Assessment Tools

    • Multiple study designs
    • Developed by NHLBI

Analysis & Synthesis

🎯 Quick Pick: Free → RevMan or R metafor | Easy to use → CMA ($295/year)

Meta-Analysis Software:

  1. RevMan (Review Manager) ⭐ Free

    • Cochrane's official software
    • Forest plots and meta-analysis
    • Free download
  2. R with metafor package ⭐ Free & Flexible

    • Most powerful and flexible
    • Free and open source
    • Steep learning curve
    • Tutorial
  3. Comprehensive Meta-Analysis (CMA)

    • User-friendly interface
    • Highest usability scores
    • Academic: $295/year
  4. OpenMeta[Analyst] - Free

    • Open source
    • Binary, continuous, diagnostic data
    • Developed at Brown University

Additional Analysis Tools:

  • Meta-Essentials - Free Excel-based tool
  • Stata - Statistical software with meta-analysis commands
  • JASP - Free, user-friendly statistical software

Visualization

Bibliometric & Network Visualization:

  1. VOSviewer ⭐ Free

    • Citation network visualization
    • Co-authorship analysis
    • Keyword mapping
    • Easy to use, widely adopted
  2. CiteSpace - Free

    • Temporal visualization of research trends
    • Burst detection for emerging topics
    • Java-based
  3. Bibliometrix / Biblioshiny - Free

    • R package for bibliometric analysis
    • Web interface (Biblioshiny) available
    • Comprehensive metrics

Data Visualization:

PRISMA Flow Diagrams:


Writing & Reporting

Collaborative Writing:

  1. Overleaf ⭐ For LaTeX

    • Real-time collaborative LaTeX editor
    • SLR Templates
    • Free tier available
    • University licenses common
  2. Google Docs

    • Real-time collaboration
    • Free, accessible
    • Good for drafting
  3. Microsoft Word + OneDrive

    • Track changes
    • Comments and reviews
    • Familiar interface

Academic Writing Tools:

Journal Templates:

  • Overleaf Templates - 1000+ journal templates
  • Publisher templates (Elsevier, Springer, IEEE, ACM)

Reference Management

🎯 Quick Pick: Most users → Zotero (free) | Large SLRs (5000+ refs) → EndNote

Top Reference Managers:

  1. Zotero ⭐ Free & Open Source

    • Best browser integration
    • Most accurate bibliographies
    • Free unlimited storage
    • Open source
    • Best for: Diverse content types
  2. EndNote

    • Best for large systematic reviews (5,000-10,000+ refs)
    • Superior deduplication
    • Most database search options
    • Best for: Large-scale SLRs
    • Commercial license required
  3. Mendeley - Free

    • Strong community features
    • PDF annotation
    • Free cloud storage
    • Owned by Elsevier

Deduplication Tools:

🎯 Performance Comparison: Rayyan (highest sensitivity/specificity) → ASySD → Deduklick → EndNote/Mendeley

  1. Rayyan ⭐ Best Performance

    • Highest sensitivity (0.98) and specificity (0.96+)
    • Automatic deduplication with screening
    • Tagging and highlighting features
    • Free for up to 3 active reviews
  2. ASySD (Automated Systematic Search Deduplicator) - Free

    • Rapid, open-source, interoperable
    • For biomedical systematic reviews
    • R package + Shiny app
    • Higher sensitivity than EndNote
    • GitHub: https://github.com/camaradesuk/ASySD
    • Published in BMC Biology (Hair et al. 2023)
  3. Deduklick - Novel Algorithm

    • Explainable automated deduplication
    • 99.51% mean recall
    • 100% precision
    • Published in Systematic Reviews (Borissov et al. 2022)
  4. Systematic Review Accelerator Deduplicator

    • Sorts by likelihood of duplication
    • XML file input (EndNote)
    • Higher sensitivity than EndNote
    • Note: Moving to TERA platform, paid after October 31, 2025
  5. Mendeley - Free

    • Good balance: 0.93 sensitivity and 0.93 PPV
    • Freely available
    • Built-in deduplication
  6. EndNote, Zotero - Built-in

    • EndNote: Best for large reviews (5,000-10,000+ refs)
    • Zotero: Free and open source

Collaboration Tools

Project Management:

  1. Notion

    • All-in-one workspace
    • Documentation and knowledge base
    • Free for individuals
  2. Miro

    • Visual collaboration whiteboard
    • Brainstorming and mapping
    • Integrates with Notion, GitHub, Slack
  3. Trello

    • Simple project boards
    • Task tracking
    • Free tier available

Research-Specific:


AI-Powered Tools

🎯 Quick Pick: Start with Elicit + ResearchRabbit (both have free tiers)

AI Research Assistants:

  1. Elicit ⭐ Highly Recommended

    • Systematic literature search and screening
    • 125M+ papers (Semantic Scholar)
    • AI-powered extraction to tables
    • Real-time collaboration
    • Free tier + paid plans
  2. Consensus

    • AI-powered literature consensus
    • Extracts key points from papers
    • Shows agreement/disagreement in literature
    • Free + Pro plans
  3. Scite

    • 1.2B citation statements analyzed
    • Shows if claims are supported/contradicted
    • Smart citations
    • Free tier + subscriptions
  4. ResearchRabbit - Free

    • "Spotify for research"
    • Personalized paper recommendations
    • Collection-based learning
    • Completely free

Additional AI Tools:

Important Notes:

  • Don't use general ChatGPT for sensitive research data
  • Combine tools for best results (e.g., Elicit + Scite + ResearchRabbit)
  • Always verify AI-generated content

Open Source Tools & GitHub Repositories

🎯 Quick Picks: Python: pip install asreview litstudy → Screening + Analysis R: install.packages(c("metafor", "bibliometrix")) → Meta-analysis + Bibliometrics PDF Extraction: GROBID (3500+ stars)

This section highlights open-source tools and code repositories that you can use, modify, and contribute to. Perfect for developers and researchers who want programmatic access or customizable solutions.

Python Libraries & Tools

Screening & Machine Learning

  1. ASReview ⭐ Highly Recommended GitHub stars

    • Active learning for systematic reviews
    • Published in Nature Machine Intelligence
    • Reduces workload by up to 95%
    • Privacy-first, no data collection
    • pip install asreview
    • Language: Python
  2. LatteReview GitHub stars

    • Low-code AI-powered automation
    • Multi-agent review systems
    • Customizable agent roles
    • Supports multiple review rounds
    • Language: Python
  3. systematic-reviewpy GitHub stars

    • Automates PRISMA workflows with NLP
    • Generates analysis tables and workflow diagrams
    • Creates ASReview files
    • Provides stemming, lemmatization, paper sorting
    • pip install systematic-reviewpy
    • Language: Python
  4. ProfOlaf GitHub stars

    • Iterative snowballing with LLM-assisted extraction
    • Human-in-the-loop filtering
    • Topic extraction and query answering
    • Citation network analysis for systematic reviews
    • Released 2024
    • Language: Python
  5. LLAssist GitHub stars

    • LLM-based literature screening
    • Automated relevance assessment
    • Key information extraction
    • Matches papers against research questions
    • Released 2024
    • Language: Python
  6. RobotReviewer GitHub stars

    • Automatically extracts risk of bias from clinical trials
    • Identifies PICO elements
    • Machine learning for RCT identification
    • Web-based interface at https://www.robotreviewer.net/
    • pip install robotreviewer
    • Language: Python

Literature Analysis & Bibliometrics

  1. LitStudy GitHub stars

    • Automate literature analysis from Jupyter notebooks
    • Scientometrics and bibliometrics
    • Network analysis and visualization
    • Natural language processing
    • pip install litstudy
    • Language: Python
  2. pyBibX GitHub stars

    • Bibliometric and scientometric analysis
    • AI-powered features
    • Generates EDA reports
    • Creates document/author/institution IDs
    • pip install pybibx
    • Language: Python
  3. metaknowledge GitHub stars

    • Bibliometric research simplification
    • Reads meta-data from various sources
    • Network analysis capabilities
    • pip install metaknowledge
    • Language: Python

Meta-Analysis

  1. PythonMeta GitHub stars

    • Comprehensive meta-analysis with effect sizes
    • Supports OR, RR, RD for dichotomous data
    • MD, SMD for continuous data
    • Fixed/random effects models with MH/Peto/IV algorithms
    • Forest and funnel plots
    • pip install PythonMeta
    • Web interface: https://www.pymeta.com/
    • Language: Python
  2. PyMARE GitHub stars

    • Mixed-effects meta-regression
    • Multiple variance estimation methods (ReML, ML, DL)
    • pip install pymare
    • Documentation: https://pymare.readthedocs.io/
    • Language: Python
  3. NiMARE GitHub stars

    • Neuroimaging meta-analysis
    • Coordinate- and image-based methods
    • General meta-analysis tools included
    • pip install nimare
    • Language: Python

Citation Snowballing

  1. snowballingGitHub stars
  • Literature snowballing tools
  • Chrome plugin for Google Scholar
  • Jupyter Notebook widgets
  • Citation graph visualization
  • Forward/backward snowballing
  • Documentation: https://joaofelipe.github.io/snowballing
  • Language: Python

Citation Network & API Access

  1. semanticscholar GitHub stars

    • Unofficial Python client for Semantic Scholar API
    • Typed responses, paginated navigation
    • Async support
    • pip install semanticscholar
    • Language: Python
  2. PyS2 GitHub stars

    • Typed pydantic objects for Semantic Scholar
    • Retries and rate limiting
    • pip install pys2
    • Language: Python
  3. PyAlex GitHub stars

    • Comprehensive OpenAlex support (Works, Authors, Sources, etc.)
    • Polite pool access, retry mechanism
    • Abstract parsing, CC0 data license
    • Covers 200M+ works, free access
    • pip install pyalex
    • Language: Python

PDF Data Extraction

  1. GROBID ⭐ Industry Standard GitHub stars
  • Machine learning for PDF extraction
  • Focus on scientific publications
  • Structured XML/TEI output
  • Extracts metadata + full text
  • 68 fine-grained structure labels
  • Language: Java/Python
  1. PDF-Extract-Kit GitHub stars
  • High-quality content extraction
  • Layout detection, formula recognition
  • OCR for complex PDFs
  • Works on papers, textbooks, reports
  • Language: Python
  1. MinerU GitHub stars

    • Transforms PDFs to markdown/JSON
    • LLM-ready format
    • Machine-readable outputs
    • Language: Python
  2. PDFDataExtractor GitHub stars

    • Semantic information extraction
    • Focused on scientific articles
    • Language: Python
  3. Marker GitHub stars

    • Converts PDF to Markdown using Surya OCR
    • Formula extraction (LaTeX)
    • Text detection and layout analysis
    • 90+ language support
    • Language: Python
  4. Unstructured GitHub stars

    • ETL for ML tasks from PDFs
    • Document parsing library
    • Multiple format support
    • pip install unstructured
    • Language: Python
  5. Tesseract OCR GitHub stars

    • Open-source OCR from Google
    • Supports 100+ languages
    • pip install pytesseract
    • Language: C++/Python
  6. PaddleOCR GitHub stars

    • Handles 80+ languages
    • Table recognition
    • Excellent for Asian languages
    • pip install paddleocr
    • Language: Python

Text Mining & NLP

  1. medaCy GitHub stars
    • Medical text mining framework
    • Built on spaCy
    • Fast model prototyping
    • Highly predictive medical NLP
    • Language: Python

R Packages

Search Strategy Development

  1. litsearchr ⭐ Highly Recommended GitHub stars

    • Revolutionizes search strategy development
    • Quasi-automatic keyword extraction using text mining
    • Co-occurrence networks for term relationships
    • Interactive Shiny GUI available
    • install.packages("litsearchr")
    • Shiny app: https://elizagrames.shinyapps.io/litsearchr/
    • Language: R
  2. searchAnalyzeR

    • Validates search strategies
    • PRISMA-compliant reporting
    • Precision-recall analysis
    • Cross-database performance comparison
    • PubMed API integration
    • install.packages("searchAnalyzeR")
    • Language: R

Screening & Workflow Management

  1. metagear GitHub stars

    • GUI for abstract/title screening
    • Inter-reviewer reliability assessment
    • PDF downloading capabilities
    • PRISMA flow diagrams with multiple design layouts
    • Data extraction from scatter plots and bar plots
    • install.packages("metagear")
    • Language: R
  2. revtools

    • Article screening for evidence synthesis
    • Topic modeling for literature exploration
    • Visual screening methods
    • Document tagging
    • install.packages("revtools")
    • Website: https://revtools.net/
    • Language: R

Citation Tracking

  1. citationchaser GitHub stars

    • Forward and backward citation chasing
    • Lens.org API integration
    • Automated reference discovery
    • RIS file output
    • install.packages("citationchaser")
    • Shiny app: https://estech.shinyapps.io/citationchaser/
    • Published in Research Synthesis Methods 13(4):533-545 (2022)
    • Language: R
  2. semscholar GitHub stars

    • R interface to Semantic Scholar API
    • Author profiles and paper metadata
    • Citations and references
    • Zotero integration
    • remotes::install_github("njahn82/semscholar")
    • Language: R
  3. semanticscholar (KTH Library)

    • Lightweight Semantic Scholar API access
    • Author metrics (h-index, citation count)
    • Paper metadata with TLDR
    • API key support (100 requests/sec)
    • install.packages("semanticscholar")
    • Language: R
  4. openalexR GitHub stars

    • R interface to OpenAlex
    • Polite pool support
    • Premium API key compatibility
    • 200M+ works, CC0 licensed
    • install.packages("openalexR")
    • Language: R

Meta-Analysis

  1. metafor ⭐ Gold Standard GitHub stars

    • Comprehensive meta-analysis package
    • Equal/fixed/random/mixed-effects models
    • Meta-regression analysis
    • Forest, funnel, radial plots
    • Extensive documentation
    • install.packages("metafor")
    • Website: https://wviechtb.github.io/metafor/
  2. meta GitHub stars

    • Comprehensive meta-analysis package
    • metagen(), metabin(), metacont(), metainc() functions
    • Fixed/random effects, three-level meta-analysis
    • GLMM, Hartung-Knapp method, trim-and-fill
    • RevMan 5 imports
    • install.packages("meta")
    • Website: https://cran.r-project.org/package=meta
  3. metaplus

    • Robust meta-analysis using t-distribution
    • Mixture of normals for random effects
    • Outlier detection via testOutliers() and outlierProbs()
    • install.packages("metaplus")
    • Language: R

Network Meta-Analysis

  1. netmeta ⭐ Most Popular GitHub stars

    • Frequentist network meta-analysis
    • 3,645 monthly downloads
    • Graph-theoretic methods
    • Component network meta-analysis
    • Inconsistency assessment via netheat() and netsplit()
    • install.packages("netmeta")
    • Language: R
  2. gemtc

    • Bayesian NMA via JAGS/OpenBUGS/WinBUGS
    • Consistency and inconsistency models
    • 1,781 monthly downloads
    • install.packages("gemtc")
    • Language: R
  3. BUGSnet GitHub stars

    • PRISMA/ISPOR/NICE-DSU compliant outputs
    • Bayesian NMA using JAGS
    • League tables via league.table()
    • SUCRA plots via sucra()
    • install.packages("BUGSnet")
    • Language: R
  4. multinma

    • Network meta-analysis and meta-regression
    • Individual patient data (IPD) and aggregate data
    • install.packages("multinma")
    • Language: R
  5. pcnetmeta

    • Arm-based network meta-analysis
    • Treatment-specific parameters
    • Bayesian framework
    • install.packages("pcnetmeta")
    • Language: R
  6. bnma

    • Bayesian network meta-analysis
    • Various model specifications
    • install.packages("bnma")
    • Language: R
  7. crossnma

    • Cross-design and cross-format NMA
    • Handles different study designs
    • install.packages("crossnma")
    • Language: R

Bayesian Meta-Analysis

  1. brms GitHub stars

    • Highly flexible Bayesian multilevel models
    • Stan backend
    • Meta-analysis as special case of hierarchical models
    • Formula: y | se(se_y) ~ 1 + (1|study)
    • install.packages("brms")
    • Language: R
  2. bayesmeta

    • Bayesian random-effects meta-analysis
    • No MCMC (faster, reproducible)
    • Shrinkage estimates and prediction intervals
    • Flexible priors
    • 919 monthly downloads
    • install.packages("bayesmeta")
    • Language: R
  3. MetaStan

    • Binomial-normal hierarchical models
    • Weakly informative priors
    • Using Stan
    • install.packages("MetaStan")
    • Language: R
  4. baggr

    • Bayesian meta-analysis using Stan
    • Hierarchical models
    • Graphical facilities
    • install.packages("baggr")
    • Language: R
  5. metaBMA

    • Bayesian model averaging
    • Variety of priors
    • Custom prior definition
    • install.packages("metaBMA")
    • Language: R

Specialized Meta-Analysis Packages

  1. MAd

    • Meta-analysis of standardized mean differences
    • Range of graphics
    • install.packages("MAd")
    • Language: R
  2. psychmeta

    • Hunter-Schmidt method for psychometric meta-analysis
    • Reliability corrections
    • install.packages("psychmeta")
    • Language: R
  3. metap

    • Meta-analysis of significance values
    • Edgington, Fisher, Lancaster, Stouffer, Tippett, Wilkinson methods
    • install.packages("metap")
    • Language: R
  4. robumeta

    • Robust variance estimation
    • For dependent effect sizes
    • install.packages("robumeta")
    • Language: R
  5. POMADE

    • Power analysis for meta-analysis
    • Dependent effect sizes
    • install.packages("POMADE")
    • Language: R
  6. metamisc

    • Diagnostic and prognostic studies
    • Bayesian methods with suggested priors
    • install.packages("metamisc")
    • Language: R
  7. dosresmeta

    • Dose-response meta-analysis
    • install.packages("dosresmeta")
    • Language: R
  8. metamedian

    • Meta-analyzes studies reporting medians
    • install.packages("metamedian")
    • Language: R
  9. mixmeta

    • Multivariate and multilevel meta-analysis
    • install.packages("mixmeta")
    • Language: R
  10. mvmeta

    • Multivariate meta-analysis
    • install.packages("mvmeta")
    • Language: R

Visualization Packages

  1. robvis ⭐ Essential GitHub stars

    • Risk-Of-Bias VISualization
    • Publication-quality figures for RoB 2, ROBINS-I, ROBINS-E, QUADAS-2, QUIPS
    • Colorblind-friendly schemes
    • Over 1,500 academic citations
    • devtools::install_github("mcguinlu/robvis")
    • Web app: https://mcguinlu.shinyapps.io/robvis (no R required)
    • Language: R
  2. forestplot

    • Highly customizable forest plots
    • Publication-quality graphics
    • install.packages("forestplot")
    • Language: R
  3. forestploter

    • Advanced forest plots
    • Flexible layouts
    • install.packages("forestploter")
    • Language: R
  4. metaviz

    • Advanced visualization for meta-analysis
    • Influence plots and cumulative plots
    • install.packages("metaviz")
    • Language: R
  5. viscomp GitHub stars

    • Visualizes multi-component interventions in NMA
    • Heat plots, violin plots, waterfall plots
    • Network graphs
    • Language: R
  6. dmetar

    • Companion to "Doing Meta-Analysis in R"
    • Utility functions for meta/metafor
    • Heterogeneity tests and utilities
    • install.packages("dmetar")
    • Language: R
  7. metaDigitise

    • Extracts data from published figures
    • Graphical interface
    • install.packages("metaDigitise")
    • Language: R

Bibliometric Analysis

  1. bibliometrixGitHub stars
  • Comprehensive science mapping
  • Citation analysis, co-citation, coupling
  • Network analysis at multiple levels
  • Visualization capabilities
  • Web interface: Biblioshiny
  • install.packages("bibliometrix")
  • Website: https://www.bibliometrix.org/

PRISMA Flow Diagrams

  1. PRISMA2020 ⭐ Official GitHub stars
  1. livingPRISMAflow GitHub stars
  • Living systematic review diagrams
  • PRISMA checklist conformance
  • Language: R

Deduplication

  1. ASySDGitHub stars

Automation & Workflow Tools

  1. Systematic Review Accelerator (SRA) GitHub stars

    • Reference deduplication
    • Fuzzy matching for semi-identical refs
    • Merge multiple search sources
    • Developed at Bond University
    • Language: JavaScript/Node.js
  2. Jupyter Guide for Reproducible Research GitHub stars

    • Crowdsourced guidelines
    • Ten Simple Rules companion
    • Reproducible SLR workflows
    • Language: Jupyter Notebooks
  3. Ten Rules for Jupyter GitHub stars

    • Best practices for notebooks
    • Sharing computational analyses
    • Reproducibility guidelines
    • Language: Jupyter Notebooks

Text Mining Resources

  • text_mining_resources GitHub stars
    • Learning resources for text mining
    • NLP techniques and tools
    • Seven-step framework
    • Document clustering, entity extraction

GitHub Topic Collections

Explore curated collections of repositories:

Key Datasets on GitHub

  • SYNERGY Dataset GitHub stars
    • Open dataset for study selection
    • 169,288 academic works
    • 26 systematic reviews
    • Benchmarking ML models

Installation Quick Start

Python Environment:

# Create virtual environment python -m venv slr-env source slr-env/bin/activate # On Windows: slr-env\Scripts\activate # Screening & AI-powered tools pip install asreview lattereview systematic-reviewpy # Bibliometrics & literature analysis pip install litstudy pybibx metaknowledge # Meta-analysis pip install PythonMeta pymare nimare # API access for citation tracking pip install semanticscholar pys2 pyalex # PDF extraction & OCR pip install marker unstructured pytesseract paddleocr # All-in-one installation pip install asreview lattereview systematic-reviewpy litstudy pybibx metaknowledge PythonMeta pymare semanticscholar pyalex

R Environment:

# Search strategy development install.packages(c("litsearchr", "searchAnalyzeR")) # Screening & workflow install.packages(c("metagear", "revtools")) # Citation tracking install.packages(c("citationchaser", "openalexR")) # Meta-analysis core install.packages(c("metafor", "meta", "metaplus")) # Network meta-analysis install.packages(c("netmeta", "gemtc", "BUGSnet", "multinma", "pcnetmeta", "bnma", "crossnma")) # Bayesian meta-analysis install.packages(c("brms", "bayesmeta", "MetaStan", "baggr", "metaBMA")) # Specialized meta-analysis install.packages(c("MAd", "psychmeta", "metap", "robumeta", "POMADE", "metamisc", "dosresmeta", "metamedian", "mixmeta", "mvmeta")) # Visualization devtools::install_github("mcguinlu/robvis") install.packages(c("forestplot", "forestploter", "metaviz", "dmetar", "metaDigitise")) # Bibliometrics & PRISMA install.packages(c("bibliometrix", "PRISMA2020")) # Deduplication install.packages("ASySD") # All-in-one installation (core packages) install.packages(c("litsearchr", "metagear", "revtools", "citationchaser", "openalexR", "metafor", "meta", "netmeta", "brms", "bibliometrix", "PRISMA2020", "ASySD", "forestplot", "metaviz"))

Node.js Tools:

# Search translation npm install sra-polyglot

Contributing to Open Source

Many of these tools welcome contributions:

  • 🐛 Report bugs and issues
  • 📖 Improve documentation
  • 💡 Suggest features
  • 🔧 Submit pull requests
  • ⭐ Star repositories you find useful

Datasets & Benchmarks

CLEF eHealth Datasets:

  • CLEF eHealth Evaluation Lab
    • Technology Assisted Reviews (TAR) datasets
    • 50+ Diagnostic Test Accuracy reviews
    • Freely available for research
    • TAR 2017-2019 benchmarks

Standard Benchmarks:

  • 20-Newsgroups - Text classification
  • TREC Total Recall - Information retrieval
  • CLEF eHealth TAR 2019 - Most widely used SLR benchmark

Research Datasets:


Learning Resources

Comprehensive Methodology Guides:

  1. Cochrane Handbook for Systematic Reviews of Interventions ⭐ Gold Standard

    • Version 6.5 (current as of 2024)
    • Free online access
    • Key chapters:
      • Chapter 2: Determining scope and questions
      • Chapter 3: Defining eligibility criteria (PICO)
      • Chapter 9: Summarizing study characteristics
      • Chapter 17: Intervention complexity
    • Updated continuously with Technical Supplement
  2. PRISMA 2020 Statement ⭐ Essential

  3. PRISMA Extensions:

  4. JBI Manual for Evidence Synthesis ⭐ Comprehensive

  5. Campbell Collaboration Standards

  6. Centre for Reviews and Dissemination (CRD) Guidance ⭐ Comprehensive

    • 300+ page guide covering all stages
    • From protocol to dissemination
    • Includes: effectiveness, tests, qualitative research, economic evaluations
    • Homepage: https://www.york.ac.uk/crd/guidance/

University Library Guides:

Comprehensive Multi-Page Guides:

  1. Cornell University - Evidence Synthesis

    • All 12 steps from protocol development through synthesis
    • Framework comparisons (PICO vs SPIDER vs PEO)
    • Reporting standards
  2. UNC Chapel Hill - Systematic Reviews

  3. University College London (UCL)

    • Collaboration with EPPI-Centre
    • Formulating questions
    • Searching databases
    • Appraising studies
    • Synthesis methods
  4. UCLA Library

    • Methodology requirements
    • Decision trees
    • Primer resources
  5. University of Cambridge

    • Medical Library guide
    • Stages visualization
    • Timeline guidance
    • Critical appraisal

Additional University Guides:

Online Courses:

University Guides:

Video Tutorials:

Communities & Forums:


Specialized Review Types & Methodologies

Living Systematic Reviews

What are Living Systematic Reviews?

  • Continuously updated systematic reviews
  • Evidence updated as new studies emerge
  • Daily or regular automated searches
  • Critical for rapidly evolving fields

Platforms & Tools:

  1. L·OVE (Living OVerview of Evidence) ⭐ Largest Platform

    • Maps and organizes ~300,000 systematic reviews
    • Epistemonikos database
    • AI algorithms and expert networks
    • Daily evidence updates
    • PICO format organization
    • Notifications for new evidence
    • Customized reports
  2. LIvE (Living Interactive Evidence Synthesis)

    • Living Interactive Systematic Reviews (LISRs)
    • 5 major components: automated search, scanner, extractor, analyzer, tabulator
    • Three pathways: conventional, semi-automated human-in-the-loop, AI-powered
  3. Pitts.ai

    • Web application for living systematic reviews
    • Network meta-analysis support
    • AI-driven data extraction from full-text articles
    • Mission: living systematic reviews in every disease
  4. Cochrane Living Evidence Network

    • Guidance documents
    • Tech enablers: Covidence, MAGICapp, Screen4Me, RCT-classifier, RevMan Replicant, Systematic Review Accelerator
    • Updated practical guidance (December 2019)
  5. DistillerSR

    • Supports living review methodology
    • Screening and data extraction tools
    • PRISMA checklist integration

Key Guidance:

Rapid Review Methodologies

What are Rapid Reviews?

  • Accelerated systematic reviews (weeks to months vs. 12-18 months)
  • Balance timeliness with rigor
  • Common shortcuts: limited databases, single screener, no quality assessment
  • Used for time-sensitive policy decisions

Resources & Guidance:

  1. NCCMT Rapid Review Guidebook ⭐ Essential

  2. Cochrane Rapid Reviews Interim Guidance

    • March 2020 guidance
    • Cochrane Rapid Reviews Methods Group
    • Specific methodological standards
  3. WHO EMRO Training Package

    • Comprehensive training materials
    • Webinars available
    • Ottawa Hospital Research Institute experiences

Methodological Papers:

University Guides:

Scoping Review Tools & Frameworks

What are Scoping Reviews?

  • Map available evidence in a field
  • Identify key concepts, gaps, and types of evidence
  • Broader than systematic reviews
  • Don't typically assess quality

Frameworks:

  1. Arksey & O'Malley Framework (2005) - Original

    • Six-stage iterative process
    • Identifying research questions → identifying studies → study selection → charting data → collating/summarizing/reporting → optional consultation
  2. Enhanced Frameworks:

  3. JBI Scoping Review Methodology ⭐ Current Standard

  4. PRISMA-ScR - Reporting Standard

University Guides:

Umbrella Review Tools

What are Umbrella Reviews?

  • Reviews of systematic reviews
  • Synthesize evidence from multiple systematic reviews
  • High-level evidence synthesis
  • Also called "overview of reviews" or "meta-reviews"

Quality Assessment Tools:

  1. AMSTAR 2 ⭐ Most Used (~80% of umbrella reviews)

    • 16-item critical appraisal tool
    • See Quality Assessment section above
  2. ROBIS

    • Risk Of Bias In Systematic reviews
    • Three-phase instrument
    • Focuses on bias (complementary to AMSTAR 2)
  3. JBI Critical Appraisal Checklist

    • Systematic review checklist for umbrella reviews
    • Recommended by JBI (not validated)

Guidance Documents:

Overlap Management:

  • GROOVE (Graphical representation of overlap for overviews)
  • CCA index (Corrected covered area) for managing primary study overlap

Realist Review Frameworks

What are Realist Reviews?

  • Theory-driven synthesis
  • Explains "what works, for whom, in what circumstances, and why"
  • Uses Context-Mechanism-Outcome (CMO) configurations
  • Based on Pawson and Tilley's realist evaluation

RAMESES Standards:

  1. RAMESES Project ⭐ Gold Standard

  2. RAMESES I - Realist Syntheses

  3. RAMESES II - Realist Evaluations

Training & Resources:

Protocol Registration Platforms

Why Register?

  • Transparency and reduces publication bias
  • Prevents duplication of effort
  • Some journals require pre-registration

Platforms:

  1. OSF (Open Science Framework) Registries ⭐ Most Flexible

  2. INPLASY - Specialized SLR Registry

    • Launched March 2020
    • Accepts: systematic reviews, scoping reviews, meta-analyses, umbrella reviews, rapid reviews, methodological reviews, meta-research
    • INPLASY unique identifiers + DOIs
    • ORCID automatic updates
    • 94%+ protocols published within 24 hours
    • Over 4,658 protocols (as of March 2023)
    • Registration fee
    • Analysis: https://pubmed.ncbi.nlm.nih.gov/37588882/
  3. protocols.io

  4. Research Registry

    • Systematic review/meta-analysis registry
    • Immediate visibility
    • Registration fee, DOIs provided
  5. PROSPERO - Health Sciences

    • International registry (mentioned in Planning & Protocol section)
    • Required for many health journals
    • Free but can have long approval times

Comparison Resource:

Qualitative Synthesis Methods

Common Approaches:

  1. Meta-ethnography (Noblit & Hare 1988) - Most Common

    • 7-stage approach
    • Reciprocal translation and line-of-argument synthesis
    • Third-order interpretations beyond original studies
    • Best for: Developing analytical findings, lived experiences research
  2. Thematic synthesis (Thomas & Harden 2008)

    • Combines meta-ethnography and grounded theory
    • Line-by-line coding → descriptive themes → analytical themes
    • Best for: Intervention need, appropriateness, acceptability questions
  3. Framework synthesis

    • Uses a priori frameworks to structure synthesis
    • Data extraction into predetermined frameworks
    • Modified iteratively
    • Best for: Theory-driven reviews, policy questions with clear structure
  4. Meta-aggregation

    • JBI approach
    • Assembling findings through categorization
    • Credibility ratings
    • Best for: Straightforward evidence aggregation
  5. Critical Interpretive Synthesis (CIS)

    • Iterative approach problematizing literature
    • Elastic terminology, multi-disciplinary teams
    • Best for: Broad review questions, generating theory, cross-disciplinary topics

⚠️ Common Challenges & How to Avoid Them

Based on experienced researchers' insights, here are the most common pitfalls and how to avoid them:

1. Poor Search Strategy

❌ Mistake: Search too narrow (miss relevant papers) or too broad (overwhelmed with irrelevant results)

✅ Solution:

  • Use PICO/PICOT frameworks to structure your question
  • Combine synonyms with OR: (machine learning OR deep learning OR neural network*)
  • Combine concepts with AND: (machine learning) AND (healthcare) AND (diagnosis)
  • Test search strings on pilot searches (should find known key papers)
  • Document all search strings and dates for reproducibility

Example Search String:

("systematic review" OR "literature review" OR "meta-analysis") AND ("machine learning" OR "deep learning" OR "artificial intelligence" OR AI) AND (healthcare OR medical OR clinical) 

2. Inadequate Protocol Planning

❌ Mistake: Starting without a clear protocol, leading to scope creep and inconsistent decisions

✅ Solution:

  • Write protocol BEFORE searching (use PRISMA-P checklist)
  • Register protocol on PROSPERO for transparency
  • Define clear inclusion/exclusion criteria upfront
  • Pre-specify all outcomes and analysis plans
  • Get protocol peer-reviewed by colleagues

3. Poor Inter-Rater Reliability

❌ Mistake: Reviewers disagree on study selection, leading to bias

✅ Solution:

  • Use at least 2 independent reviewers for screening
  • Pilot screening on 50-100 abstracts to calibrate
  • Measure agreement (Cohen's Kappa ≥ 0.6 is acceptable)
  • Have clear decision rules for conflicts
  • Use tools like Rayyan or Covidence for blind screening

4. Data Extraction Errors

❌ Mistake: Inconsistent or incorrect data extraction, transposition errors

✅ Solution:

  • Create standardized extraction forms (pilot test on 5-10 papers)
  • Use double data extraction (2 reviewers extract independently)
  • Cross-check numerical data carefully
  • Use tools: Covidence, DistillerSR, or structured spreadsheets
  • Keep extraction audit trail

5. Inadequate Documentation

❌ Mistake: Can't reproduce the review or explain decisions months later

✅ Solution:

  • Document everything: search dates, databases, filters used
  • Keep decision logs (why studies were excluded)
  • Save all search results and versions
  • Use version control (GitHub) for protocol and data
  • Follow PRISMA 2020 reporting guidelines

6. Ignoring Grey Literature

❌ Mistake: Only searching academic databases, missing important studies

✅ Solution:

  • Search conference proceedings, dissertations, preprints
  • Check clinical trial registries (ClinicalTrials.gov)
  • Contact key authors for unpublished work
  • Search: Google Scholar, OpenGrey, ProQuest Dissertations

7. Poor Time Management

❌ Mistake: Underestimating time required (SLRs typically take 6-18 months!)

✅ Solution:

  • Realistic timeline: Planning (1-2 months) → Searching (1 month) → Screening (2-4 months) → Extraction (2-3 months) → Analysis (1-2 months) → Writing (2-3 months)
  • Use project management tools: Notion, Trello, or OSF
  • Set weekly goals and track progress
  • Build in buffer time for unexpected issues

8. Tool Overwhelm

❌ Mistake: Trying to learn too many tools at once

✅ Solution:

  • Start with simple, free tools: Rayyan + Zotero
  • Add complexity only when needed
  • Use our Quick Start Guide based on your role
  • Stick to one tool per task (don't switch mid-review)

9. Poor Quality Assessment

❌ Mistake: Skipping quality assessment or using wrong tools

✅ Solution:

  • Choose appropriate tool for study type:
  • Have 2 independent assessors
  • Report quality scores in results
  • Consider sensitivity analysis (excluding low-quality studies)

10. Publication Bias

❌ Mistake: Only including published studies (positive results bias)

✅ Solution:

  • Search trial registries for unpublished trials
  • Use funnel plots to detect publication bias
  • Contact authors for unpublished data
  • Report potential limitations in discussion

📝 Practical Examples & Templates

Search String Examples

Example 1: Machine Learning in Healthcare

Title/Abstract/Keywords: ("machine learning" OR "deep learning" OR "neural network*" OR "artificial intelligence" OR AI) AND (diagnos* OR predict* OR classif* OR detect*) AND (medical OR clinical OR healthcare OR patient*) Filters: English, 2015-2024, Peer-reviewed 

Example 2: Software Engineering Education

("software engineering" OR "computer science" OR "programming") AND (education* OR teaching OR learning OR pedagog* OR curriculum) AND ("systematic review" OR "literature review" OR survey OR "state of the art") 

Example 3: Climate Change Adaptation

("climate change" OR "global warming" OR "climate crisis") AND (adapt* OR resilien* OR mitigation) AND (urban OR cit* OR "built environment") AND (policy OR policies OR governance OR planning) 

PICO/PICOT Framework Examples

Example: Clinical Question

  • Population: Adults with Type 2 diabetes
  • Intervention: Continuous glucose monitoring
  • Comparison: Standard blood glucose monitoring
  • Outcome: HbA1c levels, hypoglycemic events
  • Time: At least 6 months follow-up

Example: Technology Question (PICO adaptation)

  • Problem: Software bug prediction
  • Intervention: Machine learning approaches
  • Comparison: Traditional static analysis
  • Outcome: Precision, recall, F1-score

Downloadable Protocol Templates

  1. PRISMA-P Template (Word)

    • Official PRISMA protocol checklist
    • 17-item checklist
    • Free to use
  2. Covidence Protocol Guide (PDF)

    • Practical guide with examples
    • Templates included
    • Real-world case studies
  3. OSF Protocol Template

    • Preregistration template
    • For systematic reviews, scoping reviews, meta-analyses
    • Open access under CC license
  4. York CRD Template (PDF)

    • Systematic review protocol outline
    • Structured format
    • Widely used in health sciences

Inclusion/Exclusion Criteria Examples

Example 1: Quantitative Studies

Inclusion:

  • Peer-reviewed journal articles
  • Published 2015-2024
  • Randomized controlled trials (RCTs)
  • Adults aged 18-65
  • Intervention duration ≥ 12 weeks
  • Reported outcomes: quantitative measures with statistics

Exclusion:

  • Conference abstracts, posters
  • Non-English language
  • Animal studies
  • Case reports, editorials, reviews
  • Duplicate publications
  • No full text available

Example 2: Mixed Methods

Inclusion:

  • Empirical studies (qualitative, quantitative, mixed)
  • Focus on software engineering practices
  • Published in ACM, IEEE, or top-tier SE venues
  • Full research papers (≥ 8 pages)

Exclusion:

  • Opinion pieces, position papers
  • Tool demos without evaluation
  • Studies with <10 participants (qualitative)
  • Pre-2010 (outdated technology)

Data Extraction Form Template

Field Example Value Notes
Study ID Smith2024 FirstAuthorYear
Title ML for Medical Diagnosis
Authors Smith, J., Doe, A.
Year 2024
Journal Nature Medicine
Study Design RCT RCT/Cohort/Case-Control
Sample Size n=500
Population Type 2 Diabetes patients
Intervention CNN-based diagnosis
Comparison Traditional diagnosis
Outcome Measures Accuracy, Sensitivity, Specificity
Main Results Acc: 95%, Sens: 92%, Spec: 97%
Quality Score 8/10 (Cochrane RoB) Low/Moderate/High
Notes Large sample, well-designed

PRISMA Flow Diagram Template

You can generate this automatically with:


❓ FAQ for Young Researchers

Getting Started

Q: How long does a systematic review take? A: Typically 6-18 months depending on scope. Budget realistic time:

  • Small review (50-200 papers): 6-9 months
  • Medium review (200-500 papers): 9-12 months
  • Large review (500+ papers): 12-18 months

Q: Can I do a systematic review alone? A: Not recommended! You need at least 2 reviewers for:

  • Independent screening (reduce bias)
  • Data extraction validation
  • Quality assessment
  • Conflict resolution However, you can be the lead with 1-2 collaborators.

Q: Do I need to register my protocol? A: Strongly recommended (required for some journals):

  • Health sciences: PROSPERO (required)
  • Other fields: OSF Registries (recommended)
  • Prevents duplication and selective reporting

Q: How many databases should I search? A: Minimum 3-4 major databases for your field:

  • Health: PubMed + Cochrane + Embase + Scopus
  • Engineering/CS: IEEE Xplore + ACM + Scopus + Web of Science
  • Social Sciences: PsycINFO + ERIC + Scopus + Web of Science

Search Strategy

Q: How do I know if my search string is good? A: Test it:

  1. Should find 3-5 known key papers (seed papers)
  2. Precision > 5% (not too broad)
  3. Recall > 90% (not too narrow)
  4. Validated by a librarian or search expert

Q: Should I use MeSH terms or free text? A: BOTH! Use controlled vocabulary (MeSH, EMTREE) AND free text synonyms:

MeSH Terms: "Diabetes Mellitus, Type 2"[Mesh] OR Free Text: (diabetes OR diabetic*) AND (type 2 OR type II OR T2DM) 

Q: What if I get 10,000+ results? A: Refine your search:

  1. Add more specific AND terms
  2. Limit date range
  3. Add publication type filters
  4. Consult with search expert
  5. Use AI tools like ASReview for efficient screening

Tools & Software

Q: What's the best free tool for beginners? A: Start with:

  1. Screening: Rayyan (free, unlimited collaborators)
  2. References: Zotero (free, open-source)
  3. Analysis: RevMan (free, from Cochrane)
  4. Writing: Overleaf (free tier for LaTeX)

Q: Is Excel okay for data extraction? A: Yes, for small reviews (<50 studies). For larger reviews:

  • Use dedicated tools: Covidence, DistillerSR
  • Or structured databases with version control
  • Excel pros: familiar, flexible
  • Excel cons: error-prone, no audit trail, hard to collaborate

Q: Can I use ChatGPT for systematic reviews? A: ⚠️ Use with caution:

  • ❌ DON'T use for: screening decisions, quality assessment, data extraction
  • ✅ CAN use for: brainstorming search terms, paraphrasing text, organizing notes
  • ✅ BETTER alternatives: Elicit, Consensus, Scite (designed for research)
  • ALWAYS verify AI outputs with original sources!

Screening & Selection

Q: How many reviewers do I need for screening? A: Minimum 2 independent reviewers:

  • Title/abstract screening: 2 reviewers independently
  • Full-text screening: 2 reviewers independently
  • Conflicts: resolved by discussion or 3rd reviewer
  • Pilot screening: ~10% of records to calibrate

Q: What's an acceptable inter-rater agreement? A: Measured by Cohen's Kappa (κ):

  • κ > 0.80: Excellent
  • κ = 0.60-0.80: Good (acceptable)
  • κ = 0.40-0.60: Moderate (needs improvement)
  • κ < 0.40: Poor (recalibrate criteria)

Q: Can I use AI to reduce screening workload? A: Yes! AI tools can reduce workload by 60-95%:

  • ASReview: 95% workload reduction
  • Abstrackr: ~67% reduction
  • Must still manually review AI suggestions
  • Document AI tool use in methods

Quality & Analysis

Q: Do I have to assess study quality? A: YES! It's a core requirement of systematic reviews:

  • Shows which studies are trustworthy
  • Helps interpret conflicting results
  • Allows sensitivity analyses
  • Use appropriate tools: Cochrane RoB 2, JBI Tools

Q: When should I do meta-analysis vs narrative synthesis? A: Meta-analysis requirements:

  • ✅ Studies measure same outcome
  • ✅ Sufficient statistical data reported
  • ✅ Studies are comparable (population, intervention)
  • ✅ >3-5 studies available
  • ❌ If not: do narrative synthesis
  • Tools: RevMan, R metafor

Q: What if studies are too different (heterogeneous)? A: Options:

  1. Use random-effects meta-analysis (accounts for heterogeneity)
  2. Do subgroup analyses (by population, intervention type, etc.)
  3. Perform meta-regression (explore sources of heterogeneity)
  4. Resort to narrative synthesis
  5. Report heterogeneity statistics (I², τ²)

Writing & Reporting

Q: What's the difference between systematic review and literature review? A:

Feature Literature Review Systematic Review
Research question Broad Focused, specific (PICO)
Search Selective Comprehensive, documented
Selection criteria Implicit Explicit, pre-defined
Quality assessment Variable Mandatory, standardized
Synthesis Narrative Systematic +/- meta-analysis
Protocol Optional Required (PRISMA-P)
Reproducible No Yes

Q: Do I have to follow PRISMA? A: Highly recommended:

  • Many journals REQUIRE PRISMA compliance
  • Improves transparency and reproducibility
  • Makes writing easier (follows logical structure)
  • Use PRISMA 2020 Checklist
  • Generate flow diagram: R package or web tool

Q: Where should I publish my systematic review? A: Depends on field:

  • General: Systematic Reviews (open access)
  • CS/Engineering: ACM CSUR, IEEE COMST
  • Health: Cochrane Library, medical specialty journals
  • Consider: Impact factor, open access options, audience

Common Issues

Q: I found a paper after screening is done. What do I do? A: It happens! Options:

  1. If before analysis: add to full-text screening
  2. If after analysis: mention in limitations
  3. For living reviews: add to next update
  4. Document the paper and why it was missed

Q: What if I can't access a full-text paper? A: Try:

  1. University library access + interlibrary loan
  2. Email corresponding author directly (surprisingly effective!)
  3. ResearchGate request
  4. Legal repositories: PubMed Central, institutional repositories
  5. Last resort: exclude and note in PRISMA diagram

Q: Reviewers disagree - how do we resolve conflicts? A: Protocol should specify:

  1. Discussion between reviewers (most common)
  2. Third reviewer makes final decision
  3. Pre-defined decision rules
  4. Document all conflicts and resolutions
  5. Report agreement statistics (Kappa)

Time-Savers

Q: Any tips to speed up the process? A:

  1. Screening: Use ASReview (95% time savings)
  2. Deduplication: ASySD R package
  3. Citation snowballing: ResearchRabbit, Connected Papers
  4. PDF extraction: GROBID for automated extraction
  5. AI assistance: Elicit for quick summaries
  6. Team collaboration: Covidence or Rayyan for parallel work

Q: Can I update an existing systematic review? A: Yes! Systematic Review Updates are valuable:

  • Search from last review's end date
  • Use same criteria (or justify changes)
  • Cite original review
  • Some journals specialize in updates
  • Consider "living systematic reviews" for rapidly evolving fields

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

Guidelines:

  • Add new resources with clear descriptions
  • Include links (verify they work!)
  • Categorize appropriately
  • Indicate if free/paid

License

CC0

To the extent possible under law, the contributors have waived all copyright and related rights to this work.


Acknowledgments

This resource was created to help young researchers navigate the complex world of systematic literature reviews. Special thanks to all the tool developers, open source communities, and institutions making research more accessible.

Good luck with your systematic literature review! 🎓📚


Last Updated: November 2024

Maintainer: Open to community contributions

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors