Evidence Synthesis in the Social Sciences
Running Searches
Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)
Use the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist to monitor reporting items specifically for the searching component of the systematic review.
Methodological Expectations of Campbell Collaboration Intervention Reviews (MECCIR) Reporting Standards
View the Methodological Expectations of Campbell Collaboration Intervention Reviews (MECCIR) Reporting Standards for more detailed guidance on what should be included concerning the search and other aspects of the review protocol.
Recommended article: Searching for studies: A guide to information retrieval for Campbell systematic reviews
Documenting Decision-making
Documenting decisions made during the development of a search strategy is highly beneficial. It provides a clear rationale for including or excluding specific search terms, helping you and your team stay aligned with your protocol and avoid duplicating efforts when refining the search. This can help you and your team remember why specific search terms were included or excluded from your search during testing.
Consider using version control as you design and develop your search strategy. Keep a record of the different versions of your search and note changes made in new versions, rather than simply overwriting your search whenever significant changes are made.
Documenting Searches
Searches should be documented so that others can reproduce them and facilitate easier updating of reviews when necessary. Detailed information about the database, platform, and search strategy improves reproducibility. The search date provides information about the state of the database when searches were run and is an important indicator of the date up to which your review is valid.
When you are ready to run your final searches, you should document the following six items:
- The full name of the database
- The platform on which the database was run
- The date of the search, including the date of the most recent searches if the search was updated during the review
- A line-by-line search strategy, precisely as it was run in the database
- Any filters or limits that were applied
- The number of search results
Exporting Results
Generally, it is recommended that you export results as an .RIS file. This is the most generic file recognized by citation managers and screening platforms.
Watch this video for support on exporting search results from the ProQuest platform.
De-duplication
You will likely retrieve multiple versions of the same study as you search many databases, and will want to de-duplicate your results before screening. After you've performed your searches and imported the results into your citation management software:
- In RefWorks, from the menu, select Tools > "Find Duplicates." One or more instances of each duplicate record will be automatically selected, allowing for the quick deletion of all duplicate items.
- In Zotero, click on your library's "Duplicate Items" collection. You can resolve duplicates by merging the files.
- In Mendeley, select your folder of interest. Go to your Tools menu and select "Check for Duplicates." Select the details you would like to keep from each document. Click 'Merge' to create a single entry containing all the document details.
Here are the steps to get from running a search to being ready to screen the results against your eligibility criteria:
- Run your search in each database and source you identified as relevant to your review. Document the total number of results from each source.
- Export the results of each search to an.RIS file or another file type that can be read by the citation or reference management tool you plan to use.
- Import each set of search results into your citation or reference management tool, keeping track of which results came from which database or source and ensuring that the imported numbers align with the totals you documented for each search.
- Using the citation management tool's deduplication feature, merge items that are duplicates that were found in multiple sources. Be sure to confirm that they are true duplicates, and don't rely solely on the citation management tool's definition of duplicate.
- Export the deduplicated set of results as an .RIS file or other file type that can be imported into your screening tool. Document the total number of search results after deduplication.