AI Reference Finder — Real, Verifiable DOIs for Biomedical Claims
BioSkepsis is an AI reference finder and citation tool built for biomedical research. Every reference it returns points to a real, indexed, peer-reviewed paper with a verifiable DOI — never a fabricated one. Export directly to Zotero, Mendeley, EndNote, or BibTeX.
Start finding real references free
Verifiable DOIs. One-click Zotero export. No sign-up fee.
Start freeThe hallucinated-citation problem is real
When researchers first used ChatGPT and other general-purpose LLMs to find references for medical and scientific claims, a failure mode emerged quickly: the model would return fluent, plausible, authoritative-looking citations that did not exist. Peer-reviewed analyses documented the scale of the problem:
- Bhattacharyya and colleagues (2023, Cureus) examined 178 references generated by ChatGPT in response to medical queries and found that 69 were fabricated entirely — the papers did not exist — and a further substantial fraction contained inaccurate DOIs or author lists.1
- Later analyses across JAMA-family journals, radiology, urology, and psychiatry reproduced the pattern: double-digit percentages of LLM-generated medical citations were either fabricated or mis-attributed.
For any output that feeds a manuscript, a grant, a thesis, or a clinical decision, an unverifiable reference is a liability — it can trigger desk rejection at submission, correction notices at publication, or worse. An AI reference finder that cannot guarantee real DOIs is not fit for research use.
How BioSkepsis returns real references
BioSkepsis is built around a simple invariant: if a reference appears in the output, it exists in the corpus. Four mechanisms enforce that:
1. Retrieval-first, generation-second
Most hallucinated citations come from models that generate a reference string directly. BioSkepsis does the opposite. It retrieves candidate papers from its curated 40M+ biomedical corpus first, then uses the LLM layer only to rank, explain, and quote what retrieval returned. There is no step at which a reference is invented from training-data memory.
2. Every citation carries a verifiable DOI
Each returned reference ships with its DOI, PMID (where applicable), journal, year, and authors — all sourced from the canonical record, not regenerated. Click the DOI; it resolves to the publisher's landing page.
3. Biology-native disambiguation
Gene symbols, MeSH descriptors, and Gene Ontology terms are resolved to canonical biological entities before retrieval runs. A query mentioning "BRCA1" does not match papers that happen to use the string in an unrelated context; a query about "lupus" includes papers indexed under SLE even if the string "lupus" does not appear.
4. Declines when evidence is insufficient
If no paper in the corpus adequately supports a claim, BioSkepsis says so rather than returning a weak or off-topic reference. This is the single most important behaviour to check in any AI citation tool.
Use cases
Find references for a claim you have already written
Paste a sentence from a draft manuscript. BioSkepsis returns the 3–10 most supportive peer-reviewed references with the specific passages they cite, each with a verifiable DOI. Accept, reject, or swap references one at a time — no silent insertion.
Build a reference list from a research question
Start from a question ("does metformin affect Alzheimer's disease progression?"). BioSkepsis returns a structured reference list grouped by study type — systematic reviews, RCTs, observational studies, mechanistic — with DOIs and one-line relevance summaries for each.
Replace a shaky ChatGPT citation with a verified one
Paste a citation that a generic LLM produced. BioSkepsis checks whether it exists (often it does not), and — critically — finds real papers that do support the underlying claim. This is the single highest-value workflow for researchers migrating off generic LLMs.
Pull citations for a systematic review search string
Turn a PICO question into a curated reference set across the 40M-paper corpus. Export directly to Zotero, Mendeley, EndNote, BibTeX, or RIS.
Zotero and reference-manager export
BioSkepsis supports one-click export to the major reference managers:
- Zotero — direct push to your library, with collections preserved.
- Mendeley — BibTeX/RIS export, ready to import.
- EndNote — RIS export.
- LaTeX users — BibTeX export with canonical citation keys.
Every exported entry carries DOI, PMID, authors, journal, year, volume, issue, and pages — the fields you need for a conforming bibliography in any major style (APA, Vancouver, AMA, Harvard, MLA, Chicago).
BioSkepsis vs generic LLM citation generators
| Feature | BioSkepsis | Generic LLMs (ChatGPT, Claude, Gemini) |
|---|---|---|
| Citation method | Retrieval-first from a curated corpus | Generation from model memory (unless browsing) |
| Guarantees real DOIs | Yes | No — documented fabrication rates |
| Biomedical-specific corpus | 40M+ peer-reviewed papers | Open-web training data |
| Biology-native disambiguation | Yes (GO + MeSH + genes) | No |
| Declines when evidence insufficient | Yes (explicit) | No — typically confabulates |
| One-click Zotero export | Yes | No |
| Safe for manuscript submission | Yes (verify cited passage) | Not without manual DOI verification |
The takeaway: a generic artificial intelligence citation generator can be useful for brainstorming what a reference list might look like. An AI reference finder built for research — like BioSkepsis — is what you use when the references have to be real.
Frequently asked questions
Does BioSkepsis ever invent citations?
No. Every reference is drawn from the curated 40M+ biomedical corpus and carries a verifiable DOI. If the corpus does not contain sufficient evidence for a claim, BioSkepsis declines to return a reference rather than inventing one. This contrasts with generic LLMs, which peer-reviewed analyses have shown fabricate a substantial fraction of medical citations (Bhattacharyya et al., Cureus, 2023).
Can I export references to Zotero?
Yes. One-click export to Zotero, plus support for Mendeley, EndNote, BibTeX, and RIS. Every entry carries DOI, PMID, authors, journal, year, volume, issue, and pages.
Does it work for non-biomedical fields?
BioSkepsis is optimised for biomedical and life-science literature (biology, medicine, pharma, biotech, agricultural/veterinary/environmental science). For papers outside that scope — education, law, humanities — a generalist tool may return better coverage. For anything life-science, BioSkepsis is the higher-precision option.
Can it generate citations in APA, Vancouver, AMA, or Harvard format?
Yes. Exported entries carry the full metadata needed for any major style. For in-text citations, use your reference manager (Zotero, Mendeley, EndNote) to apply the target style automatically — the canonical route and the one most journals expect.
How does it handle the citation for a passage I am certain exists?
Paste the passage or the claim, and BioSkepsis returns the most probable source papers with supporting quotations. If the exact passage does not exist in the corpus, BioSkepsis will not invent a match — it will either surface a near-neighbour paper and say so, or decline.
Is it free to try?
Yes. The free tier covers 100 papers per session with no time limit and no credit card. That is enough for most reference-verification workflows and single-paper reference lists.
Find real references free — no credit card
40M+ peer-reviewed biomedical papers. Verifiable DOIs on every result. One-click export to Zotero.
Start freeSource
- Bhattacharyya M, Miller VM, Bhattacharyya D, Miller LE. High Rates of Fabricated and Inaccurate References in ChatGPT-Generated Medical Content. Cureus 15(5): e39238, 2023. doi:10.7759/cureus.39238