Skip to main content

‘Sophisticated global networks’ are gaming journals. A new study warns the fraud is outpacing real science — and Thailand is already feeling the effects

12 min read
2,437 words
Share:

A major new analysis published in the Proceedings of the National Academy of Sciences concludes that “sophisticated global networks” are systematically undermining academic publishing by pushing fraudulent papers into journals at industrial scale — and doing so faster than science can contain them. The researchers find that suspected “paper mill” submissions are doubling every 18 months, far outpacing the overall growth of legitimate research, which doubles roughly every 15 years. The authors warn that without urgent reforms, parts of the scientific literature risk becoming “completely poisoned,” a scenario with direct implications for Thai universities and national research priorities. The study’s key findings and expert warnings were first reported by Times Higher Education, which underscores that existing systems to combat misconduct are struggling to keep up with an increasingly organized underground industry of fake science built on collusion, image manipulation and “journal hopping” to evade detection (Times Higher Education).

The publication, led by researchers at Northwestern University, synthesized large datasets of retractions, editorial handling records, and networks of image duplications to map how organized fraud spreads across journals and subfields. It characterizes some journal-editor clusters as hubs that accept disproportionately high numbers of problematic papers, and documents how paper mills shift to fresh venues when journals become de-indexed or tighten controls. In interviews about the work, the lead author, a professor of engineering sciences and applied mathematics at Northwestern, described these networks as “essentially criminal organizations, acting together to fake the process of science,” and stressed that “the scientific community needs to police itself better” before norms erode beyond repair (Times Higher Education).

Why this matters in Thailand is not abstract. In recent years, Thai higher education authorities have uncovered a domestic market for purchased papers and suspected paper-mill outputs, with investigations spanning dozens of universities. The Ministry of Higher Education, Science, Research and Innovation (MHESI) has moved against websites selling manuscripts and, in a series of disciplinary actions, universities have dismissed staff found to have submitted ghostwritten or purchased work. The numbers are stark: MHESI ordered probes into 109 academics across 33 institutions after discovering at least five websites peddling academic papers, and several lecturers were fired as cases were substantiated (Bangkok Post, 2023; Bangkok Post, 2024). The scale and speed of the global networks described in the new study help explain how such cases can arise domestically — and why conventional, case-by-case enforcement has struggled to deter them.

The new analysis, titled “The entities enabling scientific fraud at scale are large, resilient, and growing rapidly,” argues that today’s problem is not a scatter of isolated bad actors but an ecosystem. Using network theory and forensic checks, the team linked clusters of authors and editors whose patterns of behavior — including submission and acceptance flows — strongly suggest coordinated efforts to push batches of fake papers through peer review. They also mapped large “image duplication” networks: webs of articles that recycle manipulated or fabricated figures across dozens or even thousands of papers, often concentrated in the same publisher portfolios and appearing in tight time windows. These patterns, the authors argue, are hallmarks of industrialized paper-mill production where image “banks” are re-used to manufacture manuscript lots that get funneled to known permissive editors (PubMed record; Times Higher Education).

Independent reporting has corroborated key elements of this picture. A detailed New York Times account describes how the Northwestern team compiled a database of retracted and suspected paper-mill articles and used it to uncover dense networks of editors and authors repeatedly working together. The analysis estimated that the true scale of paper-mill output could be orders of magnitude higher than what has been publicly flagged, with suspicious papers doubling every 1.5 years — a pace that overwhelms journal clean-up efforts and threatens to “poison” entire subfields, such as parts of microRNA-oncology research, where retraction and anomaly rates have spiked (New York Times). Chemistry World’s coverage highlights a striking statistic from the study’s editor-network analysis: at PLOS One, just 0.25% of handling editors oversaw 30.2% of the journal’s retractions in 2024, and those editors tended to send submissions to each other more than to the broader pool — a pattern the researchers also observed in datasets from Hindawi journals and IEEE conferences. This concentration implies that even a small number of compromised gatekeepers can enable large volumes of counterfeit work (Chemistry World).

The study also documents “journal hopping,” whereby mills shift their targets as journals get de-indexed or tighten standards. One case involves an entity the researchers linked to an evolving list of journals where publication could be “guaranteed,” with the targets changing as databases like Scopus or Web of Science took action. Retraction Watch recently reported a separate large-scale clean-up: publisher Frontiers began retracting 122 articles across five journals after uncovering a network that manipulated peer review, with the publisher detailing the findings in its own integrity report (Retraction Watch; Frontiers announcement).

For readers in Thailand, these global patterns mirror and contextualize recent local developments. MHESI’s 2023 probe followed suspicious publication bursts by lecturers outside their expertise and uncovered at least five domestic websites selling papers, with most buyers employed at public universities where publication counts influence career progression. The ministry sought police action against the sites and pressed universities to impose severe penalties. Investigations led to dismissals at multiple institutions in 2024, and a ministerial committee demanded faster, evidence-based reporting to standardize enforcement and prevent recurrences (Bangkok Post, 2023; Bangkok Post, 2024). Regional reporting suggests Thailand is not alone; Indonesian institutions have also disciplined faculty over paper-mill ties, underscoring a Southeast Asian dimension to the problem in academic labor markets shaped by “publish or perish” incentives (Library Learning Space).

Experts say those incentives are core to the crisis. The Center for Scientific Integrity, which runs the Retraction Watch Database, has warned that publishers have added thousands of new journal titles to meet demand, while researchers face intensifying pressure to produce papers for jobs, promotion and grants. Investigators of paper mills describe a thriving marketplace selling authorship slots, citations, and entire manuscripts, often using AI paraphrasing to evade plagiarism checks and synthetic images to pass naive screening — trends that make fraud both cheaper and harder to detect at scale (New York Times). Editorial groups and scholarly bodies have also issued cautions. The European Association of Science Editors noted in an August brief that suspected paper-mill products are doubling every 18 months, and that image-duplication networks can encompass thousands of articles clustered in time and journal venue, making post-publication corrective work Sisyphean (EASE).

The new PNAS study’s co-authors argue that the community has been looking in the wrong place. The popular focus on catching individual cheaters misses the architecture enabling fraud at scale: brokers who steer manuscripts to complicit editors, paper mills that re-use content across batches, and a persistent ability to migrate into new journals as old routes close. They call for “enhanced scrutiny” of editorial processes, improved pre-publication detection of fabricated research and image manipulation, and sustained exposure of the networks that coordinate misconduct. One co-author warns that if systems are “not prepared to deal with the fraud that’s already occurring,” they will be even less prepared for what generative AI will enable — including a feedback loop in which AI models are trained on tainted literature, producing more plausibly phrased but false papers that further pollute the record (Times Higher Education).

Thailand has taken notable steps in recent years to define ethical baselines in human research and to professionalize journal standards, including national guidelines overseen by the National Research Council of Thailand (NRCT) and ethics frameworks for human subjects. Institutional networks like the Thai-Journal Citation Index (TCI) have pushed for stronger editorial policies and hosted training sessions on research integrity and publication ethics for university staff and editors — a sign that Thai scholarly infrastructure is engaging with the problem proactively, even as pressures persist (NRCT/Mahidol guidelines; ThaiJo ethics exemplar). But the international evidence now suggests a shift from reactive casework to systemic risk management.

Practically, the PNAS findings point to several prioritized actions. First, vetting the editor pipeline. The concentration of retractions under a small subset of editors at certain journals, as described in the study and independent reporting, implies that publishers and indexing bodies should subject editorial appointment and performance to ongoing integrity analytics — flagging anomalies such as outlier acceptance rates, unusually short handling times, and recurrent co-occurrence with later retractions. Where Thai faculty serve on editorial boards or submit to journals with known risk signals, Thai universities and research funders can apply their own integrity screens before counting such publications toward promotions or grant performance (Chemistry World).

Second, invest in pre-submission and pre-acceptance screening. Tools for image forensics and text-anomaly detection — including checks for “tortured phrases,” unlikely methodological phrasing, and duplicated western blots or microscopy panels — should be standard in Thai university research offices and journal editorial workflows. While AI makes fakes more realistic, it also enables affordable triage at scale. The Northwestern team’s approach to building networks of duplicated images, for example, could inspire national or TCI-level services that help Thai editors and institutional review committees identify suspicious clusters before publication, not after (New York Times; EASE).

Third, change incentives that reward quantity over quality. MHESI’s recent enforcement drive arose partly from career schemes that elevate raw publication counts. International editors and metascience experts argue for rebalancing toward rigor, openness and reproducibility, including rewarding pre-registration, data and code sharing, and replication studies. For Thai promotion and grant rules, piloting “narrative CVs” and capping the number of works considered during evaluation (favoring depth over breadth) could reduce demand for dubious shortcuts that mills supply (Chemistry World).

Fourth, coordinate regionally. The Southeast Asian experience shows that mills and their brokers do not respect borders. Thailand could partner with ASEAN research councils and journal consortia to share blacklists of brokered special issues, compromised editors, and fraudulent-service websites, while also aligning due-process standards to avoid false positives. Recent large-batch retractions by prominent publishers reinforce how coordination can scale corrective action when evidence is shared across outlets (Retraction Watch; Frontiers announcement).

There is also an essential cultural dimension. Thailand’s education tradition places a premium on collective harmony and institutional reputation. That can inadvertently discourage whistle-blowing or robust post-publication critique against colleagues. Yet the rise of community-driven platforms like PubPeer has shown how transparent, respectful scrutiny can surface image reuse and statistical anomalies that pre-publication peer review misses. Incorporating structured, non-punitive post-publication review into Thai journals — and protecting those who raise concerns in good faith — could align with Thai values of community responsibility while elevating scientific rigor. Internationally, watchdog data show that retractions hit a record high in 2023, exceeding 10,000, a trend widely interpreted as both a sign of growing fraud and improved detection. For Thailand, normalizing correction is critical: the goal is not to avoid embarrassment but to preserve trust (Forbes summary of Nature analysis).

Some publishers are reorganizing in response. Frontiers joined UNITED2ACT, a cross-industry initiative to counter paper mills, months before its July 2025 retractions announcement, a signal that large houses recognize the need for shared infrastructure to profile risks and exchange intelligence on submission patterns tied to mills (Frontiers UNITED2ACT). For Thai editors, membership in such networks and adoption of common integrity taxonomies could help manage resource constraints in smaller editorial offices.

What happens if the community fails to act? The PNAS authors caution that certain subfields are already so polluted that legitimate researchers avoid them, fearing that investing years into a literature seeded with falsehoods is career-ending. In practical terms, this could compromise Thai strategic priorities in biomedicine, agriculture and engineering if local researchers unknowingly build on corrupted literatures. It also risks misallocating public funds. For example, if grant panels rely on publication counts without integrity checks, mills profit while authentic Thai science stalls.

There is reason for measured optimism: the same network tools that revealed the scope of the problem can be repurposed to defend the literature. Journal analytics can flag editors with outlier profiles; image-similarity graphs can alert reviewers to duplicated figures at submission; and anomaly detection can prioritize human scrutiny where it’s most needed. But experts stress that detection is not enough. The system must also make fraud less rewarding. That means funding and promotion based on robust contributions, better training, and clear consequences for those who buy or sell scientific authorship. Thailand’s recent cases show enforcement is possible, but the Northwestern study implies that lasting progress depends on structural prevention, not just punishment after the fact (Times Higher Education; Chemistry World).

For Thai universities, research offices and early-career scholars, the immediate takeaways are practical:

  • Verify journal and editor integrity before submitting. Check if a journal has had clusters of retractions tied to special issues or specific editors, and look for transparent editorial policies. Use signals from indexing services and community reports to avoid venues associated with paper-mill activity (Retraction Watch).

  • Strengthen internal pre-submission checks. Run image-duplication and text-forensics tools on all manuscripts, and require raw data and analysis code deposits for quantitative studies. Encourage co-authors to certify their contributions and the availability of underlying data.

  • Rebalance evaluation metrics. Faculty appraisal should prioritize a limited set of high-impact, transparent outputs with open data over publication counts. Recognize replication work and negative results to reduce perverse incentives.

  • Support and protect whistle-blowers. Establish confidential reporting channels and clear, fair procedures to evaluate concerns. Encourage post-publication commentary and corrections as part of a healthy scientific process.

  • Collaborate with national bodies. Align institutional policies with MHESI and NRCT guidelines, and engage with TCI to share intelligence on suspicious patterns. Consider cross-institutional integrity audits of high-risk subfields.

The Thai academic community has already shown it can act when confronted with evidence. The new PNAS-backed analysis raises the stakes, revealing a coordinated, adaptive, and fast-growing fraud economy that exploits gaps in editorial systems and incentive structures worldwide. For Thailand — a country investing heavily in research to drive health, education and innovation — the path forward combines vigilance with reform: modern tools to detect organized deceit, evaluation systems that reward quality and openness, and a scholarly culture that values correction over face-saving. Those steps can help ensure Thai science continues to contribute credibly to global knowledge, rather than be derailed by an underground industry built to deceive.

Sources cited in this report include the original coverage and study summaries by Times Higher Education and global media, the PubMed record for the PNAS article, and Thai and international reports on enforcement actions and editorial reforms: Times Higher Education; PubMed; New York Times; Chemistry World; Retraction Watch; Frontiers; Bangkok Post, 2023; Bangkok Post, 2024; Library Learning Space; EASE; NRCT/Mahidol guidelines; ThaiJo ethics exemplar.

Related Articles

6 min read

Surge in Fraudulent Scientific Papers Threatens Global Research Integrity, Study Warns

news science

A landmark study has confirmed growing fears that fraudulent scientific papers, fueled by shadowy “paper mills,” are increasing at an alarming rate and threatening the very foundations of science. Published in the Proceedings of the National Academy of Sciences and highlighted by The New York Times, the analysis reveals sophisticated fraud networks and a rapidly proliferating output of fake or low-quality research papers that could contaminate medical, technological, and social advancements worldwide (nytimes.com).

#ResearchIntegrity #ScientificFraud #ThailandAcademia +8 more
5 min read

Surge in Fake Scientific Papers Threatens Global Research, Experts Warn

news education

A recent investigation has sounded an alarm in the global scientific community, revealing that fraudulent scientific publications are proliferating rapidly—at a rate that far outpaces the overall growth of legitimate research. The findings, published in the influential journal Proceedings of the National Academy of Sciences (PNAS) on August 4, point to a crisis that could undermine the credibility of science worldwide if unchecked. This phenomenon is not simply an issue for researchers and academics but raises pressing concerns for policymakers, healthcare providers, and the general public in Thailand and around the world.

#ResearchEthics #ScienceFraud #AcademicIntegrity +4 more
4 min read

MIT Withdraws Support for Student AI Research Paper After Integrity Review

news artificial intelligence

In a move that has reverberated throughout the global academic community, the Massachusetts Institute of Technology (MIT) has formally withdrawn its support for a widely circulated research paper on artificial intelligence (AI) authored by a former PhD student in its economics program. The paper, titled “Artificial Intelligence, Scientific Discovery, and Product Innovation,” was first posted to the preprint server arXiv in November 2024 and quickly garnered high-profile attention for purportedly showing how AI can significantly boost scientific discovery and product innovation. However, following a confidential review, MIT has announced it has “no confidence in the provenance, reliability or validity of the data and [has] no confidence in the veracity of the research contained in the paper,” marking a rare and public reversal from one of the world’s top research universities (source).

#AI #ResearchIntegrity #AcademicEthics +7 more

Medical Disclaimer: This article is for informational purposes only and should not be considered medical advice. Always consult with qualified healthcare professionals before making decisions about your health.