The new initiative from the Biden-Harris administration highlights efforts to advance national open science policies
The Biden-Harris administration recently declared 2023 the Year of Open Science in the United States, offering an opportunity to advance national open science policy and provide greater and more equitable access to research in key areas of scientific study.
The MIT Press centers open access in much of the work we do; we take pride in making high quality, well-researched scholarship freely available to the public. In honor of the Year of Open Science, we are highlighting our own open access journals that publish groundbreaking scientific research. Read on to explore these journals, and learn more about open access initiatives at the Press.
Computational Linguistics edited by Hwee Tou Ng
2021 Impact Factor: 7.778
Computational Linguistics is the longest-running publication devoted exclusively to the computational and mathematical properties of language and the design and analysis of natural language processing systems. The quarterly offers university and industry linguists, computational linguists, artificial intelligence and machine learning investigators, cognitive scientists, speech specialists, and philosophers the latest information about the computational aspects of all the facets of research on language.
Read inside the journal: “What Is a Paraphrase?” by Rahul Bhagat and Eduard Hovy; Computational Linguistics (2013) 39 (3): 463–472
Data Intelligence edited by James Hendler, Zhixiong Zhang, and Ying Ding
Data Intelligence is an open-access, metadata-centric journal intended for data creators, curators, stewards, policymakers, and domain scientists as well as communities interested in sharing data. DI informs industry leaders, researchers, and scientists engaged in sharing and reusing data, metadata, knowledge bases, and data visualization tools. In addition to traditional articles addressing methodologies and/or resources, the journal also publishes “data articles” in the form of knowledge graphs, ontologies, and linked datasets.
Read inside the journal: “Virtual Knowledge Graphs: An Overview of Systems and Use Cases” by Guohui Xiao, Linfang Ding, Benjamin Cogrel, and Diego Calvanese; Data Intelligence (2019) 1 (3): 201–223
Harvard Data Science Review edited by Xiao-Li Meng
As an open access platform of the Harvard Data Science Initiative, Harvard Data Science Review (HDSR) features foundational thinking, research milestones, educational innovations, and major applications, with a primary emphasis on reproducibility, replicability, and readability. The journal publishes content that helps define and shape data science as a scientifically rigorous and globally impactful multidisciplinary field based on the principled and purposed production, processing, parsing, and analysis of data. By disseminating inspiring, informative, and intriguing articles and media materials, HDSR aspires to be a global forum on everything data science and data science for everyone.
Read inside the journal: “Motivating Data Science Students to Participate and Learn” by Deniz Marti and Michael D. Smith; Harvard Data Science Review (2023) 5.1
Journal of Climate Resilience and Climate Justice edited by William Shutkin
The Journal of Climate Resilience & Climate Justice (CRCJ), is an online, open access resource providing research reports, case studies, essays, and opinions from the working edge of the climate resilience and climate justice fields written in a non-technical, digestible, and educational style for a broad audience.
The Journal of Climate Resilience & Climate Justice will launch soon in 2023. Sign up for our newsletter to hear more news about the first publication.
Network Neuroscience edited by Olaf Sporns
2021 Impact Factor: 4.980
Network Neuroscience features innovative scientific work that significantly advances our understanding of network organization and function in the brain across all scales, from molecules and neurons to circuits and systems. Positioned at the intersection of brain and network sciences, the journal covers empirical and computational studies that record, analyze or model relational data among elements of neurobiological systems, including neuronal signaling and information flow in circuits, patterns of functional connectivity recorded with electrophysiological or imaging methodology, studies of anatomical connections among neurons and brain regions, and interactions among biomolecules or genes.
Read inside the journal: “Theoretical foundations of studying criticality in the brain” by Yang Tian, Zeren Tan, Hedong Hou, Guoqi Li, Aohua Cheng, Yike Qiu, Kangyu Weng, Chun Chen, and Pei Sun; Network Neuroscience (2022) 6 (4): 1148–1185
Neurobiology of Language edited by Steven L. Small and Kate E. Watkins
Neurobiology of Language provides a new venue for articles across a range of disciplines addressing the neurobiological basis of speech and language. Offering open access publishing, rigorous double-blind peer review, and quick turnaround times for authors, the journal aims to facilitate the replicability of experimental findings through modern open science requirements such as sharing of raw data and detailed methods.
Read inside the journal: “Bilingualism, Executive Function, and the Brain: Implications for Autism” by Celia Romero and Lucina Q. Uddin; Neurobiology of Language (2021) 2 (4): 513–531
Open Mind edited by Edward Gibson and Samuel J. Gershman
Open Mind provides a new venue for the highest quality, most innovative work in cognitive science, offering affordable open access publishing, concise and accessible articles, and quick turnaround times for authors. The journal covers the broad array of content areas within cognitive science, using approaches from cognitive psychology, computer science and mathematical psychology, cognitive neuroscience and neuropsychology, comparative psychology and behavioral anthropology, decision sciences, and theoretical and experimental linguistics.
Read inside the journal: “Representations of Abstract Relations in Infancy” by Jean-Rémy Hochmann; Open Mind (2022) 6: 291–310
Quantitative Science Studies edited by Ludo Waltman
Quantitative Science Studies is the official open access journal of the International Society for Scientometrics and Informetrics (ISSI). It publishes theoretical and empirical research on science and the scientific workforce. Emphasis is placed on studies that provide insight into the system of science, general laws of scientific work, scholarly communication, science indicators, science policy, and the scientific workforce.
Read inside the journal: “Pandemic publishing: Medical journals strongly speed up their publication process for COVID-19” by Serge P. J. M. Horbach; Quantitative Science Studies (2020) 1 (3): 1056–1067
Rapid Reviews: Infectious Diseases edited by Stefano M. Bertozzi
RR\ID is an open-access overlay journal that accelerates peer review of important infectious disease-related research preprints. The journal is an evolution of Rapid Reviews: COVID-19. RR\ID expands beyond COVID-19 to advance our understanding of infectious diseases, leaning on a similar “curate, review, publish” model. RR\ID aims to prevent the dissemination of false/misleading scientific information and accelerate the validation and diffusion of robust findings.
Read inside the journal: “Reviews of ‘Omicron-induced interferon signaling prevents influenza A virus infection’” by Dennis Metzger, Tarani Kanta Barman, and Lok-Yin Roy Wong; Rapid Reviews: Infectious Diseases
Transactions of the Association for Computational Linguistics edited by Asli Celikyilmaz, Ani Nenkova, and Roi Reichart
2021 Impact Factor: 9.194
A companion journal to the highly regarded quarterly Computational Linguistics, Transactions of the Association for Computational Linguistics publishes articles in all areas of natural language processing. This annual, open access journal disseminates work of vital relevance to academic and industry computational linguists, natural language processing experts, artificial intelligence and machine learning investigators, cognitive scientists, speech specialists, as well as linguists and philosophers.
Read inside the journal: “Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale” by Laurent Sartran, Samuel Barrett, Adhiguna Kuncoro, Miloš Stanojević, Phil Blunsom, and Chris Dyer; Transactions of the Association for Computational Linguistics (2022) 10: 1423–1439