Our recent article introduces a system termed ‘Schol-AR’, that enables digital data to be directly merged into scientific publications and viewed through either augmented reality (AR) or web browsers. A picture is worth 1,000 words, so before going further I’d recommend seeing for yourself. You can aim your smartphone at the QR code and image below to view the augmentation through AR, or visit www.Schol-AR.io/publications/ard2022 to view our article directly with all of its augmentations. This system is open and accessible—you can augment your own papers, posters, or other communications with your own data at www.Schol-AR.io.
The goal of this project is to bridge the gap between the digital basis of modern research, and the printing-press era technology we use to communicate this research. Yes, scientific articles are now distributed digitally, but they are still printable documents composed of text and static figures—the same as they have been for centuries. This problem is not new, and we are not the first to notice it. In fact, many notable efforts have been made to modernize the scientific article in a variety of ways. Perhaps the most common is the practice of using digital supplementary material to include digital media with articles. Unfortunately, recent metrics indicate these materials are accessed by as few as .04% of readers1. But, that’s just the tip of the iceberg—many other efforts have been made to modernize the scientific article, including the Journal of Visualized Experiments, the Wolfram Computational Essay, Elsevier’s Article of the Future and many more. While we cannot cover them all with nearly the detail they deserve and some remain as active projects, their cumulative impact has yet to displace static text and figures as the de-facto standard of scientific communication2,3,4,5.
As a result, the scientific figure has become so engrained into the publication system that we don’t question the ‘representative’ image that only reports 3% of a dataset, the 3D anatomical model that is only visible from a single angle, or the timeseries data shown as a single snapshot. Even when we do acknowledge the limitations of printable media, they are often treated as a necessary evil—because, how else can you get it in the paper? But as research methods and data continue to progress it feels harder and harder not to see the simplification of all forms of scientific data down to 2D static figures as the proverbial ‘square peg in a round hole.’
Fortunately, several technologies have recently become available that enable a solution that was not previously possible. Specifically, the global adoption of smartphones, emergence of consumer grade augmented reality and native browser integration of the web graphics library (WebGL) now make it possible to merge modern digital data directly with standard scientific communications. With this combination of technologies, we can ‘attach’ augmented digital media directly to figures, allowing readers to quickly access them through their smartphones, or standard computer screens, whichever they prefer.
We can also apply some lessons from the past as we incorporate these technologies, avoiding pitfalls that plagued some previous efforts. First and foremost, accessibility is designed to be very, very easy for readers. This means no log in required, no finding and tracking URLs, and no installing various bits of software to view a new supplementary file format. Our goal is for readers to view data by simply pointing their phone at a figure, or, for those who don’t prefer mobile devices, by opening a PDF in an online reader. Second, and just as important, the system automatically works with all publishers without them having to do anything. The augmentations are ‘attached’ to the figures themselves, not to any submission system. If a manuscript can include figures, then the augmented data can be reviewed, published, and effectively seen by anyone who has access to those figures. This not only removes the burden from publishers to implement new and changing technology, but also provides a more approachable system for authors by avoiding specific manuscript preparations that fit the capabilities and nuances of each submission system.
Speaking of authors, a third lesson we learned is to make it as easy on us as possible. This means no requiring authors to code, no technical formats like XML, and no complex processes of how data needs to be specifically formatted. We aim to make data inclusion as easy as ‘upload your data and the figure you want it attached to—the rest is done for you.’ Well, aside from including an automatically provided QR code inside the document that lets the augmentations follow it, wherever it winds up.
Overall, we hope the capabilities provided through this system can help transition the inclusion of accessible and comprehensive data from the exception to the expected standard of publication. This could support not only more thorough reporting, but also meaningfully improve our ability to communicate the complex information that underlies modern science.
Read the full article at:
Ard, T., Bienkowski, M.S., Liew, SL. et al. Integrating Data Directly into Publications with Augmented Reality and Web-Based Technologies – Schol-AR. Sci Data 9, 298 (2022). https://doi.org/10.1038/s41597-022-01426-y
- Flanagin, A. et al. Editorial Evaluation, Peer Review, and Publication of Research Reports With and Without Supplementary Online Content. JAMA 319, 410 (2018).
- Shotton, D. Semantic publishing: the coming revolution in scientific journal publishing. Learned Publishing 22, 85–94 (2009).
- Somers, J. The Scientific Paper Is Obsolete. The Atlantic https://www.theatlantic.com/science/archive/2018/04/the-scientific-paper-is-obsolete/556676/ (2018).
- Sopinka, N. et al. Envisioning the scientific paper of the future. FACETS 5, 1–16 (2020).
- Barnes, D. G. et al. Embedding and Publishing Interactive, 3-Dimensional, Scientific Figures in Portable Document Format (PDF) Files. PLOS ONE 8, e69446 (2013).
Please sign in or register for FREE
If you are a registered user on Research Data at Springer Nature, please sign in