Why academic conference posters suck

I’ve been to dozens of conference poster sessions, but I struggle to think of a single thing I’ve learned from them. I don’t think I’m alone, considering the antipathy toward academic posters I’ve noticed among colleagues and librarians.

Continue reading “Why academic conference posters suck”

Open for collaboration—panel discussion (Open Access Week)

The second half of the Open for Collaboration event (I summarized the first half earlier) was intended as a panel discussion—but was more like five jam-packed mini-presentations—about not only open access but also open education and open source software. The theme of the evening was, “Is it time for Canada to implement a unified open strategy for higher education?”

Juan Pablo Alperin, Publishing @ SFU and Public Knowledge Project (PKP)

Juan Pablo Alperin is a faculty member in the Canadian Institute for Studies in Publishing and has worked with PKP for almost ten years.

“Open access to knowledge doesn’t just happen,” said Alperin. “And we need to do more. Open access is only one piece of the puzzle. We have to teach openness—to get students to understand what openness is—and we have to teach the practice of openness.” These students will become future faculty who are completely comfortable in openness and will teach it to the next generation.

Alperin’s guide to teaching openness in five easy steps:

  1. Make all the readings open access.
  2. Have students annotate them openly. Using a tool like hypothes.is, students can comment on an article and ask each other questions online.
  3. Have students publish all their work. Alperin asks his students to get into the habit of making their scholarly work public by publishing their work anywhere online, whether it’s on a blogging platform like Blogger or WordPress, the course website, or an open-access post-publication journal like The Winnower.
  4. Give students feedback through open annotation.
  5. Have students openly review each other.

An optional sixth step is, if your teaching involves data, to use open data.

“The number of students that would have done this on their own? Probably zero,” said Alperin. But the number of students who have resisted these steps is also zero. “Students don’t have a problem with doing this,” said Alperin. “Just get them into the habit of putting their knowledge out there.”

David Ascher, VP of Product for the Mozilla Foundation

David Ascher has been at Mozilla for seven years, where he manages software projects (other than Firefox) and tries to make them as widely available as possible. Mozilla is a global nonprofit with a thousand employees and tens of thousands of volunteers, and its mission is to promote an open internet. “Mozilla has a policy of open by default,” said Ascher. Most of its programs are open for participation and critique. “We have an imperfect view of the world,” he said. “And open drives many decision-making processes and lets the world tell us why we’re wrong. Open keeps us honest.” Part of Mozilla’s work involves pushing for open standards and open policies.

One of its projects most relevant to open access are Open Badges, a micro-credentialing system that allows people to earn badges for what they learn and display these badges online as a way to show their set of skills. Any organization can use the free software and open technical standard to create and issue badges to people who have learned from them. The ever-growing list of organizations issuing badges is diverse and includes 3d GameLab, the Dallas Museum of Art, NASA, and NOAA Planet Stewards, among many others. “A lot of good work went into that,” said Ascher, referring to the Open Badges initiative. “It pushes students to learn throughout their lives, and there’s a rich community built around promoting badges.”

“For an institution, teacher, and students who get open, it can work beautifully, collaboratively,” said Ascher. “If it’s forced through because of policy, the hierarchy of badge systems reflects the hierarchy of the institution’s power structure.”

Ascher touched on the Open Source movement, which he said “got stuck on licensing and software distribution as a topic and lost track of the social, societal impact of that work.” Powerful open source software is behind a lot of what tech giants like IBM and Google have developed. These companies “follow the letter of the law,” said Ascher, “but the reality is that people don’t interact with open source software. They interact with online products and services.”

“IBM embraced open source and used it strategically,” said Ascher. Meanwhile, players in the Open Source movement have spent a lot of time fighting one another and have missed opportunities to advance the movement’s core goals.

Clint Lalonde, senior manager of open education at BCcampus

Since 2013 Clint Lalonde has been with BCcampus, which manages the Open Textbook Project on behalf of the provincial government. Open textbooks are a subset of open educational resources (OERs), which are teaching resources with Creative Commons licensing and can be freely copied, shared, modified, and reused.

However, “we can’t assume that there’s a common understanding of what it means to be open,” said Lalonde. For example, a lot of MOOCs (massive open online courses)—such as those at Coursera—have open registration but are not openly licensed. There are still copyright restrictions on the material used in those courses.

Fourteen countries around the world have made commitments to OER, said Lalonde, with the view that “publicly funded resources should be openly licensed resources. We paid for something—we should make that something as widely used as possible.”

What complicates developing a unified open strategy is that whereas in other countries, post-secondary education is considered a national responsibility, in Canada that responsibility lies with the provinces. A promising step forward is that BC, Alberta, and Saskatchewan have signed a memorandum of understanding for sharing OER.

Open textbooks have thus far saved BC students more than $1 million over the past two years. “Students using OER are doing as well, maybe even better, than students using textbooks from publishers. It would be nice to make OER the default, not the exception.:

Inba Kehoe, Copyright Officer and Scholarly Communication Librarian at University of Victoria Libraries

Inba Kehoe helped create the BCOER Librarians group, which began in 2013 with the aim of building a community of practice to address the question, “Why not open?” The group shared knowledge, participated in hackfests, created guides and posters, and developed a rubric for librarians reviewing OER resources. All of the discussions are open and take place online through forums such as wikis.

At the national level, the Canadian Association of Research Libraries is pushing toward sustainable scholarly publishing, and within the organization there’s a working group dedicated to open access issues.

Internationally, the SHERPA/RoMEO database at the University of Nottingham stores publishers’ policies about self-archiving and open-access repositories. Librarians can add to the database by looking into publishers’ policies and providing the evidence to RoMEO. “Canadian publishers are not well represented,” said Kehoe. “Let’s get our publishers’ policies into this database.”

By checking the member of the Canadian Association of Learned Journals against the list of journals in the RoMEO database, Kehoe and her collaborators hope to figure out which ones are missing and start collecting the data to fill in the holes.

Back at her home institution, Kehoe oversees the Journal Publishing Service, which uses Open Journal Systems to publish twenty-eight open-access journals. “Three-quarters of them are student journals,” said Kehoe. “Students learn the publishing process and learn about open access.” Kehoe also handles faculty requests to publish research and teaching resources, as well as open access books. “We work with the bookstore, which has a print-on-demand machine,” said Kehoe. “And I ask for research funds for the editorial work and design,” which usually runs from about $2,500 to $3,000.

Rosemary (Rosie) Redfield, UBC Department of Zoology

Since 2006, Rosie Redfield has run the RRResearch blog, an open science blog. She proposed a few top-down efforts that might help openness:

1. Pressure faculty to produce open resources wherever possible

“We need policies in place to make faculty justify it if they want to work with a publisher,” said Redfield.

2. Set top-down expectations that researchers will use OER by default

“There’s inertia among faculty about using open textbooks. Put in top-down policies to make faculty explain why” they decide to use non-OA textbooks. “The material on OpenStax is completely free, and it’s just as good.”

3. Support faculty when they have copyright issues with open-access publications

Open-access publishing requires authors to accept a CC BY licence, which means that anyone can use the material, even for commercial purposes, as long as the original author is credited. “Unscrupulous publishers like Apple Academic Press will take open articles, repackage them in a glossy book, and sell the book for more than $100 online. The people who wrote the articles have no idea this is being done,” said Redfield. Often the original publication is not mentioned. “To the scientific community, it looks like the researcher is trying to self-plagiarize. Authors can’t take these presses to court. Journals don’t own copyright so can’t take them to court. We need centralized support to defend researchers’ interests when we use OA.”

4. Encourage faculty to assign coursework that goes behind the classroom

Make open access the expectation for students, not the exception. Get students used to having their work be open. Whether the work involves adding pages to Wikipedia or speaking at an Ignite event, which archives its five-minute talks on YouTube, “set an expectation that the work that students are doing for their grade are producing benefits outside the classroom.”

5. Help libraries escape the clutches of journal publishers.

Two major barriers to OA are that “big academic publishers combine strong and weak journals into exorbitantly priced bundles, and researchers protest when they lose free access to their favourite (often obscure) journal.” A possible solution is a nationwide program that gives all faculty and students at Canadian institutions free access to all paywalled journal articles.

Open for collaboration—John Willinsky on cooperation for open access (Open Access Week 2015)

For Open Access Week (October 19–25), SFU Library, UBC Library, BCcampus, and the Public Knowledge Project joined forces to host an evening of speakers and discussion about all things open. Ostensibly, the theme of the event was “Is it time for Canada to implement a unified open strategy for higher education?” but the speakers deviated from that theme quite a bit, which, in my opinion, ultimately made the evening more interesting. A summary of John Willinsky’s keynote is below. I’ll blog about the panel discussion in a separate post.

(An overly fastidious editor’s confession: I vacillated between uppercasing and lowercasing Open Access, feeling that open access as a concept is widespread enough that it doesn’t have to be capitalized. Just to be perfectly insufferable, I’ve retained the caps for the name of the movement. If you have strong feelings about this issue, I’d love to hear your thoughts.)

***

John Willinsky is a professor at the Stanford Graduate School of Education  and the director of the Public Knowledge Project, headquartered at SFU. Award-winning author of books such as Empire of Words: The Reign of the OED, and Learning to Divide the World: Education at Empire’s End, Willinsky has been a champion of the Open Access movement for about two decades. “Open Access started to take on the name in 2001,” said Willinsky, but the idea emerged in the 1990s with the growth of the internet. “There was the sense that, if there is an internet, there should be more access to knowledge,” said Willinsky. “Information should be free, and now we have the means to begin sharing it.”

Until recently, open access has applied largely to research articles, although now more books are being made open access. “Open access means it is available to anyone,” said Willinsky, and in the past decade “We’ve had a great achievement”: Jamali and Nabavi (2015) found that 61 percent of all research articles were freely available. (Ironically, Jamali and Nabavi’s paper is not open access.)

“I have to add a qualifier,” said Willinsky. “Eighty percent of those articles are illegally posted online.”

“Fortunately, there has been very little prosecution in this,” he added.

“The movement has taken hold. The White House has a policy on open access: research supported by all federal granting agencies has to be open access. The Tri-Council in Canada [NSERC, SSHRC, and CIHR] requires open access.”

Has open access had an impact on the public? “What happens when we start making these articles free? Will people care?” Willinsky cited research by Juan Pablo Alperin, who found that in Latin America, 20 to 25 percent of people who viewed peer-reviewed articles were not affiliated with universities, and Laura Moorhead, who gave doctors free access to research articles for one year and asked, “Would they be too busy to use these resources?” Moorhead found that, indeed, two-thirds of the doctors who signed up were too busy to look at research articles, but the remaining one-third consulted research to inform their practice, to confirm what they’ve known intuitively, and to resolve disagreements with their colleagues.

The next frontier, said Willinsky, is open data. Pharmaceutical giant Bayer, he told us, has committed to making its patient-level information available to researchers and registering all of its clinical trials. “In the past,” said Willinsky, “pharma did not tell people about trials. They used to be very proprietary about data, which made it difficult to create generic drugs.”

“We cannot take [open data] for granted the way articles are taken for granted,” said Willinsky. The international community has to come together to develop standards that will allow us to take advantage of this open data. Personalized medicine is based on the ability to share and compile and analyze genomic data, he said, but right now we have no standards for sharing it in an interoperable way.

The spirit of cooperation that open data calls for builds on the cooperation that has helped open access succeed. “We’re no longer talking about whether open access is good,” said Willinsky, “but the final kilometre will be the biggest challenge.” Worldwide, we spend $12 billion on journal subscriptions, a number that isn’t decreasing despite open access. “We need to rethink the idea of subscription as a way to exclude,” he said. Subscription, at its birth, was a way to support the creation of a work. In 1612, when the Accademia della Crusca found that it had no money to print the Vocabolario degli Accademici della Crusca, the first dictionary of Italian, it called on people to contribute to the cost of printing in exchange for an acknowledgement in the publication. Subscribers directly supported the publisher, and this model of subscription as micro-patronage was borrowed to print similar works.

Willinsky calls on multiple stakeholders—libraries, journals, professional associations, presses, and funders—to create cooperatives that function more like this original subscription model. Some of the $12 billion we spend for subscriptions can surely be reallocated to pay for open access.

We’re seeing the first signs of how these kinds of cooperatives might work. Érudit, for example, is a consortium of universities in Quebec that makes the 130 Canadian journals in its collection available to all of those institutions’ libraries. And the Canadian Research Knowledge Network cuts a single cheque to a publisher and makes that publisher’s journal available to all member libraries. “There’s no advantage to libraries to have that knowledge locked up,” said Willinsky. “Let’s have the same cheque for subscription journals to make all of them open access, available to all Canadians and everyone in the world.”

Professional societies can also pave the way for more access: the American Association of Anthropology, for example, receives $500,000 a year in revenue from subscriptions to twenty-two journals that only library users can see. The association publishes with Wiley-Blackwell but owns its content and could work with any other publisher, including one that would support opening up access of those journals to the public, in a way that wouldn’t cost the organization anything in revenue. “The idea that we have to lock up knowledge to create value doesn’t work with research. Research is a public good like health is a public good. The quality of the whole is lost,” said Willinsky, if we divide it and put part of it behind a paywall.

“We know we have enough resources,” he said. “We are not powerless in the face of that challenge… On the basis of cooperation, we will get access to the whole.”

Authoring and delivery platforms for open educational resources (webinar)

The Community College Consortium for Open Educational Resources (CCCOER) hosted a webinar about a few platforms for authoring and delivering open educational resources (OER). CCCOER was founded almost eight years ago to expand access to openly licensed material, support faculty choice, and improve student success. It has more than 250 member colleges in twenty-one states. The organization understands that faculty need user-friendly authoring tools, institutions had to integrate OER into their existing course management infrastructures, and students had to be able to easily search and use OER. Representatives from three OER platforms explained their tools in this webinar. I’ll cover all three, but my focus will be on Clint Lalonde’s presentation about Pressbooks Textbook, because it’s the most relevant to publishing in BC. (Slides of the session are on Slideshare.)

Courseload Engage, presented by Etienne Pelaprat, User Experience Director at Courseload Inc.

Courseload is a platform that offers students access to text-based OER, video, audio, journal articles, library content and catalogues, proprietary content, and other uploaded content through a single application that can be integrated into existing learning management systems. Courseload has the flexibility of allowing institutions to curate their own content based on learning objectives, and it manages all of the metadata (including library catalogue data and ONIX feeds). This metadata allows institutions to generate custom catalogues and course packs, and the system tracks content use via analytics that may help institutions optimize discoverability and respond to student demand to improve their learning outcomes.

PressBooks Textbook, presented by Clint Lalonde, Open Education Manager at BCcampus

BCcampus’s Open Textbook Project was launched to provide BC post-secondary students with access to free textbooks in the forty subject areas with the highest enrolment. Rather than start from scratch, said Lalonde, BCcampus wanted to take advantage of existing textbook content already in the commons. The focus would be on adaptation, although they would also create some new content.

For students, the open textbooks had to be free for students to use and retain and available in several formats. For faculty, open textbooks had to be high-quality material that would be easy to find and adapt.

Hugh McGuire had predicted that the book would merge with the web and that books would be created web first; he founded PressBooks with that idea in mind. PressBooks is an open source WordPress plugin that allows authors to write once but output in many different formats, including HTML, EPUB, and PDF.

BCcampus worked with a programmer to customize PressBooks for easy textbook authoring, and the result is the PressBooks Textbook plug-in. It works together with Hypothes.is to allow students and faculty to annotate content. Lalonde and his team also added an application program interface (API) that facilitates searching and sharing with others on different platforms and allows the textbooks to become more than just static content. Unfortunately, Lalonde explained, PressBooks Textbook isn’t fully open source at the moment, because it relies on a proprietary PDF output engine, the license for which institutions would have to pay.

BCcampus’s next steps with this plug-in include

  • integrating accessibility features via the FLOE Project
  • finding an open source PDF engine to replace Prince XML
  • expanding the output formats to include Word-compatible ODT files.

Lalonde has blogged about PressBook Textbook’s architecture.

Open Assembly, presented by founder and CEO Domi Enders

Non-traditional students and adjunct instructors are less likely to be reached by OER initiatives because they may work remotely much of the time and are poorly integrated into an institution. As a result, they have limited access to their peer communities. Domi Enders wanted to develop open learning system that would not only give users access to OER but also give students or adjunct faculty the continuity and agency they need to remain engaged with their learning and teaching. Open Assembly can be integrated into existing learning management systems and allows users to collaborate in content curation. By offering users a space to meet and create new knowledge, it facilitates peer-to-peer learning in a way that helps remote students and faculty stay connected.

Writing about First Nations (Read Local BC)

As part of the Association of Book Publishers of British Columbia’s Read Local BC campaign, Laraine Coates of UBC Press hosted a panel discussion on writing about First Nations, featuring:

After Coates acknowledged that the evening’s event was taking place on unceded Coast Salish territories, she launched into the program by asking each panellist to describe their books.

Written as I Remember It was Elsie Paul’s idea, said Raibmon, and consists primarily of teachings and historical stories from Paul’s life. Paul, one of the last remaining mother-tongue speakers of Sliammon, wanted to create a booklet of teachings to share with her family. Raibmon thought Paul’s stories would interest a wider audience, and they decided to work together, along with Paul’s granddaughter, Harmony Johnson, to turn the booklet into a UBC Press book, which was organized into chapters based on key themes, including grief, education, spirituality, and pregnancy. “All of these stories were told and lived in a completely different language,” said Raibmon. “Elsie has lived a fascinating life, and she has a lot of interesting stories to tell.”

Jean Barman has written about BC history before, but “I’d always acted as if French Canadians didn’t exist in the province,” she said. She wanted to redress this deficiency and find out more about them. “That’s the nice thing about being an academic,” she said. “I get paid to find out!” As she did research for the book, her focus expanded from the French Canadians themselves to the fur trade that brought them to the province and the indigenous women who kept them here.

Jennifer Kramer co-edited Native Art of the Northwest Coast with art historian Charlotte Townsend-Gault and Nuuchaanulth historian Ḳi-ḳe-in. They wanted to challenge the “one monolithic idea of what native Northwest Coast art is”—the red, black, and white ovoids and formlines we so often see. The book unearths 250 years’ worth of commentary about Northwest Coast art from multiple perspectives, beginning chronologically with writings by Captain James Cook and including contemporary native artist–authors, to show the heterogeneity and richness of the region’s artistic past and present.

Coates noted that although the three books are different, they all deal with Aboriginal lives and legacy. She asked the panellists what they learned in their research.

Barman said that although over 90 percent of the men and all of the women she researched for the book were illiterate, she could still find traces of them in fur trade records or in the work of other people who had written about them. Barman looked at the relationships Aboriginal groups forged with the newcomers—particularly the way indigenous men encouraged their daughters to interact with the fur traders so that they could get access to trade goods—as well as the motivations French Canadian men had to stay rather than return to Quebec.

Raibmon said that unlike Barman’s project, hers “came with a workaround of the problem of finding traces.” Elsie Paul invited Raibmon to pull together audio material to create a book and allowed her to learn from the inside out, interconnecting teachings with history.

Kramer’s goal with her book was to consciously and actively address the problem that the majority of writing about Northwest Coast art has been by non-native authors. She wanted to bring in as many voices as possible to undermine the narratives repeated by Western, non-Aboriginal authors. “As an anthropologist, my number-one concern is, ‘Who am I to write about someone who isn’t me?’ We have this chronic problem or paradox: museums represent people who want to represent themselves. How do we get around that power imbalance?”

Kramer described the critical shift in the 1990s toward reflexivity, making the research process open to reflection and collaboration. “First Nations don’t have just one perspective, either,” said Kramer. “They’ll have many opinions. There’s no one way to write this. It’s not about correcting an incorrect history—it’s about acknowledging all the ways of knowing.” Kramer saw the draft of the book as a living, breathing archive, and she expressed apprehension about taking it to press and fixing it to a page. “It might have been better as an online blog, like Wikipedia, with many people engaging. We’re in this engagement together, and we’re co-creating these products of representation.” She also mentioned the discomfort that some of the artists felt, having the huge responsibility of representing not only their own artwork but also their culture, by extension.

Raibmon’s experience uncovered a bit of that tension as well. “Elsie did not get permission from the Sliammon people to write the book. She didn’t want to be seen as taking authority or speaking for her community.” She added that the university set up procedures requiring researchers who work with First Nations communities to get band approval, but “that’s not always appropriate. Elsie found it offensive that UBC wanted to get band council agreement so that she could tell her story.”

As a historian, said Barman, “I carefully document where all the bits and pieces come from so that others can add to them or challenge them.” She wants to make it clear that she’s telling a story, not the story, and there will always be pieces that are right to some and wrong to others. But if we don’t risk criticism and put your work out there, we’ll never learn, and our knowledge will never grow. “You’re doing something, but at least you’re doing something.”

Barman described a perennial difficulty that comes with historical research and writing: what to do about names. “What do we mean by the Northwest Coast?” To Americans, it includes Alaska and Washington but sometimes also Oregon and northern California. “What do you do before we had borders? What was something named in the past, and how have names changed? These issues can get you into conflict.”

Kramer agreed that names carry a lot of weight, and people can react strongly to them. She wanted her book to take an unconventional look at Northwest Coast art, which would naturally entail unconventional names and terms, yet still be discoverable to people using more familiar search terms. “That one would be accused of cultural appropriation is always a fear,” she said. Many First Nations groups have a very real fear of theft, given the historical theft of their land, their children, their sovereignty. But she had to grapple with the reality that no one member of the community could tell her that what she was doing was acceptable or give her a blank cheque. “You have to know you’re doing it with a good heart, that your intentions are clean.”

Kramer asked Raibmon if she had a voice in her book or if she felt as though she had to keep quiet and let Paul take the lead. The approach to narrative was different from her usual approaches, said Raibmon, but “the goal was to get Elsie’s voice on the page.” She still made a historical argument, but in an engaging way that foreground’s Paul’s voice. “I hope people who read the book will still see the historical connections, the connecting themes.” She added that she didn’t consider herself to be the historian and Paul to be her subject. “We were two historians working together, from different historical traditions. Personally I didn’t feel any tension from letting Elsie decide what topics would go in.”

“I didn’t actually understand why certain topics were off limits,” Raibmon continued. “Why are certain stories so important? There were chapters that were super important to them, but I didn’t understand it at the time. I learned how long it can take to let go of our assumptions that block our understanding… I understand now. But if my authority had trumped Elsie’s, I wouldn’t even have remembered the question, let along learn what I’ve learned.

“Elsie had stories of other families, but she didn’t feel that was appropriate to have in the book. She didn’t want to assume the stories would offend them. Cultural difference is understanding human difference.”

Open textbooks and the BC Open Textbook Accessibility Toolkit (webinar)

In fall 2012, the BC Open Textbook Project was launched to reduce the financial burden on post-secondary students, who spend an average of $1,200 per year on textbooks. As part of Open Education Week, BCcampus hosted a webinar about the project as well as the associated BC Open Textbook Accessibility Toolkit, created to help people who develop learning resources to make them as accessible as possible from the outset.

Open Textbook Project (presented by Amanda Coolidge)

In 2012, the BC Open Textbook Project received a grant of $1 million to develop open textbooks for the top-forty enrolled subject areas. It received another $1 million in 2014 to create resources for skills and trades training. BC has now committed to working together with Alberta and Saskatchewan to develop and share open textbooks.

Many people think open textbooks are e-textbooks, but what makes them open is their Creative Commons (CC) license: they can be copied, modified, and redistributed for no charge. Instructors can therefore change open textbooks to suit their courses, and students are able to get these books for free. In two years the project has saved more than five thousand students over $700,000 in textbook costs.

BCcampus carried out the Open Textbook Project in three phases:

  • First, they collected existing textbooks with CC licenses and asked faculty to review them.
  • Second, they modified these books based on faculty reviews. At the end of this process, they had covered thirty-six of the top-forty subject areas.
  • Finally, they funded the creation of four textbooks from scratch.

Open textbooks are now being used in fourteen post-secondary institutions across the province, and BCcampus has eighty-one textbooks in its collection. To create these materials, they use Pressbooks, a plugin that lets you write once and publish to many different formats.

Accessibility testing (presented by Tara Robertson)

Tara Robertson helps run CAPER-BC, which provides alternate formats of learning materials to twenty institutions across the province. They specialize in accommodations, including remediating textbooks for people with print disabilities. One reason the Open Textbook Project is exciting, said Robertson, is that instead of taking something broken and fixing it, she now has the opportunity to make the textbooks accessible from the start.

Seven students with special needs volunteered to test the open textbook resources for accessibility, reading selected chapters from textbooks in five subject areas and offering feedback on their usability. Robertson also ran a focus group with five students. She found recruiting testers challenging, and she acknowledges that the students who participated in the focus group, all of whom had visual impairments, were not representative of the many students that had other print disabilities. Still, the testers offered a lot of constructive feedback.

The chapters the students reviewed each had features that might interfere with assistive technology like text-to-speech software: formatted poetry, tables, images, quizzes, and so on. Testing revealed that the software would skip over embedded YouTube videos, so the textbooks would have to include URLs; formatted poems were problematic when enlarged because readers would have to scroll to read each line; and layout sometimes led to a confused reading order.

Robertson sees the accessibility consultation with students as an ongoing process to refine accessibility best practices.

BC Open Textbook Accessibility Toolkit (presented by Sue Doner)

BCcampus has just launched an accessibility toolkit for faculty, content creators, instructional designers, and others who “don’t know what they don’t know about accessible design.” Their aim is to build faculty capacity for universal design and to highlight the distinctions between accommodations and accessibility. Accommodations involve individualizing resources and providing alternative learning options for students who identify as having a disability. If we were proactive about creating materials that were accessible from day one, we’d have no need for accommodations.

Universal design recognizes that different students learn differently—some prefer visual materials, whereas others prefer text, for example. It offers students multiple access points to the content, and it’s better for all students, not just those who register with their disability resource centre. For example, aging students may appreciate being able to enlarge text, and international students may benefit from captions to visual material.

The toolkit offers plain language guidelines for creating different types of textbook content with a student-centred focus, using user personas to inform key design concepts and best practices. It asks content developers to think about what assumptions they’re making of the end users and how those assumptions might affect the way they present the material.

It might take a bit of time for creators of some types of content to catch up with all accessibility features—for example, video and audio should, as a rule, come with transcripts, but a lot of YouTube content doesn’t, and you may run into copyright issues if you try to offer material in different formats.

The next steps for BCcampus are to incorporate the toolkit into the development process for all new open textbooks they create, to modify existing textbooks for accessibility, and to encourage the province’s post-secondary community to formally adopt these guidelines. The toolkit, like the open textbooks, are available under a CC license and can be thought of as a living document that will change and grow as different types of content (e.g., math) becomes amenable to accessible design.

Doner sees these steps as “an opportunity to create a community of practice—a new literacy skill.”

***

This webinar (along with others offered during Open Education Week) is archived on the BCcampus site.

Biomedical research reports as structured data: Toward greater efficiency and interoperability

I’ve been working on this paper since September, and I was hoping to publish it in a journal, but I learned today I’ve been scooped. So I see no harm now in publishing it here. I want to thank Frank Sayre and Charlie Goldsmith for their advice on it, which I clearly took too long to act on. I’m posting it as is for now but will probably refine it in the weeks to come.

Apologies to my regular readers for this extra-long and esoteric post.

Comments welcome!

***

Introduction

Reporting guidelines such as CONSORT,[1] PRISMA,[2] STARD,[3] and others on the EQUATOR Network [4] set out the minimum standards for what a biomedical research report must include to be usable. Each guideline has an associated checklist, and the implication is that every item in the checklist should appear in a paragraph or section of the final report text.

But what if, rather than a paragraph, each item could be a datum in a database?

Moving to a model of research reports as structured or semi-structured data would mean that, instead of writing reports as narrative prose, researchers could submit their research findings by answering an online questionnaire. Checklist items would be required fields, and incomplete reports would not be accepted by the journal’s system. For some items—such as participant inclusion and exclusion criteria—the data collection could be even more granular: each criterion, including sex, the lower and upper limits of the age range, medical condition, and so on, could be its own field. Once the journals receive a completed online form, they would simply generate a report of the fields in a specified order to create a paper suitable for peer review.

The benefits of structured reporting have long been acknowledged, Andrew’s proposal in 1994[5] for structured reporting of clinical trials formed the basis of the CONSORT guidelines. However, although in 2006 Wager did suggest electronic templates for reports and urged researchers to openly share their research results as datasets,[6] to date neither researchers nor publishers have made the leap to structuring the components of a research article as data.

Structured data reporting is already becoming a reality for practitioners: radiologists, for example, have explored the best practices for structured reporting, including using a standardized lexicon for easy translation.[7] A study involving a focus group of radiologists discussing structured reporting versus free text found that the practitioners were open to the idea of reporting templates as long as they could be involved in their development.[8] They also wanted to retain expressive power and the ability to personalize their reports, suggesting that a hybrid model of structured and unstructured reporting may work best. In other scientific fields, including chemistry, researchers are recognizing the advantage of structured reporting to share models and data and have proposed possible formats for these “datuments.”[9] The biomedical research community is in an excellent position to learn from these studies to develop its own structured data reporting system.

Reports as structured data, submitted through a user-friendly, flexible interface, coupled with a robust database, could solve or mitigate many of the problems threatening the efficiency and interoperability of the existing research publication system.

Problems with biomedical research reporting and benefits of a structured data alternative

Non-compliance with reporting guidelines

Although reporting guidelines do improve the quality of research reports,[10],[11] Glasziou et al. maintain that they “remain much less adhered to than they should be”[12] and recommend that journal reviewers and editors actively enforce the guidelines. Many researchers may still not be aware that these guidelines exist, a situation that motivated the 2013 work of Christensen et al. to promote them among rheumatology researchers.[13] Research reports as online forms based on the reporting guidelines would raise awareness of reporting guidelines and reduce the need for human enforcement: a report missing any required fields would not be accepted by the system.

Inefficiency of systematic reviews

As the PRISMA flowchart attests, performing a systematic review is a painstaking, multi-step process that involves scouring the research literature for records that may be relevant, sorting through those records to select articles, then reading and selecting among those articles for studies that meet the criteria of the topic being reviewed before data analysis can even begin. Often researchers isolate records based on eligibility criteria and intervention. If that information were stored as discrete data rather than buried in a narrative paragraph, relevant articles could be isolated much more efficiently. Such a system would also facilitate other types of literature reviews, including rapid reviews.[14]

What’s more, the richness of the data would open up avenues of additional research. For example, a researcher interested in studying the effectiveness of recruitment techniques in pediatric trials could easily isolate a search to the age and size of the study population, and recruitment methods.

Poorly written text

Glasziou et al. point to poorly written text as one of the reasons a biomedical research report may become unusable. Although certain parts of the report—the abstract, for instance, and the discussion—should always be prose, information design research has long challenged the primacy of the narrative paragraph as the optimal way to convey certain types of information.[15],[16],[17] Data such as inclusion and exclusion criteria are best presented as a table; a procedure, such as a method or protocol, would be easiest for readers to follow as a numbered list of discrete steps. Asking researchers to enter much of that information as structured data would minimize the amount of prose they would have to write (and that editors would have to read), and the presentation of that information as blocks of lists or tables would in fact accelerate information retrieval and comprehension.

Growth of journals in languages other than English

According to Chan et al.,[18] more than 2,500 biomedical journals are published in Chinese. The growth of these and other publications in languages other than English means that systematic reviews done using English-language articles alone will not capture the full story.[19] Reports that use structured data will be easier to translate: not only will the text itself—and thus its translation—be kept to a minimum, but, assuming journals in other languages adopt the same reporting guidelines and database structure, the data fields can easily be mapped between them, improving interoperability between languages. Further interoperability would be possible if the questionnaires restricted users to controlled vocabularies, such as the International Classification of Diseases (ICD) and the International Classification of Health Interventions (ICHI) being developed.

Resistance to change among publishers and researchers

Smith noted in 2004 that the scientific article has barely changed in the past five decades.[20] Two years later Wager called on the research community to embrace the opportunity that technology offered and publish results on publicly funded websites, effectively transforming the role of for-profit publishers to one of “producing lively and informative reviews and critiques of the latest findings” or “providing information and interpretation for different audiences.” Almost a decade after Wager’s proposals, journals are still the de facto publishers of primary reports, and, without a momentous shift in the academic reward system, that scenario is unlikely to change.

Moving to structured data reporting would change the interface between researchers and journals, as well as the journal’s archival infrastructure, but it wouldn’t alter the fundamental role of journals as gatekeepers and arbiters of research quality; they would still mediate the article selection and peer review processes and provide important context and forums for discussion.

The ubiquity of online forms may help researchers overcome their reluctance to adapt to a new, structured system of research reporting. Many national funding agencies now require grant applications to be submitted online,[21],[22] and researchers will become familiar with the interface and process.

A model interface

To offer a sense of how a reporting questionnaire might look, I present mock-ups of select portions of a form for a randomized trial. I do not submit that they are the only—or even the best—way to gather reporting details from researchers; these minimalist mock-ups are merely the first step toward a proof of concept. The final design would have to be developed and tested in consultation with users.

In the figures that follow the blue letters are labels for annotations and would not appear on the interface.

QuestionnaireInterface-1
Figure 1: The first screen an author will see after logging in. (A) Each author will have an account and profile, including affiliations; many journals already have author accounts as part of their online submission infrastructure. (B) An autocomplete field with a controlled vocabulary of the types of study supported by the system. (C) Many types of articles either have no associated reporting guidelines or are unlikely to have a set structure (such as commentaries and letters). This button allows authors to submit those articles in the traditional way.
QuestionnaireInterface-2
Figure 2: Once the author selects the type of study, the appropriate associated form will load. If the author had chosen “randomised trial (cluster)” in Figure 1, for example, the CONSORT form with the cluster extension would load.
Figure 3: First page of the CONSORT questionnaire. (A) Because reporting guidelines and checklists vary in length, only after the form loads can the interface indicate progress through the questionnaire. A user-friendly system would also include a way for users to jump to a specific question or page. (B) The help button to the right of each field could bring up the associated section of the CONSORT Explanation and Elaboration document. (C) An autocomplete field with a controlled vocabulary of the possible sections in a structured abstract, such as the one from the National Library of Medicine.[23] (D) Required fields are indicated by an asterisk. (E) Users should be able to navigate to the next page without filling required fields in this particular page. Only at the end of the questionnaire will the system flag empty required fields. (F) Users should be able to save their progress at any time. Better yet, the system could autosave at regular intervals. (G) Users should also be able to exit at any point.
Figure 3: First page of the CONSORT questionnaire. (A) Because reporting guidelines and checklists vary in length, only after the form loads can the interface indicate progress through the questionnaire. A user-friendly system would also include a way for users to jump to a specific question or page. (B) The help button to the right of each field could bring up the associated section of the CONSORT Explanation and Elaboration document. (C) An autocomplete field with a controlled vocabulary of the possible sections in a structured abstract, such as the one from the National Library of Medicine.[23] (D) Required fields are indicated by an asterisk. (E) Users should be able to navigate to the next page without filling required fields in this particular page. Only at the end of the questionnaire will the system flag empty required fields. (F) Users should be able to save their progress at any time. Better yet, the system could autosave at regular intervals. (G) Users should also be able to exit at any point.
Figure 4: Item 3a on the CONSORT checklist. (A) The trial design field could autocomplete with the trial design types in a controlled vocabulary. If the study design is novel, users may click on the “Design type not listed” button to submit their articles traditionally. (B) Each checklist item should allow authors to elaborate if necessary. This box could support free-flowing text with formatting (e.g., Markdown) and LaTeX or MathML capabilities. (C) If a statement needs a citation, users could click on the “Cite” button, which would allow them to input structured bibliographic data. An ideal system would let them import that information from a reference management system. (D) The “+” button generates another “Additional information” box. Content from multiple boxes would be printed in sequence in the final report.
Figure 4: Item 3a on the CONSORT checklist. (A) The trial design field could autocomplete with the trial design types in a controlled vocabulary. If the study design is novel, users may click on the “Design type not listed” button to submit their articles traditionally. (B) Each checklist item should allow authors to elaborate if necessary. This box could support free-flowing text with formatting (e.g., Markdown) and LaTeX or MathML capabilities. (C) If a statement needs a citation, users could click on the “Cite” button, which would allow them to input structured bibliographic data. An ideal system would let them import that information from a reference management system. (D) The “+” button generates another “Additional information” box. Content from multiple boxes would be printed in sequence in the final report.
Figure 5: The user has chosen a full factorial design, and the system automatically brings up a box asking the user to fill in the number of variables and levels.
Figure 5: The user has chosen a full factorial design, and the system automatically brings up a box asking the user to fill in the number of variables and levels.
Figure 6: Completing the number of variables and levels generates a table that the user can use to fill in the allocation ratios.
Figure 6: Completing the number of variables and levels generates a table that the user can use to fill in the allocation ratios.
Figure 7: An example showing how structured inclusion and exclusion critieria might be collected. Sex and age are required fields; researchers may select, Male, Female, both, or Other (for example, if the study population is intersex). (A) Users may fill in other inclusion criteria below. The field to the left is a controlled vocabulary with an “” option. Selecting “condition” will allow the user to select from a controlled vocabulary, for example, the ICD, in the field to the right. (B) The “+” button allows the user to add as many criteria as necessary. The subsequent screen, for exclusion criteria, could be similarly structured (minus the age and sex fields).
Figure 7: An example showing how structured inclusion and exclusion critieria might be collected. Sex and age are required fields; researchers may select, Male, Female, both, or Other (for example, if the study population is intersex). (A) Users may fill in other inclusion criteria below. The field to the left is a controlled vocabulary with an “” option. Selecting “condition” will allow the user to select from a controlled vocabulary, for example, the ICD, in the field to the right. (B) The “+” button allows the user to add as many criteria as necessary. The subsequent screen, for exclusion criteria, could be similarly structured (minus the age and sex fields).
Figure 8: Item 5 on the CONSORT checklist, which asks for “The interventions for each group with sufficient details to allow replication, including how and when they were actually administered.” Subsequent screens would let the user fill in details for Groups a, b, and ab. Because interventions are often procedural, journals may wish to encourage users to enter this information as a numbered list, which would help readability and reproducibility.
Figure 8: Item 5 on the CONSORT checklist, which asks for “The interventions for each group with sufficient details to allow replication, including how and when they were actually administered.” Subsequent screens would let the user fill in details for Groups a, b, and ab. Because interventions are often procedural, journals may wish to encourage users to enter this information as a numbered list, which would help readability and reproducibility.
Figure 9: Participant flow diagram, generated based on study type. The participant flow in each group would have the same fields as the one shown for Group 1. They are collapsed in the figure to save space.
Figure 9: Participant flow diagram, generated based on study type. The participant flow in each group would have the same fields as the one shown for Group 1. They are collapsed in the figure to save space.
Figure 10: Item 15 on the CONSORT checklist asks for “a table showing baseline demographic and clnical characteristics for each group.” Once again, the number of groups is based on the trial design specified earlier. Users could generate their own tables in the system, upload tabular text (.dat, .csv, .tsv) or spreadsheets (e.g., .xlsx, .ods), or link to data-sharing sites. For analysis and discussion sections, the interface would also accommodate uploading figures, much as online journal submission systems already do.
Figure 10: Item 15 on the CONSORT checklist asks for “a table showing baseline demographic and clnical characteristics for each group.” Once again, the number of groups is based on the trial design specified earlier. Users could generate their own tables in the system, upload tabular text (.dat, .csv, .tsv) or spreadsheets (e.g., .xlsx, .ods), or link to data-sharing sites. For analysis and discussion sections, the interface would also accommodate uploading figures, much as online journal submission systems already do.
Figure 11: Final page of the CONSORT questionnaire. (A) A user should be able to preview the paper before submitting. The preview would be generated as a report in the same way as the versions for peer review and for eventual publication—a compilation, in a specific order, of the data entered. (B) Button for final article submission. Once users clicked on this button, they would be alerted to any required fields left empty.
Figure 11: Final page of the CONSORT questionnaire. (A) A user should be able to preview the paper before submitting. The preview would be generated as a report in the same way as the versions for peer review and for eventual publication—a compilation, in a specific order, of the data entered. (B) Button for final article submission. Once users clicked on this button, they would be alerted to any required fields left empty.

Other considerations

Archives

When journals moved from print to online dissemination, publishers recognized the value of digitizing their archives so that older articles could also be searched and accessed. Analogously, if publishers not only accepted new articles as structured data but also committed to converting their archives, the benefits would be enormous. First, achieving the eventual goal of completely converting all existing biomedical articles would help researchers perform accelerated systematic reviews on a comprehensive set of data. Second, the conversion process would favour published articles that already comply with the reporting guidelines; after conversion, researchers would be able to search a curated dataset of high-quality articles.

I recognize that the resources needed for this conversion would be considerable, and I see the development of a new class of professionals trained in assessing and converting existing articles. For articles that meet almost but not quite all reporting guidelines, particularly more recent publications, these professionals may succeed in acquiring missing data from some authors.[24] Advances in automating the systematic review process[25] may also help expedite conversion.

Software development for the database and interface

In “Reducing waste from incomplete or unusable reports of biomedical research,” Glasziou et al. call on the international community to find ways to decrease the time and financial burden of systematic reviews and urge funders to take responsibility for developing infrastructure that would improve reporting and archiving. To ensure interoperability and encourage widespread adoption of health reports as structured data, I urge the international biomedical research community to develop and agree to a common set of standards for the report databases, in analogy to the effort to create standards for trial registration that culminated in the World Health Organization’s International Standards for Clinical Trial Registries.[26] An international consortium dedicated to developing a robust database and flexible interface to accommodate reporting structured data would also be more likely to secure the necessary license to use a copyrighted controlled vocabulary such as the ICD.

Implementation

Any new system with wide-ranging effects must be developed in consultation with a representative sample of users and adquately piloted. The users of the report submission interface will largely be researchers, but the report generated by the journal could be consulted by a diverse group of stakeholders—not only researchers but also clinicians, patient groups, advocacy groups, and policy makers, among others. A parallel critical review of the format of this report would provide an opportunity to assess how best to reach audiences that are vested in discovering new research.

Although reporting guidelines exist for many different types of reports can each serve as the basis of a questionnaire, I recommend a review of all existing biomedical reporting guidelines together to harmonize them as much as possible before a database for reports is designed, perhaps in collaboration with the BioSharing initiative[27] and in an effort similar to the MIBBI Foundry project to “synthesize reporting guidelines from various communities into a suite of orthogonal standards” in the biological sciences.[28] For example, whereas recruitment methods are required according to the STARD guidelines, they are not in CONSORT. Ensuring that all guidelines have similar basic requirements would ensure better interoperability among article types and more homogeneity in the richness of the data.

Conclusions

Structuring biomedical research reports as data will improve report quality, decrease the time and effort it takes to perform systematic reviews, and facilitate translations and interoperability with existing data-driven sysetms in health care. The technology exists to realize this shift, and we, like Glazsiou et al., urge funders and publishers to collaborate on the development, in consultation with users, of a robust reporting database system and flexible interface. The next logical step for research in this area would be to build a prototype and for researchers to use while running a usability study.

Reports as structured data aren’t a mere luxury—they’re an imperative; without them, biomedical research is unlikely to become well integrated into existing health informatics infrastructure clinicians use to make decisions about their practice and about patient care.

Sources

[1] “CONSORT Statement,” accessed October 04, 2014, http://www.consort-statement.org/.

[2] “PRISMA Statement,” accessed October 04, 2014, http://www.prisma-statement.org/index.htm.

[3] “STARD Statement,” n.d., http://www.stard-statement.org/.

[4] “The EQUATOR Network | Enhancing the QUAlity and Transparency Of Health Research,” accessed September 26, 2014, http://www.equator-network.org/.

[5] Erik Andrew, “A Proposal for Structured Reporting of Randomized Controlled Trials,” JAMA: The Journal of the American Medical Association 272, no. 24 (December 28, 1994): 1926, doi:10.1001/jama.1994.03520240054041.

[6] Elizabeth Wager, “Publishing Clinical Trial Results: The Future Beckons.,” PLoS Clinical Trials 1, no. 6 (January 27, 2006): e31, doi:10.1371/journal.pctr.0010031.

[7] Roberto Stramare et al., “Structured Reporting Using a Shared Indexed Multilingual Radiology Lexicon.,” International Journal of Computer Assisted Radiology and Surgery 7, no. 4 (July 2012): 621–33, doi:10.1007/s11548-011-0663-4.

[8] J M L Bosmans et al., “Structured Reporting: If, Why, When, How-and at What Expense? Results of a Focus Group Meeting of Radiology Professionals from Eight Countries.,” Insights into Imaging 3, no. 3 (June 2012): 295–302, doi:10.1007/s13244-012-0148-1.

[9] Henry S Rzepa, “Chemical Datuments as Scientific Enablers.,” Journal of Cheminformatics 5, no. 1 (January 2013): 6, doi:10.1186/1758-2946-5-6.

[10] Robert L Kane, Jye Wang, and Judith Garrard, “Reporting in Randomized Clinical Trials Improved after Adoption of the CONSORT Statement.,” Journal of Clinical Epidemiology 60, no. 3 (March 2007): 241–49, doi:10.1016/j.jclinepi.2006.06.016.

[11] N Smidt et al., “The Quality of Diagnostic Accuracy Studies since the STARD Statement: Has It Improved?,” Neurology 67, no. 5 (September 12, 2006): 792–97, doi:10.1212/01.wnl.0000238386.41398.30.

[12] Paul Glasziou et al., “Reducing Waste from Incomplete or Unusable Reports of Biomedical Research.,” Lancet 383, no. 9913 (January 18, 2014): 267–76, doi:10.1016/S0140-6736(13)62228-X.

[13] Robin Christensen, Henning Bliddal, and Marius Henriksen, “Enhancing the Reporting and Transparency of Rheumatology Research: A Guide to Reporting Guidelines.,” Arthritis Research & Therapy 15, no. 1 (January 2013): 109, doi:10.1186/ar4145.

[14] Sara Khangura et al., “Evidence Summaries: The Evolution of a Rapid Review Approach.,” Systematic Reviews 1, no. 1 (January 10, 2012): 10, doi:10.1186/2046-4053-1-10.

[15] Patricia Wright and Fraser Reid, “Written Information: Some Alternatives to Prose for Expressing the Outcomes of Complex Contingencies.,” Journal of Applied Psychology 57, no. 2 (1973).

[16] Karen A. Schriver, Dynamics in Document Design: Creating Text for Readers (New York: Wiley, 1997).

[17] Robert E. Horn, Mapping Hypertext: The Analysis, Organization, and Display of Knowledge for the Next Generation of On-Line Text and Graphics (Lexington Institute, 1989).

[18] An-Wen Chan et al., “Increasing Value and Reducing Waste: Addressing Inaccessible Research.,” Lancet 383, no. 9913 (January 18, 2014): 257–66, doi:10.1016/S0140-6736(13)62296-5.

[19] Andra Morrison et al., “The Effect of English-Language Restriction on Systematic Review-Based Meta-Analyses: A Systematic Review of Empirical Studies.,” International Journal of Technology Assessment in Health Care 28, no. 2 (April 2012): 138–44, doi:10.1017/S0266462312000086.

[20] R. Smith, “Scientific Articles Have Hardly Changed in 50 Years,” BMJ 328, no. 7455 (June 26, 2004): 1533–1533, doi:10.1136/bmj.328.7455.1533.

[21] Australian Research Council, “Grant Application Management System (GAMS) Information” (corporateName=The Australian Research Council; jurisdiction=Commonwealth of Australia), accessed October 04, 2014, http://www.arc.gov.au/applicants/rms_info.htm.

[22] Canadian Institutes for Health Research, “Acceptable Application Formats and Attachments—CIHR,” November 10, 2005, http://www.cihr-irsc.gc.ca/e/29300.html.

[23] “Structured Abstracts in MEDLINE®,” accessed January 14, 2015, http://structuredabstracts.nlm.nih.gov/.

[24] Shelley S Selph, Alexander D Ginsburg, and Roger Chou, “Impact of Contacting Study Authors to Obtain Additional Data for Systematic Reviews: Diagnostic Accuracy Studies for Hepatic Fibrosis.,” Systematic Reviews 3, no. 1 (September 19, 2014): 107, doi:10.1186/2046-4053-3-107.

[25] Guy Tsafnat et al., “Systematic Review Automation Technologies.,” Systematic Reviews 3, no. 1 (January 09, 2014): 74, doi:10.1186/2046-4053-3-74.

[26] World Health Organization, International Standards for Clinical Trial Registries (Genevia, Switzerland: World Health Organization, 2012), www.who.int/iris/bitstream/10665/76705/1/9789241504294_eng.pdf.

[27] “BioSharing,” accessed October 12, 2014, http://www.biosharing.org/.

[28] “MIBBI: Minimum Information for Biological and Biomedical Investigations,” accessed October 12, 2014, http://mibbi.sourceforge.net/portal.shtml.