I made a ten-minute video for my faculty colleagues at Bemidji State University, to answer the question many are asking as I begin advocating for open learning on campus. Incorporates some of the data from the latest Florida survey I wrote about yesterday, especially the part about required textbooks going unused.
In March 2019 Florida’s Office of Distance Learning and Student Services published a follow-up to the 2010, 2012, and 2016 student surveys which have been a valuable source for many OER advocates. The new survey was conducted in spring 2018 and involved over 21,000 respondents. The survey’s findings were that:
- For the first time since the 2012 survey, overall textbook costs did not increase. Relative to 2016, only 43.8% of students reported costs of over $300 for the semester, with ten percent shifting from the “above $300” column to the “below $300”. This result doesn’t quantify the real savings, though, since it doesn’t specify whether students went from $305 to $295 or from $400 to $200 between the two surveys.
- Students increased efforts to reduce their textbook costs by finding cheaper vendors for new textbooks and by buying used copies or renting print or digital textbooks. It is worth noting that buying or renting from cheaper online sources and buying used, which all increased since 2016, could be threatened by publishers’ “inclusive access” plans that require students to acquire their materials from a single source.
- Students continued to report that they had not acquired required textbooks (64.2%), took fewer courses (42.8%), had avoided a course (40.5%), had earned a poorer grade (35.6%), or had dropped a course (22.9%) due to textbook expense.
- More students reported that required textbooks were not used in classes. In 2012, students had reported that an average of 1.6 textbooks were not used in class. In 2016, 2.6 textbooks per student were unused. In 2018, students said 3.6 of the students they had been required to buy were not used. Over a sample of 21,000 students, that means over 75,000 textbooks were purchased and not used. If the average price was $100, $7,500,000 in student funds were wasted. The survey suggests that courses switching to digital resources may account for this change – if that’s the case, instructors should stop requiring the textbook as well as the ancillaries.
- Students reported a much greater willingness to use digital textbooks. The question was worded around textbook renting (which also increased), but 41.4% indicated willingness to rent digital textbooks, which is a hopeful sign for digital OER acceptance. In addition, 57.2% of students said they used interactive practice questions and 44.8% used PowerPoint slide decks, suggesting that digital, interactive learning is making headway in both publisher and potentially OER formats.
Image source: All images from 2018 Florida Student Textbook & Course Material Survey, Donaldson, Opper, Shen, 2019. CC-BY.
This document by Dan Allosso, 2019, CC-BY-SA
And here’s an even shorter (2 1/2 minute) video about using Hypothesis inside the D2L LMS. It’s a kluge, but hopefully we’ll get the Hypothesis LMS app installed and it will get easier for students to navigate.
I just made a short video to introduce my students to Hypothesis. I’ll be using it for annotation and discussion in all my online and in-person courses this fall. After they’ve watched the video, I have the students create an account, follow a link I provide to the private group I’ve set up for each section, install the plugin in their browser, and leave at least one comment on the course syllabus. Here’s the video:
One of the things that jumped out at me, during a day full of meetings yesterday related to the beginning of the fall semester, was a guest speaker who opened our meeting in the College of Arts, Education, and Humanities. John Eggers is a Bemidji Pioneer columnist and an advocate for 100% High School graduation. John argues that we should set a goal of trying to get 100% of Bemidji’s high school students to graduate, and he claims this goal could be achieved in not five years or three years, but in one if we really put our minds to it.
(Goals by Nick Youngson CC BY-SA 3.0 Alpha Stock Images)
I’m not going to argue whether John’s goal is attainable. The thing that struck me about it is, it’s obviously the right goal. How could one justify setting a goal that aspired to less than 100%? “Yeah, we want to leave 2% or 3% or 5% behind each year” doesn’t cut it. That may be a reality, but it’s not a vision. Whether or not you believe it can be achieved in a year, three, five, or maybe never at all…how could you argue that we want less?
I thought this was a useful idea for OER, especially in the context of the currently-popular idea of Z-Degrees. The Minnesota Legislature has mandated three Associate Degrees will be created with zero textbook costs – not only for the money that will be saved by the students that got through those particular tracks, but for all the students around them, who will get the benefit of being in the Z-courses created, even if their entire program isn’t free of textbook costs. And of course the focus on creating Z-courses will inspire other changes and the benefits will snowball.
Similarly, as I’ve mentioned before, students at 4-year schools like Bemidji State University would benefit from substantial decreases in their textbook costs even if we can never eliminate them entirely. When John was talking about 100% Graduation, I wrote in my notes, “an unattainable goal isn’t necessarily a bad thing.” Then I immediately started thinking of ways it can be a bad thing. Like if you’ve pegged your compensation to a goal you can’t achieve. But then I ultimately decided that on the whole, I like the idea of refusing to compromise on a vision and then celebrating getting as close to it as possible.
(Roadblock, Ada Gonzalez CC-BY 2008)
I wouldn’t say my experience in the business world was entirely “move fast and break things.” But working in high tech certainly included an understanding that it’s sometimes better to ask forgiveness than permission. The situation couldn’t be more different in higher education.
As part of the OER advocacy I’m planning on my campus this fall, I’ve always assumed I’d do a couple of campus-wide surveys: one of faculty and one of the students affected by high textbook costs. The idea was both to locally replicate the results of student surveys like the famous Florida study, and to signal to all the students and faculty on campus that something is about to begin.
I was informed last spring that in order to survey the faculty, I would need to get permission of the Interfaculty Organization (IFO), our union. I sent an email to the President of the BSU Faculty Association with a link to the 25-question Qualtrics survey I was planning on using. He said he’d put it on the agenda of the first Faculty Senate meeting in September, but he didn’t think there would be any resistance. So, by the middle of September I’ll probably be sending out my faculty survey via the official mailing list. But he also suggested I contact the Institutional Review Board (IRB), an organization I hadn’t heard of previously.
This is where it gets a bit sticky. The IRB, it turns out, is also known as the Human Subject Committee. It was apparently formed in response to a Federal regulation (45 CFR 46.102f) that requires review and approval to do research that “deals with human subjects” in a way “designed to develop or contribute to generalizable knowledge.” The only survey activities that seem to be exempt are student and faculty evaluations and “information collected for program improvement, evaluation, and accreditation.”
I exchanged a couple of emails with the Director of Graduate Studies at BSU, who oversees the IRB. He verified that if I planned on making the data public in any way (conferences, website, publications, etc.) I would need to get IRB approval. If the information was solely for my own course development and not for public distribution, I would not be required to get approval.
What was not clear was what I would need to do to get approval. I studied the IRB website and it seemed that in addition to filling out a number of forms, I would need to get a certificate from another organization called the Collaborative Institutional Training Initiative (CITI) that I had completed a training course of some type. The course was not specified and the provided link took me to CITI’s homepage, which was no help. At this point, I have no idea how much time I would need to put in, to simply get to the point where I could submit a proposal to get my survey approved.
This is a major institutional impediment to me getting the data I was hoping to get on BSU students and faculty to guide my campaign. While I appreciate the sensitivity of using data collected from people and the need to understand issues of privacy and when a line of questioning might be inappropriate, this vague, poorly-defined requirement seems like an unnecessarily obnoxious roadblock. This IRB requirement acts as a sort of unfunded mandate, requiring me to invest an undefined amount of time not only meeting its requirements but figuring out what they are. This is the sort of bureaucratic black hole that seems like it could have been designed expressly to prevent innovation rather than to protect “human subjects”. Or is it? There seems to be a loophole, both in the published official guidelines and in the Graduate Program Director’s communication. I may be able to run my surveys on campus if I direct them only at improving my program (increasing OER acceptance and adoption on campus) and if I don’t publicize the data I collect.
It would be unfortunate if I were unable to discuss the data I collected from student and faculty surveys at the OE Global conference in the fall, or if I were unable to create charts and marketing materials documenting the significance of students’ attitudes toward excessive textbook costs. But it wouldn’t be the end of the world. We already have published studies that document these facts. Even without IRB approval, maybe I could still conduct surveys and use the data to plan my campaign, communicate with the administration and other stakeholders about the project (“information collected for program improvement, evaluation, and accreditation”), and track changes over time as I implement the program.
Maybe in the future I’ll be able to find a collaborator who either has or is interested in getting all the certifications and permissions needed to run a survey I could publicize the results of. I might make this a goal of the second-year survey, after we (hopefully) have some change to report. People at the system office and also at the IFO have expressed interest in my surveys and their results. So maybe I could involve them in some way in this future “publication” collaboration, after providing my first-year results under the limited “improvement, evaluation, and accreditation”, non-public guidelines.
In the meantime, I think I’ll try to move forward (which is the goal, after all) in the best way I can, and not let this roadblock stop me in my tracks. I’ll survey students and faculty, but with the express understanding that I will not publicize the results. Or, in other words, that the results will be expressed in what I do about the data, not what I say about it.
One of the useful aspects of Pressbooks is that authors can edit a title and add content whenever they need to. This allows errors to be corrected and materials to remain up to date as new information becomes available. How often have you discovered a problem in a textbook you’re using, and hoped it would be caught and corrected in the next edition in a few years?
Keeping up with research isn’t an issue only in the sciences, though. New information becomes available in all fields as researchers continue discovering new facts or refining their interpretations. For example, I recently discovered another historical source for my volume of primary readings relating to the Ranney brothers and their migrations across the continent in the nineteenth century.
The source wasn’t exactly new: it was a volume called the Compendium of history and biography of Hillsdale County, Michigan, written by Elon G. Reynolds in 1903. Reynolds’ work was typical of the genre, including about 80 pages of general history of the county and then over 450 pages of short biographical sketches of Hillsdale’s leading men and institutions. On pages 302 and 303 there is a sketch of Henry Ranney’s younger brother, Lemuel Sears Ranney.
The passage adds some details to Lemuel’s life I was not aware of, provides validation of some of the events Lemuel and his brothers describe in their letters, and gives us an interesting look at the elements of Lemuel’s story that seemed interesting to the editors of this 1903 volume, and presumably its readers. It also shows the degree of respect Michigan residents seem to have had for Ranney, who was still alive when the book was published.
This was all interesting enough to me that I wrote an extra short “chapter” about it and added it to the end of my ebook. Readers who are reading it online will find it automatically appended after the previous final chapter that covered Henry Ranney’s obituary. Folks who have downloaded the ebook or pdf versions to their own devices can return to the Pressbook’s homepage and download another. I’ll probably not be adding a lot more to this volume, but if I come across any new material it’s nice to be able to!