As we travel across campus to engage faculty and students about the ongoing LMS pilot project, we’ve been routinely asked what is driving the project.
It’s simple, actually: the OU Desire2Learn contract expires next summer. In the ever changing technology industry, a couple years can be an eternity, and so before we embark on another multi-year contract for something so critical to the student learning experience, we want to ensure that we are making the right decision! The point of this pilot is to take an in depth look at where we’ve been, where we are now, and where we’re going.
While opinions are important, so is objectivity. As such, we are doing our best to gather as much data this semester as possible. Through a mixture of faculty and student demos, usage tests, surveys, personal demonstrations, deep technical vetting, and live courses, we hope to collect as many different types of feedback as we can, from as much of the University as possible. The Center for Teaching Excellence and the Center for Educational Development and Research are also working to collect additional data to supplement and compliment what OU IT is doing.
The other common questions we’ve been receiving relate to the timeline: when will a decision be made, and how long will we have to make adjustments?
Our plan is to have a completed analysis to University leadership by June 1 in hopes that a decision can be made by July. This would leave us at least one complete year to make any necessary transitions. As we’ve been telling folks that we talk to, something will change. Having the right tool is only one piece of the puzzle, as we will also be making recommendations for how the university can better support the tool and the students and faculty that depend on it.
We sincerely appreciate all the feedback we have received so far, and want to thank you all for any time you have devoted to this project. Please continue to send your questions and feedback to email@example.com
If the goal is to gather information from as many faculty and students as possible, why were there not more surveys? That’s something I really don’t understand. There are something like 1500 full-time faculty at OU; I don’t see how you can get anything like representative feedback from all those faculty members without a lot of surveys gathering a lot of data – and surveying is now MUCH easier than it was many years ago when the last LMS change took place. It seems to me you need detailed information about HOW we teach in order to identify what technologies would best support us in our teaching. But so far, I only remember one survey – it was all about D2L if I recall correctly, it was not really what I would call in-depth in terms of asking us about our teaching, and those of us who participated never got to see results because the results were not made public, at least not to my knowledge (it was administered through CTE and I’m on their mailing list; survey results were never shared via that mailing list, unless I missed something). This post makes it sound like there is some kind of objective vetting that can be done for this software, but the goodness of the software is dependent on how it is being used by teachers and students. It doesn’t seem to me like very much information has been gathered from teachers or students about how to anticipate that usage in terms of our goals/strategies as teachers and learners. Without that information, how do you know what features need to be technically vetted, prioritized, etc.?
Meanwhile, as I said on another post here about this process, I am very glad that we have this blog to at least attempt some kind of public conversation. That is something I really appreciate!!!
Great points! Thankfully, we have some help from the Center for Educational Development and Research (CEDaR) on this project, and while their surveys won’t reach ALL faculty, they will reach more than we have. CEDaR is much better with research than we are (it’s what they do!), and so we have extreme confidence in the information they are gathering.
Our most recent outreach through the Pilot Roadshow and demos is aimed at the entire campus. It doesn’t count as a survey, but it is an opportunity for those not directly involved in our pilot to provide feedback.