Notes from “Learning analytics: One university’s journey”, presented by Ruth Grindey and Sandra Stevenson-Revill, from the University of Derby, at BbWorld16.
The University of Derby is a public university in the city of Derby, England.
Began initial implementation of Blackboard in Spring 2015, including the Analytics B2. Blackboard is a strong partner in the installation process for Analytics, and makes the process much easier. Pulling in historical data is possible, but takes time.
The configuration was made complex because of pulling in data from both Blackboard and People Soft, and they use granulation within People Soft. They also had some difficulties configuring the UK data format vs. the US format.
It is critical to test the data within the Analytics platform, to make sure that the data is coming together correctly. Check the data, challenge the data, and fix problems.
Learning Analytics Online
- Teaching occurs on trimesters online (17 weeks)
- Focusing on students at risk and tutor engagement
- All student support staff
- All teaching areas
- Continue to identify key questions
- Identified 4 dashboard, one for each audience, using out of the box reports
- Senior management (9 reports)
- Academic management (13 reports)
- Customer service management (7 reports)
- Content management (18 reports)
Learning Analytics On-Campus
- Establishing a pilot group (September 2015-September 2016)
- 30 users
- Across all colleges
- Identify common questions and themes from tutors
- Initial dashboard creation
- Student journey
- Personal tutoring
- Time spent in Blackboard
- Number of times accessed
- Which tools and documents?
Made a conscious choice not to roll out student dashboards at this time (although may do so in the future). Reasons: (1) They wanted to make sure the data would be right, (2) They were not ready to field questions from students about the data.
Why Adopt Learning Analytics?
First, they wanted to be proactive with customer service to their students. They were operating in a reactive fashion, only doing things when someone asked them to. Instead, they can take a pastoral care approach to identify and diagnose problems proactively, supporting students better and sooner.
- You’re not interacting – can we help?
- You’re struggling to study online – can we improve the materials?
- What is the relationship between faculty and students?
- Is this student struggling alone, or are there more students in the course? Is there a pattern?
Performance Dashboards, Retention Centre, and Module Reports
This lets you pull out information from a variety of tools in Blackboard at once, as opposed to going into each one individually. Building dashboards lets you identify hot spots, peaks and valleys quickly
Make business decisions on tool adoption based on high usage, low usage, and identify best practices for top tools.
They use the “Learn Course at a Glance Report” quite a bit, which is an out-of-the box report in Analytics for Learn. It shows data about students and interaction within a course.
Important to be sensitive about using data for instructor interaction and engagement, particularly around making assumptions. Lower engagement of instructors might mean higher efficiency. It is difficult, even with the data provided by A4L, to correlate instructor interaction/engagement with student success.
Dashboard/Report Data Considerations
- Data assumptions can be misleading
- Beware of averages, especially across future timelines
- Look out for ‘dirty’ data
- Understand Blackboard roles (using Quick Enroll affects data for instructor interactions, but it ignores student preview users)
- Data triangulations is required
- Often generate more questions, which require investigation before a conclusion is drawn