Learning Analytics

| 0 Comments | 0 TrackBacks
Recently I found myself in several meetings discussing 'learning analytics'. Basically, we want to identify potential data sources that will help inform our decisions around retention, student success, advising, placement and a plethora of other student-centered topics. The Chronicle just released a piece on learning analytics, citing examples from Harvard to Rio Salado College, a community college in Arizona. 

Regardless of what lens I view learning analytics through, I see incredible opportunity to better guide and support our students. From an institutional research perspective, I think we can use these analytics to enhance things like retention and advising. From a faculty perspective, I can see using analytics to increase engagement in my classroom. Being part of a small committee looking at potential new CMS platforms for Penn State, I'm thrilled to report that all of our potential platforms have a wide variety of learning analytics modules. 

While I feel this is an extremely positive movement, Gardner Campbell, director of professional development and innovative initiatives at Virginia Tech, has a different take (from the Chronicle Article):
"Counting clicks within a learning-management system runs the risk of bringing that kind of deadly standardization into higher education."

The article summarizes Gardner's concerns, pointing out that these CMS environments are not necessarily the best platforms to measure real student engagement and creativity. I wholeheartedly agree with Gardner! This could be a slippery slope some universities could go down. But I do argue that counting clicks is an important piece to guiding decision making in terms of retention and student success.

Take Rio Salado for example. I attended a webinar by their project lead, and he reported that a large amount of the variance in terms of student success (a "C" or better) can be predicted by using two variables from the CMS:
  1. Date of first login
  2. Whether or not the student has clicked on (and assuming, viewed) the course syllabus.
If these two simple, easy-to-track variables play such a large role in predicting whether a student will succeed or fail in a course, why not track them? This allows the instructor, or student adviser, to intervene very early in the semester, which in turn greatly increases that student's chance of success.

I look forward to the onset of Penn State's new CMS, and what data-driven initiatives we can spin up to enhance student success and retention.

No TrackBacks

TrackBack URL: https://blogs.psu.edu/mt4/mt-tb.cgi/313825

Leave a comment

Search This Blog

Full Text  Tag

Recent Entries

Pennsylvania State Data Center Presentation
About 50 people spent part of their Valentine's Day at the February 14, 2013 IR Interest Group's presentation about the…
Update on Student Information System Replacement Project
April 11, 2013, 1:30 - 3:00 p.m.508 Rider Building, University Park or through Adobe ConnectPresenter: Karen Schultz, University RegistrarOur current…
Data Tools to Help Student Success
Thursday, March 14, 20131:30 p.m. - 3:00 p.m.508 Rider Building, University Park, or through Adobe ConnectPresenters: Andrew Kelly, Data Analyst,…
The Pennsylvania State Data Center
The Pennsylvania State Data CenterThursday, February 14, 2013 1:30 - 3:00 p.m.508 Rider Building, University Park ~or~ through Adobe ConnectPresenters:…
Just back from NEAIR
I just spent a few days at the annual conference of the Northeast Association of Institutional Research, in Bethesda, Maryland.…

Learning Outcomes