Wednesday, December 21, 2016

What You Need to Know About Learning Analytics

This special report caught my attention partly because our workshop with Diana David when she showed us how to access BTC's learning analytics. This special report is from The Chronicle of Higher Education. It won't be available online for 30 days, but I'll leave a print copy of it in the eLearning Lab. (It has some serious copyright information/threats in it. I know that The Chronicle can charge up to $90.00 for a 24 page special report.) This report is a reprint of a series of their best articles on the subject. I will highlight a few of them.


Jeffery Young's This Chart Shows the Promise and Limits of 'Learning Analytics' (2016), reflects on what he learned from "the spider graphic" created from one his online courses. The graphic shows the student activity. This instructor, Courtney Stewart, has seven years of online teaching experience. With each of his lessons he presents the material in many different formats--PowerPoint, Podcast, video, and straight text. He was expecting to see the most popular formats or what format worked well for what percentage of students. He discovered that over half of his students never used the Home page or the lessons, but would jump straight to the homework. He decided to attach the lesson material to the homework posts. Students would then have only one place to look. The instructor included these analytics in his tenure portfolio.


The article that I thought would interest me most was Michael Feldstein's Reaching Students in the Back Row (2016). He discusses the three methods that he and his colleague, Phil Hill, have discovered that can help the student who is struggling or disinterested. 1) Move content broadcast out to the classroom. Online this means finding alternate ways to present the information--videos, podcasts, etc. 2) Make homework time contact time. Don't use robo-grading all of the time. Personalize some of it. Make the homework relevant to what the student it learning or will use later in the class or in their profession. 3) Hire a tutor. Sometimes machine-graded homework can also work as a robo-tutor. These are supposed to personalize learning. This article disappointed me. The title promised more than it delivered.


I discovered in this next article, Adaptive Learning Earns an Incomplete (Feldstein, M., 2016), that I did not totally understand the meaning of adaptive learning. I thought it was more accessible learning. It is a computerized what of doing what teachers have always been doing, noticing that certain students are not understanding the material and adapting the instruction to include different ways to teach the same thing. The research was funded by the Bill & Melinda Gates Foundation. Out of 23 courses, only four of them showed a modest gain in grades; most had "no discernible impact." The author points out some limitations to how the data was created and interpreted.


The last article, also by Michael Feldstein, Understand the Origins of Ed-Tech Snake Oil,  has implications beyond educational research to today's current affairs. It discusses how companies need to be made accountable for the claims they make about the effectiveness of their product. It details a report presented by an educational software product that they claims to increase graduation rates. There was no peer-review, but since it was presented and reported at many educational conference, people start quoting the statistics. Then a critic of the report questioned the findings and thought there were statistical errors due to poor design. One critic finds that feeding students chocolate can produce the same effect. The company's designers are asked to present additional information, the company chooses not to comment. It is a good example of how snake oil or what we call false news is spread. The author argues for better research design and peer-review, but to invite vendors to be part of the peer-reviewed research.


It would be interesting to see what analytics Canvas can produce and how instructors can access them. It is important not to determine if the analytics are worth the time it would take to create them. Young concluded that often looking at the straight data can be as valuable or more valuable than data presented in a labor-intensive graphical form.



Friday, December 9, 2016

Learner Attributes & Thinking About Our Student Focus Groups

"Instead of looking at the attributes of learners who eventually drop out to determine how to help learners sustain motivation and meet learning goals, discovering the attributes of successful learners can help us understand factors that contribute to motivation and persistence. Attributes that successful learners have include a high level of self-confidence and self-efficacy, and an internal locus of control."

This quotation from a book I'm reading with a few other Quality Matters Coordinators called "Effective Online Teaching", by Tina Stavredes, recalled for me our recent FLC conversation about our upcoming Student Focus Groups. I think it will be important to hear from students who struggle with online learning as well as from those who are more successful.

Self-confidence: "individuals' belief in themselves and their ability to succeed in general"


Here's an interesting and brief post from Cengage "Three Factors that Boost College Students' Confidence"


Self-efficacy: "person's belief that he or she can succeed at a specific task or range of tasks in a given domain"


Here's a good scholarly article called "Academic Self-Efficacy and First-Year College Student Performance and Adjustment"


Locus of Control: "beliefs about what determines...successes or failures in life"


Here's a Khan Academy video called "Locus of control, learned helplessness, and the tyranny of choice" (not a fan of animal research, btw)






So, what kinds of questions might draw out information about our focus group participants' self-confidence, self-efficacy, and locus of control?