top of page

Competencies, assessment & learning analytics  

COURSE DESCRIPTION

What are the intended outcomes of education, how will we know if we’ve made a difference, and what can we do to improve learning along the way? These hard but important questions are at the heart of learning design. The act of assessment verifies that learning has taken place, but it also provides opportunities for refining plans and improving student learning. Some strategies are easily implemented, while others require advanced expertise. Recent advances in technology make it possible to gather a wealth of data on how people interact within the environments in which they learn, recording each click of the mouse. In education, the use of this data to improve learning is referred to as learning analytics.

COURSE ASSIGNMENTS

 

 

Keeping it Real

How Learning Analytics has Influenced my View of Assessment

                       

Throughout my education, I always questioned if what I was learning mattered especially when I saw little relevance to what I thought I would experience in the real world. And now with technology’s role in education, no wonder so many are lost in their learning experience. Learning Analytics has the power to identify hindrances to learner engagement when and if used appropriately. This paper intends to share the ways that learning analytics has influenced my thoughts about assessment but also how I believe it’s possible to assess using big data while still “keeping it real” – or put another way, making room in the day-to-day for the data to drive meaningful change.                 

 

Purposeful Learning and Student Engagement

“Instructors have the knowledge and expertise that can be embodied in an online course experience…but just putting knowledge in a static or inert package is not creating instruction and it certainly isn’t the definition of learning.” Dr. Marsha Lovett, Carnegie Mellon University

 

Learning Analytics and Big Data in the Real World

 

Startling college placement test scores, decreasing retention and graduation rates tell us something is not working. Granted, I don’t believe it falls in the hands of higher education but certainly educators and administration do well to look at the institutions state of affairs. Many would say assessment is just the right tool for this job. Partnered with big data, which is simply put, large amounts of data in one report, assessment essentially collects useable data to help with decision making and hopefully, resulting in meaningful change. This is, in a nut shell, what I have surprisingly discovered in this course.

 

I would like to begin by sharing evidence of the first course outcome found in the course syllabus: “Articulate the role of competencies and outcomes in driving instructional design and assessment decisions at the institutional, program and course level.” (Matthews-Denatale, 2016) Put in my words, successful courses are those that are designed to do what we need them to do, educators are deliberate in content, instructional design, and competencies selected for the program or course. As I survey the numerous personal experiences with courses, and I am sure we can all relate, there are a few examples that come to mind of what didn’t work.

 

What I mean by “work” is a few things, for starters keeping the learner engaged. This is a big deal when there are so many distractions - online and around us. When we consider the non-traditional student is likely taking courses online, working, has a family, and may be involved in community activities there are many distractions to consider in the learning experiences. The last thing a student needs is to enroll in an online course, be disengaged by irrelevant content, and wordy course design requiring much effort to get to the directions of the assignment leaving the student weary before they even get to the meat of the course which then too often has little relevance their daily work.

 

Secondly, what makes a course work in my experience is how it addresses different modes of learning. There is nothing worse than a kinesthetic learner taking an online course much less to say a lecture course that lives up to its name with an instructor that mainly lectures without little involvement from the student, again leaving the student disengaged or at best working really hard to find ways to be engaged. Course design terms should serve and enrich the learner. With regard to assessment, the deliberate design of the course can make for multi-use, like including assessment through embedded assignments. I am particularly fond of this approach so much so that in my survey of personal learning experiences I realize now what I didn’t realize then - I was being assessed without even knowing it. That indeed is rich, effective, meaningful assessment.

 

 

 

 

 

 

 

There are a few highlights of this course that I would like to point out. I thoroughly enjoyed the interviews with Dr. Lovett and Dr. Siemens, both of which I admire greatly for their research that has influenced higher education on so many levels.  I was particularly drawn to Dr. Lovett’s method of adaptive learning and instructional alignment using instructional activities to assess. A quote taken from her interview that really resonates with my vision of education is, “instructors have the knowledge and expertise that can be embodied in an online course experience…but just putting knowledge in a static or inert package is not creating instruction and it certainly isn’t the definition of learning.” (YouTube Interview, 2016)

 

In terms of Assessment and Learning Analytics (LA), Dr. Lovett also points out, and as I discovered last term in Education as an Advanced Field of Study that weaving meta-cognitive activities in the design of a course address higher order thinking and learning. This aligns with my professional experiences; I have embedded indirect and qualitative sources in the New Student Orientation I facilitate as an Academic Advisor at St. Petersburg College (SPC).  Through analysis of clicks (data mining), and data collected from indirect, summative sources, assessment happens in a non-intrusive manner, which in turn creates a learning atmosphere where assessment is ongoing, effective without intimidation or less robotic.

 

Dr. Siemens, on the other hand, informs my practice in keeping with the latest trends of learning analytics. His research contextually influences different fields but primarily higher education. He coins a brilliant term: “connectivism” – the collection and connection of data from three categories that can be used in the learning experience: biological, technological and social. I likened it to scaffolding, where connections are made in the learning experience by the learner. In other words, the connections I am making in this paper are a culmination of prior knowledge from prior courses and experiences, including what I learned in this course. (YouTube Interview, 2014) If the goals of education are to “create conditions which stimulate students’ intellectual, moral and emotional growth so that they may ground their skills in a more mature, humane framework of values,” then we do well to consider how we should assess. (Bowen, 1977)

 

Another highlight of this course was the discussion of different learning credentials. Competency Based Education (CBE) is an example of a learning credential and a great alternative (or addition) to skill based academic programs that lead to professional state credentials. Alternative learning credentials like CBE align with my vision of education. And take us full circle to the title of this paper, Keeping It Real, because what learn in a classroom should most definitely be relevant and enrich our professions. Let’s be honest, when have you used Algebra unless you’re a Math teacher, yet it was required to graduate; or when do you recall needing to understand, unless you’re a Meteorologist, the types of a clouds learned about in Oceanography (an option for a graduation Science requirement). I wonder how incorporating CBE in academic programs other than A.S. degrees would improve retention and graduation rates.

 

Learning Analytics in my Organization

 

I foresee Learning Analytics playing a key role in my organization (SPC), which is a two-year public institution that also offers four year degrees. I believe there is opportunity for growth in areas like retention and persistence; persistence being the skills and strategies needed by a student to persist through personal and academic hurdles that could potentially hinder the learning experience.  LA has the potential to discover what impacts student engagement by the student and the institution. Several learning management systems used today provide data analytics to the user – this would include both faculty and student. Establishing buy in of all administration and staff is critical to the full potential of learning analytics. This creates transparency across all stakeholders including the student. Key to student success is self- awareness, LA provides meaningful and practical data for students to self-evaluate and create change where needed and in turn, essentially, forces the student to own their education.

 

A recent initiative called Academic Pathways is evidence of data captured through SPC’s business intelligence (a form of data mining) that shows correlation between student career decision and student success. The initiative includes creating ten communities that each include programs within they subject area with each program having a pathway of courses that lead the student to their career goals. St. Petersburg College is currently in the planning stages of implementing this design into the framework of the learning experience and I am super excited about the possibilities. One of the several goals of Academic Pathways is to contextualize courses; consequently this makes room for learning analytics to play a greater role. In the article 7 Things you should know about Analytics, “students could benefit from data compared across classes for the same course…because such data could reveal links between teaching styles and student learning success.” (Educause, 2010)

 

In conclusion, I would say this has been my favorite course. In hindsight, however, I would not recommend taking this course in the summer and would even suggest that it be taken alone. I feel that I wasn’t able to really capture all that the course offers to my profession because it was crammed in 8 weeks. This course addressed major gaps in my thinking and preconceived notions related to assessment. While creating my assessment plan I was effectively challenged to think through the details of deliberate and purposeful assessment. Lastly, through instructional activities and the use of rubrics my writing is more concise and clear, and assignments allowed me to create, evaluate, analyze, apply, and understand (in keeping with Blooms taxonomy).

 

References

 

Bowen, H. R. (1977). Goals: The intended outcomes of higher education. In J. L. Bess (Ed.), Foundations of American Higher Education (pp. 54-69). Needham Heights: Simon & Schuster Custom Publishing 

 

Educause. (2010). 7 things you should know about analytics. Retrieved from https://net.educause.edu/ir/library/pdf/ELI7059.pdf

 

Lovett, M. (2016). Video Interview – Competencies, Assessment & Learning Analytics. Retrieved from https://www.youtube.com/watch?v=LG3Zh6ICvJI&feature=youtu.be

 

Siemens, G. (2013). Making sense of and finding a way through, learning analytics as a field. Retrieved from https://www.youtube.com/watch?v=KqETXdq68vY

COURSE REFLECTION

"Reflection makes learning visible to the learner, making it available for connecting and deepening."

bottom of page