#further reflection on Muddiest point


What did I learn today? That there may be further research from my current project on an issue called the “muddiest point”
How was it useful? I was reading back over material from my literature review and I was again looking at the paper which mentions the muddiest point, Pinder-Grover, Green, and Millunchick (2009) and a section from a journal written by Mosteller (1989) explaining the concept.
What thoughts came up ? The term muddiest point is used to describe an area or concept that students are struggling with. Identifying and addressing these muddy points can be really helpful to students (obviously??) Pinder-Grover, Green, and Millunchick suggest using screencasts to explain these muddy points.
What did I think? Getting students to admit that they do not understand something is a tough job, how could I use an App or some software for anonymous polls during lectures to identify muddy points?
What will I do with the information i.e. what next? Start looking at a couple of ways to identify apps that are suitable, http://www.Socrative.com. comes to mind, it was featured in the DIT 12 apps of Christmas 2014 and I used it for a couple of in-class polls but more for demonstration purposes.

Green, K. R., Pinder-Grover, T., & Millunchick, J. M. (2011). The efficacy of screencasts to address the diverse academic needs of students in a large lecture course. Advances in Engineering Education, 1-28.

Mosteller, F. (1989). The ‘Muddiest Point in the Lecture’as a feedback device. On Teaching and Learning: The Journal of the Harvard-Danforth Center, 3, 10-21.

#reflection on “Muddiest Point”

What did I think about today? I was reading about the “muddiest point” in lectures and how one group used screencasts to clear them up, Pinder-Grover, Green, & Millunchick (2011), and it struck me it would be a great way to extend the use of my screencasts.
How was it useful? Initially, my screencasts were for “mechanical” tasks, purely the “how-to”, but they are evolving into “why”.
What thoughts came up ? can I use any other technology to find the “muddiest point”? and I immediately thought of Socrative www.socrative.com, an app I learned of through the 12 Apps of Christmas DIT, and which I demonstrated in class before.
What did I learn? That my use of technology is evolving and I am am hopefully moving up the SAMR ladder, Puentedura (2010) by combining technologies to help my learners.
What will I do with the information i.e. what next? Create some socrative prompt slides within some lectures and see if I can use them to solve some “muddiest point” issues.

Pinder-Grover, T., Green, K. R., & Millunchick, J. M. (2011). The efficacy of screencasts to address the diverse academic needs of students in a large lecture course. Advances in Engineering Education, 2(3), 1-28.

Puentedura, R. (2010). SAMR and TPCK: Intro to advanced practice.Retrieved February, 12, 2013.

#reflection Research results

What did I reflect on today? the phrase “intended and unintended consquences” Donnelly & Seery (2011) resonated with me.
How was it useful? While reviewing my notes I identified a number of results which were not in my mind at the start of the project. Such as helping students find reliable sources of information, improving their research skills. Also my notes and tutorial structures have changed as a result of the screencasts and in my mind have improved lab instructions.
What thoughts came up ? That the unintended have equal if not greater importance than the intended.
What did I learn? I referred to my project recently as an Onion, with many layers, this is another illustration of that analogy.
What will I do with the information i.e. what next? Review my observation journal again to see if I have missed anything else in my results.

Seery, M. K., & Donnelly, R. (2012). The implementation of pre-lecture resources to reduce in-class cognitive load: A case study for higher education chemistry. British Journal Of Educational Technology, 43(4), 667-677.

#reflection on Cognitive Load Theory

What did I think about today? That my project has a particular relationship to cognitive cortando_cebollaload theory.
How was it useful? I knew from the start that Cognitive Load was one of the aspects of the project, but this was a minor revelation or clarification for me.
What came up ? I was reading about program design for Cognitive Load (Kirschner 2002), which is important but I had the realisation that the program design was not done by me – I need to take a step back and analyse the program (packet tracer) design and see does it meet with good design practice? Or is that why I feel the need to produce screencasts about using it.
What did I learn? Research projects are like onions, just as you peel away one layer, another is revealed beneath (and it can make you want to cry ; ))
What will I do with the information i.e. what next? I’m going to do a quick analysis of the packet tracer interface to see does it meet with good design practice and if not, why not? Is this why it needs screencasting?

Kirschner, P. A. (2002). Cognitive load theory: Implications of cognitive load theory on the design of learning. Learning and instruction, 12(1), 1-10.

#reflection of Cognitive Load of Novice Learners

What did I learn today? That asking novice learners to analyse their cognitive load may be over ambitious.
How was it useful? I am trying to measure how much mental effort students have to put into my screencasts.
What thoughts came up ? When they are looking at my screencasts, how much thought will they put into cognitive load? or mental effort? This is not normally reflected on by novice learners (Martin 2014) , why should I expect them to remember?
What did I learn? That every aspect of research has two views – The view of the Researcher, and the view of the subject, those two are not (or are usually not) the same. Why should we expect the subjects to analyse themselves??
What will I do with the information i.e. what next? Measure in a different way? Is that possible? Or highlight the issue to research subjects? Will that create an “observer effect?” – This needs more research….

Martin, S. (2014). Measuring cognitive load and cognition: metrics for technology-enhanced learning. Educational Research and Evaluation,20, 592-621.

#reflection on Video Lectures by Course tutors

What did I learn today? That students use youtube to help them with study & extra material
How was it useful? Because they have a hard time sorting the good information from the bad.
Why is it useful? Because they like to get videos prepared by their own lecturers, that way they feel they can trust the information
What did I learn? Despite the wide range of resources availiable students prefer to stay local instead of “global”, that research skills and sorting good information from bad is a new skill for modern learners.
What will I do with the information i.e. what next? Make more screencasts – students definitely like them, and consider how we can help students find the good stuff. Maybe create some areas in the LMS for sharing good sources (or make it an assignment??)

#reflection on instructional design

What did I learn? That my slides showing people how to use software were not as good as I thought.

How was it useful? I am about to use screencasts to “mirror” what I do in the tutorials but I have discovered that the slides I use for the tutorial don’t cover everything I had recorded as screencasts.
Why is this useful? – Because it highlights a gap in my material and shows how suing technology has forced me to reflect on the static version of the notes.
What will I do next ?– so much to do, I could start by rehearsing screencasts for other tutorials to see if they match up with what I thought I was teaching.

#reflection on class sizes

What did I reflect on today? Whether or not I can give the same level of teaching to large groups that I provide for small groups. I was comparing exams given to large classes with those given to smaller groups and there was a slight disparity in the level of questions.
How was it useful? Made me think am I making it too easy for the large group or too hard for the small one?
What thoughts came up ? Need to ensure no bias or imbalance in assessment
What did I learn? The learning objectives are the key to assessment, other factors must be secondary to them, not what the idealist educator wants to see but it’s a reality we all must face
What will I do with the information i.e. what next? Re-visit old assessments for an “audit” and examine ways going forward to use technology so that assessments can “scale” up

#reflection on Evaluation

What did I learn today? That I need to stake a step back from my artefact and try to evaluate it with a critical eye.
How was it useful? Because I may be adding to my workload unnecessarily,
What thoughts came up ? That I need to think more about what I will measure and how.
What did I learn? Just thinking about evaluation is a worthwhile process because it identifies gaps and areas to concentrate on.
What will I do with the information i.e. what next – Look at my questions, look at the types of questions others have asked on similar products and identify the key points I need/want to evaluate.

All my material from my MSc in Applied E-Learning