e-Learning Ecologies MOOC’s Updates

Recursive Dialogical Learning for Continuous Improvement

The concepts of learning analytics, computer adaptive testing, and peer review converge to offer an environment for continuous learning and improvement. With existing, tools we can assess a learner's knowledge and 'capacity', which I define as the ability to process and apply knowledge, within a certain context. Learning content can be curated for the individual and recursively tested and adapted as required. Introducing peer review into this environment presents some interesting possibilities. As we move from 'the expert (teacher) determines the right answer' to sourcing new learning from students and practitioners, learning can be dynamic and practical as new ideas are shared, and tested. While learning evolves organizational capacity increases.

Students in this environment can direct, not only the content but prioritize learning in ways an "expert" instructor might not. Learners within a corporate environment are likely closer to the work and therefore have a better understanding of what is required to execute certain functions that the training is designed to improve. Ann Cunliffe suggested, "we need to go beyond a purely intellectual critique to one grounded in the more informal, everyday ways of sense-making and learning practice" (Cunliffe, 2002). An example of the need to engage staff in recursive dialogical training is training may not address 'workarounds' common in daily tasks. A top-down approach to training may not address tribal knowledge and fail to deliver an increase in capacity as learners complete a course that does not accurately reflect the way work gets done. Contrast that with a dialogical learner-driven learning process in which learners actively develop new content and use feedback to actively improve systems and update training and systems in concert based on new information and real-world testing.

Dr. Cope (2019) referenced this dialogical learning environment and used the example of a twitter thread where someone posts an idea and subsequent contributors ad their thoughts. This type of dialogical content still has the risk of veering toward opinion and conjecture unless there is some mechanism requiring references or validation of opinions in some way. In the context of an organization performing work in the "real world" there exists the opportunity to validate dialogical learning through testing in a live-fire environment. Cunliffe, 2002 "By questioning at many levels; self, others, theory, language, knowledge, reality, ideology, we may become more critical and responsive practitioners - better able to be actively engaged in the much-needed search for fundamental alternatives to ways of organizing and "doing things"' (Cunliffe, 2002: cited Prasad and Cavanaugh, 1997: 315). Cunliffe wrote this in 2002 and now the tools at hand for multi-modal two way dialogical learning are much more evolved. I believe Socrates would be delighted by the possibilities for organizations to self-reflect, test and learn.

Cunliffe, A. (2002) 'Reflexive Dialogical Practice', Management Learning, Sage Publishing, London, Thousand Oaks California and New Dehli. (Vol 33(1): 33-61) retrieved from: https://journals.sagepub.com/doi/10.1177/1350507602331002

Cope, B (2019), Recursive Feedback, Part 4C: Crowdsourcing Prospective or Constitutive Assessment, retrieved From <https://www.coursera.org/learn/elearning/lecture/WGn5G/recursive-feedback-part-4c-crowdsourcing-prospective-or-constitutive-assessment>