Talk:Learning techniques

From PsychonautWiki
Jump to navigation Jump to search

Research in the field of education has a substantial problem of terminology vagueness.[1][2][3][4] Instructors often conduct education research using their own course materials; however, without a set-defined vocabulary, the field lacks a coherent body of research. There are greater than 70 different models for how learning works. The sheer number of conflicting word definitions and conceptual vagueness is

"both bewildering and off-putting to practitioners and to other academics who do not specialize in this field."[1]

Definitions

Learning is defined as a gain in knowledge or skills.[5] Learning differs from memory due to timescales. Learning may be a slow and laborious process, whereas memory is a near-instant expression. Education aims to improve meaningful learning; it doesn't aim to merely develop test-taking abilities.[6]

  • Knowledge is defined as the state of being familiar with something or aware of its existence.[7]
Critical Thinking is defined as an intentional and self-regulatory judgement. It produces explanations whether evidence for a specific judgement is appropriate.[8]
  • Skills are defined as the practical application of knowledge.[9]

Metacognition is defined as the awareness and conscious use of the psychological processes involved in perception, memory, thinking and learning.[1]

Knowledge

Critical Thinking

Critical thinking enables individuals to make decisions for what to believe in a given context.[8] Critical thinking is a non-linear and circular process. It encompasses interpretation, analysis, evaluation, and inferences for evidential, conceptual, methodological, referential, and contextual aspects of judgement.

Concept Mapping

(put a concept map image here)

Concept mapping reliably improves critical thinking abilities.[8] Concept maps are node-link diagrams where meaning is signaled by node proximity, shape, and color.[8][10] They work as effectively or perform better than other stand-alone information sources. The improvements in critical thinking ability were actually greater for high-ability and older students when measured across a large range of education levels and subjects studied.

The organization of nodes, a la visual chunking, leads to efficiencies that cannot be obtained from text; it adds a spatial sense.[8][10] There are detectable benefits preferring studying maps rather than outlines or lists, and for constructing maps rather than reading text, attending lectures, or attending class discussions. The most amount of benefits arise from individuals creating the maps themselves, and groups studying concept maps. Individuals creating the maps themselves were roughly twice as educated as opposed to individuals merely studying maps. These results are potentially due individuals adding a meta-cognitive brain reflection, or groups adding discussion-related reflection processes on top of memory retrieval.

Concept mapping improves both central ideas and details, although central ideas to a much greater extent.[8][10] Mapping appears to offer a greater benefit in subject areas more saturated with verbal knowledge, such as a lecture or discussion as opposed to text or a list. There is no detectable loss of knowledge that could have been gained elsewhere. Concept mapping is an entirely preferable medium.

Teachers

Teachers may be another transferable medium for critical thinking ability. When comparing teacher attributes and student cognitive outcomes, the highest correlations for attribute-transfers were in creative and critical thinking abilities.[11]

Skills

PBL is not significantly effective in improving critical thinking.[8]

PBL demonstrate better understanding of the PRINCIPLES that link concepts. (almost twice as much as application)[12]

"Given the strong evidence for its memorial benefits, many cognitive and educational psychologists now classify testing as among the most effective educational techniques discovered to date."[13]

Tests help with application and inference questions. (0.30)[13]

There's substantial evidence that testing is better than restudy and rereading. Test-enhanced learning is often *substantially better* than nontesting re-exposure conditions (restudy/rereading). [13]

Post-test: restudy of materials or other elaborative feedback provided is better.

Using both response congruency and elaborated retrieval practice yielded the best initial test performances. A simple correct answer feedback was not associated with improved performance in any of the meta-analyses, which conflicts with other results in educational science literature.[13]

PBL students are better in applying their knowledge. None of the studies reported significant negative findings.[9]

No significant different effects on achievement between a single course and a curriculum-wide implementation. There is some significant effects for knowledge in a curriculum implementation.[9]

Although PBL has negative effect on knowledge (they do not know as many facts), their knowledge has been elaborated more thus have better recall.[9]

There is a tendency for negative effects of PBL on knowledge.[9]

The combined effect size for skills is moderate, but of practical significance. The effect on knowledge, already described as non-robust, is also small and not practically significant.[9]

There's an extremely minor negative effect on concepts (for PBL), but also they perform at least as well as conventional instruction.[12]

(metacognition) Also formative assessments (problem-based-learning) produce significant things (typically between 0.4 and 0.7 effect sizes).[1]

Testing effects are larger following more difficult initial tests, and longer duration between retrievals. (Testing is different from studying) Days-weeks > minutes [14]

Generally, more effortful or difficult tests (recall more than recognition) yields larger testing effects. Overall, repeated testing is likely beneficial, but has rapid diminishing returns[14]

PBL > traditional instruction both improving test scores and improving students' outcomes of self-study, learning interest, team spirit, problem solving, analyzing, scope of knowledge, communication, and expression.[15]

Effect sizes were generally larger for skills than for knowledge. Only group instruction failed to demonstrate consistently positive effects. this is incongruous with other studies, though.[3]

increasing allocation of curriculum time to PE benefits multiple facets of student learning with varied effect: cognitive (d=0.14); affective (d=0.66); and psychomotor(d= 0.83). Additionally, we show that learning across all three domains (d=0.41) surpasses what students might achieve by allocating that time to practising standardized tests[16]

Comprehension monitoring is a particularly important outcome of quizzing. Most students are remarkably poor at judging whether or not they have studied a piece of material well enough to have mastered it. The skill is called "Judgements of Learning" (JOLs). (this is just problem based learning)[2]

Providing online quizzes does not seem to be more effective than other tactics such as assigning homework.[2]

Comparing quiz taking to group discussion did not differ statistically, so it is more that activity mitigates loss in learning.[2]

Metacognition

Scaffolding helps here.

Learning styles fall under metacognition entirely.[1] The notion of learning styles tends to imply something fixed and unchanging

Effect sizes for different types of intervention (Table 43) Reinforcement 1.13 Student's prior cognitive ability 1.00 Instructional quality 1.04 Direct instruction 0.82 Student's disposition to learn 0.61 Class environment 0.56 Peer tutoring 0.50 Parental involvement 0.46 Teacher style 0.42 Affective attributes of students 0.24 Individualization 0.14 Behavioral objectives 0.12 Team teaching 0.06

Active better than passive. the action of setting goals, choosing appropriate strategies, and monitoring progress are more effective in improving knowledge outcomes than those which simply aim to engage learners at the level of presenting information for understanding and use.'[1] Targeting improving metacognition averaged a 26% gain (across 556 studies) compared to averaged 5% (1772 studies) without an explicit metacognitive component'[1]

Critical thinking involves metacognition (reflection&retrieval)[8]v

Three levels of knowledge structure can be targeted by assessment of PBL: 1. understanding concepts, 2. understanding principles that link concepts, 3. linking of concepts and principles to conditions and procedures for application. PBL is best at 2.[12]

All models agree that an organized domain-specific knowledge base--and metacognitive functions that operate on that knowledge--are essential parts of successful problem solving.[12]

The clearest recommendation for practice can be made for incorporating mechanisms that promote student reflection on their level of understanding.[2]

Variety

There exists substantial evidence for specific strengths and weaknesses (visual, auditory, kinaesthetic processing). However, it's not established that matching *instruction* to strengths and weaknesses is more effective than designing for all learners. Most people have preferences that differ from one another.[1]

Blended instruction is better.[10]

Diverse teaching and learning methods are more effective than traditional approaches. [8]

Fully distanced digital PBL is either as good, or only better even when compared to in-person (face-to-face) PBL.[17]

pooled effect sizes favoring technique[3] 0.68 range of difficulty (variety) 0.66 distributed practice 0.65 interactivity 0.62 multiple learning strategies 0.52 individualized learning 0.44 feedback 0.34 longer time practicing



Timing

Small breaks are better than immediate testing.[15]

Groups

Constructing maps in a group >>> individual construction[10]

Studying maps as individual > group[10]

Group Practice not significant[3]

Myths

Both brain hemispheres are involved in all activities.[1]


Nature vs Nurture

Learning environments have considerable influence.[1]

There are no brain characteristics or personality traits so strongly determined by genes that they could explain a supposedly fixed nature of any cognitive style. Statements about the biological basis of learning styles have no direct empirical support.[1]



See also

External links

References

  1. 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 1.10 Coffield, F., Moseley, D., Hall, E., Ecclestone, K. (2004). "Learning styles and pedagogy in post-16 learning: a systematic and critical review" (PDF). Learning and Skills Research Centre (Great Britain). 
  2. 2.0 2.1 2.2 2.3 2.4 Means, B., Toyama, Y., Murphy, R., Bakia, M., Jones, K. (May 2009). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. US Department of Education. 
  3. 3.0 3.1 3.2 3.3 Cook, D. A., Hamstra, S. J., Brydges, R., Zendejas, B., Szostek, J. H., Wang, A. T., Erwin, P. J., Hatala, R. (January 2013). "Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis". Medical Teacher. 35 (1): e867–e898. doi:10.3109/0142159X.2012.714886. ISSN 1466-187X 0142-159X, 1466-187X Check |issn= value (help). 
  4. Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., Rohrer, D. (2006). "Distributed practice in verbal recall tasks: A review and quantitative synthesis". Psychological Bulletin. 132 (3): 354–380. doi:10.1037/0033-2909.132.3.354. ISSN 0033-2909 1939-1455, 0033-2909 Check |issn= value (help). 
  5. Kazdin, A. E., ed. (2000). Encyclopedia of psychology. American Psychological Association ; Oxford University Press. ISBN 9781557986504. 
  6. Adesope, O. O., Trevisan, D. A., Sundararajan, N. (June 2017). "Rethinking the Use of Tests: A Meta-Analysis of Practice Testing". Review of Educational Research. 87 (3): 659–701. doi:10.3102/0034654316689306. ISSN 1935-1046 0034-6543, 1935-1046 Check |issn= value (help). 
  7. APA Dictionary of Psychology 
  8. 8.0 8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8 Lee, J., Lee, Y., Gong, S., Bae, J., Choi, M. (December 2016). "A meta-analysis of the effects of non-traditional teaching methods on the critical thinking abilities of nursing students". BMC Medical Education. 16 (1): 240. doi:10.1186/s12909-016-0761-7. ISSN 1472-6920. 
  9. 9.0 9.1 9.2 9.3 9.4 9.5 Dochy, F., Segers, M., Van den Bossche, P., Gijbels, D. (October 2003). "Effects of problem-based learning: a meta-analysis". Learning and Instruction. 13 (5): 533–568. doi:10.1016/S0959-4752(02)00025-7. ISSN 0959-4752. 
  10. 10.0 10.1 10.2 10.3 10.4 10.5 Nesbit, J. C., Adesope, O. O. (September 2006). "Learning With Concept and Knowledge Maps: A Meta-Analysis". Review of Educational Research. 76 (3): 413–448. doi:10.3102/00346543076003413. ISSN 1935-1046 0034-6543, 1935-1046 Check |issn= value (help). 
  11. Cornelius-White, J. (March 2007). "Learner-Centered Teacher-Student Relationships Are Effective: A Meta-Analysis". Review of Educational Research. 77 (1): 113–143. doi:10.3102/003465430298563. ISSN 1935-1046 0034-6543, 1935-1046 Check |issn= value (help). 
  12. 12.0 12.1 12.2 12.3 Gijbels, D., Dochy, F., Van den Bossche, P., Segers, M. (March 2005). "Effects of Problem-Based Learning: A Meta-Analysis From the Angle of Assessment". Review of Educational Research. 75 (1): 27–61. doi:10.3102/00346543075001027. ISSN 1935-1046 0034-6543, 1935-1046 Check |issn= value (help). 
  13. 13.0 13.1 13.2 13.3 Pan, S. C., Rickard, T. C. (July 2018). "Transfer of test-enhanced learning: Meta-analytic review and synthesis". Psychological Bulletin. 144 (7): 710–756. doi:10.1037/bul0000151. ISSN 0033-2909 1939-1455, 0033-2909 Check |issn= value (help). 
  14. 14.0 14.1 Rowland, C. A. (2014). "The effect of testing versus restudy on retention: A meta-analytic review of the testing effect". Psychological Bulletin. 140 (6): 1432–1463. doi:10.1037/a0037559. ISSN 0033-2909 1939-1455, 0033-2909 Check |issn= value (help). 
  15. 15.0 15.1 Liu, L., Du, X., Zhang, Z., Zhou, J. (March 2019). "Effect of problem-based learning in pharmacology education: A meta-analysis". Studies in Educational Evaluation. 60: 43–58. doi:10.1016/j.stueduc.2018.11.004. ISSN 0191-491X. 
  16. Dudley, D., Burden, R. (February 2020). "What effect on learning does increasing the proportion of curriculum time allocated to physical education have? A systematic review and meta-analysis". European Physical Education Review. 26 (1): 85–100. doi:10.1177/1356336X19830113. ISSN 1741-2749 1356-336X, 1741-2749 Check |issn= value (help). 
  17. Tudor Car, L., Kyaw, B. M., Dunleavy, G., Smart, N. A., Semwal, M., Rotgans, J. I., Low-Beer, N., Campbell, J. (28 February 2019). "Digital Problem-Based Learning in Health Professions: Systematic Review and Meta-Analysis by the Digital Health Education Collaboration". Journal of Medical Internet Research. 21 (2): e12945. doi:10.2196/12945. ISSN 1438-8871.