Research in the field of education has a substantial problem of terminology vagueness. Instructors often conduct education research using their own course materials; however, without a set-defined vocabulary, the field lacks a coherent body of research. There are greater than 70 different models for how learning works. The sheer number of conflicting word definitions and conceptual vagueness is
- "both bewildering and off-putting to practitioners and to other academics who do not specialize in this field."
Learning is defined as a gain in knowledge or skills. Learning differs from memory due to timescales. Learning may be a slow and laborious process, whereas memory is a near-instant expression. Education aims to improve meaningful learning; it doesn't aim to merely develop test-taking abilities.
- Knowledge is defined as the state of being familiar with something or aware of its existence.
- Critical Thinking is defined as an intentional and self-regulatory judgement. It produces explanations whether evidence for a specific judgement is appropriate.
- Skills are defined as the practical application of knowledge.
Metacognition is defined as the awareness and conscious use of the psychological processes involved in perception, memory, thinking and learning.
Metacognition is thinking about how thinking works. Using these learning techniques is an example of metacognition, and they all have metacognitive components. Explicitly targeting metacognition averaged a 26% increase in academic scores (556 studies) compared to an average of 5% without an explicit metacognitive component (1772 studies).
Metacognition has the actions of setting goals, choosing strategies, and monitoring progress. It is not passively sitting in front of lecture info-dumps. The most definitive recommendation for improving student understanding is incorporating strategies that make students reflect on their level of understanding. All learning models agree that metacognition is essential for successful problem solving.
Critical thinking enables individuals to make decisions for what to believe in a given context. Critical thinking is a non-linear and circular process. It encompasses interpretation, analysis, evaluation, and inferences for evidential, conceptual, methodological, referential, and contextual aspects of judgement.
Concept mapping reliably improves critical thinking abilities. Concept maps are node-link diagrams where meaning is signaled by node proximity, shape, and color. They work as effectively or perform better than other stand-alone information sources. The improvements in critical thinking ability remained applicable when measured across a large range of education levels and subjects studied.
The organization of nodes, a la visual chunking, leads to efficiencies that cannot be obtained from text; it adds a spatial sense. There are detectable benefits preferring studying maps rather than outlines or lists, and for constructing maps rather than reading text, attending lectures, or attending class discussions. The most amount of benefits arise from individuals creating the maps themselves, and groups studying concept maps. Individuals creating the maps themselves were roughly twice as educated as opposed to individuals merely studying maps. These results are potentially due individuals adding a meta-cognitive brain reflection, or groups adding discussion-related reflection processes on top of memory retrieval.
Concept mapping improves both central ideas and details, although central ideas to a much greater extent. Mapping appears to offer a greater benefit in subject areas more saturated with verbal knowledge, such as a lecture or discussion as opposed to text or a list. There is no detectable loss of knowledge that could have been gained elsewhere. Concept mapping is an entirely preferable medium.
Teachers may be another transferable medium for critical thinking ability. When comparing teacher attributes and student cognitive outcomes, the highest correlations for attribute-transfers were in creative and critical thinking abilities.
Problem-based learning (PBL) improves the ability to apply knowledge. It commonly exists in the forms of homework, quizzes, and tests. Many cognitive and educational psychologists classify PBL among the most effective educational techniques.
More specifically, students in this paradigm demonstrate a better understanding of the principles that link concepts (twice as much when compared to practical applications). PBL does not improve critical thinking. There is also a conflicting tendency to report minor negative effects for PBL on knowledge, although this technique still performs as well or better than traditional lecture-based instruction. Even though PBL possibly has a negative effect on total amount of knowledge, the applied knowledge has its intricacies elaborated further.
An important outcome for PBL is comprehension monitoring: most students are remarkably poor at judging whether they have studied material well enough to have mastered it. There is substantial evidence that testing/quizzing/HW is better than restudying and rereading. Quizzes do not seem to be more effective than HW. Generally, more difficult/effortful initial tests and longer durations between tests are better. Repeated testing is likely beneficial, on the order of days to weeks, but has diminishing returns. After testing, a restudy of materials or feedback is better.
It is important to keep in mind that PBL is only one part of a multifaceted education curriculum. Recently, standardized testing has begun sapping physical education time-slots. Physical education surpasses standardized testing with observable, practical significance for better improving cognitive, emotional, and psychomotor abilities.
Additional variables to consider (in order of importance)
The brain distributes neural activity. Cognitive functioning exists as self-sustained cyclic patterns of activation, otherwise known as neural networks. Regions implicated in creating associations, processing environments, and memorization are the most highly interconnected neural circuitry crossroads. Nontraditional techniques encouraging more associations and more environments are better.
- Creating the most multi-sensory instructional material outperforms tailoring to specific learning styles. Individual preferences can still exist, but learning styles are not fixed and unchanging; they develop as the individual does.
- Mixed question formats (multiple choice + writing), mixed number of practice tests, and mixed practice/final test formats, all improve information retention the most.
- Presenting social and emotional education alongside cognitive education produced an 11 percentile-point academic gain.
- Blended instruction, being partly in-person and partly-online, clearly outperforms only-in-person instruction.
Spacing study sessions has a large effect on retention for verbal memory tasks and skills. Cramming for tests may provide a boost for the next hour, but when considering the time length for a semester-long course it will not help. For most practical purposes in the context of life-long learning, the optimal interspaced study interval will likely be more than one day. The average academic benefit for this practice is +15%, and held for children as well as adults.
Generally, measuring retention up to a month and using the same materials studied, a break of one day outperforms all other timings. One day outperformed a 1-15 minute break and a 1-15 day break. For measuring retention from 30 days to 2900 days, a 2-28 days of difference in interval is better. Optimal spacing increases as the aimed retention time period increases; relearning a topic reinforces the information better.
Unfortunately, it's not possible to say what length study sessions should be, or how long distributing study should be to optimize long-term retention. What is clear, is that distributing the amount of study time over multi-day periods greatly improves the amount of material retained for sizable periods of time.
Groups practicing and receiving instruction fail to demonstrate consistent positive effects. When there are positive effects, they appear to arise from small groups. Peers assisting their colleagues in learning are comparable in performance to faculty members' teaching.
The differences in terminology between education research and neuroscience research contributes an additional layer of communicative ambiguity. Neuroscience is rarely included the instruction of teachers. Oversimplification is a fertile ground for introducing biases from which misunderstanding develops.
|We only use 10% of our brain.||No.|
|Individuals learn better when they receive information in their preferred learning style||The brain's interconnectivity makes this assumption unsound. Preferences can exist, but multiple sensory modes support better learning.|
|Differences in hemispheric dominance (left/right brain) can help explain individual differences amongst learners||Some brain processes have areas with additional neural activity associated with them; however brains distribute all activities entirely. 'Hot spots' on brain scan images represent a statistical map where activity has exceeded an arbitrary threshold.|
|Drinking less than 6-8 glasses of water a day can cause the brain to shrink||Dehydration can influence cognitive function, but there's no evidence for under-performance in school children who fail to meet this.|
|Multiple intelligences||This is an untestable hypothesis. The general processing complexity of the brain makes it unlikely that anything resembling this can ever be used to describe it; it seems neither accurate nor useful to reduce the vast range of individual differences to any limited number of capabilities.|
|Learning problems, associated with developmental differences, cannot be remediated with education||Worldwide surveys indicate that if teachers believe in a biologically programmed intelligence, then they feel less able to help their students. This feeling, even unintentionally, causes teachers to help their students less. Neural circuits develop at different rates until early adulthood.|
- ↑ 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 Coffield, F., Moseley, D., Hall, E., Ecclestone, K. (2004). "Learning styles and pedagogy in post-16 learning: a systematic and critical review" (PDF). Learning and Skills Research Centre (Great Britain).
- ↑ 2.0 2.1 2.2 Means, B., Toyama, Y., Murphy, R., Bakia, M., Jones, K. (May 2009). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. US Department of Education.
- ↑ 3.0 3.1 3.2 3.3 3.4 Cook, D. A., Hamstra, S. J., Brydges, R., Zendejas, B., Szostek, J. H., Wang, A. T., Erwin, P. J., Hatala, R. (January 2013). "Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis". Medical Teacher. 35 (1): e867–e898. doi:10.3109/0142159X.2012.714886.
- ↑ 4.0 4.1 4.2 4.3 Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., Rohrer, D. (2006). "Distributed practice in verbal recall tasks: A review and quantitative synthesis". Psychological Bulletin. 132 (3): 354–380. doi:10.1037/0033-2909.132.3.354.
- ↑ Kazdin, A. E., ed. (2000). Encyclopedia of psychology. American Psychological Association ; Oxford University Press. ISBN 9781557986504.
- ↑ 6.0 6.1 6.2 6.3 Adesope, O. O., Trevisan, D. A., Sundararajan, N. (June 2017). "Rethinking the Use of Tests: A Meta-Analysis of Practice Testing". Review of Educational Research. 87 (3): 659–701. doi:10.3102/0034654316689306.
- ↑ APA Dictionary of Psychology
- ↑ 8.0 8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8 Lee, J., Lee, Y., Gong, S., Bae, J., Choi, M. (December 2016). "A meta-analysis of the effects of non-traditional teaching methods on the critical thinking abilities of nursing students". BMC Medical Education. 16 (1): 240. doi:10.1186/s12909-016-0761-7. ISSN 1472-6920.
- ↑ 9.0 9.1 9.2 9.3 Dochy, F., Segers, M., Van den Bossche, P., Gijbels, D. (October 2003). "Effects of problem-based learning: a meta-analysis". Learning and Instruction. 13 (5): 533–568. doi:10.1016/S0959-4752(02)00025-7. ISSN 0959-4752.
- ↑ 10.0 10.1 10.2 10.3 Gijbels, D., Dochy, F., Van den Bossche, P., Segers, M. (March 2005). "Effects of Problem-Based Learning: A Meta-Analysis From the Angle of Assessment". Review of Educational Research. 75 (1): 27–61. doi:10.3102/00346543075001027.
- ↑ 11.0 11.1 11.2 11.3 Nesbit, J. C., Adesope, O. O. (September 2006). "Learning With Concept and Knowledge Maps: A Meta-Analysis". Review of Educational Research. 76 (3): 413–448. doi:10.3102/00346543076003413.
- ↑ Cornelius-White, J. (March 2007). "Learner-Centered Teacher-Student Relationships Are Effective: A Meta-Analysis". Review of Educational Research. 77 (1): 113–143. doi:10.3102/003465430298563.
- ↑ 13.0 13.1 13.2 Pan, S. C., Rickard, T. C. (July 2018). "Transfer of test-enhanced learning: Meta-analytic review and synthesis". Psychological Bulletin. 144 (7): 710–756. doi:10.1037/bul0000151.
- ↑ 14.0 14.1 14.2 14.3 Liu, L., Du, X., Zhang, Z., Zhou, J. (March 2019). "Effect of problem-based learning in pharmacology education: A meta-analysis". Studies in Educational Evaluation. 60: 43–58. doi:10.1016/j.stueduc.2018.11.004. ISSN 0191-491X.
- ↑ Rowland, C. A. (2014). "The effect of testing versus restudy on retention: A meta-analytic review of the testing effect". Psychological Bulletin. 140 (6): 1432–1463. doi:10.1037/a0037559.
- ↑ Dudley, D., Burden, R. (February 2020). "What effect on learning does increasing the proportion of curriculum time allocated to physical education have? A systematic review and meta-analysis". European Physical Education Review. 26 (1): 85–100. doi:10.1177/1356336X19830113.
- ↑ 17.0 17.1 Howard-Jones, P. A. (December 2014). "Neuroscience and education: myths and messages". Nature Reviews Neuroscience. 15 (12): 817–824. doi:10.1038/nrn3817.
- ↑ Fox, M. D., Snyder, A. Z., Vincent, J. L., Corbetta, M., Van Essen, D. C., Raichle, M. E. (5 July 2005). "From The Cover: The human brain is intrinsically organized into dynamic, anticorrelated functional networks". Proceedings of the National Academy of Sciences. 102 (27): 9673–9678. doi:10.1073/pnas.0504136102.
- ↑ Heuvel, M. P. van den, Sporns, O. (2 November 2011). "Rich-Club Organization of the Human Connectome". Journal of Neuroscience. 31 (44): 15775–15786. doi:10.1523/JNEUROSCI.3539-11.2011.
- ↑ Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., Schellinger, K. B. (January 2011). "The Impact of Enhancing Students' Social and Emotional Learning: A Meta-Analysis of School-Based Universal Interventions: Social and Emotional Learning". Child Development. 82 (1): 405–432. doi:10.1111/j.1467-8624.2010.01564.x. ISSN 0009-3920.
- ↑ Vallée, A., Blacher, J., Cariou, A., Sorbets, E. (10 August 2020). "Blended Learning Compared to Traditional Learning in Medical Education: Systematic Review and Meta-Analysis". Journal of Medical Internet Research. 22 (8): e16504. doi:10.2196/16504. ISSN 1438-8871.
- ↑ Tudor Car, L., Kyaw, B. M., Dunleavy, G., Smart, N. A., Semwal, M., Rotgans, J. I., Low-Beer, N., Campbell, J. (28 February 2019). "Digital Problem-Based Learning in Health Professions: Systematic Review and Meta-Analysis by the Digital Health Education Collaboration". Journal of Medical Internet Research. 21 (2): e12945. doi:10.2196/12945. ISSN 1438-8871.
- ↑ Pei, L., Wu, H. (1 January 2019). "Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis". Medical Education Online. 24 (1): 1666538. doi:10.1080/10872981.2019.1666538. ISSN 1087-2981.
- ↑ Alten, D. C. D. van, Phielix, C., Janssen, J., Kester, L. (November 2019). "Effects of flipping the classroom on learning outcomes and satisfaction: A meta-analysis". Educational Research Review. 28: 100281. doi:10.1016/j.edurev.2019.05.003. ISSN 1747-938X.
- ↑ Li, C., He, J., Yuan, C., Chen, B., Sun, Z. (November 2019). "The effects of blended learning on knowledge, skills, and satisfaction in nursing students: A meta-analysis". Nurse Education Today. 82: 51–57. doi:10.1016/j.nedt.2019.08.004. ISSN 0260-6917.
- ↑ Wilson, A. B., Brown, K. M., Misch, J., Miller, C. H., Klein, B. A., Taylor, M. A., Goodwin, M., Boyle, E. K., Hoppe, C., Lazarus, M. D. (January 2019). "Breaking with Tradition: A Scoping Meta-Analysis Analyzing the Effects of Student-Centered Learning and Computer-Aided Instruction on Student Performance in Anatomy: Effectiveness of Student Centered and Computer-Aided Learning in Anatomy". Anatomical Sciences Education. 12 (1): 61–73. doi:10.1002/ase.1789. ISSN 1935-9772.
- ↑ Gegenfurtner, A., Ebner, C. (November 2019). "Webinars in higher education and professional training: A meta-analysis and systematic review of randomized controlled trials". Educational Research Review. 28: 100293. doi:10.1016/j.edurev.2019.100293. ISSN 1747-938X.
- ↑ Castro, M. D. B., Tumibay, G. M. (March 2021). "A literature review: efficacy of online learning courses for higher education institution using meta-analysis". Education and Information Technologies. 26 (2): 1367–1385. doi:10.1007/s10639-019-10027-z.
- ↑ Jeong, H., Hmelo-Silver, C. E., Jo, K. (November 2019). "Ten years of Computer-Supported Collaborative Learning: A meta-analysis of CSCL in STEM education during 2005–2014". Educational Research Review. 28: 100284. doi:10.1016/j.edurev.2019.100284. ISSN 1747-938X.
- ↑ Lou, Y., Abrami, P. C., Apollonia, S. d’ (September 2001). "Small Group and Individual Learning with Technology: A Meta-Analysis". Review of Educational Research. 71 (3): 449–521. doi:10.3102/00346543071003449.
- ↑ Guraya, S. Y., Abdalla, M. E. (June 2020). "Determining the effectiveness of peer-assisted learning in medical education: A systematic review and meta-analysis". Journal of Taibah University Medical Sciences. 15 (3): 177–184. doi:10.1016/j.jtumed.2020.05.002. ISSN 1658-3612.
- ↑ Rees, E. L., Quinn, P. J., Davies, B., Fotheringham, V. (2 August 2016). "How does peer teaching compare to faculty teaching? A systematic review and meta-analysis". Medical Teacher. 38 (8): 829–837. doi:10.3109/0142159X.2015.1112888.