Reasoning by analogy generally involves abstracting details from a a particular set of problems and resolving structural similarities between previously distinct problems. Analogical reasoning refers to this process of recognition and then applying the solution from the known problem to the new problem. Such a technique is often identified as case-based reasoning. Analogical learning generally involves developing a set of mappings between features of two instances. Paul Thagard and Keith Holyoak have developed a computational theory of analogical reasoning that is consistent with the outline above, provided that abstraction rules are provided to the model.
Analogy is a powerful cognitive mechanism that people use to make inferences and learn new abstractions. The history of work on analogy in modern cognitive science is sketched, focusing on contributions from cognitive psychology, artificial intelligence, and philosophy of science. This review sets the stage for the 3 articles that follow in this Science Watch section.
Many professors rely on analogy, metaphor and over-generalization to maximize complex learning, and it's a skill that is taught in graduate schools across the world. I've taught over 80 classes in graduate school, and the only way to teach complex topics like statistics and database are by using analogies, building on the prerequisite conceptual framework classes such as "Algorithms" and "Data Structures" courses.
Teaching by analogy and over-generalization is an integral part of the learning process, especially for complex concepts, and academic research confirms the centuries-old belief that children learn complex concepts best by analogy and over-simplification.
Ontology Recapitulates Phylogny is the analogy (untrue, BTW), that a developing zygote/fetus reproduces it's own evolutionary stages
a quotes given by william wordsworth
Science appears as what in truth she is,
Not as our glory and our absolute boast,
But as a succedaneum, and a prop
To our infirmity.
No comments:
Post a Comment