Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke. Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Assessing Model-Based Reasoning using Evidence- Centered Design
Details
This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop. Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based reasoning: Model Formation, Model Use, Model Elaboration, Model Articulation, Model Evaluation, Model Revision, and Model-Based Inquiry. Each design pattern lays out considerations concerning targeted knowledge and ways of capturing and evaluating students' work. These design patterns are available at http://design-drk.padi.sri.com/padi/do/NodeAction?state=listNodes&NODE_TYPE=PARADIGM_TYPE. The ideas are illustrated with examples from existing assessments and the research literature.
Helps integrate standards, instruction, formative assessment, and accountability measures Builds on principles of evidence-centered assessment design Provides links to interactive, online versions of design patterns Includes supplementary material: sn.pub/extras
Autorentext
Robert J. Mislevy, PhD, is Frederic M. Lord Chair in Measurement and Statistics at the Educational Testing Service in Princeton, New Jersey, and Professor Emeritus at the University of Maryland.
**Geneva D. Haertel, PhD, is Director of Assessment Research and Design at the Center for Technology in Learning, SRI International.
**Michelle M. Riconscente, PhD, is Director of Learning and Assessment at GlassLab in California.
**Daisy Wise Rutstein, PhD, is Education Researcher in the Education Division at SRI International.
**Cindy S. Ziker, PhD, MPH, is Senior Researcher for Assessment in the Education Division of the Center for Technology and Learning at SRI International.
**
**
Inhalt
Preface.- Introduction.- Model-Based Reasoning.- Evidence-Centered Assessment Design.- Design Patterns for Model-Based Reasoning.- Model Formation.- Model Use.- Model Elaboration.- Model Articulation.- Model Evaluation.- Model Revision.- Model-based Inquiry.- Conclusion.- References.- Appendix.- Summary Form of Design Patterns for Model-based Reasoning.- Appendix.
Weitere Informationen
- Allgemeine Informationen
- GTIN 09783319522456
- Sprache Englisch
- Auflage 1st ed. 2017
- Größe H218mm x B131mm x T9mm
- Jahr 2017
- EAN 9783319522456
- Format Kartonierter Einband
- ISBN 978-3-319-52245-6
- Titel Assessing Model-Based Reasoning using Evidence- Centered Design
- Autor Robert J. Mislevy , Geneva Haertel , Michelle Riconscente , Daisy Wise Rutstein , Cindy Ziker
- Untertitel A Suite of Research-Based Design Patterns
- Gewicht 248g
- Herausgeber Springer-Verlag GmbH
- Anzahl Seiten 130
- Lesemotiv Verstehen
- Genre Linguistics & Literature