Session+8

=March 16, 2010.=

Ch. 7 powerpoint

Differentiation

ch. 8 powerpoint

How will we know if students have achieved the desired results and met the expectations? What will we accept as evidence of student understanding and proficiency? What is evidence of in-depth understanding as opposed to superficial or naive understanding? What kinds of assessment evidence will anchor our curricular units and thus guide our instruction? The backward design approach encourages us to think about a unit in terms of the collected assessment evidence needed to document and validate that the desired learning has been achieved so that the course is not just content to be covered or a series of learning activities. This approach encourages teachers and curriculum planners to **first think like an assessor** **before designing specific units and lessons, and to consider up front** **how they will determine whether students have attained the** **desired understandings.** Because understanding develops as a result of ongoing inquiry and rethinking, the assessment of understanding should be thought of in terms of a collection of evidence over time instead of an event, a single moment or test at the end of instruction. The unit will be anchored by performance tasks or projects that provide evidence that students are able to use their knowledge in context, a more appropriate means of evoking and assessing enduring understanding. Traditional assessments are used to round out the picture by assessing essential knowledge and skills that contribute to the culminating performances. To think like an assessor before designing lessons, as backward design demands, does not come naturally or easily to many teachers. We are far more used to thinking like an activity designer once we have a target. We easily and unconsciously jump from Stage I to Stage III of the backward design process, from content expectations to the design of lessons without asking ourselves if we will have the evidence we need to assess for the desired knowledge and skills. When planning to collect evidence of understanding, teachers should consider a range of assessment methods. As well assessment should include a culminating performance that demonstrates evidence of understanding of the concepts of the unit. The culminating performance is designed using the following acronym: **GRAPE** Thinking like an assessor addresses two basic questions. Where should we look to find hallmarks of understanding, and what should we look for in determining and distinguishing degrees of understanding? We need to consider the necessary evidence in general, the kinds of performance or  behavior indicative of understanding; and to focus on the most salient and revealing criteria for identifying and differentiating levels or degrees of understanding.
 * __ Stage II: Determine Acceptable Evidence __**
 * G ** oal
 * R ** ole and situation
 * A ** udience
 * P ** roduct and Presentation
 * E ** vidence of Learning

Thinking Like An Assessor || Thinking Like An Activity Designer || What would be sufficient and revealing evidence of understanding? || What would be interesting and engaging activities on this topic? || What performance tasks must anchor the unit and focus the instructional work? || What resources and materials are available on this topic? || How will I be able to distinguish between those who really understand and those who don’t (though they may seem to)? || What will students be doing in and out of class? What assignments will be given? || Against what criteria will I distinguish work? || How will I give students a grade (and justify  it to their parents)? || What misunderstandings are likely? How will I check for those? || Did the activities work? Why or why not? || (Source: Wiggins & McTighe, 1998, Understanding By Design)
 * Types of **
 * assessment **

** These are simple, content-focused questions. They... ** || ** These are open-ended questions or problems that require the student to think critically, not just recall knowledge, and then to prepare a response, product, or performance. They... ** || or solving them. used. ** As complex challenges that mirror the issues and problems faced by adults, they are authentic. Ranging in length from short-term tasks to long-term, multistaged projects, they require a production or performance. They differ from prompts because they... ** || background noise, incentives, and opportunities an adult would find in a similar situation.
 * Quiz & Test Items **
 * Quiz & Test Items **
 * Assess for factual information, concepts, and discrete skill.
 * Use selected-response or short-answer formats.
 * Are convergent--typically they have a single, best answer.
 * May be easily scored using an answer key (or machine scoring).
 * Are typically secure (not known in advance). ** ||
 * Academic Prompts **
 * Academic Prompts **
 * Academic Prompts **
 * Require constructed responses under school or exam conditions.
 * Are open. There is not a single, best answer or a best strategy for answering
 * Are open. There is not a single, best answer or a best strategy for answering
 * Often are ill-structured, requiring the development of a strategy.
 * Involve analysis, synthesis, or evaluation.
 * Typically require an explanation or defense of the answer given or methods
 * Require judgment-based scoring based on criteria and performance standards.
 * May or may not be secure. ** ||
 * Performance Tasks & Projects **
 * Performance Tasks & Projects **
 * Feature a setting that is real or simulated: one that involves the kind of constraints,
 * Feature a setting that is real or simulated: one that involves the kind of constraints,
 * Typically require the student to address an identified audience.
 * Are based on a specific purpose that relates to the audience.
 * Allow the student greater opportunity to personalize the task.
 * Are not secure. Task, criteria, and standards are known in advance and guide the student's work. ** ||