Assessment Challenges, Contradictions, and Inequities: An Analysis of

Assessment Challenges, Contradictions, and Inequities: An Analysis of

Assessment Challenges, Contradictions, and Inequities: An Analysis of the Use of digital technology and OALCF Milestones Christine Pinsent-Johnson Matthias Sturm 2016 What was the project about? Why was the project was done? Use Digital Technology Milestones are used far more than other Milestones In PMS training sessions it was suggested that the way LBS programs report on

Milestone data was not consistent Three previous studies described challenges The theory and methods could be a problem Our research questions 1. Why are digital technology Milestones used more often? 2. Why are other Milestones not as popular?

3. How do assessors and instructors understand and use results? 4. What practices do programs develop when using Milestones? How did we collect our data? 1. 2. 3. 4. Analysed OALCF documents and

Milestones Surveyed 181 assessors (28% from school boards representing 31% of learners) Interviewed 26 coordinators, assessors and practitioners from six programs across the province in all streams and sectors Analysed data from EOIS-CaMS (20132014) What are the main findings? 1. 2. 3. 4. 5. Unique design and generic content disconnects

them from teaching and learning They are primarily used for compliance and not instruction Programs then rely on digital technology Milestones, which are more predictable Guidelines make the Milestones administrable but disregard teaching and learning Programs are impacted in different ways and have devised various strategies to mitigate impacts and show compliance Unique test design and content Based on our analysis we found that The OALCF was developed in part based on IALS, the International Adult Literacy Survey This sort of testing doesnt measure how or if people are learning literacy skills It also doesnt consider the skills and abilities of those with low levels of education Making a direct connection between program

performance and results from international adult literacy surveys is an unachievable goal What did survey respondents say? Milestones are confusing Milestones are too difficult for some or too easy for others Learners can be confused by the instructions Learners are not familiar with the content and

kinds of questions asked Milestones do not benefit learner goals, program purposes and the existing curriculum being used in the program Milestones cannot provide useful information to assess learner progress What did the survey data indicate about the use of assessments in programs? Nearly all respondents (89%) continue to use other assessments Nearly all (86%) continue to use other curriculum frameworks aligned with K-12 and traditional approaches The OALCF is mainly assessment. ESKARGO and other documents from CESBA make

the OALCF Curriculum Framework usable! Provides tangible guidance for program administrators and instructors to follow and adopt into programming. All the service providers were terrified that they were getting funding pulled if they dont get the numbers in and if they dont have the Milestones. So there is all that tension implementing without the skill base to know what we are actually doing. This whole curriculum framework didnt change what we did at all. We changed the language but we still do the same things. We are using text books. We are teaching to specific skills. We always try to show you how to use the skills in the real world [] So we are changing how we are reporting it, but the other information we want we are tracking ourselves. Interviews Participant Survey respondents also

said that they are experiencing a general sense of time pressure EOIS-CaMS data revealed Milestones are used to meet reporting minimum 43,145 1.5 Milestones per learner 66,775 Milestones per learner (2013-14) What did survey respondents say? Milestones are often in too big a chunk for learners in

Secondary School goal path to show progress. There are so many skills needed to prepare for PLAR and credit system. Milestones arent really related to what we teach to prepare them for college. Most will score the same on Milestones the day they start and the day they leave. They're not really a measure of progress, just a measure for the ministry to use. Milestones are rarely applicable and seem like a 'hoop' we have to jump throughuseless much of the time in relation to true goals, and even goal path. EOIS-CaMS data shows assessors rely on digital Milestones Most Commonly Completed Milestones 1 54 Log into a user account on a computer

12% 2 55 Conduct an Internet search 5.5% 3 57 Begin to manage learning 5.3% 4 1 - Read a classified advertisement and an email 3.4%

5 4 - Read a detailed course description 3.2% 6 33 - Complete a somewhat complex form 3.0% 7 39 - Verify costs and make calculations 3.0% Rates of OALCF Milestones

Completed (2013-14) Digital Milestone use by LBS sector Milestone 54 Selected Milestone 55 Milestone 56 Completed Selected Completed Selected Completed Communit 84 y-based 52%

2816 36% 72 52% 1291 35% 23 56% 407 33%

College 41 25% 3327 42% 9 7% 1282 35% 13

32% 434 35% School board 37 23% 1723 22% 56 41% 1099 30% 5 12% 403 32% TOTAL 162

100% 7866 100% 1244 100% 100% 137 100% 3672 100% 41 Comparison of Selection and Completion of Use Digital Technology Milestones by Sector (201314) Our analysis explains why digital Milestones are different What did the survey and interview data reveal about the digital tech Milestones?

The digital technology Milestones are appealing to learners, easy to use, and predictable They can be adapted to ensure the use of truly authentic and skill appropriate texts and activities, particularly for learners with limited literacy skills, knowledge and strategies They align with actual activities that learners are engaged in (learners recognize texts used for testing purposes) They can be used in blended learning programs and computer courses They introduce learners to the Milestone testing scheme

and concept of competencies They ensure compliance Guidelines make the Milestones administrable but not teachable Only assessors can see the Milestones Instructors cannot: Analyse Milestone content to adequately prepare their learners for the Milestones and use this information to plan their lesson plans for learners Provide additional information about the content and test instructions (not allowed to use content or information to add on during testing) See detailed results that could be used to help learners complete unsuccessful Milestones (not

allowed to share results with learners) Do all learners experience these impacts? 20% 18% 16% 14% 12% 10% 8% 6% 4% 2% 0% MS 57 MS 1 Selected

MS 54 Completed MS 55 Do all learners experience these impacts? Milestone 54 35% 30% 25% 20% 15% 10% 5% 0% Grade 0-8 Grade 9

Grade 10 Selected Grade 11 Completed Grade 12 Education levels by sector Some or Completed PSE Completed High School 19% Some High School 19%

Grades 0-8 26% 36% 24% 20% 25% 31% 36% 41% 35% 30% 21%

21% Community School Board 14% 4% College Province Strategies to mitigate impacts and show compliance Milestones are silly in their present form...I appreciate the need for a structured and consistent method of evaluation, but the

Milestones miss the mark. I generally regard them as a means to an administrative end rather than an educational tool. I never select Milestones at intake to get service plans open ASAP, but I also rarely select them in the way that they are intended (select a planned start date, end date, mark the actual start and finish as the student progresses, etc.). I get my students to attempt a Milestone and enter it if they complete it. If they aren't successful, I don't record it since it is a waste of my time and will reflect negatively on my program. Interviews Participant What a large program did (case study) Work together

Hire extra admin. staff Shift extra work to admin. Embed Milestones into existing activities Keep learning activities the same Downplay the meaning and importance

Not spend time preparing learners Milestones embedded in a course What a small program did (case study) Work independently to develop a strategy Change and adapt regular curriculum planning Spend time preparing students and developing additional activities to help complete Milestones

Questioned her strategies and expertise Feedback from a School Board Offsetting learners who need more time to complete a Milestone in their program Focus on developing short-term computer-related activities with employment goals Use of digital Milestones to report progress Were bringing in people who can show progress more efficiently. Learners have to be able to complete a Milestone. Were just bringing in anyone who meets this criteria for learning. The problem with a lot of the Milestones, unless you have a person working on those exact skills, they really dont fit. The

Milestones are very limited. Interviews Participant Conclusion: The contradictions 1. Results are primarily used to provide data rather than assess learning 2. Too difficult, too easy 3. High-stakes, low-stakes 4. No alignments with other systems

5. Most commonly used Milestone results are useful for programs but do not provide objective and standardized results to MAESD (formerly MTCU) Conclusion: The inequities 1. Extra work and effort 2. Unfair assessment 3. Interfere with existing curriculum 4. Disconnect LBS system from provincial education and training initiatives Implications for the LBS system Milestones work counter to LBS objectives Compliance-centred not learner and learningcentred Programs cant show actual progress and learner accomplishments Undermines the aim to ensure accountability to all

stakeholders Interferes with program aims to help learners transition Are not appropriate for learners with the least amount of education and/or with disabilities LBS programs are being held accountable for things they dont actually do Discussion What are your conclusions about the use of the Milestones? What resources and forms of knowledge building have found most useful? What are the benefits to and barriers for your learners? Your recommendations For CESBA and each other

For researchers like us For the LBS Program and MAESD Other stakeholders Our recommendations

Do not connect funding to results Do rely on international literacy testing methods and results to build the effectiveness measure in the PMF Review a complex and inconsistent assessment system Ensure appropriate, fair, consistent and meaningful assessment by using experts who will not profit Involve practitioners in a meaningful way in future re-design efforts Moving forward You can use the research report and results to support your decisions

You can also use the findings (and slides) to explain your concerns to your ETC Do you have other suggestions about what we can do moving forward? More information Research Overview Assessment Challenges, Contradictions and Inequities: An analysis of the use of digital technology and OALCF Milestones

Literacy and Basic Skills (LBS) Program Data Lessons Learned From Analysing the OALCF Use Digital Technology Milestones Practices Developed When Using the OALCF Milestones Download at AlphaPlus.ca

Recently Viewed Presentations

  • The HELP Programme - Council of Europe

    The HELP Programme - Council of Europe

    European Programme for Human Rights Education for Legal Professionals From HRE to HRELP Eva Pastrana, Council of Europe Strasbourg, June 2017 Overview - HELP What is HELP Links to HR education in Europe Regional approach: EU Western Balkans and Turkey...
  • Mammals Chapter 32

    Mammals Chapter 32

    Mammals What's a mammal? hair Mammary/milk glands (mamma = nipple) breathe air and have a diaphragm Most are Viviparous (except marsupials and monotremes) 4 chamber heart/ double loop circulation endotherms 3 Groups Placental About 90% of all mammals Over 4,000...
  • Section 9.1 - Introduction to Geometry: Points, Lines, and Planes

    Section 9.1 - Introduction to Geometry: Points, Lines, and Planes

    Arial Comic Sans MS Lucida Calligraphy Default Design Section 9.1 - Introduction to Geometry: Points, Lines, and Planes Point Line Plane Line segment or segment Ray Use the figure to name each of the following: Intersecting, Parallel, and Skew Lines...
  • Child Development - PC\|MAC

    Child Development - PC\|MAC

    Child Development Unit. The Play Years. ... Sociodramatic Play. Children create their own imaginative story and act out various roles and themes. Girls are more likely than boys to engage in sociodramatic play. Play is the Social - Emotional Development...
  • KBO Discovery Mission - Northwestern University

    KBO Discovery Mission - Northwestern University

    Endogenic and Exogenic Processes affecting moons. Are plumes brightness affected by Jupiter tidal effect, magnetosphere, or internal conditions like Earth's volcanoes? MISSION 2: Satellite Systems of the Giant Planets. Io-Jupiter System.
  • Introduction to Astronomy

    Introduction to Astronomy

    Motions of the Stars 30 August 2006 Course Outline Naked-eye astronomy Crash course in physics Our solar system The stars Structure and history of the universe Course Outline Naked-eye astronomy Crash course in physics Our solar system The stars Structure...
  • Practical development Illustrated using the GAIUS windtunnel automation

    Practical development Illustrated using the GAIUS windtunnel automation

    Practical development Illustrated using the GAIUS windtunnel automation system By: Evert van de Waal Imtech ICT Technical Systems Background Imtech Imtech Imtech is a large (19.000) company Focus on technical services Infrastructure Buildings, ships, communications, traffic Imtech ICT Technical Systems...
  • Informacijsko-komunikacijske tehnologije u nastavi

    Informacijsko-komunikacijske tehnologije u nastavi

    "Poučavanje u mnogo čemu nalikuje varijetetskoj točki u kojoj artist treba istodobno vrtjeti nekoliko tanjura na nekoliko štapova - da bi mu to uspjelo, mora zavrtjeti nove tanjure, a povremeno zavrtjeti i one koji su usporili jer bi u protivnom...