Manitoba Math Meeting #mbmath

Manitoba Math Meeting #mbmath

The Department of Education in the province of Manitoba asked every division to send 1-2 representatives to a two-day meeting on Numeracy in Manitoba. What follows is my archive of my tweets sent on the first day May 27th 2013

  1. Starting 2days w/ Manitoba Ed on numeracy, data, and “shifting the dialogue”. Reps from each division were asked to attend. #mbedu #mathchat
  2. Day’s agenda will include PISA, PCAP, district results, and the new curriculum revealed.
  3. “We really need to look at things differently” a phrase used 4x in past 5 minutes.
  4. Here’s what “doing things differently” means to Manitoba Ed.
  5. Tweets from today with MB ed and math meeting will have hashtag of #mbmath. Follow along!
  6. “Where are we now” 1st Q… Assessment branch rep to show PCAP results next. #mbmath
  7. PISA, PCAP, and provincial assessment data being shared. #mbmath
  8. PISA 2009: “Manitoba more equity in math performance compared to most OECD countries” #mbmath
  9. @tjthiessen thanks for tweeting about this, will be following along #mbmath
  10. Interesting 2 day agenda. Hoping device batteries last! RT@miken_bu: thanks for tweeting about this, will be following along #mbmath
  11. PISA 2009 Manitoba French vs English, female vs male scores no statistically significant difference #mbmath
  12. Ken Clark from MB Assessment Branch talking about math PISA trends. #mbmath
  13. #mbmath PISA results from 2003 to 2009, avge went from 528 to 501. Ken says “statistically significant drop”. Here we go…
  14. I will reserve opinions for blogging later, so my tweets today at #mbmath are to convey info to Tweeps. Want to pick this apart badly tho!
  15. On to PCAP results. 2007 was 13yr olds, 2010 grade 8 students “to connect tchrs to student data” (K Clark) #mbmath
  16. “PISA and PCAP (old SAIP) have similar Q format” (K Clark) #mbmath
  17. PCAP 2010 math cohort will be PISA 2012 math cohort. Clark suggesting tying data together. #mbmath
  18. PCAP english results dropped from 479 to 468 (2007 to 2010) “statistically significant” (K Clark) #mbmath
  19. Comparison of PISA and PCAP proficiency 2003 to 2010 (K Clark) #mbmath
  20. Moving on to MB grade 7 math results 2008-2012 “largely stable” (K Clark) #mbmath
  21. “Agreement between PCAP and provincial gr.7 math assessment” (K Clark) #mbmath
  22. MB below, approaching, meeting gr.7 math (K Clark) #mbmath (resisting urge to tweeting my opinion getting tough!)
  23. “Shape and space strand shows biggest drop” (K Clark) #mbmath
  24. Grade 12 applied math results “fairly stable” (K Clark) #mbmath
  25. Previous slide missing column titles 4 Essential, Applied and Precalc “can’t remember which is which but all fairly stable” (Clark) #mbmath
  26. “Problem solving on report card is what makes math matter” (Clark) #mbmath
  27. @jcordovasjsd #mbmath no chance for talk yet, Ken Clark whipping thru data slides with almost no Q&A though he’s willing to answer Qs
  28. And data whirlwind of slides over at #mbmath, now “PCAP 2010 Contextual Report on S math achievement” being handed out.
  29. Wonder what @pasi_sahlberg would say if he were in the #mbmath meeting today and tomorrow? Hmm…
  30. Looking at PCAP variables “associated with higher achievement in math” #mbmath
  31. “We need to look carefully at the context and not jump to conclusions” (Aileen Najduch, Assistant Deputy Minister of Ed) #mbmath
  32. “Cannot assume cause and effect” (Najduch) #mbmath
  33. Time breakdown (weekly math time) compared to PCAP scores… #mbmath
  34. Ken Clark reminding #mbmath that PCAP and PISA are surveys of Ss, not all Ss participate. Reminder to be mindful of this when looking @ data
  35. Clark: “I don’t like the language of ‘robust’ variables” #mbmath
  36. No time yet! They’re going mile-a-minute w/ info!! RT@miken_bu: go ahead, tweet your opinions ?#mbmath
  37. Aileen Najduch highlighting Ontario Ed and numeracy blocks may be worthy of table discussions at #mbmath
  38. “Sometimes helpful to look at top performing jurisdictions: ON, AB, QUE” (Aileen Najduch) #mbmath
  39. Q from audience – any national assessments comparing 8-10yr olds? A–no. #mbmath
  40. Talk time to begin now… Tweets on hold for table discussion! #mbmath
  41. Warning – opinion tweets on #mbmath for the next little bit! 1st thot – “equity results for MB not stat significant” Disagree!
  42. #mbmath elephant in the room not being discussed (yet? I remain hopeful) – Aboriginal population and supports
  43. #mbmath 2nd elephant in the room – data dicussion not useful when administration of assessments inconsistent
  44. #mbmath 3rd elephant in the room – pedagogy results from PCAP survey are just that … A Survey, not indicative of research best practice.
  45. #mbmath table discussion looking at PCAP results tied to conventional assessment = higher score on PCAP? Causality or no? harumph
  46. Q at #mbmath table discussion: why only 2 provinces fully participated in PCAP? Affect on results…
  47. I need a clone to do tweetiing at #mbmath session I’m in. Sorry tweeps lots of table convo happening!
  48. #mbmath Student PCAP survey – perception among higher scoring students that more tests, quizzes given in class. *perception*
  49. #mbmath Teacher PCAP survey perception – that heterogeneous classes produce lower test results. *perception*
  50. #mbmath only MB and PEI had 100% PCAP participation. MB highest rate of exclusion.
  51. Aileen Najduch asking us to raise hands if we’re uncomfortable looking at PCAP and PISA results at #mbmath meeting. Hmm, truthful response?
  52. #mbmath I’m comfortable with the discussion. (For the record). I’m *frustrated* with our Minister’s perception of the data and … (1 of 2)
  53. #mbmath … frustrated with lack of discussion of elephants. Hopeful (delusional?) it will occur in pm… Lunch break! Back at 1pm
  54. #mbmath Website where PCAP “Contextual Report on Student Achievement in Math” is housed found at
  55. #mbmath Actual report link used during morning discussion at… (warning, BIG pdf!)
  56. #mbmath Our table group given Assessment category, variable “methods of classroom assessment”, and results from S survey (PCAP participants)
  57. #mbmath Morning table groups were each given one category and variable, with the results from the “Contextual Report on S Achievement in M”
  58. #mbmath S survey linked to PCAP results. Ss who responded mostly “conventional” assessments used (tests, quizzes) scored higher on PCAP.
  59. #mbmath Table discussed dislike of “conventional” and “unconventional” labels – how does this solidify perception in Ss, Ts, public?
  60. #mbmath PCAP definition of “unconventional” assessment included self-assessment, peer assess, journals, portfolios, group work.
  61. #mbmath Table group wanted student population data coordinated with survey and PCAp results.
  62. #mbmath Table group brought up 4th elephant in the room: student engagement. How does this affect PCAP results? “It’s not for marks, so…”
  63. #mbmath Finding it frustrating that hashtag not highlighted by organizers at event. Wishing more tweeters in room would join conversation.
  64. #mbmath Wondering abt dynamics of an Assistant Deputy Minister of Education asking everyone in the room to “self proclaim”, raising hands…
  65. #mbmath …in answer to Q where clearly there is an expected A. Wondering if twitter convo wld be different than room convo…
  66. #mbmath Multiple people mentioning work of EQAO while same ppl talking about not having pendulum swing. Disconnect!
  67. #mbmath Lots of emphasis on “balance” – in instruction, assessment, approaches. Unspoken fears in the room about possible pendulum swing…
  68. @jcordovasjsd I found it ironic this particular slide was opener to 1min table conversation. Irony of “outdated expect’ns” fr 1997! #mbmath
  69. @joevans as “Maple God”, & knowing u r at #mbmath session w/me tmrw, is there a way to access “old Maple stuff” from previous division (1of2
  70. #mbmath session reconvening with Ken Clark from Assessment Branch handing out envelopes of divisional data from gr.3/4 and MY assessments
  71. #mbmath each division to now discuss individual divisional math data to identify strengths and needs, based on data provided.
  72. #mbmath Divisional discussion also to include Qs on instructional practice (has this shifted to affect data? should it?).
  73. #mbmath Divisional discussion also to include Qs on assessment practice (has this shifted to affect data? should it?)
  74. Hi @WLW21st ! If u want to get a feel for what’s going on so far, follow the #mbmath hashtag. I’m collecting copies of handouts for u. 🙂
  75. “There’s more to how you assess & how you instruct, than just this gr.3/4 and MY data, that affects S performance.” (K.Clark) #mbmath
  76. #mbmath though I’m a table group of one to discuss our district data, convo going really well! (Grin)
  77. #mbmath conversation with consultant from Dept of Ed – if we’re using research based instruct’n, assessm’t but results r “stable”, ask why!
  78. #mbmath “What does a ‘good math class’ look like based on best research?” (L Girling, Dept of Ed)
  79. #mbmath Sharing what other divisional assessment data tools r used. Wpg Div’n sharing “data rich”, Nto6 assessment “Learning Pathway”.
  80. #mbmath Wpg Div’n sharing gr.8 common exam, different data than what gr.6 competencies show.
  81. #mbmath Whiteshell Div’n looking for baseline K-6 assessment, and SY too.
  82. #mbmath Western Div’n working with MY& SY PLC’s, common assessments.
  83. #mbmath Turtle River tracking K-6, starting discussions on this.
  84. #mbmath Turtle Mtn tracking more literacy, less numeracy, looking @ better numeracy thru grade grp mtgs.
  85. #mbmath Swan Valley doing gr2,4,6,8,10 literacy tracking.
  86. #mbmath Sunrise working on problem posing, problem solving rubrics and exemplars.
  87. #mbmath St James Assiniboia doing K assessment, tracking report card data, gr1-2 math assessment on patterns, relations, number.
  88. #mbmath Southwest Horizons having gr4 and gr8 math tchrs create essential outcomes and tool to assess.
  89. #mbmath 7oaks focusing on provincial assessments, tchr pd.
  90. #mbmath Rolling River looking for baseline assessment tool.
  91. #mbmath River East using provincial results, early numeracy intervention, & gr2-3 data.
  92. #mbmath Red River Valley using Strong Beginnings assessment process but no common tool for that process. Num ldrship team created.
  93. #mbmath Prairie Spirit has school based, not yet divisional assessment.
  94. #mbmath Prairie Rose tracking EDI data, provincial data, grad rates.
  95. #mbmath Portage la Prairie looking at software to pull collected data together.
  96. #mbmath Pine Creek looking at creating numeracy cohort for next 2 yrs, action research to improve tching.
  97. #mbmath I shared that Pembina Trails is doing First Steps in Math, Strong Beginnings tool, guided math, differentiation, PLCs, program ldrs
  98. #mbmath Mystery Lake first year to have numeracy support in district. Will create plan!
  99. #mbmath Mountain view using prov data, CAP4, gr9 formative div’l assessment.
  100. #mbmath Louis Riel working on literaacy initiative, will start numeracy initiative next year. Common path to come!
  101. #mbmath Lord Selkirk using provincial data, Peggy Hill numerical knowledge assessment.
  102. #mbmath Lakeshore using Strong Beginnings for K-4 but looks different in dif schools.
  103. #mbmath Kelsey school div’n small but school based, teacher made assessments K-5. Using prov assessments.
  104. #mbmath Interlake 1st year with numeracy teams to create common assessments. Using EDI, prov data.
  105. #mbmath Garden Valley focusing on literacy, hoping to shift to numeracy in future.
  106. #mbmath Frontier doing pre & post test 3xyr for division. Fractions, stats, place value.
  107. @tjthiessen what do you feel is the benefit of division based assessment? #mbmath
  108. Very important for consistent understanding! RT@MsAKonrad: what do you feel is the benefit of division based assessment? #mbmath
  109. @MsAKonrad (among other things) but common assessment should never preclude individual student needs! #mbmath
  110. #mbmath Fort la Bosse 3yr plan for numeracy 2013-2016. EY diagnostic tool, using Numeracy Nets, PLCs.
  111. #mbmath Flin Flon using CAT4 for 3yrs. Includes LA and math, tracking student progress. Written in April, discussed Sept by tchrs.
  112. #mbmath Evergreen has divisional student council, resiliency initiative K-12.
  113. #mbmath Brandon using EDI results, gr.5&9 data, end of year assessments for gr3,5,7.
  114. #mbmath Borderland min 2tchrs per building trained in First Steps in Math, all resource tchrs trained.
  115. #mbmath Beautiful Plains using CTBS, Tell Them From Me, provincial data.
  116. #mbmath I missed just one division in the past set of tweets (they talked while I tweeted what Pembina Trails was doing). Apologies to them!
  117. #mbmath Next portion of afternoon on provincial report card implementation currently happening.
  118. #mbmath “How might we use report card data to improve relationships, numeracy learning, teaching and leadership?” Are our goal SMARTER?
  119. #mbmath SMARTER goals include “Engagement” and “Response to current learning gap”.
  120. #mbmath Still misconceptions out there about “when can we give a 4/4” Yes it can and SHOULD be given in November, March AND June.
  121. #mbmath Department encouraging use of report card data to explore numeracy.
  122. #mbmath Reminder from Department that if a teacher is entering NA “not assessed” this must be a conversation w/, approved by, administrator.
  123. #mbmath We’re now being given homework packages to come with (completed) for tomorrow’s session.
  124. #mbmath Here’s our homework for tomorrow…curriculum reveal occurs then. (Unless we read it tonite…which I will!!)
  125. #mbmath Aaaaaaand more homework. Day is done. See y’all tomorrow!

Read next page

Did you find this story interesting? Be the first to
or comment.


About tjthiessen

explorer, administrator, consultant, student, leader
This entry was posted in Mathematics. Bookmark the permalink.