Artwork

コンテンツは IDEA Data and IDEA Data Center (IDC) によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、IDEA Data and IDEA Data Center (IDC) またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Player FM -ポッドキャストアプリ
Player FMアプリでオフラインにしPlayer FMう!

How the Pandemic and Changes to the SPP/APR Affected Arkansas’s IDEA Data

18:38
 
シェア
 

アーカイブされたシリーズ ("無効なフィード" status)

When? This feed was archived on September 30, 2024 00:25 (1M ago). Last successful fetch was on August 23, 2024 01:48 (3M ago)

Why? 無効なフィード status. サーバーは持続期間に有効なポッドキャストのフィードを取得することができませんでした。

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 329724636 series 3340807
コンテンツは IDEA Data and IDEA Data Center (IDC) によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、IDEA Data and IDEA Data Center (IDC) またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

SPP/APR Resources at a Glance


Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!
You can contact us via the Podcast page on the IDC website at https://ideadata.org/.
### Episode Transcript ###

00:00:01.55 >> You're listening to "A Date with Data" with your host, Amy Bitterman.
00:00:07.37 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day.
00:00:19.53 >> "A Date with Data" is brought to you by the IDEA Data Center.
00:00:24.76 >> On this episode, we're joined by Jody Fields, Director of IDEA Data and Research Office with the Arkansas Department of Education. Welcome. Can you tell us a little bit about yourself to get us started?
00:00:35.65 >> So, I'm Dr. Jody Fields. I actually have a grant from the Arkansas Department of Education Special Education Unit. We're located at the University of Arkansas Little Rock, and I have been serving as the Special Education Data Manger since December 2003.
00:00:52.33 >> Between the pandemic and all the changes to the SPP/APR, states have seen a dramatic impact on their 618 data and also the indicators. Can you talk about how the pandemic has impacted your data, trends and the results you've seen?
00:01:07.02 >> Of course, the obvious data trend that was interrupted was assessment, and everyone lost that data set. The accountability systems kind of crashed for the year in that regard. And, well, Arkansas was actually in the building in the 2021 school year. We still saw impact. So many of the students were virtual. About two thirds of special ed students were actually virtual in 2021 school year, and so we saw a lot of issues around the data even though we were all submitting them as normal through the statewide student management system, drops in assessment scores. Drop in discipline records probably was the biggest one and then also the fact of graduation. In March of 2020 when everything shut down and students went home to be on virtual instead, a lot of school districts just went ahead and graduated everybody, so you didn't ... We had this higher graduation rate where we may have had a higher dropout rate if they were still in the buildings because they may not have decided to stick around for the last month or so of school, and instead we ended up with this higher graduation rate, a lower dropout rate, and there was a good three-to-four-percentage-point swing on both of those.
00:02:25.52 >> And what about the impact on the quality of your IDEA data? What did that look like?
00:02:30.54 >> I think the quality issue is really tied to how the districts handled dismissing students because we have a single-student management system in Arkansas, so every school district uses the exact same system. And so it really came down to, what were they entering? And a lot of the data entry staff were doing it from home in the spring of 2020 as we finished up that school year. They were still required to submit everything. No one had a waiver for any data sets to be submitted whether it was IDEA or under ESSA or CCD, Common Core data. It didn't matter. Everything still had to be there, and so some of the issues about how the districts but also for early childhood getting access to their students to finish off early childhood outcomes was a big one about, how do you do an early childhood outcome final measurement for those kids heading to kindergarten or being dismissed that was going to be really valid and reliable when you couldn't even get access to the students? And I think that was probably one of the bigger ones besides just how data changed around school entry with graduation and dropout. For early childhood, it was outcomes, not having access, and of course, that then ties into referrals and not having access to these initial evaluations. While a lot of referrals seem to come in as normal, a lot of parents weight the evaluation. My kid is not going back to school next year, especially my preschoolers. Day cares and preschools were not open. They were just like, "We're going to wait and see. We're at home. They can just stay home with us," and so a lot more waving off the referral process. And then, our data didn't look that bad, but that's only because we have a standing practice of not holding the districts accountable for things they can't have any control over such as a parent not giving them access to a child to do with timelines, and so we tend to apply everything that's an indicator [Indistinct] around C to B transition. We apply those same reasons into indicator 11, and so our percentages look pretty good, don't really look like the pandemic hit it, but we actually ... There were a lot of delays even though our data doesn't show that in the sense because parents delay giving evaluators access to students.
00:04:50.99 >> So there's a real trickle-down effect of things. Even you might see something really change and be impacted, but you also have to think about, down the line, how that's going to affect other data and moving on to the future, I would imagine.
00:05:04.22 >> When you're not getting the access to the data, to the student to generate that data, then you have this trickle down about, how soon are they going to get services? When does their IEP start? You have that whole issue around, you have a 6-months delay because you couldn't get access to the student, so is there something you can do to help make up that 6 months of not being able to have an IEP in place 6 months earlier than what you have actually started one because of all of these trickle-down effects?
00:05:36.40 >> What are some of the ways that you've tried to mitigate some of those effects and address these data quality issues and also the results and impact themselves?
00:05:46.83 >> Trying to mitigate it is a little bit more of a trick because it is at the local level and not necessarily at the state, so we did have a lot of things about reminding districts what their responsibilities were, making sure that we were still doing trainings. While we couldn't do face-to-face trainings, we still had a lot of webinars and Zoom trainings for everybody. I have a training coordinator for the student management system who was on there almost every day with a different district training new data entry staff about what had to happen, what they had to submit, reminding them what they had to submit at the end of the school year, the start of the school year, and it really is a ... A lot of it is reminding the districts, "Don't forget. These are your codes. These codes go with this group. These codes go with that program," because we have everything in the student management system. We have certain codes that go to early childhood exit, school-age exit, CEIS exits and such that we have to kind of remind them, especially as new staff comes into the districts, well, what all those codes means and how they have impacts within their data and how we use that, then, with the reporting. And we're probably one of the states that kept the timely and accurate piece of our APR, and so a lot of the times, just reminding them, hey, this has to be done right, you have to go back. You have a period to review it and fix things, and when you don't, timely and accurate comes into play.
00:07:06.67 >> Can you say a little more about the timely and accurate piece with the districts? You mentioned with the APR.
00:07:12.36 >> So we look at basically everything for timely and accurate, and we're not measuring within the APR. We kind of still call it the old indicator 20, so timeline-accurate reporting. It's not just the data coming in for 618. Did we get the data in? Did we get it initially from the student management system when they have their month of review for the various data sets? Did they clean up all the errors and conflicts that we found in the data? Did they get that all resolved within the time frame? And so it really comes down to a yes and no. But then we also go to monitoring and say, so, someone had a finding. Did they submit everything that they were supposed to submit in the time frame you gave them to submit? Did we have audit findings back in finance? Did they submit everything to finance they were supposed to submit in a timely manner? Do we have anything still that's long-outstanding in that regard, even? And while on the APR, it just shows yes or no, but we do our determinations as actually, how many years have you been, or how many items didn't you submit? So on our APR programs, it might say, "No, you didn't meet the indicator," but the determination, it says, under timely and accurate, we're counting how many items you failed to do, submit appropriately, and your score is based on the number of items.
00:08:27.99 >> And did you see that take a hit?
00:08:29.81 >> Probably a little bit that there were a few that didn't clean up as much as they should have been cleaning up, and we had to go back after the fact and ask them to give us the verification of what it was so we could fix it before we started to use the data, but it does go into my spreadsheet for timely and accurate, my spreadsheet that they never want me to go into because they do. It's kind of funny at times. They're more concerned at times about getting hit with that timely and accurate and their superintendent asking them, "Why didn't you do this on time?"
00:09:01.46 >> Right.
00:09:01.72 >> That concerns them more than it does, at times, of just being, "You have a compliance audit," but yes, because the superintendents also sign off on all of this data. They're really having those when you're saying you didn't submit it timely. Superintendent goes, "Why didn't you submit it? What didn't you fix that you were supposed to fix?"
00:09:22.09 >> Because that does seem more like low-hanging fruit, something that really everyone should be able to do at a minimum.
00:09:28.61 >> It did make it more challenging, and there were ... When people were working from home, and they're like, "All this stuff is in the office," and some school districts allowed staff still to come in and work in their office, and other districts were like, "No one can come to the buildings," so it was very much a local decision, which made it a bigger challenge for some districts than it did for others.
00:09:49.74 >> What about the changes to the SPP/APR for the federal fiscal year 2020 to 2025? What impact did those have on your state and your results?
00:10:00.05 >> Well, I think the biggest changes and the impact of the results has a lot to do with graduation and dropout using the 618 data and not the ESSA data. That is something that the districts are really going to have to get used to because of what the calculation is, that calculation for graduation and dropout using those five categories in which students are in the exiting data for leaving special ed as denominator. So when you look at graduation or dropout, it's the percent of students who are coded as graduating with a regular diploma within that data set. It's not, who was supposed to graduate with a regular diploma or, what students could have dropped out? It's just, who is within that data set or coded as a graduate or a dropout? And that probably has a bigger impact on dropout than it does on graduation. The graduation numbers actually, in that methodology for the state, looks really good. But when you take that down to the district level, it looks kind of strange, so if you only have one student in that five categories, and let's say they're a maximum age, if you follow the calculation, graduation comes out as zero. And while we have put a disclaimer on our local APRs that says, "This is a percent of leavers, not a percent graduation rate," if we put out zero percent for graduation, it would be a major issue. The districts, the superintendents, they're all going to be going, "What on earth ... What do you mean?" And so we had made the decisions on the local ones, if you have zero counts and diploma, certificate or alternate diploma, then it will become a not-applicable-for-graduation and not be posted out to the public as a zero because that will just raise all kinds of flags. And the same thing can happen with dropout, that you could have one student in that data set who is a dropout and nobody else and anything because you might be a school ... You could be a charter school who only has grades K, seven, but you might have that 14-year-old who disappeared on you, and so now you have one of one in the data set for 14-to-21-year-olds, and you're sitting at 100 percent. So we have actually, for public reporting, put the criteria that you have to have more than five in that dropout category for us to report it publicly. Instead it becomes a not-applicable just like we're doing in graduation and making things not applicable. At the state level, that calculation is really easy. You look at it and go, okay, no big deal. But at the district level, those two fields or indicators are major issues around reporting that to the public and them understanding that it's not a graduation rate. It's not a dropout rate. It is a percent of leavers who exited special ed in these five categories and only those five categories.
00:12:52.43 >> With these changes to the SPP/APR, how did you convey them to districts and other stakeholders?
00:12:58.60 >> So part of our requirements within APR is courses to hold stakeholder meetings, and we actually initiated that with a group of selected stakeholders from across the state, and this is beyond those who are part of our state advisory. So we had about 40 who participated in webinars, but we also have, special education does a monthly call with LEA supervisors and the last Thursday of each month, and so we also put things out through there where you've been doing it at different meetings. We had [Indistinct] a few states, we actually had in-person meetings last summer. We had two state conferences, one in June, one in September or October that we got to present things on. That was one of mine when we did the LEA academy, which, it was the end of September, I believe, or it was early October. I had a whole session, was about graduation and dropout to really explain what this was going to mean, which is why we also came up with the disclaimer to put on the APR profiles for each district that shows this is what this means. Don't confuse it and such. So it is a challenge, especially as staff turns over, and the fact that we know that the special ed directors have to go back and explain this to superintendents, and that is the bigger challenge of superintendents also understanding. And Arkansas is the state who's getting ready to have their first cohort next year graduate off an alternate pathway, and it explained that, in the APR, this is going to count against. But when it comes to accountability under ESSA, it's good for them. They're part of that 4-year cohort, and so having to explain the differences where we have kept things so aligned for 8 years of aligning ESSA and the APR, and then we've totally flipped back to the way it was before where there was no alignment, and that's been a big piece of that understanding, of getting the message out to the districts, to the stakeholders, our advisory council and explaining, this is how this is going to affect ... This is good, which is why we brought it up at the advisory a couple months ago around the fact of graduation and dropout, and this is the criteria we've laid out before we put this to the public on the local APRs, because this is going to raise all kinds of flags if we put out there someone has a zero-graduation rate or a 100-percent dropout rate on one kid.
00:15:12.95 >> Given how long you've been in this position as data manager, how has your role changed and evolved over time?
00:15:20.28 >> Probably one of the best things about being around as long as I have is all the iterations that had to happen within IDEA of having to collect referral data, having to collect. We already were collecting child count for school age and early childhood and personnel in the student management system, but it was in-house kind of settling alongside of the student management system, not actually integrated, and now it's fully integrated into it. But we had to create referral tracking. We had to create a module for early intervening and such, and so a lot of times, people go, "I can't believe you just know this off the top of your head." When you had to create it all, you know it because you were the one who had to help create it and such. So, yeah, that was one of the biggest things. And, of course, one of the things when they did the grant to the university in the spring of '05 was, we could have staff. So I've had an analyst. We took over. Instead of me doing the training with the school districts, I was able to hire a training coordinator who handles the training with the school districts and helps troubleshoot during the submission cycles and such. And so a lot of times, I feel like I'm pretty spoiled because I do have staff, but then also when I was having staff, that means that I get to go to more meetings while they get to do all the analysis stuff. And there's times when I'm like, "Can I just go back to the analysis and not do all the meetings instead?" because it is one of those that you're like, "I kind of miss doing what I started out doing," and now you have to play ... You're the administrator, and you get to get pulled into all these meetings. And it took time over years of getting pulled into meetings. It depended on who was the director of special ed because I've gone through four directors of special ed. I'm the only person left who actually went to a monitoring with OSA, and that was under the old Craig before they started looking at RDA, and we have a monitoring company now. But I'm the only one who's ever gone through it.
00:17:17.00 >> Jody, you have such a wealth of knowledge after being in your position for as long as you have and with your states, and you're always sharing your experience and expertise and tips and tricks that you've learned over the years with other data managers in other states, and I know how much they appreciate hearing from you and learning from you.
00:17:34.89 >> Thank you. I appreciate that. I try to help out where I can. There's times out of the blue I'll get an e-mail from someone, and I'm like, "Okay," and I just ... This is what I know. This is where you can find it. Somewhere out there, there's a PowerPoint that OSA did. We'll see if anybody put it on the listserv and see if anyone still has it. And there was a group of us. There's probably five or more of us who've actually been around this long, Connecticut, Pennsylvania, Kansas, Missouri. There is a group of us who have all been around together and through all the changes.
00:18:07.70 >> Thank you so much, Jody, for joining us today on the podcast. We really appreciate it.
00:18:11.62 >> I appreciate you having me, and I hope the podcasts are a big success.
00:18:17.25 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or, connect with us via the podcast page on the IDC website at IDEAdata.org.
  continue reading

53 つのエピソード

Artwork
iconシェア
 

アーカイブされたシリーズ ("無効なフィード" status)

When? This feed was archived on September 30, 2024 00:25 (1M ago). Last successful fetch was on August 23, 2024 01:48 (3M ago)

Why? 無効なフィード status. サーバーは持続期間に有効なポッドキャストのフィードを取得することができませんでした。

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 329724636 series 3340807
コンテンツは IDEA Data and IDEA Data Center (IDC) によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、IDEA Data and IDEA Data Center (IDC) またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

SPP/APR Resources at a Glance


Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!
You can contact us via the Podcast page on the IDC website at https://ideadata.org/.
### Episode Transcript ###

00:00:01.55 >> You're listening to "A Date with Data" with your host, Amy Bitterman.
00:00:07.37 >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education staff who, just like you, are dealing with IDEA data every day.
00:00:19.53 >> "A Date with Data" is brought to you by the IDEA Data Center.
00:00:24.76 >> On this episode, we're joined by Jody Fields, Director of IDEA Data and Research Office with the Arkansas Department of Education. Welcome. Can you tell us a little bit about yourself to get us started?
00:00:35.65 >> So, I'm Dr. Jody Fields. I actually have a grant from the Arkansas Department of Education Special Education Unit. We're located at the University of Arkansas Little Rock, and I have been serving as the Special Education Data Manger since December 2003.
00:00:52.33 >> Between the pandemic and all the changes to the SPP/APR, states have seen a dramatic impact on their 618 data and also the indicators. Can you talk about how the pandemic has impacted your data, trends and the results you've seen?
00:01:07.02 >> Of course, the obvious data trend that was interrupted was assessment, and everyone lost that data set. The accountability systems kind of crashed for the year in that regard. And, well, Arkansas was actually in the building in the 2021 school year. We still saw impact. So many of the students were virtual. About two thirds of special ed students were actually virtual in 2021 school year, and so we saw a lot of issues around the data even though we were all submitting them as normal through the statewide student management system, drops in assessment scores. Drop in discipline records probably was the biggest one and then also the fact of graduation. In March of 2020 when everything shut down and students went home to be on virtual instead, a lot of school districts just went ahead and graduated everybody, so you didn't ... We had this higher graduation rate where we may have had a higher dropout rate if they were still in the buildings because they may not have decided to stick around for the last month or so of school, and instead we ended up with this higher graduation rate, a lower dropout rate, and there was a good three-to-four-percentage-point swing on both of those.
00:02:25.52 >> And what about the impact on the quality of your IDEA data? What did that look like?
00:02:30.54 >> I think the quality issue is really tied to how the districts handled dismissing students because we have a single-student management system in Arkansas, so every school district uses the exact same system. And so it really came down to, what were they entering? And a lot of the data entry staff were doing it from home in the spring of 2020 as we finished up that school year. They were still required to submit everything. No one had a waiver for any data sets to be submitted whether it was IDEA or under ESSA or CCD, Common Core data. It didn't matter. Everything still had to be there, and so some of the issues about how the districts but also for early childhood getting access to their students to finish off early childhood outcomes was a big one about, how do you do an early childhood outcome final measurement for those kids heading to kindergarten or being dismissed that was going to be really valid and reliable when you couldn't even get access to the students? And I think that was probably one of the bigger ones besides just how data changed around school entry with graduation and dropout. For early childhood, it was outcomes, not having access, and of course, that then ties into referrals and not having access to these initial evaluations. While a lot of referrals seem to come in as normal, a lot of parents weight the evaluation. My kid is not going back to school next year, especially my preschoolers. Day cares and preschools were not open. They were just like, "We're going to wait and see. We're at home. They can just stay home with us," and so a lot more waving off the referral process. And then, our data didn't look that bad, but that's only because we have a standing practice of not holding the districts accountable for things they can't have any control over such as a parent not giving them access to a child to do with timelines, and so we tend to apply everything that's an indicator [Indistinct] around C to B transition. We apply those same reasons into indicator 11, and so our percentages look pretty good, don't really look like the pandemic hit it, but we actually ... There were a lot of delays even though our data doesn't show that in the sense because parents delay giving evaluators access to students.
00:04:50.99 >> So there's a real trickle-down effect of things. Even you might see something really change and be impacted, but you also have to think about, down the line, how that's going to affect other data and moving on to the future, I would imagine.
00:05:04.22 >> When you're not getting the access to the data, to the student to generate that data, then you have this trickle down about, how soon are they going to get services? When does their IEP start? You have that whole issue around, you have a 6-months delay because you couldn't get access to the student, so is there something you can do to help make up that 6 months of not being able to have an IEP in place 6 months earlier than what you have actually started one because of all of these trickle-down effects?
00:05:36.40 >> What are some of the ways that you've tried to mitigate some of those effects and address these data quality issues and also the results and impact themselves?
00:05:46.83 >> Trying to mitigate it is a little bit more of a trick because it is at the local level and not necessarily at the state, so we did have a lot of things about reminding districts what their responsibilities were, making sure that we were still doing trainings. While we couldn't do face-to-face trainings, we still had a lot of webinars and Zoom trainings for everybody. I have a training coordinator for the student management system who was on there almost every day with a different district training new data entry staff about what had to happen, what they had to submit, reminding them what they had to submit at the end of the school year, the start of the school year, and it really is a ... A lot of it is reminding the districts, "Don't forget. These are your codes. These codes go with this group. These codes go with that program," because we have everything in the student management system. We have certain codes that go to early childhood exit, school-age exit, CEIS exits and such that we have to kind of remind them, especially as new staff comes into the districts, well, what all those codes means and how they have impacts within their data and how we use that, then, with the reporting. And we're probably one of the states that kept the timely and accurate piece of our APR, and so a lot of the times, just reminding them, hey, this has to be done right, you have to go back. You have a period to review it and fix things, and when you don't, timely and accurate comes into play.
00:07:06.67 >> Can you say a little more about the timely and accurate piece with the districts? You mentioned with the APR.
00:07:12.36 >> So we look at basically everything for timely and accurate, and we're not measuring within the APR. We kind of still call it the old indicator 20, so timeline-accurate reporting. It's not just the data coming in for 618. Did we get the data in? Did we get it initially from the student management system when they have their month of review for the various data sets? Did they clean up all the errors and conflicts that we found in the data? Did they get that all resolved within the time frame? And so it really comes down to a yes and no. But then we also go to monitoring and say, so, someone had a finding. Did they submit everything that they were supposed to submit in the time frame you gave them to submit? Did we have audit findings back in finance? Did they submit everything to finance they were supposed to submit in a timely manner? Do we have anything still that's long-outstanding in that regard, even? And while on the APR, it just shows yes or no, but we do our determinations as actually, how many years have you been, or how many items didn't you submit? So on our APR programs, it might say, "No, you didn't meet the indicator," but the determination, it says, under timely and accurate, we're counting how many items you failed to do, submit appropriately, and your score is based on the number of items.
00:08:27.99 >> And did you see that take a hit?
00:08:29.81 >> Probably a little bit that there were a few that didn't clean up as much as they should have been cleaning up, and we had to go back after the fact and ask them to give us the verification of what it was so we could fix it before we started to use the data, but it does go into my spreadsheet for timely and accurate, my spreadsheet that they never want me to go into because they do. It's kind of funny at times. They're more concerned at times about getting hit with that timely and accurate and their superintendent asking them, "Why didn't you do this on time?"
00:09:01.46 >> Right.
00:09:01.72 >> That concerns them more than it does, at times, of just being, "You have a compliance audit," but yes, because the superintendents also sign off on all of this data. They're really having those when you're saying you didn't submit it timely. Superintendent goes, "Why didn't you submit it? What didn't you fix that you were supposed to fix?"
00:09:22.09 >> Because that does seem more like low-hanging fruit, something that really everyone should be able to do at a minimum.
00:09:28.61 >> It did make it more challenging, and there were ... When people were working from home, and they're like, "All this stuff is in the office," and some school districts allowed staff still to come in and work in their office, and other districts were like, "No one can come to the buildings," so it was very much a local decision, which made it a bigger challenge for some districts than it did for others.
00:09:49.74 >> What about the changes to the SPP/APR for the federal fiscal year 2020 to 2025? What impact did those have on your state and your results?
00:10:00.05 >> Well, I think the biggest changes and the impact of the results has a lot to do with graduation and dropout using the 618 data and not the ESSA data. That is something that the districts are really going to have to get used to because of what the calculation is, that calculation for graduation and dropout using those five categories in which students are in the exiting data for leaving special ed as denominator. So when you look at graduation or dropout, it's the percent of students who are coded as graduating with a regular diploma within that data set. It's not, who was supposed to graduate with a regular diploma or, what students could have dropped out? It's just, who is within that data set or coded as a graduate or a dropout? And that probably has a bigger impact on dropout than it does on graduation. The graduation numbers actually, in that methodology for the state, looks really good. But when you take that down to the district level, it looks kind of strange, so if you only have one student in that five categories, and let's say they're a maximum age, if you follow the calculation, graduation comes out as zero. And while we have put a disclaimer on our local APRs that says, "This is a percent of leavers, not a percent graduation rate," if we put out zero percent for graduation, it would be a major issue. The districts, the superintendents, they're all going to be going, "What on earth ... What do you mean?" And so we had made the decisions on the local ones, if you have zero counts and diploma, certificate or alternate diploma, then it will become a not-applicable-for-graduation and not be posted out to the public as a zero because that will just raise all kinds of flags. And the same thing can happen with dropout, that you could have one student in that data set who is a dropout and nobody else and anything because you might be a school ... You could be a charter school who only has grades K, seven, but you might have that 14-year-old who disappeared on you, and so now you have one of one in the data set for 14-to-21-year-olds, and you're sitting at 100 percent. So we have actually, for public reporting, put the criteria that you have to have more than five in that dropout category for us to report it publicly. Instead it becomes a not-applicable just like we're doing in graduation and making things not applicable. At the state level, that calculation is really easy. You look at it and go, okay, no big deal. But at the district level, those two fields or indicators are major issues around reporting that to the public and them understanding that it's not a graduation rate. It's not a dropout rate. It is a percent of leavers who exited special ed in these five categories and only those five categories.
00:12:52.43 >> With these changes to the SPP/APR, how did you convey them to districts and other stakeholders?
00:12:58.60 >> So part of our requirements within APR is courses to hold stakeholder meetings, and we actually initiated that with a group of selected stakeholders from across the state, and this is beyond those who are part of our state advisory. So we had about 40 who participated in webinars, but we also have, special education does a monthly call with LEA supervisors and the last Thursday of each month, and so we also put things out through there where you've been doing it at different meetings. We had [Indistinct] a few states, we actually had in-person meetings last summer. We had two state conferences, one in June, one in September or October that we got to present things on. That was one of mine when we did the LEA academy, which, it was the end of September, I believe, or it was early October. I had a whole session, was about graduation and dropout to really explain what this was going to mean, which is why we also came up with the disclaimer to put on the APR profiles for each district that shows this is what this means. Don't confuse it and such. So it is a challenge, especially as staff turns over, and the fact that we know that the special ed directors have to go back and explain this to superintendents, and that is the bigger challenge of superintendents also understanding. And Arkansas is the state who's getting ready to have their first cohort next year graduate off an alternate pathway, and it explained that, in the APR, this is going to count against. But when it comes to accountability under ESSA, it's good for them. They're part of that 4-year cohort, and so having to explain the differences where we have kept things so aligned for 8 years of aligning ESSA and the APR, and then we've totally flipped back to the way it was before where there was no alignment, and that's been a big piece of that understanding, of getting the message out to the districts, to the stakeholders, our advisory council and explaining, this is how this is going to affect ... This is good, which is why we brought it up at the advisory a couple months ago around the fact of graduation and dropout, and this is the criteria we've laid out before we put this to the public on the local APRs, because this is going to raise all kinds of flags if we put out there someone has a zero-graduation rate or a 100-percent dropout rate on one kid.
00:15:12.95 >> Given how long you've been in this position as data manager, how has your role changed and evolved over time?
00:15:20.28 >> Probably one of the best things about being around as long as I have is all the iterations that had to happen within IDEA of having to collect referral data, having to collect. We already were collecting child count for school age and early childhood and personnel in the student management system, but it was in-house kind of settling alongside of the student management system, not actually integrated, and now it's fully integrated into it. But we had to create referral tracking. We had to create a module for early intervening and such, and so a lot of times, people go, "I can't believe you just know this off the top of your head." When you had to create it all, you know it because you were the one who had to help create it and such. So, yeah, that was one of the biggest things. And, of course, one of the things when they did the grant to the university in the spring of '05 was, we could have staff. So I've had an analyst. We took over. Instead of me doing the training with the school districts, I was able to hire a training coordinator who handles the training with the school districts and helps troubleshoot during the submission cycles and such. And so a lot of times, I feel like I'm pretty spoiled because I do have staff, but then also when I was having staff, that means that I get to go to more meetings while they get to do all the analysis stuff. And there's times when I'm like, "Can I just go back to the analysis and not do all the meetings instead?" because it is one of those that you're like, "I kind of miss doing what I started out doing," and now you have to play ... You're the administrator, and you get to get pulled into all these meetings. And it took time over years of getting pulled into meetings. It depended on who was the director of special ed because I've gone through four directors of special ed. I'm the only person left who actually went to a monitoring with OSA, and that was under the old Craig before they started looking at RDA, and we have a monitoring company now. But I'm the only one who's ever gone through it.
00:17:17.00 >> Jody, you have such a wealth of knowledge after being in your position for as long as you have and with your states, and you're always sharing your experience and expertise and tips and tricks that you've learned over the years with other data managers in other states, and I know how much they appreciate hearing from you and learning from you.
00:17:34.89 >> Thank you. I appreciate that. I try to help out where I can. There's times out of the blue I'll get an e-mail from someone, and I'm like, "Okay," and I just ... This is what I know. This is where you can find it. Somewhere out there, there's a PowerPoint that OSA did. We'll see if anybody put it on the listserv and see if anyone still has it. And there was a group of us. There's probably five or more of us who've actually been around this long, Connecticut, Pennsylvania, Kansas, Missouri. There is a group of us who have all been around together and through all the changes.
00:18:07.70 >> Thank you so much, Jody, for joining us today on the podcast. We really appreciate it.
00:18:11.62 >> I appreciate you having me, and I hope the podcasts are a big success.
00:18:17.25 >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content. Or, connect with us via the podcast page on the IDC website at IDEAdata.org.
  continue reading

53 つのエピソード

すべてのエピソード

×
 
Loading …

プレーヤーFMへようこそ!

Player FMは今からすぐに楽しめるために高品質のポッドキャストをウェブでスキャンしています。 これは最高のポッドキャストアプリで、Android、iPhone、そしてWebで動作します。 全ての端末で購読を同期するためにサインアップしてください。

 

クイックリファレンスガイド