|
Post by N.E. Brigand on Oct 23, 2022 1:21:08 GMT -6
Fate unknown (at least to me): ... To judge from the SMBF schedule updated at 1:48 a.m. (get some sleep, fellas!), Hamilton Township and Kalida qualified.
|
|
|
Post by rlrrll on Oct 23, 2022 13:27:47 GMT -6
Meaning no offense to Finneytown, which has put on some fine shows in recent seasons; and with much respect to Davidson's entertaining show (but can they please make it so the audience can't hear the metronome they use at the start of every movement?); but: what's up with Kettering this year? Maybe nothing's up with Kettering, given they had one of their better showings ever at BOA Indianapolis this week? Either they had a really rough run at Wilmington two weeks ago, or OMEA's judges were all wet, or BOA's judges screwed up in Indy, but there's no way a one-point gap in OMEA (between Kettering and Springboro) translates to a ten-point gap in BOA. I'm glad I'll get to see all these bands at OMEA state finals, and can make up my own mind in person. It absolutely can if you understand the judging sheets/philosophies between both organizations and the quality of judges (or lack there of) between the circuits. Success in 1 circuit does automatically equate to success in the other. 34 Ohio bands are going to BOA Grand Nationals, Maybe only 2 (Centerville and Mason) are safe bets to make semifinals and neither of them compete in OMEA. I don't think that is a coincidence.
|
|
|
Post by ohioguy2 on Oct 23, 2022 15:01:43 GMT -6
Maybe nothing's up with Kettering, given they had one of their better showings ever at BOA Indianapolis this week? Either they had a really rough run at Wilmington two weeks ago, or OMEA's judges were all wet, or BOA's judges screwed up in Indy, but there's no way a one-point gap in OMEA (between Kettering and Springboro) translates to a ten-point gap in BOA. I'm glad I'll get to see all these bands at OMEA state finals, and can make up my own mind in person. It absolutely can if you understand the judging sheets/philosophies between both organizations and the quality of judges (or lack there of) between the circuits. Success in 1 circuit does automatically equate to success in the other. 34 Ohio bands are going to BOA Grand Nationals, Maybe only 2 (Centerville and Mason) are safe bets to make semifinals and neither of them compete in OMEA. I don't think that is a coincidence. Yeah… it kind of is a coincidence, especially when you realize both Mason and Centerville don’t compete much in MSBA either. Both might do 1-2 MSBA shows a year beyond the ones they host, maybe. It’s not like doing MSBA instead of OMEA as their “local” circuit is what makes them semifinals contenders. I would argue that they’re two of the five biggest high schools in Ohio and both from incredibly affluent communities has a lot more to do with it than the local circuit they compete in.
|
|
|
Post by neop on Oct 23, 2022 16:04:13 GMT -6
As an undergraduate CS student taking data science courses and an overall nerd, I feel that it is time to weaponize my powers of meaningless data analysis! I took the scores from the 2021 OMEA MB season to put together a sort of model for the growth of the average band's score over the course of a 7-week season. Then, I applied it to the scores from this season in a vague attempt to predict what bands were destined to qualify for state and which were not. I did the same for superior ratings at state, but that's not the story I want to discuss in this post. I also ran this same prediction for all the WV bands that made appearances, but since they are very unlikely to participate at state (are they even allowed?), I do not include them here. Without further ado, let's see which bands might take up those final spots at OMEA SMBF this year. First, the bands that have not qualified and do not have any competitions this weekend: (Band (Class) - Most Recent Score (rating, week number) - Last State Appearance)Colerain (AA)
| 77.6 (II, week 6)
| 2015 (II - AA)
| Portsmouth West (B)
| 76.356 (II, week 6)
| 2021 (II - B)
| Liberty-Benton (B)
| 75.48 (II, week 6)
| NEVER
| Minford (B)
| 74.616 (II, week 6)
| 2021 (II - B)
| Valley (C)
| 74.488 (II, week 6)
| 2019 (II - C)
| East Clinton (C)
| 73.16 (II, week 5)
| 1984 (II - C)
| Brookville (C)
| 71.5 (II, week 5)
| 2009 (I - B)
| Symmes Valley (C)
| 69.644 (II, week 6)
| 1996 (III - C)
| Firelands (B)*
| N/A (II, week 6)
| NEVER
| Mississinawa Valley (C)
| 71.76 (III, week 6)
| 1983 (I - C)
| National Trail (C)
| 70.713 (III, week 6)
| 1997 (II - B)
| Cambridge (B)
| 65 (III, week 6)
| 2021 (II - B)
| Trimble (C)
| 63.04 (III, week 6)
| 2003 (II - C)
| Elgin (C)
| 62.6 (III, week 4)
| 1983 (III - B)
| Portsmouth (B)
| 61.064 (III, week 6)
| 1997 (II - B)
|
*Firelands was not scored at their only show.
Some of these bands (like Colerain and East Clinton) may not have attended SMBF even if they had qualified, but I included them here anyways. Some of the other bands have not been to state in quite a while! Here's hoping they have more success next season and continue to work hard, as I'd love to see some new (or old?) faces at SMBF. There's also one band in particular I'd like to give an honorable mention to: Sylvania Southview will compete this weekend for the first time, and as such, I have exactly zero data on them and cannot make a prediction. Best of luck to them! Now for the bands who have not yet qualified for state and are competing this weekend. I will organize them in different sections by the percent chance I give them to reach state based on their scores this season.
(Band (Class) - Most Recent Score (rating, week number) - Last State Appearance - Shows Tomorrow)
Bands with a >95% chance of qualifying: Buckeye (B)
| 78.82 (II, week 6)
| 2021 (I - B)
| Thomas Worthington
| Carlisle (B)
| 79.04 (II, week 6)
| 2021 (II - B)
| Shawnee, Tecumseh
| Chesapeake (C)
| 80.92 (II, week 6)
| 2021 (I - C)
| Lancaster
| Delaware Hayes (AA)
| 78.292 (II, week 5)
| 2021 (I - AA)
| Thomas Worthington
| Hamilton Township (A)
| 78.659 (II, week 5)
| 2021 (II - A)
| Lancaster
| Lancaster (AA)
| 78.64 (II, week 4)
| 2021 (II - AA)
| Newark, Lancaster
| Tippecanoe (A)
| 78.64 (II, week 5)
| 2021 (I - A)
| Tecumseh
| Walter E. Stebbins (A)
| 79.02 (II, week 6)
| 2021 (I - A)
| Forest Hills
|
Bands with a 75-94.99% chance of qualifying: Bellefontaine (A)
| 77.5 (II, week 5)
| 2019 (II - A)
| Shawnee, Tecumseh
| Canal Winchester (AA)
| 77.76 (II, week 6)
| 2021 (II - AA)
| Tri-Valley
| Crooksville (C)
| 77.56 (II, week 6)
| NEVER
| Tri-Valley
| Fort Frye (C)
| 77.76 (II, week 6)
| 2005 (II - C)
| Tri-Valley
| Indian Valley (B)*
| 77.74 (II, week 5)
| 2021 (II - B)
| Newark, New Philadelphia
| Kalida (C)
| 77.42 (II, week 5)
| 2021 (II - C)
| Swanton
| Lynchburg-Clay (C)
| 77.208 (II, week 6)
| 2021 (I - C)
| Shawnee, Tecumseh
| Madison-Plains (C)**
| 76.466 (II, week 5)
| 2019 (I - C)
| Shawnee, Thomas Worthington
| Newton (A)
| 77.36 (II, week 4)
| 2021 (II - B)
| Shawnee
| St. John's Jesuit (B)
| 78.34 (II, week 6)
| 2021 (I - B)
| Swanton
|
*Indian Valley was accidentally awarded a Superior at Norton on 10/8, and remains on the state schedule. I don't think OMEA would remove them if they didn't qualify, but I'm including them here because they have not technically qualified yet. **Absolute best of luck to Madison-Plains tomorrow. After the turbulent and tragic start to their season, I would love to see them qualify. Bands with a 50-74.99% chance of qualifying: Bethel (C)
| 75.6 (II, week 6)
| 2016 (II - C)
| Tecumseh
| Big Walnut (AA)
| 74.7 (II, week 5)
| 2021 (II - A)
| Newark
| Chillicothe (A)
| 76.162 (II, week 6)
| 2021 (II - A)
| Nelsonville-York
| Coventry (B)
| 74.92 (II, week 5)
| 2021 (II - B)
| New Philadelphia
| Fort Recovery (C)
| 74.42 (II, week 6)
| 2021 (II - C)
| Swanton
| Greenville (A)
| 74.52 (II, week 4)
| 2021 (II - A)
| Tecumseh
| Marion Local (C)*
| 76.12 (II, week 6)
| 2021 (II - C)
| Tecumseh
| Marlington (B)
| 74.36 (II, week 6)
| 2019 (II - A)
| New Philadelphia
| New London (C)
| 75.5 (II, week 6)
| NEVER
| Thomas Worthington
| Ottawa-Glandorf (B)
| 74.14 (II, week 5)
| 2021 (II - B)
| Shawnee, Tecumseh
| Perkins (B)
| 76.6 (II, week 6)
| 2021 (II - B)
| Copley
| Watkins Memorial (AA)
| 74.6 (II, week 6)
| 2021 (II - AA)
| Newark, Tri-Valley
| Wauseon (B)
| 73.8 (II, week 4)
| 2018 (I - B)
| Swanton
| Wellington (C)
| 74.46 (II, week 6)
| 2021 (I - C)
| Copley
|
*May the streak continue! Fingers crossed!
Bands with a 25-49.99% chance of qualifying: Adena (C)
| 72.6 (II, week 6)
| 2013 (II - C)
| Thomas Worthington
| Fairbanks (C)
| 70.38 (II, week 3)
| 2009 (II - C)
| Lancaster
| Groveport Madison (AA)
| 73.42 (II, week 6)
| 2021 (II - AA)
| Newark
| Marietta (A)
| 74.2 (II, week 6)
| 2018 (II - A)
| New Philadelphia
| Morgan (B)
| 72.82 (II, week 6)
| 2018 (II - B)
| Tri-Valley
| Norwood (B)
| 73.3 (II, week 6)
| 2015 (II - B)
| Forest Hills
| Union Local (B)
| 72.56 (II, week 6)
| 2019 (II - C)
| New Philadelphia
| Waynesville (B)
| 72.8 (II, week 6)
| 2021 (II - B)
| Tecumseh
| Wellston (C)
| 73.658 (II, week 6)
| 2021 (II - B)
| Nelsonville-York
| Woodmore (C)
| 73.24 (II, week 5)
| 2019 (II - C)
| Swanton
| Zanesville (A)
| 70.108 (II, week 2)
| 2019 (II - A)
| Tri-Valley
|
Bands with a <25% chance of qualifying: Bishop Watterson (A)
| 71.1 (III, week 5)
| 2021 (I - A)
| Thomas Worthington
| Blanchester (C)*
| 67.22 (III, week 5)
| 2019 (II - C)
| Forest Hills
| Coshocton (C)
| 71.3 (II, week 6)
| 2016 (II - B)
| New Philadelphia
| Crestview (C)
| 64.24 (III, week 1)
| 2001 (I - B)
| New Philadelphia
| Gallia Academy (B)
| 70.54 (II, week 4)
| 2019 (II - B)
| Nelsonville-York
| Maysville (B)
| 69.22 (III, week 4)
| 2021 (II - B)
| Tri-Valley
| Millersport (C)
| 71.2 (II, week 6)
| NEVER
| Lancaster
| Northeastern (C)
| 70.94 (III, week 6)
| 2019 (II - C)
| Shawnee, Tecumseh
| Shawnee (C)
| 69.7 (III, week 6)
| 2021 (I - B)
| Shawnee
| Southeastern (B)
| 71.3 (II, week 6)
| NEVER
| Lancaster
| Tecumseh (A)
| 63.5 (III, week 4)
| 2019 (II - A)
| Tecumseh
| Westfall (C)
| 67.1 (III, week 4)
| 2011 (II - B)
| Thomas Worthington
|
*Blanchester did receive a II last weekend, but they were not scored. The OMEA SMBF schedule currently shows 21 empty slots. The chances of exactly 21 bands qualifying tomorrow are not very high, but if I was to predict 21 bands qualifying tomorrow based on my analysis, they would be: Chesapeake Hamilton Township Lancaster Tippecanoe Carlisle Walter E. Stebbins Delaware Hayes Buckeye Newton St. John's Jesuit Bellefontaine Kalida Canal Winchester Fort Frye Lynchburg-Clay Madison-Plains Crooksville Perkins Chillicothe Marion Local Greenville
Overall, I'd say it's possible for any of these bands to qualify for state. This is just a ROUGH estimate based on past scores. I cannot overstate how unpredictable the judging environment can be in OMEA, so it is reasonable to suggest that multiple bands with >95% chances of qualifying may not make it, and bands with <25% chances may make it. After all, probabilities are not certainties! This is just something I threw together because I love numbers and band.
(Also, please don't think that this model even holds a candle to hostrauser's. I am sure that one using his method would be much more accurate. I also can't imagine how awful this looks on mobile, so please proceed with caution!)
So, how did I do? Looks like 9 of these 21 predictions were incorrect. That's... not awful, but it's not great either. In the future, I could try to improve this by trying to factor in the individual growth of a band during the current season. I could also try to estimate how much a score might be inflated (or deflated) at a show based on the strength of the bands there.
I do also think that some of this is not my fault. OMEA judging can be wildly inconsistent, and it is likely that last night's shows were no exception. It is no question in my mind that some last-second qualifying bands last night may have failed to qualify at another show, or that some near-miss bands may have qualified at another show. A band that might score an 82 with one panel of judges might score a 75 with another panel on the same day. That is the kind of inconsistency that gives OMEA a bad reputation, since they may not give bands as accurate of a picture of their performance versus competitors as MSBA or BOA might. I think OMEA should increase their focus on scoring consistency. Maybe it's poor judge selection, maybe it's unclear scoring criteria on the sheets, maybe it's both, but things could be much better than they are.
I will say that this issue seems to affect bands with scores between 70 and 80 more than bands with scores over 80. I also know that score =/= rating in OMEA, but it does give a rough idea. Wild scoring inconsistencies can affect the rating of a band just as easily on the individual caption level.
I'd be interesting in hearing how others would address OMEA's issues, or if they would consider things to be fine as they are.
|
|
|
Post by rlrrll on Oct 23, 2022 16:36:47 GMT -6
It absolutely can if you understand the judging sheets/philosophies between both organizations and the quality of judges (or lack there of) between the circuits. Success in 1 circuit does automatically equate to success in the other. 34 Ohio bands are going to BOA Grand Nationals, Maybe only 2 (Centerville and Mason) are safe bets to make semifinals and neither of them compete in OMEA. I don't think that is a coincidence. Yeah… it kind of is a coincidence, especially when you realize both Mason and Centerville don’t compete much in MSBA either. Both might do 1-2 MSBA shows a year beyond the ones they host, maybe. It’s not like doing MSBA instead of OMEA as their “local” circuit is what makes them semifinals contenders. I would argue that they’re two of the five biggest high schools in Ohio and both from incredibly affluent communities has a lot more to do with it than the local circuit they compete in. Or you can ask the question why of the remaining OMEA bands that go to BOA, why aren't they more successful. The standard Ohio education system that is guided by OMEA doesn't strive for a higher level of performance and rewards mediocrity. I'm certainly not saying MSBA is better. It's not much better at all but why perform at an OMEA show if the feedback you receive does nothing to help you prepare for performing at a higher level. Money has lots to do with high end success at BOA. Money affords you more and better teachers but you can still put together creative shows designs on a budget. Kentucky seems to do well to educate kids in small county schools that perform very strongly in A class. OMEA is desperate for judges and many of the ones they do have are not well trained for today's modern demands and styles. Even those that have been judging for 20-30 years. The parity of shows and scoring is all over the place depending who is on your panel and there is very little consistency due to this poor training. I know some very good judges in OMEA that get it. And I've also heard some of the most worthless tapes imaginable with no usefull feedback or information across a number of years. This isn't a new problem.
|
|
|
Post by N.E. Brigand on Oct 23, 2022 17:05:19 GMT -6
So, how did I do? Looks like 9 of these 21 predictions were incorrect. That's... not awful, but it's not great either. In the future, I could try to improve this by trying to factor in the individual growth of a band during the current season. I could also try to estimate how much a score might be inflated (or deflated) at a show based on the strength of the bands there.
I do also think that some of this is not my fault. OMEA judging can be wildly inconsistent, and it is likely that last night's shows were no exception. It is no question in my mind that some last-second qualifying bands last night may have failed to qualify at another show, or that some near-miss bands may have qualified at another show. A band that might score an 82 with one panel of judges might score a 75 with another panel on the same day. That is the kind of inconsistency that gives OMEA a bad reputation, since they may not give bands as accurate of a picture of their performance versus competitors as MSBA or BOA might. I think OMEA should increase their focus on scoring consistency. Maybe it's poor judge selection, maybe it's unclear scoring criteria on the sheets, maybe it's both, but things could be much better than they are.
I will say that this issue seems to affect bands with scores between 70 and 80 more than bands with scores over 80. I also know that score =/= rating in OMEA, but it does give a rough idea. Wild scoring inconsistencies can affect the rating of a band just as easily on the individual caption level.
I'd be interesting in hearing how others would address OMEA's issues, or if they would consider things to be fine as they are. Is BOA really so much more consistent? In last year's Grand Nationals, we saw the same band score 83.70 for a performance Friday at 6:00 p.m. and 76.38 for a performance Saturday at 4:15 p.m. That's more than a 7-point drop in less than 24 hours. I don't recall anybody who saw both performances indicating the band was noticeably worse the second time.
|
|
|
Post by trumpet300 on Oct 23, 2022 17:38:45 GMT -6
So, how did I do? Looks like 9 of these 21 predictions were incorrect. That's... not awful, but it's not great either. In the future, I could try to improve this by trying to factor in the individual growth of a band during the current season. I could also try to estimate how much a score might be inflated (or deflated) at a show based on the strength of the bands there.
I do also think that some of this is not my fault. OMEA judging can be wildly inconsistent, and it is likely that last night's shows were no exception. It is no question in my mind that some last-second qualifying bands last night may have failed to qualify at another show, or that some near-miss bands may have qualified at another show. A band that might score an 82 with one panel of judges might score a 75 with another panel on the same day. That is the kind of inconsistency that gives OMEA a bad reputation, since they may not give bands as accurate of a picture of their performance versus competitors as MSBA or BOA might. I think OMEA should increase their focus on scoring consistency. Maybe it's poor judge selection, maybe it's unclear scoring criteria on the sheets, maybe it's both, but things could be much better than they are.
I will say that this issue seems to affect bands with scores between 70 and 80 more than bands with scores over 80. I also know that score =/= rating in OMEA, but it does give a rough idea. Wild scoring inconsistencies can affect the rating of a band just as easily on the individual caption level.
I'd be interesting in hearing how others would address OMEA's issues, or if they would consider things to be fine as they are. Is BOA really so much more consistent? In last year's Grand Nationals, we saw the same band score 83.70 for a performance Friday at 6:00 p.m. and 76.38 for a performance Saturday at 4:15 p.m. That's more than a 7-point drop in less than 24 hours. I don't recall anybody who saw both performances indicating the band was noticeably worse the second time. In BOA, Something we have to remember is that in prelims, scores are really only comparable on the same panel and to the other bands in that class that are on said panel. It may look odd when large drops happen but everything is relative. A band may score really well in their class compared to the other groups in that class, but when it comes to semis, they are now being compared to all of the semifinalists who span all 4 classes. It isn't really a consistency issue, it's more of a relativity issue when comparing groups within a class vs all bands in all classes.
|
|
|
Post by rlrrll on Oct 23, 2022 18:00:56 GMT -6
So, how did I do? Looks like 9 of these 21 predictions were incorrect. That's... not awful, but it's not great either. In the future, I could try to improve this by trying to factor in the individual growth of a band during the current season. I could also try to estimate how much a score might be inflated (or deflated) at a show based on the strength of the bands there.
I do also think that some of this is not my fault. OMEA judging can be wildly inconsistent, and it is likely that last night's shows were no exception. It is no question in my mind that some last-second qualifying bands last night may have failed to qualify at another show, or that some near-miss bands may have qualified at another show. A band that might score an 82 with one panel of judges might score a 75 with another panel on the same day. That is the kind of inconsistency that gives OMEA a bad reputation, since they may not give bands as accurate of a picture of their performance versus competitors as MSBA or BOA might. I think OMEA should increase their focus on scoring consistency. Maybe it's poor judge selection, maybe it's unclear scoring criteria on the sheets, maybe it's both, but things could be much better than they are.
I will say that this issue seems to affect bands with scores between 70 and 80 more than bands with scores over 80. I also know that score =/= rating in OMEA, but it does give a rough idea. Wild scoring inconsistencies can affect the rating of a band just as easily on the individual caption level.
I'd be interesting in hearing how others would address OMEA's issues, or if they would consider things to be fine as they are. Is BOA really so much more consistent? In last year's Grand Nationals, we saw the same band score 83.70 for a performance Friday at 6:00 p.m. and 76.38 for a performance Saturday at 4:15 p.m. That's more than a 7-point drop in less than 24 hours. I don't recall anybody who saw both performances indicating the band was noticeably worse the second time. My comment to that is if their placement was consistent regardless of the actual number. Numbers are relative based on the size of the show and performances draws as judges manage the numbers to be given. If placement was consistent or within reason that is all that matters. Other than that, marching band is a subjective art and completely open to opinions. Anyone who has been around this activity for a while learns which judges like certain styles or techniques as a personal preference.
|
|
|
Post by thestraightestlegs on Oct 23, 2022 22:22:50 GMT -6
Yeah… it kind of is a coincidence, especially when you realize both Mason and Centerville don’t compete much in MSBA either. Both might do 1-2 MSBA shows a year beyond the ones they host, maybe. It’s not like doing MSBA instead of OMEA as their “local” circuit is what makes them semifinals contenders. I would argue that they’re two of the five biggest high schools in Ohio and both from incredibly affluent communities has a lot more to do with it than the local circuit they compete in. Or you can ask the question why of the remaining OMEA bands that go to BOA, why aren't they more successful. The standard Ohio education system that is guided by OMEA doesn't strive for a higher level of performance and rewards mediocrity. I'm certainly not saying MSBA is better. It's not much better at all but why perform at an OMEA show if the feedback you receive does nothing to help you prepare for performing at a higher level. Money has lots to do with high end success at BOA. Money affords you more and better teachers but you can still put together creative shows designs on a budget. Kentucky seems to do well to educate kids in small county schools that perform very strongly in A class. OMEA is desperate for judges and many of the ones they do have are not well trained for today's modern demands and styles. Even those that have been judging for 20-30 years. The parity of shows and scoring is all over the place depending who is on your panel and there is very little consistency due to this poor training. I know some very good judges in OMEA that get it. And I've also heard some of the most worthless tapes imaginable with no usefull feedback or information across a number of years. This isn't a new problem. This is a whole can of worms, however… I can attest that there will likely be changes to at least the visual sheet within the next couple of years. Hopefully the same can be said of other sheets in OMEA as well. The judging panels are starting to skew younger than they were a few years ago. This isn’t necessarily a good thing, but I see it as a sign of hope to start to credit a more modern approach to the activity, and show design in particular. I think there are significant flaws in both local Ohio circuits, and that BOA does a much better job than most local circuits. But yes, this is a subjective activity. Regardless of circuit, judging is HARD. I do think if you’ve been doing it for a decade, you should have valuable feedback, good numbers management, and a deep knowledge of the criteria on the sheet. But if we want OMEA to change, we have to be patient. They recently adopted competition suite and a 100 point scoring system. That’s not nothing. I’m looking forward to seeing the circuit continue its evolution over the next few years. There’s hope!
|
|
|
Post by N.E. Brigand on Oct 23, 2022 22:59:54 GMT -6
Or you can ask the question why of the remaining OMEA bands that go to BOA, why aren't they more successful. The standard Ohio education system that is guided by OMEA doesn't strive for a higher level of performance and rewards mediocrity. I'm certainly not saying MSBA is better. It's not much better at all but why perform at an OMEA show if the feedback you receive does nothing to help you prepare for performing at a higher level. Money has lots to do with high end success at BOA. Money affords you more and better teachers but you can still put together creative shows designs on a budget. Kentucky seems to do well to educate kids in small county schools that perform very strongly in A class. OMEA is desperate for judges and many of the ones they do have are not well trained for today's modern demands and styles. Even those that have been judging for 20-30 years. The parity of shows and scoring is all over the place depending who is on your panel and there is very little consistency due to this poor training. I know some very good judges in OMEA that get it. And I've also heard some of the most worthless tapes imaginable with no usefull feedback or information across a number of years. This isn't a new problem. This is a whole can of worms, however… I can attest that there will likely be changes to at least the visual sheet within the next couple of years. Hopefully the same can be said of other sheets in OMEA as well. The judging panels are starting to skew younger than they were a few years ago. This isn’t necessarily a good thing, but I see it as a sign of hope to start to credit a more modern approach to the activity, and shoe design in particular. I think there are significant flaws in both local Ohio circuits, and that BOA does a much better job than most local circuits. But yes, this is a subjective activity. Regardless of circuit, judging is HARD. I do think if you’ve been doing it for a decade, you should have valuable feedback, good numbers management, and a deep knowledge of the criteria on the sheet. But if we want OMEA to change, we have to be patient. They recently adopted competition suite and a 100 point scoring system. That’s not nothing. I’m looking forward to seeing the circuit continue its evolution over the next few years. There’s hope! Could someone be specific about what is wrong with either the visual sheets or the music sheets in OMEA? (For that matter, what was wrong with a 300-point system? Or with not using one particular brand of tabulation software?)
|
|
|
Post by neop on Oct 23, 2022 23:27:21 GMT -6
Could someone be specific about what is wrong with either the visual sheets or the music sheets in OMEA? (For that matter, what was wrong with a 300-point system? Or with not using one particular brand of tabulation software?) My answers to both questions might not be very helpful, but I like this discussion, so I'll jump in. I have personally never seen any of the judging sheets in OMEA, so I have no idea what the criteria look like. I was only listing that as a possibility for why I think there's some problematic inconsistency in OMEA judging. To speak to an earlier point of yours, I definitely see inconsistency in scoring in other circuits as well. The problem in OMEA is that it leads to, for example, a band entirely missing state versus making it. As you know, the cutoff for this is scoring based in OMEA, just as it is in many other circuits. Some circuits, like BOA regionals or UIL, have the cutoffs for their finals events be a certain top x placements. Scoring inconsistency isn't as much of a problem in BOA, since scores will typically be affected evenly across the event, but it is in OMEA if it affects whether a band attends the final event or not. Placement inconsistency isn't as much of a problem in OMEA, since any number of bands at an event are able to qualify for state, but when BOA judges are unable to consistently determine which band is better, it leads to no shortage of grumbling (see the 2001 Grand Nationals Semis draw). Both of these designs (as well as others) have many pros and cons, and I ultimately don't think there's a good answer to how it could be improved upon. There's no way to be completely factually correct in the judging of such an opinion-heavy activity, after all. The reason why people argue about judging being wrong in the arts is not often because they have a sensible solution. You are going to hear more about something being wrong than what exactly is wrong. And hey, sometimes complaining is fun. I don't think anything is inherently wrong with using a 300 point system, but I know our brains like to think in terms of a scale of 0-100. To me, it gives a better idea of how close a band is to achieving the next higher rating, or how close they were to what the judges would've called "perfection." Lots of things in our lives operate on similar scales, like currency or percentages. It's just familiar to us. I don't want to imply that blind conformity is a good thing, but just about every other marching band point system I know of operates on that scale. Maybe keeping the 300 point system would have made OMEA stand out as being unique, but I personally don't think that being different just for the sake of being different is a good enough reason. If there was any improvement I could offer to the activity, it's that spectators should focus more on the well-being of the activity, the programs that operate within it, and the kids that are responsible for the existence of this conversation in the first place. The numbers that the judges write on the score sheets are less important. They offer neat points of discussion, but man do people get heated about scores in this activity!
|
|
|
Post by rlrrll on Oct 24, 2022 7:10:15 GMT -6
This is a whole can of worms, however… I can attest that there will likely be changes to at least the visual sheet within the next couple of years. Hopefully the same can be said of other sheets in OMEA as well. The judging panels are starting to skew younger than they were a few years ago. This isn’t necessarily a good thing, but I see it as a sign of hope to start to credit a more modern approach to the activity, and shoe design in particular. I think there are significant flaws in both local Ohio circuits, and that BOA does a much better job than most local circuits. But yes, this is a subjective activity. Regardless of circuit, judging is HARD. I do think if you’ve been doing it for a decade, you should have valuable feedback, good numbers management, and a deep knowledge of the criteria on the sheet. But if we want OMEA to change, we have to be patient. They recently adopted competition suite and a 100 point scoring system. That’s not nothing. I’m looking forward to seeing the circuit continue its evolution over the next few years. There’s hope! Could someone be specific about what is wrong with either the visual sheets or the music sheets in OMEA? (For that matter, what was wrong with a 300-point system? Or with not using one particular brand of tabulation software?) I don't have the sheets in front of me and can't find them posted on line so I'll refrain from the actual sheets. It is quite possible that the sheets themselves aren't a problem but how judges are trained to interpret them. I will say the change to Competition Suite is simply a symbol of their slowness to adapt to change. They are always the last to adapt to everything. Competition Suite has existed for a number of years and used successfully in many circuits but it was ridiculous bands still had to submit their own SD cards at checking in at OMEA shows until this season I believe. I think last season was a "demo" program for certain shows to try out Comp Suite.
|
|
|
Post by oldarmybandguy on Oct 24, 2022 10:23:32 GMT -6
I wanted to thank everyone that posts in this thread...it's what brought me to hornrank in the first place. I have 2 wind players and a cellist in junior high in Fairfield, and we have been to exactly 1 show to see the band perform. Part of that is that the school and OMEA and other Ohio circuits are shockingly bad at giving out information on when, and where events are and who is participating. Additionally, the only circuit that seems to give score and recap information out is MSBA, and there really aren't enough participants to make this valuable. It has been very helpful to be able to track their future band through these posts. Thank you!
|
|
|
Post by thestraightestlegs on Oct 24, 2022 10:45:38 GMT -6
This is a whole can of worms, however… I can attest that there will likely be changes to at least the visual sheet within the next couple of years. Hopefully the same can be said of other sheets in OMEA as well. The judging panels are starting to skew younger than they were a few years ago. This isn’t necessarily a good thing, but I see it as a sign of hope to start to credit a more modern approach to the activity, and shoe design in particular. I think there are significant flaws in both local Ohio circuits, and that BOA does a much better job than most local circuits. But yes, this is a subjective activity. Regardless of circuit, judging is HARD. I do think if you’ve been doing it for a decade, you should have valuable feedback, good numbers management, and a deep knowledge of the criteria on the sheet. But if we want OMEA to change, we have to be patient. They recently adopted competition suite and a 100 point scoring system. That’s not nothing. I’m looking forward to seeing the circuit continue its evolution over the next few years. There’s hope! Could someone be specific about what is wrong with either the visual sheets or the music sheets in OMEA? (For that matter, what was wrong with a 300-point system? Or with not using one particular brand of tabulation software?) So in my opinion, there’s too much put on any one judge in OMEA. In OMEA, the criteria you’d see in visual effect, visual ensemble, and visual individual is basically all lumped together to create the sheets used for the visual judge. I think the same thing is true of the music sheets. The GE judges (as I’ve heard on a tape before) are judging everything. Next to no one is qualified to judge that effectively, and there are dozens of judges in OMEA. In my opinion, that’s why the poor quality of judging and inconsistency in numbers exists. It’s not necessarily that the judges are bad, but instead that the sheets are asking them to do way too much. In BOA, the criteria and responsibility to give feedback visually is split up between three separate captions. The effect judges are also split to focus on music for visual depending on their training and expertise. This is really important. Such a broad scope of things to watch makes it difficult to dig into anything or even cover everything you’re supposed to look for across an 8 minute show. It’s even tougher when the show isn’t complete. Moving from a 300 point system to 100 point scale isn’t really all that beneficial, other than it being the universal standard. It’s easier to comprehend out of 100, but I don’t think it changes the experience for anyone. Making the change to competition suite in very important from my perspective. It’s not about how the tabulation is done, but rather how the commentary gets from be judges to the staff. It used to be that with SD cards, you had to wait to get them back, then download them, then share that with your staff. Now the commentary is available immediately. The recaps can be seen by everyone with access (whenever they choose to use that function). Although not very common, when there’s an OMEA show with critique it really helps to have the commentary available for your whole staff right away to take advantage of your limited time in critique with the judge. You’re wasting your time if the judge just recites all the feedback to you that they already gave you but because of an antiquated system, you haven’t been able to listen to yet. Critique should be staff driven. I think ideally the fix is to create more captions in OMEA, but barring that, I think the sheets could just hone in a little more on specific things to look for and have less overlap between captions. I can’t tell you the number of times I’ve listened to GE judges talk about timing of feet or tone quality in the mellophones. It’s just not the best way to judge band from my perspective. You get a wide variety of results from show to show when this happens.
|
|
|
Post by N.E. Brigand on Oct 24, 2022 11:50:43 GMT -6
Could someone be specific about what is wrong with either the visual sheets or the music sheets in OMEA? (For that matter, what was wrong with a 300-point system? Or with not using one particular brand of tabulation software?) Making the change to competition suite in very important from my perspective. It’s not about how the tabulation is done, but rather how the commentary gets from be judges to the staff. It used to be that with SD cards, you had to wait to get them back, then download them, then share that with your staff. Now the commentary is available immediately. The recaps can be seen by everyone with access (whenever they choose to use that function). Although not very common, when there’s an OMEA show with critique it really helps to have the commentary available for your whole staff right away to take advantage of your limited time in critique with the judge. You’re wasting your time if the judge just recites all the feedback to you that they already gave you but because of an antiquated system, you haven’t been able to listen to yet. Critique should be staff driven. Very helpful information, thanks. I still remember listening to actual cassettes (or mini cassettes?) on the bus rides back from competitions.
|
|
|
Post by N.E. Brigand on Oct 24, 2022 15:11:34 GMT -6
Newton did not qualify at Shawnee. There is Youtube video of most of the awards ceremony taken by some very happy Middletown students (they were grand champion), and it includes the full announcement of qualifiers. Thanks! I've found judging to be a bit abnormal all season, but a few of tonight's results have been especially confusing. Maybe I would understand better if I was able to see the final shows put on by some of these near-miss bands. That's only Newton's second miss, no?Answering more definitely: yes. Newton missed previously only in 2006. Also I was wrong in an earlier comment: as shown below, they were not the last band whose streak broke prior to Grove City missing in 2009. I thought it might be of interest to see how the number of bands with a "perfect" record changed over time. Thirty-six bands qualified for OMEA's first State Marching Band Finals in 1980. Here they all are ordered by when their streaks ended: 1980 -- 36 bandsStreak ends after 1 year for:Carlisle Franklin Heights Hilliard* Park Hills** Springboro Tallmadge Carrollton 1980-1981 -- 29 bandsStreak ends after 2 years for:Caldwell Fairborn-Baker** Northmont Upper Arlington 1980-1982 -- 25 bandsStreak ends after 3 years for:Fairfield Marlington Miamisburg New Philadelphia Xenia 1980-1983 -- 20 bandsStreak ends after 4 years for:Licking Heights 1980-1984 -- 19 bandsStreak ends after 5 years for:Field Highland Wadsworth 1980-1985 -- 16 bandsStreak ends after 6 years for:Versailles 1980-1987 -- 15 bandsStreak ends after 7 years for:Piqua 1980-1988 -- 14 bandsStreak ends after 9 years for:Athens Shenandoah 1980-1989 -- 12 bandsStreak ends after 10 years for:Watkins Memorial 1980-1990 -- 11 bandsStreak ends after 11 years for:Teays Valley Westfall 1980-1998 -- 9 bandsStreak ends after 19 years for:Fort Recovery 1980-2002 -- 8 bandsStreak ends after 23 years for:Westland 1980-2005 -- 7 bandsStreak ends after 26 years for:Newton 1980-2006 -- 6 bandsStreak ends after 27 years for:Cambridge 1980-2008 -- 5 bandsStreak ends after 29 years for:Grove City 1980-2015 -- 4 bandsStreak ends after 36 years for:Perkins 1980-2021 -- 3 bandsStreak ends after 41 straight appearances*** for:Marion Local 1980-2022 (and counting) -- 42 straight appearances*** for 2 bands: Newark Troy *Hilliard became Hilliard Davidson. **Fairborn-Baker and Park Hills merged in 1982 to become Fairborn, which didn't appear at state finals until 1988. ***No competitions in 2020. I think it's likely that some of these bands (e.g., New Philadelphia) deliberately skipped SMBF in one or more years to focus on BOA. If you spot an error, let me know and I'll fix it.
|
|
|
Post by neop on Oct 24, 2022 15:29:47 GMT -6
Thanks! I've found judging to be a bit abnormal all season, but a few of tonight's results have been especially confusing. Maybe I would understand better if I was able to see the final shows put on by some of these near-miss bands. That's only Newton's second miss, no?Answering more definitely: yes. Newton missed previously only in 2006. Also I was wrong in an earlier comment: as shown below, they were not the last band whose streak broke prior to Grove City missing in 2009. I thought it might be of interest to see how the number of bands with a "perfect" record changed over time. Thirty-six bands qualified for OMEA's first State Marching Band Finals in 1980. Here they all are ordered by when their streaks ended: 1980 -- 36 bandsStreak ends after 1 year for:Carlisle Franklin Heights Hilliard* Park Hills** Springboro Tallmadge Carrollton 1980-1981 -- 29 bandsStreak ends after 2 years for:Caldwell Fairborn-Baker** Northmont Upper Arlington 1980-1982 -- 25 bandsStreak ends after 3 years for:Fairfield Marlington Miamisburg New Philadelphia Xenia 1980-1983 -- 20 bandsStreak ends after 4 years for:Licking Heights 1980-1984 -- 19 bandsStreak ends after 5 years for:Field Highland Wadsworth 1980-1985 -- 16 bandsStreak ends after 6 years for:Versailles 1980-1987 -- 15 bandsStreak ends after 7 years for:Piqua 1980-1988 -- 14 bandsStreak ends after 9 years for:Athens Shenandoah 1980-1989 -- 12 bandsStreak ends after 10 years for:Watkins Memorial 1980-1990 -- 11 bandsStreak ends after 11 years for:Teays Valley Westfall 1980-1998 -- 9 bandsStreak ends after 19 years for:Fort Recovery 1980-2002 -- 8 bandsStreak ends after 23 years for:Westland 1980-2005 -- 7 bandsStreak ends after 26 years for:Newton 1980-2006 -- 6 bandsStreak ends after 27 years for:Cambridge 1980-2008 -- 5 bandsStreak ends after 29 years for:Grove City 1980-2015 -- 4 bandsStreak ends after 36 years for:Perkins 1980-2021 -- 3 bandsStreak ends after 41 straight appearances*** for:Marion Local 1980-2022 (and counting) -- 42 straight appearances*** for 2 bands: Newark Troy *Hilliard became Hilliard Davidson. **Fairborn-Baker and Park Hills merged in 1982 to become Fairborn, which didn't appear at state finals until 1988. ***No competitions in 2020. I think it's likely that some of these bands (e.g., New Philadelphia) deliberately skipped SMBF in one or more years to focus on BOA. If you spot an error, let me know and I'll fix it. Perhaps it's time we compile other lists of few-miss bands?
1 miss: Berne Union (1980) Grove City (2009) Hilliard Davidson (1981) Lancaster (1980) Marion Local (2022)
Teays Valley (1991)
2 misses: Brunswick (1980, 1981)
Cloverleaf (1980, 1981)
Dublin Coffman (1980, 1981)
Newton (2006, 2022)
Perkins (2016, 2022)
Tri-Valley(1980, 1981)
3 misses: Pickerington Central (1980, 1981, 1985)
4 misses: Athens (1989, 2010, 2011, 2013)
Meadowbrook (1980, 1986, 2016, 2021)
River View (1980, 2008, 2009, 2010)
Ross (1980, 1981, 1982, 1983)
5 misses: Fairfield (1983, 1986, 1987, 1988, 1995)
Louisville (1980, 1981, 1982, 1983, 1984)
Marietta (1980, 1986, 2019, 2021, 2022)
Warren Local (1980, 1981, 1982, 2011, 2012)
|
|
|
Post by N.E. Brigand on Oct 24, 2022 16:20:03 GMT -6
Perhaps it's time we compile other lists of few-miss bands?
... 2 misses:
Brunswick (1980, 1981) Sometimes people at these schools don't know their own history. Some Brunswick students at the 2015 state finals, upon glancing at the program book as it then was, told me that it must be in error, because their director had told them the band never earned a Superior at state prior to his arrival in the 2000s. Did he actually tell them that? (And if so, was he himself unaware?) Or did they misunderstand something he said? Because as you would know, they got a I rating nine times in the 1980s and 1990s. At that time, the state finals program book only showed results back to 1991. I'd like to think I played a small part in getting them to expand that to include the full list. (I do wish they'd kept the grid format they used to use, though.) I corresponded around that time with a director on their adjudication committee, and sent that person my own spreadsheet, in which I'd been able to include the results back to 1990 because I had a 1991 program, which listed that information. (Our college band director recruited a group of students to work state finals at Cooper Stadium in 1991 and 1992. I don't seen to have kept a book for the latter year. I'd really love to collect a full set for all years to date.) There are still some mistakes, I think. Youtube has video of Vandalia Butler performing at state in 1994 and 1995, for instance, but last I checked those performances weren't listed in the program book.
|
|
|
Post by neop on Oct 24, 2022 16:51:07 GMT -6
Perhaps it's time we compile other lists of few-miss bands?
... 2 misses:
Brunswick (1980, 1981) Sometimes people at these schools don't know their own history. Some Brunswick students at the 2015 state finals, upon glancing at the program book as it then was, told me that it must be in error, because their director had told them the band never earned a Superior at state prior to his arrival in the 2000s. Did he actually tell them that? (And if so, was he himself unaware?) Or did they misunderstand something he said? Because as you would know, they got a I rating nine times in the 1980s and 1990s. At that time, the state finals program book only showed results back to 1991. I'd like to think I played a small part in getting them to expand that to include the full list. (I do wish they'd kept the grid format they used to use, though.) I corresponded around that time with a director on their adjudication committee, and sent that person my own spreadsheet, in which I'd been able to include the results back to 1990 because I had a 1991 program, which listed that information. (Our college band director recruited a group of students to work state finals at Cooper Stadium in 1991 and 1992. I don't seen to have kept a book for the latter year. I'd really love to collect a full set for all years to date.) There are still some mistakes, I think. Youtube has video of Vandalia Butler performing at state in 1994 and 1995, for instance, but last I checked those performances weren't listed in the program book. I personally have the programs from 1992-94, 2004, 2011, 2013-19, and 2021. They tend to pop up on eBay here and there. I'm still keeping an eye out for 2002-03 and 2005-10 programs so that I can see the listings for Amherst Steele. I also know Bowling Green has copies of the 1987-90, 1998, 2004-08, 2011-12, and 2016-18 programs in their library.
|
|
|
Post by N.E. Brigand on Oct 24, 2022 17:05:37 GMT -6
I personally have the programs from 1992-94, 2004, 2011, 2013-19, and 2021. They tend to pop up on eBay here and there. I'm still keeping an eye out for 2002-03 and 2005-10 programs so that I can see the listings for Amherst Steele. I also know Bowling Green has copies of the 1987-90, 1998, 2004-08, 2011-12, and 2016-18 programs in their library. Good to know! I've been to Bowling Green's library to copy some Tolkien fanzines; it hadn't occurred to me that they might have OMEA programs.
|
|
|
Post by oldarmybandguy on Oct 24, 2022 17:21:32 GMT -6
As I’ve said earlier, I really appreciate all the info about the program my kids are about to get into, so this is an open question: what is the incentive to perform at OMEA finals without scores or competition? Like, I honestly can’t fathom the cost associated with that trip to come away with nothing but a judges tape. Any insight is appreciated.
|
|
|
Post by N.E. Brigand on Oct 24, 2022 17:53:39 GMT -6
As I’ve said earlier, I really appreciate all the info about the program my kids are about to get into, so this is an open question: what is the incentive to perform at OMEA finals without scores or competition? Like, I honestly can’t fathom the cost associated with that trip to come away with nothing but a judges tape. Any insight is appreciated. Well, yes. I don't actually disagree. Although I would note the same is true of OMEA concert band adjudication (but there it's also true at the local level: ratings only), and I've never heard anyone complain about that. And it's been that way for a long, long time. From the SMFB program: Elsewhere they imply that, when OMEA got into marching band in 1980, allowing regular season competitions to keep giving placements and scores was a sop to local traditions. You might say it's a variation on the mid-show narration from the Reading Buccaneers' 2017 production: "Look in the mirror: that is your competition ... We are in competition with no one. We have no desire to play the game of being better than anyone. We are simply trying to be better than we were yesterday." (Mind you, that show was very much in competition and was in fact the winner that year by a substantial margin following the corps' unusual loss the prior year.)
|
|
|
Post by oldarmybandguy on Oct 24, 2022 18:54:43 GMT -6
I can get the mindset to a degree-it’s just a difficult mindset especially considering the name of this message board being “Hornrank”. I would think part of the appeal for both audience and performer is an outward recognition of the drive for greater excellence. I love watching football no matter who is playing but I still wouldn’t enjoy it if there was no winner
|
|
|
Post by N.E. Brigand on Oct 24, 2022 19:10:41 GMT -6
I can get the mindset to a degree-it’s just a difficult mindset especially considering the name of this message board being “Hornrank”. I would think part of the appeal for both audience and performer is an outward recognition of the drive for greater excellence. I love watching football no matter who is playing but I still wouldn’t enjoy it if there was no winner I suppose another argument OMEA might make is that professional musical organizations generally aren't in competition with one another. The Cleveland Orchestra and the New York Philharmonic aren't meeting up in big match vying to get the highest score. That said, there are contests for younger adult musicians, e.g., international piano competitions. There are the Grammys (for the pop side mostly), but like the Oscars, it's broadly understood that those are to some degree popularity contests. There is the Pulitzer for American composers, but we might remember the example of Charles Ives, who upon winning that award is reported to have said either "Prizes are for boys, and I'm all grown up," or "Prizes are the badges of mediocrity." Maybe he said both.
|
|
|
Post by neop on Oct 26, 2022 12:34:25 GMT -6
Looks like Sylvania Southview received a rating of II, but did get some Is in individual categories. Impressive showing for their first competition!
|
|
|
Post by N.E. Brigand on Oct 26, 2022 12:45:42 GMT -6
As for Swanton ... I have to think this will be tight between Van Buren and North Royalton, and if I were betting, my money would be on Van Buren to win. I would have lost that bet, although possibly Van Buren took both music and visual? North Royalton took G.E. (and percussion and guard). Van Buren did indeed take music and visual, so the margin between first and second place was likely very close. (Initially I wrote that this meant Van Buren was in first place with three of the five judges whose determinations contribute to the overall score and rating, but that's not necessarily so. North Royalton could have been first with one music judge, but by a smaller margin that Van Buren was ahead from the other music judge. On the other hand, Van Buren could have been on top with one G.E. judge.)
|
|
|
Post by ohioguy2 on Oct 27, 2022 8:31:17 GMT -6
OMEA Nelsonville-York 10/22
Class AA
1. Hilliard Davidson 92.2 (I) 2. Reynoldsburg 84.2 (I)
Class A
1. Berne Union 91.0 (I) 2. Dawson-Bryant 85.8 (I) 3. Athens 81.1 (I) 4. Chillicothe 78.8 (I)
Class B
1. Meigs 79.2 (II) 2. Gallia Academy 75.6 (II) 3. Wellston 74.72
Class C
1. Belpre 81.9 (I) Ex. Nelsonville-York (II)
OMEA Forest Hills 10/22
Class AA
1. Lakota East 94.0 (I) 2. Walnut Hills 89.0 (I) 3. Dublin Scioto 83.74 (I) 4. Loveland 82.9 (I) Rating Only - Sycamore (I) Ex Forest Hills (I)
Class A
1. Ross 87.6 (I) 2. Piqua 84.2 (I) 3. Walter E. Stebbins 82.5 (I)
Class B
1. Finneytown 88.25 (I) 2. Madeira 85.5 (I) 3. Arcanum 83.4 (I) 4. Deer Park 81.2 (I) 5. Clinton-Massie 80.4 (I) 6. Norwood 71.9 (II)
Class C
1. Grove City Christian 76.9 (II) 2. Blanchester 72.7 (II)
OMEA Thomas Worthington 10/22
Class AA
1. Grove City 93.1 (I) 2. Dublin Jerome 89.0 (I) 3. Worthington Kilbourne 88.2 (I) 4. Pickerington Central 85.36 (I) 5. Reynoldsburg 82.19 6. Delaware Hayes 81.26 (I) Ex Thomas Worthington (I)
Class A
1. Bishop Watterson 79.6 (I)
Class B
1. Heath 80.8 (I) 2. London 76.5 (II) 3. Buckeye 73.75 (II)
Class C
1. Adena 74.6 (II) 2. Madison-Plains 73.7 (II) 3. New London 70.46 (II) 4. Westfall 68.12 (III)
OMEA Tri-Valley 10/22
Class AA
1. Dublin Coffman 93.68 (I) 2. Hilliard Darby 93.06 (I) 3. Canal Winchester 84.7 (I) 4. Watkins Memorial 80.44 (I)
Class A
1. Point Pleasant 84.86 (I) 2. Zanesville 74.06 (II)
Class B
1. John Glenn 84.5 (I) 2. Licking Valley 83.76 (I) 3. Meadowbrook 83.0 (I) 4. Maysville 77.66 (II) 5. Morgan 73.3 (II)
Class C
1. Philo 83.28 (I) 2. Crooksville 77.42 (II) 3. Fort Frye 77.36 (II)
OMEA Tecumseh 10/22
Class AA
1. Marysville 84.38 (I)
Class A
1. Wilmington 85.5 (I) 2. Greenville 84.06 (I) 3. Tippecanoe 81.84 (I) 4. Bellefontaine 78.7 (II) Ex Tecumseh (II)
Class B
1. Versailles 85.6 (I) 2. Northwestern 82.9 (I) 3. Carlisle 81.0 (I) 4. Waynesville 73.6 (II) 5. Ottawa-Glandorf 71.7 (II)
Class C
1. Covington 81.6 (I) 2. Lynchburg-Clay 81.0 (I) 3. Columbus Grove 77.7 (I) 4. Bethel 74.4 (II) 5. Marion Local 73.4 (II) 6. Northeastern 69.58 (III)
OMEA New Philadelphia 10/22
Class AA
1. Nordonia 88.004 2. Hoover 86.992
Class A
1. Louisville 88.884 2. Revere 84.794 (I) 3. Crestwood 84.362 (I) 4. Marietta 75.564 (II)
Class B
1.Ridgewood 87.364 (I) 2. St. Clairsville 85.43 (I) 3. Coventry 81.788 (I) 4. Indian Valley 81.132 (I) 5. Union Local 78.114 (II) 6. River View 77.946 (II) 7. Marlington 77.934 (II)
Class C
1. Coshocton 74.214 (II) 2. Crestview 67.062 (III)
OMEA Lancaster 10/22
Class AA
Ex. Lancaster (I)
Class A
1. Berne-Union 92.8 (I) 2. Dawson-Bryant 86.64 (I) 3. Franklin Heights 81.78 (I) 4. Hamilton Township 80.02 (I) 5. Chesapeake 79.5 (II)
Class B
1. Eastern Brown 86.8 (I) 2. Bloom Carroll 84.1 (I) 3. Highland 82.56 (I) 4. Rock Hill 82.44 (I) 5. Northwest 81.78 (I) 6. Minford 77.5 (II) 7. Southeastern 76.96 (II)
Class C
1. Belpre 80.6 (I) 2. Fairbanks 73.2 (II) 3. Millersport 71.8 (II)
OMEA Shawnee 10/22
Class AA
1. Middletown 87.0 (I) 2. Marysville 84.5 (I) 3. Dublin Scioto 83.74 (I) Rating Only - Sycamore (I)
Class A
1. Edgewood 83.02 (I) 2. Newton 79.58 (II) 3. Bellefontaine 78.33 (II)
Class B
1. Northwestern 81.3 (I) 2. Deer Park 79.5 (I) 3. Carlisle 77.38 (II) 4. Ottawa-Glandorf 71.7 (II) 5. London 71.2 (II)
Class C
1. Lynchburg Clay 81.9 (I) 2. Madison-Plains 77.6 (II) 3. Northeastern 72.14 (II) Ex Shawnee (II)
OMEA Newark 10/22
Class AA
1. Hilliard Bradley 91.42 (I) 2. Licking Heights 90.31 (I) 3. Dublin Jerome 89.72 (I) 4. Westland 86.57 (I) 5. Pickerington Central 85.28 (I) 6. Watkins Memorial 78.5 (I) 7. Lancaster 79.65 (II) 8. Big Walnut 76.32 (II) 9. Groveport Madison 74.66 (II) Ex. Newark (I)
Class A
1. Lexington 85.42 (I) 2. Point Pleasant 81.62 (I) 3. Rock Hill 78.48 (II)
Class B
1. Licking Valley 83.08 (I) 2. Indian Valley 80.84 (I) 3. River View 80.15 (I)
OMEA Swanton 10/22
Class AA
1.North Royalton 87.22 (I)
Class A
1. Springfield 85.26 (I) 2. Sylvania Southview 79.43 (II)
Class B
1. Van Buren 86.76 (I) 2. Rossford 84.38 (I) 3. St. John's Jesuit 81.72 (I) 4. Wauseon 78.39 (II)
Class C
1. Fort Recovery 80.36 (I) 2. Ada 80.3 (I) 3. Kalida 79.94 (I) 4. Arcadia 79.51 (I) 5. Woodmore 76.92 (II)
MSBA Miamisburg 10/22
Class AAAAA
1. Larry A. Ryle 83.9 2. Campbell County 82.9
Class AAAA
1. Olentangy 78.8 2. Central Crossing 73.8 3. Marion Harding 71.0 4. Westerville 69.75 5. West Carrollton 68.3 6. Hamilton 64.0
Class AAA
1. Dixie Heights 70.2 2. Colerain 65.8
Class AA
1. Fairborn 67.6 2. Greenon 64.2 3. John Hardin 61.3 4. Goshen 59.0 5. Newton 54.1
Class A
1. Indian Hill 60.4 2. Simon Kenton 56.3 3. Switzerland County 54.7 4. National Trail 48.3 5. Bardstown 44.4 6. Rising Sun 43.6
|
|
|
Post by oldarmybandguy on Oct 27, 2022 9:18:13 GMT -6
I know scoring in OMEA isn't the same as BOA...curious to see if Lakota East, or their in-town rival, Lakota West makes it to Semi-finals. Given the field, it could be possible.
|
|
|
Post by N.E. Brigand on Oct 27, 2022 12:20:42 GMT -6
I would have lost that bet, although possibly Van Buren took both music and visual? North Royalton took G.E. (and percussion and guard). Van Buren did indeed take music and visual, so the margin between first and second place was likely very close. (Initially I wrote that this meant Van Buren was in first place with three of the five judges whose determinations contribute to the overall score and rating, but that's not necessarily so. North Royalton could have been first with one music judge, but by a smaller margin that Van Buren was ahead from the other music judge. On the other hand, Van Buren could have been on top with one G.E. judge.) Close, but not as close as I would have guessed: OMEA Swanton 10/22
Class AA 1.North Royalton 87.22 (I)Class A 1. Springfield 85.26 (I) 2. Sylvania Southview 79.43 (II) Class B 1. Van Buren 86.76 (I)2. Rossford 84.38 (I) 3. St. John's Jesuit 81.72 (I) 4. Wauseon 78.39 (II) Class C 1. Fort Recovery 80.36 (I) 2. Ada 80.3 (I) 3. Kalida 79.94 (I) 4. Arcadia 79.51 (I) 5. Woodmore 76.92 (II) A half point puts big North Royalton here ahead of little Van Buren by as much as North Royalton was behind Nordonia and Amherst two weeks earlier. (Speaking of Amherst, here's hoping the Copley scores cross your path.) Also, what a score for Southview to put up in their first ever OMEA appearance! And let me just offer you a huge thanks for gathering and sharing all these scores for the past two years. That's been enormously helpful in shining some much needed light on the state of the activity in Ohio.
|
|
|
Post by neop on Oct 27, 2022 14:43:47 GMT -6
(Speaking of Amherst, here's hoping the Copley scores cross your path.) I've been told the scores for 1st and 2nd place. Amherst Steele scored 91.66, and Brunswick scored 89.04.
|
|