New Judging System Debuts in U.S. Domestic Competition

The first domestic competition in the U.S. judged entirely under the new judging system was held by the Glacier Falls Figure Skating Club of Anaheim, Ca. on April 23., 2005.  This was a club competition, meaning entries were limited to club members, as opposed to an open competition which any USFSA skater may enter.  Unlike previous tests of the new judging system that have been held at some U.S. competitions in the past year, all events were judged using the new judging system, which was used to calculate the official results of the competition.

Nearly 80 skaters performed their routines, ranging from the Basic 2 level (6-8 year old skaters), through the Junior level.  In addition to Short Programs and Long Programs, Compulsory Moves events and Artistic events were judged using the new system.  The role of Technical Specialist was filled by a local National Judge, Wayne Hundley, and a local PSA Master rated coach, Sherri Terando.  Three of the club's higher level skaters served as Assistant Technical Specialists and data entry operators.

Since the new judging system was not designed for events below the Junior level, and event requirements for judging under the new system have yet to be finalized in the U.S., the club created its own set of guidelines for the events and made a number of small modifications to the judging system to customize it for the U.S. competition structure.  For the Novice events and above, however, the standard ISU requirements were used.

Requirements for number of elements were established for every event, which turned out to agree with requirements recently put forward by the USFSA scoring system implementation committee in all but one case.  Weighting factors were chosen for each event level and type of event to balance the value of elements and program components. At the lowest levels some of the program components were not used.  Penalties for deductions were also reduced for the lower levels, for which point totals are fairly small compared to senior level events.

To score the competition, a full hardware system, with instant replay capability was used.  This hardware supports the use of seven judges and a caller, and cost about $18,000.  The competition was scored using seven judges on each panel, the minimum number of judges we feel are necessary for producing result calculations of acceptable accuracy.  The single trimmed mean was used in the scoring calculation, but random selection of judges was not used, nor was anonymity of the judges.  Posted results included the identification of each judge (and their marks), and the total points each judge ended up awarding each skater, as well as the resulting order of finish determined from each judge individually.  The skaters also received an ISU-style protocol of all their marks.

Events where only one skater entered were judged like all the others events, with seven judges marking the program.  One nice aspect of the new system is that these skaters receive a set of marks and a point total they can use to gauge their competition progress, even in competitions where they have no one to skate against.

To limit the number of officials required (saving on manpower and expenses) the competition made use of a Technical Specialist (TS) and Assistant Technical Specialist (ATS).  There was no Technical Controller, no data entry operator and no instant replay operator.  The ATS served as the data entry operator and replay operator.  Element reviews were decided by a vote of  the TS, ATS and the event Referee.  The video feed for the instant replay was provided by the event videographer, so the competition did not incur any cost for video equipment or camera operator.

Due to a scheduling conflict at the rink, setup of the hardware was not completed until two hours into the competition.  This little "wrinkle" allowed the competition the unintended "opportunity" to test three different calculation approaches during the competition, and directly compare them to each other in a real competition environment.

For all events, the judges were asked to record their marks on manual scoring sheets, and the ATS recorded the calls of the TS.  For the first third of the competition, the accountants manually entered the calls from the TS worksheet and the marks from each judge's worksheet.  This process was painfully slow, and significantly delayed posting results.  The experience made it painfully clear that a fully manual system and a caller-less system is completely unrealistic both in terms of the required activities on the judges' and in the accounting room.  So much so it would seem better to not hold a competition than to use this method!

For the middle third of the competition a semi-automated calculation process was used.  A data file created by the "caller" computer was used to input the calls and deduction directly into the accounting program.  The marks of the judges were then entered by the accountants manually from the manual scoring sheets using a graphical  interface.  After a little practice, the accountants were able to calculate event results nearly as quickly as they currently do in the current system under closed judging.

For the last third of the competition the full hardware system was used.  The accounting process was completed using the computer files generated by the system, and it took less than 2 minutes to calculate and print out the results sheets for an individual event.

Based on this most recent experience, the Glacier Falls FSC remains committed to refining the process of scoring local competitions using full hardware support.  Most USFSA clubs, however, cannot afford to do this, or lack the manpower to implement this approach on their own.  One lesson learned from this competition, however, is that the semi-automated approach works better than they would have predicted before the competition.  The semi-automated approach does not adversely affect the timing and cost of a competition, and does not require major changes to the current practices of holding a competition.  The semi-automated approach is very cost effective.  It comes to about  $1500 for one caller computer with instant replay (which the callers made good use of during this competition).  For a club that already had a sufficiently powerful computer, the cost to upgrade it to a "caller" computer with instant replay is about $700.

While using the semi-automated approach during the competition, it was found that just having the computer displays available for each judge was a great way to distribute the calls to the judges.  Thus, one enhancement to the semi-automated approach, which was demonstrated at this competition to work well, is to give each judge a "dumb" screen with the list of calls and a video feed.  With these screens, the judges have an up-to-the-moment display of the calls and they can also see the replays viewed by the caller. The cost for this is about $175 for each judge.

Other than the delay in posting the marks, the skaters and coaches were enthusiastic about the new process.  The skaters were particularly excited about seeing their protocols and studying their marks.  Overall, "the great experiment," proved extremely useful in providing real-world experience and information about how the new judging system can be implemented at the lower levels of the USFSA competition structure, a subject that is sure to receive considerable debate at the upcoming USFSA Governing Council next month.

Return to title page

Copyright 2005 by George S. Rossano