The use of instant replay had its first tryout in a U.S. competition recently. The Glacier Falls Figure Skating Club (of Anaheim, California) made use of an instant replay system at its 2004 Summer Classic competition. As far as we are aware this is the first time a video replay system has been used at a domestic competition in the U.S. The competition ran for four days and had about 330 entrants and over 600 starts.
More than just the introduction of replay into U.S. competition, this was also the first step in a series of tests the club is taking to introduce use of the new judging system in local competitions in a way that is both practical and cost effective for a typical USFSA club competition.
If this had been an ISU test, the result, regardless of actual experience, would be that the test was a complete success, everybody thought it was just fabulous, nobody had any issues with it, nothing went wrong, the skaters and coaches all loved it, and everybody wondered why this wasn't done years ago. That would be followed by enthusiastic cheerleading testimonials from skaters and coaches, but no useful information about what was done or what happened in the test.
This, however, was a real test, and this is a real test report. Overall is was a great success, with only minor problems discovered. A great deal was learned about what works and does not work in using instant replay and hardware capable of running the new judging system in a typical ice rink.
This article describes the results of this test and provides a "how to" look at the system. Note that this was not a USFSA endorsed test or project. It was undertaken by the club on its own in order to be prepared to use the new judging system when it is adopted by USFSA, and to gain some practical experience before that happens.
Each judge was provided with a LCD computer screen on which real time video of the skating was displayed. Five of the seven screens were touchscreens and two were normal LCD flat panel displays with touchpads. The purpose of using the touchpads was to see if the added cost of touchscreens could be avoided, to make the system more economical.
The result of this part of the test was that saving the video clips using touchpads was easy enough, but the judges all found navigating through the program for playback awkward and time consuming with touchpads. Even though the use of touchscreens raises the system cost by about $250 per screen, it was concluded that they are well worth the expense and essential to the system. Touch monitors could be used, which would also provide some cost savings, but these are too big, too heavy and consume too much power to be a practical choice.
The screens used were 15 inch LCD flat panel displays with 1024 X 768 resolution. They afford a large clear display of the video clips, which were digitized at 640 X 480 resolution, and there is enough area left over on the displays for the user interface needed to implement the new judging system.
This was a self-serve system. In ISU competition there is a video cameraman and replay technician. In this competition, however, the video feed was provided by the competition videographer, Hurd Video, and the judges selected the video clips themselves. By taking this approach, the cost of video equipment and personnel are eliminated. Competitions that do not make use of an official videographer, or have one that cannot, or will not, provide a video feed, would have to come up with their own video equipment and cameraman, raising cost.
In addition to the touchscreens, the system consists of two small cubical equipment rack, about 2 ft (60 cm) on a side. In addition there is a video distribution amplifier and network switch. The total amount of equipment (including miscellaneous supply boxes and several small tables on which the touchscreens are placed) takes up about 100 cubic feet (2,8 cubic meters) and fills up the entire bed of a full-size pickup truck three feet deep (0,9 m).
Clubs configure their judging stand in a variety of ways, depending on the configuration of their arenas. In some rinks there is enough room to use an actual stand with tables and office chairs, as is used at Nationals or in ISU competition. In others, the judges sit on director chairs shoulder to shoulder with nothing but clip boards to write on. In many others, removable table-tops are clamped onto the rink boards. This latter approach is the one used at the Summer Classic.
The Disney Ice Arena, where the competition is held, has ample room in its hockey boxes to place seven judges, the referee and all the equipment and cables without crowding the judges. The touchscreens were placed next to each table-top on pedestal type tables with a 12 in (30 cm) diameter top and 18 in (45 cm) octagonal base. Their height was chosen so that the middle of each screen was at the top of the boards. This afforded a comfortable view and reach for the judges without blocking view of the ice. This approach worked very well and was very easy to implement. The pedestal tables took two afternoons to make and cost about $20 each.
Installation of the system, from the time the truck pulled up at the back door, to the time the system was up and running was about 2 1/2 hours with two people doing the work. Although the rink was not used by hockey or the public each night after the competition, it was decided to lock up the touchscreens and computers each night but leave the rest of the installation intact. It was found that this part of the system could be stowed each night in about 10-15 minutes and setup the next morning in about 20-30 minutes, again with two people doing the work. In rinks where a complete teardown and setup each night is required these times would probably double.
At the end of the competition, final teardown of the system from shutdown to final loading of the truck took two people about 1 hour.
One very basic question in this test was whether the system would blow a circuit breaker in the rink. Sufficient power cords were on hand to run off two circuits if required, but that proved unnecessary. A single 15 amp circuit was able to handle the load. Disney Ice, however, is a fairly modern rink with a good electrical infrastructure. In rinks where that is not the case, using a system of this type will require careful forethought and planning to insure there is an adequate, uninterrupted, source of electricity.
In the self-serve approach, the judges select the things they want to review and replay them themselves. Video clips are captured after the event to be reviewed occurs. The system captures the prior 8 seconds for each clip. In this test a maximum of four clips could be captured for each performance.
For replay, the judge can call up the last 1, 2, 4 or 8 seconds of the clip. The clips can be stopped, restarted, stepped through frame by frame, or played back at double, half or quarter speed.
One goal of this test was to see how many clips the judges really need and how long they should be. Eight seconds is enough to review errors in jumps, and spins that might be short, but not enough to review entire step sequences. This limitation was chosen to prevent the delay between skaters from becoming excessive, since time is money at a local competition.
When the club submitted its competition sanction, it was warned that instant replay would blow the competition schedule, but that was found not to be the case.
In this competition the Referee scheduled an extra 20 seconds after each performance in the Short Programs to review video clips. No extra time was schedules in any other event segments.
It was found that the judges rarely saved all four video clips they were allowed, so the choice of four was a pretty good starting point. Nevertheless, the intent is to increase the capability of the system to support a greater number of clips. For reviewing spins where the skater rotates in slow motion, it was found that slightly longer clips might occasionally be useful, say 10 seconds maximum. In only one case, however, did a judge feel limited by the lack of ability to capture a full step sequence. Adding one long clip for this purpose is under consideration.
Because of the importance of getting deductions correct in the short program, the extra 20 seconds proved worthwhile. In the free skating, however, the judges were mainly interested in checking the landings of jumps. Even though no extra time was scheduled, after a day of experience most judges were able to review 2 or 3 jump landings in the typical time scheduled between skaters in a local competition.
The experience at this competition was that the use of replay added less than one hour to the total competition schedule and did not cause the competition to run late. On the other hand, had 20 seconds been added to each start, the competition schedule would have been extended by one hour a day, at a cost of about $1200 for added ice time (equivalent to about $4 per entrant).
The system operated reliably for the four days it was in use. As with any new software put to its first full test, the judges did manage to find unexpected combinations of buttons to push that had unintended consequences, but this has since been corrected. For the most part the system was operated unattended by a technician, but it was clear that when used for actual scoring, a technically qualified person must be on hand at the judges' stand at all time to provide assistance to the judges and quickly resolve any technical problems that might arise.
Overall, nearly all the judges used the system to some extent, and many were very enthusiastic about it. Some clubs expressed interest in using the system in their own competitions and the system is available to local clubs for that purpose.
Due to the mathematical characteristics of the new judging system, skaters would be best served if local competitions were judged by a minimum of nine judges and qualifying competition with 11 or more. We also assume that random selection of judges will not be used in U.S. competitions, and the scores from all judges on a panel will count.
Obviously, the greater the number of judges used, the greater the cost of the hardware and the greater the cost to the competition for the added officials. Aside from the issue that random selection of judges skews the results to produce the wrong answer frequently, clubs cannot afford the expense of officials who's marks don't actually contribute to the results.
It is our current view that the bare minimum approach to using the new judging system at a local competition requires seven judges, with the referee and one caller sharing an eighth screen. With adequate spare hardware, supplies, cables and packing cases the total cost for a turnkey system is about $20,000. With nine judges and a second caller, the cost would increase by about $6,000.
The system described here was packaged for local use, lovingly transported by local volunteers. If the system were to be transported by commercial shipper, more robust packaging would be required, increasing cost by about $2,000. In addition, the cost of transportation would have to be added to the cost of use at each competition.
It is the author's experience in frequently transporting hardware of similar complexity for use in his day job, that a system such as this has a lifetime of about 4 years. For cost of ownership one should assume repair and maintenance costs of at least $1,000 per year.
The cost of a system such as this is beyond the means of most clubs in the U.S., and given that one club would use it no more than a few days a years, justifying the expense would be difficult. However, anything less than a computer system with replay capability makes use of the new scoring system unmanageable and is a disservice to the competitors.
On a regional basis, the system described here is cost effective for a group of clubs sharing use of the hardware. For example, 20 clubs in an interclub association that purchased a system and used it 50 days a year, would recover the cost of purchase and operation in the four year lifetime of the system, with a rental fee of $120 per day.
The hardware and replay software is now being tweaked based on the experience gained in this first test. The next step is to take scoring software and caller software which currently exists in separate programs and integrate all aspects of the new judging system into a single program. Once that is completed further operational testing will begin.
Following the Junior Grand Prix in Long Beach, which the club is hosting, the user interface will be tested in local competition and a mock competition with events at all levels will be held. If these two operational tests are successful, the system is expected to be ready for use with the new judging system by early October 2004. At that point use by shadow panels in actual competition would be feasible. It is the goal of the club to begin use of the new judging system in its competitions beginning in calendar 2005.
Copyright 2004 by George S. Rossano
11 August 2004