In preparation for their square-off in Sunday’s Super Bowl LVII, players and coaches from the Kansas City Chiefs and Philadelphia Eagles have spent hours poring over video footage of each other’s games, hoping to get an edge that will make a difference in the penultimate clash.

It’s a practice that goes back to the early days of the NFL, well before the advent of videotape recording, and one that, even with a host of new digital tools, still requires a human to watch the action to see what’s really happening.

But new research from a group of BYU researchers is set to make that practice a thing of the past, thanks to a system under development that’s leveraging the power of artificial intelligence to create a new, and very efficient, analysis tool.

BYU professor D.J. Lee, master’s student Jacob Newman, and Ph.D. students Andrew Sumsion and Shad Torrie are using AI to automate the time-consuming process of analyzing and annotating game footage manually. Using deep learning and computer vision, the researchers have created an algorithm that can consistently locate and label players from game film and determine the formation of the offensive team — a process that currently requires a slew of video assistants.

The first seeds of the research were sown amid the pandemic, when Lee connected with analysts from BYU’s football program to talk about how AI-driven, automated film analysis might help the program.

“They showed us their current process, which does incorporate some software tools, but still requires someone to scroll through video footage of games,” Lee said. “We thought, if we could automate this it would save hours of manual work and change the whole system.”

BYU researchers capture sound of world’s most powerful rocket coming to life
How BYU-born research went from (almost) disrupting the cellphone business to advancing spinal repairs
Utah startup Errand wants to make life easier by doing the running around for you

To begin the work, Lee and his team analyzed hours of game video provided by BYU football staffers — but realized the camera angles were all wrong and didn’t provide consistent, unobstructed views of what all the players were doing at the same time.

Flummoxed by the inconsistent video footage, the BYU researchers turned to a wildly popular and hyper-realistic video game to help solve the dilemma.

“What we did was came up with the idea to use the Madden NFL video game,” Lee said. “It actually allows you to position the camera view in a variety of different places. While it also has some limitations, but it was a much better way to train our algorithm that gave us control and was more consistent.”

They used those images to train a deep-learning algorithm to locate the players, which then feeds into a Residual Network framework to determine what position the players are playing, according to the researchers. Finally, their neural network uses the location and position information to determine what formation (of more than 25 formations) the offense is using — anything from the Pistol Bunch TE to the I Form H Slot Open.

Lee said the algorithm can accurately identify formations 99.5% of the time when the player location and labeling information is correct. Some formations, like the I, where four players are lined up one in front of the next — center, quarterback, fullback and running back — proved to be among the most challenging formations to identify thanks to some obstruction issues.

The BYU algorithm is detailed in a journal article “Automated Pre-Play Analysis of American Football Formations Using Deep Learning,” recently published in a special issue of Advances of Artificial Intelligence and Vision Applications in Electronics.

Lee said the research is in the early stages but has the potential to be expanded beyond just identifying formations to tracking individual player movement, and providing notes about that movement, after the ball is snapped and a play commences.

Some changes in how game video is shot will be required to take full advantage of the developing AI tool. Lee said higher camera placement, like one set up where most stadiums house press boxes, would capture a full-field, unobstructed view that would work well with how the algorithm functions.

The process could also be adapted, Lee said, to provide analysis for other team sports like baseball or soccer. Lee underscored that the tool is not designed or intended to be a replacement for team coaches and analysts but rather provide a new resource to make the work of game preparation more efficient and more accurate. And, he noted the tool could give a competitive edge to those teams that put it to use.

“Once you have this data there will be a lot more you can do with it,” Lee said. “You can take it to the next level.

“Big data can help us know the strategies of this team, or the tendencies of that coach. It could help you know if they are likely to go for it on fourth down and two or if they will punt. The idea of using AI for sports is really cool, and if we can give them even 1% of an advantage, it will be worth it.”