In 2016, the artificial intelligence program AlphaGo defeated 18-time world-champion Go player Lee Sedol, four matches to one, in a five-match series.
The victory—machine over man—sent shockwaves around the world. For Xin Wang, then an undergraduate student studying engineering at Tongji University in China, it sparked an interest in finding a way to combine civil engineering and the continually advancing interfaces between man and machine.
At the time, Go—a board game where players attempt to enclose more territory than their opponent—was considered too complex for AI. AlphaGo’s victory signaled the beginning of a new era of AI, capable of handling more intricate challenges than the older generations. Now Wang has followed that interest, sparked by AlphaGo’s victory, to his civil and environmental engineering graduate studies at the University of Wisconsin-Madison.
He is designing an AI-based system that may one day bring advanced gesture-based input to future construction sites. Construction workers on today’s sites use hand gestures to quickly communicate with one another. Such gestures can be especially useful in loud construction environments, where noise can overpower vocal instructions. Wang’s research may allow construction workers to use hand signals to control construction equipment directly, narrowing the margin for accidents caused by human error or misinterpretation, and leading to safer worksites.
Wang works under Mortenson Assistant Professor of Civil and Environmental Engineering Zhenua Zhu. “Professor Zhu really has an interest in the application of computer science for civil engineering and I was drawn to him because that’s also my research interest,” Wang says. “Together, we decided to apply artificial intelligence to the construction environment in order to improve it.”
To do this, Wang and Zhu first conducted a feasibility study, which has since published in the journal Automation in Construction, to determine whether a computer could capture and decipher hand gestures used on construction sites. Through the study, which employed neural networks, Wang and Zhu found that, in a test environment, AI could interpret hand gestures in just under two-tenths of a second with 93% accuracy.
“We developed a weighted-based framework,” Wang says. “On construction sites, some workers won’t just stand still. They’ll be moving around the site as they make their gestures, and the detection and tracking system we developed can track the workers as they move.”
With the initial step completed, Wang is working to testing the Tap Strap 2, a wearable sensor that can be used to direct gesture-based input into a computer’s gesture recognition system. In the future, he and Zhu may work to combine the Tap Strap 2 with their weighted detection system and a camera to improve gesture reading accuracy and usability. They ultimately hope to partner with a construction company to test the system in a real-world setting.
“That is the type of work we’d need to partner with companies to test because it can’t just be accomplished in a lab,” Wang says. “Construction sites can be loud and hectic, and sometimes the misunderstood hand gestures between workers can cause accidents. Part of why we’re conducting this research is because we want to help prevent such accidents. We want to improve the construction industry by improving safety and efficiency.”
Wang also views the research as foundational. As the world moves towards automation, it’s likely that more robotic systems will make their way onto construction sites. If the research continues to advance as planned, Wang hopes that it will help position future technologies to play a role in creating safer construction sites.
“Our research is really for the future since there’s not yet a lot in the way of robotics on construction sites,” Wang says. “With the advent of more construction robots, we think our research can have a broader impact across the construction scene.”
Author: Alex Holloway