Navigation Content
University of Wisconsin Madison College of Engineering
You are here:
  1. Home > 
  2. News > 
  3. News archive > 
  4. 2013 > 
  5. Focus on new faculty: Rebecca Willett, harnessing big data for myriad collaborations

Focus on new faculty: Rebecca Willett, harnessing big data for myriad collaborations

“Hopefully, the tools we develop will lead to deeper insight into people’s responses to the media, to national and local events, and to one another,” Rebecca Willett says.

Rebecca Willett began work at UW-Madison in fall 2013, yet already names a wide array of fields in which she has found or is seeking collaborators: astronomy, social media, medical imaging, computer science, statistics, political history.

Fortunately for Willett—who recently joined the faculty as an associate professor of electrical and computer engineering and a fellow in the Wisconsin Institute for Discovery (WID) optimization group—her particular expertise in big data processes proves handy wherever people need to make sense of overwhelming amounts of information.  

As she earned her master’s and PhD in electrical and computer engineering at Rice University, Willett came to focus on point processes, in which a researcher must analyze a body of data that comprises many individual points, or “discrete events.” A discrete event can range from a Tweet to the firing of a neuron in the brain or from a photon hitting a detector in an X-ray telescope to a packet traveling through an Internet router. 

“One of the reasons I came here is the breadth of collaborative opportunities,” Willett says. The interdisciplinary adventures she’s begun since arriving in Madison include working with Computer Science Associate Professor Jerry Zhu to better understand what causes users’ interactions on Twitter. “Hopefully, the tools we develop will lead to deeper insight into people’s responses to the media, to national and local events, and to one another,” she says.

One of her over-arching goals is to develop tools for using discrete event data to draw reliable conclusions about an underlying physical phenomenon. “It’s a broad statistical problem arising in a variety of different applications, leading to significant mathematical and computational challenges.”

Typically the underlying physical system (such as a social network or a scene being image) is highly complex, with a large number of unknown elements to be inferred from noisy, distorted data. It’s Willett’s job to develop methods of finding structure within these systems. “Often we have access to physical models which may be accurate but are impossible to fold into practical software. We then work to develop relaxed, less stringent models which allow rapid computation without losing accuracy,” Willett says. 

“Less stringent” criteria can actually mean a more sophisticated development process, because Willett’s approach has to adapt to a variety of fields. During her eight years (first as an assistant professor, then as an associate professor of electrical and computer engineering) at Duke University, she collaborated with a North Carolina State University astronomer to explore the physical processes that led to the formation of Kepler’s Supernova. In this case, a “discrete event” was a photon hitting a satellite’s detector. “Different wavelengths of light tell you what materials are present,” Willett says. “I want to be able to say with some certainty whether or how much iron is present. How much light do I have to collect before I can answer this reliably?”

Light figures heavily into much of Willett’s other research. One project focuses on improving the quality of night vision images. “Service members identify allies using flag arm patches like this one,” she says, displaying a dark, blurry American flag. The process of reducing noise in such images relies on finding and exploiting structure in the underlying scene. Rather than breaking down the images as a mass of unrelated pixels, she identifies blocks of pixels with common features — for instance corresponding to a commonly formed intersection between a star and the flag’s background. 

By splitting the image into repeating blocks, and understanding the structure that exists among those blocks, Willett can use a complex process of averaging to extrapolate a clearer image. The result is a de-noising process that’s far more high-end and complex than consumer photo-editing software. The hard part is creating the underlying averaging process, because previously established methods will break down. “There are complex and interlaced engineering-design choices to be made here,” she says.

Some of the tools Willett has developed for imaging can also be used in politics. Sifting through U.S. Senate voting records from 1795 to 2011, her methods track how different senators formed dynamic voting blocs and factions. Her analysis doesn’t take into account the actual content of the bills or their historical context. Yet her visualizations of the data align accurately with the historical narrative, reflecting the stark divisions of Congress during the Reconstruction and after the 2010 elections, and less stringent divides during the civil rights era.  

Willett currently supervises a master’s student in computer science and plans to bring four of her PhD students here from the Duke ECE department. At WID, she’s working on finding optimal approaches to analyzing huge data sets, particularly in the biological sciences. 

“You can’t just plug in standard code or analysis techniques,” she says. “It’s a rich field for research.”

Scott Gordon
11/1/2013