Raginsky receives CAREER Award to apply information theory to machine learning problems
By Elise King, Coordinated Science Lab
February 22, 2013
- ECE Assistant Professor Maxim Raginsky has received a CAREER Award from the National Science Foundation to develop an information-theoretic approach to machine learning problems that involve multiple resource-constrained learning agents in a large network.
- His goal is to make sure that the network resources are allocated in a smart way, and each user receives only the data they need without significant waste of bandwidth or power.
- The NSF CAREER award supports a five-year project and is given to junior faculty members.
ECE Assistant Professor Maxim Raginsky, a researcher in the Coordinated Science Lab, has received a CAREER Award from the National Science Foundation to develop an information-theoretic approach to machine learning problems that involve multiple resource-constrained learning agents in a large network.
Traditionally, communications researchers have been concerned about delivering information reliably from point A to point B, such as a Netflix user who wants movie content to stream smoothly to his screen. However, technology users these days not only want information to be delivered, but also analyzed and interpreted. Traditional compression algorithms exploit redundancy in the signal, but are oblivious to the goals of many different types of users that may rely on the network.
“The overall design objective is to make sure that the network resources are allocated in a smart way, and each user receives only the data they need without significant waste of bandwidth or power,” said Raginsky.
Raginsky uses ecological monitoring as an example. If someone is tracking a rare bird species in a specific habitat and wants to record how many of these birds fly in and out of the area, it would be a waste of resources to continuously stream video if what the person really wants is just the arrivals and departures of the birds. A big part of the problem is learning to detect events of interest and to reliably communicate only the data describing these events.
“So I want to make sure that only the relevant information gets to those who need it, despite the fact that everyone is using the same network and the kinds of information that are relevant to one user are different than the kinds of information that are relevant to somebody else,” Raginsky said.
These problems are messy and complex, and there is no hope to come up with an accurate model for all kinds of data being transmitted and received over networks because of the increasing size and complexity of both the networks and the data, Raginsky said. Machine learning offers a variety of tools for extracting predictively relevant information from observations, but to date most of the research on machine learning has not focused on the network aspect and all the resource constraints that it imposes.
This project will systematically explore what is and is not possible in these types of large networks with multiple learning agents, specifically identifying the effect of bandwidth limitations, losses, delays and lack of central coordination on the performance of statistical learning algorithms, thus helping develop efficient and robust coding/decoding schemes.
The CAREER Award is given specifically to “junior faculty members who exemplify the role of teacher-scholars through outstanding research, excellent education, and the integration of education and research within the context of the mission of their organizations,” according to NSF’s website.
Raginsky said that because these awards are for five-year projects, the proposals take a lot of time and effort.
“You propose to research something you’re really passionate about, and presumably you want to work on this topic even if it did not get funded,” he said. “So, when I heard about my proposal being recommended for funding, of course it was a relief. I will have a good time working on this problem.”
Editor's note: media inquiries should be directed to Brad Petersen, Director of Communications, at email@example.com or (217) 244-6376.