Large networks such as social media platforms, road networks and even our genes contain large amounts of data hidden in plain sight. However, the techniques scientists devise to learn more about nonlinear relationships within these structures often result in unintentional discrimination against historically disadvantaged groups. These biased results are what electrical engineering and computer science professor Sucheta Soundarajan strives to prevent by bringing fairness to network algorithms.
Soundarajan received a National Science Foundation (NSF) CAREER Award for his research on network analysis algorithms. The grant is a one-time researcher grant intended to support the professional development of Soundarajan. In addition to providing funding for research, it will support a number of non-research service projects.
“Every time I get a grant, I feel good because it’s validation from the scientific community as a whole,” says Soundarajan. “This one above all because it is linked to me as an individual and not just to the project. I have the impression of being validated as a scientist. It means a lot. “
While the award is an individual achievement, it supports research that can benefit communities around the world. More and more information is being acquired from network analysis and what scientists are discovering is that although algorithms do not have access to protected attributes such as age, disability, gender identification, religion and national origin, they always end up discriminating against these groups.
“What we are seeing is that people from these minority and disadvantaged groups are unfairly discriminated against at a higher rate,” says Soundarajan. “We want to create algorithms that automatically find people at the center of a network, but do so in a fair way. “
Soundarajan says criminal convictions and loans are two examples of areas where algorithms are used to make crucial decisions and where scientists have detected potential unlawful discrimination. Another example of a fairness issue is the way we connect with each other on social platforms. Friendship recommendation algorithms can exacerbate people’s tendency to seek out those who are like them.
“Taken to the extreme, if people follow these recommendations, people end up in silos where they only connect to people who are like them and that’s how you end up with echo chambers.” , explains Soundarajan.
Apart from his research, Soundarajan will have the opportunity to hire a graduate student to help develop ethics-based modules that can be part of computer science courses in the hope that this will help students develop thinking. ethically oriented.
“We’re going to design these labs where we’ll give the students a set of data and they’ll apply algorithms to it, and then they’ll look at the results and have to ask themselves if those results are right,” says Soundarajan.
Soundarajan will also look into the development of continuing education for lawyers. She hopes to create courses that focus on explaining how algorithms can cause discrimination problems.
Devoting his time and talent to something meaningful for the company is important to Soundarajan. She credits the lifelong support she received as a factor in choosing her area of research, and acknowledges that the help she received from members of her department contributed to her latest achievement.
“I have been invested so much as a scientist, I feel like I have a moral obligation to do something that benefits everyone,” says Soundarajan. “I’ve been very lucky to be around people who really want to see me succeed and that’s also true at Syracuse University. People gave me their time, spending hours reading the proposal that won me this award, and it means a lot to me.