Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Inclusive Reads & Conversations with UWGB Libraries

Find information about this month's speaker and reading.

This Month's Speaker

Gaurav BansalGaurav Bansal, Ph.D.


Dr. Gaurav Bansal is Frederick E. Baer, Professor in Business and Professor of MIS / Statistics at the Austin E. Cofrin School of Business at UW-Green Bay. He served as the Founding Chair and Academic Director of the Master of Science in Data Science program at UW-Green Bay. He is interested in researching how our cognitive limitations and various biases impact our decision-making and online behavior.

This Month's Topic

In our discussion and readings for this month, we will be exploring "the biased world we live in, made of biased data, powered by biased data, for generating more biased data" and its relationship to diversity, equity, and inclusion.

From Gaurav:

"Today’s information-driven society runs based on algorithms built using data and spins out data for other algorithms to use. These algorithms are increasingly driving all aspects of our lives from our news feeds, people we meet, friends we keep, employees we hire, the food we eat, emotions we feel, media we watch, products we buy, and who we vote. The list is getting bigger every day. But little do we realize that these algorithms make certain assumptions that are not always true for all populations and under all circumstances. The algorithm development processes and algorithms themselves suffer from several limitations that hinder our efforts to create an inclusive society. We as a society need to be aware of such limitations and develop mechanisms to highlight these at appropriate forums so that action can be taken to modify and improve the algorithms to make our world more inclusive and welcoming for all.

Few things to note. These algorithms are built on historical data. These algorithms are particularly “smart” about catching the deep biases inherent in the past data and “bake” them “perfectly” into the decision-making. For instance, historically, generally, males were hired for leadership positions in the past. Thus, these algorithms swiftly learn to reject female applicants from the recruitment pool. These algorithms use proxy variables. An AI-based app assisting in distributing health care assuming that prior medical history signals the need of healthcare mistakenly assumes poor people to be healthy as most poor people don’t have a medical history. It fails to recognize that they don’t have a medical history, not because they are healthy, but because they don’t have health insurance. Since poverty is highly correlated with race, the oversight on the algorithm quickly creates a disadvantaged world for one particular race. These algorithms create a personalized world. The social media algorithms know what we like to watch and read; hence, they present only one facet of the world. Little do “they” realize that by doing so, they are creating deep factions in the society by making a unique and personalized world for each of us. And as they increase exclusivity, they, beyond doubt, lower inclusivity."

Access the Readings

Directions for Setting Up NYT Account

Recommended Further Reading from Gaurav