Resolve Magazine Fall 2023 >> Making Sense of Machine Learning >> Stories >> Exposing fakes
As you scroll through your social media feeds, are you sure what you’re seeing is real?
“The human visual system is very advanced, and we can easily recognize real-world objects,” says Aparna Bharati (pictured), an assistant professor of computer science and engineering. “But online, we can very easily be fooled by high-quality fakes, and sometimes, even by the low-quality ones.”
Bharati researches computer vision, an interdisciplinary field that enables computers to understand the visual world through photographs and videos. Specifically, she’s developing algorithms for the detection and profiling of fake content to solve the problem of visual misinformation. As our use of social media has expanded, so have the tools and algorithms to edit, filter, and generate misleading or fabricated images.
“Those images can be used to support false narratives, or erode trust in the information ecosystem,” says Bharati. “The goal of our research is to help restore that trust so people can make more-informed decisions.”
She says the operations that users employ to create artificial content leave behind telltale signs, or “statistical fingerprints.” Her team will use a range of deep-learning techniques to analyze volumes of images, including real ones and their artificially edited variants. They’ll train the model to not only distinguish between the authentic and the fake, but with the latter images, which regions within them are spurious. In other words, the algorithm will pick up and highlight all those little details we miss as we casually scroll through our feeds.
“Developing algorithms that can inform users about the history of the content, with respect to how it was edited or generated, can reduce misinformation for online users,” says Bharati. “Well-informed citizens are the backbone of any democratic or economic system, and in order to take actions for the betterment of ourselves and the world, we need to know what is real.”
Main image: Cundra/iStock