Bluesky Facebook Reddit Email

‘Privacy by design’: Purdue tech protects against identity leaking during AI photo editing

03.12.26 | Purdue University

Apple MacBook Pro 14-inch (M4 Pro)

Apple MacBook Pro 14-inch (M4 Pro) powers local ML workloads, large datasets, and multi-display analysis for field and lab teams.


WEST LAFAYETTE, Ind. — Consumers, businesses and institutions may soon have private, secure and trustworthy generative AI tools for editing and sharing profile photos, ID images and personal pictures without exposing their private identities to external platforms.

Purdue University researchers Vaneet Aggarwal , Dipesh Tamboli and Vineet Punyamoorty have developed the patent-pending system, which is utilized before and after photos are uploaded to an AI editing platform.

“Results of validation testing show that we can preserve editing quality while dramatically reducing what AI models can learn about your identity,” Aggarwal said. “This is a critical step toward trustworthy generative AI.”

Their research has been published in the peer-reviewed journal IEEE Transactions on Artificial Intelligence .

Aggarwal is a University Faculty Scholar and the Reilly Professor of Industrial Engineering with courtesy appointments in the Department of Computer Science and the Elmore Family School of Electrical and Computer Engineering . Tamboli is a doctoral alumnus and Punyamoorty is a doctoral candidate in computer and electrical engineering; both worked in Aggarwal’s research group .

“Our system allows users to mask sensitive regions on their photo, like the face, from an AI editing service,” Tamboli said. “Those regions are masked locally on the user’s device using a detailed outline of the region.”

Tamboli said only the masked image is sent to the AI editing service.

“After the image is edited by AI, our system reintegrates the sensitive region back into the edited image using geometric alignment and blending,” he said.

Aggarwal said the Purdue system is the first solution that delivers:

“It’s privacy by design,” he said. “With our system, the AI platform never sees the face, but the final edited image still looks completely natural.”

The researchers disclosed the system to the Purdue Innovates Office of Technology Commercialization , which has applied for a patent to protect the intellectual property. Industry partners interested in developing or commercializing the work should contact Parag Vasekar, business development and licensing manager, at psvasekar@prf.org about track code 71122 .

Addressing privacy risks from AI editing tools

Tamboli said modern generative AI tools edit photos with impressive realism but require users to upload full, unaltered images to cloud-based systems. These images include private details including the face and identifying features.

“Requiring full, unaltered images creates serious privacy and security risks,” he said. “Once a photo is uploaded, users lose control over where their biometric data goes, how it is stored or how it might be misused.”

Tamboli said previous privacy approaches relied on blurring sensitive regions, locking parts of an image, using stylization filters or avoiding cloud upload entirely. Some also use traditional anonymization or differential privacy.

Aggarwal said these traditional solutions have major drawbacks:

“So these traditional methods break the editing process or fail to fully protect personal identity,” he said.

Validating and developing the Purdue privacy system

The researchers validated their system by testing how well leading AI foundation models infer biometric attributes from masked versus unmasked images.

They found the Purdue system significantly reduced the ability of AI models to detect attributes such as eye color, facial hair and age group. In some cases, attribute-classification accuracy dropped by more than 80%, demonstrating strong protection against identity leakage.

The research team is taking steps to bring the technology closer to real-world deployment, including expanding the system to protect additional sensitive features such as medical details, ID documents and other privacy-critical content.

About Purdue Innovates Office of Technology Commercialization

The Purdue Innovates Office of Technology Commercialization operates one of the most comprehensive technology transfer programs among leading research universities in the U.S. Services provided by this office support the economic development initiatives of Purdue University and benefit the university’s academic activities through commercializing, licensing and protecting Purdue intellectual property. In fiscal year 2025, the office reported 161 deals executed with 269 technologies licensed, 479 invention disclosures received, and 267 U.S. and international patents received. The office is managed by the Purdue Research Foundation, a private, nonprofit foundation created to advance the mission of Purdue University. Contact otcip@prf.org for more information.

About Purdue University

Purdue University is a public research university leading with excellence at scale. Ranked among top 10 public universities in the United States, Purdue discovers, disseminates and deploys knowledge with a quality and at a scale second to none. More than 106,000 students study at Purdue across multiple campuses, locations and modalities, including more than 57,000 at our main campus in West Lafayette and Indianapolis. Committed to affordability and accessibility, Purdue’s main campus has frozen tuition 14 years in a row. See how Purdue never stops in the persistent pursuit of the next giant leap — including its integrated, comprehensive Indianapolis urban expansion; the Mitch Daniels School of Business; Purdue Computes; and the One Health initiative — at https://www.purdue.edu/president/strategic-initiatives .

Media contact: Steve Martin, sgmartin@prf.org

IEEE Transactions on Artificial Intelligence

10.1109/TAI.2026.3671211

PRIVATEEDIT: A Privacy-Preserving Pipeline for Face-Centric Generative Image Editing

6-Mar-2026

Keywords

Article Information

Contact Information

Steve Martin
Purdue Research Foundation
sgmartin@prf.org

Source

How to Cite This Article

APA:
Purdue University. (2026, March 12). ‘Privacy by design’: Purdue tech protects against identity leaking during AI photo editing. Brightsurf News. https://www.brightsurf.com/news/8X5DEOP1/privacy-by-design-purdue-tech-protects-against-identity-leaking-during-ai-photo-editing.html
MLA:
"‘Privacy by design’: Purdue tech protects against identity leaking during AI photo editing." Brightsurf News, Mar. 12 2026, https://www.brightsurf.com/news/8X5DEOP1/privacy-by-design-purdue-tech-protects-against-identity-leaking-during-ai-photo-editing.html.