Consumer Empowerment Through Ethical AI: Strategies for Transparent and Trustworthy Personalized Marketing

Authors

  • Annisa Qurrota A'yun Univeristas Diponegoro, Indonesia
  • Wahyu Setyaningsih Univeristas Diponegoro, Indonesia

DOI:

https://doi.org/10.70764/gdpu-jmb.2025.1(1)-01

Keywords:

Consumer Empowerment, AI-based Marketing, Ethics in AI, Consumer Privacy

Abstract

Objective: This research aims to explore how the ethical application of artificial intelligence (AI) in marketing can empower consumers through transparency, data control, and bias reduction. The main focus of this research is to understand the impact of giving consumers control over their personal data, transparency in the use of AI, and bias mitigation efforts on consumer trust and loyalty. Research Design & Methods: This research employs thematic analysis to identify themes on ethical and legal challenges in AI-based marketing, using specific keywords to ensure relevance. Sources were selected for credibility, relevance, and recent publication, with key seminal works also included. Findings: The results show that consumer empowerment through ethical AI increases trust and loyalty. Transparency of data use and control over personal information are important for strong relationships between consumers and brands. Reducing bias in AI algorithms is also necessary for fairness and inclusiveness. Implications & Recommendations: This research recommends companies to implement ethics in AI marketing through data transparency, consumer control over personal data, and bias-free algorithms. The application of explainable AI (XAI) is also recommended to increase consumer understanding and trust, ultimately strengthening loyalty through ethical and responsible AI. Contribution & Value Added: This research enriches the AI marketing literature by emphasizing the importance of ethics in empowering consumers and provides practical guidance for companies to implement AI ethically, transparently, and with respect for consumer privacy.

Downloads

Published

2025-05-05

Issue

Section

Articles