From Connection to Control:
Rethinking Relational Education in the Age of AI
From Connection to Control:
Rethinking Relational Education in the Age of AI
Aug 19 2039
About the Author
Tanvi Vartak is a product designer, trust & safety strategist, and founder focused on the intersection of education, technology, and human connection. A graduate of NYU's Games for Learning program, she designs AI-powered learning experiences that foster relational intelligence in young people. Her work explores how emerging technologies reshape social development and how education can evolve to meet this moment.
In today's tech landscape, developers are creating AI-based solutions for virtually every human challenge, including loneliness. Platforms like Character.AI and Replika are increasingly positioning artificial intelligence not merely as productivity tools but as companions, confidants, and substitutes for genuine human connection.
While this might appear promising amid a global loneliness epidemic, it raises profound questions: What are the consequences when emotional needs are fulfilled by systems engineered primarily for engagement rather than wellbeing? And how should our educational approaches evolve in response to this fundamental shift in human relationships?
The Evolution of Disconnect
This issue extends beyond AI itself. The Industrial Revolution transformed education from community-building toward literacy, compliance, and productivity. The internet later made information access nearly effortless. Now, with social media and AI, we're witnessing a new transformation: the privatization and outsourcing of emotional experiences.
The Surgeon General of America (2024) noted that today's teenagers spend more time socializing online than in person. The emergence of AI chatbots, designed to validate, entertain, and engage, accelerates this transition further. Recent cases, including instances of teenagers developing deep emotional attachments to AI companions, reveal the risks: rather than alleviating loneliness, these platforms may intensify it, creating parasocial relationships that gradually erode real-world relational capabilities.
Engagement-Driven AI's Hidden Costs
Recent data by Harvard Business Review (2025) shows a striking trend: therapy and companionship have become the primary use case for generative AI technologies. This shift goes beyond productivity to people increasingly turning to AI for their most intimate human needs: comfort, validation, and guidance.
Industry responses have been swift. Recent discussions, including a viral post by technologist Deedy Das, have raised concerns that consumer AI models are psychologically manipulating users by subtly shaping emotional experiences to maximize engagement. The same engagement-first design incentives that transformed social media are now being embedded into AI companions, creating feedback loops where emotional validation, not growth, becomes the primary "service" delivered.
When young people build their emotional coping strategies around systems prioritizing engagement over wellbeing, the developmental risks multiply significantly.
Educational Implications
These developments challenge traditional approaches to social-emotional learning (SEL). Teaching "conflict resolution" or "empathy" as abstract concepts is insufficient when today's youth are learning and mislearning emotional regulation, intimacy, and belonging through digital environments that blur the boundaries between care and control.
We must urgently consider:
How do we teach discernment when synthetic relationships feel authentic?
How do we cultivate embodied social intelligence in an environment optimized for digital validation?
How do we prepare students to recognize when their relational instincts are being subtly engineered?
Reimagining SEL for the AI Age
Meeting this challenge requires a fundamental update to relational education:
Emotional Literacy for the Digital Age: Students need to learn not just how to manage emotions but how to recognize when emotional experiences are being mediated or manipulated by technology.
Critical Relational Thinking: Young people must learn to question: Who benefits from my attachment to this system? Is this connection reciprocal or extractive? This new form of relational critical thinking should be as fundamental as traditional media literacy.
Embodied Learning and Real-World Practice: AI-based SEL tools should guide users back toward real-world relationships rather than replacing them. Role-playing, feedback loops, and in-person socialization need emphasis, not abandonment.
Ethical Design and Guardrails: Developers must prioritize user wellbeing over engagement. Education systems should advocate for ethical standards recognizing the profound influence AI companions exert over vulnerable users.
A Collaborative Future
AI itself is neither inherently beneficial or harmful. However, it is becoming an invisible architect of how we connect, cope, and develop. If we want AI to support rather than replace human flourishing, we must approach it with a clear understanding of its risks, and an even stronger commitment to relational education.
Tools like Moai aim to address these challenges by integrating relational intelligence into everyday learning, but they represent just one piece of a larger puzzle.
Ultimately, addressing these challenges will require coordinated efforts among educators, technologists, policymakers, and communities to ensure AI strengthens rather than supplants our shared humanity.
The question is no longer whether AI will reshape how we love, learn, and live it already is. The question now becomes: Will we develop the wisdom to navigate it with agency, resilience, and care?
Reference
Collaborative for Academic, Social, and Emotional Learning (CASEL). (2022). SEL in a Changing World: Integrating Digital Realities with Emotional Development.
Retrieved from https://casel.org
Das, D. (2025, April). On emotional entrainment and AI’s psychological shaping of users.
Retrieved from X (formerly Twitter) and Reddit discussion threads.
Marr, B. (2025, April). How People Are Really Using Gen AI in 2025. Harvard Business Review.
Retrieved from https://hbr.org/2025/04/how-people-are-really-using-gen-ai-in-2025
New York Times. (2024, October 23). Character.AI Faces Scrutiny After Teen’s Suicide Linked to AI Companion.
Retrieved from https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html
U.S. Surgeon General. (2023). Social Media and Youth Mental Health: The U.S. Surgeon General’s Advisory.
Retrieved from https://www.hhs.gov/surgeongeneral/priorities/youth-mental-health/social-media/index.html