- Companies
-
-
New York, New York, United States
-
- Categories
- Information technology Marketing strategy Market research Media
Latest feedback
Project feedback
Project feedback
Project feedback
Achievements
Recent projects
Policy Strategy for Combating AI-Assisted Coercive Control
The Feminine Intelligence Agency (FIA) is a research and public policy organization dedicated to advancing Social Discernment âthe ability to detect manipulation early, respond effectively, and protect personal and collective agency in complex social environments. FIA is developing a comprehensive curriculum, scalable tools, and legal frameworks to address coercive control, emotional exploitation, and tech-enabled manipulation in relationships and digital life. Our work focuses on: đ§ Social Discernment Curriculum : A new field of emotional intelligence that helps people recognize coercion, power games, and deceptive language before harm occurs. đ± Player Identifier Chatbot : A predictive tool that analyzes early relationship patterns to help users recognize emotionally dangerous individuals. đ AI Language Training App ( ChatBoy ) that teaches users to spot manipulative language across different contextsâdating, work, friendship, and family. đ§ââïž Peer-to-Peer Wellness Platform : A non-therapeutic, group-based support system designed for women navigating emotional burnout and frustration in high-conflict or confusing relationships. đ Cybersecurity for Women : A decentralized, peer-led training program that focuses on emotional safety, social engineering, and digital resilienceâespecially for those targeted by psychological abuse or coercion online. FIAâs work bridges public policy, digital safety, and emotional intelligence to help individuals and institutions detect invisible threats and respond with clarity, ethics, and courage. This project explores how the United States might adopt policies to criminalize coercive control in intimate and institutional relationships, following the lead of countries like the UK, Scotland, and Ireland. Specifically, it will examine how emerging technologiesâparticularly generative AIâare amplifying patterns of manipulation, isolation, and psychological domination, a phenomenon we refer to as AI-Assisted Psychological Exploitation (AIPEx) . The student will assess: How Britain successfully passed coercive control laws (strategy, framing, resistance) Why similar efforts have struggled in the U.S. (cultural, legal, and political factors) How new threatsâespecially tech-enabled coercionâmay change the narrative or legislative appetite The outcome will be a policy roadmap for how coercive control could be reframed, regulated, or outlawed in the U.S., including legal frameworks, political feasibility, and public engagement strategies. đ Policy Problem: The U.S. lacks legal frameworks to address non-physical forms of abuse like gaslighting, digital surveillance, emotional manipulation, and isolation tacticsâdespite their well-documented psychological and economic impacts. The problem is compounded by AI tools that are being misused to amplify these tactics at scale (e.g., deepfakes, real-time surveillance, chatbot mirroring, voice cloning). This project addresses both a market failure (unregulated tools enabling harm) and a government failure (lack of legal protections and prevention strategies for psychological abuse). đ§© Key Questions to Explore: What legislative, advocacy, and cultural strategies made coercive control laws pass in the UK and Ireland? What would a U.S.-specific policy roadmap need to look likeâlegally, politically, and culturally? Could technology-driven manipulation (AIPex) be a wedge issue for reform or regulation? What risks or resistance would this proposal face from political, tech, or legal sectors?
Legal Pre-Exit Toolkit Development for Women in Coercive Control Situations
The Feminine Intelligence Agency (FIA) is a research and innovation lab focused on uncovering, naming, and dismantling hidden forms of social coercion. Our work spans psychology, AI, public health, and human rightsâbut at the center of it all is one urgent truth: coercive control is still legal in most of the world , and even in jurisdictions where laws exist, it remains nearly impossible to prosecute . Coercive control is a form of invisible violence âthe manipulation, isolation, gaslighting, financial entrapment, and slow erosion of autonomy that occurs without physical bruises. Because it operates through covert aggression , it leaves women without the kind of hard evidence courts typically demand. Most survivors are told thereâs ânothing they can do,â or worseâthat itâs their fault. Even in the best-case scenario, these women are expected to gather and present their own evidence, interpret complex laws, and advocate for themselvesâall while under psychological siege and often without money, legal support, or safety. Thatâs where this project comes in. Weâre inviting your team to design a step-by-step legal preparation toolkit for women experiencing coercive control who may not be able to leave immediatelyâbut who want to act intelligently and strategically now. The goal is to equip them with a concrete, discreet path forward: what to document, how to protect themselves legally and digitally, and how to begin building a trauma-informed case that courts might one day take seriously. Your research and insightâespecially if it includes interviews with real attorneys and advocates âcould help shape a resource that doesnât exist anywhere else. With your help, we can begin translating constitutional protections and human rights principles into something women can actually use. This is not just a research exercise. It is an opportunity to close a devastating legal blind spot âand make a real difference in the lives of women navigating invisible forms of abuse with no roadmap, no allies, and no precedent.
Cybersecurity Fair Pilot Readiness Assessment
The goal of this project is to help the Feminine Intelligence Agency (FIA) prepare its innovative Cybersecurity Fair platform for public launch. The Fair is an interactive, peer-learning environment where students teach each other practical cybersecurity micro-skills â a scalable, low-cost model to raise digital awareness and resilience across campuses. Student teams will focus on testing, improvement, and pilot design to ensure the platform is secure, effective, and engaging. Through hands-on evaluation and structured reporting, the team will connect cybersecurity principles with real-world educational deployment. Key Objectives: Conduct a cybersecurity and usability audit of the Cybersecurity Fair prototype, identifying vulnerabilities, privacy risks, and accessibility issues. Design and document a pilot launch plan , including test procedures, feedback collection, and key performance indicators (KPIs). Evaluate learning effectiveness â does the content help users understand emerging threats like AI-assisted psychological exploitation (AIPEx) and social engineering? Deliver an implementation readiness report summarizing system health, educational quality, and risk mitigation priorities. Develop a professional presentation package suitable for executive review and potential university partners. By the end of the project, students will produce a set of practical, adoption-ready recommendations that strengthen FIAâs ability to deliver cybersecurity literacy to thousands of college students, especially women, in a fast-changing digital threat landscape.
Global Student Program Scalability Model
The Feminine Intelligence Agency (FIA) is a global research and innovation lab dedicated to advancing social discernment, digital safety, and emotional intelligence through AI-driven tools and peer-to-peer learning systems. FIA collaborates with universities, researchers, and students worldwide to tackle complex social and ethical challenges at the intersection of technology, psychology, and social systems. FIA seeks to enhance its global research and learning ecosystem by developing a scalable student infrastructure model. The current framework brings together students from diverse disciplinesâincluding data science, psychology, design, and technologyâto co-develop tools that address pressing social issues such as AI ethics, manipulation detection, and digital safety. This project invites MBA students to analyze the existing FIA student infrastructure and propose a scalable global model that ensures sustainability, operational efficiency, and cross-institutional collaboration. The final deliverable will serve as a blueprint for future FIA academic partnerships and global student programs. Objectives Assess the current FIA student collaboration model, including onboarding, task management, and deliverable tracking. Identify bottlenecks in communication, project continuity, and knowledge transfer between cohorts and partner institutions. Design a replicable, modular framework that enables seamless onboarding, collaboration, and research production across global teams. Recommend digital infrastructure and governance mechanisms (e.g., Notion, Slack, or LMS systems) to support scalability and institutional memory. Deliver a final proposal that outlines strategic, operational, and implementation plans for a sustainable FIA global education pipeline.