The Path to Legitimacy: Jasmin Chavez Cruz on Policy, Partnerships & Building Trustworthy AI
At The Graylight Lab, we’re not just interested in where the future is headed—we’re asking who’s shaping it, who’s being left behind, and how we can build across those gaps with care. Few people understand the weight of that responsibility better than Jasmin Chavez Cruz, a coalition-builder and policy strategist who’s worked at the highest levels of government to uplift underrepresented communities and reimagine what education, advocacy, and equity can look like in practice.
In a moment when AI is pushing institutions to rethink everything from access to accountability, Jasmin’s cross-sector lens reminds us that technology isn’t the whole story—people are. In this conversation, we explore what it means to raise an ethically-grounded generation, design systems that earn trust, and lead with the kind of vision that honors both history and possibility.
Q: You’ve led coalitions across government, education, and community sectors. In this next chapter of AI integration, who’s missing from the table—and how do we invite them meaningfully, not just symbolically?
A: In this next chapter of AI integration, the voices most missing from the table are the very communities that will be most impacted by these technologies: low-income families, non-English speakers, people of color, workers in industries vulnerable to automation, and young people navigating a rapidly changing future. Too often, these communities are discussed as beneficiaries or case studies but rarely as co-creators or decision-makers.
Meaningful inclusion begins with reimagining who we consider experts. That means investing in trusted community leaders, educators, and youth to build their capacity to engage with AI policy and innovation. It is not just about inviting them for a listening session but also about ensuring they are shaping the agenda from the beginning. It also means forging partnerships between technologists and people-centered sectors, such as public education, labor, and immigrant advocacy, where equity and access are at the core.
We invite people in a meaningful way by shifting power. That includes resourcing their participation, creating feedback loops that lead to real accountability, and translating complex systems into accessible and actionable knowledge. When we do that, we are not just building coalitions. We are building public trust, and that is essential for a future where AI serves everyone.
Q: We often hear about the need to “prepare students for the future of work.” But what about preparing systems to be more worthy of the students we’re raising? What would a values-aligned education system look like in your eyes?
A: We often discuss preparing students for the future of work. Still, the real challenge is preparing our systems to be worthy of the brilliance, resilience, and potential that our students already possess. A values-aligned education system would start by honoring the whole child, not just as future workers but as current contributors, leaders, and change agents.
A values-aligned system centers on equity, dignity, and opportunity. It treats multilingualism and cultural identity as strengths. It ensures that every student, regardless of zip code or immigration status, has access to a rigorous and relevant education that connects learning to community, civic engagement, and personal purpose. It invests in the adults who shape those experiences, including teachers, counselors, and families, by building trust, providing resources, and offering meaningful support.
This kind of system is not built solely through mandates. It must be co-created with students and communities, rooted in lived experience, and designed to uplift every learner, not just to succeed in the system but to help transform it.
Q: You’ve been at the helm of major stakeholder engagement efforts. What’s the biggest misconception tech leaders have about how public policy and community advocacy actually function?
A: One of the biggest misconceptions tech leaders often have is that public policy and community advocacy operate on the exact timelines and logic of product development. In tech, speed and iteration are key. In public policy and advocacy, trust, relationship-building, and process are everything. It is not just about finding the most efficient solution. It is about building consensus, navigating competing priorities, and ensuring the people most affected have a seat at the table.
Another common misconception is assuming that policy is something you influence only when you need something. In reality, effective engagement means showing up consistently long before there is a crisis or an ask. It means understanding history, power dynamics, and the fact that, for many communities, skepticism toward new technologies is not resistance but lived experience.
When tech leaders recognize that policy and advocacy are not obstacles but essential pathways to legitimacy and long-term impact, that is when real, values-aligned partnerships can take shape.
Q: In conversations around upskilling and AI readiness, how do we make sure we’re not just talking to underrepresented communities, but building with them—and being led by them?
A: To truly build with underrepresented communities, we must start by shifting our mindset from one of outreach to one of shared ownership. That means recognizing that these communities are not just recipients of training or policy; they are also active participants in shaping them. They are experts in their own right, with lived experience that should guide how we design, implement, and evaluate upskilling and AI readiness efforts.
Being led by these communities requires us to create spaces where their voices are not only heard but where their insights shape decision-making from the beginning. It means investing in local leaders, compensating their time, and building long-term partnerships rooted in trust. It also means being transparent about risks, trade-offs, and power—especially in spaces where emerging technologies may widen existing gaps if we are not intentional.
We make progress when we center dignity, agency, and accountability. Building with communities means inviting them to co-create the future and recognizing that their leadership is essential to getting it right.
Q: As someone who’s championed both education and technology policy, where do you see the greatest opportunity for synergy—and where do you see risk if we don’t move with care?
A: The most significant opportunity for synergy between education and technology lies in expanding access to educational resources. Technology can help bridge learning gaps by connecting students to resources, mentors, and experiences that might otherwise be out of reach. It can personalize instruction, support multilingual learners, and give educators new tools to engage students in meaningful and relevant ways. At its best, technology can help level the playing field.
But there is a real risk if we move without care. If we do not address existing inequities in broadband access, device availability, or digital literacy, we risk deepening the very divides we hope to close. There is also a danger in letting tech drive decisions without community input or without considering the social and emotional needs of students.
The path forward requires collaboration between educators, technologists, families, and students. We need policies that are rooted in equity, informed by classroom realities, and developed in partnership with the people most impacted. That is how we create systems that serve all learners, not just those already positioned to succeed.
Q: How do we expand the definition of “public servant” in an AI age—especially when so many change-makers are now operating outside of traditional government roles?
A: In an AI age, expanding the definition of a public servant means recognizing that meaningful service to the public does not begin or end with a government title. Today, some of the most impactful change-makers are building in community organizations, advocacy groups, startups, and research institutions. They are designing ethical technologies, organizing for digital equity, and pushing for accountability and transparency from the outside in.
To expand this definition, we need to value public impact as much as public position. That means celebrating those who are working to close opportunity gaps, protect rights, and ensure that AI serves the public good, regardless of their position. It also means creating new pathways for collaboration between traditional institutions and these emerging leaders and removing the barriers that prevent talent from moving between sectors.
A public servant in this era is anyone who is committed to advancing equity, justice, and access in the systems we all rely on. The more we uplift and support those working toward that mission, the stronger our collective future will be.
Q: You’ve built trust across agencies, nonprofits, and grassroots groups. What does sustainable trust look like when applied to tech and data systems?
A: Sustainable trust in tech and data systems begins with transparency, accountability, and community voice at its core. It is not just about protecting privacy or securing data. It is about ensuring that communities understand how their information is being used, who benefits, and how decisions are made.
In my experience building trust across government, nonprofit organizations, and grassroots groups, I have learned that trust is earned through consistency, active listening, and follow-through. The same applies to tech. Communities need to see that their concerns are not only heard but acted on. That means involving them early in the design of systems, co-creating solutions, and making space for feedback that shapes outcomes.
Sustainable trust also requires long-term investment. It cannot be built through one-time engagement or surface-level consultation. It must be grounded in relationships, shared values, and a commitment to not cause harm. When people feel respected, informed, and empowered, they are more likely to engage with systems and contribute to building tools that work for everyone.
Q: If you could embed one civic value into every AI system being piloted today, what would it be—and why?
A: If I embed one civic value into every AI system being piloted today, it would be accountability. Without it, even the most innovative tools risk causing harm, deepening inequities, or operating without regard for the people they affect most.
Accountability means that systems are transparent in how they are built, who they serve, and what impact they have. It means giving communities the ability to question, challenge, and influence decisions that shape their lives. It also requires developers, institutions, and decision-makers to take responsibility for outcomes—not just celebrate progress but confront unintended consequences and make changes when needed.
Embedding accountability ensures that AI is not just powerful but principled. It is how we build public trust, protect rights, and ensure that technology advances the common good, not just private interests.
Q: What gives you hope when it comes to building equitable education pipelines in a time when both democracy and data are under pressure?
A: What gives me hope is the rising generation of leaders who are not just preparing for the future; they are actively shaping it. As an Advisory Board member for ZETA, I have the privilege of supporting a community of Gen Z innovators, changemakers, and leaders who are connecting directly with policymakers and industry to push for a future that reflects their values.
ZETA is building equitable education-to-leadership pipelines by ensuring young people have access to the tools, networks, and knowledge they need to lead in civic and digital spaces. Whether it is harnessing the power of AI for good, combating misinformation, or expanding access to opportunities in the digital economy, these young leaders are not waiting for permission. They are leading the way.
Even as democracy and data face real pressure, I am hopeful because I see what is possible when we center equity, invest in potential, and create space for youth-led solutions. ZETA reminds me every day that the future is already here, and it is bold, brilliant, and ready to build.
Q: In one word, how would you describe the kind of leadership this moment requires—and what do you think it demands from all of us, regardless of sector?
A: Courage. This moment demands that we lead with truth, take risks for what is right, and stay grounded in the communities we serve.
Q: Tell people how to follow you/where to find you!
A: