October 3, 2025

Balancing A.I. fluency, privacy, and safety in school communities

    Key points summarized

  • A.I. fluency is becoming a graduation requirement at Ohio State University, highlighting this as a future-ready skill.
  • Challenges with Gaggle are an important reminder about balancing A.I. monitoring tools with fostering honest communication and well-being.
  • ChatGPT offers teen-specific safety features, but guidance and modern
    life skills education remain essential for student accountability.

A.I. Fluency Becomes a Graduation Requirement

The Gist: Ohio State University has launched a bold new initiative requiring all students to graduate fluent in both their major and artificial intelligence. Freshmen are now required to take a generative A.I. course and participate in multiple workshops designed to connect classroom learning with real-world applications. Leaders at Ohio State say the goal is to ensure that every student can use A.I. effectively while maintaining strong critical thinking skills.

What to Know: The program, called A.I. Fluency, reflects how relevant educators believe A.I. will be in the job market. A recent analysis found a 600% increase in job postings asking for A.I. skills over the last decade, with a 103% jump in the past year alone. Companies like Duolingo report that A.I. is accelerating their work dramatically, allowing them to create nearly 150 new language courses in just one year compared to 100 in the previous decade. While some remain skeptical about A.I.’s effect on creativity, with more than half of adults in a recent survey saying they worry it will make people less creative, as Ohio State weaves A.I. into the curriculum, could this give students a competitive edge in the workforce?

TSI’s Take: Ohio State’s program highlights how A.I. Literacy has become an important skill as students start exploring career paths. And K-12 schools don’t need to wait to begin this skill building, as a tiered and developmentally appropriate approach to A.I. fluency is most effective. With A.I. Literacy lessons from The Social Institute, educators can:

  • Introduce A.I. early: Integrate lessons on the ethics of A.I. into existing subjects so students can begin to understand its implications.
  • Model critical thinking: Remind students that A.I. is a thought partner, not a substitute for their own reasoning or creativity.
  • Connect to real-world skills: Highlight how A.I. is reshaping industries from healthcare to education, giving students tangible reasons to learn.

By starting the conversation earlier, schools can give students a head start, preparing them not only for college initiatives like Ohio State’s but also for a workforce where A.I. is already shaping opportunity. Start the conversation with students with the #WinAtSocial Lesson, Exploring career paths and the impact of A.I. on jobs, to encourage them to think critically about A.I.’s role in their lives today.

Monitoring or Mentoring? Protecting Student Privacy in the Age of A.I.

The Gist: Over 1,500 school districts across the U.S. currently use an A.I.-powered tool called “Gaggle” to monitor student activity on school-issued devices and flag potentially unsafe behavior. While this platform was intended as a tool for keeping students safe online, it’s now sparking concerns nationwide about student privacy and security. In light of the controversy around the platform, schools are reminded of the importance of mentoring vs. monitoring when supporting student well-being and empowering them to protect their privacy. 

What to Know: Districts from Kansas to Washington state have seen both benefits and challenges with Gaggle. Some districts reported that the A.I. tool led to timely interventions, while others reported instances where the tool flagged and deleted non-harmful content, such as one art student’s portfolio. One district even had a lawsuit filed against it by current and former students, stating that the monitoring overstepped privacy boundaries and discouraged honest conversations about health and well-being. While Gaggle has kept students safe in certain instances, it has revealed that, like other A.I. tools, it is not completely accurate. As educators, this is a reminder to empower students to protect their privacy and create a safe learning environment where students feel comfortable speaking up about well-being challenges. 

TSI’s Take: Protecting student privacy is not just about compliance or avoiding risks; it’s about building trust. While Gaggle may be a helpful resource for school communities, it’s important that schools protect student privacy, not overstep boundaries, and never lose the human touch when supporting students. When students know their personal information, online activity, and communications are respected, they are more likely to feel safe, valued, and willing to share concerns with trusted adults. Here’s how you can empower students to protect their privacy while building a strong relationship with them:

  • Model responsible privacy practices: This not only helps protect them from potential harm, but it also helps them develop essential modern life skills.
  • Encourage critical thinking, especially around A.I.: Remind students to think before they search online or with A.I. Is it data they want to share?
  • Emphasize ownership: Remind students that their data, photos, and words are part of their digital footprint. Frame privacy as protecting something that belongs to them.

With A.I. tools becoming part of students’ daily lives, it’s more important than ever to empower students to think critically about what they share online and with artificial intelligence. Start the conversation with students with the #WinAtSocial Lesson, Protecting our personal information from A.I., to help them consider how they can take control of their personal information. 

ChatGPT rolls out teen-specific features aimed at safety, but is it enough?

The Gist: OpenAI has introduced a new version of ChatGPT designed specifically for teens under 18. This edition includes built-in parental controls, blackout hours, and stricter content filters. The move reflects a growing effort to make A.I. tools both safe and supportive for younger users while addressing concerns about how teens interact with chatbots in their daily lives.

What to Know: The new safeguards are meant to protect teens while still encouraging them to explore A.I. in positive ways. Features like chat history settings and blackout hours show how companies are responding to calls for healthier tech habits and age-appropriate boundaries. At the same time, these updates are a reminder of the limitations of in-app features and that savvy students can get around them if they want. By framing these new features as guides and pairing them with modern life skills education that teaches students to navigate tech like A.I. in positive ways, educators can support students in building accountability as they grow.

TSI’s Take: ChatGPT’s teen safeguards highlight how A.I. can evolve to promote balance, fairness, and responsibility. Schools can use this moment to guide students in thoughtful ways:

  • Promote balance: Model how features like blackout hours can encourage healthier habits and focus.
  • Teach originality: Emphasize that chatbots are thought partners, not replacements for authentic voices or lived experiences.
  • Discuss fairness and integrity: Lead conversations about when using a chatbot is helpful and when it crosses a line in academic or personal growth.

Updates like this are a step in the right direction, but they don’t override teaching students resiliency and responsibility. Schools can inspire students to shape a future where A.I. is navigated safely, fairly, and with authenticity at the center. Start the conversation with the #WinAtSocial Lesson, Breaking down ChatGPT and the role of artificial intelligence in our lives, to help them reflect on responsibility and balance in their own digital choices.

A.I. is here to stay, and it’s shaping the way students learn, create, and connect. By starting conversations early, modeling critical thinking, and teaching responsible choices, schools can equip students with the skills they need to thrive in a world where A.I. is part of everyday life. Request a demo of our A.I. Literacy lessons to empower students to think critically, protect their privacy, and navigate technology with confidence.


The Social Institute (TSI) is the leader in equipping students, families, and educators with modern life skills to impact learning, well-being, and students’ futures. Through #WinAtSocial, our interactive, peer-to-peer learning platform, we integrate teacher PD, family resources, student voice insights, and more to empower entire school communities to make positive choices online and offline. #WinAtSocial Lessons teach essential skills while capturing student voice and actionable insights for educators. These insights help educators maintain a healthy school culture, foster high-impact teaching, and build meaningful relationships with families. Our unique, student-respected approach empowers and equips students authentically, enabling our solution to increase classroom participation and improve student-teacher relationships. Through our one-of-a-kind lesson development process, we create lessons for a variety of core and elective classes, incorporating timely topics such as social media, A.I., screen time, misinformation, and current events to help schools stay proactive in supporting student health, happiness, and academic success.