As AI leadership changes, how can educators help students navigate the evolving landscape of ChatGPT
Be sure to check out the suggested Huddle question at the bottom of this article to discuss this important topic with your students in class, if you feel it is appropriate.
In a whirlwind of events, Sam Altman, co-founder of OpenAI, found himself ousted and then swiftly reinstated as CEO of the company that owns the ever-evolving and popular AI chatbot, ChatGPT. This unexpected turn of events sent shockwaves through the AI community, prompting discussions about who should be trusted to govern and regulate artificial intelligence: Tech companies or the government?
Are tech companies prioritizing student safety and well-being while managing the ethical dilemmas that AI raises, such as bias and influence over our behavior? If they’re not, how much should the government be involved in protecting students from misinformation on AI? While this is still up for debate, educators must stay updated on AI regulations and features so they know their part in empowering students to use AI for good. It’s just as important as teaching students to write or helping them learn the basics of a new language.
Let’s dive in.
Unraveling AI Leadership Turmoil: A period of disruption and questions
On November 17th, 2023, the tech industry witnessed the dramatic removal and subsequent return of Sam Altman as CEO of OpenAI, the driving force behind the widely-used ChatGPT chatbot. Altman was then reinstated five days later with Microsoft, a significant investor in OpenAI, playing a role in facilitating Altman’s return. While Altman’s dismissal and reinstatement from the company shocked tech users, we still don’t know precisely why he was fired or what caused him to return. Altman’s dismissal and reinstatement suggest instability within ChatGPT’s leadership. With the widespread use of the platform among students, is it time for the government to come in and provide better leadership for the company?
Despite not knowing why Altman was indeed fired or rehired, we do know that before he was let go, some of the board members expressed concern that Altman’s focus on ChatGPT’s expansion might overshadow the board’s desire to balance growth with AI safety. Upon Altman’s reinstatement, some of the directors on the board who had initially fired Altman were let go. Two new members joined the board for ChatGPT.
The instability of AI leadership and the fragility of ChatGPT’s business structure raise concerns for the tech community about whether the heads of ChatGPT are fit to run a business that significantly impacts user safety and behavior. Half of K-12 students currently use ChatGPT, and ChatGPT’s upper management decisions will ultimately affect them. And if safeguarding AI users from biased and false information isn’t their goal, how can educators empower students to navigate misinformation responsibly?
ChatGPT’s impact on schools
Teachers are increasingly incorporating ChatGPT into the classroom. In fact, 60% of teachers reported using ChatGPT as part of their job. Since many educators and students use AI as part of their work, staying informed about OpenAI’s leadership situation is important for two reasons. Firstly, it shows how AI technology is changing and becoming safer for students. As new leaders come in and more regulations get passed for AI by the government, the algorithm that AI uses will slowly become more reliable and provide students with more helpful information as tech companies and the government make improvements to AI.
Secondly, it highlights how crucial educators are to encourage students to think critically about the information and content they receive from AI since the platform isn’t free from bias or misinformation. Whether or not regulations and improvements are made to the platform, students must stay informed on the safest and most reliable ways to navigate ChatGPT so they do not encounter false or biased information.
Empowering students to think critically and spot misinformation in AI
While ChatGPT’s company undergoes new changes, we are reminded as educators of the importance of empowering students to navigate AI responsibly by thinking critically about the information they share and receive. When students fact-check the information they receive from ChatGPT and evaluate the bias in the information they’re given, they become better critical thinkers when evaluating content they see on both AI and social media. Empowering students to analyze and reflect on their information from AI helps them avoid misinformation and learn new and valuable information that ChatGPT can provide. Whether it’s on a subject they’re learning in school or a topic they are interested in knowing more about, ChatGPT can be a powerful studying tool when used positively.
Here are some essential skills you can help your students build when navigating AI:
- Critical thinking skills: Teach students to refrain from taking the information they receive from ChatGPT at face value. While AI technology is smart, it’s not always correct. Double-checking the information you get on ChatGPT with reliable sources is important for making sure you have factual information.
- Responsible decision-making: AI tools like ChatGPT have a variety of beneficial uses for students, such as helping them study or developing fun activities to do with friends, but they can also be used in harmful ways. Remind students to treat others how they want to be treated, and not use AI in a way that could hurt others or their reputation.
- Self-awareness: Inform students of OpenAI’s biases, such as racial or political biases resulting from human prejudice. It’s important that students carefully word the questions they have and not ask opinion-loaded questions that can reveal some of AI’s biases.
- Using technology for good: Remind students of all of the positive ways they can use ChatGPT to support their academics, such as using the chatbot as a study partner, using it to help improve grammar skills, or even doing vocab lessons with AI. There are so many creative and positive ways that students can use AI to enhance their learning.
As technology continues to evolve, reminding students of the importance of using tech for good is important in helping them maintain their well-being. For more strategies and skills to help students use AI responsibly, check out our ChatGPT playbook and learn how you can use it to empower students to use technology for good.
#WinAtSocial Huddle Question
Huddle with your students
ChatGPT can assist students while studying in numerous ways, such as helping them review certain subjects, quizzing them in math and so much more. But, ChatGPT can be used in class to help run lessons, as well. They can be used to generate problem-solving scenarios and test students’ knowledge of a certain topic. As a student, would you want your teacher to utilize ChatGPT in their lessons? Why?
The Social Institute (TSI) is the leader in empowering students by understanding students. Through #WinAtSocial, our gamified, peer-to-peer learning platform, we equip students, educators, and families to navigate their social world – in the classroom and beyond, online and offline – in healthy, high-character ways. Our unique, student-respected approach empowers and equips, rather than scares and restricts. We incorporate timely topics about social media, tech use, and current events that are impacting student well-being and learning. #WinAtSocial Lessons teach life skills for the modern day, capture student voice, and provide school leaders with actionable insights. Through these insights, students play an essential role in school efforts to support their own health, happiness, and future success as we enable high-impact teaching, meaningful family conversations, and a healthy school culture.