- Published on
Safe AI Tool Usage with Privacy Protection in Mind
More and more teachers are putting student writing into ChatGPT for feedback or entering parent consultation content into AI for analysis. It is convenient β but one important question follows: "Where does this information go?" The Personal Data Protection Act and school information security policies apply just as much to AI tool usage. A single student name entered carelessly could lead to legal issues. This post compiles practical principles for using AI in educational settings safely and effectively.
Table of Contents
- What Counts as Personal Data in an Educational Setting
- Understanding How AI Tools Process Data
- Five Principles for Safe Use
- How to Check Privacy Settings for Each Tool
- AI Usage Guidelines for Students and Staff
What Counts as Personal Data in an Educational Setting
The Scope of Protected Personal Data
Personal data that should not be entered into AI in a school environment is broader than most people realize.
Student-related personal data:
- Name, student number, date of birth
- Family background, household composition, financial situation
- Academic grades and achievement assessment results
- Health information, disability status
- Psychological counseling content, school violence records
- Contact information, home address
Staff-related personal data:
- Name, employee number, contact information
- Personnel records, salary information
- Teacher evaluation results
Sensitive Information Requires Extra Protection
The Personal Data Protection Act requires a higher standard of protection for "sensitive information" including health status, sexual orientation, religion, and political views. A student's family circumstances or psychological counseling records may fall into this category.
Understanding How AI Tools Process Data
Where Does My Data Go?
It is important to understand how data entered into an AI tool is processed.
ChatGPT (OpenAI):
- Conversations from both free and paid users may by default be used to train the model
- Enabling the "opt out of training" setting (Settings β Data Controls) prevents conversations from being used for training
- ChatGPT Team and Enterprise plans are excluded from training data by default
NotebookLM (Google):
- Google's standard privacy policy applies
- Using a Google Workspace for Education account applies a separate educational data protection policy
- When using a personal Google account, Google's general terms of service apply
Gemini (Google):
- Conversations are saved to your Google account by default
- Turning off "Gemini Apps Activity" prevents conversations from being saved to your account
Server Location and Legal Application
Most AI tools use servers in the United States. Korea's Personal Data Protection Act applies to personal data collected and processed within Korea, but when data is transferred to overseas servers, "international transfer" provisions come into play. Entering student information into overseas AI services at the school level is a legally sensitive area requiring review.
Five Principles for Safe Use
Principle 1: De-identify Before Entering
If you must enter content containing personal data into AI, de-identify it first.
- Name β anonymous code (Student A, Student B)
- School name β delete or "K High School"
- Student number and class β delete
- Specific event details β generalize as a pattern description
Principle 2: Minimum Information Principle
Enter only the minimum information necessary to achieve your purpose. When asking "analyze this student's adjustment difficulties," there is no need to include detailed family background information.
Principle 3: Enable the Opt-Out Training Setting
In the settings of the AI tool you use, find and activate the option to prevent conversations from being used as training data. Most major AI services have this setting.
Principle 4: Separate Work and Personal Accounts
When using AI for school-related work, use your school Google Workspace account or work Microsoft account where possible. These have stricter data protection policies than personal accounts.
Principle 5: Review Generated Output Before Distributing
Before distributing any AI-generated document, always check whether it contains personal data. AI can sometimes recombine personal data from its input and include it in its output.
How to Check Privacy Settings for Each Tool
ChatGPT Settings
- Click the account icon in the upper right of ChatGPT
- Settings β Data Controls
- Toggle off "Improve the model for everyone"
- To delete past conversations, use "Delete all chats"
Gemini Settings
- Go to the settings menu in the Gemini app
- "Gemini Apps Activity" β turn off or set an auto-delete period
- Delete already-saved conversations using "Delete activity"
NotebookLM Settings
NotebookLM is linked to your Google account.
- Google Account Settings β Data & Privacy
- Check app permissions related to NotebookLM
- Consider deleting notebooks containing sensitive content after use
AI Usage Guidelines for Students and Staff
What to Tell Students
Guide students on what to observe when using AI:
- Do not enter your own or others' names or contact information into AI
- Do not ask AI about or enter personal information about friends or family
- Do not share AI-generated content about personal data directly to social media
- AI usage records may be logged or shared with teachers in accordance with school policy
Staff Checklist
A checklist that can be shared with school staff for AI use:
- Use school email account when using AI for work purposes
- Do not enter content containing student names or ID numbers
- Do not enter sensitive information such as grades or counseling records
- Confirm that training opt-out settings are enabled
- Check AI-generated output for personal data before distributing
- Consult the data protection officer when in doubt
Policy Development Recommendations
If your school does not yet have an explicit policy on AI tool usage, recommend establishing one. The policy should include:
- A list of approved AI tools
- Prohibited personal data categories
- Procedures for violations
- Staff training schedule
AI tools are powerful instruments in educational settings, but without proper usage rules, the personal data of students and staff can be put at risk. Balancing convenience and security is a new competency teachers need in the AI era. Even if the rules seem complex, the core is simple: "Do not enter names or information that could identify a person into AI."
Does your school have an explicit policy on AI tool usage? If not, what do you think should be included? Share your thoughts in the comments.
Related Posts