- DISCOVERING AI: Igniting Human Potential
- Posts
- Take Advantage of the Calm Before the School Year Ends: Preventing Confusion Before Final Projects Are Due
Take Advantage of the Calm Before the School Year Ends: Preventing Confusion Before Final Projects Are Due

DISCOVERING AI: Igniting Human Potential
By Amy D. Love, Founder of DISCOVERING AI and of the Global FAMILY AI GAME PLAN initiative
We are entering the final third of the school year.
Elementary students are finishing research projects. Middle schoolers are drafting longer essays. High school students are preparing capstones and final exams. College students are working on major papers that impact transcripts, scholarships, and academic standing.
There is still time.
Time for schools and families to clarify expectations around AI use before final projects are submitted. Time to prevent confusion before it escalates into conflict.
And the signals are already here.
When AI Tools and AI Checkers Collide
On February 6, 2026, a judge ruled in favor of Orion Newby in his lawsuit against Adelphi University.
His paper had been flagged as 100% AI-generated by Turnitin. He received a zero, was required to attend a plagiarism workshop, and faced the possibility of suspension for repeat violations. He insisted he wrote the paper himself.
The court found the accusation “completely false” and “without valid basis.” The university was ordered to reverse penalties and expunge his record. His family reportedly spent more than $100,000 defending him.
Two different institutions. Two different outcomes. One shared challenge.
Unclear expectations in a rapidly evolving AI environment.
This is not limited to higher education. AI use is happening in elementary classrooms, middle school homework, and high school college essays. The tools are moving faster than most policies.
Why This Matters Before Final Projects Are Due
When expectations are unclear, several things happen quickly:
Students guess what is acceptable.
Teachers interpret standards differently.
Parents feel surprised when consequences appear.
AI detection tools become lightning rods rather than safeguards.
Some higher education institutions, including University of California, Berkeley and Vanderbilt University, have disabled certain AI detection features due to reliability concerns.
The issue is not whether AI exists. The issue is whether students clearly understand how they are expected to use it.
The old homework question was simple: Did you get it done?
The new homework question is more complex: How did you get it done?
That shift requires clarity at every grade level.
Why Blanket Policies Are Not Enough
Many districts are working hard to create comprehensive AI policies. The challenge is speed. AI tools evolve monthly, and classroom applications shift rapidly. What feels appropriate this semester may require refinement next semester.
Schools do not simply need rigid, static policies. They need living guidance.
Guidance that:
Can be updated as technology evolves
Aligns with curriculum standards
Reflects department expectations
Provides teachers with practical, classroom-level clarity
English has different considerations than Algebra. Brainstorming differs from drafting. Guided learning inside a math platform is not the same as generating a full research paper.
Clarity must happen class by class.
When teachers communicate:
What AI use looks like in this course
Where AI is encouraged
Where it requires guidance
Where it is paused
Students understand the guardrails. Parents can reinforce expectations at home. Administrators reduce institutional risk.
With clarity, schools lead.
Why Parents Must Be Part of the Conversation
This is not solely a school issue. Homework is often completed at home. Devices are personal. AI tools are accessible at any hour.
One parent shared their perspective in this video conversation on our site:
What you hear is not panic. It is a desire for alignment.
Parents want to support learning. Teachers want to protect integrity. Students want to succeed without fear of accidental missteps.
When parents and schools share language and expectations, trust increases. When they do not, misunderstandings multiply.
From AI Detection to AI Direction
AI checkers are statistical probability tools. They are not optimized for truth. They can misidentify human writing as AI-generated.
Overreliance creates stress. Clear expectations reduce it.
This is why schools across the country are implementing the FAMILY AI GAME PLAN™ and STUDENT AI GAME PLAN™ from DISCOVERING AI. Not as a compliance form, though as a shared framework that aligns parents, teachers, and students before major assignments are due.
It clarifies what is allowed, what requires guidance, and what is paused. It builds Clarity, Confidence, and Connection.
That is not simply prevention. It is leadership in the Age of AI.
As You Prepare for the Final Stretch
If you teach, lead or support a school, consider:
Do students clearly understand what AI use looks like in each class?
Have parents been equipped with language to reinforce expectations at home?
Is there a fair and transparent review process if a detector flags a paper?
There is still time to act before final projects are submitted.
DISCOVERING AI provides free tools schools can implement immediately to create clarity and alignment. If you would like to explore how your school can implement the AI Game Plan framework, schedule a free 15–30 minute conversation:
Because the issue is not AI.
It is whether parents and schools choose to lead together with intention.
Join the conversation

In this week’s Facebook Live session of Parenting in the Age of AI, we’re expanding the conversation into real school experiences.
Come curious. Leave steadier.
𝐃𝐢𝐬𝐜𝐨𝐯𝐞𝐫𝐢𝐧𝐠 𝐀𝐈: Igniting Human Potential
Follow me, 💠 Amy Love 💠, on LinkedIn, on, and subscribe to the DISCOVERING AI newsletter.

