AI Strategy Research Packet Β· v1.0 Β· April 2026

Strategic Landscape

What the data actually says about AI in K–12 education

Calvary Preparatory Academy | AI in Education Research Packet


This is the longer version of the situation. It synthesizes findings from six concurrent research streams (April 2026) covering: AI tutoring & personalized learning, AI literacy curriculum, Christian school AI integration, teacher AI tools, academic integrity, and the future workforce. Citations are inline. Full reports are in 06_research_appendix.md.


1. AI tutoring works β€” but only when designed correctly

The single most important finding across the entire research base is the sharp empirical distinction between scaffolded AI (Socratic, guided, pedagogically structured) and answer-engine AI (general-purpose ChatGPT-style access). They produce opposite outcomes.

When AI tutoring helps:

  • Google’s LearnLM trial (UK secondary schools). AI + human supervision achieved 66% success on novel problem transfer vs. 61% for human tutors alone. Tutors approved 76% of LearnLM AI responses with minimal edits. (The 74 Million)
  • Nature Scientific Reports RCT (n=316). AI tutor users achieved double the learning gains (median post-score 4.5 vs. 3.5) in less time (49 vs. 60 minutes) compared to in-class active learning. (Nature)
  • CMU year-long study (350+ seventh graders). Human tutoring layered on AI tutoring produced an additional 0.36 grade-level gain vs. AI-only deployment. Effects scale with usage intensity. (CMU Heinz)
  • GarzΓ³n et al. systematic review (155 studies). AI-assisted learning yields ~+65.8% performance gains on average, strongest in STEM and hybrid human-AI models. (IES REL Central)

When AI tutoring harms:

  • Wharton/PNAS study (Bastani et al.). High school math students given open ChatGPT access scored 17% worse on independent exams than students who used no technology β€” despite showing +45% gains during AI-assisted practice. Students believed AI had helped them. (X/@asanwal β€” 4,059 likes)
  • MIT/Oxford/CMU multi-institution study. AI assistance boosts short-term performance but reduces students’ ability to work through problems independently and lowers persistence. (X/@rohanpaul_ai)
  • Education Week analysis of 310 AI-generated lesson plans. AI-generated content systematically favors lower-order thinking (recall, memorization) over higher-order skills (analysis, synthesis, creation). (EdWeek)

The implication

Generic AI access in a school context is not neutral. It is actively counterproductive for the cognitive and metacognitive outcomes Christian schools care most about β€” perseverance, struggle, the slow formation of judgment. Calvary Prep should not adopt “AI access” as a policy. It should adopt specific, scaffolded AI tools with explicit pedagogical design, human teacher supervision, and intentional use boundaries.

The strongest synthesis of the evidence is from Stanford’s SCALE Initiative (scale.stanford.edu) and the Brookings 2026 review (brookings.edu). Both should be read in full by whoever leads Calvary Prep’s AI working group.


2. The AI literacy gap is wider than the headlines

Two contradictory data points define this space:

  • EdWeek (March 2026): 80% of high schools claim to teach AI basics or responsible use.
  • Junior Achievement (August 2025, n=1,008 teens): 64% of U.S. teens (13–17) report receiving no AI education at school.
  • Integrative review (Gu, 2025): Only 5.4% of students receive any formal AI literacy training.

Translation: most “AI literacy” in U.S. schools is a one-period mention, a teacher’s verbal warning about ChatGPT, or a hastily-added paragraph in a syllabus. Almost no schools have structured, grade-banded curriculum.

What employers say they need

  • 76% of global business leaders see AI literacy as essential.
  • 66% are unwilling to hire candidates lacking AI skills. (EdWeek citing Microsoft)

The four (or five) dimensions

The OECD/European Commission framework β€” currently the most rigorous international reference β€” defines AI literacy across four domains: understand, use, evaluate, create, plus integrated ethics. (OECD/EC framework) ETS published a parallel grade-9–12 progression. (ETS)

Calvary Prep’s distinctive contribution should be a fifth dimension: steward. What does it mean to use AI as a person made in the image of God, in love of neighbor, with wisdom and restraint? This is the dimension secular frameworks cannot articulate. It is also the one parents in your audience most need to see addressed.

Counter-narrative worth taking seriously

Nat Purser’s widely-shared post β€” “AI literacy for students is often a sham” (X) β€” argues that most current AI literacy programs are superficial and rapidly obsolete. This critique is empirically supported by the 5.4% formal-training figure. It is a warning to schools planning AI literacy work, not a reason to avoid it: design for depth, not optics.


3. Teacher AI adoption is a structural shift, not a fad

The Gallup/Walton data (gallup.com) is unusually granular:

Teacher AI usage Time saved Top use cases
60% used AI in 2024–25 Weekly users save ~6 hrs/week (~6 weeks/year) Lesson prep (37%), worksheets (33%), assessments (25%), grading (16%)

80% of AI-using teachers report improved work quality. The time isn’t being banked β€” it’s being reinvested in individualized student feedback, the part of the job teachers most want to do.

The honest finding from Stanford SCALE

When Stanford analyzed actual teacher AI prompt logs (rather than asking what teachers say they use AI for), more than 50% of prompts were “doing” tasks β€” generating lesson plans, quizzes, feedback drafts. The most-used tool was an Essay Grading Assistant. (Stanford SCALE)

This matters because vendor marketing portrays AI as a “thinking partner” for teachers. The actual usage data shows teachers are using it as a content generator β€” which is fine, but creates a quality-control responsibility that has to be designed into the workflow.

Public trust is dropping while teacher use is rising

This is the most strategically important divergence in the data. Public support for AI in lesson planning dropped from 62% to 49% in one year, and 70% of parents oppose AI tools accessing student data. (PDK/EdWeek)

For Calvary Prep, this means an AI rollout that doesn’t include transparent parent-facing communication will create trust damage faster than it creates instructional gains. The schools that win this transition will be the ones that show their work.


4. Academic integrity: detection is dead, redesign is the path

This is the area where the data has shifted most sharply in the last six months.

The collapse of detection

  • University of Arizona disabled its AI detection systems entirely. Cornell and Pitt have followed. (KOLD News; Detection Drama)
  • 61% false positive rate for ESL students in current AI detection tools. (Detection Drama, April 2026)
  • The Washington Post and Inside Higher Ed simultaneously published opinion pieces on April 13, 2026 calling detection-first approaches harmful to honest students. The expert consensus has tipped.

The scale of the problem

  • Turnitin (Feb 2026): 15% of submitted essays are now more than 80% AI-generated. Up from 3% in 2023. Fivefold increase in 27 months.
  • HEPI (UK, March 2026): 95% of undergraduates use AI; 94% use it for assessed work; 12% submit AI text directly.
  • RAND (March 2026): U.S. student AI use for homework rose from 48% (May 2025) to 62% (Dec 2025). 67% of students themselves say AI harms their critical thinking.

Faculty alarm

  • AAC&U: 95% of college faculty fear student overreliance on AI; 78% report increased cheating; 74% believe AI is degrading the value of degrees.
  • College Board: 92% of faculty concerned about AI-facilitated plagiarism.

What’s working

The expert consensus is converging on assessment redesign rather than detection: oral examinations, in-class writing, portfolio documentation, process-based grading (drafts, screen recordings), and transparent AI-use policies in which students declare and explain their AI use rather than hide it.

For Calvary Prep, this is a significant opportunity. As an online school, every assessment is already mediated digitally. You can architect oral defenses over video, process portfolios in your LMS, and structured reflections on AI use more easily than a traditional school can. The same infrastructure that made you “behind” a brick-and-mortar school in 2009 now makes you “ahead” in 2026.


5. The workforce is repricing around AI fluency at extraordinary speed

This is the framing every parent in your audience cares about most, even if they don’t say it. Their tuition expectation is that Calvary Prep graduates can compete for college admissions and entry-level jobs against peers from the strongest secular high schools. The bar for that competition has moved.

Demand-side data (2026)

  • NACE (April 2026): Demand for AI skills in entry-level postings nearly tripled in 6 months. Now >33% of postings require AI proficiency. 60% of employers assign AI-specific projects to interns; 28% explicitly seek “AI-proficient early talent.” (naceweb.org)
  • Handshake Class of 2026: 70% of hiring managers consider generative AI skills critical for entry-level hires. AI-mention job postings up 5x since 2023.
  • PwC Global AI Jobs Barometer: AI skills carry a 56% wage premium vs. comparable non-AI roles.
  • IMF SDN/2026/001: 1-in-10 advanced-economy job postings now demand skills that did not exist in standard taxonomies five years ago.

Supply-side data

  • Stanford Digital Economy Lab (“Canaries in the Coal Mine”): Employment for 22–25 year-olds in AI-exposed occupations dropped 13% since 2022. Workers aged 35+ in the same fields saw employment increase 6–9%. (digitaleconomy.stanford.edu)
  • WEF (March 2026): U.S. entry-level jobs down 35% over 18 months.
  • The Guardian (April 2026): 42.5% underemployment among recent college graduates β€” the highest rate since 2020.

What employers actually want

Across multiple practitioner posts and institutional research, a consistent technical cluster recurs:

  • RAG (Retrieval-Augmented Generation) β€” building AI systems that ground answers in specific document corpora
  • AI agent frameworks β€” orchestrating multi-step AI workflows
  • LLM inference and deployment β€” getting models into production
  • Vector databases and embeddings β€” semantic search infrastructure
  • Prompt engineering and evaluation β€” designing and measuring AI interactions
  • AI-assisted product building β€” actually shipping things using AI

Nvidia’s Jensen Huang and Andrew Ng both indicate they would hire an AI-fluent new graduate over a 10-year veteran without these skills.

The implication for a Christian high school

A 2026 Calvary Prep senior who graduates with demonstrable, portfolio-evidenced experience using AI to build something real β€” even something small, even something for a Bible class or a service project β€” enters a fundamentally different college admissions and job market than a senior who has only “used ChatGPT” the way 84% of their peers have. The differentiator is not exposure. It is capability and evidence.

This becomes a graduation-portfolio opportunity for Calvary Prep that few schools have considered.


6. The Christian-school-specific landscape

This is the most under-covered space in the broader AI-and-education conversation, and it is where Calvary Prep has the largest potential asymmetric advantage.

Empirical baseline: ACSI / Cardus survey (2024)

The only large-scale survey of actual Christian educator AI adoption:

  • ~1/3 of Christian educators are already using AI in teaching.
  • 1 in 5 lack confidence in doing so ethically.
  • Adoption is outpacing theological preparation. (cardus.ca)

The institutional scaffolding is being built right now

  • W.I.S.E. Framework (IACE, February 2026): Worship, Image, Sent, Equipped β€” the most comprehensive biblically-rooted rubric yet published for evaluating AI tools in Christian schools. Addresses idolatry risk, Imago Dei, mission, and vocational formation. (iace.education)
  • CESA’s “The Question of AI and the Christian School” (cesaschools.org) provides creation/dominion theological framing for board-level conversations.
  • The Gospel Coalition’s “Don’t Hand Education Over to AI” (thegospelcoalition.org) is the clearest articulation of the discipleship-formation objection. It functions as the strongest “loyal opposition” voice in the discourse.
  • Baylor Symposium on Faith and Culture 2026 included dedicated sessions on AI reimagining K-12 discipleship. Elite Christian academic institutions are now treating this as a priority research area.

Christian theology is shaping AI development at the industry level

This is a development almost no Christian school leadership team is yet aware of:

  • Anthropic (the maker of Claude) consulted approximately 15 Christian leaders on Claude’s moral and spiritual development, including the question of whether AI could be considered a “child of God.” (X β€” 1,000+ likes)
  • Tim Hwang’s Institute for Christian Machine Intelligence (ICMI) is producing peer-reviewed work arguing that Christian theological concepts (corrigibility, humility, Imago Dei) offer concrete technical contributions to AI safety and alignment. The “GospelVec” project derives activation steering vectors from the four Gospels. (X/@timhwang)
  • Research suggests 89.5% of explicitly religious content in major LLM training datasets is Christian β€” meaning Claude, ChatGPT, and Gemini are, as a matter of statistical fact, more familiar with biblical theology than any other religious tradition.

The serious objection

The strongest theological pushback comes from a pneumatological argument: that genuine Christian study, prayer, and Scripture-engagement involve the work of the Holy Spirit through human struggle, and AI shortcuts that work in spiritually significant ways. (X/@RunyaMahtan; X/@thetangles)

This objection deserves engagement, not dismissal β€” see 05_risks_and_objections.md for a serious treatment.


7. Synthesis: what’s actually happening

Pull all of the above together and a coherent picture emerges:

  1. AI in education has crossed an adoption threshold. Whether schools choose AI or not, students are using it. The “should we” question is closed.
  2. The technology works when designed correctly and harms when it isn’t. This is a pedagogical design problem, not an AI problem.
  3. Detection-first integrity strategies are collapsing. Assessment redesign is the path forward and online schools are structurally well-positioned for it.
  4. The workforce is repricing around AI fluency faster than schools can respond. The 2026 graduate without AI capability is in a fundamentally worse position than they were four years ago.
  5. Christian schools have a genuine theological frame to bring β€” one being actively requested by major AI labs and being articulated by Christian institutions in real time. This is rare cultural leverage.
  6. The window for early-mover advantage among online Christian high schools is open and not infinite. Most schools will follow rather than lead. Few will publish a serious AI strategy in 2026. The ones that do will define the category.

Calvary Prep has a structural advantage that most schools do not have. The question for the board is whether to deploy it.


“And do not be conformed to this world, but be transformed by the renewing of your mind, that you may prove what is that good and acceptable and perfect will of God.” β€” Romans 12:2 (NKJV)

The renewing of the mind is exactly the work AI cannot do for a student. That is the part Calvary Prep keeps as its center. AI is the scaffolding around it, used to free teachers and students for the formation work itself.