12/03/2025

By Dr. Gene Kerns, Chief Academic Officer

It’s notable when a company is so pioneering or dominant within a market that the name of the company or its product becomes synonymous with an entire category.

For some people, all photocopiers are “Xerox machines.” Similarly, if we’re asked to pick up some Reynolds Wrap at the store, we’re probably OK as long as we come home with some brand of aluminum foil. Similarly, when I feel a sneeze coming on, I immediately reach for a Kleenex, even though there’s no guarantee it was produced by the Kimberly-Clark corporation. A Puffs tissue (manufactured by Procter & Gamble) will do just as well.

There’s even a name for this phenomenon within the field of branding: genericization. It’s formally defined as “the process in which a trademark or proprietary name becomes widely perceived as a common noun or verb describing the type of product or service.”

Such is quickly becoming the case with generative AI. OpenAI’s ChatGPT was so early to and has become so dominant in the market that, when some folks say, “I used ChatGPT to. . .,” there’s no guarantee that they actually have an account with OpenAI. They may just as well have used Google’s Gemini, Microsoft’s Copilot, Anthropic’s Claude, or any other among a host of generative AI tools.

This brings us to a thought-provoking question that has important implications for the use of generative AI in K–12 education. In a world now awash with access to generative AI tools, what differentiates one tool from another? Does it really make a difference which tool you (and, ultimately, your students) use? Or are generative AI tools all basically the same?

In reflecting on this question, I think there are two parts to the answer, one that’s easy to grasp, and another that we should reflect upon more deeply in deciding which generative AI tools to use in schools and classrooms.

Generative AI in education: Following instructional design best practices

Let’s journey to the answer by tracing the development of the new AI Create Lesson Generator in our Nearpod platform. This tool allows educators to prompt Nearpod to create a quality lesson on nearly any topic.

For one of our internal experts with a deep background in instructional design, the AI Create Lesson Generator became a passion project. The result of her collaboration with other colleagues is a tool that stands out in the sea of far more generic K–12 lesson creation tools. 

The first phases of development of this new tool primarily employed “off the shelf” generative AI applications without extensive modification. The lessons that were created by this early prototype covered the basics, but they lacked some essential elements and nuances—details an instructional designer would be far more sensitive to.

This is not uncommon. As one recent analysis found, the quality of lessons and other instructional materials created by generative AI tools varies widely. Another group of researchers pointed out that these tools may “embed outdated educational approaches that limit student agency and classroom dialogue,” and they call for “further refinement of these tools to better serve educational needs.” 

With these details in mind, our instructional design expert stepped in to collaborate with our engineers. They made it a priority to direct the refinement of the prompt engineering so that when a teacher tells Nearpod, “I want a lesson appropriate for high school students on wood shop safety” or “I want a lesson appropriate for fifth grade students on the early Swedish settlers of Delaware,” the generative AI tool we have built receives far more direction about what it needs to create.

What do I mean by this?

Through careful crafting, the AI Create Lesson Generator “hears” the following when prompted for the wood shop safety lesson:

  • Create a lesson appropriate for high school students on wood shop safety
  • Identify key lesson objectives at the outset
  • Apply “backwards design” to ensure the final lesson addresses all major objectives
  • Consider the age of the students and break or “chunk” the content into small pieces of an appropriate length that are interspersed with polls, activities, and discussion opportunities that reinforce the content
  • Identify key vocabulary words
  • Apply the principles of Universal Design for Learning (UDL) in building the lesson
  • Ensure the end product is formatted for easy delivery in Nearpod
  • . . . and more

The end result is a tool that—in the words of one of our instructional design experts—“thinks like your most experienced teaching colleague and embodies the expertise of our curriculum designers, incorporating their deep pedagogical knowledge and instructional wisdom into every lesson it creates.”

AI tools for education

See how trustworthy generative AI tools can help to accelerate student learning in your schools.

AI-created lessons, text complexity, and K–12 student needs

Even at this point, however, we had not yet created something that truly delineates our AI Create Lesson Generator from other available generative AI tools. In other words, what we’d done up to that point could have been largely replicated by any other technology company willing to hire experienced instructional designers, pair them with skilled engineers, and commit them to the task of refining the prompting of a generative AI tool.

What happened next, however, changed everything.

Our instructional designer identified a weakness in some of the AI-generated lessons, and she reached out to colleagues for input. She had concerns about the level of language used in some of the lessons. Sometimes it seemed either too advanced or too easy for the intended grade level. 

That’s when an expert linguist on our staff suggested that the tool be directed to use Renaissance’s ATOS readability formula. This free resource, which has the externally documented ability to reliability gauge text complexity, would provide additional guidance to the AI Create Lesson Generator around language.

Within hours of this suggestion, the revised AI prompting code, now told to review all language through ATOS ratings, was producing lessons of a higher quality.

High quality input for high quality AI-generated lessons

This brings us back to our original question: What’s the real difference between one AI-based tool and another?

Perhaps surprisingly, it’s often not the “AI” itself. No one has the corner on generative AI, as the widespread use of ChatGPT, Gemini, Copilot, Claude, and other tools clearly demonstrates. Instead, the difference lies in the information each company feeds into and uses to train its generative AI tools—and the additional resources, such as the ATOS readability formula, that each company can connect to and make part of the generative process.

This is where Renaissance has a significant advantage. We have four decades of experience in K–12 education and classroom technology, and we bring unique resources to our generative AI tools:

  • Text complexity information from ATOS
  • Billions of data points about how students learn from our Star and FastBridge interim assessments, along with our DnA formative assessment tool
  • Knowledge of every single word that appears in every book—now more than 220,000—with an Accelerated Reader quiz
  • Data on skills practice in both ELA and math from millions of students
  • Knowledge of the most essential skills for progress at each grade level, customized to the learning standards of each US state
  • Insights about which of these essential skills are the most challenging for students to learn as they progress in reading and math
  • . . . and more

With this key point in mind, let’s go back to the earlier point about the term “ChatGPT” becoming genericized within the field of AI. How many people know what the name ChatGPT actually stands for?

The “Chat” part is easy. This is the interface, designed to have a conversation with us. But what about the “GPT”? There’s a lot of insight within this acronym:

  • The “G” stands for “generative,” given that the tool generates things for us. 
  • The “P” stands for “pre-trained,” meaning the information the tool pulls from in creating and refining its output. 
  • The “T” stands for “transformer,” referring to the neural network architecture that allows the tool to do what is does.

Looking from this perspective, generative AI tools tend to be quite similar in terms of both the “G” and the “T” components. It’s the “P” component—the essential pre-training, or where the various tools pull their information from—that creates the true difference:

  • Pull broadly from all of the internet, and you’ll get all of the internet—the good, the bad, and the ugly.
  • In contrast, pull from carefully vetted sources, refine your results with the right tools, and then—critically—inform the process with as much data as possible, and you’ll get results of a much higher quality.

In terms of the elements Renaissance taps to inform and enhance our generative AI features, we’ve already done some amazing things for teachers and students, but we’re also just getting started. We’re currently working on new ways of bringing together the best of generative AI, along with 40 years of expertise in personalized teaching and learning, to further support the work that educators do in the classroom each day.

I can confidently say that, in this moment of rapid technological change, effective instructional practices continue to matter greatly. And that the role of generative AI is not to replace teachers but rather to help them do what they do best. For this reason, I’m reminded of a statement that Renaissance made years ago to describe assessment-driven reading and math practice: “Better data, better learning.”

As we look to the future and the benefits that quality AI tools can bring to teachers and students, we might rephrase this as “AI you can trust, impact you can see.”

Learn More

Connect with an expert to explore the AI Create Lesson Generator and other quality AI-based tools for education.

Share this post