No datum left behind: Making good use of every bit of educational data

By Eric Stickney, Director of Educational Research

“Just what do you do with all that data?”

A superintendent asked me this question the other day, and I understand where she’s coming from. The educators in her district have put faith in Renaissance products. She is responsible for the students who complete the assignments, practices, and assessments that populate our databases.

Massive databases. The Data page on our website provides a live count of the data we receive and store from schools across the US, Canada, and the United Kingdom. During the 2014–2015 school year alone, we captured about 70 million achievement tests, feedback on 400+ million books and articles read, and results from 27 million mastery assignments in mathematics. In August, the data ticker will reset for the 2015–2016 school year, and I’m betting we’ll surpass those marks over the course of the next school year.


Data Learnalytics


At Renaissance, we encourage teachers to use the data our solutions provide to make instructional decisions. We know that students benefit when their teachers receive timely, accurate information about what kids know and like, as well as about the topics they struggle with or are ready to learn next.

Besides housing and safekeeping this information so educators have access to both current data and historical context, we also work to give this information back to educators in another way.

Every bit of information we maintain is used to share actionable insights with educators, parents, and students. We collect and analyze data with the goal of improving educational outcomes, using the data to try to squeeze out every possible insight about learning and teaching. What we find is shared as insights with educators, and it serves to shape our development decisions.

Just as important, Renaissance is deeply committed to the protection of school and student data. In all we do, we go to great lengths to provide aggregated data that is useful to educators, parents, and researchers while stopping well short of releasing information that could be used to identify any district, school, teacher, or student.

Here is just a sample of what we do with all of this data:

  • Learning progressions for reading and math provide a road map of where individual kids have been and the path they need to take next. Using a student’s score, educators can drill down and see which topics a student has mastered, as well as those they still need to work on to become proficient. Differentiating learning experiences in this way is nearly impossible to do manually. Our achievement data supports empirical validation and ongoing fine-tuning of these progressions. With accurate, up-to-date assessment data, every student can be placed at the correct starting point in the learning progressions and move forward from there.

  • Fidelity and best practices. We use data to review the extent to which teachers and students use our programs, and we try to make software changes that encourage even better implementation integrity. Why? Because years of research on implementation in both reading and mathematics have shown that when our solutions are used in a certain way, students are more likely to benefit and grow at an accelerated rate.

  • What Kids Are Reading. This annual report and searchable website represents the world’s largest survey of K–12 student reading behavior. Captured from data on millions of students, this information about popular books read by grade, gender, and book characteristics can help students search for engaging books to read. This data also informs the personalized Discovery Bookshelf inside Renaissance Accelerated Reader®

  • Renaissance Star Assessments® are profoundly shaped by student data that informs item calibration, score norms, and links from Star to state summative and other assessments. In addition, psychometricians continually evaluate Star results for indicators of technical adequacy, which ensures the assessments are consistently measuring what they are intended to measure. Likewise, student growth percentiles (SGPs), which are reported in Star, require a large amount of historical data to understand what growth looks like by subject, grade, and type of student. SGPs help educators answer key questions such as, how is my student growing relative to academic peers? And, how likely is a particular student to catch up to a level of proficiency on the state test?

And that’s just the beginning of what we do—we’ve really only just scratched the surface. Our first priority remains to encourage and assist educators in using their data to make sound instructional decisions and effect change in their classrooms. Beyond that, we strive to practice what we preach. Just as we ask teachers to use their data, we endeavor to use the data we gather to impart as much insight as possible on issues affecting teaching and learning and to further the development of our solutions.

We know the potential that lies in this data, so we continually examine it for other topics to research and share. Is there a research question you’d like to use data to explore? How do you use data to inform instructional decisions? Share your ideas with us below in the comments.

Eric Stickney, Director of Educational Research
Eric Stickney, Director of Educational Research
Eric Stickney works with external independent researchers who conduct evaluations of Renaissance programs. He specializes in analyzing reading and mathematics data collected from millions of students in North America and the UK.


  1. Matt Renwick says:

    As a principal, I’m interested how specific activities can help increase student engagement, especially at the intermediate grade levels. Numbers are fine, but they don’t tell me a lot about how motivated students are with their learning. Specifically, what qualitative types of information could be measured regarding engagement? What tools could help assess student dispositions toward learning and student interactions with each other about their reading lives? Accelerated Reader has a very robust system. It could be even better if kids were allowed to interact with each other in this online space. They should be able to not only rate books, but also write reviews, recommend books to others, and share their to-read lists. Think Goodreads for kids.

    I’m also troubled that Accelerated Reader still promotes a point system. How many studies need to come out before Renaissance Learning decides to scrap this external motivation tool, something that can actually decrease a student’s motivation to become a lifelong reader who doesn’t need a point system to pick up that next book?

    Data is great. Quick comprehension checks certainly give us some information about surface-level understanding. This information can certainly aid a teacher in being more responsive in their instruction. But if student information systems only provide a number, then it fails serves to encourage further learning for the sake of pursuing knowledge and interests. Some things just aren’t quantifiable. This is something I am looking to investigate.

    Matt Renwick

  2. Eric Stickney, Director of Educational Research Eric Stickney says:

    Thank you Matt for your thoughtful comments. You raise an important point that numbers by themselves, without context, can be unhelpful. We strive to make sure that every metric we collect and communicate to teachers is educationally meaningful. And when we use data to shape our products and professional services, we do so by grounding the data in accepted theories and facts about teaching, learning, and measurement. We have a rich set of data from programs like Accelerated Reader or STAR assessments, which is, as you said, a key piece of the puzzle. A teacher or counselor’s observations about a student provide even more information. We agree that integrating the qualitative with the quantitative could provide even richer data, and we are actively thinking about ways to do that.

    Two other points you raised were also interesting. First, regarding student interaction, AR 360, the latest version of Accelerated Reader, supports peer interaction around informational reading. Technology, of course, supports social interaction, and we are investigating other ways students could interact that would be support their learning, while avoiding common social pitfalls. Second, AR points serve as an indicator of practice. Points are a mashup of three factors: volume of text, difficulty of text, and student comprehension of that text. I think we can all agree that all three factors are important for teachers to know. Simply measuring reading by volume (books or words read) or difficulty can provide an incomplete picture of the student’s reading. Where it gets tricky is how metrics such as points are used. Tying points to competitions, prizes, and the like can create an extrinsic motivational system that may be unhelpful for encouraging students to become lifelong readers. We recommend they be used only in setting goals for individual students as an overall measure of those three critical reading practice factors. In the past we advocated using points as a motivational tool, but our recommendations here have evolved along with the research. Instead, if schools want to celebrate reading practice, we encourage celebrating students who meet their personalized goals.

    Again, I really appreciate your comments. We are always looking for more ways to improve our programs via our Research Panel. To sign up, visit

    • Matt Renwick says:

      Thank you Eric for responding. Your openness to suggestions and thoughtful explanations are appreciated. I look forward to observing the improvements in AR.