The Peabody Picture Vocabulary Test-5th Edition and The Expressive Vocabulary Test- Third Edition – Recent Revisions of Assessments Essential to a Comprehensive Evaluation

Guest Post by Michael F. Shaughnessy, Eastern New Mexico University


Dr. Michael F. Shaughnessy received his Bachelor’s degree from Mercy College in Dobbs Ferry, NY, his master’s degree in counseling and guidance from Bank Street College of Education, and his master’s in School Psychology from the College of New Rochelle. He did his doctoral work at the University of Nebraska- Lincoln and has done post-doctoral study at George Washington University in Head Injury and Brain Trauma and additional study at Texas Tech University in Lubbock, Texas in multiple disabilities.

Dr. Shaughnessy’s research interests are Intelligence Testing and Personality, Gifted Education, Projective Psychology, and Clinical Psychology


In this blog post, Dr. Shaughnessy reviews revisions of two well-known tests – The Peabody Picture Vocabulary Test-5th Edition and The Expressive Vocabulary Test- 3rd Edition. These instruments are often used as part of comprehensive evaluations and their revisions provide updated norms and easy paper and pencil or digital administration.


The Peabody Picture Vocabulary Test-5th Edition

The Peabody Picture Vocabulary Test 5th Edition (PPVT™-5) is a well-known test that has been around literally for decades, serving as an integral part of most comprehensive evaluations. The PPVT-5 is a measure of receptive vocabulary designed for Standard American English. The test measures receptive vocabulary and its acquisition; strengths and weaknesses in the domain of semantics, word knowledge; and overall general language development. The PPVT-5 can be used for screening for receptive language disorders, screening for pre-school children, measuring word knowledge, understanding reading delays and difficulties, and measuring reevaluation and growth.

The goal of the revision was to update normative data to provide accurate comparative information; simplify and shorten the administration process; and include item analyses for assessment to intervention connections. The PPVT-5 keeps the basic well-known format of the PPVT–4 – respondents select a stimulus word from a four-picture layout – with stimulus words and pictures more applicable to different cultures within the United States.

Examiners compute raw scores to receive age-based standard scores (M = 100, SD = 15), percentile ranks, normal curve equivalents (NCEs), stanines, and both age and grade equivalents. Douglas Dunn, the author of the PPVT-5 has written a quite comprehensive manual delving into the statistical aspects of the test. The manual provides evidence of reliability, evidence of internal consistency, standard error of measurement and confidence intervals, and reliability of test scores. In terms of validity, the manual provides evidence based on test content, and response processes. The manual also reports relationships with other variables, specifically correlations with the PPVT-4, EVT-3, CLEF-Preschool -2, CLEF-5, and the KTEA-3-Brief.

Special group studies included populations diagnosed with language disorder, language delay and specific language impairment, autism spectrum disorder, specific language disability in reading and or writing, and hearing impairment with cochlear implants.

The publisher’s website and testing manual has additional information about testers’ qualifications, age-rage, administration time, and the paper and pencil and digital format of the assessment. 

The Expressive Vocabulary Test- Third Edition

The EVT-3 is a valid reliable measure of expressive vocabulary and word retrieval for Standard American English. The EVT–3:

  1. Measures expressive vocabulary
  2. Offers comparisons of receptive and expressive vocabulary with concurrent administration of the PPVT-5 
  3. Assesses strengths and weaknesses in the realm of semantics, word knowledge
  4. Offers an assessment of general language development.
  5. Generates evidence-based interventions through the online Q-global system

The goals of the updated Expressive Vocabulary Test, Third Edition (EVT™-3) were to update normative data, include item analyses to connect assessment to intervention, and offer digital applications that maintain the basic qualities of the EVT–2 – ease of administration and response capture, brief administration time, and accurate and straightforward scoring.  The stimulus words and pictures were also updated to be culturally relevant.

The EVT-3 will be available in paper and pencil and digital formats. The digital format includes digital scoring and reporting, digital stimulus books and manuals, and digital administration and scoring via Pearson’s Q-interactive® system.

The test yields age-based standard scores (M = 100, SD = 15), percentiles, normal curve equivalents (NCEs), stanines, and age and grade equivalents. Scoring options include digital Q-global scoring, although examiners can also use the manual for hand score.

The manual provides evidence of reliability and validity, including test-retest, internal consistency, standard error of measurement, confidence intervals, and reliability and standard errors of measurement of the normative sample. In terms of evidence of validity, the manual provides evidence based on test content, response processes, and relationships with other variables such as correlations with EVT-2 and PPVT-5. The PPVT-5 and EVT-3 were co-normed, offering a high degree of confidence in the results of the two tests.

The publisher’s website and testing manual has additional information about testers’ qualifications, age-rage, administration time, and the paper and pencil and digital format of the assessment. 

Overall Summary

When used together, the PPVT-5 and the EVT-3 offer direct comparisons of receptive and expressive vocabulary.  The general idea is that using both tests together expedites movement immediately to evidence-based interventions using the data gathered from these two tests.

Both instruments will be available via Q-interactive, Pearson’s web- and iPad®-based system for interactive, seamless assessment, scoring, and reporting. With Q-interactive, examiners administer interactive assessments with an intuitive, portable system that employs two iPads that are connected by Bluetooth. The student or subject views the test stimuli on one iPad while the examiner uses the other iPad to view and control test administration directions and the verbal stimuli, and record and score responses. New users can receive orientation or sign-up at HelloQ.com. for an overview.

Both tests have a long and rich history of use and their importance cannot be over-estimated. These two instruments are easy to administer and score, and provide an excellent baseline and pre-post measure of growth and development to ascertain if students are responding to intervention. These two instruments will provide excellent information for speech language therapists, school psychologists, educational diagnosticians and others involved in the evaluation and assessment of preschool and school aged children, adolescents and adults.

References

Dunn, D. M. (2019) .  Peabody Picture Vocabulary Test (5th ed.) [Measurement instrument]. Bloomington, MN: NCS Pearson.

Williams, K. T. (2019). Expressive Vocabulary Test (3rd ed.) [Measurement instrument]. Bloomington, MN: NCS Pearson.

Perceptions that Promote Achievement

Guest post by Del Siegle, PhD


Del Siegle is a professor in gifted education in the Neag School of Education at the University of Connecticut, where he was honored as a teaching fellow. Prior to earning his PhD, Del worked as a gifted and talented coordinator in Montana. He is past president of the National Association of Gifted Children and has served on the board of directors of The Association for the Gifted. He is also past chair of the AERA Research on Giftedness, Creativity, and Talent SIG. He has been co-editor of the Journal of Advanced Academics and  Gifted Child Quarterly. He writes a technology column for Gifted Child Today. Del’s research interests include web-based instruction, motivation of gifted students, and teacher bias in the identification of students for gifted programs. Along with Gary Davis and Sylvia Rimm, he is an author of the popular textbook, Education of the Gifted and Talented (6th and 7th ed.). He is the Director of the National Center for Research on Gifted Education (NCRGE), which replaces the former National Research Center on the Gifted and Talented (NRC/GT).


Gifted students, similar to other students, can be at risk for academic failure, and the seeming lack of motivation of many academically gifted students is an area of frustration and concern for many parents and teachers. The underachievement of some of America’s most talented students represents a loss of valuable human resources for the nation, as well as unrealized fulfillment for the individual.

We know that

•       Teachers nominate three times as many gifted boys as underachievers as they nominate gifted girls as underachievers. Are more gifted boys underachieving, or are gifted girls who are underachieving being overlooked?

•        Parents and teachers indicate higher rates of inattentive behavior at home and school for underachievers. Is inattention a cause of underachievement or a product of the underachievement?

•        Developing trusting relationships with gifted underachievers, making school meaningful, and helping students see the important of school appear to be the strongest strategies for improving gifted underachievers’ grades. How can educators make school more meaningful for gifted and talented students?

One way to understand why students are not motivated to achieve is to study what motivates individuals. Our work has demonstrated there is a relationship between and among three key perceptions (self-efficacy, goal/task value, and environmental perceptions), and a resultant behavior (self-regulation). Students must believe they have the skills to do well before they will tackle a task. This self-efficacy appears to be necessary, but not sufficient for students to achieve.

Students must also find school tasks meaningful and valuable. Even if students believe they have the skills (self-efficacy) to do well in school, if they do not see their schoolwork as meaningful, they will not complete it. Many gifted students do not see the work they are doing in school as meaningful for several reasons. They may already know much of what is being presented to them. They also often have specific passion areas they enjoy exploring, but are seldom given opportunities to pursue in school. Educators often fail to share with students why content they are teaching is significant and how it relates the world in which live.

Additionally, students’ perceptions of school and home events, the nature of teachers’ and parents’ expectations and support, and the patterns of interaction between students, teachers, and parents have an impact on their academic attitudes and behaviors. These environmental perceptions interact with students’ goal/task values, and self-efficacy to help them set realistic goals and self-regulate. If any one of these three attitudes is low, individuals can fail to engage and achieve. Self-regulation and study skills are important for academic success; however, they interact with attitudes and are a resultant behavior that enables students to be successful. Helping students recognize and appreciate the abilities they are developing, helping them see the meaningfulness of the tasks we ask them to do, and letting them know we support their efforts as they approach challenges promotes the achievement oriented attitude necessary for success.

Resources for parents of gifted students

Last week’s post by Dr. Pfeiffer was one of our more popular posts so far. Dr. Pfeiffer wrote about facilitating a presentation for families of gifted children during which the audience discussed answers that they had submitted in response to the question, “What is the one thing that you find most worrisome or challenging as a parent of a gifted child?”

Dr. Pfeiffer wrote about the power of the discussion as participants shared similar concerns and experience. We know that having a community of families with similar interests and concerns can provide an important source of information and support.

To follow up on that post, we collected a set of resources that we hope will be helpful for parents to learn more about available educational opportunities, research about giftedness, and organizations that support gifted students and families. These resources can also be helpful for educators to share with students and families.

National Association for Gifted Children – Resources for Parents. A great set of resources, publications, links for further information, and tips for parents of gifted children. The site includes brochures and handouts that families can share with schools. Also check the websites of your state-level organization. For example, the Ohio Association for Gifted Children (OAGC) includes information and documents that parents and families could find helpful.

Your state department of education and your local school district should have a page with information for parents specific to your state and local community. The resources on the Ohio Department of Education website include information about giftedness, educational strategies for serving gifted students, and links to state and national organizations.

Of course, we suggest that you look at the Duke TIP website for information. Dr. Pfeiffer, former Executive Director of Duke TIP, and I, former Research Postdoctoral Fellow, have great respect for the work, research, advocacy, and resources that Duke TIP offers students, families, and the gifted community.

University centers offer useful research-based resources and services for students. Below is a sample of these centers. You can see that resources are located across the country and in a variety of institutions. These are just a few of the centers that focus their work on gifted students. Check your area to see if a local institution of higher education has a summer program, research center, or program that could be of interest.

These websites are a great place to start your search or to learn about new developments and research about effective educational strategies and programming for gifted students.

Grant Evaluation for Gifted Educators – A Deeper Dive

Photo by Matthew Henry from Burst

In the January 16, 2020 post, I discussed grant writing for gifted educators. In today’s post I address how program evaluation aligned with a project’s research agenda can enhance your project, and provide good reasons to include an external evaluator in your project.

Program evaluation: enhance your proposal and your project

Respond to the call for proposals. The first reason to focus on evaluation in your grant proposal is because funders ask for it. We know that the key rule of grant writing is to include all the elements that the funder seeks. Most education-focused grant programs at the National Science Foundation, National Institutes for Health, and US Department of Education require an explanation of how grant activities and outcomes will be evaluated.

Support program implementation and improvement. Evaluation is not just important because the funder asks for this, but also because evaluation can help support a team’s work and can provide information and feedback that could improve the success of the project.

Program evaluators speak of two types of evaluation: formative and summative. Formative evaluation collects information and provides feedback to a project team about progress and perception of activities. This information can help the team understand if they are on track to complete project activities, if changes to the project plan or timeline should be considered, and if the expectations of the project participants and stakeholders are being met. Summative evaluation focuses on the outcomes of grant activities and assesses the impact of a program on participants and the extent to which a program resulted in expected outcomes.

Program evaluation aligned with the research agenda

External evaluation and research activities can focus on similar areas and use similar, or even the same data, to answer different questions. An evaluator can be assessing the extent to which a curriculum meets teacher needs, provides sufficient support and materials for implementation, and engages the interest of students. Concurrently, the research team can be examining the connection between student engagement and motivation.

Research and evaluation plans should be closely aligned to ensure efficient and complete data collection while avoiding repetitive or intrusive data collection. Communicating to grant reviewers, program participants, and stakeholders that the research and evaluation efforts are aligned and coordinated helps build trust and shows respect for participants.

In order to create an aligned plan, engage an evaluator early in your planning and proposal writing process. Developing an evaluation plan along with the research agenda can help align and organize data collection activities and ensure that all required information and data be available by project end.

External evaluation support

No doubt you are thinking “since the research team is already studying the project, can’t the research team just collect information about implementation and participant feedback?” Yes, that is possible. However, here a few reasons a skilled and experienced evaluator should be engaged.

First, an outside perspective is helpful. Evaluators can help shed light on a theory of action that is missing a step or help clarify assumptions at program onset. Discussions with an evaluator often help solidify thinking about project work and can position a project for successful implementation.

Experience is another reason. Experienced evaluators can offer a broad knowledge about assessment tools, data collection practices, and evaluation methodologies that can improve the work of the team. These skills can improve research efforts, as well, allowing you to focus on answering the questions that are of interest to you and your research agenda. A skilled independent evaluator should not come to your project to answer his or her own questions, they should come to your project with the goal of answering the questions that will help ensure that your program runs smoothly, and as intended.

And finally, an evaluator can help disseminate results, new practices, or innovative methods to different audiences. Evaluators can share innovations in assessment tools and methods, data collection strategies, analysis methods, and effective data visualization and reporting efforts. Often evaluators are willing to accompany principal investigators and project teams to conferences or professional meetings to help share and promote project outcomes.

With the right partner, external evaluation support can be a benefit to any project. If you have an idea for a grant submission that you would like to discuss, please email me at Tania@CenseoGroup.com. Our team would be excited to work with your team to develop your idea, to support your grant writing and grant evaluation efforts, and to increase research about effective programming for, gifted learners.

Censeo Group helps clients set their goals, as well as attain them!

Grantwriting for Gifted Educators – Program Evaluation Support

Since it’s still January, let’s continue discussing our professional goals for the new year. Let’s focus on your goal to start or increase the success of your grant funding efforts – or maybe that is a goal of your superintendent or department chair.

January is a great time to begin working toward that goal, and as a professional program evaluator and frequent contributor to grant applications, I am both equipped and glad to help you. If you are thinking, “Hm, perhaps my organization can finally write for or be awarded one of these grants this year”, read on!

Grant writing meeting
Photo by Matthew Henry from Burst

Over the past 15 years, I have led a team of evaluators, statisticians, and data collectors at Censeo Group, an external evaluation consulting firm. We support school districts, state departments of education, faculty, and grantmakers in implementing and measuring the impact of grant-funded activities. We help educators craft evaluation and research sections of grant proposals and ensure that goals, activities, plans, and personnel are aligned. Our team has evaluated programs across the country at the K12 and postsecondary levels in formal and informal education settings. We have worked with educators who are creating curriculum units and teacher professional development aligned with the Next Generation Science Standards (NGSS), Minecraft activities to help middle school boys and girls develop spatial skills, literacy initiatives, mathematics and Earth science teacher professional development, and online learning for gifted students in language arts instruction.

Censeo Group is currently the external evaluator of a newly-funded School Climate Transformation Grant (SCTG), and we are familiar with federal evaluation requirements for proposal writing and evaluation reporting. We have extensive experience evaluating US Department of Education awards – from Reading Excellent Act and Reading First grants in the 1990s to Javits and current SCTG grant awards.

The Jacob K. Javits Gifted and Talented Students Education Program at the U.S. Department of Education is by far the largest and most focused grant program for gifted educators. The 2019 appropriation for this program was $12,000,000. As of this writing, there is no information posted about the FY 2020 Javits competition, but if the timeline is consistent with the 2019 competition, the grant application could be available in early May and deadline to submit in early June. The FY2019 competition was open to state educational agencies, local educational agencies, the Bureau of Indian Education, institutions of higher education, and other public and private agencies and organizations. If you are thinking about submitting a Javits grant proposal , we can help you by discussing your project ideas and evaluation plan so that perhaps your program can be one of this year’s Javits awardees.

A good deal of Censeo Group’s evaluation work is with Primary Investigators funded by the National Science Foundation (NSF). We are knowledgeable about the formal and informal education grant programs and can help you tailor your proposal for the NSF. A number of completed NSF grants have focused specifically on services for or the study of gifted learners in informal settings, through the NSF Advancing Informal STEM Learning (AISL) grant program, including projects at the Belin-Blank Center and at the University of Connecticut Neag. Gifted researchers at the University of North Texas received an Innovative Technology Experiences for Students and Teachers (ITEST) program award and researchers at George Washington University (OSPrI project) and SRI International (iSTEM project) received funding through the Discovery Research PreK-12 (DRK-12) program.

Currently, 24 active National Science Foundation (NSF) grants include gifted students in project activities. The majority of these projects, funded by specific NSF science directorates, focus on basic science research and include gifted students in summer experiences or lab visits. For example, the Division of Astronomical Sciences (AST) funded a study of faint dwarf galaxies at Ohio State University (NSF award number 1615838) that includes, as a small component, a dwarf-galaxy-hunting project for gifted high-school students attending a summer program, scholarships for students to attend the program, and a video blog (vlog) and a guide for students interested in scientific careers.

Censeo Group evaluators could support NSF-funded projects that include gifted learners in several ways:

  • Measure changes in student STEM attitudes, learning, and career interest as a result of their engagement in project activities.
  • Investigate the methods and impact of mentoring provided by undergraduate and graduate students.
  • Study the impact of mentoring on project staff.
  • Measure the impact of the summer learning experience and activities.
  • Support building curriculum materials from summer programs into robust learning experiences for informal or formal science settings.
  • Study the process of implementation, and the factors that supported or interfered with project activities in order to improve implementation in subsequent years.
  • Support and evaluate the effectiveness of dissemination and communication efforts.

If you have an idea for a grant submission that you would like to discuss, please email me at Tania@CenseoGroup.com. We would love to work with you to develop your idea and support your grant writing and grant evaluation efforts to increase research and effective programming for gifted learners. Censeo Group prides itself on helping clients not only set their goals but also attain them!

Mid-school year reflection

The end of the first semester provides a good opportunity to reflect on the successes that you experienced and the work that you hope to accomplish before the end of the school year. It is a good time to start thinking about professional development, consider changes in your practice, and reaffirm, or perhaps create for the first time goals for the 2019-2020 school year.

Today’s post provides a downloadable worksheet to help you reflect on your classroom instruction, gifted assessment practices, support you are providing gifted learners, policies or procedures that you would like to address, or a problem that has been nagging you.

You will have the chance to reflect on questions questions such as, “What plans did you make for your classroom, district, or department when the school year started? Have you made progress on those goals? Are there professional development opportunities that you have been thinking about? Have you heard about a book or article that could help with your work? Are there opportunities to share your work with colleagues in your district, state, or professional organization?

By the end of the exercise, you will have an action plan, ideas for what you will accomplish each month and how you’ll hold yourself accountable for your goal.

Creativity and Design Based Learning – Applications to Gifted Education

One of the domains measured by the Gifted Rating Scale is creativity. When we think about creativity, we often think about its role in artistic endeavors. In today’s blog, I wanted to highlight the importance of creativity in research, engineering design, and design-based practices and offer several resources about these practices in gifted education.

At Censeo Group, the external evaluation firm that I lead, our portfolio includes a number of science technology, engineering, and mathematics (STEM) evaluation projects. Our evaluation team often supports faculty who are developing and researching the effects of STEM curriculum materials and instructional practices on student learning. We recently completed a project supported by the National Science Foundation’s (NSF) Innovative Technology Experiences for Students and Teachers (ITEST) program. The University of Akron Zip to STEM team developed a curriculum unit that integrated engineering and technology into force and motion science instruction. Middle school students in the Akron Public Schools (APS) used technology, including CAD software and a virtual wind tunnel, to design, test, and race a soap box derby mini car. Through this project, 8th grade APS students had the opportunity to experience hands-on, engaging learning, use technology, and apply science concepts to a practical task.

Activities that incorporate the engineering design process or design based learning allow students to engage in hands-on, applied projects in which they search for problems, investigate the viewpoints of those who use a product or engage with a system, work with diverse teammates who bring different perspectives to problem solving, analyze their proposed solution, and iterate to improve solutions. Design based learning processes guide students to deep learning around a practical problem for which students propose solutions.

A recent episode of Bonni Stachowiak’s Teaching in Higher Education Podcast with guest Nicola Ulibarri provides a great introduction to design thinking and the importance of creativity in research and design endeavors. Dr. Ulibarri’s recently-published book, Creativity in Research, is a good resource for researchers and for teachers who are supporting student-led research activities. Although the podcast focuses on teaching in higher education, there are many ideas relevant to K12 educators and gifted education.

Below is a list of articles and books that discuss engineering design and design based thinking in gifted education that provide a good start to learn more about this topic.

Guest Post: Alan S. Kaufman, Some Thoughts About the Super-Gifted

We are grateful that Dr. Kaufman agreed to write a guest post on the Gifted Assessment Insights blog. We hope that you enjoy Dr. Kaufman’s post about super-gifted students.


Alan S. Kaufman, Yale University Child Study Center

As a professional, I have scoffed at headlines about a backwoods teenage genius with an IQ of 190 or an insightful and funny columnist who advertises herself as the smartest person in the world with an IQ of 228. As I argued in IQ Testing 101 and elsewhere: The norms just don’t go that high, stratospheric IQs are imaginary numbers, and they have no scientific justification.

I had to struggle to get David Wechsler to extend the WISC-R norms to 160, even though he insisted that going 4 standard deviations above the mean was nothing more than guesswork. Although professionals I respect—such as Linda Silverman (Giftedness 101) or the late great Julian Stanley at Johns Hopkins—have made the study of super gifted people both an art and science, I have resisted. Certainly the age-old IQ formula of MA divided by CA X 100 permits the mathematical computation of absurdly high IQs, but that formula should have become extinct eons ago along with the woolly mammoth.

As a professional, with a PhD in psychology mentored by super psychometrician Robert L. Thorndike, don’t get me started about superstars with IQs of 200. But as a parent and grandparent… well, that is a bit different. In that domain, there is Jennie who spoke in full sentences before her first birthday, David the musical theater star, Nicole the Emmy-winning producer and film-maker, and Kate the poet. And there is James, who I still call Jamie, who edits the Psychology 101 series that features my book on IQ and Silverman’s on giftedness, and who is among the leading creativity scholars in the world.

When Nadeen used 5-year-old Jamie as her “test subject” for her assessment course at DePaul University, she administered the McCarthy Scales, or at least tried to. As my teaching assistant at University of Georgia, Bruce Bracken, discovered two years earlier when he used Jamie as a demonstration subject for the course on preschool assessment, it was nearly impossible to stay a step ahead of Jamie (Bruce had to use an abundance of creativity—Paul Torrance was chair of the department—and rely on more self-control than he possessed to make it through the class.).

Nadeen faced the same crisis when Jamie decided that he could do a better job of administering the McCarthy Scales than she could. He took the manual and followed the standardized procedures and exact wording as he proceeded to administer two subtests to two different graduate students before Nadeen regained control of the demonstration with as much parental restraint as she could muster. So naturally I used Jamie as my demonstration subject at age 6 for my Wechsler-Binet course.

My TA Leslie was doing a wonderful job administering the 1972 Stanford-Binet to him, even able to keep Jamie under hypnotic control for 45 minutes while managing to pause occasionally to instruct the class on administration subtleties, scoring ambiguities, and techniques for maintaining rapport. She drew on the board Jamie’s abstract design, which he had to draw from memory, and explained that it should be scored 1.5 out of a possible 2. Jamie turned to the board and said, “What??!! That is perfect. Why did I lose half a point? Not fair.” But he was redirected, and the session continued. . . and continued . . . and continued.

On the old Binet, the examiner had to keep testing until the child or adult reached a ceiling. Which meant that the person had to fail all tasks at that level, typically six different tasks. But Jamie got all tasks right at the 10-year level, and kept getting one or two or three correct through Level XIII. My TA was eying me for permission to end the session, which was pushing two hours, even though a legitimate ceiling hadn’t been reached. I avoided eye contact with her and she continued. At Level XIV, he failed the first five tasks, and a visibly relieved Leslie exhaled; one task to go! Then Jamie sailed through “Reconciliation of Opposites” with ease. That meant administering the Average Adult (AA) level to him, the only level that had 8 tasks, in order to get a ceiling. Sensing a rebellion by Leslie and about half the class (the others were caught up in “How high can he go? Just how high is his IQ?”), I relented. We violated the standardization procedures and quit testing without reaching a ceiling. I let everyone leave. And I never even bothered to compute the IQ he would have earned if he had failed all 8 AA tasks, though I knew it was through the roof.

I never tired of using my kids as demonstration subjects, because it allowed me and my various TAs—for a couple of years it was Cecil Reynolds—to instruct the class while Jennie, David, or Jamie would (usually) wait patiently for the feedback to be given to the eager and often brilliant graduate students. Yet one of my favorite stories happened in the real world, not in my classroom. My granddaughter was referred for gifted placement as a second grader in San Diego. When the examiner started to test Nicole, the acronym “K-BIT” stared back at her. “That’s my grandparents’ test,“ she exclaimed, though she had never seen it before. “Now, Nikki Hendrix, let’s save the tall stories for later,” the examiner chastised her. But with a bit of persistence, my granddaughter managed to get the reluctant woman to retrieve the manual from the kit and find the Dedication, which Nicole knew by heart (“To Nikki, with love, from Nana and Papa”); then the evaluation began. Gifted indeed!

So what exactly do I believe about kids or adults with super high IQs? I guess, deep down inside despite my scientist’s skepticism, I think that maybe Marilyn vos Savant did earn an IQ of 228, with a mental age of 22 years 10 months, when she was a girl of 10.

#NAGC19 – Social Media at the NAGC 66th Annual Convention

This year I was unable to attend the National Association for Gifted Children (NAGC) conference, however, I followed the hashtags #NAGC19 and #NAGC2019 on social media. My thanks to the 200 conference attendees who shared their learning and impressions of the conference. The summary of hashtag activity (created with the free version of Keyhole) shows the wide reach of the social media posts – more than a half-million impressions from tweets and posts. And, this summary does not include posts made on private Instagram accounts!

Social media analysis NAGC19

The overall tone of the posts was positive – with tweets focusing on session content and excitement about being at the conference. The post that received the highest sentiment – retweets and posts – was by swanwick_w of the hot air balloon that was part of the Family Day activities. Several people posted photos of this gorgeous balloon and about the success of Family Day.

The post with the highest engagement was from Anne Rinn, congratulating student winners at the NAGC Research Gala.

Although the majority of posts were positive, I was curious about those tagged as having negative sentiments. These posts were related to two issues: 1) improving services, and 2) better understanding student needs. The first was a thread about perfection and anxiety in gifted students:

The second set of tweets tagged as negative were discussions about the “Confronting Pseudoscience in Gifted Education” session presented by Kate Snyder (@DressageProf), Bess Wilson, Matt McBee (@TunnelOfFire) Scott J Peters (@realScottPeters) and Frank Worrell.

The result that most surprised me was the infrequency of posts focused on assessment. Only 21 tweets included the words “assessment”, “test” or “testing”. These tweets, focused on improving equity and psychometric strength of assessment, clearly aligned with our interests of a valid, reliable, and fair revision to the GRS. We hope that the posts on the Gifted Assessment Insights blog and our Twitter account (@Gifted_Assess) can help support important discussions about assessment in gifted education.

The other top influential tweets celebrated the successful conference:

Thank you to the attendees who posted on social media so that those of us who could not attend could still experience a bit of the conference.

Creativity in Learning

bat

Today we celebrate Halloween in the United States. Halloween provides a good opportunity for children to express themselves and highlight their creativity – costumes, carved pumpkins, scary stories, decorations. Halloween also provides us a great topic for today’s blog post, the topic of creativity.

Ronald Beghetto, a creativity researcher, has described creativity as occupying a “conflicted position” in education. Although teachers and parents say that they value creativity, creativity is not often included in educational practice. Creative expression is often limited to art or music classes, writing seminars, class celebrations, or special events.

The status of creativity in gifted education is similarly conflicted. Although included in federal and research conceptions of giftedness, creativity is not included the guidelines of all state departments of education for identifying gifted students. GRS co-author, Steven Pfeiffer, reported in 2012 that the number of states that included creativity in their definition of giftedness decreased from 30 states in 2000 to 27 states in 2010. At that time, 45 states included intelligence in their definition. According to the 2014-2015 NAGC State of Gifted Education Report, the number of states that included creativity in their state definition at that time declined to 21 states.

In some cases, even though creativity is included in a state’s definition for gifted identification, this category does not stand on its own. For, example, Ohio Department of Education’s definition of Creative Thinking Ability requires students to score one standard deviation above the mean on an intelligence test and also attain a qualifying score on a creativity test or checklist.

Ohio’s definition of creative giftedness aligns with Renzulli’s category of “schoolhouse giftedness”, which refers to students who score well on traditional intellectual assessments and perform well in school. This is in contrast to what Renzulli refers to as “creative/productive giftedness”, which refers to students who develop original knowledge or products and employ integrated and problem oriented thought processes. The Gifted Rating Scale, in its 12-item creativity scale, addresses both types of creativity, highlighting creative thinking, innovative approaches to problem solving and academic activities, and also the production of creative products and activities.

What do restrictive definitions of creativity, or the lack of focus on creativity mean for gifted students and educators? How do we ensure that students who display high creativity as well as those who have the potential for creative thinking are engaged in creative activities?

A research study released just this week can support our work.  The Creativity in Learning Gallup study found that creative thinking in classroom assignments supports higher-order cognitive skills and more engaged and confident learners.  The study described creative activities as those in which students were able to:

  1. choose what to learn in class;
  2. try different ways of doing things;
  3. come up with their own ways to solve a problem;
  4. discuss topics with no right or wrong answer;
  5. create a project to express what they’ve learned;
  6. work on a multidisciplinary project;
  7. work on a project with real-world applications; and
  8. publish or share projects with people outside the classroom.

Teachers who used these strategies in assignments saw in their students higher levels of critical thinking, stronger connections between subjects, deeper knowledge of subject matter, and more effective retention of knowledge than teachers who did not use these strategies to engage the power of students’ creativity. Teachers who used technology to support creative assignments saw the highest learninglevels of cognitive engagement and learning.

The Gallup study provides a compelling argument to ensure that assignments and lessons allow students, particularly highly creative and productive students, to engage their creativity. This is a particularly important, considering another key finding of the study, which supports Ron’s notion of the conflicted position of creativity: despite the fact that teachers and parents agree on the importance of creativity in learning, students spend little time on activities that foster creativity and that connect with real-world applications.

Download a copy of the Creativity in Learning report here to read more about how creativity in learning supports positive outcomes, how technology can support creativity, and the importance of a collaborative and supportive school culture to support creativity in learning. The Gallup study findings can provide evidence to support your work with all students, but particularly with gifted students who might not be experiencing activities that harness and engage their creativity.