334 Transcript

Dr. Jeremy Sharp Transcripts Leave a Comment

[00:00:00] Dr. Sharp: Hello everyone. Welcome to The Testing Psychologist podcast- the podcast where we talk all about the business and practice of psychological and neuropsychological assessment. I’m your host, Dr. Jeremy Sharp, licensed psychologist, group practice owner, and private practice coach.

This podcast is brought to you by PAR.

Conduct a broad-based assessment of personality and psychopathology with the Gold Standard Personality Assessment Inventory or PAI. The new PAI Spanish Revised Translation retains semantic equivalence while using clearer and more inclusive language. Learn more at parinc.com/pai.

All right, everyone, welcome back to the Testing Psychologist podcast. 

My guests today are quite a team. I’ve got Dr. Joni Lakin and Anna Houseman talking to me all about the CogAT®.

The CogAT®, as some of you may know, [00:01:00] is an ability test primarily administered in schools and used for GT assessment. So we’re going to dig into many aspects of the CogAT® but really focus on strengths-based applications and how to use the CogAT® in schools outside of the typical domain of GT assessment.

We get into the basics and background of the CogAT®: what it is, how it was developed, how it compares to IQ tests, and so forth. We talk about strengths-based applications, of course, and how we can use CogAT® data to identify and perhaps close the ability achievement gap. We take a little detour into spatial reasoning and how it is an under-underappreciated skill and many other topics. So this is a good one. Whether you’re familiar with the CogAT® or not, I think there’s a lot [00:02:00] to take away.

Let me tell you about my guests.

Dr. Joni Lakin is a professor at the University of Alabama and is a co-author of the Cognitive Abilities Test or CogAT® Form 8. She studies educational measurement issues related to test validity and fairness with a particular interest in the accessibility of tests for English learner students. She also studies science, technology, engineering, and math education and interventions that promote STEM retention along the academic journey.

Anna Houseman is the Product Marketing Director at Riverside Insights. Before Riverside, Anna taught elementary and middle school and served as a district assessment director.

As you can tell, they come at this topic from two very important perspectives, and the symbiosis of their opinions is evident during our discussion. Lots to take away from this one.

[00:03:00] If you’re a practice owner and you’re looking for some group coaching, accountability, support, and connection with other practice owners who are running testing practices, I would invite you to check out The Testing Psychologist Mastermind groups. There’s a group for every level of practice development: beginner, intermediate, and advanced. You can get more info and schedule a pre-group call at thetestingpsychologist.com/consulting.

All right, let’s get to my conversation with Anna Houseman and Dr. Joni Lakin.

Joni, Anna, welcome to the podcast. 

Dr. Joni: Thank you.

Anna: Thanks for having us, Jeremy. We’re excited to be here. 

Dr. Sharp: I’m excited to talk with you. Let’s do [00:04:00] just a little orientation for the listeners, which is pretty common when I have multiple guests. Joni, could you just talk a little bit and give a super brief background so people can start to identify your voice? And then Anna, you can do the same thing after that. 

Dr. Joni: Yeah. I’m Dr. Joni Lakin. I’m a professor at the University of Alabama. I have been working on the cognitive abilities test, CogAT® probably about 15 years- since before I had my Ph.D. So it’s something that has been part of my research and my practice all along. And so, I’m really excited to talk about the ways that we use that data and how it can help schools.

Dr. Sharp: Great. 

Anna: My name Is Anna Houseman. I work directly with Riverside Insights which is the vendor that distributes and supports educators with the Cognitive Abilities Assessment. I lead some of the product development and marketing initiatives at Riverside for CogAT®. Previously, I was an [00:05:00] elementary middle school teacher in New Jersey and New York, and then was an assessment coordinator for a number of years and assessment director. So, I have a bit of experience with how to actually use the data in the classroom.  And so right now in my current role, I get to work with districts across the country to help them better utilize the assessment data in the classroom to really drive student growth. 

Dr. Sharp: That sounds great. I love having both of you here and bringing those different but complementary perspectives. I’m excited for our conversation and personally excited because I have kids who are right in this zone. I have a 4th and a 5th grader, so we’ve had some CogAT discussions in our home over the last two years. It’s nice to be talking with y’all- go direct to the source of some of this info that we were looking at.

I will start as I always start, which is by asking the question, why this is important to you? Out of all the things that you [00:06:00] could focus on in our field, why this in particular? Joni, I’ll let you go first. 

Dr. Joni: I’ve always been really interested in how we plan instruction to help students maximize their potential. So, some of the applications we’ll talk about are gifted education, but I’m interested across the spectrum of how we identify students’ strengths and help them to develop that. So working with tests, working with assessment data, to me is a really great way of informing instruction or differentiating instruction. So, I’m always excited to talk to practitioners about some of the myths or misunderstandings of ability testing and help them to see the value of those tools. 

Dr. Sharp: I love that. Anna, how about you? 

Anna: This topic has been really personal to me honestly since I was a little girl. Many members of my family learn and think very differently. I remember as a kid not really understanding why they were struggling in school and why they hated school.[00:07:00] And then I decided, at the age of 14, that education was what I wanted to commit my life to. That I wanted to help kids figure out their true potential because I saw my siblings and members of my family that just really struggled with that. So I became a teacher.

When I was teaching in some really low-income areas, all of our focus was on getting our kids to pass the state test and on getting them to close that achievement gap. And so what we would do is stop instruction in January and then test prep from January through May.

What we saw over time is that in kindergarten, they were doing really well. 1st grade, they were doing really well. 2nd grade, they were doing really well. In 4th, 5th, 6th, and 7th grade, they were not doing well. As the content got harder, as the content got more rigorous, and as they were forced to critically think and problem-solve, they struggled. And then they got to college and they failed out. And it was because when I looked back in hindsight, all of our [00:08:00] kids were just focused on retaining and regurgitating skills that we were teaching in the classroom and not really learning how to be a thinker and not really learning who they were as a student or what their strengths were.

What is so cool about the work we’re doing with CogAT is that we’re actually trying to flip the script on how we teach students that it’s so much less about retaining and regurgitating knowledge and so much more about how a student thinks, how they problem solve, helping them be confident as a learner, and then long term, how that leads to their own personal growth and success.

So, it is super cool, what we’re doing, and I think it is, especially post covid, a really great opportunity to redefine what instruction looks like in the classroom and how to really support both cognitive growth, but then also, individual understanding and growth as well.

Dr. Sharp: That’s a good point. I feel like people are maybe doing a little bit of a reset now, or at least we’re putting more of a [00:09:00] microscope onto student learning following covid. So it is good timing to think about a different approach. It’s interesting to me to hear you say that component about stopping instruction in January and teaching to the test. You always heard that, or I always heard that over the years, and to know that it’s true is sad and validating. 

Anna: Yeah. And it was unfortunate because I think our learners that, like I said, I was teaching in a very low-income area and all we were trying to do was close the achievement gap for kids. But in hindsight, what I didn’t realize is the way we were trying to close the achievement gap was getting them to pass an end-year exam, and we weren’t really helping them understand who they were as thinkers and learners and teaching them the critical thinking and problem-solving skills that were going to actually help them be successful for life- beyond that stage of the year. 

Dr. Sharp: Absolutely.

Dr. Joni: And that’s a big [00:10:00] issue in testing more generally. I like to talk with these bumper sticker ideas and one of them is having a test worth teaching tool. In my whole career in testing, we’ve talked about that. I don’t think that the State test have ever become that for a variety of reasons, but that’s one thing to think about is, if our states could move to better assessments that we’re measuring more accurately these critical thinking skills, complex problem solving, there’s a lot of problems with that, but to the extent that we keep going back to these basic proficiency measures, we’re not getting to that point where there would be a test worth prepping for because it measured important skills in authentic ways. 

Dr. Sharp: That’s a good point. I know we’re already getting off topic a little bit, but I think this is important, or at least off-script. This is interesting. We talk a lot in my house, or like I said, I have a 4th and a 5th grader who, one of them is very keyed into test performance. [00:11:00] And so we talk a lot about, or he talks a lot about how he did on the Star tests or the Maps, there’s all these. In your mind, in our current educational system, are there tests that are “better” than others in terms of gauging these critical thinking, these skills that you deem more important?

Dr. Joni:  It’s always hard to say because each state, Maps and Star, there’s two other products that are permeating different states. So there’s some consistency, but a lot of states have their own systems. Like my state of Alabama keeps throwing out their test and reinventing it over and over, and each time it’s not any better than the previous.

So, I think sometimes these national programs like Map and some of these other that are shared across states probably are going closer to [00:12:00] that value just because every time you start over, people are reinventing the wheel. So, I do like to see…we had Smarter Balanced than we had Park, with the common core assessments, and we are falling off those carts. But those were an attempt to make a test worth teaching tool.

So, some of them are more valid, especially if they have computer adaptive and they’re able to be tailored to student skills. So they don’t have that problem with just measuring basic and below basic proficiency levels. I think those are better assessments. 

Dr. Sharp: Okay. That’s fair. 

Anna: I think… Oh, sorry, Jeremy, go ahead.

Dr. Sharp: No, you’re good. Go for it.

Anna: I do think it’s important on the back end of that question to understand when you’re looking at comparing assessments to understand what they’re measuring. I think that a lot of the state tests and Map and Star and all of those tests, they’re what we call achievement tests. [00:13:00] So they’re measuring predominantly what you’re learning on a day-to-day basis in the classroom. They’re measuring knowledge if you will.

And if you look at it in a simplistic way, if you look at the brain, that’s the back part of the brain. What information are you retaining? Whereas, an ability test like the CogAt is more measuring problem-solving skills and critical thinking skills- are you able to take new information and apply it to a different scenario? So it’s less about what you’re learning on a day-to-day basis, and it’s more about how your brain thinks, and that’s more of the frontal lobe piece of your brain. 

So I think it’s important as we compare tests and think about the testing industry as a whole, that we really understand what these tests are measuring and what they’re telling us about students. 

Dr. Sharp: That’s a great point. And hopefully, through this conversation, we can delineate some of that a little more clearly for folks who may want it or be interested in it.

It might be a nice segue to talk more about the [00:14:00] CogAT and what exactly it is. I hear a lot about it just as a pediatric psychologist doing a lot of testing with kids and, of course, having my own kids, and it’s still a little bit of a mystery, for my own lack of research into it, and I’m guessing there are other practitioners who are probably in the same boat. So, let’s tackle that. Tell us what is the CogAT exactly. 

Anna: Perfect. Great segue. The CogAT is an assessment that measures a student’s or anyone’s cognitive ability. Like we were just talking about, if you think about the typical standardized tests that a student takes in the classroom, even the SATs, AP tests and ACTs, those are measuring achievements. It’s measuring what a student has learned. The CogAT on the other end of the spectrum is measuring how a student learns. So it actually looks at a student’s cognitive reasoning [00:15:00] ability, verbal reasoning, quantitative reasoning, nonverbal and figural reasoning, how a student thinks about different types of problems, how a student takes new information and applies that to a new scenario.

So, instead of having a student regurgitate skills or standards or show mastery of knowledge, it’s really looking at how a student thinks, and how applies that thinking to an entirely different situation. The CogAT full battery has three different parts. There’s a quantitative reasoning section, a verbal reasoning section, and a nonverbal figural reasoning section. Students can take it from kindergarten all the way through 12th grade.

A lot of districts across the country use it as part of their gifted and talented or advanced academic identification processes, but we’re seeing more and more districts start to shift and actually use it for students in the [00:16:00] classroom to develop talent, to develop the cognitive ability and to really push students to a higher level of critical thinking. Joni, keep me honest on that and definitely jump in if there was anything else you’d add.

Dr. Joni: I think one thing your audience will wonder is, this is a group-administered ability test, and so, it does have a pretty high correlation with things like the composite from the Woodcock-Johnson and some of them are IQ focus measures. But the big focus is, finally in an audience where I can say fluid intelligence and they’ll know what I mean, right?

Dr. Sharp: Yes.

Dr. Joni: Is that fluid intelligence the reasoning component of human ability? We measure it using the three different domains of verbal, quantitative, and figural reasoning. And so, the test was originally developed really to help inform instruction alongside achievement data. So helping you to, like Anna was saying, where do you learn the most easily, the quickest? Where do you have the more capacity for more complex thinking?

[00:17:00] And so, those three different domains are really important to assess the different areas of reasoning because they align with different aspects of education. So we do find that students with stronger verbal skills relative to quantitative, have a different experience across the curriculum than students with other profiles. So that’s another important thing. We use those three battery scores to influence that kind of understanding of the students relative strengths and weaknesses in learning. 

Dr. Sharp: Yeah. Correct me if I’m wrong. I hope you will. Does it make any sense to conceptualize it as basically a more expanded version of the fluid reasoning index on the WISC? Is it all tapping into fluid reasoning or are we maybe drawn, I’m just trying to draw a comparison between what I’m super familiar with. So are we more looking at verbal versus visual and fluid reasoning, or is it an expansion of fluid reasoning? 

[00:18:00] Dr. Joni: Yeah, it’s combined. So if you think about the verbal and performance scores from an IQ test and then a fluid reasoning score, the verbal battery of CogAT would correlate with both that verbal component and the reasoning component. So there is some differentiation there. A big difference is that we’re not trying to get into the processes. So all those additional working memory, processing speed, things like that, we’re not tapping into that. That’s something that’s better assessed in a one-on-one environment. So we’re really looking broadly at that verbal and the performance component, but also the fluid component. So all of the above is what I just said. 

Dr. Sharp: Great. That sounds good. That does help me organize it. Actually, I have to classify things and make them familiar as much as possible.

You said group administration. Does this mean kids are on laptops in the classroom? Is it all [00:19:00] computer-based or is there a mechanical component or what?

Dr. Joni: Historically, it was paper-based. Slowly, over time, people are moving. The online administration is fully comparable. So we do concordant studies where we show where we can adjust for any differences and how difficult it is paper-based or online. I know a lot of testing programs have had to do that on the fly during Covid when they suddenly started administering virtually. So that’s something that we already had built in because it has been a transition. Some schools really stick it out with the paper-based, but once they see how easy the online administration is much more user-friendly. So yeah, it’s computer-based. They can use different kinds of tablets. We do try to use the technology that students are most familiar with. That increases access for the test. 

Dr. Sharp: Of course. That’s great. There may be more questions to ask. [00:20:00] We’ll see where we go as far as the relationship to more standard IQ tests. But I’ll hold off on those for now because I know we have some other exciting material to get into.

I will say this. As far as what the scores look like, can you just do a brief description? Are we talking about standard scores or something else? What does the scoring look like and how can we interpret the results?

Dr. Joni: The test is reported on two different scales. One is a universal scale score which goes across different levels. Anna mentioned it goes from K to 12. So there’s a level for 5 and 6-year-olds up through 17, 18. That scale is the vertical scale that connects all the different levels you can compare over time. But then within each age or grade level, we have the standard age score scale, which we call an IQ-like scale. So it’s a mean of 100, a standard deviation of 16, not 15.

Dr. Sharp: That’s interesting.

[00:21:00] Dr. Joni: Why? Because we felt like it. So it’s just slightly a traditional IQ scale but similarly interpreted. We try to call it IQ-like because we don’t want folks to think it’s completely interchangeable with Woodcock Johnson’s score primarily because we just don’t measure those process skills. So if you’re thinking about some of the clinical diagnostic uses, it should not be used as an interchangeable measure. But if you’re interested in ability and how it predicts future learning, then it is very comparable and that’s why we use that scale that’s so familiar. But we also report things like national percentiles, and a lot of schools use local percentile scores, which is a growing area of interest and use, especially related to gift and identification.

We have verbal, quantitative, and nonverbal. They get their own scales as well as composite. What’s helpful is that there are some meta composites like the QM composite is the best predictor of science and math outcomes. So not just quant by itself, [00:22:00] but quant combined with figural is a better predictor. So, you can mix and match the three batteries. And then we have the ability profile, which is more about, what’s it called, the wisp with the jacket the profile score kind of thing. We have a system of that that relates those different relative performances of VQN to two specific instructional recommendations. So we find value in all the different combinations. 

Dr. Sharp: Yeah, that’s fantastic. I’m asking some dumb questions, but hopefully, some other individuals out there taking benefit from this. What’s the theoretical framework? Is it CHC derived measure or is it something different?

Dr. Joni: Absolutely. The first version of this test was called the Lord’s Thorn that came out in 1950 something. So obviously they were probably more influenced by Spearman and some earlier theorists. But Yes, it’s absolutely [00:23:00] designed around the CHC.

When you look at the Kittel Carroll model and you look at fluid reasoning, it actually has three components that are, sequential, quantitative reasoning, and then maybe inductive reasoning. We find that those map pretty well onto the IQ. So we use the different terms. I don’t think anyone would be excited to take the inductive reasoning test, but if we call it figural reasoning, they’re like, okay. So it does map well onto the way that John Carroll organized fluid reasoning in that way.

Dr. Sharp: Great. Thanks for bearing with some of these basic questions.

Dr. Joni: I’ll say that Riverside also publishes the Woodcock Johnson and Kevin McGrew who’s the current caretaker of CHC theory, he’s part of the Riverside team. I don’t know him personally, but I love to be in any way connected to him and his work.

Dr. Sharp: Yeah, he’s done great work.

Dr. Joni: I’ll name drop him here.

Dr. Sharp: I love that. Totally okay. Well, thanks for doing [00:24:00] some intro. I am really excited to get into some of the more applied components of the CogAt. You mentioned that it’s primarily used, or maybe that’s my word, for GT assessment. Is that fair? Is that an accurate statement at this point?

Anna: Yeah, I think, and Joni keep me honest on this, but I think that even though CogAt was identified or created as an assessment to be used in conjunction with achievement data to support student learning, over time, specifically as state assessments have become more dominant with some of the federal policies that have been enacted over the past decade or so, there’s been a need for additional assessments to be used in states that have gifted and talented identification processes.

And so, CogAt has become, because it’s considered an assessment that that really assesses how students understand, how students think, [00:25:00] and students capacity for learning, has become used very often as part of the gifted and talented identification process.

We see many districts that have migrated to using CogAt for Universal screening if they’re identifying for gifted and talented frequently in 2nd or 3rd grade, and then there are some districts that are still using it on a referral based through a referral based based approach for gifts and talented, although from an equity standpoint, we highly recommend doing universal screening as opposed to just referringfor gifted and talented identification. But that is where we’ve seen the CogAt administration shift.

What we are starting to see is that as districts understand the value of the information that they’re getting, there are many districts that are using it for far more than just gifted identification now. They’re seeing that it’s actually impacting how [00:26:00] students can learn in the classroom.  And so, we are starting to work with districts across the country that are trying to utilize the data and the information for more than just gifted and talented identification and trying to expand their stories so that they can impact other districts and that folks can start to see the value of how to use this in the classroom.

Dr. Joni: Yeah. I’ll say that if you look at among schools that use tests, CogAt is the market leader, they would say. We get a lot of schools, if they do give identification with tests like this, a lot of them do use CogAt. But again, that wasn’t the original purpose.

What Anna was saying about the rise of accountability testing, that might be part of it because ability tests used to have a more prominent role. Originally, CogAT was administered alongside the Iowa test which used to be called the Iowa Test of Basic Skills by a lot of folks. I know the state of [00:27:00] Georgia used to use both of those together. And so it was used for instructional planning and since they were already familiar with it, they used it for things like gifted identification.

Over time, I think ability testing has really become just the purview of special education testing- the identifying of learning disabilities or learning needs. And so, ability testing is only for the two tales of the distribution. I think it’s unfortunate because we get so much information and now that we have universal screening where schools are administering something like CogAT to their entire 2nd grade classroom, to me it’s a shame to throw out 80%- 90% of what you just collected and not use that.

I had friends in grad school who had been teachers and they would say, oh yeah, we got our CogAt scores and then I put it in a drawer. And that was it. They never used it. Or they’re like, that’s what the gifted person does and no one else in the whole school ever looks at [00:28:00] CogAt scores. And to me, that’s just a huge waste of the time and the money and everything that goes into the ability testing.

So, while it’s really important to do universal screening so every student has an opportunity to demonstrate need whether it’s for special education services or gifted education or a combination, that huge swath in the middle, there’s so much value. And I think that’s part of why it’s hard to get general ed. Classroom teachers bought in is because it doesn’t help them. And so they’re like, why should I spend my time giving something like the CogAt when I already do the achievement test? What is this adding?

And so, that’s my personal mission, is to make sure that teachers are more aware of what are the implications of CogAt scores and why should they ask their gifted specialists to give them the classroom reports that they could be using. 

Dr. Sharp: Yeah. 

Anna: I also think there’s a [00:29:00] misnomer about that. There’s a misunderstanding about what ability data is and people who are not familiar with ability assessments and ability data hear the word ability data and frequently connect it to IQ tests. And I think that because of how IQ tests were misused and just to be really blunt, the inequitable practices that arose out of it, I think that there’s a lot of fear about using an ability assessment in a similar way.

What we’re trying to demonstrate is that:

1) An ability assessment is not a direct comparison to an IQ test.

2)We’re actually promoting equitable practices to unlock doors for students that wouldn’t necessarily be unlocked through just typical achievement testing, but we are having to try and flip the story on its head about what ability tests can be used for.

[00:30:00] Dr. Sharp: Yeah. I want to dig into some of that. Joni, you phrased the question, people will ask, what’s the point of doing universal screening, basically or what’s the point of administering all these kids? How would you answer that question right off the bat? What are the benefits? 

Dr. Joni: One of the benefits from the gifted education perspective is that there will be students who have talents for academic areas who are not noticed if you’re basing it on what teachers see in the classroom or what they see from achievement data. So, a lot of gifted students have high ability and do well in the classroom so their achievements scores will be high. Those are the best case scenario.

But then there’s this other pool of students who maybe act out or they’re disengaged, they don’t like school, or they’ve become disenfranchised by the classroom. And those are [00:31:00] exactly the students who achievement scores might not be high. Their relationship with the teacher might not be great. And so, they don’t have that opportunity to have that talent recognized and maybe be put in a situation that’d be better for their talent.

So, just from an equity perspective, there are some hidden gems out there. And so, having multiple measures ability to test alongside achievement can help you uncover those students. But then from a broader perspective of what’s the value case for ability scores, it’s really about planning instruction.

So if you have information about a student’s current achievement level and you have information about their abilities, you might be able to do something like small groups and flexible ability grouping where you say, okay, these are students who haven’t mastered the skill, but they are going to learn very quickly. They can be put in maybe a more independent situation or a small group that’s ready to go quickly versus another student who is not meeting an achievement goal but is also going to learn slower in that [00:32:00] domain. They may need more structure and support to achieve those goals. So we know that if you have relatively weak quantitative scores, the big risk there is that you’ll develop math anxiety, which I think is endemic in our society.

This sort of sense, I’m not a math person. A low quant score does not mean you’re not a math person. It does mean that math will come more slowly or it may not be as natural to you. And so you might need more time studying it, more opportunities to use manipulatives, use different kinds of learning engagement styles, just different kinds of instruction. And so, the two pieces combined help you to make those decisions as a teacher. So that’s really the value proposition for testing everybody with Ability test.

I hope that in another 10 years, gifted ed is just one thing that they happen to use the CogAt for that as well as planning instruction. That’s really my focus [00:33:00] is broadening that, like Anna was saying, changing the story about ability test and then making sure that people aren’t just using it for one purpose, especially if that one purpose is a cut score and you classify the kid and that’s it. End of identification. That’s the worst case that is unfortunately a reality in some districts right now. 

Anna: I can give you two examples of what that looks like practically in the classroom.

Dr. Sharp: I’d love that.

Anna: Two kiddos immediately popped to mind that were in my 5th grade class. They were both African American boys. They were held back numerous years in a row because of their state achievement scores yet they were extraordinarily talented in different ways. One of them loved to read and loved science and when looking back, I had a hunch that he was much more capable than his achievement scores were demonstrating. I had no way to prove that.

Another one of my kiddos that was [00:34:00] held back because of his achievement scores drew everything out on paper. He made stories, he made pictures, he made buildings. This was clearly a non-verbal or figural learner that was really strong at spatial understanding. And if I had known that he had a strength in non-verbal reasoning, I would’ve taught him math completely differently. I would’ve pulled out the manipulative. I would’ve had him draw out pictures on his chart or on his page. I would’ve given him extra time for his assessments so that he could create that mental model that he needed to create to solve the problem. But we didn’t have any of that information. 

And so it would have completely changed my way of instruction if I had known that that first little boy was a verbal learner and the second little boy was a nonverbal, figural learner. And then I could have really differentiated an instruction to the way that those students learned best in the classroom. And who knows, they may have done much [00:35:00] better on some achievement assessments when they had instruction that was actually aligned to  the way that they thought. 

Dr. Sharp: Right. Yeah. Those are great examples. I wonder if I could maybe have you clarify a little bit and maybe we’re just getting wrapped up in semantic differences. We’ll see. When you talk about these different learning styles and or learning differences, how do you reconcile that or explain that in the context of the research that says that’s not necessarily a distinction we can make. Do you see what I’m saying? Like, learning styles. That’s kind of a myth.

Dr. Joni: I was about to interrupt and say that learning styles are not a thing. It is very complex to talk about it. So ability test, especially the way that we do it also gets into that specific abilities. And I know that that is a huge [00:36:00] debate when you look at school psychology especially. Whether or not those battery scores are meaningful in the context or is it just general ability? Is it all general ability and your forced to…?

Dr. Sharp: Oh my gosh, yeah.

Dr. Joni: I read these things. There’s a lot of data that if you’re looking at a really broad level, you’re just say like, oh, specific ability don’t matter. It’s just general ability. You don’t need those other scores. But when you look at, so I think about it like talent development or, I like to think about it almost in terms of career development with the idea that career development is a process throughout childhood. I’m not just talking about high school kids, but when you think about the career trajectory or the expertise trajectory of students, general ability is a really powerful predictor for general outcomes. So, you want to know if a student’s probably going to do well in school, that general ability, that composite score from CogAt, something like that will be useful. But [00:37:00] I’m really interested in if students are going to have a particular talent for engineering, science, are they going to have a particular skill in the language arts?

So understanding where they might develop expertise and where they could be most successful learning the quickest, being most comfortable with learning, pushing the boundaries of our knowledge, pursuing education, formal or informal, all of that can be better predicted from those specific abilities. So even though they aren’t as powerful a predictor as general ability in some of those cases, you can’t predict specific areas of talent- a future talent development using general ability, right? It just tells you they’ve got it or they don’t have it. Versus ability profiles tells you more about where strengths might be. So I do disagree with some about that literature on whether profiles are ever useful. And I think they’re incredibly useful when understanding individual talent trajectories.

Now, getting back to learning styles. Traditionally, learning styles is the theory [00:38:00] that there are certain ways that you learn better and you should be taught in a way that matches that ability. So if you’re a spatial learner, I should only show you pictures. I shouldn’t show you words. The nuance is that you actually want to support them in incongruent ways. I guess that’s a way of putting it. Like Anna’s story about the student was very visual. They can come up with that visual model and that can be very effective for them. But the difference is that we also want to support them, and this is what Anna was getting at, was you want to support them with those skills they have in learning math in traditional formats too.

So when I’m being dismissive of learning styles, I say, you can’t dance a math equation, but you can. It’s not a very effective way over time. You can’t continue to be engaged and excel in mathematics if you can’t engage with formal mathematics. So the idea is more about understanding what kinds of learning come [00:39:00] easier and using that to achieve a common outcome.

When I talk about tailoring to students quantitative weakness, we’re never going to say like, oh, you’re not a math person. You don’t need to know math. We’re going to say, okay, what skills do you have and how are we going to leverage that to help you succeed in mathematics?

So, it sounds similar to learning styles, but it’s the opposite, that we’re not going to give you the thing that already comes easily to you.  Spatially gifted kids don’t need a lot of visuals. They can use visuals, they can come up with their own visuals. We need to support them with the words and understanding text. It’s subtle, but I do think there is strong support for it. I know there’s strong support for differentiation, but I think that that use case for ability test is valid and we have evidence for it whereas learning styles and the idea of like, I’m going to teach you how you learn, that is not supported by the literature.

Anna: Joni, would you almost talk about it as it unlocks a strategy [00:40:00] for a student to apply to a new concept? So if a student is a spatial, like really prefers to reason spatially, that student could then apply that strategy to new information or a new concept? 

Dr. Joni: I think so. I also see that, especially with spatial, a lot of teachers don’t realize that that’s an area of strength for some students. So that’s another area of development where I’m working on a spatial reasoning test to help to differentiate that from just abstract reasoning. And I think it’s really important because these kids don’t know that that’s an asset, and teachers especially may not know that that could be a deficit for other students, so that if you can’t visualize, if you can’t understand a graphical representation, that inhibits your learning as well.

So, I think it’s reaffirming to those students to say, that’s an important skill and you bring it to the classroom and we’re going help others to [00:41:00] also build up that skill. And that can maybe make you feel that you’ve got a secret power that you didn’t know was important but now you do. And so, it is a strategy for you to use in your learning at.

Basically, the antidote to all the learning styles thing is multimedia. Don’t provide just in the format that student prefers, but to provide in all formats, multimedia, options, different ways of engaging in the content. That is valuable for all students so they can pick and choose what works for them in learning a specific thing. So similarly to what Anna’s saying, providing different strategies and different ways of engaging content can also be valuable. 

Dr. Sharp: I appreciate you diving into that and making some of those fine distinctions. Like I said, there’s a lot of semantics going on and just getting the words right, but that makes sense. I appreciate the deep dive indulging my questions here. [00:42:00] You did say something that’s interesting to me that we might detour for just a second with this spatial reasoning component, and like you said, it can be a little bit of a secret superpower. How else does that show up in day-to-day classroom work? Can you think of any examples of that where a kid with spatial reasoning could really excel that we might not think of?

Let’s take a break to hear from our featured partner.

Conduct a broad-based assessment of personality and psychopathology with the gold Standard Personality Assessment Inventory or PAI. 22 non-overlapping scales cover a full range of clinical constructs So you’ll get the information you need to make a diagnosis and formulate a treatment plan. Plus for your clients who speak Spanish, the new PAI Spanish revised translation retains semantic equivalence while updating language to be clearer and more inclusive. Learn more at [00:43:00] parinc.com/pai.

Alright, let’s get back to the podcast.

Dr. Joni: Well, my immediate thought is one place where it can hamper students that we didn’t realize until recently was early mathematics. So if you think about like number line and being able to visualize numbers and understand why negative numbers are on that same continuum, that’s a spatial representation. And so one thing that early childhood folks are learning is that you have to build up spatial thinking skills for those early mathematics concepts.

So you might find if a student has stronger spatial skills, that they’re just naturally… they’ll figure out some things related to mathematics in those early years because the obvious thing is you think about, oh, they’re going to be really good at geometry. Sure. There’s lots of shapes and relationships there. But also even in those early math skills, it’s insidious how it influences early math learning and conceptualization.

Beyond that, [00:44:00] a lot of folks will tell me, oh, my kid’s very spatially gifted. They love Minecraft. We did a study this summer, it is true, you probably know this with kids. Literally every kid plays Minecraft or Roblox or some combination. They like games within the games. I don’t have kids, so I don’t have any idea what the games within games are, but I respect it.

Dr. Sharp: You don’t want to open that box. It’s a deep box.

Dr. Joni: I’ve tried. It’s one of those things you have to hit it at a certain age like Ewoks. If you were young enough to love Ewoks, then you love Ewoks. I wasn’t the right age to learn Minecraft, so I’ll don’t think I’ll ever understand it.

Dr. Sharp: That’s fair.

Dr. Joni: But the way kids engage with it can be really different. My nephew once showed me how you could use the blocks to build a pit and then someone would fall into it and you could cover them up. That is not very spatially loaded. But when he builds castles or he builds mazes and interactive spaces, that’s a very spatially loaded [00:45:00] task.

So if you’re going in and you’re doing things involving building or problem solving with spatial relationships, or it might be a maze, it might even navigating that space in a video game, that can be a sign of spatial strengths versus if you’re going in and you’re playing games that are more like interactions, they’re more like reasoning, problem solving, maybe you’re playing some… I’m thinking for grownups, we play like solitaire, we play Candy Crush. If it’s something that looks more like that, it’s not really spatially loaded, even if it might also occur within the Minecraft universe. 

So that’s one thing is thinking about the ways they engage with the same content in a more spatially nuanced way. And as grownups, obviously spatial reasoning is really important to engineering. Geosciences- there’s a huge field of geospatial information sciences is growing and that is spatially loaded. So there’s a lot of careers that require spatial reasoning skills. 

Dr. Sharp: Yeah, that’s so true. [00:46:00] It’s been interesting to see, my kids both go to the same school and they teach math the same way. My son went through it last year and now my daughter’s going through it. And to see the differences there in the way that they use spatial representation in math, one of my kids really, really gets it. When they explain it, it makes sense. One of my others doesn’t so much. I don’t know, it’s just an interesting example of how that… I’m sure that scales to other kids in the class.

Anna: In the past decade or so, we’ve completely done a 180 with how we teach math even starting in the pre-k ages. I think the way that all of us learned multiplication, subtraction, edition and division is really different than the way the kids are learning it now in the classroom. They’re learning it from a much more, like unpacking what the numbers actually mean. How you look at what does three [00:47:00] look like when you draw it out, when you model it out? What does 3+4 really mean from a spatial perspective when you add them together?

And so, students are having to rely even from like pre-K on when they’re being introduced to numbers, they’re having to rely on some spatial skills and competencies that historically you didn’t have to necessarily lean as heavily on when you were learning math.

Dr. Sharp: It is so true. The model that they do at my kids’ school is that the kids actually don’t do the homework. They have to teach the grownup how to do the homework. And so, my kids are teaching me and I’m learning these spatial representations on the fly and having to somehow do math that way. It’s been a cool process, but it I think it’s good. 

Dr. Joni: That feels good. When I see people complaining about it on social media, a lot of times it’s some of those heuristics shortcut ways of doing math and they’re like, why should I learn how to guesstimate? [00:48:00] Why should I learn how to… I’ve seen some kind of web thing that lets you do multiplication with paper representations. And part of that is that other people were having to invent that on their own right to kind of work out some of these problems. So, if you’re really comfortable with math, you figure out ways of guesstimating, right?

Like, I’m comfortable with math and so when I cook and I need to do like two thirds, I have heuristics of how I do that. And basically what they’re doing is trying to take those heuristics and teach them to people which is an odd thing to do, but instead of waiting for kids to maybe come up with these ways of thinking, they’re trying to expose them to a lot of different ways of being comfortable with math and with numbers and concepts like proportion.

So, whether or not every kid should have to learn how to do the math in each of these different ways, I’m not sure that it really makes a lot of sense, but presenting all these different ways is really valuable because then it gives kids other ways of thinking with [00:49:00] numbers. And that’s the kind of reasoning, being comfortable with quantity and with relative amounts and being flexible with quantitative concepts.

That’s exactly what makes reasoning something unique from achievement- knowing your math facts, knowing how to solve problems in traditional, classic ways. And so, it is interesting to see them complicate math to make it more engaging to some students, but it definitely is pretty messy looking if you don’t know what they’re doing and why they’re doing it.

Dr. Sharp: True.

Dr. Joni: I’ve explained it to my mother many times when she works with her grandkids and she still does not understand. 

Dr. Sharp: I can understand.

Let’s talk about some more applications. I really like this idea. We definitely detoured, which was fun, and this idea that screening students is a great way to level the playing field and do away with some of those issues [00:50:00] with inequity and kids who might otherwise be passed over or even thrown into special ed. The research says that kids of color and marginalized kiddos are thrown into special ed or whatever, diagnosed with behavioral issues at a lot higher rate than non marginalized kids. And so I would love to hear any more of those applications or ways that y’all are finding this can be super helpful for those kinds of kiddos.

Anna: Yeah, absolutely. We’re seeing some districts across the country do some really innovative work using their CogAt data. One in particular, a pretty diverse district, what they’re doing is testing grades 1, 3, 5, 7, and 9 testing every child in those grades with CogAt and then comparing their CogAT data to their achievement data and using that [00:51:00] comparison to identify if there are any extreme outliers. Who are the students that are relatively high ability and are not performing on an achievement test?

And that’s an indication to them that something else might be going on with that student. Perhaps there is a learning difference with that student. Perhaps there’s a social emotional issue with that student, perhaps that student we see often… Maybe that student is an English language learner and would frequently be diagnosed with a learning disability, but actually as a strong verbal reasoner, just doesn’t know English. That’s a common use case for it.

So even using the ability alongside achievement, identify gaps and potential and performance for students, that’s been a really powerful use case of the CogAT. The other one that we’re seeing that is also very powerful is in using CogAT and using the ability data to support talent development with students.

We work with one district that has created this program where [00:52:00] they actually take the ability data and give it to students and teach students like, you are strong in quantitative reasoning. And then they have enrichment time throughout the day where that student then actually goes and develops that talent through activities and resources that are research designed to develop that quantitative reasoning even more. And then they have days where that student is then going and developing their areas that are not as strong, like verbal reasoning or figural reasoning.

What’s really cool about watching this play out in action, we actually got to zoom in and see the kids engaging in these activities the other day is that they’re able to tell you like, I’m really strong at verbal reasoning. This is my strength. I am not as strong at quantitative reasoning. This is an area I’m working to develop. And these kids that are 8, 9, 10 years old are able to articulate that in a way that…

I had to hire an executive coach in my 30s to [00:53:00] understand what my reasoning strengths were, but to see the impact of the talent development that these schools are doing, and to see the impact from a social emotional perspective that it’s actually having on kids as they’re able to understand their own strengths and their own relative weaknesses, it’s just really powerful how districts are utilizing this information.

Dr. Sharp: You said two things that are super interesting to me. Going back to that first point you made, I don’t know if it’s too much of a reach, but it almost seems like you could use this CogAT versus achievement discrepancy as a proxy for maybe mental health concerns or social emotional concerns or environmental issues.

To me, that would be really important. Like, why are you doing so well on the ability test and not so well on the academic? Immediately, I’m like, what is getting in the way here? Is this [00:54:00] not being able to do your homework at home? Is it being distracted in the classroom due to some ADHD or something? I don’t know if that’s the direction people are headed, but that’s one thing that jumped up on my radar. That’s a really valuable gap to notice and opens up a lot of possibilities for how to intervene with a student, maybe screen other things.

Anna: Totally. And it doesn’t necessarily give you the answer of why, right? 

Dr. Sharp: Sure.

Dr. Joni: Totally. It definitely doesn’t.

Anna: It doesn’t give you that answer. What it does show you is maybe a student doesn’t have the tutoring resources that another student has. Maybe a student doesn’t have the books at home that another student has. Maybe a student doesn’t have a parent to help him or her with homework that another student has. Maybe a student does have something like they’re getting bullied at school and that’s why they’re not performing. So it doesn’t tell you why but it gives you the opportunity to ensure that [00:55:00] you’re not letting kids slip through the cracks because they’re not achieving on those achievement benchmarks.

Dr. Sharp: Yeah. It gives you an opportunity to ask more questions, basically. Be curious. That makes a lot of sense to me.

The other thing that you were saying, I love that idea of maybe combating some of that stereotype threat almost by sharing some of these results with kids. I could see that being super helpful with like girls in STEM, for example, or minority kids across the board. If you share that information and say like, Hey, you’re really good at this, and then that starts to build and combat some of those internal messages that might get in the way that they may have learned elsewhere. I don’t know. Again, maybe that’s a reach, but… 

Dr. Joni: No, it’s always a danger because their growth mindset interventions may or may not be as effective as originally touted, but we know that having a fixed mindset, an entity idea [00:56:00] that I’m smart and that means I shouldn’t have to work hard.  I’m smart. It means I should be good at everything. Everything should come easily. And if it doesn’t, then I’m secretly not smart. It was all a mistake. Regardless of what kind of that broader literature shows, we know that for students with academic talents, it’s a real risk. Knowing that they have strengths can make them think like, oh, I shouldn’t have to make effort.

So having that growth mindset, that  incremental idea that whether I’m good at this or not good at this, I have to work towards accomplishments and expertise is a really important skillset. If you’re going to talk to students about their ability scores, frame it in a sense of even the smartest, most brilliant person you’ve seen on tv studies hard and works hard for what they’re able to talk about. So yeah, fighting that idea that smart kids don’t work hard is really important.

Dr. Sharp: Yeah. I appreciate that. Throwing that in [00:57:00] there to make sure that we don’t forget.

Anna:  And that’s what’s really interesting. Our district that does a lot of this talent development, it was so interesting, it came out directly out of her work with understanding growth mindset and seeing how kids that were very able would crumple with a new challenge. And then she saw that kids that weren’t as able thought that or weren’t necessarily as cognitively gifted also just think they weren’t smart.

And so, she created this process of talent development and she framed it as a way it’s so powerful. She framed it as a way of we all have relative strengths. We all have relative areas of growth and we’re going to spend some days developing our relative strengths and we’re going to spend some days developing our relative areas of growth.

You could hear kids talking about it in a way that was so impactful. They would go around saying like, this is [00:58:00] my relative strength and I’m practicing X, Y, and Z today. This is the area that I need to continue fostering and get better at. So it was really cool how she took this relative strengths and weaknesses and morphed it through the lens of a growth mindset to really empower kids to understand what they were naturally potentially more ready and to do but continued to prompt them to work on those skills.

Dr. Sharp: I love that. And maybe just living it, like I said with my kids and seeing them do different things academically. Some of the most meaningful work that anybody can do, I think is empowering kids. 

Well, I know that we’re getting close time wise. Time always flies by. As we start to wrap up, I wonder, are there other areas, anything that we didn’t cover that was worth diving into here for a few minutes? If not, that’s totally okay. I know there are many cans of worms we could [00:59:00] open. We may choose not to this time around, but yeah, anything big to get on the radar as far as the CogAT?

Dr. Joni: One thing Anna mentioned a while back was about perceptions of IQ and his historical misuse. Absolutely. I agree with that conceptual conceptualization. One thing I would want to say is that, we really talk about tests are a mirror to society. And so inequities in education and access to quality education are reflected in our tests. The tests don’t cause those differences. So, I would encourage folks who are talking to parents or advocates who are trying to say all tests are out there to keep kids out, they’re trying to perpetuate inequities, tests are in mirror. They reflect those inequities.

The reason that I believe in tests and I work in tests and I study how to make them better is because I think they help [01:00:00] reflect those inequities and remind us of the work we need to do. If you think about, we’re reconceptualizing achievement gaps to educational debts, and I think that’s really true that some students by virtue of where they’re born and the tax policies of their school district and their families, their race and other backgrounds, those lead them to be in educational systems that are lower quality, more stressed, less resourced, all these things. And so the tests are going to reflect that. But if you get rid of the test, you don’t get rid of the inequity. You just get rid of the ability to detect it.

I really believe in the value of assessment data cautiously interpreted. I’m not trying to say it like tests are perfect and completely unbiased, but we do a lot of work to ensure that the content is equally accessible and meaningful to all students. And we do a lot of work to ensure [01:01:00] that the scores are comparable over time, across students, different…

People will ask, if I take it on online, is it easier or harder? We’ve adjusted for that. We try to take those things into account. And we know that the large test gaps that you might see, I think they’re true. They’re not a figment of the test, they’re a reality of our system. And so, I always want to blame the system and make the system better rather than focus on either the students being less than. It’s not the students and it’s not the test, it’s really the system. That’s something I would want to just put out there on behalf of the Test . 

Dr. Sharp: Right. Standing up for the tests. 

Dr. Joni: Well, yeah. One of the few people who is like, no.

Anna: I think tests are really a force for good. I honestly don’t think they’re a force for evil. They’re a force for good. But they do get a bad rap for some very good reasons.

Dr. Sharp: Right. Well, it seems clear that you and many [01:02:00] others are working hard to reverse some of the problems that historically were present. 

Dr. Joni: Yes. Test data could be the tool to address those. If you know where the gaps are, if you know where students’ strengths are, can you use that to address these inequities?And, of course, we need to fund our schools better and pay our teachers better. I’ll put it out there. I’m getting a political platform here.  

Dr. Sharp: Sure. Yeah, you can have the platform for those statements. 

Dr. Joni: More money to the schools, more money to the teachers.  

Dr. Sharp: That’s great. Well, I’m totally in agreement with you on both of those things. Maybe that’s a nice note to end on.

If if folks want to reach out to either of you,

1) Is that okay?

2) If so, what’s the best way to find you?

Dr. Joni: I’ve given up on Twitter, but I still have a professional website and it’s just my name, jonilakin.com. So you can reach out [01:03:00] to me by that. Anna, how about you?  

Anna: Feel free to check out our website, riversideinsights.com. And then happy to take any personal emails, anna.houseman@riversideinsights com.

Dr. Sharp: Great. I really appreciate y’all’s time. This is fun to dig into a measure that I’ve been adjacent to for many years and probably should have had this conversation a long time ago, but I hope that it’s been beneficial for folks. I know it was really interesting and compelling for me. So, thanks for being here, both of you. 

Dr. Joni: Oh, great. Thanks for having us. 

Dr. Sharp: All right, y’all, thank you so much for tuning into this episode. Always grateful to have you here. I hope that you take away some information that you can implement in your practice and in your life. Any resources that we mentioned during the episode will be listed in the show notes, so make sure to check those out.

If you like what you hear on the podcast, I would be so [01:04:00] grateful if you left a review on iTunes or Spotify or wherever you listen to your podcast.

If you’re a practice owner or aspiring practice owner, I’d invite you to check out The Testing Psychologist mastermind groups. I have mastermind groups at every stage of practice development: beginner, intermediate, and advanced. We have homework, we have accountability, we have support, and we have resources. These groups are amazing. We do a lot of work and a lot of connecting. If that sounds interesting to you, you can check out the details at thetestingpsychologist.com/consulting. You can sign up for a pre-group phone call and we will chat and figure out if a group could be a good fit for you. Thanks so much.

[01:05:00] The information contained in this podcast and on The Testing Psychologist website are intended for informational and educational purposes only. Nothing in this podcast or on the website is intended to be a substitute for professional, psychological, psychiatric, or medical advice, diagnosis, or treatment. Please note that no doctor-patient relationship is formed here. And similarly, no supervisory or consultative relationship is formed between the host or guests of this podcast and listeners of this podcast. If you need the qualified advice of any mental health, PR practitioner, or medical provider, please seek one in your area. Similarly, if you need supervision on clinical matters, please find a supervisor with expertise that fits your needs.

Click here to listen instead!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.