Finally Finished Administering the Dynamic Learning Map!


I finally finished administering the Dynamic Learning Maps ™ (DLM) test! Well, I found out that I was finished. I actually finished on Thursday last week. It just wasn’t clear that I was done because the interface doesn’t tell you. You just stop receiving tests in your KITE inbox. This made the end of the testing process rather anticlimactic.

The DLM project is pioneered by researchers from the University of Kansas. According to the form letter they wanted us to send home (and have signed by our building principal), the DLM is an “exciting process” with assessments that are “revolutionary and important to the future of educating students with cognitive disabilities.”

According to their website ( , the DLM project is a research-based “innovative way for all students with significant cognitive disabilities to demonstrate their learning. “ Interestingly, exploring the website revealed only three ‘research papers’: One was basically a 9-page definition of text complexity, the second was an attempt to simplify their learning nodes,

You know...Simplify.

You know…Simplify.

and the third was an evaluation of a pilot conducted by the good people at the DLM from which they concluded that everything was awesome.

It seems like the DLM researchers glossed over the teacher evaluation portion of their pilot evaluation. Most teachers (68%!) rated the assessment system a C, D, or F. Apparently, teachers only know how to rank things by grades. This poor response from teachers would indicate to me that another pilot is necessary before implementation.

The researchers used other comments, such as teachers asking for more symbol-supported text, as an opportunity to suggest that teachers simply need to be educated on what the DLM was all about rather than seriously considering that symbol supports might help students to perform on the assessments. In other words, teacher complaints were due to their own ignorance of testing strategies rather than due to the deficiency of test items.

My own experience with the tests was similar to those teachers’ who participated in the pilot. I know that different students received different testlets, so my own reflections will be short and sweet! I administered the tests to a young man with classic autism who uses an AAC device for expressive communication.

Inauthentic—My student understands the difference between pipe cleaners and band-aids. He knows that they are different, could sort them easily, and use the correct item functionally when necessary (when was the last time you used a pipe cleaner functionally?). He was awfully confused when I asked him to hand me one or the other due to his difficulty with auditory processing. The system likely logged that he does not know those items or the concepts of same/different, when this is not a fair or accurate conclusion of his abilities.

Frustrating—My student learns best with errorless learning. He gets frustrated when he does not know the answer and can sometimes demonstrate that frustration with aggression. Thanks, DLM! Luckily, we have been working on using calming breaks and I had a huge bag of Froot Loops on hand. Unluckily, he has learned, correctly, to ask for help when he doesn’t understand something. It broke my heart that I couldn’t help him. Instead, I praised the heck out of him whenever he gave me any sort of answer–correct, incorrect, building a tower out of the band-aids and pipe cleaners, didn’t matter!

Poorly Designed—When will these companies who make tests for students with significant disabilities, especially Autism, learn that you cannot list answer items from top to bottom, or left to right, and expect students to thoughtfully choose what they think is the correct answer? These students will usually choose the last answer when they are presented linearly! The answer choices need to be presented in a circle of sorts. To make matters worse, the DLM did not even off-set the answer choices and the questions were often embedded within a text. This made it extremely confusing—the student could not even guess at what the answer might be because he couldn’t visually discern which items on the screen were supposed to be the choices.

Just off-setting the questions from the rest of the text like this would have been an improvement to the on-screen presentation.

Good intentions with accessibility options, still falls short—The testing interface seemed to offer a variety of accessibility options for students. All students are expected to interact directly with the computer. I administered the test on an iPad and didn’t have trouble with accessibility. A coworker had a student who needed enlarged text and, as a result, only a very small part of the screen could be viewed at one time. The teacher eventually gave up, unchecked the visual impairment box on the student profile, and then presented the test items on the SmartBoard to allow for a larger view of the whole screen.

To conclude, I found that the DLM fell short of its own description of being “exciting” or an “innovative way for all students with significant cognitive disabilities to demonstrate their learning.” In the DLM’s defense, however, I don’t have a better alternative for a standardized assessment for our students with the most significant disabilities. The Illinois Alternative Assessment (IAA) fell short in its own way as well. I only hope that this is another stop on the way to the design of a minimally intrusive test that will highlight a student’s abilities in a meaningful way.


10 thoughts on “Finally Finished Administering the Dynamic Learning Map!

    • Thanks for the comment! Testing won’t change until the business side is removed… I don’t see that coming anytime soon!

      I may do a separate post on DLM scoring. I couldn’t get a straight answer on how scoring worked when I reached out to them.

      Liked by 1 person

  1. jeffreymhartman

    Pennsylvania uses a test called the PASA for a similar population. It is based on a set of alternate standards and curricula used in PA. Three levels of the assessment exist for students in different ranges of functioning. The students who take it have no meaningful way of participating in the assessments used with the general population, so the PASA is a default.

    The PASA has questionable worth. Administering the it is complicated. Schools (usually specific teachers) have to purchase materials for the test. Most students require two staff members to administer it. After administration, the tests are scored remotely by people who watch videos of the testing session. Schools and parents get a report indicating students scored somewhere between below basic and advanced according to the alternate standards. Slightly more information is available, but only enough to add a sentence or two to the present levels section of an IEP. Little planning can be done based on the results. Finally, a debate has continued for over a decade as to whether or not PASA scores should be included with the scores from the general population of a school (since the same language is used for the scores).

    In contrast to the current nationwide testing climate, the PASA exists largely because parents clamored for it. Advocacy groups called for a way for severely disabled students to be represented fairly along with other students. They wanted inclusion. What they got was the PASA.

    The test has some authentic components and includes performances relevant to a life skills curriculum. However, ongoing programs like Brigance offer the same kind of assessment and more detailed performance data. The PASA ends up being nearly arbitrary.

    For low-functioning populations, daily data collected via performance according to a life skills curriculum will always make more sense than anything meant to mimic summative assessment beyond basic level testing.

    Liked by 1 person

    • Great thoughts. It is interesting to learn about how other states administer assessments. At the state level (and nationally as well), the existence of these tests is a result of litigation by parents and disability groups. They want to be treated equally and they want special educators and districts to be held accountable for the education that they provide students with disabilities. I don’t think that these test results can be used meaningfully to hold teachers or schools accountable at this point!

      The way your test is designed is much more intrusive…Maybe I should consider myself lucky that we are a DLM state!


  2. Ugh!! YES! I finished on Friday and it was fine for some of my kiddos on the spectrum but others, not so much. It gave non-readers tests where they have to select words. Then it moved one kiddp down to a test where he imitates me and did really well, so it moved him back to words. You can clap when asked so you’re ready for words?! It tested the same skill with different math manipilatives 4 times on a kiddo who kept failing. If he doesn’t know groups vs separate, he doesn’t know it! Sorry, DLM, but you don’t seem that dynamic to me…my one kiddo was so frustrated by the end of testing that he would get aggressive at the sight of the ipad we used for testing. Poor guy.

    Liked by 1 person

  3. MA uses portfolio assessment for S who can’t do PARCC even with accommodation. Teachers identify appropriate entry point for grade level standard in the target areas (MA puts out a resource guide to assist) and gathers evidence to show how S mastered skill over course of year. (So can’t pick something S knew before year started, must master in one year.)
    It’s a paperwork pain in the next, but you get used to it. And I’ve told many teachers over the years, as much as we complain about it it’s head and shoulders above any other state’s alternate assessment that I’ve heard of – and you’re reinforcing that belief.


    • I believe Illinois did a similar portfolio before the DLM and the Illinois Alternate Assessment. It sounds like yours might be slightly more work on the teacher end and more subjective, but overall more meaningful. Thanks for sharing!


  4. sevey

    Thank you so much for this review. We have this coming our way, and I saw what the sight had up and it got me worried. This just confirms all of that.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s