Tuesday, December 11, 2012

Your "assessment" is probably bullshit

Every year, the first morning of DevLearn, I usually wake up with some crazy epiphany about learning and training that sticks in my brain throughout the conference. This year the revelation was "assessment is meaningless." I decided to try to spend the conference (when I wasn't presenting or hosting the Emerging Tech stage) trying to find examples of meaningful assessment.

My mission was by and large futile.

Let me start by clarifying: for adult education and training, knowledge might be important for dinner party conversation, and it might even help you in a job interview, but more than likely what you know isn't a fair representation of what you can do. Learning objectives are very different than performance objectives.

Why are we still writing learning objectives and assessing for knowledge acquisition when the only thing the business cares about is what employees can do?

If you're an instructional designer and you're writing learning objectives, please stop. STOP. What you learned in your graduate program or by reading Dick & Carey is meaningless to business objectives and business leaders don't know what the hell ADDIE is. Executives don't care if employees passed all of their knowledge checks with 100%. The bottom line is not improved by multiple choice questions. WHAT YOU ARE MEASURING AS A BENCHMARK OF LEARNING IS MAKING INSTRUCTIONAL DESIGN OBSOLETE. Stop stroking your own ego. Stop trying to justify your decision to make a click-through elearning module or to drag the salesforce out of the field for two days for training by showing they all passed a multiple choice test. NO ONE BELIEVES YOU and NO ONE CARES (yes, I'm yelling).

Businesses care about performance. What are employees doing, and what do they need to do better, to improve business function. Is what you're teaching employees helping the business make more money or saving the business money? SHOW THAT. Use those words. Measure performance. Make what you do meaningful to your organization by setting performance objectives and measuring against those.  Provide performance support. Stop making the easy decisions. Make meaningful and relevant design choices.

If you can't assess something meaningful, it might be better to not assess anything at all.

/drops mic. walks off stage./


9 comments:

  1. I agree with this sentiment (no surprise to you), though I think it's a teensy bit simplistic to assume that all businesses actually know what they're doing.

    If that were the case, then Daimler-Chrysler would have been worth at least a little more after the great merger, rather than ultimately less than Daimler had been worth before. Likewise AOL/TimeWarner, Sprint/Nextel, and Quaker Oats / Snapple, which is a favorite because they lost $1.6 million of value per day over the 27 months of that bright idea.

    I know a lot of people who see themselves as instructional designers love to design (and quite a few love to instruct) -- but many of their clients believe just as deeply in the Little Corporate Schoolhouse model: butts in seats (or distance learning on screen), contact hours, completion stats.

    ReplyDelete
  2. I actually very much agree, Dave, that businesses actually don't know what they're doing ;) But if we're the learning professionals, at some point it does us a disservice to pander to archaic thinking about learning...even if that's what a client is expecting/paying for. I have not been above creating useless content and knowledge checks in order to "check the box" on training, but I did so after telling them exactly what they were getting, and what they were measuring...probably nothing but how many check marks they made.

    My heart and my bank account sometimes just don't see things the same way, as I'm sure many people's don't. But we should be using our expertise for good and not being lazy or simple order takers...presumably we are the experts, no?

    ReplyDelete
  3. Well, I am going to disagree with the original poster. While as an ID I believe that what we develop and implement does in fact affect the bottom line of the organization, we have to realize that not all training does nor should senior level executives care what scores learners get in class, that is what we are there for. Just like I am not there to make those senior level decisions, that is what they are their for. We should always be training to meet a develop someone to perform to a specific level and need to measure accordingly. If we are training a skill then measure the performance of the skill, if knowledge then measure the knowledge. In either case, it should be done in the context of performing on the job, if you are not doing that then you are not meeting the need of the organization. The simple fact is that sometimes we can map training to business results and sometime we just need to train Joe to run a machine, take a call or deliver a package. Does this impact the business, of course, but can you or should you map every training assessment to business outcomes, I don't think you are asked to do so or need to to justify the importance of a given training interventiion.

    ReplyDelete
    Replies
    1. I think you contradicted yourself...you said "We should always be training to meet a develop someone to perform to a specific level and need to measure accordingly."--key word being PERFORM. Knowledge is only important in the context of performance; knowledge in isolation is rarely important...yet that's how we assess people. I also disagree that skill training isn't mapped to a business result. If Joe can't run a machine effectively, that impacts production, productivity, etc...those are measurable business metrics. Taking a call? Customer satisfaction, repeat customers, time to resolution for customer issues...all measurable business metrics. Delivering a package? Customer satisfaction, delivery time, delivery efficiency...all measurable business metrics. Yes, I think we need to think of our jobs as training professionals in the context of business metrics. If we don't, we undervalue the work that we're doing and we don't show the relevance of training to our counterparts in the business who have expertise in other areas.

      Delete
  4. I'm chanting for an encore. Get back on the stage.

    I would like to hear what folks from companies like Question Mark, who are deep in the business of multiple choice / assessment have to say. But I won't agree with them. Just curious. I'm glad to have your points because I have been all fired up just by the total lack of question writing skill. I haven't met a corporate assessment that I can't just rock because the question writers lacked talent in writing or pedagogy. Or maybe there is no such thing as well-written or "good" assessment questions?

    People, just cover your asses. Don't force people to endure assessments essentially contrived to cover said asses.

    ReplyDelete
    Replies
    1. I'm guessing that companies that focus on multiple choice assessments would argue that they do translate into performance indicators. I'd argue that I've never seen a multiple choice test that was truly an indicator of performance.

      I'm particularly interested in your point about the quality of the question writing...I think there probably ARE ways to frame questions and MC answers to test aptitude (eg, SATs) or the ability to synthesize information to determine the best course of action or solution. Instructional designers aren't trained in question writing to this level...it's a specialized skill that our industry just doesn't have.

      So I'm beating up on MC questions, when really the problem is assessment...what are you assessing? How are you assessing it? Why are you assessing it? And are you assessing it in a meaningful way?

      Delete
  5. Koreen, good points and I see what you are saying and I think we are on the same page. My point was that while we are training folks so they can perform a specific duty, task, etc. to a specific standard, many times are job stops at just that and we need to make sure we are training to the optimal level of performance based on what we, as training professionals, have been asked to do. We need to trust that by Joe performing his job at the level we are preparing him to perform, that we are impacting the bottom line of the business. However, in many cases the training or PI department does not need to go beyond that and show upper leadership how this is happening. I think most senior managers know if Joe can do his job great, then the machine makes product, the package is delivered and the call is taken and hence the customer is satisfied. I think any time we can tie our efforts to business goals, we should, but do not think we throw the baby out with the bath water and say that if we cannot show these connections that there is no point in any assessment. I know when I design training, I want to know where they are succeding and failing along the way in knowldge and skill acquisition so I an troubleshoot the program and make it better. Being able to assess knowledge acquistion can centainly help me understand why performance is hindered. Bottom line, as learning professionals I think we do need to assess these things for our own understanding and focus more of our reporting to higher on performance and business impacts. There is room for both in what we do.

    ReplyDelete
  6. Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic. pmp exam prep st. louis

    ReplyDelete
  7. Hi, your article was of great help. I loved the way you shared the information, thanks.
    Amazing article, I highly appreciate your efforts, it was highly helpful. Thank you AWS, aws training,aws course,aws certification training,aws online training

    ReplyDelete