Score Reports: What would YOU like us to include?

Score Reports: What would YOU like us to include?

Liberty Munson (Microsoft)

I've been toying around with the idea of making some changes to the score reports because the results from our Exam Satisfaction Survey (yes, I look at this data very carefully and use your feedback to make decisions about how to improve the quality of content and exam experience, so you really should be completing it when you get the email from comScore asking you to do so!) consistently show low levels of satisfaction with the usefulness of the score reports. Honestly, this is a common result for most certification programs. After all, what is likely to be the most helpful--seeing the specific items that you missed--is something that few, if any, certification programs will ever provide. But, I do have some ideas on how I might be able to make some improvements...

Before I spend too much time and resources on my ideas, I'd like to hear from you. Your input will help me shape my ideas and transform them into something better. 

So, here's your opportunity to tell what you'd like to see included in our score reports. It can be anything from learning resources to specific ideas around what we should be telling you about your exam performance to understanding what your score means to something that I can't even begin to imagine (I hope you have ideas like that!). What would increase the usefulness of the score reports? What would you like to see on them? Have you ever received a score report that you loved or thought was particularly helpful (if you want to send me a copy, I'd love to see it (mslcd@microsoft.com)? What did it contain? Why did you love it? Why was it useful?

Caveat #1: No promises. I'm thinking big changes in the type of information that is provided, but it is very likely that some of  what I'd (or you'd) like may not be possible systematically. Whatever we do has to be scalable and something that can be generated automatically based on your exam performance. Nothing is certain.

Caveat #2: Any changes we make to the score reports (again, I'm thinking big) will take some time to roll out if we can even implement them. Don't expect changes overnight.

Caveat #3: I will never be able to tell you the specific questions that you missed and why (exam security and the integrity of the program make this impossible). Think big, but keep this in mind.

Comments
  • rellufgerg
    |

    Current score reports provide information on objective domain performance. What about breaking them down into bullet point items within objectives? For example, on 70-680 instead of having:

    Installing, Upgrading, and Migrating to Windows 7

    Deploying Windows 7

    Configuring Hardware and Applications

    Configuring Network Connectivity

    Configuring Access to Resources

    Configuring Mobile Computing

    Monitoring and Maintaining Systems That Run Windows 7

    Configuring Backup and Recovery options

    The exam report instead lists performance on each of the (in the case of this exam 32) sub objectives. That gets around the "why was I wrong on this specific question" problem, but does certainly provide a tighter focus on where candidates need to put more work into their revision should they choose to take the exam again.

  • rellufgerg
    |

    How about just telling us which section of the score report a particular question belongs to while we are taking the exam? Sometimes I don't have any idea what a particular question is getting at. Knowing that I need to go study security settings instead of network settings would be pretty helpful without giving away anything too useful.

  • Erno
    |

    Hi, Liberty!

    Thinking big: the false answers on an exam could/should be classified so the report could tell you to read the questions better, study the details better, prioritize better and so on.

    Feed back on Objective Domain level can be useful when you flunk an OD big time but when you just didn't get it by a single question it doesn't help much.

    Drilling down from the OD endangers students to study areas that are too small.

  • rellufgerg
    |

    Give an option (with checkbox/radiobuttons) of let's say three profiles:

    1. Pass vs. Fail

    2. Required vs. Collected points

    3. Report on separate Objectives.

  • Lars Helbig
    |

    Since the score report can never tell you what exactly you did wrong, I fear that those who complained about it will still be dissatified with any changes you make.

    The best suggestion I could make would be to change the bar diagramms that range form "needs development" to "strong" into something more quantifiable; a percentage a numeric score or a pass/fail on the section or something. An indication on how the indivdual sections were weighed would also be nice, but I fear this level of usefullness would come close to being too usefull again.

  • suzi.sapphire
    |

    Hi Liberty,

    The more info the better, I think.  I was thinking along similalr lines to Brad.  It would be useful to know which section questions belong to.  Also, as questions are weighted, as well as a percentage score, what about a percentage right answers?  That way we would have a clue as to whether we're failing on the hard stuff or the simple stuff (though if I'm failing on the simple stuff, I shouldn't really be taking the exam).

    And not useful at all, but purely as an ego boost or an incentive, how about a 'You are in the top x% of exam takers'.  Yeah, yeah, we all know a pass is a pass, but I still remember my NT 4 Enterprise score of 970.  It might act as an incentive to some to push that bit harder.

  • rellufgerg
    |

    Having had results come through that look like I got 90% in everything and somehow still failed (Windows 7 Enterprise Support Upgrade exam) I think that the current method is a bit misleading.

    I think a percentage type guide for each of the OD would be best, to give the candidate a better idea where they are weak.

  • rellufgerg
    |

    @suzi.sapphire:  Liberty has mentioned several times that you cannot compare your scores to someone else's because no two exams have exactly the same questions (OK, statistically there are bound to be duplicate exams out there, but I'm speaking practically).  Does someone who scores an 800 know more about the topic than someone who scored a 780, or did they just get questions that were more favorable to their knowledge?  How do you know?  With that being the case, bragging rights means very little and true comparisons like you're asking for are impossible.

  • Wayne Hoggett
    |

    I don't think they need any changes. I always find that the scrore report accurately represents my strengths and weaknesses. I have never thought, damn I wish I knew which section that question was from, because it doesn't matter, the topics are the topics. You should know where your strengths and weaknesses are before you sit the exam and you should be focusing on improving your weaknesses. There are practice exams from MeasureUp and Kaplan for a reason, these tell you exactly the areas you need to focus on, prior to the exam.

  • rellufgerg
    |

    Knowing how I compared to average... some of it was suggested in other comments.  A better understanding if I'm above average based on score, broken down as much as you can break down my own score for each objective.  Also knowing for each objective if I'm missing the easy questions that other people got right, or if I'm missing the same ones other people are missing.

  • cheong00
    |

    To know how well / poor I'm doing, it could be interesting (it may not even be considered helpful) to include an average mark per objective that is the sum of "average mark of other student for questions that I take".

    That should make the ones who wants to show off to the world "Look! I do my exam better than most of the others" satisfy. [j/k]

  • Zeshan Sattar  (Regional Lead – UK)

    Hi Liberty,

    Please bear with me as I get a bit off-topic:

    Our MSITAs deliver the MTA certification using the Certiport plaform - I love the fact that it stores the transcript and delivers a PDF copy of the transcript.

    Now imagine if we can do the same with MCP/MCTS/MCITP exams. So when you log into the MCP member site, not only can you access your logos, digital certificates, VBC, but you can also access your score reports.

    Once we have the score reports on the MCP member site what would be really cool is that some kind of system is set up which can examine your performance on each Objective Domain and create you a custom learning plan. This learning plan would tap into the myriad of resources available in MS Land e.g. e-learning courses, MS-Press books*, technet articles,learning snacks, webcasts, TechEd recordings, knowledge base etc. It would also allow you to "tick off" when you have completed that part of "recommended learning".

    *With the MS-Press books it would be awesome if it could pin it down to a chapter and then the candidate only has to pay to purchase the chapters that they need to read in order to "level up".

    So the transcript not only acts as just telling you what you did in the examine it provides a springboard to enhancing your learning regardless whatever the outcome. Thus inherently becoming more valuable.

    At the moment, when I pass the exam I rarely look at the bars, so something like this would be really useful to anyone looking towards becoming a "Microsoft Certificationist"

    I'm sure this project sounds like a dream and maybe it is. However, we do have a new Community Marketing Manager (Hello Mark Protus!) and a wonderful team behind him (Liberty, Sarah, Krista and of course Ken) and I'm sure something like this would take months of work. You may even need MCTs/MVPs to help you "tag" each resource so that it can be seen by the Learning Plan.

    If Microsoft could offer this benefit as part of the MCP programme, I can see it remaining the leader of IT certifications for decades to come.

  • Mike Smith
    |

    The only thing I'd like to see added is a summary by question style. I.e. multiple choice, drag and drop, pick one vs pick three, and order these steps.  I often wonder how I have done on some of these, especially the "order these steps" questions as there are often multiple orders possible.

  • Adam Vero
    |

    I think detail by sub-domain would not necessarily be all that helpful, but what I would be interested to know would be how many questions were in a particular OD.

    So, first bar might read 60% (5 questions) - this pretty much tells me I got two out of five wrong which may be down to one I did not know and another poor questions, but still avoids telling me which question it was. If instead it said 60% (20 questions) I know I seriously need to work on that area.

    I know the number of questions in each OD is then 'flattened out' by the weighting, but a percentage score without a sample size is not ideal.

    From a visualisation point of view, widths of bars in proportion to their weighting would make sense. Although the weighting is shown as a number, I think it would be useful for people to see if the area they got 20% on was 'heavier' than the area they got 80%, and make it even more obvious why their overall score came out as it did.

  • rellufgerg
    |

    hola soy latino y no entiendo como dan certificaciones  de 70-680 solamente en ingles, es algo tonto e ilógico porque lanzar w7 en español y no tambien las certificaciones en español u otros idiomas cual es la idea de esto poruque poner trabas a personas comunes que no manejan ingles pero si sus sistemas operativos.

View All