That's right... I said it... Microsoft is NOT trying to trick you! I have been working the our content development managers reviewing the comments that you provide after taking one of our certification exams (yes, we do read them!), and one comment that we see on nearly every exam is that an item(s) is "tricky." So, I want to set the record straight, and I say this knowing that I am likely to create a firestorm of comments from you... we are not trying to trick you. As a psychometrician, I'm very concerned about content on our exams that is tricky. Tricky is bad... tricky items ask about nuances and corner cases that only fanatical users of Microsoft's technologies (or the item writer) would know the answer to off the top of their heads during an exam. They require memorization of obscure facts or trivial pieces of information. They are not written clearly. They do not differentiate the best from the rest because anyone, regardless of their ability, has about the same chance of answering the question correctly and usually that chance is not very good.
However, there is a difference between "tricky" and "difficult," and candidates often confuse the two, calling something "tricky" when it's, in fact, "difficult." Difficult is great. I love "difficult" because these are the items that differentiate the best from the rest. We don't expect all, or even many, candidates to know the answers to those questions, but we include them on our exams because we want earning a certification to mean something...to set you apart from those who don't pass the exam or don't hold the certification. Because they are intended to differentiate candidates, by definition, exams must include difficult content that can be used to set you apart from others. These are questions that are unlikely to be something that the average user of our technologies would encounter in their daily jobs; however, these questions should ask about situations that candidates who really know the ins and outs of the technology may have encountered or, if they did not have that particular experience, they would know enough about the technology to be able to extrapolate the correct answer.
So, every exam should have some questions that are difficult; however, exams should not have questions that are tricky. When you don't know the answer or think that it's something that only a few people would experience, you're likely to think the question is "tricky," and it probably is...to you, but it's not to everyone and should not be for those who are the expert users of our technologies. In fact, our development process ensures that questions are reviewed many times by many different subject matter experts, meaning that we shouldn't have many truly "tricky" questions on an exam. Did you notice that I said "many' instead of "none?" Well, it turns out that there's no such thing as a perfect item, and stuff slips through the cracks. We may have a truly tricky question on an exam, which is why you need to let us know if you think something is tricky and the best way to do this is to provide comments after the exam. But, you need to be specific about what makes it tricky (e.g., no one does it or does it this way; no one would memorize this/know it off the top of their head). When we see comments like this, we have the question reviewed by subject matter experts (SMEs) who may--or may not--agree with your assessment of the item's trickiness. If you were to retake the exam and saw the same question, odds are that our SMEs didn't agree that it was tricky, rather they felt it was difficult--something not encountered or done by most candidates but something that differentiates the best from the rest. And, those questions are great because when you know the answers, they help set you apart from others--exactly what certification is intended to do.
This is where you tell me what you think... I knew going into this blog that this might be a spirited conversation.
I don't think that I have ever seen "trick" questions or "tricky" questions, from a technical standpoint. What I do see a lot of are poorly written questions (grammatically speaking), or questions that use unclear or imprecise language that requires the candidate to make assumptions about what the goal of the question is or what a particular answer is intended to mean. I flag them and comment on them every time that I see one, but I see far more of these than I would expect to see, even on live exams (as opposed to beta exams).
I recently took and passed a VMware exam, and I was struck by how different the questions were from typical Microsoft exam questions. While Microsoft questions tend to be long and rambling, often times multiple paragraphs long, most of the questions on the VMware exam were a sentence or two at most. The questions were not any easier or any less technical, but the VMware exams definitely embrace the concept of an "economy of words" that made nicely took guesswork and assumption out of the picture.
I have not seen many tricky questions, but I have seen some outside the (stated) scope of the test. Those always throw me off.
Recently took a Cisco exam, same as Kevin mentioned for VMware. Short, to the point questions, written in proper english. Some are very difficult and require a lot of thinking, unlike a Microsoft exam in which you either know the answer or not.
The MS questions are vague, badly written, out of scope, and look designed for brain dumping. They are not reviewed by anyone, let alone an expert. For instance, Server 2012 SMEs don't call it 'Server 8', but your exams do, because nobody ever reviewed the exam. Feedback during exams and item challenges are completely ignored. Retaking exams until you memorise the questions and pass is a brain dump mentality that MS should not encourage with second shot.
Not one Microsoft exam question has ever asked me about a situation I've encountered in real life. Real life experience gets in the way for the MS exams.
I've supported Microsoft exams for many years but...
Difficult is when a topic or subject needs a lot more attention, a lot more explaining and is a lot more challenging to understand.
Putting questions into exams which you cannot find a single reference to when searching the course material designed for that exam makes the exams tricky.
Compared with other vendors, Microsoft exams are getting increasingly tricky.
My only argument is that whilst you are free to make the exams as difficult as you possibly can, in fact I encourage that, please also include all the content in the courses you say are designed for that exam.
Liberty, this is really a "tricky" issue. You are right in your categorization about what is tricky or not from a psychometrician's perspective. But this does not mean that the exams are not tricky. From your post:
"Tricky is bad... tricky items ask about nuances and corner cases that only fanatical users of Microsoft's technologies (or the item writer) would know the answer to off the top of their heads during an exam. They require memorization of obscure facts or trivial pieces of information. "
Almost everybody knows that there are questions in the exam that ask about a very specific bit of information where you would not see in the training materials, in your day to day operations, maybe never in your job and even worse, will never need unless you tackle that specific issue.
The 70-246 - Monitoring and Operating a Private Cloud with System Center 2012 exam (which I took in January, 2013) was an excellent exam. I have sent you an email about this exam and told you that it was the best exam I have taken because it tested the candidate's understanding of the private cloud. It covered from basics to advanced concepts, including the information that the candidate needs to know in operating and monitoring a private cloud. I did not have a single question about testing something hidden, or a piece of information that a candidate needs in order to solve a specific issue, which a minority of users may or may not encounter.
However, I wish I could say the same about the Windows Server 2012 exams. You may argue that Windows Server is a much broader topic compared to the private cloud, which is fine. But I can argue that some of the questions I have struggled with had nothing to do with the understanding nor the operating of the operating system. I can go so far as to say that the questions were asked specifically to make the candidate fail. They were not difficult, they were tricky at best. If I can see the questions that I have received in the exam, I can point you to the questions I am talking about. I am that sure about what I am saying.
Plus, from a support engineer's perspective, I want a junior administrator to know what is going on in the operating system (yes, some notion about storage, networking etc. but I am restricting it to the topic). If -say-he does not know how the network services work, if he cannot ask the right questions to diagnose the issue, what is the point in asking the bit-level details about the -say- Teredo or 6to4. Again, I am not saying that he does not need to know these transition mechanisms, on the contrary he needs to, but _after_ he properly understands the underlying network infrastructure.
You also spoke about the best and the rest. By definition "best" is one; some /thing/ can be best, not some /things/. Microsoft certification has many earners, and by definition they are not the "bests." Even not certifying candidates who had a score below 1000 does not show they are "bests." The earners are the candidates who proved that they have a level of knowledge that is over the specified threshold. We should be speaking about the correct way of measuring the level _and_ the quality of the knowledge. Not some qualitative and subjective quantification about "best."
I fully support Microsoft certifications, but I cannot say that they question people for real-life knowledge. If it really is the case, do not make the exams difficult, make sure that the candidates really *understand* the infrastructure, the "how it works" of the system. If you require a candidate to know about a Powershell command with all the arguments and options, you are pointing him to brain dumps.
<quote>I have been working the our content development managers </quote>
your trying to trick me right?
I don't have a problem with difficult questions and I expect them. I was unlucky once to get a sequence of extremely difficult questions in an exam that took me days later to study/review their content (and when I did my second shot, none of them appeared and the exam went smoothly).
But... whenever I see a tricky question on the exams I do write about them and I even explain on my comments why they should not be there. In the past it really pissed me off questions where the answers where just silly position combinations like: imagine that one method could have two parameters, the options would be:
a) Parameter A + Parameter B
b) Parameter B + Parameter A
c) Parameter A only
d) Parameter B only
Not even on high school I would admit that silliness and I promptly mark the question (even knowing the correct sequence) to be reviewed and then I describe it as a typical lazy-written question.
Other than that I disagree with other comments saying that the exams are "target" to be used on braindumps on purpose, nor that the questions should be around two sentences (Two sentences can't describe a SQL/development question at all...). Also what I disagree the most is the famous "real life" complain, because there is no such thing as "real file": every single code, command, script are real life by definition! It does not matter that one person in particular would not use it during his lifetime, but that they are not imaginary code like "once upon a time in a far far place...". No! If they exist, they are potential candidates to be on exams. It would be rather different if in a Win8 exam would have WinXP or Win 9 (!) questions. That would be "not real life". Other than that: get used, such is life, full of unexpected things.
I do respect you and the hard work you are doing for so many years, I really do. And that's why I am writing here.
I have to agree with all the points mentioned above. Microsoft exams were not using tricky questions in the past (during WS2003 time). Unfortunately, with the development of WS2012 exams, you have started to use tricky questions, and you do that increasingly so. I do write comments in every exam that I see a "bad" question, but I also notice that these comments are not taken into consideration (OK, you read them but maybe because of budgetary restrictions you can not update the exams soon enough). However, the case exists.
One other point, there are too many questions that are out of scope. There are questions about System Center components within 70-410, 70-411, and 70-412 exams. However, those questions are to be asked within 70-414 and 70-415. For example, 70-417, which is an upgrade exam for "basic WS components" contains questions about "advanced WS components" like Orchestrator, Windows Intune, Service Manager and related PowerShell commands. All of these are out of scope questions, reserved for the more advanced level exams. These questions are not tricky according to your definition, but they are still "bad" questions. Of course all of these questions should be asked, but that should be done in context of the proper exam.
Also, Microsoft exams most certainly need to be simplified. Where 2-3 lines are enough to ask a question. 2-3 paragraphs are too much and increase the overall exam time. It is simply impossible to completely read all questions, answer them and still have some time for review before you end the exam. I believe you can experience that first hand if you take a WS 2012 exam yourself. Since you are not an IT person, you won't understand the questions, but all you have to do is read them thorougly, make a guess out of the available options and then check how much time you have remained for review.
I want to state that I have taken more than 30 MS exams in tha past 8 years. I clearly see the change of direction in the exam context. In your struggle to make Microsoft Certification more valuable, you (your content development managers and/or SMEs) lost your focus. You have forgotten that the questions need to be difficult to answer, but not difficult to read.
For closure, I repeat that the exam questions must be clear and to the point, not providing too many unrelated clues as is the current case (which is tricky by definition). The candidate is not a detective, therefore requires no clues. Either he/she knows the answer or not. I believe that's all we need to measure.
I, too, agree with many of the points made above, especially where some exams contained questions that were not directly associated with the expectations of that particular exam and where some of the questions are not clearly written. As for the latter, I would call these "tricky", but I might concede that these might not be intentionally "tricky".
Still, and this is a matter of semantics, I find some of the questions where there are two "correct" answers to be tricky, mainly because the verbiage used in the question itself. The types of questions I'm referring to are the ones that usually include a phrase that something like this: "...doing this task with minimal effort from the programmer." The problem with this type of question is how do you really know what is "minimal" for each and every programmer. Sure, number of SMEs are polled to see what they think, but I submit that unless you have 100% agreement on each answer and they are all in agreement that this is the "best" way to handle the task at hand, it will be considered "tricky" because there are still two "correct" answers. One just have to decide which one applies to the nuance of the question.
Overall, I have found the exams to improve in both content coverage and difficulty (if the exams are too easy, it would diminish the value of the certification). And yes, as you, Liberty, pointed out, we are all human, so everything is not perfect and a lot of this is still a matter of opinion and semantic understanding.
I too agree with many of the above sentiments, so I don't feel the need to repeat them here. It may be better stated that "Microsoft is trying not to trick you". Unfortunately, Microsoft hasn't been scoring a 700 or better in that realm lately...
"They require memorization of obscure facts or trivial pieces of information."
Well, I failed the 70-689 exam today. Although the actual hands on testing is great, I think the exams are getting harder. I've done a lot to prepare including purchasing a Surface Pro, Buying a genuine Windows to Go USB Drive, building virtual labs, registering and evaluating Office 365 and Windows Intune. I've trained every day for 2 months and thought I was ready.
The problem is that the exam is not as balanced as advertised. There were a number of technologies that I trained heavily for that there were no questions on. Some other technologies were heavily weighted and others were not clearly identified in the exam objectives.
The issue I have is that the publicised exam objective list is (and always has been) far too vague. I would like to see a more detailed drill down of the exam requirements. That way I could happily follow my training methodology of read it, watch it, do it. Also, I'd like to see MS learning look at each question and ask, does the exam objective clearly say to study this item (and in this level of detail)?
Finally, I was unable to answer a question that I could not possibly have known unless I used actual infrastructure or "memorized obscure facts or trivial pieces of information.".
Hi, I took 70-483 exam today. I couldn't manage my time well and ended up not being able to answer the last 8 questions of total 47 and failed. I agree with Andrew. I had a lot of questions from one or two particular subjects and at least 5 of those time consuming drag and drop code writing questions. There weren't questions from very important subjects such as multitasking or casting types... I find the exam isn't balanced well and there are many questions which are only about smallest detail where even a senior developer would look up in his books or MSDN.
Also I was looking at the re-take exam policies and I noticed two things. I could register with the "second shot" option and get the second time for free. Also as English isn't my mother tongue I could get 30 minutes extra at the exam. I wish this information was located somewhere handy or provided while registering for the exam. I recommend looking at special offers and reading the exam policy to my fellow test takers.
As an MCT, who has the opportunity to receive feedback from delegates, I am becoming increasingly frustrated with the level of exam failures being experienced. The failures aren't because the delegates don't use the products or don't study hard for their exams, it because increasingly, as identified in other posts, the subject matter in the exams is a) not covered by the course material (in many cases it isn't even mentioned anywhere) and b) as someone has also stated, is next to impossible to find out about anywhere on the Internet.
When sitting my own exams, I have experienced the same problem. The problem increases when the answers provided don't actually match the scenario in the question. You can discount the obviously wrong answers but then you are left with an answer that is clearly wrong too - or relates to a method of working that you wouldn't ever chose in the real world and that goes against Microsoft's own guidelines and best practices.
I would concure, make the exams difficult but, at least, allow those wishing to take the exams the ability to identify the subject matter that will be covered in the exam. With each generation of software, there is more to take on board as you have to understand the basics plus the additional features of each new product. The courseware is now way off the mark when it comes to covering exam subjects. For example, read the 6292A course book and then try to sit the 70-680 exam.
I just want you to know that I'm reading your feedback and comments, and I will respond. This is not radio silence. I want to get as many of your thoughts as possible before I start responding. But, never fear... I will! So, keep them coming!
@ Liberty: I'm trying to schedule a exam in Prometric using the generated Voucher Number, but when I puts the VN and clicks in "Validate" the system show the message "Promotion Not Found". I'm trying schedule the exams 070-840 by paying for the first exam using the microsoft voucher code so that I could get the next 2 exams for free which is valid for the certification exams: MCSD Windows Store Apps-HTML5 [ Exams 70-480, 70-481, 70-482 ]