Are tests that even native speakers of a language might not answer correctly really legitimate? This gap-fill quiz for business vocabulary is from this quiz repository. Only scored 16 out of 20 and I have a masters in economics. Does that make me incompetent and shameless? Hardly.
Part of the problem is probably English teachers without subject-specific knowledge asking either irrelevant questions or questions with either multiple or no answers. Another problem may be United Kingdom-specific language. The problems were:
"allowance money" vs. "pocket money" ("pocket money" is quite a general word)
"socialist economy" vs. a "mixed economy" ("mixed" pretty vacuous here)
"gold reserves" vs. "gold reserve" ("is" forces a usage that I'm not familiar with)
"multi-use ticket" vs. "season ticket" (no reference to summer, winter, ...)
Nonetheless, such quizzes still have value as a sort of meeting of minds between teacher and student and it's probably better that a lot of time is not wasted writing air-tight valid questions. It is important that students are not assessed with questions like this. Sometimes this type of question actually penalizes the best students in the class. I've thrown away test questions like this that were given in exams and confused very competent students.
The bottom-line: focus on content and not language itself, i.e. Content Based Instruction (CBI) . Setting learning objectives for content that is expressed with vocabulary is more effective and natural than making the vocabulary itself an objective.
(London Chamber of Commerce and Industry (LCCI) International Qualifications exam curriculae seem to be a good basis for content-based instruction).