Electricmonk

Ferry Boender

Programmer, DevOpper, Open Source enthusiast.

Blog

Bad surveys

Friday, January 6th, 2006

Since I graduated and got my bachelors degree, I’ve regularly been approached to participate in various surveys. Among them have been polls on the impact of pre-education on my last education, quality of my latest education and various other surveys.

To date I have not seen a single poll which was straight-forward in it’s questions. Every single one uses vague terms which I’ve encountered in the past but which have no reference at all to my education. Questions tend to be incomplete, vague, terribly formulated (to the point that the meaning of the question can only be guessed) or not in the slightest related to my personal situation (even though they’re still assuming it is).

The latest survey I’ve been approached for is one by the SEO which performs various market researches. They’re conducting research on the marketing field and positions of recently graduated higher educated. Since my time isn’t worth anything anyway, I decided to particpate in the poll. At 15% of the poll they completely lost me. I had no idea which particular education I’ve followed they were asking questions about. The poll listed the question under “Information Technology”. Yeah, nice, but I’ve followed more than one “Information Technology” related education, so which one am I awnsering questions for? Next to that, they were also asking questions along the lines of (rough translation): “Have you ever been deselected for an educational level or educational institute?”. What does ‘deselected’ mean? At first that sentence read like ‘selected’, but the awnsers did not make any sence. I had no idea how to awnser that.

One of the main problems of surveys like these is that they’re linear. By linear, I mean the questions are asked in sequence and there’s no way of telling what the next question will be. As a result it is impossible to put a certain question in perspective to the rest. Why are polls structured like this? Why not just list anything on one huge page so I can see what I’m up against?

My biggest gripe about these kind of surveys is that people actually base statistics and decisions on them. They don’t seem to realize that a large percentage of the awnsers are simply guesses. This explains why survey-results are so often complete bullshit.

Now, I have conducted surveys myself in the past, and I know how hard it is to get it quite right. Even harder is getting a meaningful and correct conclusion from the collected awnsers. But you’d think they would at least take the time to sit down with some people from the target group and walk through the survey with them.

Here are a couple of tips for creating better surveys:

  • Show the complete survey at once and don’t conduct questions in a linear non-forward/non-backward fashion.
  • Explain used terminology.
  • Allow people to comment on each question and take these comments into consideration when drawing conclusions. If need be, throw away the results and start over. Better no conclusion than a wrong conclusion.
  • Keep questions simple.
  • Show questions in the right context. Announce the context before each question. Explain the context! This sounds elementary, but you wouldn’t believe some of these surveys.
  • When conducting digital surveys, allow people to stop awnsering questions and come back to the survey later on (as in days, not minutes) so they can complete it.
  • List an e-mail address or phone number for asking questions about the survey somewhere.
  • Don’t ask questions in the form of “If such and such or such and such or this and that were the question, or this and that or somesuch was the question, would you say you were inclined to do this or that or such and such or not?” and then list five options with a follow up question about one of the options. I can just hear you thinking “Come on, nobody conducts polls with questions like that!”. Well, guess again.
  • Less is beter.
  • Don’t list multiple option questions like: A) 0-8. B) 8-15. C) 15-20, because I’m not very psychic and I can’t guess if 8 should be listed under A or B. Again, I’m not making this stuff up. I’ve seen this and many, many other unclear options.
  • Do not assume people in the target group know what every term used means. Get somebody from outside of the target group to take the poll. If there are questions (s)he doesn’t understand, there’s going to be people in the target group that won’t understand them either.
  • Do not list the conclusion you’re going to draw from the questionair in the questionair! Yes, yes, I know, you think I’m bullshitting you, but once again, I’ve seen it happen. Some time ago I saw a poll which, in it’s introduction said: “We’re conducting a survey amongst such and such group to determine too what degree the lack of this and that negatively influences the results of whatever“. Such a description will influence the target demograhpic.
  • Each question should have a ‘biased’ checkbox. Usually people know when they’re giving a biased awnser, and they won’t be too lame to admit it, even if they’re doing it on purpose. (or maybe I’m just an exception).

Any professional surveyer will probably laugh and scoff at this little rant of mine but guess what? All your polls suck. I’ve only ever seen one decent survey in my life (except for one-question-yes-or-no ones) and that was a survey on instructor-competence at my last educational facility.

Oh, and this stuff also goes for the Dutch IRS, with their pathetic “We can’t make it any more fun, but we sure can make it easier!”. Yeah, right. If this is easier, I wouldn’t want to know what it was like before.

Update (I guess I’m not done ranting yet)

Why do they conduct surveys like this anyway? I mean, with the questions and the multiple awnsers and stuff? Why not just put down the question and a big entrybox where people can write their awnser? “We’re conducting a survey on the job perspectives offered by different educations. Please write the full name of your highest achieved/completed education and your oppinion on the job perspectives that eductation offers. Please restrict yourself to your previously held or current job.” Highly unscientific naturally, just like all the other surveys. At one point you’ve got to conclude that it’s not the way a survey is performed but the validity of the conclusions that’s important. Why make something as hard as drawing conclusions even harder by getting the data in such an ass-backwards way? This way you’re at least basing your conclusions on the oppinions of the target group, not that of the surveyor.

The text of all posts on this blog, unless specificly mentioned otherwise, are licensed under this license.