IT Surveys are not the answer

I don't care who you are. I don't care how much education or how little education you have. If your profession is in information technology (IT), you and I share the same exact thoughts, questions, and even dread toward this particular topic. The topic is user surveys.

What am I talking about, you ask? Ok, picture yourself arriving to work and finding "it" in your inbox at the entrance of your cubicle. Let's go even further, you're the one that placed the survey in everyone's inbox and now the time has arrived to collect the paperwork, tally the results, and draw conclusions for the project that you are working on. At this moment, reflect on your thoughts and frame of mind while you hold those surveys in your hands. We all have that same creepy voice in our heads asking us the question, "Do these user surveys hold any real value?" The voice demands an answer.

Whether you answer that question with a "yes", "no", or "maybe" will depend on your own experiences with surveys. However, if I cared to guess I would say that most of us would answer the question with a "maybe" knowing full well we believe the answer is a "no". Yet if you're responsible in providing support to IT users you really wish the effort put into surveys would bear fruit and help identify the improvements needed for the products or services you provide.

As I mentioned in an earlier post, Earl Miles in his quest to improve Drupal's administrative user-interface brings up his frustration with surveys.

This is excellent. They have surveys that they've used to figure out what new users want. This is…well, to my mind. not so excellent.

“But wait,” some say, “how can the surveys be wrong?”

Well, I only have my perspective to give here, but my perspective is that the new users don’t actually know exactly what they want. Oh, they know in broad terms what it is they want to do, but I don’t feel that the questions posed in the surveys are necessarily the right ones.

His assertion that the failure of surveys is the result of both users not knowing what they want and those doing the survey asking the wrong questions is correct. I've drawn some of the same conclusions. More importantly the research shows it often to be true.

A year ago, for the completion of a Master's degree in Administration with focus on information systems, I wrote a very dry professional report titled, "Analysis and Recommendations for Information Technology User Support Provided to a Government Field Office". (From the title, we can all assume that the report is collecting dust on a shelf at my university's library.) In that report though, I made a similar observations to the one provided in Mr. Miles post. In my report I wrote:

There are likely two reasons for why previous surveys and interviews concerning user support did not identify any problems with the user support program. First, those authoring and completing the surveys might not know or understand what their current needs were from the IT user support programs. Users are unable to provide necessary feedback when they are either not informed of their role or do not understand their role in the activity for which participation is sought (Damodaran, 1996). Another reason might be that the questions that were asked in the surveys and interviews were not the correct questions for identifying problems and desired solutions.

As an example to illustrate these points, when my organization needed to replace the field office's aging Windows NT servers a survey was conducted. An advisory team representing the field office IT staff was organized and charged with recommending a new network server for the field office and to define whether there were any organizational requirements for the server. The formation of the "server team" preceded by sending surveys to the field office IT staff.

Although the first page of the survey indicated that business needs were a concern for the new servers, the survey itself did not ask a single question about business needs. Instead the surveys focused on technical questions such as whether the operating system of the servers should be Windows 2003 or Linux. After the surveys were analyzed three conclusions were made:

  • No organizational requirements for server standardization existed.
  • No recommendation for a server configuration could be provided from the survey.
  • Offices should make their own decision with regards to the office server configuration with little input from headquarters.

Yes that's right, the two-month work by five or six dedicated IT professionals gave us a survey that provided no additional insight than when they first started. According to the survey, I could have brought in an old Radio Shack TRS-80 and the users would be happy. Luckily, most of us (including the survey team) recognized that survey results are not a substitute for professional judgement. Something was terribly wrong with the survey!

While the survey in my example focused on software and hardware requirements, the survey should have focused on the business needs of the field office before the task of recommending a server configuration was given. Proper systems analysis, requiring an understanding of organizational goals and strategic planning, should have preceded any hardware and software specifications recommended by the advisory team (Turban, McClean, & Wetherbe, 1999, pp. 547-548). The questionnaire contained no questions regarding the field office's current operational, administrative, or research needs, nor did it give any consideration that the server might have a role in the organization's strategic plans or local operating plans. Lastly, non-IT managers were not asked to provide input into the team's survey. Simply put, the survey the advisory team provided to the field office reflected the server as the end goal and not just the means for achieving organizational goals.

Surveys, are not going to be of value if the project itself isn't understood. I think sometimes we throw surveys at the users in hopes they'll define the project and tasks we should be working on. However, time and time again it has been shown that good surveys cannot be written until the project and goals to be accomplished have already been defined. Later in the report, my small section on surveys concluded with:

The advisory team's oversight in not relating IT systems to business processes is not unique to organizations. A recent article in a magazine publication for IT professionals estimated that anywhere from 50 to 80 percent of all projects fail (Zhen, 2005). Three of the five reasons for project failure given by the same article were:

  • the project's real value isn't understood
  • the true users are not known
  • the requirements aren't clearly understood.

IT managers often fail to improve their user support programs not because of lack of analysis, but by neglecting to remember that system analysis requires more than surveys and interviews. Organizations need to take a step back and consider improving the methodology they are using for analyzing the problems in their user support programs.

 

In other words improving something requires letting go of the user surveys and focus on the ideal system you wish to create. Then determine what the ideal system has that the current system doesn't have. If you really don't have an understanding to where you want to go with a project I have doubts the user will be able to help you. After all, their current system is where they thought they wanted to go in the first place and they didn't get it right. The users are not looking for you you to ask them questions through your silly interviews and surveys. The users are wising up and looking for you to answer their questions. My word of advice, draw your conclusions from their questions and not their answers.

References

Damodaran, L. (1996). User involvement in the systems design process - a practical guide for users. [Electronic Version]. Behaviour & Information Technology, 15, 363-377.

Turban, E., Mclean, E., & Wetherbe, J. (1999). Information Technology for Management: Making Connections for Strategic Advantage, Second Edition. New York: John Wiley & Sons, Inc.

Zhen, J., (2005). Why projects fail. Computerworld. Retrieved March 6, 2005, from http://www.computerworld.com/article/2568179/it-project-management/why-…