Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Survey Research | Definition, Examples & Methods

Survey Research | Definition, Examples & Methods

Published on August 20, 2019 by Shona McCombes . Revised on June 22, 2023.

Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyze the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyze the survey results, step 6: write up the survey results, other interesting articles, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research : investigating the experiences and characteristics of different social groups
  • Market research : finding out what customers think about products, services, and companies
  • Health research : collecting data from patients about symptoms and treatments
  • Politics : measuring public opinion about parties and policies
  • Psychology : researching personality traits, preferences and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and in longitudinal studies , where you survey the same sample several times over an extended period.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • US college students
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18-24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalized to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

Several common research biases can arise if your survey is not generalizable, particularly sampling bias and selection bias . The presence of these biases have serious repercussions for the validity of your results.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every college student in the US. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalize to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions. Again, beware of various types of sampling bias as you design your sample, particularly self-selection bias , nonresponse bias , undercoverage bias , and survivorship bias .

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by mail, online or in person, and respondents fill it out themselves.
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses.

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by mail is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g. residents of a specific region).
  • The response rate is often low, and at risk for biases like self-selection bias .

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyze.
  • The anonymity and accessibility of online surveys mean you have less control over who responds, which can lead to biases like self-selection bias .

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping mall or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g. the opinions of a store’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations and is at risk for sampling bias .

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data: the researcher records each response as a category or rating and statistically analyzes the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analyzed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g. yes/no or agree/disagree )
  • A scale (e.g. a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g. age categories)
  • A list of options with multiple answers possible (e.g. leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analyzed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an “other” field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic. Avoid jargon or industry-specific terminology.

Survey questions are at risk for biases like social desirability bias , the Hawthorne effect , or demand characteristics . It’s critical to use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no indication that you’d prefer a particular answer or emotion.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by mail, online, or in person.

There are many methods of analyzing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also clean the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organizing them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analyzing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analyzed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyze it. In the results section, you summarize the key results from your analysis.

In the discussion and conclusion , you give your explanations and interpretations of these results, answer your research question, and reflect on the implications and limitations of the research.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyze your data.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, June 22). Survey Research | Definition, Examples & Methods. Scribbr. Retrieved June 26, 2024, from https://www.scribbr.com/methodology/survey-research/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, questionnaire design | methods, question types & examples, what is a likert scale | guide & examples, what is your plagiarism score.

How to make a survey report: A guide to analyzing and detailing insights

  • March 8, 2024

The survey report: Meaning and importance

Survey findings, analysis and interpretation, recommendations, use clear and accessible language, structure a report for clarity, provide context, include visual aids, cite sources, proofread and edit, introduction, create a survey with surveyplanet.

In today’s data-driven world, surveys are indispensable tools for gathering valuable intelligence and making informed decisions. Whether conducting market research, gauging customer satisfaction, or gathering employee feedback, the key to unlocking a survey’s true potential lies in the subsequent analysis and reporting of the collected data.

Whether you’re new to surveys or a seasoned researcher, mastering analysis and reporting is essential. To unlock the full potential of surveys one must learn how to make a survey report.

Before diving into the process, let’s clarify what a survey report is and why it’s crucial. It is a structured document that presents the findings, analysis, and conclusions derived from survey data. It serves as a means to communicate the insights obtained from the survey to stakeholders, enabling them to make informed decisions.

The importance of a survey report cannot be overstated. It provides a comprehensive overview of collected data, allowing stakeholders to gain a deeper understanding of the subject matter. Additionally, it serves as a reference point for future decision-making and strategy development, ensuring that actions are based on sound evidence rather than assumptions.

Key components of a survey report

Start a survey report with a brief overview of the purpose of the survey, its objectives, and the methodology used for data collection. This sets the context for the rest of the report and helps readers understand the scope of the survey.

The introduction serves as the roadmap that guides readers through the document and provides essential background information. It should answer questions such as why the survey was conducted, who the target audience was, and how the data was collected . By setting clear expectations upfront, the groundwork is laid for a coherent and compelling report.

Present the key findings of the survey in a clear and organized manner. Use charts, graphs, and tables to visualize the data effectively. Ensure that the findings are presented in a logical sequence, making it easy for readers to follow the narrative.

The survey findings section is the heart of the report, where the raw data collected during the survey is presented. It’s essential to organize the findings in a way that is easy to understand and digest. Visual aids such as charts, graphs, and tables can help illustrate trends and patterns in the data, making it easier for readers to grasp the key insights.

Dive deeper into the survey data by analyzing trends, patterns, and correlations. Provide insights into what the data means and why certain trends may be occurring. The findings must be interpreted in the context of the survey’s objectives and any relevant background information.

Analysis and interpretation are where the real value of the survey report lies. This is where surface-level findings are moved beyond to uncover the underlying meaning behind the data. By digging deeper to provide meaningful insights, stakeholders gain a deeper understanding of the issues at hand and identify potential opportunities for action.

Based on the analysis, offer actionable recommendations or suggestions that address the issues identified in the survey. These recommendations should be practical, feasible, and tied directly to the survey findings.

The recommendations section is where insights are translated into action. It’s not enough to simply present the findings—clear guidance on what steps should be taken next must be provided. Recommendations should be specific, actionable, and backed by evidence from the survey data. Such practical guidance empowers stakeholders to make informed decisions that drive positive change.

Don’t forget to summarize the key findings, insights, and recommendations presented in the report. Reinforce the importance of the survey results and emphasize how they can be used to drive decision-making.

The conclusion serves as a final wrap-up, summarizing the key takeaways and reinforcing the importance of the findings. It’s an opportunity to remind stakeholders of the survey’s value and how the results can be used to inform decision-making and drive positive change. By ending on a strong note, readers have a clear understanding of the significance of the survey and the actions that need to be taken moving forward.

Best practices for survey report writing

In addition to understanding the key components of a survey report, it’s essential to follow best practices when writing and presenting findings. Here are some tips to ensure that a survey report is clear, concise, and impactful.

Avoid technical jargon or overly complex language that may confuse readers. Instead, use clear and straightforward wording that is easily understood by the target audience.

Organize a survey report into clearly defined sections:

  • Conclusion.

This helps readers navigate the document and find needed information quickly.

Always provide the background of findings by explaining the significance of the survey objectives and how the data relates to the broader goals of the organization or project.

Charts, graphs, and tables can help illustrate key findings and trends in the data. Use them sparingly and ensure they are properly labeled and explained in the text.

When referencing external sources or previous research, be sure to cite them properly. This adds credibility to the findings and allows readers to explore the topic further if they wish.

Before finalizing a survey report, take the time to proofread and edit it for grammar, spelling, and formatting errors. A polished and professional-looking report reflects positively on your work and enhances its credibility.

By following these best practices, it is ensured that a survey report effectively communicates findings and insights to stakeholders, empowering them to make informed decisions based on the data collected.

Short survey report example

To illustrate the process, let’s consider a hypothetical short survey report example:

The purpose of this survey was to gather feedback from customers regarding their satisfaction with our products and services. The survey was conducted online and received responses from 300 participants over a two-week period.

  • 85% of respondents reported being satisfied with the quality of our products.
  • 70% indicated that they found our customer service to be responsive and helpful.
  • The majority of respondents cited price as the primary factor influencing their purchasing decisions.

The high satisfaction ratings suggest that our products meet the expectations of our customers. However, the feedback regarding pricing indicates a potential area for improvement. By analyzing the data further, we can identify opportunities to adjust pricing strategies or offer discounts to better meet customer needs.

Based on the survey findings, we recommend conducting further market research to better understand pricing dynamics and competitive positioning. Additionally, we propose exploring initiatives to enhance the overall value proposition for our products and services.

The survey results provide valuable insights into customer perceptions and preferences. By acting on these findings, we can strengthen our competitive position and drive greater customer satisfaction and loyalty .

Creating a survey report involves more than just presenting data; it requires careful analysis, interpretation, and meaningful recommendations. By following the steps outlined in this guide and utilizing the survey report example like the one provided, you can effectively communicate survey findings and empower decision-makers to take action based on valuable insights.

Ready to turn survey insights into actionable results? Try SurveyPlanet, our powerful survey tool designed to streamline survey creation, data collection, and analysis. Sign up now for a free trial and experience the ease and efficiency of gathering valuable feedback with SurveyPlanet. Your journey to informed decision-making starts here!

Photo by Kaleidico on Unsplash

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Doing Survey Research | A Step-by-Step Guide & Examples

Doing Survey Research | A Step-by-Step Guide & Examples

Published on 6 May 2022 by Shona McCombes . Revised on 10 October 2022.

Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyse the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyse the survey results, step 6: write up the survey results, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research: Investigating the experiences and characteristics of different social groups
  • Market research: Finding out what customers think about products, services, and companies
  • Health research: Collecting data from patients about symptoms and treatments
  • Politics: Measuring public opinion about parties and policies
  • Psychology: Researching personality traits, preferences, and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and longitudinal studies , where you survey the same sample several times over an extended period.

Prevent plagiarism, run a free check.

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • University students in the UK
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18 to 24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalised to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every university student in the UK. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalise to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions.

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by post, online, or in person, and respondents fill it out themselves
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by post is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g., residents of a specific region).
  • The response rate is often low.

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyse.
  • The anonymity and accessibility of online surveys mean you have less control over who responds.

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping centre or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g., the opinions of a shop’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations.

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data : the researcher records each response as a category or rating and statistically analyses the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analysed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g., yes/no or agree/disagree )
  • A scale (e.g., a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g., age categories)
  • A list of options with multiple answers possible (e.g., leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analysed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an ‘other’ field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic.

Use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no bias towards one answer or another.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by post, online, or in person.

There are many methods of analysing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also cleanse the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organising them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analysing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analysed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyse it. In the results section, you summarise the key results from your analysis.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviours. It is made up of four or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with five or seven possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyse your data.

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analysing data from people using questionnaires.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2022, October 10). Doing Survey Research | A Step-by-Step Guide & Examples. Scribbr. Retrieved 24 June 2024, from https://www.scribbr.co.uk/research-methods/surveys/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs quantitative research | examples & methods, construct validity | definition, types, & examples, what is a likert scale | guide & examples.

Logo for M Libraries Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

9.1 Overview of Survey Research

Learning objectives.

  • Define what survey research is, including its two important characteristics.
  • Describe several different ways that survey research can be used and give some examples.

What Is Survey Research?

Survey research is a quantitative approach that has two important characteristics. First, the variables of interest are measured using self-reports. In essence, survey researchers ask their participants (who are often called respondents in survey research) to report directly on their own thoughts, feelings, and behaviors. Second, considerable attention is paid to the issue of sampling. In particular, survey researchers have a strong preference for large random samples because they provide the most accurate estimates of what is true in the population. In fact, survey research may be the only approach in psychology in which random sampling is routinely used. Beyond these two characteristics, almost anything goes in survey research. Surveys can be long or short. They can be conducted in person, by telephone, through the mail, or over the Internet. They can be about voting intentions, consumer preferences, social attitudes, health, or anything else that it is possible to ask people about and receive meaningful answers.

Most survey research is nonexperimental. It is used to describe single variables (e.g., the percentage of voters who prefer one presidential candidate or another, the prevalence of schizophrenia in the general population) and also to assess statistical relationships between variables (e.g., the relationship between income and health). But surveys can also be experimental. The study by Lerner and her colleagues is a good example. Their use of self-report measures and a large national sample identifies their work as survey research. But their manipulation of an independent variable (anger vs. fear) to assess its effect on a dependent variable (risk judgments) also identifies their work as experimental.

History and Uses of Survey Research

Survey research may have its roots in English and American “social surveys” conducted around the turn of the 20th century by researchers and reformers who wanted to document the extent of social problems such as poverty (Converse, 1987). By the 1930s, the US government was conducting surveys to document economic and social conditions in the country. The need to draw conclusions about the entire population helped spur advances in sampling procedures. At about the same time, several researchers who had already made a name for themselves in market research, studying consumer preferences for American businesses, turned their attention to election polling. A watershed event was the presidential election of 1936 between Alf Landon and Franklin Roosevelt. A magazine called Literary Digest conducted a survey by sending ballots (which were also subscription requests) to millions of Americans. Based on this “straw poll,” the editors predicted that Landon would win in a landslide. At the same time, the new pollsters were using scientific methods with much smaller samples to predict just the opposite—that Roosevelt would win in a landslide. In fact, one of them, George Gallup, publicly criticized the methods of Literary Digest before the election and all but guaranteed that his prediction would be correct. And of course it was. (We will consider the reasons that Gallup was right later in this chapter.)

From market research and election polling, survey research made its way into several academic fields, including political science, sociology, and public health—where it continues to be one of the primary approaches to collecting new data. Beginning in the 1930s, psychologists made important advances in questionnaire design, including techniques that are still used today, such as the Likert scale. (See “What Is a Likert Scale?” in Section 9.2 “Constructing Survey Questionnaires” .) Survey research has a strong historical association with the social psychological study of attitudes, stereotypes, and prejudice. Early attitude researchers were also among the first psychologists to seek larger and more diverse samples than the convenience samples of college students that were routinely used in psychology (and still are).

Survey research continues to be important in psychology today. For example, survey data have been instrumental in estimating the prevalence of various mental disorders and identifying statistical relationships among those disorders and with various other factors. The National Comorbidity Survey is a large-scale mental health survey conducted in the United States (see http://www.hcp.med.harvard.edu/ncs ). In just one part of this survey, nearly 10,000 adults were given a structured mental health interview in their homes in 2002 and 2003. Table 9.1 “Some Lifetime Prevalence Results From the National Comorbidity Survey” presents results on the lifetime prevalence of some anxiety, mood, and substance use disorders. (Lifetime prevalence is the percentage of the population that develops the problem sometime in their lifetime.) Obviously, this kind of information can be of great use both to basic researchers seeking to understand the causes and correlates of mental disorders and also to clinicians and policymakers who need to understand exactly how common these disorders are.

Table 9.1 Some Lifetime Prevalence Results From the National Comorbidity Survey

Lifetime prevalence*
Generalized anxiety disorder 5.7 7.1 4.2
Obsessive-compulsive disorder 2.3 3.1 1.6
Major depressive disorder 16.9 20.2 13.2
Bipolar disorder 4.4 4.5 4.3
Alcohol abuse 13.2 7.5 19.6
Drug abuse 8.0 4.8 11.6
*The lifetime prevalence of a disorder is the percentage of people in the population that develop that disorder at any time in their lives.

And as the opening example makes clear, survey research can even be used to conduct experiments to test specific hypotheses about causal relationships between variables. Such studies, when conducted on large and diverse samples, can be a useful supplement to laboratory studies conducted on college students. Although this is not a typical use of survey research, it certainly illustrates the flexibility of this approach.

Key Takeaways

  • Survey research is a quantitative approach that features the use of self-report measures on carefully selected samples. It is a flexible approach that can be used to study a wide variety of basic and applied research questions.
  • Survey research has its roots in applied social research, market research, and election polling. It has since become an important approach in many academic disciplines, including political science, sociology, public health, and, of course, psychology.

Discussion: Think of a question that each of the following professionals might try to answer using survey research.

  • a social psychologist
  • an educational researcher
  • a market researcher who works for a supermarket chain
  • the mayor of a large city
  • the head of a university police force

Converse, J. M. (1987). Survey research in the United States: Roots and emergence, 1890–1960 . Berkeley, CA: University of California Press.

Research Methods in Psychology Copyright © 2016 by University of Minnesota is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Survey Software & Market Research Solutions - Sawtooth Software

  • Technical Support
  • Technical Papers
  • Knowledge Base
  • Question Library

Call our friendly, no-pressure support team.

Survey Research: Definition, Methods, Examples, and More

Table of Contents

What is Survey Research?

Survey research, as a key research method of marketing research, is defined as the systematic collection and analysis of data gathered from respondent feedback through questionnaires or interviews. This primary research method is designed to gather information about individuals' opinions, behaviors, or characteristics through a series of questions or statements. 

The evolution of survey research in market research has been profound, transitioning from paper-based questionnaires posted randomly to respondent’s homes to sophisticated online platforms that offer much more convenient ways to reach the desired audience. Its importance lies not just in the breadth of data it can collect but in the depth of understanding it provides, allowing researchers and businesses alike to tap into the psyche of their target audience.

Get Started with Your Survey Research Today!

Ready for your next research study? Get access to our free survey research tool. In just a few minutes, you can create powerful surveys with our easy-to-use interface.

Start Survey Research for Free or Request a Product Tour

Reasons for Conducting Survey Research

The reasons for conducting survey research are as diverse as the questions it seeks to answer, yet they all converge on a common goal: to inform decision-making processes. Here's why survey research is pivotal:

  • Honest Feedback and Insights: Survey research offers a platform for respondents to provide candid feedback on products, services, or policies, providing businesses with critical insights into consumer satisfaction and areas for improvement.
  • Privacy and Anonymity Benefits: By ensuring respondent anonymity, surveys encourage honest and uninhibited responses, leading to more accurate and reliable data.
  • Providing a Platform for Criticism and Improvement Suggestions: Surveys open up a dialogue between businesses and their clientele, offering a structured way for criticism and suggestions to be voiced constructively.
  • Iterative Feedback Loops: The iterative nature of survey research, with its ability to be conducted periodically, helps businesses track changes in consumer behavior and preferences over time, enabling continuous improvement and adaptation. This ongoing dialogue facilitated by survey research not only enriches the business-consumer relationship but also fosters an environment of continuous learning and improvement, ensuring that businesses remain agile and responsive to the evolving needs and expectations of their target audience.

A woman sitting on a couch taking a phone call. Representing phone interviews (one of the survey research types)

Types of Survey Research Methods & Data Collection Methods

In the world of survey research a range of methods each offer unique advantages tailored to a researcher or businesses specific research goals.

Email Surveys

Email surveys represent a modern approach to data collection, utilizing email addresses stored on client databases to distribute questionnaires. This method is particularly appealing for its cost-effectiveness and efficiency, as it minimizes the financial expenditure associated with other methods. However, many businesses only hold email addresses relating to their current customer base, meaning that any studies performed using this approach will be limited in scope.

Online Panels

Online panels represent the most convenient form of online research. Panel companies source a wide variety of potential respondents which are available for any company to survey on a cost-per-interview (CPI) basis. However, this convenience comes with drawbacks as online panels are known for having potential data quality issues which are likely to impact the results of your survey if not guarded against.

Phone Surveys (CATI)

Computer Assisted Telephone Interviewing (CATI) combines the efficiency of computer-guided surveys with the personal touch of telephone communication. This method is advantageous for its ability to cover wide populations, including those in remote areas, ensuring a broader demographic reach. The direct interaction between the interviewer and respondent can also enhance response rates and clarity on questions. However, personal engagement comes at a cost, making CATI more time-consuming and expensive than online methods. 

Face-to-Face Interviews

The most traditional method, face-to-face interviews, involves direct, in-person interaction between the interviewer and the respondent. This approach is highly valued for its high response rates and the depth of insight it can provide, including non-verbal cues that offer additional layers of understanding. Although this method is resource-intensive, requiring significant investment in trained personnel and logistics, the quality of data obtained can be unmatched. 

Survey Research Timeframe Methods

Longitudinal Survey Research tracks the same group of respondents over time, offering invaluable insights into trends and changes in behaviors or attitudes. This method is ideal for observing long-term patterns, such as the impact of societal changes on individual behaviors. 

Cross-sectional / Ad-hoc Survey Research provides a snapshot of a population at a specific point in time, making it perfect for capturing immediate insights across various demographics. This method's versatility is showcased in applications ranging from consumer satisfaction surveys to public opinion polls, where understanding the current state of affairs is crucial. 

Each of these survey research methods brings its own strengths to the table, allowing researchers to tailor their approach to the specific nuances of their study objectives. By selecting the method that best aligns with their goals, researchers can maximize the effectiveness of their data collection efforts, paving the way for impactful insights and informed decision-making.

Uses and Examples of Survey Research

Survey research's versatility allows it to be applied across a myriad of fields, offering insights that drive decision-making and strategic planning. Its applications range from gauging public opinion and consumer preferences to evaluating the effectiveness of policies and programs.

Marketing Research

In marketing research, survey research is pivotal in understanding consumer behavior, preferences, and satisfaction levels. For example, a retail company may conduct online surveys to determine customer satisfaction with its products and services. The feedback collected can highlight areas of success and identify opportunities for improvement, guiding the company in refining its offerings and enhancing the customer experience.

Free Survey Maker Tool

Get access to our free and intuitive survey maker. In just a few minutes, you can create powerful surveys with its easy-to-use interface.

Try our Free Survey Maker or Request a Product Tour

Political Polling

Political polling represents another significant application of survey research, providing insights into voter attitudes, preferences, and likely behaviors. These surveys can influence campaign strategies, policy development, and understanding of public sentiment on various issues. A notable instance is the use of survey research during electoral campaigns to track the popularity of candidates and the effectiveness of their messages.

Public Health Research

Public health studies frequently utilize survey research to assess health behaviors, awareness of health issues, and the impact of health interventions. For example, a cross-sectional survey might be conducted to evaluate the effectiveness of a public health campaign aimed at reducing smoking rates. The data gathered can inform health officials about the campaign's impact and guide future public health strategies.

Educational Research

Educational research also benefits from survey methods, with studies designed to evaluate educational interventions, student satisfaction, and learning outcomes. For instance, longitudinal surveys can track students' academic progress over time, providing insights into the effectiveness of educational programs and interventions.

These examples underscore the adaptability of survey research, enabling tailored approaches to collecting and analyzing data across various sectors. Its capacity to yield actionable insights makes it an invaluable tool in the pursuit of knowledge and improvement.

Advantages and Disadvantages

Survey research is a powerful tool in the arsenal of researchers, offering numerous advantages while also presenting certain challenges that must be navigated carefully.

Advantages of Survey Research

  • Cost-Effectiveness: Survey research is often more affordable than other data collection methods, especially beneficial when targeting large populations.
  • Large Sample Sizes: It enables the collection of data from a large sample size (audience), enhancing the generalizability of findings.
  • Flexibility in Design: Surveys allow for customization in question formats, delivery methods, and structure, tailoring the approach to specific research needs.
  • Ease of Administration: With options for online, mail, phone, and in-person surveys, administration can be adapted to best reach the target audience.
  • Efficient Data Analysis: The quantitative nature of survey responses facilitates straightforward analysis using statistical software, aiding in the quick identification of trends and insights.

Disadvantages of Survey Research

  • Response Bias: The potential for respondents to provide socially desirable answers rather than truthful ones can lead to biased data .
  • Sampling Issues: Challenges such as non-response bias and difficulty in reaching certain populations can compromise the representativeness of the sample.
  • Questionnaire Design Challenges: Crafting questions that are clear and unbiased while avoiding ambiguity is complex and can impact the validity of the results.
  • Lack of Response Context: Surveys may not capture the nuances behind responses, limiting understanding of the reasons behind certain behaviors or opinions.
  • Time and Resource Constraints: Designing, administering, and analyzing surveys can be resource-intensive, potentially limiting their scope and depth.
  • Data Quality: The rise of survey panels has increased the likelihood of either poor quality responses, or even automated bots, affecting survey results.

Understanding these advantages and disadvantages is crucial for researchers as they design and implement survey research studies. By carefully considering these factors, it is possible to leverage the strengths of survey research while mitigating its limitations, ensuring the collection of valuable and actionable insights.

Survey Research Design Process

The design and execution of survey research involve several critical steps, each contributing to the overall quality and reliability of the findings. By following a structured process, researchers can ensure that their survey research effectively meets its objectives.

  • Define Survey Research Objectives: The first step involves clearly defining what you aim to achieve with your survey. Objectives should be specific, measurable, achievable, relevant, and time-bound (SMART). This clarity guides the subsequent steps of the survey design process.
  • Identify Your Target Audience: Knowing who you need to survey is crucial. The target audience should align with the research objectives, ensuring that the data collected is relevant and insightful.
  • Select the Appropriate Method: Based on the objectives and the target audience, choose the most suitable survey method. Consider factors such as budget, time constraints, and the need for depth vs. breadth of data.
  • Plan and Execute the Study: This involves crafting the survey questionnaire, deciding on the distribution method (online, mail, phone, face-to-face), and determining the timeline for data collection. Ensuring questions are clear, unbiased, and relevant is critical to gathering valuable data.
  • Analyze Data and Make Decisions: Once data collection is complete, analyze the responses to identify trends, patterns, and insights. Use statistical software for quantitative analysis and consider qualitative methods for open-ended responses. The findings should inform decision-making processes, guiding strategic planning and interventions.

By following these steps, researchers can maximize the effectiveness and reliability of their survey research, paving the way for meaningful insights and informed decision-making.

Sampling Methods in Survey Research

A crucial aspect of survey research is selecting a representative sample from the target population . The sampling method plays a significant role in the quality and generalizability of the research findings. There are two main types of sampling methods: probability sampling and non-probability sampling.

  • Probability Sampling: This method ensures every member of the target population has a known and equal chance of being selected. Types of probability sampling include simple random sampling, stratified random sampling, and cluster sampling. This method is preferred for its ability to produce representative samples, allowing for generalizations about the population from the sample data.
  • Non-Probability Sampling: In non-probability sampling, not every member of the population has a known or equal chance of selection. This category includes convenience sampling, quota sampling, and purposive sampling. While less rigorous than probability sampling, non-probability methods are often used when time and resources are limited or when specific, targeted insights are required.

Choosing the right sampling method is critical to the success of survey research. For example, a market research firm aiming to understand consumer preferences across different demographics might use stratified random sampling to ensure that the sample accurately reflects the population's diversity. Conversely, a preliminary study exploring a new phenomenon might opt for convenience sampling to quickly gather initial insights.

Understanding the strengths and limitations of each sampling method allows researchers to make informed choices, balancing rigor with practical constraints to best achieve their research objectives.

Need Sample for Your Research?

Let us connect you with your ideal audience! Reach out to us to request sample for your survey research.

Request Sample

Survey research provides invaluable insights across diverse fields, from consumer behavior to public policy. Its flexibility, cost-effectiveness, and broad reach make it an indispensable tool for researchers aiming to gather actionable data. Despite its challenges, such as response bias and sampling complexities, careful design and methodological rigor can mitigate these issues, enhancing the reliability and validity of findings.

Sawtooth Software

3210 N Canyon Rd Ste 202

Provo UT 84604-6508

United States of America

survey research report meaning

Support: [email protected]

Consulting: [email protected]

Sales: [email protected]

Products & Services

Support & Resources

survey research report meaning

Bit Blog

Survey Report: What is it & How to Create it?

' src=

So, you’ve conducted a survey and came up with strategies to get the most responses to that survey.

A survey does not end after you managed to capture a lot of responses. Responses are just plain data.

A survey report is how you convert that data into information and implement the results in your research.

Data left unanalyzed is just a mess of numbers and words, which do no good to anyone.In this article, we’ll show you the ins and outs of writing a fantastic survey report.

Before getting into the step-by-step process, let’s first understand what survey reports are.

What is a Survey Report? (Definition)

A survey report is a document that demonstrates all the important information about the survey in an objective, clear, precise, and fact-based manner.

The report defines the:

  • The objective of the survey
  • Number of questions
  • Number of respondents
  • Start and end date of the survey
  • The demographics of the people who have answered (geographical area, gender, age, marital status, etc)
  • Your findings and conclusion from the given data

All this data should be presented using graphs, charts, and tables. Anything that makes the data easy-to-read and understand.

After reading your survey report, the reader should be clear on:

  • Why you conducted this survey.
  • The time period the survey ran.
  • Channels used to promote and fill up the survey.
  • Demographics of the respondents.
  • What you found out after conducting the survey.
  • How can the findings be implemented to create better results?

Importance of Writing a Survey Report

Here are four compelling reasons why you should always write a survey report:

1. Make a powerful impact

Survey reports reveal the hard numbers regarding a scenario. It is always impactful when you include stats with facts.

Bit.ai Home Page CTA

For instance, saying “80% of women working in the media sector claim to have faced workplace harassment at some time in their life” is more impactful than saying “Many women face workplace harassment.”

Data is much more powerful when we know the exact proportion of it. People connect with issues that come with large numbers, as they stir up the room and demand for action.

Read more:   How To Create A Customer Survey For Better Insights?

2. Paves the way for d ecision making

When you categorize large data into easy to read charts and graphs, it gets converted into useful information. This information is then used by the management to make decisions that will directly affect the company.

For example, let’s say that you performed a product feedback survey to find out the preferences of your target audience.

Based on their answers, the features of your product can be amended.

After all, it’s the people for whom you are making the product, right?

3. Trend is your friend

This is a common saying among surveyors and researchers. It implies that when you ask one question again and again over time, you find a trend of change in the answers.

Thus, survey reports help track trends in the market that also provides insight into what the future could hold.

For instance, if we talk about fashion, bell-bottom jeans were a huge trend in the ’70s. Then, the ’80s saw carrot pants followed by ankle-high skinny jeans in the 90s.

Fashion researchers track these changes and bring back some of the old trends. Bell-bottoms and carrot pants came back and Gen Z is loving it!

4. Helps to gather explicit data

Explicit data in survey terms means first-hand data that is obtained without any tampering in the process. Data often gets contaminated, when it is not received first-hand and transfers through various mediums.

Survey reports help express information fully and clearly without beating around the bush. However, the possibility of a respondent altering the data, lying, or manipulating should be considered.

First-hand data should not be blindly followed. It must be supported by some underlying facts.

With this thought in mind, let’s move to the actual steps required to write a survey report.

Read more:   Formal Reports: What are they & How to Create them!

How to Write a Survey Report in 5 Easy Steps?

If you are here trying to learn how to make a survey report, we can assume that you have already taken two steps:

  • Created & distributed a questionnaire/survey
  • Received responses to it

When its time to analyze the collected data, take the following steps:

Step 1: Export Data

Whether you used Google Forms, Typeform, Survey Monkey, or any other survey platform, the result of the survey conducted comes in two ways. One is a spreadsheet document filled with data and the other is the graphical and chart representation of the data.

Export this real-time information in your survey report by either downloading, printing, or simply copy-pasting the graphical results of the survey into your survey report document.

This legitimates your survey and lets the reader see the exact survey you conducted and its digital results without any edits or data manipulation.

Read more:  How to Embed Google Form to Your Documents?

Step 2: Filter, Analyze and Visualize

The next step is to filter the data. You’ll want to look out for corrupt, biased, duplicate, or inaccurate information and filter it out. This step is also called cleaning the data. The cleaned data is then analyzed to infer results.

Analyzing the data includes grouping similar aspects, categorizing, and identifying patterns.

For instance, XYZ company conducted an employee satisfaction survey.

In it, they asked, how is the relationship between you and your immediate supervisor?

According to the data, it is found that 80% workforce said that their supervisor was warm and friendly. the other 20% gave a mixed kind of response.

On analyzing it, it was determined that most of the workforce helps each other and the pyramid of command is running successfully.

So the XYZ company will group this information and present it like: ‘4 out of 5 employees working in XYZ organization claimed to have a warm and friendly relationship with their supervisor ‘

With this, it can be concluded that the work environment of XYZ organization is good. They will also use this information to attract more workforce towards the organization.

It is always better to put the information in statistical terms. This makes the visualization process easier. While visualizing, you collect the analyzed data in the form of easily understandable charts, graphs, or figures.

Visualization makes the data go alive!

Step 3: Interpret Data

At this point, you have all your information ready and baked. It time to eat!

By eating, we mean inferring what you found out from the collected data. Observe your findings with supporting facts and figures and also think about the next step/recommendations to make a change.

A survey report

But hey, we can’t eat before presenting the dishes first. Presentation is an important element!

However, not every survey report is presented the same way. This leads us to the next step…

Step 4: Recognise the Type of Survey Report

There are various kinds of survey reports, depending on the nature and objective of the survey. Recognize what type your survey falls into. Some common ones include:

  • Employee Satisfaction Survey: This is done to determine the opinion of the employees regarding the workspace. It helps in increasing employee motivation and makes them feel heard.
  • Customer Satisfaction Survey: You must have come across one of these in your life. Many restaurants, salons, companies ask their customers to fill out a form giving feedback.
  • Market Research Survey: This type of survey is performed to find out the preferences and demographics of the target audience. It also helps with competitor research.
  • Social Survey: This is a survey conducted to find out stats about social topics such as climate change, waste management, poverty, etc.

Step 5: Structure of the Survey Report

Add the title of your survey, name of the organization, date of submission, name of the mentor, etc.

  • Background and Objective

Provide a background of the topic at hand, giving insight as to why this survey is conducted. The reason behind it is known as the objective of the report.

  • Methodology

Your methodology should tell the readers about the methods you used to conduct the survey, channels used to spread it, and the ways used to analyze the received data.

The conclusion should contain a detailed picture of your findings. It includes listing what you determined by the survey in form of facts, figures, and statistics.

  • Recommendation

Give the reader the next step or CTA (call to action) in your survey report. After sharing the survey and analyzing the results, offer some insightful recommendations to make a change, amend or implicate something.

For instance, to improve employee satisfaction and communication, we can hold a weekly town hall meeting.

Include detailed information that is too long/complicated to put in the report but is used or referred to in the report. It may include long mathematical calculations, tables of raw data, in-depth charts, etc.

Make sure to properly credit every source that you extracted information from. This is very important as the absence of references puts a threat of plagiarism in your report.

  • Table of Contents

Usually, it is included at the beginning of any report or project. However, in the case of the survey report, the table of contents comes at the end. It lists everything present in the report divided into sections and sub-sections.

  • Executive Summary

It is a crucial element of your report. Not everybody has the time to go through the entire survey report. The executive summary should be able to summarise the whole survey in one or two pages.

Okay great so now you know how to write a Survey Report, how about we show you the smartest & fastest way for you to create yours?

Bit.ai : The Ultimate Tool for Creating Survey Reports

Bit.ai: Tool for creating survey reports

Easily weave any type of digital content within your Bit docs like file attachments, visual web links, cloud files , PDF previews, math equations, videos , and much more. Bit integrates across 80+ popular applications like Google Sheets , OneDrive , Tableau , Typeform , Lucidcharts , and much more! Now your content, wherever it may reside can part of your survey report.

Bit’s impressive editor is collaborative so that you can work with your team at the same time and create smart documents that help you communicate more effectively together.

Bit features infographic

Your survey report can contain an automated table of contents, embedded Excel sheets of tables, interactive Tableau charts, PDF presentations, Google Form surveys, and much more! Having this information in one place allows you to easily add context to each element you share. Reduce the amount of scattered data and information and tie it together beautifully in one place.

When you’re ready to share it with your world, you can invite your team members to view your report inside of Bit. You can also share it with a live link, embed it on a website, or share a trackable link to track the engagement levels of your report. If there need to be additional security layers placed, you can invite your audience in as guests where they need to login to view your survey report.

Bit will change the way you communicate and is the ultimate tool necessary to create impressive survey reports!

Our team at  bit.ai  has created a few awesome business templates to make your business processes more efficient. Make sure to check them out before you go, y our team might need them!

  • SWOT Analysis Template
  • Business Proposal Template
  • Business Plan Template
  • Competitor Research Template
  • Project Proposal Template
  • Company Fact Sheet
  • Executive Summary Template
  • Operational Plan Template
  • Pitch Deck Template

Conducting a survey isn’t easy.

From preparing the questionnaire to interpreting and presenting the data, it can seem like a heck of a job.

With Bit.ai’s smart document collaboration platform, at least you don’t have to worry about creating an interactive, impressive survey report!

So what are you waiting for?

Try out Bit’s survey report template and let us know how you liked it by tweeting @bit_docs.

Further reads:

How To Create An Effective Status Report?

7 Types of Reports Your Business Certainly Needs!

Incident Report: What is it & How to Write it the Right Way!

Performance Report: What is it & How to Create it? (Steps Included)

Business Report: What is it & How to Write it? (Steps & Format)

Marketing Report: Definition, Types, Benefits & Things to Include!

Sales Report: What is it and How to Create One?

Bit bottom banner

How to Embed Airtable Database in your Bit Documents?

Quality Management Plan: What is it and How to Create it?

Related posts

Standard operating procedures (sop): what, types and how to write, goal setting for 2023: the ultimate process for self improvement, project scope: what is it and how to write it, purchase agreement: what is it & how to create it, vpn: working, benefits, protocols & role in cyber security, best 13 document management systems of 2024 (free & paid).

survey research report meaning

About Bit.ai

Bit.ai is the essential next-gen workplace and document collaboration platform. that helps teams share knowledge by connecting any type of digital content. With this intuitive, cloud-based solution, anyone can work visually and collaborate in real-time while creating internal notes, team projects, knowledge bases, client-facing content, and more.

The smartest online Google Docs and Word alternative, Bit.ai is used in over 100 countries by professionals everywhere, from IT teams creating internal documentation and knowledge bases, to sales and marketing teams sharing client materials and client portals.

👉👉Click Here to Check out Bit.ai.

Recent Posts

How to leverage ai to transform your blog writing, how to create an internal wiki to collect and share knowledge, how to build an effective knowledge base for technical support, 9 knowledge base mistakes: what you need to know to avoid them, personal user manual: enhance professional profile & team productivity, 9 document management trends every business should know.

Logo for Boise State Pressbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

5 Approaching Survey Research

What is survey research.

Survey research is a quantitative and qualitative method with two important characteristics. First, the variables of interest are measured using self-reports (using questionnaires or interviews). In essence, survey researchers ask their participants (who are often called respondents in survey research) to report directly on their own thoughts, feelings, and behaviors. Second, considerable attention is paid to the issue of sampling. In particular, survey researchers have a strong preference for large random samples because they provide the most accurate estimates of what is true in the population. Beyond these two characteristics, almost anything goes in survey research. Surveys can be long or short. They can be conducted in person, by telephone, through the mail, or over the Internet. They can be about voting intentions, consumer preferences, social attitudes, health, or anything else that it is possible to ask people about and receive meaningful answers. Although survey data are often analyzed using statistics, there are many questions that lend themselves to more qualitative analysis.

Most survey research is non-experimental. It is used to describe single variables (e.g., the percentage of voters who prefer one presidential candidate or another, the prevalence of schizophrenia in the general population, etc.) and also to assess statistical relationships between variables (e.g., the relationship between income and health). But surveys can also be used within experimental research; as long as there is manipulation of an independent variable (e.g. anger vs. fear) to assess an effect on a dependent variable (e.g. risk judgments).

Chapter 5: Learning Objectives

If your research question(s) center on the experience or perception of a particular phenomenon, process, or practice, utilizing a survey method may help glean useful data. After reading this chapter, you will

  • Identify the purpose of survey research
  • Describe the cognitive processes involved in responding to questions
  • Discuss the importance of context in drafting survey items
  • Contrast the utility of open and closed ended questions
  • Describe the BRUSO method of drafting survey questions
  • Describe the format for survey questionnaires

The heart of any survey research project is the survey itself. Although it is easy to think of interesting questions to ask people, constructing a good survey is not easy at all. The problem is that the answers people give can be influenced in unintended ways by the wording of the items, the order of the items, the response options provided, and many other factors. At best, these influences add noise to the data. At worst, they result in systematic biases and misleading results. In this section, therefore, we consider some principles for constructing surveys to minimize these unintended effects and thereby maximize the reliability and validity of respondents’ answers.

Cognitive Processes of Responses

To best understand how to write a ‘good’ survey question, it is important to frame the act of responding to a survey question as a cognitive process. That is, there are are involuntary mechanisms that take place when someone is asked a question. Sudman, Bradburn, & Schwarz (1996, as cited in Jhangiani et. al, 2012) illustrate this cognitive process here.

Progression of a cognitive response. Fist the respondent must understand the question then retrieve information from memory to formulate a response based on a judgement formed by the information. The respondent must then edit the response, depending on the response options provided by the survey.

Framing the formulation of survey questions in this way is extremely helpful to ensure that the questions posed on your survey glean accurate information.

Example of a Poorly Worded Survey Question

How many alcoholic drinks do you consume in a typical day?

  • A lot more of average
  • Somewhat more than average
  • Average number
  • Somewhat fewer than average
  • A lot fewer than average

Although this item at first seems straightforward, it poses several difficulties for respondents. First, they must interpret the question. For example, they must decide whether “alcoholic drinks” include beer and wine (as opposed to just hard liquor) and whether a “typical day” is a typical weekday, typical weekend day, or both. Even though Chang and Krosnick (2003, as cited in Jhangiani et al. 2012) found that asking about “typical” behavior has been shown to be more valid than asking about “past” behavior, their study compared “typical week” to “past week” and may be different when considering typical weekdays or weekend days). Once respondents have interpreted the question, they must retrieve relevant information from memory to answer it. But what information should they retrieve, and how should they go about retrieving it? They might think vaguely about some recent occasions on which they drank alcohol, they might carefully try to recall and count the number of alcoholic drinks they consumed last week, or they might retrieve some existing beliefs that they have about themselves (e.g., “I am not much of a drinker”). Then they must use this information to arrive at a tentative judgment about how many alcoholic drinks they consume in a typical day. For example, this mental calculation might mean dividing the number of alcoholic drinks they consumed last week by seven to come up with an average number per day. Then they must format this tentative answer in terms of the response options actually provided. In this case, the options pose additional problems of interpretation. For example, what does “average” mean, and what would count as “somewhat more” than average? Finally, they must decide whether they want to report the response they have come up with or whether they want to edit it in some way. For example, if they believe that they drink a lot more than average, they might not want to report that for fear of looking bad in the eyes of the researcher, so instead, they may opt to select the “somewhat more than average” response option.

From this perspective, what at first appears to be a simple matter of asking people how much they drink (and receiving a straightforward answer from them) turns out to be much more complex.

Context Effects on Survey Responses

Again, this complexity can lead to unintended influences on respondents’ answers. These are often referred to as context effects because they are not related to the content of the item but to the context in which the item appears (Schwarz & Strack, 1990, as cited in Jhangiani et al. 2012). For example, there is an item-order effect when the order in which the items are presented affects people’s responses. One item can change how participants interpret a later item or change the information that they retrieve to respond to later items. For example, researcher Fritz Strack and his colleagues asked college students about both their general life satisfaction and their dating frequency (Strack, Martin, & Schwarz, 1988, as cited in Jhangiani et al. 2012) . When the life satisfaction item came first, the correlation between the two was only −.12, suggesting that the two variables are only weakly related. But when the dating frequency item came first, the correlation between the two was +.66, suggesting that those who date more have a strong tendency to be more satisfied with their lives. Reporting the dating frequency first made that information more accessible in memory so that they were more likely to base their life satisfaction rating on it.

The response options provided can also have unintended effects on people’s responses (Schwarz, 1999, as cited in Jhangiani et al. 2012) . For example, when people are asked how often they are “really irritated” and given response options ranging from “less than once a year” to “more than once a month,” they tend to think of major irritations and report being irritated infrequently. But when they are given response options ranging from “less than once a day” to “several times a month,” they tend to think of minor irritations and report being irritated frequently. People also tend to assume that middle response options represent what is normal or typical. So if they think of themselves as normal or typical, they tend to choose middle response options. For example, people are likely to report watching more television when the response options are centered on a middle option of 4 hours than when centered on a middle option of 2 hours. To mitigate against order effects, rotate questions and response items when there is no natural order. Counterbalancing or randomizing the order of presentation of the questions in online surveys are good practices for survey questions and can reduce response order effects that show that among undecided voters, the first candidate listed in a ballot receives a 2.5% boost simply by virtue of being listed first!

Writing Survey Items

Types of Items

Questionnaire items can be either open-ended or closed-ended. Open-ended  items simply ask a question and allow participants to answer in whatever way they choose. The following are examples of open-ended questionnaire items.

  • “What is the most important thing to teach children to prepare them for life?”
  • “Please describe a time when you were discriminated against because of your age.”
  • “Is there anything else you would like to tell us about?”

Open-ended items are useful when researchers do not know how participants might respond or when they want to avoid influencing their responses. Open-ended items are more qualitative in nature, so they tend to be used when researchers have more vaguely defined research questions—often in the early stages of a research project. Open-ended items are relatively easy to write because there are no response options to worry about. However, they take more time and effort on the part of participants, and they are more difficult for the researcher to analyze because the answers must be transcribed, coded, and submitted to some form of qualitative analysis, such as content analysis. Another disadvantage is that respondents are more likely to skip open-ended items because they take longer to answer. It is best to use open-ended questions when the answer is unsure or for quantities which can easily be converted to categories later in the analysis.

Closed-ended items ask a question and provide a set of response options for participants to choose from.

Examples of  Closed-Ended Questions

How old are you?

On a scale of 0 (no pain at all) to 10 (the worst pain ever experienced), how much pain are you in right now?

Closed-ended items are used when researchers have a good idea of the different responses that participants might make. They are more quantitative in nature, so they are also used when researchers are interested in a well-defined variable or construct such as participants’ level of agreement with some statement, perceptions of risk, or frequency of a particular behavior. Closed-ended items are more difficult to write because they must include an appropriate set of response options. However, they are relatively quick and easy for participants to complete. They are also much easier for researchers to analyze because the responses can be easily converted to numbers and entered into a spreadsheet. For these reasons, closed- ended items are much more common.

All closed-ended items include a set of response options from which a participant must choose. For categorical variables like sex, race, or political party preference, the categories are usually listed and participants choose the one (or ones) to which they belong. For quantitative variables, a rating scale is typically provided. A rating scale is an ordered set of responses that participants must choose from.

Likert Scale indicating scaled responses between 1 and 5 to questions. A selection of 1 indicates strongly disagree and a selection of 5 indicates strongly agree

The number of response options on a typical rating scale ranges from three to 11—although five and seven are probably most common. Five-point scales are best for unipolar scales where only one construct is tested, such as frequency (Never, Rarely, Sometimes, Often, Always). Seven- point scales are best for bipolar scales where there is a dichotomous spectrum, such as liking (Like very much, Like somewhat, Like slightly, Neither like nor dislike, Dislike slightly, Dislike somewhat, Dislike very much). For bipolar questions, it is useful to offer an earlier question that branches them into an area of the scale; if asking about liking ice cream, first ask “Do you generally like or dislike ice cream?” Once the respondent chooses like or dislike, refine it by offering them relevant choices from the seven-point scale. Branching improves both reliability and validity (Krosnick & Berent, 1993, as cited in Jhangiani et al. 2012 ) . Although you often see scales with numerical labels, it is best to only present verbal labels to the respondents but convert them to numerical values in the analyses. Avoid partial labels or length or overly specific labels. In some cases, the verbal labels can be supplemented with (or even replaced by) meaningful graphics.

Writing Effective Items

We can now consider some principles of writing questionnaire items that minimize unintended context effects and maximize the reliability and validity of participants’ responses. A rough guideline for writing 9 questionnaire items is provided by the BRUSO model (Peterson, 2000, as cited in Jhangiani et al. 2012 ) . An acronym, BRUSO stands for “brief,” “relevant,” “unambiguous,” “specific,” and “objective.” Effective questionnaire items are brief and to the point. They avoid long, overly technical, or unnecessary words. This brevity makes them easier for respondents to understand and faster for them to complete. Effective questionnaire items are also relevant to the research question. If a respondent’s sexual orientation, marital status, or income is not relevant, then items on them should probably not be included. Again, this makes the questionnaire faster to complete, but it also avoids annoying respondents with what they will rightly perceive as irrelevant or even “nosy” questions. Effective questionnaire items are also unambiguous; they can be interpreted in only one way. Part of the problem with the alcohol item presented earlier in this section is that different respondents might have different ideas about what constitutes “an alcoholic drink” or “a typical day.” Effective questionnaire items are also specific so that it is clear to respondents what their response should be about and clear to researchers what it is about. A common problem here is closed- ended items that are “double barreled .” They ask about two conceptually separate issues but allow only one response.

Example of a “Double Barreled” question

Please rate the extent to which you have been feeling anxious and depressed

Note: The issue in the question itself is that anxiety and depression are two separate items and should likely be separated

Finally, effective questionnaire items are objective in the sense that they do not reveal the researcher’s own opinions or lead participants to answer in a particular way. The best way to know how people interpret the wording of the question is to conduct a pilot test and ask a few people to explain how they interpreted the question. 

A description of the BRUSO methodology of writing questions wherein items are brief, relevant, unambiguous, specific, and objective

For closed-ended items, it is also important to create an appropriate response scale. For categorical variables, the categories presented should generally be mutually exclusive and exhaustive. Mutually exclusive categories do not overlap. For a religion item, for example, the categories of Christian and Catholic are not mutually exclusive but Protestant and Catholic are mutually exclusive. Exhaustive categories cover all possible responses. Although Protestant and Catholic are mutually exclusive, they are not exhaustive because there are many other religious categories that a respondent might select: Jewish, Hindu, Buddhist, and so on. In many cases, it is not feasible to include every possible category, in which case an ‘Other’ category, with a space for the respondent to fill in a more specific response, is a good solution. If respondents could belong to more than one category (e.g., race), they should be instructed to choose all categories that apply.

For rating scales, five or seven response options generally allow about as much precision as respondents are capable of. However, numerical scales with more options can sometimes be appropriate. For dimensions such as attractiveness, pain, and likelihood, a 0-to-10 scale will be familiar to many respondents and easy for them to use. Regardless of the number of response options, the most extreme ones should generally be “balanced” around a neutral or modal midpoint.

Example of an unbalanced versus balanced rating scale

Unbalanced rating scale measuring perceived likelihood

Unlikely | Somewhat Likely | Likely | Very Likely | Extremely Likely

Balanced rating scale measuring perceived likelihood

Extremely Unlikely | Somewhat Unlikely | As Likely as Not | Somewhat Likely |Extremely Likely

Note, however, that a middle or neutral response option does not have to be included. Researchers sometimes choose to leave it out because they want to encourage respondents to think more deeply about their response and not simply choose the middle option by default. However, including middle alternatives on bipolar dimensions can be used to allow people to choose an option that is neither.

Formatting the Survey

Writing effective items is only one part of constructing a survey. For one thing, every survey should have a written or spoken introduction that serves two basic functions (Peterson, 2000, as cited by Jhangiani et al. 2012 ). One is to encourage respondents to participate in the survey. In many types of research, such encouragement is not necessary either because participants do not know they are in a study (as in naturalistic observation) or because they are part of a subject pool and have already shown their willingness to participate by signing up and showing up for the study. Survey research usually catches respondents by surprise when they answer their phone, go to their mailbox, or check their e-mail—and the researcher must make a good case for why they should agree to participate. This means that the researcher has only a moment to capture the attention of the respondent and must make it as easy as possible for the respondent  to participate . Thus the introduction should briefly explain the purpose of the survey and its importance, provide information about the sponsor of the survey (university-based surveys tend to generate higher response rates), acknowledge the importance of the respondent’s participation, and describe any incentives for participating.

The second function of the introduction is to establish informed consent. Remember that this involves describing to respondents everything that might affect their decision to participate. This includes the topics covered by the survey, the amount of time it is likely to take, the respondent’s option to withdraw at any time, confidentiality issues, and so on. Written consent forms are not always used in survey research (when the research is of minimal risk and completion of the survey instrument is often accepted by the IRB as evidence of consent to participate), so it is important that this part of the introduction be well documented and presented clearly and in its entirety to every respondent.

The introduction should be followed by the substantive questionnaire items. But first, it is important to present clear instructions for completing the questionnaire, including examples of how to use any unusual response scales. Remember that the introduction is the point at which respondents are usually most interested and least fatigued, so it is good practice to start with the most important items for purposes of the research and proceed to less important items. Items should also be grouped by topic or by type. For example, items using the same rating scale (e.g., a 5-point agreement scale) should be grouped together if possible to make things faster and easier for respondents. Demographic items are often presented last because they are least interesting to participants but also easy to answer in the event respondents have become tired or bored. Of course, any survey should end with an expression of appreciation to the respondent.

Coding your survey responses

Once you’ve closed your survey, you’ll need to identify how to quantify the data you’ve collected. Much of this can be done in ways similar to methods described in the previous two chapters. Although there are several ways by which to do this, here are some general tips:

  • Transfer data : Transfer your data to a program which will allow you to organize and ‘clean’ the data. If you’ve used an online tool to gather data, you should be able to download the survey results into a format appropriate for working the data. If you’ve collected responses by hand, you’ll need to input the data manually.
  • Save: ALWAYS save a copy of your original data. Save changes you make to the data under a different name or version in case you need to refer back to the original data.
  • De-identify: This step will depend on the overall approach that you’ve taken to answer your research question and may not be appropriate for your project.
  • Name the variables: Again, there is no ‘right’ way to do this; however, as you move forward, you will want to be sure you can easily identify what data you are extracting. Many times, when you transfer your data the program will automatically associate data collected with the question asked. It is a good idea to name the variable something associated with the data, rather than the question
  • Code the attributes : Each variable will likely have several different attributes, or      layers. You’ll need to come up with a coding method to distinguish the different responses. As discussed in previous chapters, each attribute should have a numeric code associated so that you can quantify the data and use descriptive and/or inferential statistical methods to either describe or explore relationships within the dataset.

Most online survey tools will download data into a spreadsheet-type program and organize that data in association with the question asked. Naming the variables so that you can easily identify the information will be helpful as you proceed to analysis.

This is relatively simple to accomplish with closed-ended questions. Because                   you’ve ‘forced’ the respondent to pick a concrete answer, you can create a code               that is associated with each answer. In the picture above, respondents were                     asked to identify their region and given a list of geographical regions and in                     structed to pick one. The researcher then created a code for the regions. In this               case, 1= West; 2= Midwest; 3= Northeast; 4= Southeast; and 5= Southwest. If you’re           working to quantify data that is somewhat qualitative in nature (i.e. open ended             questions) the process is a little more complicated. You’ll need to either create                 themes or categories, classify types or similar responses, and then assign codes to         those themes or categories.

6. Create a codebook : This.is.essential. Once you begin to code the data, you will                 have somewhat disconnected yourself from the data by translating the data from         a language that we understand to a language which a computer understands. Af           ter you run your statistical methods, you’ll translate it back to the native language         and share findings. To stay organized and accurate, it is important that you keep a         record of how the data has been translated.

7.  Analyze: Once you have the data inputted, cleaned, and coded, you should be                ready  to analyze your data using either descriptive or inferential methods, depend.      ing on your approach and overarching goal.

Key Takeaways

  • Surveys are a great method to identify information about perceptions and experiences
  • Question items must be carefully crafted to elicit an appropriate response
  • Surveys are often a mixed-methods approach to research
  • Both descriptive and inferential statistical approaches can be applied to the data gleaned through survey responses
  • Surveys utilize both open and closed ended questions; identifying which types of questions will yield specific data will be helpful as you plan your approach to analysis
  • Most surveys will need to include a method of informed consent, and an introduction. The introduction should clearly delineate the purpose of the survey and how the results will be utilized
  • Pilot tests of your survey can save you a lot of time and heartache. Pilot testing helps to catch issues in the development of item, accessibility, and type of information derived prior to initiating the survey on a larger scale
  • Survey data can be analyzed much like other types of data; following a systematic approach to coding will help ensure you get the answers you’re looking for
  • This section is attributed to Research Methods in Psychology by Rajiv S. Jhangiani, I-Chant A. Chiang, Carrie Cuttler, & Dana C. Leighton is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted. ↵
  • The majority of content in these sections can be attributed to Research Methods in Psychology by Rajiv S. Jhangiani, I-Chant A. Chiang, Carrie Cuttler, & Dana C. Leighton is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted. ↵

A mixed methods approach using self-reports of respondents who are sampled using stringent methods

A type of survey question that allows the respondent to insert their own response; typically qualitative in nature

A type of survey question which forces a respondent to select a response; no subjectivity.

Practical Research: A Basic Guide to Planning, Doing, and Writing Copyright © by megankoster. All Rights Reserved.

Share This Book

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • Product Demos
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence
  • Market Research
  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Survey Data Analysis & Reporting

Try Qualtrics for free

Survey data analysis and best practices for reporting.

20 min read Data can do beautiful things, but turning your survey results into clear, compelling analysis isn’t always a straightforward task. We’ve collected our tips for survey analysis along with a beginner’s guide to survey data and analysis tools.

What is survey data analysis?

Survey analysis is the process of turning the raw material of your survey data into insights and answers you can use to improve things for your business. It’s an essential part of doing survey-based research .

There are a huge number of survey data analysis methods available, from simple cross-tabulation , where data from your survey responses is arranged into rows and columns that make it easier to understand, to statistical methods for survey data analysis which tell you things you could never work out on your own, such as whether the results you’re seeing have statistical significance.

Get your free Qualtrics account now

Types of survey data

Different kinds of survey questions yield data in different forms. Here’s a quick guide to a few of them. Often, survey data will belong to more than one of these categories as they frequently overlap.

Quantitative data vs. qualitative data

What’s the difference between qualitative data and quantitative data?

  • Quantitative data, aka numerical data, involves numerical values and quantities. An example of quantitative data would be the number of times a customer has visited a location, the temperature of a city or the scores achieved in an NPS survey .
  • Qualitative data is information that isn’t numerical. It may be verbal or visual, or consist of spoken audio or video. It’s more likely to be descriptive or subjective, although it doesn’t have to be. Qualitative data highlights the “why” behind the what.

Analysis reporting ma

Closed-ended questions

These are questions with a limited range of responses. They could be a ‘yes’ or ‘no’ question such as ‘do you live in Portland, OR?’. Closed-ended questions can also take the form of multiple-choice, ranking, or drop-down menu items. Respondents can’t qualify their choice between the options or explain why they chose which one they did.

This type of question produces structured data that is easy to sort, code and quantify since the responses will fit into a limited number of ‘buckets’. However, its simplicity means you lose out on some of the finer details that respondents could have provided.

Natural language data (open-ended questions)

Answers written in the respondent’s own words are also a form of survey data. This type of response is usually given in open field (text box) question formats. Questions might begin with ‘how,’ ‘why,’ ‘describe…’ or other conversational phrases that encourage the respondent to open up.

This type of data, known as unstructured data , is rich in information. It typically requires advanced tools such as Natural Language Processing and sentiment analysis to extract the full value from how the respondents answered, because of its complexity and volume.

Categorical (nominal) data

This kind of data exists in categories that have no hierarchical relationship to each other. No item is treated as being more or less, better or worse, than the others. Examples would be primary colors (red v. blue), genders (male v female) or brand names (Chrysler v Mitsubishi).

Multiple choice questions often produce this kind of data (though not always).

Ordinal data

Unlike categorical data, ordinal data has an intrinsic rank that relates to quantity or quality, such as degrees of preference, or how strongly someone agrees or disagrees with a statement.

Likert scales and ranking scales often serve up this kind of data.

Likert Scale

Scalar data

Like ordinal data, scalar data deals with quantity and quality on a relative basis, with some items ranking above others. What makes it different is that it uses an established scale, such as age (expressed as a number), test scores (out of 100), or time (in days, hours, minutes etc.)

You might get this kind of data from a drop-down or sliding scale question format, among others.

The type of data you receive affects the kind of survey results analysis you’ll be doing, so it’s very important to consider the type of survey data you will end up with when you’re writing your survey questions and designing survey flows .

Steps to analyze your survey data

Here’s an overview of how you can analyze survey data, identify trends and hopefully draw meaningful conclusions from your research.

1.   Review your research questions

Research questions are the underlying questions your survey seeks to answer. Research questions are not the same as the questions in your questionnaire , although they may cover similar ground.

It’s important to review your research questions before you analyze your survey data to determine if it aligns with what you want to accomplish and find out from your data.

2.   Cross-tabulate your data

Cross-tabulation is a valuable step in sifting through your data and uncovering its meaning. When you cross-tabulate, you’re breaking out your data according to the sub-groups within your research population or your sample, and comparing the relationship between one variable and another. The table you produce will give you an overall picture of how responses vary among your subgroups.

Target the survey questions that best address your research question. For example, if you want to know how many people would be interested in buying from you in the future, cross-tabulating the data will help you see whether some groups were more likely than others to want to return. This gives you an idea of where to focus your efforts when improving your product design or your customer experience .

Cross Tabulation

Cross-tabulation works best for categorical data and other types of structured data. You can cross-tabulate your data in multiple ways across different questions and sub-groups using survey analysis software . Be aware, though, that slicing and dicing your data very finely will give you a smaller sample size, which then affects the reliability of your results.

1.   Review and investigate your results

Put your results in context – how have things changed since the last time you researched these kinds of questions? Do your findings tie in to changes in your market or other research done within your company?

Look at how different demographics within your sample or research population have answered, and compare your findings to other data on these groups. For example, does your survey analysis tell you something about why a certain group is purchasing less, or more? Does the data tell you anything about how well your company is meeting strategic goals, such as changing brand perceptions or appealing to a younger market?

Look at quantitative measures too. Which questions were answered the most? Which ones produced the most polarized responses? Were there any questions with very skewed data? This could be a clue to issues with survey design .

2.   Use statistical analysis to check your findings

Statistics give you certainty (or as close to it as you can get) about the results of your survey. Statistical tools like T-test, regression and ANOVA help you make sure that the results you’re seeing have statistical significance and aren’t just there by chance.

Statistical tools can also help you determine which aspects of your data are most important, and what kinds of relationships – if any – they have with one another.

Benchmarking your survey data

One of the most powerful aspects of survey data analysis is its ability to build on itself. By repeating market research surveys at different points in time, you can not only use it to uncover insights from your results, but to strengthen those insights over time.

Using consistent types of data and methods of analysis means you can use your initial results as a benchmark for future research . What’s changed year-on-year? Has your survey data followed a steady rise, performed a sudden leap or fallen incrementally? Over time, all these questions become answerable when you listen regularly and analyze your data consistently.

Maintaining your question and data types and your data analysis methods means you achieve a like-for-like measurement of results over time. And if you collect data consistently enough to see patterns and processes emerging, you can use these to make predictions about future events and outcomes.

Another benefit of data analysis over time is that you can compare your results with other people’s, provided you are using the same measurements and metrics. A classic example is NPS (Net Promoter Score) , which has become a standard measurement of customer experience that companies typically track over time.

How to present survey results

Most data isn’t very friendly to the human eye or brain in its raw form. Survey data analysis helps you turn your data into something that’s accessible, intuitive, and even interesting to a wide range of people.

1.   Make it visual

You can present data in a visual form, such as a chart or graph, or put it into a tabular form so it’s easy for people to see the relationships between variables in your crosstab analysis. Choose a graphic format that best suits your data type and clearly shows the results to the untrained eye. There are plenty of options, including linear graphs, bar graphs, Venn diagrams, word clouds and pie charts. If time and budget allows, you can create an infographic or animation.

2.   Keep language human

You can express discoveries in plain language, for example, in phrases like “customers in the USA consistently preferred potato chips to corn chips.” Adding direct quotes from your natural language data (provided respondents have consented to this) can add immediacy and illustrate your points.

3.   Tell the story of your research

Another approach is to express data using the power of storytelling, using a beginning-middle-end or situation-crisis-resolution structure to talk about how trends have emerged or challenges have been overcome. This helps people understand the context of your research and why you did it the way you did.

4.   Include your insights

As well as presenting your data in terms of numbers and proportions, always be sure to share the insights it has produced too. Insights come when you apply knowledge and ideas to the data in the survey, which means they’re often more striking and easier to grasp than the data by itself. Insights may take the form of a recommended action , or examine how two different data points are connected.

CX text IQ

Common mistakes in analyzing data and how to avoid them

1.   being too quick to interpret survey results.

It’s easy to get carried away when the data seems to show the results you were expecting or confirms a hypothesis you started with. This is why it’s so important to use statistics to make sure your survey report is statistically significant, i.e. based on reality, not a coincidence. Remember that a skewed or coincidental result becomes more likely with a smaller sample size.

2.   Treating correlation like causation

You may have heard the phrase “correlation is not causation” before. It’s well-known for a reason: mistaking a link between two independent variables as a causal relationship between them is a common pitfall in research. Results can correlate without one having a direct effect on the other.

An example is when there is another common variable involved that isn’t measured and acts as a kind of missing link between the correlated variables. Sales of sunscreen might go up in line with the number of ice-creams sold at the beach, but it’s not because there’s something about ice-cream that makes people more vulnerable to getting sunburned. It’s because a third variable – sunshine – affects both sunscreen use and ice-cream sales.

3.   Missing the nuances in qualitative natural language data

Human language is complex, and analyzing survey data in the form of speech or text isn’t as straightforward as mapping vocabulary items to positive or negative codes. The latest AI solutions go further, uncovering meaning, emotion and intent within human language.

Trusting your rich qualitative data to an AI’s interpretation means relying on the software’s ability to understand language in the way a human would, taking into account things like context and conversational dynamics. If you’re investing in software to analyze natural language data in your surveys, make sure it’s capable of sentiment analysis that uses machine learning to get a deeper understanding of what survey respondents are trying to tell you.

Tools for survey analysis

If you’re planning to run an ongoing data insights program (and we recommend that you do), it’s important to have tools on hand that make it easy and efficient to perform your research and extract valuable insights from the results.

It’s even better if those tools help you to share your findings with the right people, at the right time, in a format that works for them. Here are a few attributes to look for in a survey analysis software platform.

  • Easy to use (for non-experts) Look for software that demands minimal training or expertise, and you’ll save time and effort while maximizing the number of people who can pitch in on your experience management program . User-friendly drag-and-drop interfaces, straightforward menus, and automated data analysis are all worth looking out for.
  • Works on any platform Don’t restrict your team to a single place where software is located on a few terminals. Instead, choose a cloud-based platform that’s optimized for mobile, desktop, tablet and more.
  • Integrates with your existing setup Stand-alone analysis tools create additional work you shouldn’t have to do. Why export, convert, paste and print out when you can use a software tool that plugs straight into your existing systems via API?
  • Incorporates statistical analysis Choose a system that gives you the tools to not just process and present your data, but refine your survey results using statistical tools that generate deep insights and future predictions with just a few clicks.
  • Comes with first-class support The best survey data tool is one that scales with you and adapts to your goals and growth. A large part of that is having an expert team on call to answer questions, propose bespoke solutions, and help you get the most out of the service you’ve paid for.

Tips from the team at Qualtrics

We’ve run more than a few survey research programs in our time, and we have some tips to share that you may not find in the average survey data analysis guide. Here are some innovative ways to help make sure your survey analysis hits the mark, grabs attention, and provokes change.

Write the headlines

The #1 way to make your research hit the mark is to start with the end in mind. Before you even write your survey questions, make sample headlines of what the survey will discover. Sample headlines are the main data takeaways from your research. Some sample headlines might be:

  • The #1 concern that travelers have with staying at our hotel is X
  • X% of visitors to our showroom want to be approached by a salesperson within the first 10 minutes
  • Diners are X% more likely to choose our new lunch menu than our old one

You may even want to sketch out mock charts that show how the data will look in your results. If you “write” the results first, those results become a guide to help you design questions that ensure you get the data you want.

Gut Data Gut

We live in a data-driven society. Marketing is a data-driven business function. But don’t be afraid to overlap qualitative research findings onto your quantitative data . Don’t be hesitant to apply what you know in your gut with what you know from the data.

This is called “Gut Data Gut”. Check your gut, check your data, and check your gut. If you have personal experience with the research topic, use it! If you have qualitative research that supports the data, use it!

Your survey is one star in a constellation of information that combines to tell a story. Use every atom of information at your disposal. Just be sure to let your audience know when you are showing them findings from statistically significant research and when it comes from a different source.

Write a mock press release to encourage taking action

One of the biggest challenges of research is acting on it . This is sometimes called the “Knowing / Doing Gap” where an organization has a difficult time implementing truths they know.

One way you can ignite change with your research is to write a press release dated six months into the future that proudly announces all the changes as a result of your research. Maybe it touts the three new features that were added to your product. Perhaps it introduces your new approach to technical support. Maybe it outlines the improvements to your website.

After six months, gather your team and read the press release together to see how well you executed change based on the research.

Focus your research findings

Everyone consumes information differently. Some people want to fly over your findings at 30,000 feet and others want to slog through the weeds in their rubber boots. You should package your research for these different research consumer types.

Package your survey results analysis findings in 5 ways:

  • A 1-page executive summary with key insights
  • A 1-page stat sheet that ticks off the top supporting stats
  • A shareable slide deck with data visuals that can be understood as a stand-alone or by being presented in person
  • Live dashboards with all the survey data that allow team members to filter the data and dig in as deeply as they want on a DIY basis
  • The Mock Press Release (mentioned above)

How to analyze survey data

Reporting on survey results will prove the value of your work. Learn more about statistical analysis types or jump into an analysis type below to see our favorite tools of the trade:

  • Conjoint Analysis
  • CrossTab Analysis
  • Cluster Analysis
  • Factor Analysis
  • Analysis of Variance (ANOVA)

eBook: 5 Practices that Improve the Business Impact of Research

Related resources

Analysis & Reporting

Margin of error 11 min read

Data saturation in qualitative research 8 min read, thematic analysis 11 min read, behavioral analytics 12 min read, statistical significance calculator: tool & complete guide 18 min read, regression analysis 19 min read, data analysis 31 min read, request demo.

Ready to learn more about Qualtrics?

survey research report meaning

  • Voxco Online
  • Voxco Panel Management
  • Voxco Panel Portal
  • Voxco Audience
  • Voxco Mobile Offline
  • Voxco Dialer Cloud
  • Voxco Dialer On-premise
  • Voxco TCPA Connect
  • Voxco Analytics
  • Voxco Text & Sentiment Analysis

survey research report meaning

  • 40+ question types
  • Drag-and-drop interface
  • Skip logic and branching
  • Multi-lingual survey
  • Text piping
  • Question library
  • CSS customization
  • White-label surveys
  • Customizable ‘Thank You’ page
  • Customizable survey theme
  • Reminder send-outs
  • Survey rewards
  • Social media
  • Website surveys
  • Correlation analysis
  • Cross-tabulation analysis
  • Trend analysis
  • Real-time dashboard
  • Customizable report
  • Email address validation
  • Recaptcha validation
  • SSL security

Take a peek at our powerful survey features to design surveys that scale discoveries.

Download feature sheet.

  • Hospitality
  • Financial Services
  • Academic Research
  • Customer Experience
  • Employee Experience
  • Product Experience
  • Market Research
  • Social Research
  • Data Analysis

Explore Voxco 

Need to map Voxco’s features & offerings? We can help!

Watch a Demo 

Download Brochures 

Get a Quote

  • NPS Calculator
  • CES Calculator
  • A/B Testing Calculator
  • Margin of Error Calculator
  • Sample Size Calculator
  • CX Strategy & Management Hub
  • Market Research Hub
  • Patient Experience Hub
  • Employee Experience Hub
  • NPS Knowledge Hub
  • Market Research Guide
  • Customer Experience Guide
  • The Voxco Guide to Customer Experience
  • Survey Research Guides
  • Survey Template Library
  • Webinars and Events
  • Feature Sheets
  • Try a sample survey
  • Professional services

Find the best customer experience platform

Uncover customer pain points, analyze feedback and run successful CX programs with the best CX platform for your team.

Get the Guide Now

survey research report meaning

We’ve been avid users of the Voxco platform now for over 20 years. It gives us the flexibility to routinely enhance our survey toolkit and provides our clients with a more robust dataset and story to tell their clients.

VP Innovation & Strategic Partnerships, The Logit Group

  • Client Stories
  • Voxco Reviews
  • Why Voxco Research?
  • Careers at Voxco
  • Vulnerabilities and Ethical Hacking

Explore Regional Offices

  • Survey Software The world’s leading omnichannel survey software
  • Online Survey Tools Create sophisticated surveys with ease.
  • Mobile Offline Conduct efficient field surveys.
  • Text Analysis
  • Close The Loop
  • Automated Translations
  • NPS Dashboard
  • CATI Manage high volume phone surveys efficiently
  • Cloud/On-premise Dialer TCPA compliant Cloud on-premise dialer
  • IVR Survey Software Boost productivity with automated call workflows.
  • Analytics Analyze survey data with visual dashboards
  • Panel Manager Nurture a loyal community of respondents.
  • Survey Portal Best-in-class user friendly survey portal.
  • Voxco Audience Conduct targeted sample research in hours.
  • Predictive Analytics
  • Customer 360
  • Customer Loyalty
  • Fraud & Risk Management
  • AI/ML Enablement Services
  • Credit Underwriting

survey research report meaning

Find the best survey software for you! (Along with a checklist to compare platforms)

Get Buyer’s Guide

  • 100+ question types
  • SMS surveys
  • Banking & Financial Services
  • Retail Solution
  • Risk Management
  • Customer Lifecycle Solutions
  • Net Promoter Score
  • Customer Behaviour Analytics
  • Customer Segmentation
  • Data Unification

Explore Voxco 

Watch a Demo 

Download Brochures 

  • CX Strategy & Management Hub
  • Blogs & White papers
  • Case Studies

survey research report meaning

VP Innovation & Strategic Partnerships, The Logit Group

  • Why Voxco Intelligence?
  • Our clients
  • Client stories
  • Featuresheets

Definition, Templates, and Best Pratices of Survey Report

SHARE THE ARTICLE ON

What is survey report – purpose and templates Survey Best Practices

The survey report is a document whose purpose is to convey the information acquired during the survey in its whole and objectively. The report includes all of the results that were gathered.

The following are included in the full survey report:

Completion Rate

  • Total Number of Responses
  • Date of the Most Recent Response
  • Views from a Survey
  • Breakdown of Survey Respondent Answers
  • Dissection of Closed-Ended Questions

Exploratory Research Guide

Conducting exploratory research seems tricky but an effective guide can help.

In basic words, the completion rate is the number of questions answered divided by the total number of questions in your survey. This is crucial to understand for a variety of reasons.

For example, if our survey had any items that respondents could skip (were optional), or if they abandoned the survey halfway through.

If we have a survey with 12 questions but most responders only answer six of them, our completion rate is 50%.

The completion rate can indicate a variety of things depending on the survey tool we use. For example, if the majority of respondents were only offered six questions out of a total of twelve because half of the questions were irrelevant and were skipped, it is most likely a response rate we’ll be happy with. 

But what if our 50% response rate is due to individuals purposefully ignoring questions? It might imply that we need to enhance our survey.

Number Of Responses

In order to fully assess our survey findings, we must know exactly how many individuals responded. Be cautious: certain survey platforms may not count individual respondents, only their replies to individual questions.

As a result, it’s critical that our survey platform allows us to tally the number of distinct people that replied so that we can assess whether we have a substantial sample size.

How do we figure out how many samples we’ll need?

This depends on the type of data we want to study; however, we may select to examine data from our whole audience or just a certain group.

For example, if we are a beauty firm that offers face creams exclusively for ladies over the age of thirty-five, our survey may reveal that we also have younger women who use our products because they want to look as youthful as possible for as long as possible.

We may select to divide these replies into distinct age groups in order to gather the data we need or data that may provide us with intriguing insight.

So, if we polled them on the efficacy of a new anti-aging lotion, we may find that ladies under thirty had significantly different replies than those in their sixties. This is the type of data that we could have neglected but that can considerably help us with our marketing efforts.

Date Of Last Response

This may not appear relevant if we are performing a survey for a limited and specified time period. Even yet, if we ask clients to complete a customer service evaluation survey once each case is closed, we may have years of data.

A lot may change for our product, staff, and customers over time, therefore it’s critical to determine if the data we’re evaluating is still relevant.

For example, if we rebuild our website but have a normal survey that we ask all new customers to fill out after purchase that isn’t updated with the redesign survey, the entire data may be inaccurate. This is due to the fact that it takes into consideration the sentiments stated prior to the release of the new design.

Survey Views

We need to know the overall number of survey views as well as the total number of unique survey views (the number of total views versus the number of different people who viewed the survey, as some people may have viewed it more than once).

If there is a significant difference between these two totals, this might indicate a number of reasons.

First, our survey may be aimed at a big number of people, and the questions may not be relevant enough for all of our respondents to answer.

Respondents may potentially read the survey and opt not to complete it for the following reasons:

  • They just do not have the time.
  • They lack the necessary equipment (things like open-ended questions can be difficult and tedious to answer on a small phone screen)
  • They look at the first few questions and determine that doing the survey is not for them.

Breakdown Of Survey Respondent Answers

We want to observe how each person responded all of the questions so that we can examine how individuals answered all of the survey questions. This might be useful for identifying trends in particular respondents’ responses.

If we discover a really smart response to one of their questions, we can also uncover their other responses.

Breakdown Of Closed-ended Questions

When you think of a survey report, you probably see graphs and pie charts summarizing the results of closed-ended questions.

This is vital for a successful survey report since it helps you to take in a huge amount of data at a look and communicate it readily to individuals who may find the data useful.

The use of graphics in survey analysis makes it more user-friendly and does not necessitate a lot of effort or prior knowledge to evaluate.

In this example, we can see the NPS ® (Net Promoter Score) at a glance – we know that over 75% of respondents are supporting our brand, 3.2 percent are detractors (which can and should be handled to the best of our ability), and we received 800 total replies. All of this information is plain to see and easy to interpret. 

Purpose Of Survey Report

The primary goal of survey reports is to collect all of the data and information gathered during the survey and use it for research and other reasons. In the case of weather survey reports, analysts utilize the reports to assess changes in climate between the current year and the prior year. Survey reports are used by marketing research firms to examine the behavioral responses of clients and consumers to new services and commodities. Understanding such data enables the organization to make improvements, implement changes, and raise awareness. This leads to the company’s growth.

See Voxco survey software in action with a Free demo.

Best Pratices Of Survey Report

Use data visualizations to display information.

Visuals are the most important component to incorporate when writing a survey introduction.

Including a chart in our introduction helps to bring it to life and adds emphasis to the tale we’re telling.

Survey Visualization Examples

Pie charts are ideal for bringing numbers to life. Here’s an excellent example from a wedding poll:

Write a Brief Survey Summary 

Our survey summary should provide the reader with a thorough overview of the information. However, we don’t want to take up too much room.

Because they are intended to be swiftly digested by decision-makers, survey summaries are also referred to as executive summaries.

We’ll want to eliminate the less important findings and concentrate on what’s crucial.

For most surveys, a one-page summary is sufficient to provide this information.

Short Survey Introduction Examples

A teaser at the beginning of a survey summary is one method to keep it brief. Here’s an example of an opening that doesn’t reveal all of the results but gives us a reason to keep reading:

Write the most important information first

When considering how to write a summary of survey findings, keep in mind that the start must capture the reader’s interest. Focusing on crucial data assists us in doing so straight from the start.

This is why, once the remainder of the survey report has been produced, it is typically better to write the survey introduction towards the conclusion. That way, we’ll be able to identify the key lessons.

This is a simple and effective technique to design a survey introduction that entices the reader to research more.

Survey Summaries with Key Facts Examples

Here’s a great example of a survey that catches the attention right away. The main result is presented first, followed by a fact around half the group immediately after: 

And here’s an excellent survey introduction that explains the findings in a single sentence:

Survey Report Templates

Basic sample survey report, labor market survey report, structural survey report, survey research report, salary survey report.

Net Promoter ® , NPS ® , NPS Prism ® , and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Satmetrix Systems, Inc., and Fred Reichheld. Net Promoter Score℠ and Net Promoter System℠ are service marks of Bain & Company, Inc., Satmetrix Systems, Inc., and Fred Reichheld.

Explore Voxco Survey Software

What is survey report – purpose and templates Survey Best Practices

+ Omnichannel Survey Software 

+ Online Survey Software 

+ CATI Survey Software 

+ IVR Survey Software 

+ Market Research Tool

+ Customer Experience Tool 

+ Product Experience Software 

+ Enterprise Survey Software 

Brand Perception Survey Questions1

Navigating The Importance of Perception Survey

Navigating The Importance of Perception Survey SHARE THE ARTICLE ON Table of Contents For every organization, irrespective of the industry they operate in, it is

What is survey report – purpose and templates Survey Best Practices

Participant bias: Types, Reasons, & Measures

Participant bias: Types, Reasons, & Measures SHARE THE ARTICLE ON Table of Contents Researchers rely on their survey participants for reliable data. Any data collecting

What is survey report – purpose and templates Survey Best Practices

Customer experience in healthcare

Customer experience in healthcare SHARE THE ARTICLE ON Table of Contents Patient-consumers, as the phrase is increasingly known in the business, want more from their

Customer Feedback Loop2

Customer Feedback Loop

Customer Feedback Loop See what question types are possible with a sample survey! Try a Sample Survey Table of Contents Listening to the voice of

The key role of data integration in leveraging the power of machine intelligence

The key role of data integration in leveraging the power of machine intelligence Introduction __ Today Artificial Intelligence (AI) is possibly the most widely discussed

What is survey report – purpose and templates Survey Best Practices

Data Lifecycle Management – Why is it Important?

Data Lifecycle Management – Why is it Important? SHARE THE ARTICLE ON Table of Contents Introduction When you realize how much data organizations collect everyday,

We use cookies in our website to give you the best browsing experience and to tailor advertising. By continuing to use our website, you give us consent to the use of cookies. Read More

Name Domain Purpose Expiry Type
hubspotutk www.voxco.com HubSpot functional cookie. 1 year HTTP
lhc_dir_locale amplifyreach.com --- 52 years ---
lhc_dirclass amplifyreach.com --- 52 years ---
Name Domain Purpose Expiry Type
_fbp www.voxco.com Facebook Pixel advertising first-party cookie 3 months HTTP
__hstc www.voxco.com Hubspot marketing platform cookie. 1 year HTTP
__hssrc www.voxco.com Hubspot marketing platform cookie. 52 years HTTP
__hssc www.voxco.com Hubspot marketing platform cookie. Session HTTP
Name Domain Purpose Expiry Type
_gid www.voxco.com Google Universal Analytics short-time unique user tracking identifier. 1 days HTTP
MUID bing.com Microsoft User Identifier tracking cookie used by Bing Ads. 1 year HTTP
MR bat.bing.com Microsoft User Identifier tracking cookie used by Bing Ads. 7 days HTTP
IDE doubleclick.net Google advertising cookie used for user tracking and ad targeting purposes. 2 years HTTP
_vwo_uuid_v2 www.voxco.com Generic Visual Website Optimizer (VWO) user tracking cookie. 1 year HTTP
_vis_opt_s www.voxco.com Generic Visual Website Optimizer (VWO) user tracking cookie that detects if the user is new or returning to a particular campaign. 3 months HTTP
_vis_opt_test_cookie www.voxco.com A session (temporary) cookie used by Generic Visual Website Optimizer (VWO) to detect if the cookies are enabled on the browser of the user or not. 52 years HTTP
_ga www.voxco.com Google Universal Analytics long-time unique user tracking identifier. 2 years HTTP
_uetsid www.voxco.com Microsoft Bing Ads Universal Event Tracking (UET) tracking cookie. 1 days HTTP
vuid vimeo.com Vimeo tracking cookie 2 years HTTP
Name Domain Purpose Expiry Type
__cf_bm hubspot.com Generic CloudFlare functional cookie. Session HTTP
Name Domain Purpose Expiry Type
_gcl_au www.voxco.com --- 3 months ---
_gat_gtag_UA_3262734_1 www.voxco.com --- Session ---
_clck www.voxco.com --- 1 year ---
_ga_HNFQQ528PZ www.voxco.com --- 2 years ---
_clsk www.voxco.com --- 1 days ---
visitor_id18452 pardot.com --- 10 years ---
visitor_id18452-hash pardot.com --- 10 years ---
lpv18452 pi.pardot.com --- Session ---
lhc_per www.voxco.com --- 6 months ---
_uetvid www.voxco.com --- 1 year ---
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

survey research report meaning

Home Market Research

Research Reports: Definition and How to Write Them

Research Reports

Reports are usually spread across a vast horizon of topics but are focused on communicating information about a particular topic and a niche target market. The primary motive of research reports is to convey integral details about a study for marketers to consider while designing new strategies.

Certain events, facts, and other information based on incidents need to be relayed to the people in charge, and creating research reports is the most effective communication tool. Ideal research reports are extremely accurate in the offered information with a clear objective and conclusion. These reports should have a clean and structured format to relay information effectively.

What are Research Reports?

Research reports are recorded data prepared by researchers or statisticians after analyzing the information gathered by conducting organized research, typically in the form of surveys or qualitative methods .

A research report is a reliable source to recount details about a conducted research. It is most often considered to be a true testimony of all the work done to garner specificities of research.

The various sections of a research report are:

  • Background/Introduction
  • Implemented Methods
  • Results based on Analysis
  • Deliberation

Learn more: Quantitative Research

Components of Research Reports

Research is imperative for launching a new product/service or a new feature. The markets today are extremely volatile and competitive due to new entrants every day who may or may not provide effective products. An organization needs to make the right decisions at the right time to be relevant in such a market with updated products that suffice customer demands.

The details of a research report may change with the purpose of research but the main components of a report will remain constant. The research approach of the market researcher also influences the style of writing reports. Here are seven main components of a productive research report:

  • Research Report Summary: The entire objective along with the overview of research are to be included in a summary which is a couple of paragraphs in length. All the multiple components of the research are explained in brief under the report summary.  It should be interesting enough to capture all the key elements of the report.
  • Research Introduction: There always is a primary goal that the researcher is trying to achieve through a report. In the introduction section, he/she can cover answers related to this goal and establish a thesis which will be included to strive and answer it in detail.  This section should answer an integral question: “What is the current situation of the goal?”.  After the research design was conducted, did the organization conclude the goal successfully or they are still a work in progress –  provide such details in the introduction part of the research report.
  • Research Methodology: This is the most important section of the report where all the important information lies. The readers can gain data for the topic along with analyzing the quality of provided content and the research can also be approved by other market researchers . Thus, this section needs to be highly informative with each aspect of research discussed in detail.  Information needs to be expressed in chronological order according to its priority and importance. Researchers should include references in case they gained information from existing techniques.
  • Research Results: A short description of the results along with calculations conducted to achieve the goal will form this section of results. Usually, the exposition after data analysis is carried out in the discussion part of the report.

Learn more: Quantitative Data

  • Research Discussion: The results are discussed in extreme detail in this section along with a comparative analysis of reports that could probably exist in the same domain. Any abnormality uncovered during research will be deliberated in the discussion section.  While writing research reports, the researcher will have to connect the dots on how the results will be applicable in the real world.
  • Research References and Conclusion: Conclude all the research findings along with mentioning each and every author, article or any content piece from where references were taken.

Learn more: Qualitative Observation

15 Tips for Writing Research Reports

Writing research reports in the manner can lead to all the efforts going down the drain. Here are 15 tips for writing impactful research reports:

  • Prepare the context before starting to write and start from the basics:  This was always taught to us in school – be well-prepared before taking a plunge into new topics. The order of survey questions might not be the ideal or most effective order for writing research reports. The idea is to start with a broader topic and work towards a more specific one and focus on a conclusion or support, which a research should support with the facts.  The most difficult thing to do in reporting, without a doubt is to start. Start with the title, the introduction, then document the first discoveries and continue from that. Once the marketers have the information well documented, they can write a general conclusion.
  • Keep the target audience in mind while selecting a format that is clear, logical and obvious to them:  Will the research reports be presented to decision makers or other researchers? What are the general perceptions around that topic? This requires more care and diligence. A researcher will need a significant amount of information to start writing the research report. Be consistent with the wording, the numbering of the annexes and so on. Follow the approved format of the company for the delivery of research reports and demonstrate the integrity of the project with the objectives of the company.
  • Have a clear research objective: A researcher should read the entire proposal again, and make sure that the data they provide contributes to the objectives that were raised from the beginning. Remember that speculations are for conversations, not for research reports, if a researcher speculates, they directly question their own research.
  • Establish a working model:  Each study must have an internal logic, which will have to be established in the report and in the evidence. The researchers’ worst nightmare is to be required to write research reports and realize that key questions were not included.

Learn more: Quantitative Observation

  • Gather all the information about the research topic. Who are the competitors of our customers? Talk to other researchers who have studied the subject of research, know the language of the industry. Misuse of the terms can discourage the readers of research reports from reading further.
  • Read aloud while writing. While reading the report, if the researcher hears something inappropriate, for example, if they stumble over the words when reading them, surely the reader will too. If the researcher can’t put an idea in a single sentence, then it is very long and they must change it so that the idea is clear to everyone.
  • Check grammar and spelling. Without a doubt, good practices help to understand the report. Use verbs in the present tense. Consider using the present tense, which makes the results sound more immediate. Find new words and other ways of saying things. Have fun with the language whenever possible.
  • Discuss only the discoveries that are significant. If some data are not really significant, do not mention them. Remember that not everything is truly important or essential within research reports.

Learn more: Qualitative Data

  • Try and stick to the survey questions. For example, do not say that the people surveyed “were worried” about an research issue , when there are different degrees of concern.
  • The graphs must be clear enough so that they understand themselves. Do not let graphs lead the reader to make mistakes: give them a title, include the indications, the size of the sample, and the correct wording of the question.
  • Be clear with messages. A researcher should always write every section of the report with an accuracy of details and language.
  • Be creative with titles – Particularly in segmentation studies choose names “that give life to research”. Such names can survive for a long time after the initial investigation.
  • Create an effective conclusion: The conclusion in the research reports is the most difficult to write, but it is an incredible opportunity to excel. Make a precise summary. Sometimes it helps to start the conclusion with something specific, then it describes the most important part of the study, and finally, it provides the implications of the conclusions.
  • Get a couple more pair of eyes to read the report. Writers have trouble detecting their own mistakes. But they are responsible for what is presented. Ensure it has been approved by colleagues or friends before sending the find draft out.

Learn more: Market Research and Analysis

MORE LIKE THIS

survey research report meaning

When You Have Something Important to Say, You want to Shout it From the Rooftops

Jun 28, 2024

The Item I Failed to Leave Behind — Tuesday CX Thoughts

The Item I Failed to Leave Behind — Tuesday CX Thoughts

Jun 25, 2024

feedback loop

Feedback Loop: What It Is, Types & How It Works?

Jun 21, 2024

survey research report meaning

QuestionPro Thrive: A Space to Visualize & Share the Future of Technology

Jun 18, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • Privacy Policy

Research Method

Home » Research Report – Example, Writing Guide and Types

Research Report – Example, Writing Guide and Types

Table of Contents

Research Report

Research Report

Definition:

Research Report is a written document that presents the results of a research project or study, including the research question, methodology, results, and conclusions, in a clear and objective manner.

The purpose of a research report is to communicate the findings of the research to the intended audience, which could be other researchers, stakeholders, or the general public.

Components of Research Report

Components of Research Report are as follows:

Introduction

The introduction sets the stage for the research report and provides a brief overview of the research question or problem being investigated. It should include a clear statement of the purpose of the study and its significance or relevance to the field of research. It may also provide background information or a literature review to help contextualize the research.

Literature Review

The literature review provides a critical analysis and synthesis of the existing research and scholarship relevant to the research question or problem. It should identify the gaps, inconsistencies, and contradictions in the literature and show how the current study addresses these issues. The literature review also establishes the theoretical framework or conceptual model that guides the research.

Methodology

The methodology section describes the research design, methods, and procedures used to collect and analyze data. It should include information on the sample or participants, data collection instruments, data collection procedures, and data analysis techniques. The methodology should be clear and detailed enough to allow other researchers to replicate the study.

The results section presents the findings of the study in a clear and objective manner. It should provide a detailed description of the data and statistics used to answer the research question or test the hypothesis. Tables, graphs, and figures may be included to help visualize the data and illustrate the key findings.

The discussion section interprets the results of the study and explains their significance or relevance to the research question or problem. It should also compare the current findings with those of previous studies and identify the implications for future research or practice. The discussion should be based on the results presented in the previous section and should avoid speculation or unfounded conclusions.

The conclusion summarizes the key findings of the study and restates the main argument or thesis presented in the introduction. It should also provide a brief overview of the contributions of the study to the field of research and the implications for practice or policy.

The references section lists all the sources cited in the research report, following a specific citation style, such as APA or MLA.

The appendices section includes any additional material, such as data tables, figures, or instruments used in the study, that could not be included in the main text due to space limitations.

Types of Research Report

Types of Research Report are as follows:

Thesis is a type of research report. A thesis is a long-form research document that presents the findings and conclusions of an original research study conducted by a student as part of a graduate or postgraduate program. It is typically written by a student pursuing a higher degree, such as a Master’s or Doctoral degree, although it can also be written by researchers or scholars in other fields.

Research Paper

Research paper is a type of research report. A research paper is a document that presents the results of a research study or investigation. Research papers can be written in a variety of fields, including science, social science, humanities, and business. They typically follow a standard format that includes an introduction, literature review, methodology, results, discussion, and conclusion sections.

Technical Report

A technical report is a detailed report that provides information about a specific technical or scientific problem or project. Technical reports are often used in engineering, science, and other technical fields to document research and development work.

Progress Report

A progress report provides an update on the progress of a research project or program over a specific period of time. Progress reports are typically used to communicate the status of a project to stakeholders, funders, or project managers.

Feasibility Report

A feasibility report assesses the feasibility of a proposed project or plan, providing an analysis of the potential risks, benefits, and costs associated with the project. Feasibility reports are often used in business, engineering, and other fields to determine the viability of a project before it is undertaken.

Field Report

A field report documents observations and findings from fieldwork, which is research conducted in the natural environment or setting. Field reports are often used in anthropology, ecology, and other social and natural sciences.

Experimental Report

An experimental report documents the results of a scientific experiment, including the hypothesis, methods, results, and conclusions. Experimental reports are often used in biology, chemistry, and other sciences to communicate the results of laboratory experiments.

Case Study Report

A case study report provides an in-depth analysis of a specific case or situation, often used in psychology, social work, and other fields to document and understand complex cases or phenomena.

Literature Review Report

A literature review report synthesizes and summarizes existing research on a specific topic, providing an overview of the current state of knowledge on the subject. Literature review reports are often used in social sciences, education, and other fields to identify gaps in the literature and guide future research.

Research Report Example

Following is a Research Report Example sample for Students:

Title: The Impact of Social Media on Academic Performance among High School Students

This study aims to investigate the relationship between social media use and academic performance among high school students. The study utilized a quantitative research design, which involved a survey questionnaire administered to a sample of 200 high school students. The findings indicate that there is a negative correlation between social media use and academic performance, suggesting that excessive social media use can lead to poor academic performance among high school students. The results of this study have important implications for educators, parents, and policymakers, as they highlight the need for strategies that can help students balance their social media use and academic responsibilities.

Introduction:

Social media has become an integral part of the lives of high school students. With the widespread use of social media platforms such as Facebook, Twitter, Instagram, and Snapchat, students can connect with friends, share photos and videos, and engage in discussions on a range of topics. While social media offers many benefits, concerns have been raised about its impact on academic performance. Many studies have found a negative correlation between social media use and academic performance among high school students (Kirschner & Karpinski, 2010; Paul, Baker, & Cochran, 2012).

Given the growing importance of social media in the lives of high school students, it is important to investigate its impact on academic performance. This study aims to address this gap by examining the relationship between social media use and academic performance among high school students.

Methodology:

The study utilized a quantitative research design, which involved a survey questionnaire administered to a sample of 200 high school students. The questionnaire was developed based on previous studies and was designed to measure the frequency and duration of social media use, as well as academic performance.

The participants were selected using a convenience sampling technique, and the survey questionnaire was distributed in the classroom during regular school hours. The data collected were analyzed using descriptive statistics and correlation analysis.

The findings indicate that the majority of high school students use social media platforms on a daily basis, with Facebook being the most popular platform. The results also show a negative correlation between social media use and academic performance, suggesting that excessive social media use can lead to poor academic performance among high school students.

Discussion:

The results of this study have important implications for educators, parents, and policymakers. The negative correlation between social media use and academic performance suggests that strategies should be put in place to help students balance their social media use and academic responsibilities. For example, educators could incorporate social media into their teaching strategies to engage students and enhance learning. Parents could limit their children’s social media use and encourage them to prioritize their academic responsibilities. Policymakers could develop guidelines and policies to regulate social media use among high school students.

Conclusion:

In conclusion, this study provides evidence of the negative impact of social media on academic performance among high school students. The findings highlight the need for strategies that can help students balance their social media use and academic responsibilities. Further research is needed to explore the specific mechanisms by which social media use affects academic performance and to develop effective strategies for addressing this issue.

Limitations:

One limitation of this study is the use of convenience sampling, which limits the generalizability of the findings to other populations. Future studies should use random sampling techniques to increase the representativeness of the sample. Another limitation is the use of self-reported measures, which may be subject to social desirability bias. Future studies could use objective measures of social media use and academic performance, such as tracking software and school records.

Implications:

The findings of this study have important implications for educators, parents, and policymakers. Educators could incorporate social media into their teaching strategies to engage students and enhance learning. For example, teachers could use social media platforms to share relevant educational resources and facilitate online discussions. Parents could limit their children’s social media use and encourage them to prioritize their academic responsibilities. They could also engage in open communication with their children to understand their social media use and its impact on their academic performance. Policymakers could develop guidelines and policies to regulate social media use among high school students. For example, schools could implement social media policies that restrict access during class time and encourage responsible use.

References:

  • Kirschner, P. A., & Karpinski, A. C. (2010). Facebook® and academic performance. Computers in Human Behavior, 26(6), 1237-1245.
  • Paul, J. A., Baker, H. M., & Cochran, J. D. (2012). Effect of online social networking on student academic performance. Journal of the Research Center for Educational Technology, 8(1), 1-19.
  • Pantic, I. (2014). Online social networking and mental health. Cyberpsychology, Behavior, and Social Networking, 17(10), 652-657.
  • Rosen, L. D., Carrier, L. M., & Cheever, N. A. (2013). Facebook and texting made me do it: Media-induced task-switching while studying. Computers in Human Behavior, 29(3), 948-958.

Note*: Above mention, Example is just a sample for the students’ guide. Do not directly copy and paste as your College or University assignment. Kindly do some research and Write your own.

Applications of Research Report

Research reports have many applications, including:

  • Communicating research findings: The primary application of a research report is to communicate the results of a study to other researchers, stakeholders, or the general public. The report serves as a way to share new knowledge, insights, and discoveries with others in the field.
  • Informing policy and practice : Research reports can inform policy and practice by providing evidence-based recommendations for decision-makers. For example, a research report on the effectiveness of a new drug could inform regulatory agencies in their decision-making process.
  • Supporting further research: Research reports can provide a foundation for further research in a particular area. Other researchers may use the findings and methodology of a report to develop new research questions or to build on existing research.
  • Evaluating programs and interventions : Research reports can be used to evaluate the effectiveness of programs and interventions in achieving their intended outcomes. For example, a research report on a new educational program could provide evidence of its impact on student performance.
  • Demonstrating impact : Research reports can be used to demonstrate the impact of research funding or to evaluate the success of research projects. By presenting the findings and outcomes of a study, research reports can show the value of research to funders and stakeholders.
  • Enhancing professional development : Research reports can be used to enhance professional development by providing a source of information and learning for researchers and practitioners in a particular field. For example, a research report on a new teaching methodology could provide insights and ideas for educators to incorporate into their own practice.

How to write Research Report

Here are some steps you can follow to write a research report:

  • Identify the research question: The first step in writing a research report is to identify your research question. This will help you focus your research and organize your findings.
  • Conduct research : Once you have identified your research question, you will need to conduct research to gather relevant data and information. This can involve conducting experiments, reviewing literature, or analyzing data.
  • Organize your findings: Once you have gathered all of your data, you will need to organize your findings in a way that is clear and understandable. This can involve creating tables, graphs, or charts to illustrate your results.
  • Write the report: Once you have organized your findings, you can begin writing the report. Start with an introduction that provides background information and explains the purpose of your research. Next, provide a detailed description of your research methods and findings. Finally, summarize your results and draw conclusions based on your findings.
  • Proofread and edit: After you have written your report, be sure to proofread and edit it carefully. Check for grammar and spelling errors, and make sure that your report is well-organized and easy to read.
  • Include a reference list: Be sure to include a list of references that you used in your research. This will give credit to your sources and allow readers to further explore the topic if they choose.
  • Format your report: Finally, format your report according to the guidelines provided by your instructor or organization. This may include formatting requirements for headings, margins, fonts, and spacing.

Purpose of Research Report

The purpose of a research report is to communicate the results of a research study to a specific audience, such as peers in the same field, stakeholders, or the general public. The report provides a detailed description of the research methods, findings, and conclusions.

Some common purposes of a research report include:

  • Sharing knowledge: A research report allows researchers to share their findings and knowledge with others in their field. This helps to advance the field and improve the understanding of a particular topic.
  • Identifying trends: A research report can identify trends and patterns in data, which can help guide future research and inform decision-making.
  • Addressing problems: A research report can provide insights into problems or issues and suggest solutions or recommendations for addressing them.
  • Evaluating programs or interventions : A research report can evaluate the effectiveness of programs or interventions, which can inform decision-making about whether to continue, modify, or discontinue them.
  • Meeting regulatory requirements: In some fields, research reports are required to meet regulatory requirements, such as in the case of drug trials or environmental impact studies.

When to Write Research Report

A research report should be written after completing the research study. This includes collecting data, analyzing the results, and drawing conclusions based on the findings. Once the research is complete, the report should be written in a timely manner while the information is still fresh in the researcher’s mind.

In academic settings, research reports are often required as part of coursework or as part of a thesis or dissertation. In this case, the report should be written according to the guidelines provided by the instructor or institution.

In other settings, such as in industry or government, research reports may be required to inform decision-making or to comply with regulatory requirements. In these cases, the report should be written as soon as possible after the research is completed in order to inform decision-making in a timely manner.

Overall, the timing of when to write a research report depends on the purpose of the research, the expectations of the audience, and any regulatory requirements that need to be met. However, it is important to complete the report in a timely manner while the information is still fresh in the researcher’s mind.

Characteristics of Research Report

There are several characteristics of a research report that distinguish it from other types of writing. These characteristics include:

  • Objective: A research report should be written in an objective and unbiased manner. It should present the facts and findings of the research study without any personal opinions or biases.
  • Systematic: A research report should be written in a systematic manner. It should follow a clear and logical structure, and the information should be presented in a way that is easy to understand and follow.
  • Detailed: A research report should be detailed and comprehensive. It should provide a thorough description of the research methods, results, and conclusions.
  • Accurate : A research report should be accurate and based on sound research methods. The findings and conclusions should be supported by data and evidence.
  • Organized: A research report should be well-organized. It should include headings and subheadings to help the reader navigate the report and understand the main points.
  • Clear and concise: A research report should be written in clear and concise language. The information should be presented in a way that is easy to understand, and unnecessary jargon should be avoided.
  • Citations and references: A research report should include citations and references to support the findings and conclusions. This helps to give credit to other researchers and to provide readers with the opportunity to further explore the topic.

Advantages of Research Report

Research reports have several advantages, including:

  • Communicating research findings: Research reports allow researchers to communicate their findings to a wider audience, including other researchers, stakeholders, and the general public. This helps to disseminate knowledge and advance the understanding of a particular topic.
  • Providing evidence for decision-making : Research reports can provide evidence to inform decision-making, such as in the case of policy-making, program planning, or product development. The findings and conclusions can help guide decisions and improve outcomes.
  • Supporting further research: Research reports can provide a foundation for further research on a particular topic. Other researchers can build on the findings and conclusions of the report, which can lead to further discoveries and advancements in the field.
  • Demonstrating expertise: Research reports can demonstrate the expertise of the researchers and their ability to conduct rigorous and high-quality research. This can be important for securing funding, promotions, and other professional opportunities.
  • Meeting regulatory requirements: In some fields, research reports are required to meet regulatory requirements, such as in the case of drug trials or environmental impact studies. Producing a high-quality research report can help ensure compliance with these requirements.

Limitations of Research Report

Despite their advantages, research reports also have some limitations, including:

  • Time-consuming: Conducting research and writing a report can be a time-consuming process, particularly for large-scale studies. This can limit the frequency and speed of producing research reports.
  • Expensive: Conducting research and producing a report can be expensive, particularly for studies that require specialized equipment, personnel, or data. This can limit the scope and feasibility of some research studies.
  • Limited generalizability: Research studies often focus on a specific population or context, which can limit the generalizability of the findings to other populations or contexts.
  • Potential bias : Researchers may have biases or conflicts of interest that can influence the findings and conclusions of the research study. Additionally, participants may also have biases or may not be representative of the larger population, which can limit the validity and reliability of the findings.
  • Accessibility: Research reports may be written in technical or academic language, which can limit their accessibility to a wider audience. Additionally, some research may be behind paywalls or require specialized access, which can limit the ability of others to read and use the findings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Methodology

Research Methodology – Types, Examples and...

Assignment

Assignment – Types, Examples and Writing Guide

Evaluating Research

Evaluating Research – Process, Examples and...

Informed Consent in Research

Informed Consent in Research – Types, Templates...

What is a Hypothesis

What is a Hypothesis – Types, Examples and...

Conceptual Framework

Conceptual Framework – Types, Methodology and...

Realtor.com Economic Research

  • Data library

Realtor.com Survey Finds 33% of ‘Sandwich Generation’ Say their Circumstance has Helped then Buy a Home

Hannah Jones

Roughly one in six Americans falls into the ‘Sandwich Generation’, meaning they are caring for their children and their parent(s)/grandparent(s) at the same time. Though these responsibilities can be challenging to manage, roughly a third of this generation reports that their arrangement has led to homeownership. 

survey research report meaning

At an especially challenging time to purchase a home, the net impact of this sort of arrangement seems to either be very positive or quite challenging for the Sandwich Generation. More than half of ‘Sandwich Generation’ adults who receive financial support from family members said their arrangement is helping them afford a home and a little less than half said this support is helping them save for retirement. 

Heightened caretaking responsibilities have impacted the finances of roughly half of the Sandwich Generation, and the consequences seem to be mixed. Almost a third of respondents reported that they have been prevented from buying a home, and another 30% say they have been prevented from paying off their mortgage. However, another third said that their arrangement has helped them financially. It seems the impact of caretaking responsibilities on the Sandwich Generation is highly dependent on what circumstances and arrangements their family has settled on.

survey research report meaning

The ‘Sandwich Generation’ is split across age ranges, with 36% of this group made up by Millennials, 30% in Gen Z, 17% Baby Boomers and 16% in Gen X. Of those who reported being part of the Sandwich Generation, 56% were men and 44% were women, suggesting that men may be slightly more likely to fall into this category.

survey research report meaning

Generationally, Millennials are especially impacted by being part of the Sandwich Generation, and the impact seems to concentrate on the two ends of the spectrum, Almost half (46%) of millennials in the Sandwich Generation reported that their double-support role is preventing them from buying a home, while 43% said it is in fact helping them afford a home. 

Today’s housing market presents a challenge to most buyers, but especially to first time buyers, many of which are Millennials or Gen Z. For those in the Sandwich Generation, housing may feel either untenable given family financial needs, or perhaps more approachable due to family support. No matter the circumstance, prospective buyers can use tools such as the Realtor.com Affordability Calculator to get a better idea of how much house they can afford to accommodate their family’s needs.

Sign up for updates

Join our mailing list to receive the latest data and research.

survey research report meaning

Three years after police reforms, Black Bostonians report harassment and lack of trust at higher rates than other groups

A survey of Bostonians found wide disparities in the ways different racial groups experience their relationship with law enforcement, and negative interactions are also associated with trauma and chronic health conditions. 

Three years after sweeping law enforcement reforms were enacted in Boston to address long-standing concerns of unequal treatment, there is still a striking difference in the way Bostonians of different races experience their interactions with their city’s police force, according to new findings from a team of Harvard Kennedy School researchers. 

Not only did the research find large racial disparities in reports of police harassment and in trust in law enforcement, but it also showed a strong association between negative interactions with police and trauma and chronic health conditions.

The report was conducted by a research team at the Harvard Kennedy School’s Program in Criminal Justice (PCJ) and was led by Sandra Susan Smith , the Daniel and Florence Guggenheim Professor of Criminal Justice at HKS. Smith is faculty chair of PCJ and director of the Malcolm Wiener Center for Social Policy .

The team surveyed a representative sample of 1,407 Boston residents—including 286 Black, 245 Latino, 143 Asian American and Pacific Islander, and 667 white residents—about their contact with, and trust in, law enforcement, and about the impacts that those encounters have on their lives and their communities. The survey was conducted in January and February of 2024.

The survey’s key findings include:

Black Bostonians report various types of police harassment at much higher rates than non-Black Bostonians.

In contrast to non-Black Bostonians, Black Bostonians feel a deep distrust towards law enforcement, and their distrust is strongly associated with experiences of police harassment.

More than half of Boston residents report that law enforcement has made their community feel safer, but rates vary by race/ethnicity and are informed by experiences of police harassment and harassment perceived to be racially motivated.

Among Bostonians, police harassment isn’t just predictive of distrust and feelings of community safety, it is also predictive of symptoms of trauma, especially so for Boston’s Black men.

For some Bostonians, most notably AAPI residents, police harassment and associated distrust and trauma symptoms are linked with chronic health conditions.

In June 2020, Boston’s then-mayor Marty Walsh formed a task force to review Boston Police policies and procedures. The move was part of a national reexamination of policing following George Floyd’s murder in May 2020 and the wave of national protests and outrage that followed. The task force recommendations, ultimately accepted by the mayor, included expanding the use of body-worn cameras; diversifying the police force and creating a culture of inclusion and belonging; engaging officers in implicit-bias training; creating an independent oversight review board; and enhancing police use-of-force policies.

“All things considered, are there any signs to suggest that law enforcement officers treat Black residents of Boston the same as people from other racial and ethnic groups?” the report asks. “Based on results of analysis of these survey data, we have little reason to believe that Black Bostonians are treated the same as people from other racial and ethnic groups.

“Racial disparities in police harassment, including harassment perceived to be racially motivated, are large and consistent with police patterns and practices in Boston described by many in the Black community in the years and decades before George Floyd’s murder, during that year of global protest, and in the years since. It is unclear that reforms responding to Boston’s racial reckoning have done much to alter these very troubling and long-standing patterns.”

Portrait of Sandra Susan Smith

“The social costs associated with police harassment are far greater than we have imagined, extending well beyond penal system outcomes and distrust in law enforcement to include trauma and chronic health conditions.”

Sandra susan smith.

The survey also sought to measure the extent to which encounters with police were linked with mental health vulnerabilities. They asked respondents to remember an experience with police and then were asked the extent to which they agreed with a series of statements that might be indicative of trauma.

“Black Bostonians responded affirmatively to a greater number of these statements,” the report found. While, on average, Latinos, AAPI, and white Bostonians agreed with 1.1, 1,  and 1.2 statements affirmatively, Black residents responded yes to 1.8. “Further, it is not just that a significantly lower percentage of Black Bostonians responded ‘no’ to all the trauma statements—43% relative to 65%, 63%, and 51% of Latino, AAPI, and White residents, respectively—it is also that a significantly higher percentage of Black Bostonians responded ‘yes’ to between 3 and 6 statements—34% relative to 20% of the other racial/ethnic groups.”

“A growing body of research links aggressive policing to poor mental and physical health outcomes in communities targeted for such interventions,” according to the report. “In fact, in addition to mental health vulnerabilities like depression and PTSD-like symptoms, aggressive policing practices have been linked to high blood pressure, diabetes, and obesity/overweight.”

While the analysis “produced several statistically significant findings, they are not always in the direction we would predict, and the strongest associations are not necessarily for the groups we might expect.”

For example, “among Black Bostonians, self-reported high blood pressure is negatively associated with both racially motivated police harassment and distrust; a lower percentage of those who reported racially motivated police harassment and distrust also reported having high blood pressure. The opposite is true for AAPI residents, however; self-reported high blood pressure is positively associated not only with police harassment and racially motivated police harassment but also with trauma symptoms. Among Latino residents, distrust in the police is associated with high blood pressure as well.”

“As with prior research conducted in other cities, findings from this Boston-based study suggest that the social costs associated with police harassment are far greater than we have imagined, extending well beyond penal system outcomes and distrust in law enforcement to include trauma and chronic health conditions,” Smith said. “Thus, even while Boston should be celebrated for the low rates at which its residents die immediately after contact with law enforcement, we should acknowledge and address the extent to which the slow violence of police harassment and the trauma and chronic health conditions it produces diminishes both the quality and likely the length of Bostonians’ lives, especially so for Bostonians of color, and particularly for its Black residents.” 

_ Photography Photo by Matthew J. Lee/The Boston Globe via Getty Images.

More from HKS

Want a jury to be fair, impartial, and engaged in higher-quality deliberations diversify the jury pool, harvard researchers say., experts argue that more work is needed to remove the stigma of a criminal record on job applications, history, culture, and policy all influence the state of black america and democracy today, hks faculty explain.

Get smart & reliable public policy insights right in your inbox. 

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Am J Pharm Educ
  • v.72(1); 2008 Feb 15

Best Practices for Survey Research Reports: A Synopsis for Authors and Reviewers

Jolaine reierson draugalis.

a The University of Oklahoma College of Pharmacy

Stephen Joel Coons

b The University of Arizona College of Pharmacy

Cecilia M. Plaza

c American Association of Colleges of Pharmacy

INTRODUCTION

As survey researchers, as well as reviewers, readers, and end users of the survey research literature, we are all too often disheartened by the poor quality of survey research reports published in the peer-reviewed literature. For the most part, poor quality can be attributed to 2 primary problems: (1) ineffective reporting of sufficiently rigorous survey research, or (2) poorly designed and/or executed survey research, regardless of the reporting quality. The standards for rigor in the design, conduct, and reporting of survey research in pharmacy should be no lower than the standards for the creation and dissemination of scientific evidence in any other discipline. This article provides a checklist and recommendations for authors and reviewers to use when submitting or evaluating manuscripts reporting survey research that used a questionnaire as the primary data collection tool.

To place elements of the checklist in context, a systematic review of the Journal was conducted for 2005 (volume 69) and 2006 (volume 70) to identify articles that reported the results of survey research. In 2005, volume 69, 10/39 (26%) and 2006, volume 70, 10/29 (35%) of the total research articles published (not including personal or telephone interviews) used survey research methods. As stated by Kerlinger and Lee, “Survey research studies large and small populations (or universes) by selecting and studying samples chosen from the population to discover the relative incidence, distribution, and interrelations of sociological and psychological variables.” 1 Easier said than done; that is, if done in a methodologically sound way. Although survey research projects may use personal interviews, panels, or telephones to collect data, this paper will only consider mail, e-mail, and Internet-based data collection approaches. For clarity, the term survey should be reserved to describe the research method whereas a questionnaire or survey instrument is the data collection tool. In other words, the terms survey and questionnaire should not be used interchangeably. As well, data collection instruments are used in many research designs such as pretest/posttest and experimental designs, and use of the term survey is inappropriate to describe the instrument or the methodology in these cases. In 2005-2006 Journal volumes 69 and 70, 11/68 research articles (16%) used inappropriate terminology. Survey research can be very powerful and may well be the only way to conduct a particular inquiry or ongoing body of research.

There is no shortage of text and reference books, to name but a few of our favorites, Dillman's Mail and Internet Surveys: The Tailored Design Method , 2 Fowler's Survey Research Methods , 3 Salant and Dillman's How to Conduct Your Own Survey , 4 and Aday and Cornelius's Designing and Conducting Health Surveys – A Comprehensive Guide . 5 As well, numerous guidelines, position statements, and best practices are available from a wide variety of associations in the professional literature and via the Internet. We will cite a number of these throughout this paper. Unfortunately, it is apparent from both the published literature and the many requests to contribute data to survey research projects that these materials are not always consulted and applied. In fact, it seems quite evident that there is a false impression that conducting survey research is relatively easy. As an aside to his determination of the effectiveness of follow-up techniques in mail surveys, Stratton found, “the number of articles that fell short of a scholarly level of execution and reporting was surprising.” 6 In addition, Desselle more recently observed that, “Surveys are perhaps the most used, and sometimes misused, methodological tools among academic researchers.” 7

We will structure this paper based on a modified version of the 10 guiding questions established in the Best Practices for Survey and Public Opinion Research by the American Association for Public Opinion Research (AAPOR). 8 The 10 guiding questions are: (1) was there a clearly defined research question? (2) did the authors select samples that well represent the population to be studied? (3) did the authors use designs that balance costs with errors? (4) did the authors describe the research instrument? (5) was the instrument pretested? (6) were quality control measures described? (7) was the response rate sufficient to enable generalizing the results to the target population? (8) were the statistical, analytic, and reporting techniques appropriate to the data collected? (9) was evidence of ethical treatment of human subjects provided? and (10) were the authors transparent to ensure evaluation and replication? These questions can serve as a guide for reviewers and researchers alike for identifying features of quality survey research. A grid addressing the 10 questions and subcategories is provided in Appendix 1 for use in preparing and reviewing submissions to the Journal .

Clearly Defined Research Question

Formulating the research questions and study objectives depends on prior work and knowing what is already available either in archived literature, American Association of Colleges of Pharmacy (AACP) institutional research databases, or from various professional organizations and associations. 9 , 10 The article should clearly state why the research is necessary, placing it in context, and drawing upon previous work via a literature review. 9 This is especially pertinent to the measurement of psychological constructs, such as satisfaction (eg, satisfaction with pharmacy services). Too many researchers just put items down on a page that they think measure the construct (and answer the research question); however, they may miss the mark because they have not approached the research question and, subsequently, item selection or development from the perspective of a theoretical framework or existing model that informs the measurement of satisfaction. Another important consideration is whether alternatives to using survey research methods have been considered, in essence asking the question of whether the information could better be obtained using a different methodology. 8

Sampling Considerations

For a number of reasons (eg, time, cost), data are rarely obtained from every member of a population. A census, while appropriate in certain specific cases where responses from an entire population are needed to adequately answer the research question, is not generally required in order to obtain the desired data. In the majority of situations, sampling from the population under study will both answer the research question and save both time and money. Survey research routinely involves gathering data from a subset or sample of individuals intended to represent the population being studied. 11 Therefore, since researchers are relying on data from samples to reflect the characteristics and attributes of interest in the target population, the samples must be properly selected. 12 To enable the proper selection of a sample, the target population has to be clearly identified. The sample frame should closely approximate the full target population; any significant departure from that should be justified. Once the sample frame has been identified, the sample selection process needs to be delineated including the sampling method (eg, probability sampling techniques such as simple random or stratified). Although nonprobability sample selection approaches (eg, convenience, quota, or snowball sampling) are used in certain circumstances, probability sampling is preferred if the survey results are to be credibly generalized to the target population. 13

The required sample size depends on a variety of factors, including whether the purpose of the survey is to simply describe population characteristics or to test for differences in certain attributes of interest by subgroups within the population. Authors of survey research reports should describe the process they used to estimate the necessary sample size including the impact of potential nonresponse. An in-depth discussion of sample size determination is beyond the scope of this paper; readers are encouraged to refer to the excellent existing literature on this topic. 13 , 14

Balance Between Costs and Errors

Balance between costs and errors deals with a realistic appraisal of resources needed to carry out the study. This appraisal includes both monetary and human resource aspects. Tradeoffs are necessary but involve more than just numbers of subjects. For example, attempting to get large sample sizes but with insufficient follow-up versus a smaller more targeted representative sample with multiple follow-ups. Seemingly large sample sizes do not necessarily represent a probability sample. When conducting survey research, if follow-ups are not planned and budgeted for, the study should not be initiated. The effectiveness of incentives and approaches to follow-up are discussed in detail elsewhere, 2 , 4 , 5 but the importance of well-planned follow-up procedures cannot be overstated. In volumes 69 and 70 of the Journal , 11/20 (55%) survey research papers reported the use of at least 1 follow-up to the initial invitation to participate.

Description of the Survey Instrument

The survey instrument or questionnaire used in the research should be described fully. If an existing questionnaire was used, evidence of psychometric properties such as reliability and validity should be provided from the relevant literature. Evidence of reliability indicates that the questionnaire is measuring the variable or variables in a reproducible manner. Evidence supporting a questionnaire's validity indicates that it is measuring what is intended to be measured. 15 In addition, the questionnaire's measurement model (ie, scale structure and scoring system) should be described in sufficient detail to enable the reader to understand the meaning and interpretation of the resulting scores. When open-ended, or qualitative, questions are included in the questionnaire, a clear description must be provided as to how the resulting text data will be summarized and coded, analyzed, and reported.

If a new questionnaire was created, a full description of its development and testing should be provided. This should include discussion of the item generation and selection process, choice of response options/scales, construction of multi-item scales (if included), and initial testing of the questionnaire's psychometric properties. 15 As with an existing questionnaire, evidence supporting the validity and reliability of the new questionnaire should be clearly provided by authors. If researchers are using only selected items from scales in an existing questionnaire, justification for doing so should be provided and their measurement properties in their new context should be properly tested prior to use. In addition, proper attribution of the source of scale items should be provided in the study report. In volumes 69 and 70 in the Journal , 10/20 (50%) survey research papers provided no or insufficient information concerning the reliability and/or validity of the survey instrument used in the study.

Commonly measured phenomena in survey research include frequency, quantity, feelings, evaluations, satisfaction, and agreement. 16 Authors should provide sufficient detail for reviewers to be able to discern that the items and response options are congruent and appropriate for the variables being measured. For instance, a reviewer would question an item asking about the frequency of a symptom with the response options ranging from “excellent” to “poor.” In an extensive review article, Desselle provides an overview of the construction, implementation, and analysis of summated rating attitude scales. 7

Pretesting is often conducted with a focus group to identify ambiguous questions or wording, unclear instructions, or other problems with the instrument prior to widespread dissemination. Pretesting is critical because it provides valuable information about issues related to reliability and validity through identification of potential problems prior to data collection. In volumes 69 and 70 in the Journal , only 8/20 survey research papers (40%) reported that pretesting of the survey instrument was conducted. Authors should clearly describe how a survey instrument was pretested. While pretesting is often conducted with a focus group of peers or others similar to subjects, cognitive interviewing is becoming increasingly important in the development and testing of questionnaires to explore the way in which members of the target population understand, mentally process, and respond to the items on a questionnaire. 17 , 18 Cognitive testing, for example, consists of the use of both verbal probing by the interviewer (eg, “What does the response ‘some of the time’ mean to you?”) and think aloud , in which the interviewer asks the respondent to verbalize whatever comes to mind as he or she answers the question. 16 This technique helps determine whether respondents are interpreting the questions and the response sets as intended by the questionnaire developers. If done with a sufficient number of subjects, the cognitive interviewing process also provides the opportunity to fulfill some of the roles of a pilot test in which length, flow, ease of administration, ease of response, and acceptability to respondents can be assessed. 19

Quality Control Measures

The article should describe in the methods section whether procedures such as omit or skip patterns (procedures that direct respondents to answer only those items relevant to them) were used on the survey instrument. The article should also describe whether a code book was used for data entry and organization and what data verification procedures were used, for example spot checking a random 10% of data entries back to the original survey instruments. Outliers should be verified and the procedure for handling missing data should be explained.

Response Rates

In general, response rate can be defined as the number of respondents divided by the number of eligible subjects in the sample. A review of survey response rates reported in the professional literature found that over a quarter of the articles audited failed to define response rate. 20 As stated by Johnson and Owens, “when a ‘response rate’ is given with no definition, it can mean anything, particularly in the absence of any additional information regarding sample disposition.” 20 Hence, of equal importance to the response rate itself is transparency in its reporting. As with the CONSORT guidelines for randomized controlled trials, the flow of study subjects from initial sample selection and contact through study completion and analysis should be provided. 21 Drop-out or exclusion for any reason should be documented and every individual in the study sample should be accounted for clearly. In addition, there may be a need to distinguish between the overall response rate and item-level response rates. Very low response rates for individual items on a questionnaire can be problematic, particularly if they represent important study variables.

Fowler states that there is no agreed-upon standard for acceptable response rates; however, he indicates that some federal funding agencies ask that survey procedures be used that are likely to result in a response rate of over 75%. 3 Bailey also asserted that the minimal acceptable response rate was 75%. 22 Schutt indicated that below 60% was unacceptable, but Babbie stated that a 50% response rate was adequate. 23 , 24 As noted in the Canadian Medical Association journal's editorial policy, “Except for in unusual circumstances, surveys are not considered for publication in CMAJ if the response rate is less than 60% of eligible participants.” 10 Fowler states that, “…one occasionally will see reports of mail surveys in which 5% to 20% of the selected sample responded. In such instances, the final sample has little relationship to the original sampling process; those responding are essentially self-selected. It is very unlikely that such procedures will provide any credible statistics about the characteristics of the population as a whole.” 3 Although the literature does not reflect agreement on a minimum acceptable response rate, there is general consensus that at least half of the sample should have completed the survey instrument. In volumes 69 and 70 in the Journal , 7/20 survey research papers (35%) had response rates less than 30%, 6/20 (30%) had response rates between 31%-60%, and 7/20 (35%) had response rates of 61% or greater. In volumes 69 and 70 in the Journal , in the 13 survey research articles that had less than a 60% response rate, 8/13 (61.5%) mentioned the possibility of response bias.

The lower the response rate, the higher the likelihood of response bias or nonresponse error. 4 , 25 “Nonresponse error occurs when a significant number of subjects in the sample do not respond to the survey and when they differ from respondents in a way that influences, or could influence, the results.” 26 Response bias stems from the survey respondents being somehow different from the nonrespondents and, therefore, not representative of the target population. The article should address both follow-up procedures (timing, method, and quantity) and response rate. While large sample sizes are often deemed desirable, they must be tempered by the consideration that low response rates are more damaging to the credibility of results than a small sample. 12 Most of the time, response bias is very hard to rule out due to lack of sufficient information regarding the nonrespondents. Therefore, it is imperative that researchers design their survey method to optimize response rates. 2 , 27 To be credible, published survey research must meet acceptable levels of scientific rigor, particularly in regard to response rate transparency and the representativeness or generalizability of the study's results.

Statistical, Analytic, and Reporting Techniques

As noted in the Journal's Instructions to Reviewers, there should be a determination of whether the appropriate statistical techniques were used. The article should indicate what statistical package was used and what statistical technique was applied to what variables. Decisions must be made as to how data will be presented, for example, using a pie chart to provide simple summaries of data but not to present linear or relational data. 28 The authors should provide sufficient detail to allow reviewers to match up hypothesis testing and relevant statistical analyses. In addition, if the questionnaire included qualitative components (eg, open-ended questions), a thorough description should be provided as to how and by whom the textual responses were coded for analysis.

Human Subjects Considerations

Even though most journals now require authors to indicate Institutional Review Board (IRB) compliance, there are still many examples of requests to participate, particularly in web-based or e-mail data collection modes, that have obviously not been subjected to IRB scrutiny. Examples of the evidence that this is the case include insufficient verbiage (eg, estimates of time to complete, perceived risks and benefits) in the invitation to participate, “mandatory” items (thereby violating subject's right to refuse to answer any or all items), and use of listservs for “quick and dirty” data gathering when the ultimate intent is to disseminate the findings. The authors should explicitly list which IRB they received approval from, the IRB designation received (eg, exempt, expedited), and how consent was obtained.

Transparency

The authors should fully specify their methods and report in sufficient detail such that another researcher could replicate the study. This consideration permeates the previous 9 sections. For example, an offer to provide the instrument upon request does not substitute for the provision of reliability and validity evidence in the article itself. Another example related to transparency of methods would be the description of the mode of administration. In volume 69 of the Journal , 3/10 (30%) survey research articles used mixed survey methods (both Internet and first-class mail) but did not provide sufficient detail as to what was collected by each respective method. Also, in volume 69 of the Journal , 1 survey research article simply used the word “sent” without providing any information as to how the instrument was delivered.

Additional Considerations Regarding Internet or Web-based Surveys

The use of Internet or e-mail based surveys (also referred to as “email surveys”) has grown in popularity as a proposed less expensive and more efficient method of conducting survey research. 2 , 29 - 32 The supposed ease in data collection can give the impression that survey research is easily conducted; however, the good principles for traditional mail surveys still apply. Authors and reviewers must be aware that the mode of administration is irrelevant to all that must be done prior to that. Some potential problems associated with the use of web-based surveys are their ability to be forwarded to inappropriate or unintended subjects. 31 Web-based surveys also suffer from potential problems with undeliverable e-mails due to outdated listservs or incorrect e-mail addresses, thus affecting the calculation of the response rate and determination of the most appropriate denominator. 2 , 30 - 31 The authors should describe specifically how the survey instrument was disseminated (eg, e-mail with a link to the survey) and what web-based survey tool was used.

We have provided 10 guiding questions and recommendations regarding what we consider to be best practices for survey research reports. Although our recommendations are not minimal standards for manuscripts submitted to the Journal , we hope that they provide guidance that will result in an enhancement of the quality of published reports of questionnaire-based survey research. It is important for both researchers/authors and reviewers to seriously consider the rigor that needs to be applied in the design, conduct, and reporting of survey research so that the reported findings credibly reflect the target population and are a true contribution to the scientific literature.

ACKNOWLEDGEMENT

The ideas expressed in this manuscript are those of the authors and do not represent the position of the American Association of Colleges of Pharmacy.

Appendix 1. Criteria for Survey Research Reports

  • Are the study objectives clearly identified?
  • —AACP databases
  • —Readily available literature
  • —Other professional organizations
  • What sampling approaches were used?
  • Did the authors provide a description of how coverage and sampling error were minimized?
  • Did the authors describe the process to estimate the necessary sample size?
  • Did the authors use designs that balance costs with errors? (eg, strive for a census with inadequate follow-up versus smaller sample but aggressive follow-up)
  • Was evidence provided regarding the reliability and validity of an existing instrument?
  • How was a new instrument developed and assessed for reliability and validity?
  • Was the scoring scheme for the instrument sufficiently described?
  • Was the procedure used to pretest the instrument described?
  • Was a code book used?
  • Did the authors discuss what techniques were used for verifying data entry?
  • What was the response rate?
  • How was response rate calculated?
  • Were follow-ups planned for and used?
  • Do authors address potential nonresponse bias?
  • Were the statistical, analytic, and reporting techniques appropriate to the data collected?
  • Did the authors list which IRB they received approval from?
  • Did the authors explain how consent was obtained?
  • Was evidence for validity provided?
  • Was evidence of reliability provided?
  • Were results generalizable?
  • Is replication possible given information provided?
  • Frontiers in Allergy
  • Drug, Venom & Anaphylaxis
  • Research Topics

Anaphylaxis Challenges: Idiopathic and Rare Causes

Total Downloads

Total Views and Downloads

About this Research Topic

The main first step in managing anaphylaxis is establishing a correct diagnosis. Many factors concerning definition, diagnostic criteria, and lack of confirmatory testing, makes the diagnosis a challenge, not only to the treating or first line medical responders but also to the allergologists who are ...

Keywords : Anaphylaxis, diagnostic challenges, allergic reactions, systemic reactions, rare causes, idiopathic anaphylaxis

Important Note : All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic Editors

Topic coordinators, recent articles, submission deadlines.

Manuscript
Manuscript Extension

Participating Journals

Manuscripts can be submitted to this Research Topic via the following journals:

total views

  • Demographics

No records found

total views article views downloads topic views

Top countries

Top referring sites, about frontiers research topics.

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • How Americans View National, Local and Personal Energy Choices

Most Americans want more renewable energy, but support has dipped. Interest in electric vehicles has also declined

Table of contents.

  • 1. Views on energy development in the U.S.
  • 2. Americans’ views on local wind and solar power development
  • 3. Americans’ perceptions of solar power in their own lives
  • Acknowledgments
  • The American Trends Panel survey methodology
  • Appendix: Detailed charts and tables

survey research report meaning

Pew Research Center conducted this study to understand Americans’ views of energy issues. For this analysis, we surveyed 8,638 U.S. adults from May 13 to 19, 2024.

Everyone who took part in the survey is a member of the Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way, nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the ATP’s methodology .

Here are the questions used for this report , along with responses, and its Methodology .

The planet’s continued streak of record heat has spurred calls for action by scientists and global leaders . Meanwhile, in the United States, energy development policy is being hotly debated on the national and local levels this election year. How do Americans feel about U.S. energy policy options, and what steps are they willing to take in their own lives to reduce carbon emissions? A new Pew Research Center survey takes a look.

Among the major findings:

Chart shows Support for expanding wind, solar power in the U.S. has fallen since 2020

There’s been a decline in the breadth of support for wind and solar power. The shares who favor expanding solar and wind power farms are down 12 percentage points and 11 points, respectively, since 2020, driven by sharp drops in support among Republicans.

Interest in buying an electric vehicle (EV) is lower than a year ago. Today, 29% of Americans say they would consider an EV for their next purchase, down from 38% in 2023.

Still, a majority of Americans (63%) support the goal of the U.S. taking steps to become carbon neutral by 2050. When asked which is the greater priority, far more Americans continue to say the country should focus on developing renewable energy than fossil fuel sources (65% vs. 34%).

The survey, conducted May 13-19 among 8,638 U.S. adults, finds a fairly modest share of U.S. adults (25%) say it’s extremely or very important to them personally to limit their own “carbon footprint.” Far more give this middling or low priority.

These findings illustrate how large shares of Americans back more renewable energy that would decrease overall carbon emissions. Still, this general orientation does not necessarily translate into strong commitment to reducing personal carbon emissions or interest in buying an EV.

Jump to read more on: Trends in views of energy development in the U.S. | Views on wind and solar development at the local level | Perceptions of solar power in people’s own lives

What’s behind declines in support for wind and solar?

Declines in public support for renewable energy have been driven by Republicans and Republican-leaning independents, whose support started to fall sharply after President Joe Biden took office in early 2020.

  • 64% of Republicans say they favor more solar panel farms, down from 84% in 2020.
  • 56% of Republicans say they favor more wind turbine farms, a 19-point drop from 2020.

Chart shows Growing partisan divide in support for expanding wind, solar power in the U.S.

Over this same time period, views among Democrats and Democratic leaners on these measures are little changed, with large majorities continuing to support more wind and solar development.

In some cases, gaps between Republicans and Democrats over energy policy now approach the very wide partisan divides seen over the importance of climate change .

In May 2020, Democrats were 26 points more likely than Republicans to say the country’s priority should be developing renewable energy (91% vs. 65%). Four years later, that gap has ballooned to 49 points, due almost entirely to changing views among Republicans – 61% of whom now say developing fossil fuels like oil, coal and natural gas should be the more important priority.

Jump to more details on partisan differences in views of U.S. energy development.

But changes in attitudes about policies that would reduce carbon emissions are not solely the result of more negative views among Republicans. For instance, the share of Democrats who say they are very or somewhat likely to consider an EV for their next car purchase has declined from 56% to 45% in the last year. And the share of Democrats who call climate change a very big problem for the U.S. has declined from 71% in 2021 to 58% today.

Views within each party

Chart shows Young Republicans give priority to developing renewable energy over fossil fuels in the U.S.

Among Republicans, age matters. Younger Republicans express much more support for renewable energy than do older Republicans. For instance, 67% of Republicans ages 18 to 29 say the country should give priority to wind, solar and hydrogen development. The oldest Republicans (ages 65 and older) take the opposite view: 76% give priority to developing oil, coal and natural gas.

By and large, Democrats are more united in their views on energy. Democrats across age groups broadly support steps that would lower carbon emissions and prioritize renewable sources. But differences emerge over the degree with which to break from fossil fuels: 45% of Democrats say the country should phase out the use of oil, coal and natural gas completely, compared with 53% who say that fossil fuels should remain part of the mix along with renewable sources.

Differences within the two major parties are explored in more detail here .

Views on increasing electric vehicles in the U.S.

Chart shows 58% of Americans oppose rules aimed at dramatically increasing electric vehicle sales in the U.S.

Amid a major policy push at the federal level for electric vehicles, Americans are unenthusiastic about steps that would phase out gas-powered vehicles.

In March of this year, the Biden administration announced a rule aimed at dramatically expanding EV sales . Overall, 58% of Americans say they oppose these rules that would make EVs at least half of all new cars and trucks sold in the U.S. by 2032. Republicans overwhelmingly oppose this policy (83%). Among Democrats, 64% support these rules to expand EV sales, while 35% say they oppose them.

Chart shows Declining share of Americans say they are likely to consider buying an electric vehicle

Americans bought EVs in record numbers last year, but the growth rate is slowing, and interest in EVs has declined. In the current survey, 29% of Americans say they are very or somewhat likely to consider an electric vehicle the next time they purchase a car. Last year, 38% expressed this level of interest in an EV purchase.

Related: About 3 in 10 Americans would seriously consider buying an electric vehicle and the distribution of EV charging stations in the U.S.

Americans’ views on limiting their own ‘carbon footprint’

Discussions about reducing carbon emissions often include the everyday actions people can take to reduce the amount of energy they use . One-in-four Americans say it is extremely or very important to them personally to limit their own “carbon footprint.” Larger shares say this is either somewhat (42%) or not too or not at all (32%) important to them.

Chart shows 1 in 4 Americans say limiting their ‘carbon footprint’ is extremely or very important to them

Even among Democrats – who express broad support for renewable energy – only 39% say reducing their own carbon footprint is extremely or very important to them personally.

These findings align with a previous Center survey that shows a modest share of Americans (23%) expect to make major sacrifices in their own life because of climate change.

Simply put, the shares of Americans who place the highest priority on limiting their own carbon emissions or expect to make big changes to the way they live because of climate change remain relatively small.

Those who place a high priority on reducing their own carbon footprint – or expect major direct impacts from climate change – are far more likely than other Americans to back aggressive steps to reduce carbon emissions.

For instance, 70% of those who place high importance on reducing their own carbon footprint support rules to dramatically boost EV sales in the U.S. by 2032. Much smaller shares of those who say reducing their carbon footprint is somewhat (43%) or not too or not at all (14%) important support this policy.

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Partisanship & Issues
  • Political Issues

Electric Vehicle Charging Infrastructure in the U.S.

How republicans view climate change and energy issues, growing share of americans favor more nuclear power, why some americans do not see urgency on climate change, what the data says about americans’ views of climate change, most popular, report materials.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

NTRS - NASA Technical Reports Server

Available downloads, related records.

IMAGES

  1. Survey Research: Definition, Examples & Methods

    survey research report meaning

  2. Types of Research Report

    survey research report meaning

  3. Surveys: What They Are, Characteristics & Examples

    survey research report meaning

  4. Survey research

    survey research report meaning

  5. Survey Research

    survey research report meaning

  6. A Comprehensive Guide to Survey Research Methodologies

    survey research report meaning

VIDEO

  1. Survey Meaning in Hindi

  2. Research Report

  3. Nursing Research Report| Nursing Research| B.Sc. 3rd Year

  4. Meaning of Questionnaire🤔| Statistics|#commerce #statistics #class12 #bcom #education #shorts

  5. Applied Survey Research Survey Fundamentals and Terminology

  6. Report Meaning

COMMENTS

  1. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  2. Survey Research: Definition, Examples and Methods

    Survey Research Definition. Survey Research is defined as the process of conducting research using surveys that researchers send to survey respondents. The data collected from surveys is then statistically analyzed to draw meaningful research conclusions. ... It will help you to analyze, predict decisions, and help write the summary report ...

  3. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  4. Survey Research: Definition, Examples & Methods

    Survey research is the process of collecting data from a predefined group (e.g. customers or potential customers) with the ultimate goal of uncovering insights about your products, services, or brand overall.. As a quantitative data collection method, survey research can provide you with a goldmine of information that can inform crucial business and product decisions.

  5. Survey Research

    Survey Research. Definition: Survey Research is a quantitative research method that involves collecting standardized data from a sample of individuals or groups through the use of structured questionnaires or interviews. The data collected is then analyzed statistically to identify patterns and relationships between variables, and to draw conclusions about the population being studied.

  6. How to Make a Survey Report: Complete Guide

    Present the key findings of the survey in a clear and organized manner. Use charts, graphs, and tables to visualize the data effectively. Ensure that the findings are presented in a logical sequence, making it easy for readers to follow the narrative. The survey findings section is the heart of the report, where the raw data collected during ...

  7. Survey Research: Definition, Types & Methods

    Descriptive research is the most common and conclusive form of survey research due to its quantitative nature. Unlike exploratory research methods, descriptive research utilizes pre-planned, structured surveys with closed-ended questions. It's also deductive, meaning that the survey structure and questions are determined beforehand based on existing theories or areas of inquiry.

  8. Doing Survey Research

    Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout. Distribute the survey. Analyse the responses. Write up the results. Surveys are a flexible method of data collection that can be used in many different types of research.

  9. 9.1 Overview of Survey Research

    What Is Survey Research? Survey research is a quantitative approach that has two important characteristics. First, the variables of interest are measured using self-reports. In essence, survey researchers ask their participants (who are often called respondents in survey research) to report directly on their own thoughts, feelings, and behaviors. Second, considerable attention is paid to the ...

  10. Survey Research: Definition, Methods, Examples, and More

    What is Survey Research? Survey research, as a key research method of marketing research, is defined as the systematic collection and analysis of data gathered from respondent feedback through questionnaires or interviews. This primary research method is designed to gather information about individuals' opinions, behaviors, or characteristics ...

  11. A quick guide to survey research

    Survey research is a unique way of gathering information from a large cohort. Advantages of surveys include having a large population and therefore a greater statistical power, the ability to gather large amounts of information and having the availability of validated models. However, surveys are costly, there is sometimes discrepancy in recall ...

  12. Survey Report: What is it & How to Create it?

    (Definition) A survey report is a document that demonstrates all the important information about the survey in an objective, clear, precise, and fact-based manner. ... Market Research Survey: This type of survey is performed to find out the preferences and demographics of the target audience. It also helps with competitor research.

  13. Approaching Survey Research

    What Is Survey Research? Survey research is a quantitative and qualitative method with two important characteristics. First, the variables of interest are measured using self-reports (using questionnaires or interviews). In essence, survey researchers ask their participants (who are often called respondents in survey research) to report directly on their own thoughts, feelings, and behaviors.

  14. (PDF) Understanding and Evaluating Survey Research

    Survey research is defined as. "the collection of information from. a sample of individuals through their. responses to questions" (Check &. Schutt, 2012, p. 160). This type of r e -. search ...

  15. Designing, Conducting, and Reporting Survey Studies: A Primer for

    Burns et al., 2008 12. A guide for the design and conduct of self-administered surveys of clinicians. This guide includes statements on designing, conducting, and reporting web- and non-web-based surveys of clinicians' knowledge, attitude, and practice. The statements are based on a literature review, but not the Delphi method.

  16. Survey data analysis and best practices for reporting

    1. Make it visual. You can present data in a visual form, such as a chart or graph, or put it into a tabular form so it's easy for people to see the relationships between variables in your crosstab analysis. Choose a graphic format that best suits your data type and clearly shows the results to the untrained eye.

  17. Survey Report: Definition, Templates, and Best Pratices

    Definition, Templates, and Best Pratices of Survey Report. The survey report is a document whose purpose is to convey the information acquired during the survey in its whole and objectively. The report includes all of the results that were gathered. The following are included in the full survey report:

  18. Research Reports: Definition and How to Write Them

    Research reports are recorded data prepared by researchers or statisticians after analyzing the information gathered by conducting organized research, typically in the form of surveys or qualitative methods. A research report is a reliable source to recount details about a conducted research. It is most often considered to be a true testimony ...

  19. Survey Glossary, Dictionary of Survey Research Terms

    Second, random sample within each stratum. Survey: The process of conducting research using survey methodology, though in common parlance "survey" is used to mean the survey instrument or questionnaire. Survey Administration: The process of inviting people to take the survey and collect data from them.

  20. PDF Fundamentals of Survey Research Methodology

    The survey is then constructed to test this model against observations of the phenomena. In contrast to survey research, a . survey. is simply a data collection tool for carrying out survey research. Pinsonneault and Kraemer (1993) defined a survey as a "means for gathering information about the characteristics, actions, or opinions of a ...

  21. PDF Presenting survey results

    Writing for reports. The purpose of report writing is to communicate the findings of the research. The report should tell the whole story — what the objectives of the research were, how the data were collected, what the data say and what the implications of the findings are. Every individual has their own style of writing.

  22. Research Report

    Research Report. Definition: Research Report is a written document that presents the results of a research project or study, including the research question, methodology, results, and conclusions, in a clear and objective manner. ... The study utilized a quantitative research design, which involved a survey questionnaire administered to a ...

  23. Realtor.com Survey Finds 33% of 'Sandwich Generation' Say their

    The 'Sandwich Generation' is split across age ranges, with 36% of this group made up by Millennials, 30% in Gen Z, 17% Baby Boomers and 16% in Gen X.

  24. Political Typology Quiz

    ABOUT PEW RESEARCH CENTER Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions.

  25. Methodology

    AAPOR Task Force on Address-based Sampling. 2016. "AAPOR Report: Address-based Sampling." ↩ Email [email protected]. ↩; Postcard notifications are sent to 1) panelists who have been provided with a tablet to take ATP surveys, 2) panelists who were recruited within the last two years, and 3) panelists recruited prior to the last two years who opt to continue receiving postcard ...

  26. Three years after police reforms, Black Bostonians report harassment

    A survey of Bostonians found wide disparities in the ways different racial groups experience their relationship with law enforcement, and negative interactions are also associated with trauma and chronic health conditions. ... Not only did the research find large racial disparities in reports of police harassment and in trust in law enforcement ...

  27. Best Practices for Survey Research Reports: A Synopsis for Authors and

    This article provides a checklist and recommendations for authors and reviewers to use when submitting or evaluating manuscripts reporting survey research that used a questionnaire as the primary data collection tool. To place elements of the checklist in context, a systematic review of the Journal was conducted for 2005 (volume 69) and 2006 ...

  28. Anaphylaxis Challenges: Idiopathic and Rare Causes

    The main first step in managing anaphylaxis is establishing a correct diagnosis. Many factors concerning definition, diagnostic criteria, and lack of confirmatory testing, makes the diagnosis a challenge, not only to the treating or first line medical responders but also to the allergologists who are investigating the cause. Idiopathic, unusual presentations and rare causes might be wrongly ...

  29. How Americans View Energy Policies and Personal Choices

    Pew Research Center conducted this study to understand Americans' views of energy issues. For this analysis, we surveyed 8,638 U.S. adults from May 13 to 19, 2024. Everyone who took part in the survey is a member of the Center's American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of ...

  30. NTRS

    In support of a collaboration with the International Forum for Aviation Research (IFAR), the NASA Langley Liner Physics Team has acquired a detailed dataset for use in the evaluation of flow direction effects on the acoustic liner impedance eduction process. Measurements are acquired with four acoustic liners designed to cover a range of sensitivities to source sound pressure level and ...