8.5 Writing Process: Creating an Analytical Report

Learning outcomes.

By the end of this section, you will be able to:

  • Identify the elements of the rhetorical situation for your report.
  • Find and focus a topic to write about.
  • Gather and analyze information from appropriate sources.
  • Distinguish among different kinds of evidence.
  • Draft a thesis and create an organizational plan.
  • Compose a report that develops ideas and integrates evidence from sources.
  • Give and act on productive feedback to works in progress.

You might think that writing comes easily to experienced writers—that they draft stories and college papers all at once, sitting down at the computer and having sentences flow from their fingers like water from a faucet. In reality, most writers engage in a recursive process, pushing forward, stepping back, and repeating steps multiple times as their ideas develop and change. In broad strokes, the steps most writers go through are these:

  • Planning and Organization . You will have an easier time drafting if you devote time at the beginning to consider the rhetorical situation for your report, understand your assignment, gather ideas and information, draft a thesis statement, and create an organizational plan.
  • Drafting . When you have an idea of what you want to say and the order in which you want to say it, you’re ready to draft. As much as possible, keep going until you have a complete first draft of your report, resisting the urge to go back and rewrite. Save that for after you have completed a first draft.
  • Review . Now is the time to get feedback from others, whether from your instructor, your classmates, a tutor in the writing center, your roommate, someone in your family, or someone else you trust to read your writing critically and give you honest feedback.
  • Revising . With feedback on your draft, you are ready to revise. You may need to return to an earlier step and make large-scale revisions that involve planning, organizing, and rewriting, or you may need to work mostly on ensuring that your sentences are clear and correct.

Considering the Rhetorical Situation

Like other kinds of writing projects, a report starts with assessing the rhetorical situation —the circumstance in which a writer communicates with an audience of readers about a subject. As the writer of a report, you make choices based on the purpose of your writing, the audience who will read it, the genre of the report, and the expectations of the community and culture in which you are working. A graphic organizer like Table 8.1 can help you begin.

Summary of Assignment

Write an analytical report on a topic that interests you and that you want to know more about. The topic can be contemporary or historical, but it must be one that you can analyze and support with evidence from sources.

The following questions can help you think about a topic suitable for analysis:

  • Why or how did ________ happen?
  • What are the results or effects of ________?
  • Is ________ a problem? If so, why?
  • What are examples of ________ or reasons for ________?
  • How does ________ compare to or contrast with other issues, concerns, or things?

Consult and cite three to five reliable sources. The sources do not have to be scholarly for this assignment, but they must be credible, trustworthy, and unbiased. Possible sources include academic journals, newspapers, magazines, reputable websites, government publications or agency websites, and visual sources such as TED Talks. You may also use the results of an experiment or survey, and you may want to conduct interviews.

Consider whether visuals and media will enhance your report. Can you present data you collect visually? Would a map, photograph, chart, or other graphic provide interesting and relevant support? Would video or audio allow you to present evidence that you would otherwise need to describe in words?

Another Lens. To gain another analytic view on the topic of your report, consider different people affected by it. Say, for example, that you have decided to report on recent high school graduates and the effect of the COVID-19 pandemic on the final months of their senior year. If you are a recent high school graduate, you might naturally gravitate toward writing about yourself and your peers. But you might also consider the adults in the lives of recent high school graduates—for example, teachers, parents, or grandparents—and how they view the same period. Or you might consider the same topic from the perspective of a college admissions department looking at their incoming freshman class.

Quick Launch: Finding and Focusing a Topic

Coming up with a topic for a report can be daunting because you can report on nearly anything. The topic can easily get too broad, trapping you in the realm of generalizations. The trick is to find a topic that interests you and focus on an angle you can analyze in order to say something significant about it. You can use a graphic organizer to generate ideas, or you can use a concept map similar to the one featured in Writing Process: Thinking Critically About a “Text.”

Asking the Journalist’s Questions

One way to generate ideas about a topic is to ask the five W (and one H) questions, also called the journalist’s questions : Who? What? When? Where? Why? How? Try answering the following questions to explore a topic:

Who was or is involved in ________?

What happened/is happening with ________? What were/are the results of ________?

When did ________ happen? Is ________ happening now?

Where did ________ happen, or where is ________ happening?

Why did ________ happen, or why is ________ happening now?

How did ________ happen?

For example, imagine that you have decided to write your analytical report on the effect of the COVID-19 shutdown on high-school students by interviewing students on your college campus. Your questions and answers might look something like those in Table 8.2 :

Asking Focused Questions

Another way to find a topic is to ask focused questions about it. For example, you might ask the following questions about the effect of the 2020 pandemic shutdown on recent high school graduates:

  • How did the shutdown change students’ feelings about their senior year?
  • How did the shutdown affect their decisions about post-graduation plans, such as work or going to college?
  • How did the shutdown affect their academic performance in high school or in college?
  • How did/do they feel about continuing their education?
  • How did the shutdown affect their social relationships?

Any of these questions might be developed into a thesis for an analytical report. Table 8.3 shows more examples of broad topics and focusing questions.

Gathering Information

Because they are based on information and evidence, most analytical reports require you to do at least some research. Depending on your assignment, you may be able to find reliable information online, or you may need to do primary research by conducting an experiment, a survey, or interviews. For example, if you live among students in their late teens and early twenties, consider what they can tell you about their lives that you might be able to analyze. Returning to or graduating from high school, starting college, or returning to college in the midst of a global pandemic has provided them, for better or worse, with educational and social experiences that are shared widely by people their age and very different from the experiences older adults had at the same age.

Some report assignments will require you to do formal research, an activity that involves finding sources and evaluating them for reliability, reading them carefully, taking notes, and citing all words you quote and ideas you borrow. See Research Process: Accessing and Recording Information and Annotated Bibliography: Gathering, Evaluating, and Documenting Sources for detailed instruction on conducting research.

Whether you conduct in-depth research or not, keep track of the ideas that come to you and the information you learn. You can write or dictate notes using an app on your phone or computer, or you can jot notes in a journal if you prefer pen and paper. Then, when you are ready to begin organizing your report, you will have a record of your thoughts and information. Always track the sources of information you gather, whether from printed or digital material or from a person you interviewed, so that you can return to the sources if you need more information. And always credit the sources in your report.

Kinds of Evidence

Depending on your assignment and the topic of your report, certain kinds of evidence may be more effective than others. Other kinds of evidence may even be required. As a general rule, choose evidence that is rooted in verifiable facts and experience. In addition, select the evidence that best supports the topic and your approach to the topic, be sure the evidence meets your instructor’s requirements, and cite any evidence you use that comes from a source. The following list contains different kinds of frequently used evidence and an example of each.

Definition : An explanation of a key word, idea, or concept.

The U.S. Census Bureau refers to a “young adult” as a person between 18 and 34 years old.

Example : An illustration of an idea or concept.

The college experience in the fall of 2020 was starkly different from that of previous years. Students who lived in residence halls were assigned to small pods. On-campus dining services were limited. Classes were small and physically distanced or conducted online. Parties were banned.

Expert opinion : A statement by a professional in the field whose opinion is respected.

According to Louise Aronson, MD, geriatrician and author of Elderhood , people over the age of 65 are the happiest of any age group, reporting “less stress, depression, worry, and anger, and more enjoyment, happiness, and satisfaction” (255).

Fact : Information that can be proven correct or accurate.

According to data collected by the NCAA, the academic success of Division I college athletes between 2015 and 2019 was consistently high (Hosick).

Interview : An in-person, phone, or remote conversation that involves an interviewer posing questions to another person or people.

During our interview, I asked Betty about living without a cell phone during the pandemic. She said that before the pandemic, she hadn’t needed a cell phone in her daily activities, but she soon realized that she, and people like her, were increasingly at a disadvantage.

Quotation : The exact words of an author or a speaker.

In response to whether she thought she needed a cell phone, Betty said, “I got along just fine without a cell phone when I could go everywhere in person. The shift to needing a phone came suddenly, and I don’t have extra money in my budget to get one.”

Statistics : A numerical fact or item of data.

The Pew Research Center reported that approximately 25 percent of Hispanic Americans and 17 percent of Black Americans relied on smartphones for online access, compared with 12 percent of White people.

Survey : A structured interview in which respondents (the people who answer the survey questions) are all asked the same questions, either in person or through print or electronic means, and their answers tabulated and interpreted. Surveys discover attitudes, beliefs, or habits of the general public or segments of the population.

A survey of 3,000 mobile phone users in October 2020 showed that 54 percent of respondents used their phones for messaging, while 40 percent used their phones for calls (Steele).

  • Visuals : Graphs, figures, tables, photographs and other images, diagrams, charts, maps, videos, and audio recordings, among others.

Thesis and Organization

Drafting a thesis.

When you have a grasp of your topic, move on to the next phase: drafting a thesis. The thesis is the central idea that you will explore and support in your report; all paragraphs in your report should relate to it. In an essay-style analytical report, you will likely express this main idea in a thesis statement of one or two sentences toward the end of the introduction.

For example, if you found that the academic performance of student athletes was higher than that of non-athletes, you might write the following thesis statement:

student sample text Although a common stereotype is that college athletes barely pass their classes, an analysis of athletes’ academic performance indicates that athletes drop fewer classes, earn higher grades, and are more likely to be on track to graduate in four years when compared with their non-athlete peers. end student sample text

The thesis statement often previews the organization of your writing. For example, in his report on the U.S. response to the COVID-19 pandemic in 2020, Trevor Garcia wrote the following thesis statement, which detailed the central idea of his report:

student sample text An examination of the U.S. response shows that a reduction of experts in key positions and programs, inaction that led to equipment shortages, and inconsistent policies were three major causes of the spread of the virus and the resulting deaths. end student sample text

After you draft a thesis statement, ask these questions, and examine your thesis as you answer them. Revise your draft as needed.

  • Is it interesting? A thesis for a report should answer a question that is worth asking and piques curiosity.
  • Is it precise and specific? If you are interested in reducing pollution in a nearby lake, explain how to stop the zebra mussel infestation or reduce the frequent algae blooms.
  • Is it manageable? Try to split the difference between having too much information and not having enough.

Organizing Your Ideas

As a next step, organize the points you want to make in your report and the evidence to support them. Use an outline, a diagram, or another organizational tool, such as Table 8.4 .

Drafting an Analytical Report

With a tentative thesis, an organization plan, and evidence, you are ready to begin drafting. For this assignment, you will report information, analyze it, and draw conclusions about the cause of something, the effect of something, or the similarities and differences between two different things.

Introduction

Some students write the introduction first; others save it for last. Whenever you choose to write the introduction, use it to draw readers into your report. Make the topic of your report clear, and be concise and sincere. End the introduction with your thesis statement. Depending on your topic and the type of report, you can write an effective introduction in several ways. Opening a report with an overview is a tried-and-true strategy, as shown in the following example on the U.S. response to COVID-19 by Trevor Garcia. Notice how he opens the introduction with statistics and a comparison and follows it with a question that leads to the thesis statement (underlined).

student sample text With more than 83 million cases and 1.8 million deaths at the end of 2020, COVID-19 has turned the world upside down. By the end of 2020, the United States led the world in the number of cases, at more than 20 million infections and nearly 350,000 deaths. In comparison, the second-highest number of cases was in India, which at the end of 2020 had less than half the number of COVID-19 cases despite having a population four times greater than the U.S. (“COVID-19 Coronavirus Pandemic,” 2021). How did the United States come to have the world’s worst record in this pandemic? underline An examination of the U.S. response shows that a reduction of experts in key positions and programs, inaction that led to equipment shortages, and inconsistent policies were three major causes of the spread of the virus and the resulting deaths end underline . end student sample text

For a less formal report, you might want to open with a question, quotation, or brief story. The following example opens with an anecdote that leads to the thesis statement (underlined).

student sample text Betty stood outside the salon, wondering how to get in. It was June of 2020, and the door was locked. A sign posted on the door provided a phone number for her to call to be let in, but at 81, Betty had lived her life without a cell phone. Betty’s day-to-day life had been hard during the pandemic, but she had planned for this haircut and was looking forward to it; she had a mask on and hand sanitizer in her car. Now she couldn’t get in the door, and she was discouraged. In that moment, Betty realized how much Americans’ dependence on cell phones had grown in the months since the pandemic began. underline Betty and thousands of other senior citizens who could not afford cell phones or did not have the technological skills and support they needed were being left behind in a society that was increasingly reliant on technology end underline . end student sample text

Body Paragraphs: Point, Evidence, Analysis

Use the body paragraphs of your report to present evidence that supports your thesis. A reliable pattern to keep in mind for developing the body paragraphs of a report is point , evidence , and analysis :

  • The point is the central idea of the paragraph, usually given in a topic sentence stated in your own words at or toward the beginning of the paragraph. Each topic sentence should relate to the thesis.
  • The evidence you provide develops the paragraph and supports the point made in the topic sentence. Include details, examples, quotations, paraphrases, and summaries from sources if you conducted formal research. Synthesize the evidence you include by showing in your sentences the connections between sources.
  • The analysis comes at the end of the paragraph. In your own words, draw a conclusion about the evidence you have provided and how it relates to the topic sentence.

The paragraph below illustrates the point, evidence, and analysis pattern. Drawn from a report about concussions among football players, the paragraph opens with a topic sentence about the NCAA and NFL and their responses to studies about concussions. The paragraph is developed with evidence from three sources. It concludes with a statement about helmets and players’ safety.

student sample text The NCAA and NFL have taken steps forward and backward to respond to studies about the danger of concussions among players. Responding to the deaths of athletes, documented brain damage, lawsuits, and public outcry (Buckley et al., 2017), the NCAA instituted protocols to reduce potentially dangerous hits during football games and to diagnose traumatic head injuries more quickly and effectively. Still, it has allowed players to wear more than one style of helmet during a season, raising the risk of injury because of imperfect fit. At the professional level, the NFL developed a helmet-rating system in 2011 in an effort to reduce concussions, but it continued to allow players to wear helmets with a wide range of safety ratings. The NFL’s decision created an opportunity for researchers to look at the relationship between helmet safety ratings and concussions. Cocello et al. (2016) reported that players who wore helmets with a lower safety rating had more concussions than players who wore helmets with a higher safety rating, and they concluded that safer helmets are a key factor in reducing concussions. end student sample text

Developing Paragraph Content

In the body paragraphs of your report, you will likely use examples, draw comparisons, show contrasts, or analyze causes and effects to develop your topic.

Paragraphs developed with Example are common in reports. The paragraph below, adapted from a report by student John Zwick on the mental health of soldiers deployed during wartime, draws examples from three sources.

student sample text Throughout the Vietnam War, military leaders claimed that the mental health of soldiers was stable and that men who suffered from combat fatigue, now known as PTSD, were getting the help they needed. For example, the New York Times (1966) quoted military leaders who claimed that mental fatigue among enlisted men had “virtually ceased to be a problem,” occurring at a rate far below that of World War II. Ayres (1969) reported that Brigadier General Spurgeon Neel, chief American medical officer in Vietnam, explained that soldiers experiencing combat fatigue were admitted to the psychiatric ward, sedated for up to 36 hours, and given a counseling session with a doctor who reassured them that the rest was well deserved and that they were ready to return to their units. Although experts outside the military saw profound damage to soldiers’ psyches when they returned home (Halloran, 1970), the military stayed the course, treating acute cases expediently and showing little concern for the cumulative effect of combat stress on individual soldiers. end student sample text

When you analyze causes and effects , you explain the reasons that certain things happened and/or their results. The report by Trevor Garcia on the U.S. response to the COVID-19 pandemic in 2020 is an example: his report examines the reasons the United States failed to control the coronavirus. The paragraph below, adapted from another student’s report written for an environmental policy course, explains the effect of white settlers’ views of forest management on New England.

student sample text The early colonists’ European ideas about forest management dramatically changed the New England landscape. White settlers saw the New World as virgin, unused land, even though indigenous people had been drawing on its resources for generations by using fire subtly to improve hunting, employing construction techniques that left ancient trees intact, and farming small, efficient fields that left the surrounding landscape largely unaltered. White settlers’ desire to develop wood-built and wood-burning homesteads surrounded by large farm fields led to forestry practices and techniques that resulted in the removal of old-growth trees. These practices defined the way the forests look today. end student sample text

Compare and contrast paragraphs are useful when you wish to examine similarities and differences. You can use both comparison and contrast in a single paragraph, or you can use one or the other. The paragraph below, adapted from a student report on the rise of populist politicians, compares the rhetorical styles of populist politicians Huey Long and Donald Trump.

student sample text A key similarity among populist politicians is their rejection of carefully crafted sound bites and erudite vocabulary typically associated with candidates for high office. Huey Long and Donald Trump are two examples. When he ran for president, Long captured attention through his wild gesticulations on almost every word, dramatically varying volume, and heavily accented, folksy expressions, such as “The only way to be able to feed the balance of the people is to make that man come back and bring back some of that grub that he ain’t got no business with!” In addition, Long’s down-home persona made him a credible voice to represent the common people against the country’s rich, and his buffoonish style allowed him to express his radical ideas without sounding anti-communist alarm bells. Similarly, Donald Trump chose to speak informally in his campaign appearances, but the persona he projected was that of a fast-talking, domineering salesman. His frequent use of personal anecdotes, rhetorical questions, brief asides, jokes, personal attacks, and false claims made his speeches disjointed, but they gave the feeling of a running conversation between him and his audience. For example, in a 2015 speech, Trump said, “They just built a hotel in Syria. Can you believe this? They built a hotel. When I have to build a hotel, I pay interest. They don’t have to pay interest, because they took the oil that, when we left Iraq, I said we should’ve taken” (“Our Country Needs” 2020). While very different in substance, Long and Trump adopted similar styles that positioned them as the antithesis of typical politicians and their worldviews. end student sample text

The conclusion should draw the threads of your report together and make its significance clear to readers. You may wish to review the introduction, restate the thesis, recommend a course of action, point to the future, or use some combination of these. Whichever way you approach it, the conclusion should not head in a new direction. The following example is the conclusion from a student’s report on the effect of a book about environmental movements in the United States.

student sample text Since its publication in 1949, environmental activists of various movements have found wisdom and inspiration in Aldo Leopold’s A Sand County Almanac . These audiences included Leopold’s conservationist contemporaries, environmentalists of the 1960s and 1970s, and the environmental justice activists who rose in the 1980s and continue to make their voices heard today. These audiences have read the work differently: conservationists looked to the author as a leader, environmentalists applied his wisdom to their movement, and environmental justice advocates have pointed out the flaws in Leopold’s thinking. Even so, like those before them, environmental justice activists recognize the book’s value as a testament to taking the long view and eliminating biases that may cloud an objective assessment of humanity’s interdependent relationship with the environment. end student sample text

Citing Sources

You must cite the sources of information and data included in your report. Citations must appear in both the text and a bibliography at the end of the report.

The sample paragraphs in the previous section include examples of in-text citation using APA documentation style. Trevor Garcia’s report on the U.S. response to COVID-19 in 2020 also uses APA documentation style for citations in the text of the report and the list of references at the end. Your instructor may require another documentation style, such as MLA or Chicago.

Peer Review: Getting Feedback from Readers

You will likely engage in peer review with other students in your class by sharing drafts and providing feedback to help spot strengths and weaknesses in your reports. For peer review within a class, your instructor may provide assignment-specific questions or a form for you to complete as you work together.

If you have a writing center on your campus, it is well worth your time to make an online or in-person appointment with a tutor. You’ll receive valuable feedback and improve your ability to review not only your report but your overall writing.

Another way to receive feedback on your report is to ask a friend or family member to read your draft. Provide a list of questions or a form such as the one in Table 8.5 for them to complete as they read.

Revising: Using Reviewers’ Responses to Revise your Work

When you receive comments from readers, including your instructor, read each comment carefully to understand what is being asked. Try not to get defensive, even though this response is completely natural. Remember that readers are like coaches who want you to succeed. They are looking at your writing from outside your own head, and they can identify strengths and weaknesses that you may not have noticed. Keep track of the strengths and weaknesses your readers point out. Pay special attention to those that more than one reader identifies, and use this information to improve your report and later assignments.

As you analyze each response, be open to suggestions for improvement, and be willing to make significant revisions to improve your writing. Perhaps you need to revise your thesis statement to better reflect the content of your draft. Maybe you need to return to your sources to better understand a point you’re trying to make in order to develop a paragraph more fully. Perhaps you need to rethink the organization, move paragraphs around, and add transition sentences.

Below is an early draft of part of Trevor Garcia’s report with comments from a peer reviewer:

student sample text To truly understand what happened, it’s important first to look back to the years leading up to the pandemic. Epidemiologists and public health officials had long known that a global pandemic was possible. In 2016, the U.S. National Security Council (NSC) published a 69-page document with the intimidating title Playbook for Early Response to High-Consequence Emerging Infectious Disease Threats and Biological Incidents . The document’s two sections address responses to “emerging disease threats that start or are circulating in another country but not yet confirmed within U.S. territorial borders” and to “emerging disease threats within our nation’s borders.” On 13 January 2017, the joint Obama-Trump transition teams performed a pandemic preparedness exercise; however, the playbook was never adopted by the incoming administration. end student sample text

annotated text Peer Review Comment: Do the words in quotation marks need to be a direct quotation? It seems like a paraphrase would work here. end annotated text

annotated text Peer Review Comment: I’m getting lost in the details about the playbook. What’s the Obama-Trump transition team? end annotated text

student sample text In February 2018, the administration began to cut funding for the Prevention and Public Health Fund at the Centers for Disease Control and Prevention; cuts to other health agencies continued throughout 2018, with funds diverted to unrelated projects such as housing for detained immigrant children. end student sample text

annotated text Peer Review Comment: This paragraph has only one sentence, and it’s more like an example. It needs a topic sentence and more development. end annotated text

student sample text Three months later, Luciana Borio, director of medical and biodefense preparedness at the NSC, spoke at a symposium marking the centennial of the 1918 influenza pandemic. “The threat of pandemic flu is the number one health security concern,” she said. “Are we ready to respond? I fear the answer is no.” end student sample text

annotated text Peer Review Comment: This paragraph is very short and a lot like the previous paragraph in that it’s a single example. It needs a topic sentence. Maybe you can combine them? end annotated text

annotated text Peer Review Comment: Be sure to cite the quotation. end annotated text

Reading these comments and those of others, Trevor decided to combine the three short paragraphs into one paragraph focusing on the fact that the United States knew a pandemic was possible but was unprepared for it. He developed the paragraph, using the short paragraphs as evidence and connecting the sentences and evidence with transitional words and phrases. Finally, he added in-text citations in APA documentation style to credit his sources. The revised paragraph is below:

student sample text Epidemiologists and public health officials in the United States had long known that a global pandemic was possible. In 2016, the National Security Council (NSC) published Playbook for Early Response to High-Consequence Emerging Infectious Disease Threats and Biological Incidents , a 69-page document on responding to diseases spreading within and outside of the United States. On January 13, 2017, the joint transition teams of outgoing president Barack Obama and then president-elect Donald Trump performed a pandemic preparedness exercise based on the playbook; however, it was never adopted by the incoming administration (Goodman & Schulkin, 2020). A year later, in February 2018, the Trump administration began to cut funding for the Prevention and Public Health Fund at the Centers for Disease Control and Prevention, leaving key positions unfilled. Other individuals who were fired or resigned in 2018 were the homeland security adviser, whose portfolio included global pandemics; the director for medical and biodefense preparedness; and the top official in charge of a pandemic response. None of them were replaced, leaving the White House with no senior person who had experience in public health (Goodman & Schulkin, 2020). Experts voiced concerns, among them Luciana Borio, director of medical and biodefense preparedness at the NSC, who spoke at a symposium marking the centennial of the 1918 influenza pandemic in May 2018: “The threat of pandemic flu is the number one health security concern,” she said. “Are we ready to respond? I fear the answer is no” (Sun, 2018, final para.). end student sample text

A final word on working with reviewers’ comments: as you consider your readers’ suggestions, remember, too, that you remain the author. You are free to disregard suggestions that you think will not improve your writing. If you choose to disregard comments from your instructor, consider submitting a note explaining your reasons with the final draft of your report.

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/writing-guide/pages/1-unit-introduction
  • Authors: Michelle Bachelor Robinson, Maria Jerskey, featuring Toby Fulwiler
  • Publisher/website: OpenStax
  • Book title: Writing Guide with Handbook
  • Publication date: Dec 21, 2021
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/writing-guide/pages/1-unit-introduction
  • Section URL: https://openstax.org/books/writing-guide/pages/8-5-writing-process-creating-an-analytical-report

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Overview of Analytic Studies

Introduction

We search for the determinants of health outcomes, first, by relying on descriptive epidemiology to generate hypotheses about associations between exposures and outcomes. Analytic studies are then undertaken to test specific hypotheses. Samples of subjects are identified and information about exposure status and outcome is collected. The essence of an analytic study is that groups of subjects are compared in order to estimate the magnitude of association between exposures and outcomes.

In their book entitled "Epidemiology Matters" Katherine Keyes and Sandro Galea discuss three fundamental options for studying samples from a population as illustrated in the video below (duration 8:30).

Learning Objectives

After successfully completing this section, the student will be able to:

  • Describe the difference between descriptive and scientific/analytic epidemiologic studies in terms of information/evidence provided for medicine and public health.
  • Define and explain the distinguishing features of a cohort study.
  • Describe and identify the types of epidemiologic questions that can be addressed by cohort studies.
  • Define and distinguish among prospective and retrospective cohort studies using the investigator as the point of reference.  
  • Define and explain the distinguishing features of a case-control study.
  • Explain the distinguishing features of an intervention study.
  • Identify the study design when reading an article or abstract.

Cohort Type Studies

A cohort is a "group." In epidemiology a cohort is a group of individuals who are followed over a period of time, primarily to assess what happens to them, i.e., their health outcomes. In cohort type studies one identifies individuals who do not have the outcome of interest initially, and groups them in subsets that differ in their exposure to some factor, e.g., smokers and non-smokers. The different exposure groups are then followed over time in order to compare the incidence of health outcomes, such as lung cancer or heart disease. As an example, the Framingham Heart Study enrolled a cohort of 5,209 residents of Framingham, MA who were between the ages of 30-62 and who did not have cardiovascular disease when they were enrolled. These subjects differed from one another in many ways: whether they smoked, how much they smoked, body mass index, eating habits, exercise habits, gender, family history of heart disease, etc. The researchers assessed these and many other characteristics or "exposures" soon after the subjects had been enrolled and before any of them had developed cardiovascular disease. The many "baseline characteristics" were assessed in a number of ways including questionnaires, physical exams, laboratory tests, and imaging studies (e.g., x-rays). They then began "following" the cohort, meaning that they kept in contact with the subjects by phone, mail, or clinic visits in order to determine if and when any of the subjects developed any of the "outcomes of interest," such as myocardial infarction (heart attack), angina, congestive heart failure, stroke, diabetes and many other cardiovascular outcomes.

Over time some subjects eventually began to develop some of the outcomes of interest. Having followed the cohort in this fashion, it was eventually possible to use the information collected to evaluate many hypotheses about what characteristics were associated with an increased risk of heart disease. For example, if one hypothesized that smoking increased the risk of heart attacks, the subjects in the cohort could be sorted based on their smoking habits, and one could compare the subset of the cohort that smoked to the subset who had never smoked. For each such comparison that one wanted to make the cohort could be grouped according to whether they had a given exposure or not, and one could measure and compare the frequency of heart attacks (i.e., the incidence) between the groups. Incidence provides an estimate of risk, so if the incidence of heart attacks is 3 times greater in smokers compared to non-smokers, it suggests an association between smoking and risk of developing a heart attack. (Various biases might also be an explanation for an apparent association. We will learn about these later in the course.) The hallmark of analytical studies, then, is that they collect information about both exposure status and outcome status, and they compare groups to identify whether there appears to be an association or a link.

The Population "At Risk"

From the discussion above, it should be obvious that one of the basic requirements of a cohort type study is that none of the subjects have the outcome of interest at the beginning of the follow-up period, and time must pass in order to determine the frequency of developing the outcome.

  • For example, if one wanted to compare the risk of developing uterine cancer between postmenopausal women receiving hormone-replacement therapy and those not receiving hormones, one would consider certain eligibility criteria for the members prior to the start of the study: 1) they should be female, 2) they should be post-menopausal, and 3) they should have a uterus. Among post-menopausal women there might be a number who had had a hysterectomy already, perhaps for persistent bleeding problems or endometriosis. Since these women no longer have a uterus, one would want to exclude them from the cohort, because they are no longer at risk of developing this particular type of cancer.
  • Similarly, if one wanted to compare the risk of developing diabetes among nursing home residents who exercised and those who did not, it would be important to test the subjects for diabetes at the beginning of the follow-up period in order to exclude all subjects who already had diabetes and therefore were not "at risk" of developing diabetes.

Eligible subjects have to meet certain criteria to be included as subjects in a study (inclusion criteria). One of these would be that they did not have any of the diseases or conditions that the investigators want to study, i.e., the subjects must be "at risk," of developing the outcome of interest, and the members of the cohort to be followed are sometimes referred to as "the population at risk."

However, at times decisions about who is "at risk" and eligible get complicated.

Example #1: Suppose the outcome of interest is development of measles. There may be subjects who:

  • Already were known to have had clinically apparent measles and are immune to subsequent measles infection
  • Had sub-clinical cases of measles that went undetected (but the subject may still be immune)
  • Had a measles vaccination that conferred immunity
  • Had a measles vaccination that failed to confer immunity

In this case the eligibility criteria would be shaped by the specific scientific questions being asked. One might want to compare subjects known to have had clinically apparent measles to those who had not had clinical measles and had not had a measles vaccination. Or, one could take blood sample from all potential subjects in order to measure their antibody titers (levels) to the measles virus.

Example #2: Suppose you are studying an event that can occur more that once, such as a heart attack. Again, the eligibility criteria should be shaped to fit the scientific questions that are being answered. If one were interested in the risk of a first myocardial infarction, then obviously subjects who had already had a heart attack would not be eligible for study. On the other hand, if one were interested in tertiary prevention of heart attacks, the study cohort would include people who had had heart attacks or other clinical manifestations of heart disease, and the outcome of interest would be subsequent significant cardiac events or death. 

Prospective and Retrospective Cohort Studies

Cohort studies can be classified as prospective or retrospective based on when outcomes occurred in relation to the enrollment of the cohort.

Prospective Cohort Studies

Summary of sequence of events in a hypothetical prospective cohort study from The Nurses Health Study

In a prospective study like the Nurses Health Study baseline information is collected from all subjects in the same way using exactly the same questions and data collection methods for all subjects. The investigators design the questions and data collection procedures carefully in order to obtain accurate information about exposures before disease develops in any of the subjects. After baseline information is collected, subjects in a prospective cohort study are then followed "longitudinally," i.e. over a period of time, usually for years, to determine if and when they become diseased and whether their exposure status changes. In this way, investigators can eventually use the data to answer many questions about the associations between "risk factors" and disease outcomes. For example, one could identify smokers and non-smokers at baseline and compare their subsequent incidence of developing heart disease. Alternatively, one could group subjects based on their body mass index (BMI) and compare their risk of developing heart disease or cancer.

 Examples of Prospective Cohort Studies

  • The Framingham Heart Study Home Page
  • The Nurses Health Study Home Page

Pitfall icon sigifying a potential pitfall to be avoided

Pitfall: Note that in these prospective cohort studies a comparison of incidence between the groups can only take place after enough time has elapsed so that some subjects developed the outcomes of interest. Since the data analysis occurs after some outcomes have occurred, some students mistakenly would call this a retrospective study, but this is incorrect. The analysis always occurs after a certain number of events have taken place. The characteristic that distinguishes a study as prospective is that the subjects were enrolled, and baseline data was collected before any subjects developed an outcome of interest.

Retrospective Cohort Studies

In contrast, retrospective studies are conceived after some people have already developed the outcomes of interest. The investigators jump back in time to identify a cohort of individuals at a point in time before they have developed the outcomes of interest, and they try to establish their exposure status at that point in time. They then determine whether the subject subsequently developed the outcome of interest.

Summary of a retrospective cohort study in which the investigator initiates the study after the outcome of interest has already taken place in some subjects.

Suppose investigators wanted to test the hypothesis that working with the chemicals involved in tire manufacturing increases the risk of death. Since this is a fairly rare exposure, it would be advantageous to use a special exposure cohort such as employees of a large tire manufacturing factory. The employees who actually worked with chemicals used in the manufacturing process would be the exposed group, while clerical workers and management might constitute the "unexposed" group. However, rather than following these subjects for decades, it would be more efficient to use employee health and employment records over the past two or three decades as a source of data. In essence, the investigators are jumping back in time to identify the study cohort at a point in time before the outcome of interest (death) occurred. They can classify them as "exposed" or "unexposed" based on their employment records, and they can use a number of sources to determine subsequent outcome status, such as death (e.g., using health records, next of kin, National Death Index, etc.).

Retrospective cohort studies like the one described above are very efficient for studying rare or unusual exposures, but there are many potential problems here. Sometimes exposure status is not clear when it is necessary to go back in time and use whatever data is available, especially because the data being used was not designed to answer a health question. Even if it was clear who was exposed to tire manufacturing chemicals based on employee records, it would also be important to take into account (or adjust for) other differences that could have influenced mortality, i.e., confounding factors. For example, it might be important to know whether the subjects smoked, or drank, or what kind of diet they ate. However, it is unlikely that a retrospective cohort study would have accurate information on these many other risk factors.

The video below provides a brief (7:31) explanation of the distinction between retrospective and prospective cohort studies.

Link to a transcript of the video

Intervention Studies (Clinical Trials)

Intervention studies (clinical trials) are experimental research studies that compare the effectiveness of medical treatments, management strategies, prevention strategies, and other medical or public health interventions. Their design is very similar to that of a prospective cohort study. However, in cohort studies exposure status is determined by genetics, self-selection, or life circumstances, and the investigators just observe differences in outcome between those who have a given exposure and those who do not. In clinical trials  exposure status  (the treatment type)  is assigned by the investigators . Ideally, assignment of subjects to one of the comparison groups should be done randomly in order to produce equal distributions of potentially confounding factors. Sometimes a group receiving a new treatment is compared to an untreated group, or a group receiving a placebo or a sham treatment. Sometimes, a new treatment is compared to an untreated group or to a group receiving an established treatment. For more on this topic see the module on Intervention Studies.

In summary, the characteristic that distinguishes a clinical trial from a cohort study is that the investigator assigns the exposure status in a clinical trial, while subjects' genetics, behaviors, and life circumstances determine their exposures in a cohort study.

Summarizing Data in a Cohort Study

Investigators often use contingency tables to summarize data. In essence, the table is a matrix that displays the combinations of exposure and outcome status. If one were summarizing the results of a study with two possible exposure categories and two possible outcomes, one would use a "two by two" table in which the numbers in the four cells indicate the number of subjects within each of the 4 possible categories of risk and disease status.

For example, consider data from a retrospective cohort study conducted by the Massachusetts Department of Public Health (MDPH) during an investigation of an outbreak of Giardia lamblia in Milton, MA in 2003. The descriptive epidemiology indicated that almost all of the cases belonged to a country club in Milton. The club had an adult swimming pool and a wading pool for toddlers, and the investigators suspected that the outbreak may have occurred when an infected child with a dirty diaper contaminated the water in the kiddy pool. This hypothesis was tested by conducting a retrospective cohort study. The cases of Giardia lamblia had already occurred and had been reported to MDPH via the infectious disease surveillance system (for more information on surveillance, see the Surveillance module). The investigation focused on an obvious cohort - 479 members of the country club who agreed to answer the MDPH questionnaire. The questionnaire asked, among many other things, whether the subject had been exposed to the kiddy pool. The incidence of subsequent Giardia infection was then compared between subjects who been exposed to the kiddy pool and those who had not.

The table below summarizes the findings. A total of 479 subjects completed the questionnaire, and 124 of them indicated that they had been exposed to the kiddy pool. Of these, 16 subsequently developed Giardia infection, but 108 did not. Among the 355 subjects who denied kiddy pool exposure, 14 developed Giardia infection, and the other 341 did not.

 Organization of the data this way makes it easier to compute the cumulative incidence in each group (12.9% and 3.9% respectively). The incidence in each group provides an estimate of risk, and the groups can be compared in order to estimate the magnitude of association. (This will be addressed in much greater detail in the module on Measures of Association.) One way of quantifying the association is to calculate the relative risk, i.e., dividing the incidence in the exposed group by the incidence in the unexposed group). In this case, the risk ratio is (12.9% / 3.9%) = 3.3. This suggest that subjects who swam in the kiddy pool had 3.3 times the risk of getting Giardia infections compared to those who did not, suggesting that the kiddy pool was the source.

Unanswered Questions

If the kiddy pool was the source of contamination responsible for this outbreak, why was it that:

  • Only 16 people exposed to the kiddy pool developed the infection?
  • There were 14 Giardia cases among people who denied exposure to the kiddy pool?

Before you look at the answer, think about it and try to come up with a possible explanation.

Likely Explanation

Optional Links of Potential Interest

Link to the 2003 Giardia outbreak

Link to CDC page on Organizing Data

what is analytical research with example

Possible Pitfall: Contingency tables can be oriented in several ways, and this can cause confusion when calculating measures of association.

There is no standard rule about how to set up contingency tables, and you will see them set up in different ways.

  • With exposure status in rows and outcome status in columns
  • With exposure status in columns and outcome status in rows
  • With exposed group first followed by non-exposed group
  • With non-exposed group first followed by exposed group

If you aren't careful, these different orientations can result in errors in calculating measures of association. One way to avoid confusion is to always set up your contingency tables in the same way. For example, in these learning modules the contingency tables almost always indicate outcome status in columns listing subjects who have the outcome of interest to the left of subjects who do not have the outcome, and exposure status of the exposed (or most exposed) group is listed in a row above those who are unexposed (or have less exposure).

The table below illustrates this arrangement.

Case-Control Studies

Cohort studies have an intuitive logic to them, but they can be very problematic when:

  • The outcomes being investigated are rare;
  • There is a long time period between the exposure of interest and the development of the disease; or
  • It is expensive or very difficult to obtain exposure information from a cohort.

In the first case, the rarity of the disease requires enrollment of very large numbers of people. In the second case, the long period of follow-up requires efforts to keep contact with and collect outcome information from individuals. In all three situations, cost and feasibility become an important concern.

A case-control design offers an alternative that is much more efficient. The goal of a case-control study is the same as that of cohort studies, i.e. to estimate the magnitude of association between an exposure and an outcome. However, case-control studies employ a different sampling strategy that gives them greater efficiency.   As with a cohort study, a case-control study attempts to identify all people who have developed the disease of interest in the defined population. This is not because they are inherently more important to estimating an association, but because they are almost always rarer than non-diseased individuals, and one of the requirements of accurate estimation of the association is that there are reasonable numbers of people in both the numerators (cases) and denominators (people or person-time) in the measures of disease frequency for both exposed and reference groups. However, because most of the denominator is made up of people who do not develop disease, the case-control design avoids the need to collect information on the entire population by selecting a sample of the underlying population.

Rothman describes the case-control strategy as follows: 

To illustrate this consider the following hypothetical scenario in which the source population is Plymouth County in Massachusetts, which has a total population of 6,647 (hypothetical). Thirteen people in the county have been diagnosed with an unusual disease and seven of them have a particular exposure that is suspected of being an important contributing factor. The chief problem here is that the disease is quite rare.

Map of Plymouth County showing red icons of people who developed hepatitis A in the outbreak

If I somehow had exposure and outcome information on all of the subjects in the source population and looked at the association using a cohort design, it might look like this:

Therefore, the incidence in the exposed individuals would be 7/1,007 = 0.70%, and the incidence in the non-exposed individuals would be 6/5,640 = 0.11%. Consequently, the risk ratio would be 0.70/0.11=6.52, suggesting that those who had the risk factor (exposure) had 6.5 times the risk of getting the disease compared to those without the risk factor. This is a strong association.

In this hypothetical example, I had data on all 6,647 people in the source population, and I could compute the probability of disease (i.e., the risk or incidence) in both the exposed group and the non-exposed group, because I had the denominators for both the exposed and non-exposed groups.

The problem , of course, is that I usually don't have the resources to get the data on all subjects in the population. If I took a random sample of even 5-10% of the population, I might not have any diseased people in my sample.

An alternative approach would be to use surveillance databases or administrative databases to find most or all 13 of the cases in the source population and determine their exposure status. However, instead of enrolling all of the other 5,634 residents, suppose I were to just take a sample of the non-diseased population. In fact, suppose I only took a sample of 1% of the non-diseased people and I then determined their exposure status. The data might look something like this:

With this sampling approach I can no longer compute the probability of disease in each exposure group, because I no longer have the denominators in the last column. In other words, I don't know the exposure distribution for the entire source population. However, the small control sample of non-diseased subjects gives me a way to estimate the exposure distribution in the source population. So, I can't compute the probability of disease in each exposure group, but I can compute the odds of disease in the case-control sample.

The Odds Ratio

The odds of disease among the exposed sample are 7/10, and the odds of disease in the non-exposed sample are 6/56. If I compute the odds ratio, I get (7/10) / (5/56) = 6.56, very close to the risk ratio that I computed from data for the entire population. We will consider odds ratios and case-control studies in much greater depth in a later module. However, for the time being the key things to remember are that:

  • The sampling strategy for a case-control study is very different from that of cohort studies, despite the fact that both have the goal of estimating the magnitude of association between the exposure and the outcome.
  • In a case-control study there is no "follow-up" period. One starts by identifying diseased subjects and determines their exposure distribution; one then takes a sample of the source population that produced those cases in order to estimate the exposure distribution in the overall source population that produced the cases. [In cohort studies none of the subjects have the outcome at the beginning of the follow-up period.]
  • In a case-control study, you cannot measure incidence, because you start with diseased people and non-diseased people, so you cannot calculate relative risk.
  • The case-control design is very efficient. In the example above the case-control study of only 79 subjects produced an odds ratio (6.56) that was a very close approximation to the risk ratio (6.52) that was obtained from the data in the entire population.
  • Case-control studies are particularly useful when the outcome is rare is uncommon in both exposed and non-exposed people.

The Difference Between "Probability" and "Odds"?

what is analytical research with example

  • The odds are defined as the probability that the event will occur divided by the probability that the event will not occur.

If the probability of an event occurring is Y, then the probability of the event not occurring is 1-Y. (Example: If the probability of an event is 0.80 (80%), then the probability that the event will not occur is 1-0.80 = 0.20, or 20%.

The odds of an event represent the ratio of the (probability that the event will occur) / (probability that the event will not occur). This could be expressed as follows:

Odds of event = Y / (1-Y)

So, in this example, if the probability of the event occurring = 0.80, then the odds are 0.80 / (1-0.80) = 0.80/0.20 = 4 (i.e., 4 to 1).

  • If a race horse runs 100 races and wins 25 times and loses the other 75 times, the probability of winning is 25/100 = 0.25 or 25%, but the odds of the horse winning are 25/75 = 0.333 or 1 win to 3 loses.
  • If the horse runs 100 races and wins 5 and loses the other 95 times, the probability of winning is 0.05 or 5%, and the odds of the horse winning are 5/95 = 0.0526.
  • If the horse runs 100 races and wins 50, the probability of winning is 50/100 = 0.50 or 50%, and the odds of winning are 50/50 = 1 (even odds).
  • If the horse runs 100 races and wins 80, the probability of winning is 80/100 = 0.80 or 80%, and the odds of winning are 80/20 = 4 to 1.

NOTE that when the probability is low, the odds and the probability are very similar.

On Sept. 8, 2011 the New York Times ran an article on the economy in which the writer began by saying "If history is a guide, the odds that the American economy is falling into a double-dip recession have risen sharply in recent weeks and may even have reached 50 percent." Further down in the article the author quoted the economist who had been interviewed for the story. What the economist had actually said was, "Whether we reach the technical definition [of a double-dip recession] I think is probably close to 50-50."

Question: Was the author correct in saying that the "odds" of a double-dip recession may have reached 50 percent?

Which Study Design Is Best?

Decisions regarding which study design to use rest on a number of factors including::

  • Uncommon Outcome: If the outcome of interest is uncommon or rare, a case-control study would usually be best.
  • Uncommon Exposure: When studying an uncommon exposure, the investigators need to enroll an adequate number of subjects who have that exposure. In this situation a cohort study is best.
  • Ethics of Assigning Subjects to an Exposure: If you wanted to study the association between smoking and lung cancer, It wouldn't be ethical to conduct a clinical trial in which you randomly assigned half of the subjects to smoking.
  • Resources: If you have limited time, money, and personnel to gather data, it is unlikely that you will be able to conduct a prospective cohort study. A case-control study or a retrospective cohort study would be better options. The best one to choose would be dictated by whether the outcome was rare or the exposure of interest was rare.

There are some situations in which more than one study design could be used.

Smoking and Lung Cancer: For example, when investigators first sought to establish whether there was a link between smoking and lung cancer, they did a study by finding hospital subjects who had lung cancer and a comparison group of hospital patients who had diseases other than cancer. They then compared the prior exposure histories with respect to smoking and many other factors. They found that past smoking was much more common in the lung cancer cases, and they concluded that there was an association. The advantages to this approach were that they were able to collect the data they wanted relatively quickly and inexpensively, because they started with people who already had the disease of interest.

The short video below provides a nice overview of epidemiological studies.

what is analytical research with example

However, there were several limitations to the study they had done. The study design did not allow them to measure the incidence of lung cancer in smokers and non-smokers, so they couldn't measure the absolute risk of smoking. They also didn't know what other diseases smoking might be associated with, and, finally, they were concerned about some of the biases that can creep into this type of study.

As a result, these investigators then initiated another study. They invited all of the male physicians in the United Kingdom to fill out questionnaires regarding their health status and their smoking status. They then focused on the healthy physicians who were willing to participate, and the investigators mailed follow-up questionnaires to them every few years. They also had a way of finding out the cause of death for any subjects who became ill and died. The study continued for about 50 years. Along the way the investigators periodically compared the incidence of death among non-smoking physicians and physicians who smoked small, moderate or heavy amounts of tobacco.

These studies were useful, because they were able to demonstrate that smokers had an increased risk of over 20 different causes of death. They were also able to measure the incidence of death in different categories, so they knew the absolute risk for each cause of death. Of course, the downside to this approach was that it took a long time, and it was very costly. So, both a case-control study and a prospective cohort study provided useful information about the association between smoking and lung cancer and other diseases, but there were distinct advantages and limitations to each approach. 

Hepatitis Outbreak in Marshfield, MA

In 2004 there was an outbreak of hepatitis A on the South Shore of Massachusetts. Over a period of a few weeks there were 20 cases of hepatitis A that were reported to the MDPH, and most of the infected persons were residents of Marshfield, MA. Marshfield's health department requested help in identifying the source from MDPH. The investigators quickly performed descriptive epidemiology. The epidemic curve indicated a point source epidemic, and most of the cases lived in the Marshfield area, although some lived as far away as Boston. They conducted hypothesis-generating interviews, and taken together, the descriptive epidemiology suggested that the source was one of five or six food establishments in the Marshfield area, but it wasn't clear which one. Consequently, the investigators wanted to conduct an analytic study to determine which restaurant was the source. Which study design should have been conducted? Think about the scenario, and then open the "Quiz Me" below and choose your answer.

Link to more on the hepatitis outbreak

Case-control studies are particularly efficient for rare diseases because they begin by identifying a sufficient number of diseased people (or people have some "outcome" of interest) to enable you to do an analysis that tests associations. Case-control studies can be done in just about any circumstance, but they are particularly useful when you are dealing with rare diseases or disease for which there is a very long latent period, i.e. a long time between the causative exposure and the eventual development of disease. 

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMJ Open Access

Logo of bmjgroup

Analytical studies: a framework for quality improvement design and analysis

Conducting studies for learning is fundamental to improvement. Deming emphasised that the reason for conducting a study is to provide a basis for action on the system of interest. He classified studies into two types depending on the intended target for action. An enumerative study is one in which action will be taken on the universe that was studied. An analytical study is one in which action will be taken on a cause system to improve the future performance of the system of interest. The aim of an enumerative study is estimation, while an analytical study focuses on prediction. Because of the temporal nature of improvement, the theory and methods for analytical studies are a critical component of the science of improvement.

Introduction: enumerative and analytical studies

Designing studies that make it possible to learn from experience and take action to improve future performance is an essential element of quality improvement. These studies use the now traditional theory established through the work of Fisher, 1 Cox, 2 Campbell and Stanley, 3 and others that is widely used in biomedicine research. These designs are used to discover new phenomena that lead to hypothesis generation, and to explore causal mechanisms, 4 as well as to evaluate efficacy and effectiveness. They include observational, retrospective, prospective, pre-experimental, quasiexperimental, blocking, factorial and time-series designs.

In addition to these classifications of studies, Deming 5 defined a distinction between analytical and enumerative studies which has proven to be fundamental to the science of improvement. Deming based his insight on the distinction between these two approaches that Walter Shewhart had made in 1939 as he helped develop measurement strategies for the then-emerging science of ‘quality control.’ 6 The difference between the two concepts lies in the extrapolation of the results that is intended, and in the target for action based on the inferences that are drawn.

A useful way to appreciate that difference is to contrast the inferences that can be made about the water sampled from two different natural sources ( figure 1 ). The enumerative approach is like the study of water from a pond. Because conditions in the bounded universe of the pond are essentially static over time, analyses of random samples taken from the pond at a given time can be used to estimate the makeup of the entire pond. Statistical methods, such as hypothesis testing and CIs, can be used to make decisions and define the precision of the estimates.

An external file that holds a picture, illustration, etc.
Object name is qhc51557fig1.jpg

Environment in enumerative and analytical study. Internal validity diagram from Fletcher et al . 7

The analytical approach, in contrast, is like the study of water from a river. The river is constantly moving, and its physical properties are changing (eg, due to snow melt, changes in rainfall, dumping of pollutants). The properties of water in a sample from the river at any given time may not describe the river after the samples are taken and analysed. In fact, without repeated sampling over time, it is difficult to make predictions about water quality, since the river will not be the same river in the future as it was at the time of the sampling.

Deming first discussed these concepts in a 1942 paper, 8 as well as in his 1950 textbook, 9 and in a 1975 paper used the enumerative/analytical terminology to characterise specific study designs. 5 While most books on experimental design describe methods for the design and analysis of enumerative studies, Moen et al 10 describe methods for designing and learning from analytical studies. These methods are graphical and focus on prediction of future performance. The concept of analytical studies became a key element in Deming's ‘system of profound knowledge’ that serves as the intellectual foundation for improvement science. 11 The knowledge framework for the science of improvement, which combines elements of psychology, the Shewhart view of variation, the concept of systems, and the theory of knowledge, informs a number of key principles for the design and analysis of improvement studies:

  • Knowledge about improvement begins and ends in experimental data but does not end in the data in which it begins.
  • Observations, by themselves, do not constitute knowledge.
  • Prediction requires theory regarding mechanisms of change and understanding of context.
  • Random sampling from a population or universe (assumed by most statistical methods) is not possible when the population of interest is in the future.
  • The conditions during studies for improvement will be different from the conditions under which the results will be used. The major source of uncertainty concerning their use is the difficulty of extrapolating study results to different contexts and under different conditions in the future.
  • The wider the range of conditions included in an improvement study, the greater the degree of belief in the validity and generalisation of the conclusions.

The classification of studies into enumerative and analytical categories depends on the intended target for action as the result of the study:

  • Enumerative studies assume that when actions are taken as the result of a study, they will be taken on the material in the study population or ‘frame’ that was sampled.

More specifically, the study universe in an enumerative study is the bounded group of items (eg, patients, clinics, providers, etc) possessing certain properties of interest. The universe is defined by a frame, a list of identifiable, tangible units that may be sampled and studied. Random selection methods are assumed in the statistical methods used for estimation, decision-making and drawing inferences in enumerative studies. Their aim is estimation about some aspect of the frame (such as a description, comparison or the existence of a cause–effect relationship) and the resulting actions taken on this particular frame. One feature of an enumerative study is that a 100% sample of the frame provides the complete answer to the questions posed by the study (given the methods of investigation and measurement). Statistical methods such as hypothesis tests, CIs and probability statements are appropriate to analyse and report data from enumerative studies. Estimating the infection rate in an intensive care unit for the last month is an example of a simple enumerative study.

  • Analytical studies assume that the actions taken as a result of the study will be on the process or causal system that produced the frame studied, rather than the initial frame itself. The aim is to improve future performance.

In contrast to enumerative studies, an analytical study accepts as a given that when actions are taken on a system based on the results of a study, the conditions in that system will inevitably have changed. The aim of an analytical study is to enable prediction about how a change in a system will affect that system's future performance, or prediction as to which plans or strategies for future action on the system will be superior. For example, the task may be to choose among several different treatments for future patients, methods of collecting information or procedures for cleaning an operating room. Because the population of interest is open and continually shifts over time, random samples from that population cannot be obtained in analytical studies, and traditional statistical methods are therefore not useful. Rather, graphical methods of analysis and summary of the repeated samples reveal the trajectory of system behaviour over time, making it possible to predict future behaviour. Use of a Shewhart control chart to monitor and create learning to reduce infection rates in an intensive care unit is an example of a simple analytical study.

The following scenarios give examples to clarify the nature of these two types of studies.

Scenario 1: enumerative study—observation

To estimate how many days it takes new patients to see all primary care physicians contracted with a health plan, a researcher selected a random sample of 150 such physicians from the current active list and called each of their offices to schedule an appointment. The time to the next available appointment ranged from 0 to 180 days, with a mean of 38 days (95% CI 35.6 to 39.6).

This is an enumerative study, since results are intended to be used to estimate the waiting time for appointments with the plan's current population of primary care physicians.

Scenario 2: enumerative study—hypothesis generation

The researcher in scenario 1 noted that on occasion, she was offered an earlier visit with a nurse practitioner (NP) who worked with the physician being called. Additional information revealed that 20 of the 150 physicians in the study worked with one or more NPs. The next available appointment for the 130 physicians without an NP averaged 41 days (95% CI 39 to 43 days) and was 18 days (95% CI 18 to 26 days) for the 20 practices with NPs, a difference of 23 days (a 56% shorter mean waiting time).

This subgroup analysis suggested that the involvement of NPs helps to shorten waiting times, although it does not establish a cause–effect relationship, that is, it was a ‘hypothesis-generating’ study. In any event, this was clearly an enumerative study, since its results were to understand the impact of NPs on waiting times in the particular population of practices. Its results suggested that NPs might influence waiting times, but only for practices in this health plan during the time of the study. The study treated the conditions in the health plan as static, like those in a pond.

Scenario 3: enumerative study—comparison

To find out if administrative changes in a health plan had increased member satisfaction in access to care, the customer service manager replicated a phone survey he had conducted a year previously, using a random sample of 300 members. The percentage of patients who were satisfied with access had increased from 48.7% to 60.7% (Fisher exact test, p<0.004).

This enumerative comparison study was used to estimate the impact of the improvement work during the last year on the members in the plan. Attributing the increase in satisfaction to the improvement work assumes that other conditions in the study frame were static.

Scenario 4: analytical study—learning with a Shewhart chart

Each primary care clinic in a health plan reported its ‘time until the third available appointment’ twice a month, which allowed the quality manager to plot the mean waiting time for all of the clinics on Shewhart charts. Waiting times had been stable for a 12-month period through August, but the manager then noted a special cause (increase in waiting time) in September. On stratifying the data by region, she found that the special cause resulted from increases in waiting time in the Northeast region. Discussion with the regional manager revealed a shortage of primary care physicians in this region, which was predicted to become worse in the next quarter. Making some temporary assignments and increasing physician recruiting efforts resulted in stabilisation of this measure.

Documenting common and special cause variation in measures of interest through the use of Shewhart charts and run charts based on judgement samples is probably the simplest and commonest type of analytical study in healthcare. Such charts, when stable, provide a rational basis for predicting future performance.

Scenario 5: analytical study—establishing a cause–effect relationship

The researcher mentioned in scenarios 1 and 2 planned a study to test the existence of a cause–effect relationship between the inclusion of NPs in primary care offices and waiting time for new patient appointments. The variation in patient characteristics in this health plan appeared to be great enough to make the study results useful to other organisations. For the study, she recruited about 100 of the plan's practices that currently did not use NPs, and obtained funding to facilitate hiring NPs in up to 50 of those practices.

The researcher first explored the theories on mechanisms by which the incorporation of NPs into primary care clinics could reduce waiting times. Using important contextual variables relevant to these mechanisms (practice size, complexity, use of information technology and urban vs rural location), she then developed a randomised block, time-series study design. The study had the power to detect an effect of a mean waiting time of 5 days or more overall, and 10 days for the major subgroups defined by levels of the contextual variables. Since the baseline waiting time for appointments varied substantially across practices, she used the baseline as a covariate.

After completing the study, she analysed data from baseline and postintervention periods using stratified run charts and Shewhart charts, including the raw measures and measures adjusted for important covariates and effects of contextual variables. Overall waiting times decreased 12 days more in practices that included NPs than they did in control practices. Importantly, the subgroup analyses according to contextual variables revealed conditions under which the use of NPs would not be predicted to lead to reductions in waiting times. For example, practices with short baseline waiting times showed little or no improvement by employing NPs. She published the results in a leading health research journal.

This was an analytical study because the intent was to apply the learning from the study to future staffing plans in the health plan. She also published the study, so its results would be useful to primary care practices outside the health plan.

Scenario 6: analytical study—implementing improvement

The quality-improvement manager in another health plan wanted to expand the use of NPs in the plan's primary care practices, because published research had shown a reduction in waiting times for practices with NPs. Two practices in his plan already employed NPs. In one of these practices, Shewhart charts of waiting time by month showed a stable process averaging 10 days during the last 2 years. Waiting time averaged less than 7 days in the second practice, but a period when one of the physicians left the practice was associated with special causes.

The quality manager created a collaborative among the plan's primary care practices to learn how to optimise the use of NPs. Physicians in the two sites that employed NPs served as subject matter experts for the collaborative. In addition to making NPs part of their care teams, participating practices monitored appointment supply and demand, and tested other changes designed to optimise response to patient needs. Thirty sites in the plan voluntarily joined the collaborative and hired NPs. After 6 months, Shewhart charts indicated that waiting times in 25 of the 30 sites had been reduced to less than 7 days. Because waiting times in these practices had been stable over a considerable period of time, the manager predicted that future patients would continue to experience reduced times for appointments. The quality manger began to focus on a follow-up collaborative among the backlog of 70 practices that wanted to join.

This project was clearly an analytical study, since its aim was specifically to improve future waiting-time performance for participating sites and other primary care offices in the plan. Moreover, it focused on learning about the mechanisms through which contextual factors affect the impact of NPs on primary care office functions, under practice conditions that (like those in a river) will inevitably change over time.

Statistical theory in enumerative studies is used to describe the precision of estimates and the validity of hypotheses for the population studied. But since these statistical methods provide no support for extrapolation of the results outside the population that was studied, the subject experts must rely on their understanding of the mechanisms in place to extend results outside the population.

In analytical studies, the standard error of a statistic does not address the most important source of uncertainty, namely, the change in study conditions in the future. Although analytical studies need to take into account the uncertainty due to sampling, as in enumerative studies, the attributes of the study design and analysis of the data primarily deal with the uncertainty resulting from extrapolation to the future (generalisation to the conditions in future time periods). The methods used in analytical studies encourage the exploration of mechanisms through multifactor designs, contextual variables introduced through blocking and replication over time.

Prior stability of a system (as observed in graphic displays of repeated sampling over time, according to Shewhart's methods) increases belief in the results of an analytical study, but stable processes in the past do not guarantee constant system behaviour in the future. The next data point from the future is the most important on a graph of performance. Extrapolation of system behaviour to future times therefore still depends on input from subject experts who are familiar with mechanisms of the system of interest, as well as the important contextual issues. Generalisation is inherently difficult in all studies because ‘whereas the problems of internal validity are solvable within the limits of the logic of probability statistics, the problems of external validity are not logically solvable in any neat, conclusive way’ 3 (p. 17).

The diverse activities commonly referred to as healthcare improvement 12 are all designed to change the behaviour of systems over time, as reflected in the principle that ‘not all change is improvement, but all improvement is change.’ The conditions in the unbounded systems into which improvement interventions are introduced will therefore be different in the future from those in effect at the time the intervention is studied. Since the results of improvement studies are used to predict future system behaviour, such studies clearly belong to the Deming category of analytical studies. Quality improvement studies therefore need to incorporate repeated measurements over time, as well as testing under a wide range of conditions (2, 3 and 10). The ‘gold standard’ of analytical studies is satisfactory prediction over time.

Conclusions and recommendations

In light of these considerations, some important principles for drawing inferences from improvement studies include 10 :

  • The analysis of data, interpretation of that analysis and actions taken as a result of the study should be closely tied to the current knowledge of experts about mechanisms of change in the relevant area. They can often use the study to discover, understand and evaluate the underlying mechanisms.
  • The conditions of the study will be different from the future conditions under which the results will be used. Assessment by experts of the magnitude of this difference and its potential impact on future events should be an integral part of the interpretation of the results of the intervention.
  • Show all the data before aggregation or summary.
  • Plot the outcome data in the order in which the tests of change were conducted and annotate with information on the interventions.
  • Use graphical displays to assess how much of the variation in the data can be explained by factors that were deliberately changed.
  • Rearrange and subgroup the data to study other sources of variation (background and contextual variables).
  • Summarise the results of the study with appropriate graphical displays.

Because these principles reflect the fundamental nature of improvement—taking action to change performance, over time, and under changing conditions—their application helps to bring clarity and rigour to improvement science.

Acknowledgments

The author is grateful to F Davidoff and P Batalden for their input to earlier versions of this paper.

Competing interests: None.

Provenance and peer review: Not commissioned; externally peer reviewed.

Understanding and solving intractable resource governance problems.

  • In the Press
  • Conferences and Talks
  • Exploring models of electronic wastes governance in the United States and Mexico: Recycling, risk and environmental justice
  • The Collaborative Resource Governance Lab (CoReGovLab)
  • Water Conflicts in Mexico: A Multi-Method Approach
  • Past projects
  • Publications and scholarly output
  • Research Interests
  • Higher education and academia
  • Public administration, public policy and public management research
  • Research-oriented blog posts
  • Stuff about research methods
  • Research trajectory
  • Publications
  • Developing a Writing Practice
  • Outlining Papers
  • Publishing strategies
  • Writing a book manuscript
  • Writing a research paper, book chapter or dissertation/thesis chapter
  • Everything Notebook
  • Literature Reviews
  • Note-Taking Techniques
  • Organization and Time Management
  • Planning Methods and Approaches
  • Qualitative Methods, Qualitative Research, Qualitative Analysis
  • Reading Notes of Books
  • Reading Strategies
  • Teaching Public Policy, Public Administration and Public Management
  • My Reading Notes of Books on How to Write a Doctoral Dissertation/How to Conduct PhD Research
  • Writing a Thesis (Undergraduate or Masters) or a Dissertation (PhD)
  • Reading strategies for undergraduates
  • Social Media in Academia
  • Resources for Job Seekers in the Academic Market
  • Writing Groups and Retreats
  • Regional Development (Fall 2015)
  • State and Local Government (Fall 2015)
  • Public Policy Analysis (Fall 2016)
  • Regional Development (Fall 2016)
  • Public Policy Analysis (Fall 2018)
  • Public Policy Analysis (Fall 2019)
  • Public Policy Analysis (Spring 2016)
  • POLI 351 Environmental Policy and Politics (Summer Session 2011)
  • POLI 352 Comparative Politics of Public Policy (Term 2)
  • POLI 375A Global Environmental Politics (Term 2)
  • POLI 350A Public Policy (Term 2)
  • POLI 351 Environmental Policy and Politics (Term 1)
  • POLI 332 Latin American Environmental Politics (Term 2, Spring 2012)
  • POLI 350A Public Policy (Term 1, Sep-Dec 2011)
  • POLI 375A Global Environmental Politics (Term 1, Sep-Dec 2011)

Writing theoretical frameworks, analytical frameworks and conceptual frameworks

Three of the most challenging concepts for me to explain are the interrelated ideas of a theoretical framework, a conceptual framework, and an analytical framework. All three of these tend to be used interchangeably. While I find these concepts somewhat fuzzy and I struggle sometimes to explain the differences between them and clarify their usage for my students (and clearly I am not alone in this challenge), this blog post is an attempt to help discern these analytical categories more clearly.

A lot of people (my own students included) have asked me if the theoretical framework is their literature review. That’s actually not the case. A theoretical framework , the way I define it, is comprised of the different theories and theoretical constructs that help explain a phenomenon. A theoretical framework sets out the various expectations that a theory posits and how they would apply to a specific case under analysis, and how one would use theory to explain a particular phenomenon. I like how theoretical frameworks are defined in this blog post . Dr. Cyrus Samii offers an explanation of what a good theoretical framework does for students .

For example, you can use framing theory to help you explain how different actors perceive the world. Your theoretical framework may be based on theories of framing, but it can also include others. For example, in this paper, Zeitoun and Allan explain their theoretical framework, aptly named hydro-hegemony . In doing so, Zeitoun and Allan explain the role of each theoretical construct (Power, Hydro-Hegemony, Political Economy) and how they apply to transboundary water conflict. Another good example of a theoretical framework is that posited by Dr. Michael J. Bloomfield in his book Dirty Gold, as I mention in this tweet:

In Chapter 2, @mj_bloomfield nicely sets his theoretical framework borrowing from sociology, IR, and business-strategy scholarship pic.twitter.com/jTGF4PPymn — Dr Raul Pacheco-Vega (@raulpacheco) December 24, 2017

An analytical framework is, the way I see it, a model that helps explain how a certain type of analysis will be conducted. For example, in this paper, Franks and Cleaver develop an analytical framework that includes scholarship on poverty measurement to help us understand how water governance and poverty are interrelated . Other authors describe an analytical framework as a “conceptual framework that helps analyse particular phenomena”, as posited here , ungated version can be read here .

I think it’s easy to conflate analytical frameworks with theoretical and conceptual ones because of the way in which concepts, theories and ideas are harnessed to explain a phenomenon. But I believe the most important element of an analytical framework is instrumental : their purpose is to help undertake analyses. You use elements of an analytical framework to deconstruct a specific concept/set of concepts/phenomenon. For example, in this paper , Bodde et al develop an analytical framework to characterise sources of uncertainties in strategic environmental assessments.

A robust conceptual framework describes the different concepts one would need to know to understand a particular phenomenon, without pretending to create causal links across variables and outcomes. In my view, theoretical frameworks set expectations, because theories are constructs that help explain relationships between variables and specific outcomes and responses. Conceptual frameworks, the way I see them, are like lenses through which you can see a particular phenomenon.

A conceptual framework should serve to help illuminate and clarify fuzzy ideas, and fill lacunae. Viewed this way, a conceptual framework offers insight that would not be otherwise be gained without a more profound understanding of the concepts explained in the framework. For example, in this article, Beck offers social movement theory as a conceptual framework that can help understand terrorism . As I explained in my metaphor above, social movement theory is the lens through which you see terrorism, and you get a clearer understanding of how it operates precisely because you used this particular theory.

Dan Kaminsky offered a really interesting explanation connecting these topics to time, read his tweet below.

I think this maps to time. Theoretical frameworks talk about how we got here. Conceptual frameworks discuss what we have. Analytical frameworks discuss where we can go with this. See also legislative/executive/judicial. — Dan Kaminsky (@dakami) September 28, 2018

One of my CIDE students, Andres Ruiz, reminded me of this article on conceptual frameworks in the International Journal of Qualitative Methods. I’ll also be adding resources as I get them via Twitter or email. Hopefully this blog post will help clarify this idea!

You can share this blog post on the following social networks by clicking on their icon.

Posted in academia .

Tagged with analytical framework , conceptual framework , theoretical framework .

By Raul Pacheco-Vega – September 28, 2018

7 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post .

Thanks, this had some useful clarifications for me!

I GOT CONFUSED AGAIN!

No need to be confused!

Thanks for the Clarification, Dr Raul. My cluttered mind is largely cleared, now.

Thanks,very helpful

I too was/am confused but this helps 🙂

Thank you very much, Dr.

Leave a Reply Cancel Some HTML is OK

Name (required)

Email (required, but never shared)

or, reply to this post via trackback .

About Raul Pacheco-Vega, PhD

Find me online.

My Research Output

  • Google Scholar Profile
  • Academia.Edu
  • ResearchGate

My Social Networks

  • Polycentricity Network

Recent Posts

  • “State-Sponsored Activism: Bureaucrats and Social Movements in Brazil” – Jessica Rich – my reading notes
  • Reading Like a Writer – Francine Prose – my reading notes
  • Using the Pacheco-Vega workflows and frameworks to write and/or revise a scholarly book
  • On framing, the value of narrative and storytelling in scholarly research, and the importance of asking the “what is this a story of” question
  • The Abstract Decomposition Matrix Technique to find a gap in the literature

Follow me on Twitter:

Proudly powered by WordPress and Carrington .

Carrington Theme by Crowd Favorite

  • Privacy Policy

Buy Me a Coffee

Research Method

Home » Descriptive Analytics – Methods, Tools and Examples

Descriptive Analytics – Methods, Tools and Examples

Table of Contents

Descriptive Analytics

Descriptive Analytics

Definition:

Descriptive analytics focused on describing or summarizing raw data and making it interpretable. This type of analytics provides insight into what has happened in the past. It involves the analysis of historical data to identify patterns, trends, and insights. Descriptive analytics often uses visualization tools to represent the data in a way that is easy to interpret.

Descriptive Analytics in Research

Descriptive analytics plays a crucial role in research, helping investigators understand and describe the data collected in their studies. Here’s how descriptive analytics is typically used in a research setting:

  • Descriptive Statistics: In research, descriptive analytics often takes the form of descriptive statistics . This includes calculating measures of central tendency (like mean, median, and mode), measures of dispersion (like range, variance, and standard deviation), and measures of frequency (like count, percent, and frequency). These calculations help researchers summarize and understand their data.
  • Visualizing Data: Descriptive analytics also involves creating visual representations of data to better understand and communicate research findings . This might involve creating bar graphs, line graphs, pie charts, scatter plots, box plots, and other visualizations.
  • Exploratory Data Analysis: Before conducting any formal statistical tests, researchers often conduct an exploratory data analysis, which is a form of descriptive analytics. This might involve looking at distributions of variables, checking for outliers, and exploring relationships between variables.
  • Initial Findings: Descriptive analytics are often reported in the results section of a research study to provide readers with an overview of the data. For example, a researcher might report average scores, demographic breakdowns, or the percentage of participants who endorsed each response on a survey.
  • Establishing Patterns and Relationships: Descriptive analytics helps in identifying patterns, trends, or relationships in the data, which can guide subsequent analysis or future research. For instance, researchers might look at the correlation between variables as a part of descriptive analytics.

Descriptive Analytics Techniques

Descriptive analytics involves a variety of techniques to summarize, interpret, and visualize historical data. Some commonly used techniques include:

Statistical Analysis

This includes basic statistical methods like mean, median, mode (central tendency), standard deviation, variance (dispersion), correlation, and regression (relationships between variables).

Data Aggregation

It is the process of compiling and summarizing data to obtain a general perspective. It can involve methods like sum, count, average, min, max, etc., often applied to a group of data.

Data Mining

This involves analyzing large volumes of data to discover patterns, trends, and insights. Techniques used in data mining can include clustering (grouping similar data), classification (assigning data into categories), association rules (finding relationships between variables), and anomaly detection (identifying outliers).

Data Visualization

This involves presenting data in a graphical or pictorial format to provide clear and easy understanding of the data patterns, trends, and insights. Common data visualization methods include bar charts, line graphs, pie charts, scatter plots, histograms, and more complex forms like heat maps and interactive dashboards.

This involves organizing data into informational summaries to monitor how different areas of a business are performing. Reports can be generated manually or automatically and can be presented in tables, graphs, or dashboards.

Cross-tabulation (or Pivot Tables)

It involves displaying the relationship between two or more variables in a tabular form. It can provide a deeper understanding of the data by allowing comparisons and revealing patterns and correlations that may not be readily apparent in raw data.

Descriptive Modeling

Some techniques use complex algorithms to interpret data. Examples include decision tree analysis, which provides a graphical representation of decision-making situations, and neural networks, which are used to identify correlations and patterns in large data sets.

Descriptive Analytics Tools

Some common Descriptive Analytics Tools are as follows:

Excel: Microsoft Excel is a widely used tool that can be used for simple descriptive analytics. It has powerful statistical and data visualization capabilities. Pivot tables are a particularly useful feature for summarizing and analyzing large data sets.

Tableau: Tableau is a data visualization tool that is used to represent data in a graphical or pictorial format. It can handle large data sets and allows for real-time data analysis.

Power BI: Power BI, another product from Microsoft, is a business analytics tool that provides interactive visualizations with self-service business intelligence capabilities.

QlikView: QlikView is a data visualization and discovery tool. It allows users to analyze data and use this data to support decision-making.

SAS: SAS is a software suite that can mine, alter, manage and retrieve data from a variety of sources and perform statistical analysis on it.

SPSS: SPSS (Statistical Package for the Social Sciences) is a software package used for statistical analysis. It’s widely used in social sciences research but also in other industries.

Google Analytics: For web data, Google Analytics is a popular tool. It allows businesses to analyze in-depth detail about the visitors on their website, providing valuable insights that can help shape the success strategy of a business.

R and Python: Both are programming languages that have robust capabilities for statistical analysis and data visualization. With packages like pandas, matplotlib, seaborn in Python and ggplot2, dplyr in R, these languages are powerful tools for descriptive analytics.

Looker: Looker is a modern data platform that can take data from any database and let you start exploring and visualizing.

When to use Descriptive Analytics

Descriptive analytics forms the base of the data analysis workflow and is typically the first step in understanding your business or organization’s data. Here are some situations when you might use descriptive analytics:

Understanding Past Behavior: Descriptive analytics is essential for understanding what has happened in the past. If you need to understand past sales trends, customer behavior, or operational performance, descriptive analytics is the tool you’d use.

Reporting Key Metrics: Descriptive analytics is used to establish and report key performance indicators (KPIs). It can help in tracking and presenting these KPIs in dashboards or regular reports.

Identifying Patterns and Trends: If you need to identify patterns or trends in your data, descriptive analytics can provide these insights. This might include identifying seasonality in sales data, understanding peak operational times, or spotting trends in customer behavior.

Informing Business Decisions: The insights provided by descriptive analytics can inform business strategy and decision-making. By understanding what has happened in the past, you can make more informed decisions about what steps to take in the future.

Benchmarking Performance: Descriptive analytics can be used to compare current performance against historical data. This can be used for benchmarking and setting performance goals.

Auditing and Regulatory Compliance: In sectors where compliance and auditing are essential, descriptive analytics can provide the necessary data and trends over specific periods.

Initial Data Exploration: When you first acquire a dataset, descriptive analytics is useful to understand the structure of the data, the relationships between variables, and any apparent anomalies or outliers.

Examples of Descriptive Analytics

Examples of Descriptive Analytics are as follows:

Retail Industry: A retail company might use descriptive analytics to analyze sales data from the past year. They could break down sales by month to identify any seasonality trends. For example, they might find that sales increase in November and December due to holiday shopping. They could also break down sales by product to identify which items are the most popular. This analysis could inform their purchasing and stocking decisions for the next year. Additionally, data on customer demographics could be analyzed to understand who their primary customers are, guiding their marketing strategies.

Healthcare Industry: In healthcare, descriptive analytics could be used to analyze patient data over time. For instance, a hospital might analyze data on patient admissions to identify trends in admission rates. They might find that admissions for certain conditions are higher at certain times of the year. This could help them allocate resources more effectively. Also, analyzing patient outcomes data can help identify the most effective treatments or highlight areas where improvement is needed.

Finance Industry: A financial firm might use descriptive analytics to analyze historical market data. They could look at trends in stock prices, trading volume, or economic indicators to inform their investment decisions. For example, analyzing the price-earnings ratios of stocks in a certain sector over time could reveal patterns that suggest whether the sector is currently overvalued or undervalued. Similarly, credit card companies can analyze transaction data to detect any unusual patterns, which could be signs of fraud.

Advantages of Descriptive Analytics

Descriptive analytics plays a vital role in the world of data analysis, providing numerous advantages:

  • Understanding the Past: Descriptive analytics provides an understanding of what has happened in the past, offering valuable context for future decision-making.
  • Data Summarization: Descriptive analytics is used to simplify and summarize complex datasets, which can make the information more understandable and accessible.
  • Identifying Patterns and Trends: With descriptive analytics, organizations can identify patterns, trends, and correlations in their data, which can provide valuable insights.
  • Inform Decision-Making: The insights generated through descriptive analytics can inform strategic decisions and help organizations to react more quickly to events or changes in behavior.
  • Basis for Further Analysis: Descriptive analytics lays the groundwork for further analytical activities. It’s the first necessary step before moving on to more advanced forms of analytics like predictive analytics (forecasting future events) or prescriptive analytics (advising on possible outcomes).
  • Performance Evaluation: It allows organizations to evaluate their performance by comparing current results with past results, enabling them to see where improvements have been made and where further improvements can be targeted.
  • Enhanced Reporting and Dashboards: Through the use of visualization techniques, descriptive analytics can improve the quality of reports and dashboards, making the data more understandable and easier to interpret for stakeholders at all levels of the organization.
  • Immediate Value: Unlike some other types of analytics, descriptive analytics can provide immediate insights, as it doesn’t require complex models or deep analytical capabilities to provide value.

Disadvantages of Descriptive Analytics

While descriptive analytics offers numerous benefits, it also has certain limitations or disadvantages. Here are a few to consider:

  • Limited to Past Data: Descriptive analytics primarily deals with historical data and provides insights about past events. It does not predict future events or trends and can’t help you understand possible future outcomes on its own.
  • Lack of Deep Insights: While descriptive analytics helps in identifying what happened, it does not answer why it happened. For deeper insights, you would need to use diagnostic analytics, which analyzes data to understand the root cause of a particular outcome.
  • Can Be Misleading: If not properly executed, descriptive analytics can sometimes lead to incorrect conclusions. For example, correlation does not imply causation, but descriptive analytics might tempt one to make such an inference.
  • Data Quality Issues: The accuracy and usefulness of descriptive analytics are heavily reliant on the quality of the underlying data. If the data is incomplete, incorrect, or biased, the results of the descriptive analytics will be too.
  • Over-reliance on Descriptive Analytics: Businesses may rely too much on descriptive analytics and not enough on predictive and prescriptive analytics. While understanding past and present data is important, it’s equally vital to forecast future trends and make data-driven decisions based on those predictions.
  • Doesn’t Provide Actionable Insights: Descriptive analytics is used to interpret historical data and identify patterns and trends, but it doesn’t provide recommendations or courses of action. For that, prescriptive analytics is needed.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Digital Ethnography

Digital Ethnography – Types, Methods and Examples

Predictive Analytics

Predictive Analytics – Techniques, Tools and...

Big Data Analytics

Big Data Analytics -Types, Tools and Methods

Diagnostic Analytics

Diagnostic Analytics – Methods, Tools and...

Blockchain Research

Blockchain Research – Methods, Types and Examples

Social Network Analysis

Social Network Analysis – Types, Tools and...

Child Care and Early Education Research Connections

Data analysis.

Different statistics and methods used to describe the characteristics of the members of a sample or population, explore the relationships between variables, to test research hypotheses, and to visually represent data are described. Terms relating to the topics covered are defined in the  Research Glossary .

Descriptive Statistics

Tests of Significance

Graphical/Pictorial Methods

Analytical techniques.

Descriptive statistics can be useful for two purposes:

To provide basic information about the characteristics of a sample or population. These characteristics are represented by variables in a research study dataset.

To highlight potential relationships between these characteristics, or the relationships among the variables in the dataset.

The four most common descriptive statistics are:

Proportions, Percentages and Ratios

Measures of central tendency, measures of dispersion, measures of association.

One of the most basic ways of describing the characteristics of a sample or population is to classify its individual members into mutually exclusive categories and counting the number of cases in each of the categories. In research, variables with discrete, qualitative categories are called nominal or categorical variables. The categories can be given numerical codes, but they cannot be ranked, added, or multiplied. Examples of nominal variables include gender (male, female), preschool program attendance (yes, no), and race/ethnicity (White, African American, Hispanic, Asian, American Indian). Researchers calculate proportions, percentages and ratios in order to summarize the data from nominal or categorical variables and to allow for comparisons to be made between groups.

Proportion —The number of cases in a category divided by the total number of cases across all categories of a variable.

Percentage —The proportion multiplied by 100 (or the number of cases in a category divided by the total number of cases across all categories of a value times 100).

Ratio —The number of cases in one category to the number of cases in a second category.

A researcher selects a sample of 100 students from a Head Start program. The sample includes 20 White children, 30 African American children, 40 Hispanic children and 10 children of mixed-race/ethnicity.

Proportion of Hispanic children in the program = 40 / (20+30+40+10) = .40.

Percentage of Hispanic children in the program = .40 x 100 = 40%.

Ratio of Hispanic children to White children in the program = 40/20 = 2.0, or the ratio of Hispanic to White children enrolled in the Head Start program is 2 to 1.

Proportions, percentages and ratios are used to summarize the characteristics of a sample or population that fall into discrete categories. Measures of central tendency are the most basic and, often, the most informative description of a population's characteristics, when those characteristics are measured using an interval scale. The values of an interval variable are ordered where the distance between any two adjacent values is the same but the zero point is arbitrary. Values on an interval scale can be added and subtracted. Examples of interval scales or interval variables include household income, years of schooling, hours a child spends in child care and the cost of child care.

Measures of central tendency describe the "average" member of the sample or population of interest. There are three measures of central tendency:

Mean —The arithmetic average of the values of a variable. To calculate the mean, all the values of a variable are summed and divided by the total number of cases.

Median —The value within a set of values that divides the values in half (i.e. 50% of the variable's values lie above the median, and 50% lie below the median).

Mode —The value of a variable that occurs most often.

The annual incomes of five randomly selected people in the United States are $10,000, $10,000, $45,000, $60,000, and $1,000,000.

Mean Income = (10,000 + 10,000 + 45,000 + 60,000 + 1,000,000) / 5 = $225,000.

Median Income = $45,000.

Modal Income = $10,000.

The mean is the most commonly used measure of central tendency. Medians are generally used when a few values are extremely different from the rest of the values (this is called a skewed distribution). For example, the median income is often the best measure of the average income because, while most individuals earn between $0 and $200,000 annually, a handful of individuals earn millions.

Measures of dispersion provide information about the spread of a variable's values. There are three key measures of dispersion:

Range  is simply the difference between the smallest and largest values in the data. Researchers often report simply the values of the range (e.g., 75 – 100).

Variance  is a commonly used measure of dispersion, or how spread out a set of values are around the mean. It is calculated by taking the average of the squared differences between each value and the mean. The variance is the standard deviation squared.

Standard deviation , like variance, is a measure of the spread of a set of values around the mean of the values. The wider the spread, the greater the standard deviation and the greater the range of the values from their mean. A small standard deviation indicates that most of the values are close to the mean. A large standard deviation on the other hand indicates that the values are more spread out. The standard deviation is the square root of the variance.

Five randomly selected children were administered a standardized reading assessment. Their scores on the assessment were 50, 50, 60,75 and 90 with a mean score of 65.

Range = 90 - 50 = 40.

Variance = [(50 - 65)2 + (50 - 65)2 + (60 - 65)2 + (75 - 65)2 + (90 - 65)2] / 5 = 300.

Standard Deviation = Square Root (150,540,000,000) = 17.32.

Skewness and Kurtosis

The range, variance and standard deviation are measures of dispersion and provide information about the spread of the values of a variable. Two additional measures provide information about the shape of the distribution of values.

Skew  is a measure of whether some values of a variable are extremely different from the majority of the values. Skewness refers to the tendency of the values of a variable to depart from symmetry. A distribution is symmetric if one half of the distribution is exactly equal to the other half. For example, the distribution of annual income in the U.S. is skewed because most people make between $0 and $200,000 a year, but a handful of people earn millions. A variable is positively skewed (skewed to the right) if the extreme values are higher than the majority of values. A variable is negatively skewed (skewed to the left) if the extreme values are lower than the majority of values. In the example of students' standardized test scores, the distribution is slightly positively skewed.

Kurtosis  measures how outlier-prone a distribution is. Outliers are values of a variable that are much smaller or larger than most of the values found in a dataset. The kurtosis of a normal distribution is 0. If the kurtosis is different from 0, then the distribution produces outliers that are either more extreme (positive kurtosis) or less extreme (negative kurtosis) than are produced by the normal distribution.

Measures of association indicate whether two variables are related. Two measures are commonly used:

Chi-square test of independence

Correlation

Chi-Square test of independence  is used to evaluate whether there is an association between two variables. (The chi-square test can also be used as a measure of goodness of fit, to test if data from a sample come from a population with a specific distribution, as an alternative to Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests.)

It is most often used with nominal data (i.e., data that are put into discrete categories: e.g., gender [male, female] and type of job [unskilled, semi-skilled, skilled]) to determine whether they are associated. However, it can also be used with ordinal data.

Assumes that the samples being compared (e.g., males, females) are independent.

Tests the null hypothesis of no difference between the two variables (i.e., type of job is not related to gender).

To test for associations, a chi-square is calculated in the following way: Suppose a researcher wants to know whether there is a relationship between gender and two types of jobs, construction worker and administrative assistant. To perform a chi-square test, the researcher counts the number of female administrative assistants, the number of female construction workers, the number of male administrative assistants, and the number of male construction workers in the data. These counts are compared with the number that would be expected in each category if there were no association between job type and gender (this expected count is based on statistical calculations). The association between the two variables is determined to be significant (the null hypothesis is rejected), if the value of the chi-square test is greater than or equal to the critical value for a given significance level (typically .05) and the degrees of freedom associated with the test found in a chi-square table. The degrees of freedom for the chi-square are calculated using the following formula:  df  = (r-1)(c-1) where r is the number of rows and c is the number of columns in a contingency or cross-tabulation table. For example, the critical value for a 2 x 2 table with 1 degree of freedom ([2-1][2-1]=1) is 3.841.

Correlation coefficient  is used to measure the strength and direction of the relationship between numeric variables (e.g., weight and height).

The most common correlation coefficient is the Pearson's product-moment correlation coefficient (or simply  Pearson's r ), which can range from -1 to +1.

Values closer to 1 (either positive or negative) indicate that a stronger association exists between the two variables.

A positive coefficient (values between 0 and 1) suggests that larger values of one of the variables are accompanied by larger values of the other variable. For example, height and weight are usually positively correlated because taller people tend to weigh more.

A negative association (values between 0 and -1) suggests that larger values of one of the variables are accompanied by smaller values of the other variable. For example, age and hours slept per night are often negatively correlated because older people usually sleep fewer hours per night than younger people.

The findings reported by researchers are typically based on data collected from a single sample that was drawn from the population of interest (e.g., a sample of children selected from the population of children enrolled in Head Start or Early Head Start). If additional random samples of the same size were drawn from this population, the estimated percentages and means calculated using the data from each of these other samples might differ by chance somewhat from the estimates produced from one sample. Researchers use one of several tests to evaluate whether their findings are statistically significant.

Statistical significance refers to the probability or likelihood that the difference between groups or the relationship between variables observed in statistical analyses is not due to random chance (e.g., that differences between the average scores on a measure of language development between 3- and 4-year-olds are likely to be “real” rather than just observed in this sample by chance). If there is a very small probability that an observed difference or relationship is due to chance, the results are said to reach statistical significance. This means that the researcher concludes that there is a real difference between two groups or a real relationship between the observed variables.

Significance tests and the associated  p-  value only tell us how likely it is that a statistical result (e.g., a difference between the means of two or more groups, or a correlation between two variables) is due to chance. The p-value is the probability that the results of a statistical test are due to chance. In the social and behavioral sciences, a p-value less than or equal to .05 is usually interpreted to mean that the results are statistically significant (that the statistical results would occur by chance 5 times or fewer out of 100), although sometimes researchers use a p-value of .10 to indicate whether a result is statistically significant. The lower the p-value, the less likely a statistical result is due to chance. Lower p-values are therefore a more rigorous criteria for concluding significance.

Researchers use a variety of approaches to test whether their findings are statistically significant or not. The choice depends on several factors, including the number of groups being compared, whether the groups are independent from one another, and the type of variables used in the analysis. Three widely used tests are the t-test, F-test, and Chi-square test.

Three of the more widely used tests of statistical significance are described briefly below.

Chi-Square test  is used when testing for associations between categorical variables (e.g., differences in whether a child has been diagnosed as having a cognitive disability by gender or race/ethnicity). It is also used as a goodness-of-fit test to determine whether data from a sample come from a population with a specific distribution.

t-test  is used to compare the means of two independent samples (independent t-test), the means of one sample at different times (paired sample t-test) or the mean of one sample against a known mean (one sample t-test). For example, when comparing the mean assessment scores of boys and girls or the mean scores of 3- and 4-year-old children, an independent t-test would be used. When comparing the mean assessment scores of girls only at two time points (e.g., fall and spring of the program year) a paired t-test would be used. A one sample t-test would be used when comparing the mean scores of a sample of children to the mean score of a population of children. The t- test is appropriate for small sample sizes (less than 30) although it is often used when testing group differences for larger samples. It is also used to test whether correlation and regression coefficients are significantly different from zero.

F-test  is an extension of the t-test and is used to compare the means of three or more independent samples (groups). The F-test is used in Analysis of Variance (ANOVA) to examine the ratio of the between groups to within groups variance. It is also used to test the significance of the total variance explained by a regression model with multiple independent variables.

Significance tests alone do not tell us anything about the size of the difference between groups or the strength of the association between variables. Because significance test results are sensitive to sample size, studies with different sample sizes with the same means and standard deviations would have different t statistics and p values. It is therefore important that researchers provide additional information about the size of the difference between groups or the association and whether the difference/association is substantively meaningful.

See the following for additional information about descriptive statistics and tests of significance:

Descriptive analysis in education: A guide for researchers  (PDF)

Basic Statistics

Effect Sizes and Statistical Significance

Summarizing and Presenting Data

There are several graphical and pictorial methods that enhance understanding of individual variables and the relationships between variables. Graphical and pictorial methods provide a visual representation of the data. Some of these methods include:

Line graphs

Scatter plots.

Geographical Information Systems (GIS)

Bar charts visually represent the frequencies or percentages with which different categories of a variable occur.

Bar charts are most often used when describing the percentages of different groups with a specific characteristic. For example, the percentages of boys and girls who participate in team sports. However, they may also be used when describing averages such as the average boys and girls spend per week participating in team sports.

Each category of a variable (e.g., gender [boys and girls], children's age [3, 4, and 5]) is displayed along the bottom (or horizontal or X axis) of a bar chart.

The vertical axis (or Y axis) includes the values of the statistic on that the groups are being compared (e.g., percentage participating in team sports).

A bar is drawn for each of the categories along the horizontal axis and the height of the bar corresponds to the frequency or percentage with which that value occurs.

A pie chart (or a circle chart) is one of the most commonly used methods for graphically presenting statistical data.

As its name suggests, it is a circular graphic, which is divided into slices to illustrate the proportion or percentage of a sample or population that belong to each of the categories of a variable.

The size of each slice represents the proportion or percentage of the total sample or population with a specific characteristic (found in a specific category). For example, the percentage of children enrolled in Early Head Start who are members of different racial/ethnic groups would be represented by different slices with the size of each slice proportionate to the group's representation in the total population of children enrolled in the Early Head Start program.

A line graph is a type of chart which displays information as a series of data points connected by a straight line.

Line graphs are often used to show changes in a characteristic over time.

It has an X-axis (horizontal axis) and a Y axis (vertical axis). The time segments of interest are displayed on the X-axis (e.g., years, months). The range of values that the characteristic of interest can take are displayed along the Y-axis (e.g., annual household income, mean years of schooling, average cost of child care). A data point is plotted coinciding with the value of the Y variable plotted for each of the values of the X variable, and a line is drawn connecting the points.

Scatter plots display the relationship between two quantitative or numeric variables by plotting one variable against the value of another variable

The values of one of the two variables are displayed on the horizontal axis (x axis) and the values of the other variable are displayed on the vertical axis (y axis)

Each person or subject in a study would receive one data point on the scatter plot that corresponds to his or her values on the two variables. For example, a scatter plot could be used to show the relationship between income and children's scores on a math assessment. A data point for each child in the study showing his or her math score and family income would be shown on the scatter plot. Thus, the number of data points would equal the total number of children in the study.

Geographic Information Systems (GIS)

A Geographic Information System is computer software capable of capturing, storing, analyzing, and displaying geographically referenced information; that is, data identified according to location.

Using a GIS program, a researcher can create a map to represent data relationships visually. For example, the National Center for Education Statistics creates maps showing the characteristics of school districts across the United States such as the percentage of children living in married couple households, median family incomes and percentage of population that speaks a language other than English. The data that are linked to school district location come from the American Community Survey.

Display networks of relationships among variables, enabling researchers to identify the nature of relationships that would otherwise be too complex to conceptualize.

See the following for additional information about different graphic methods:

Graphical Analytic Techniques

Geographic Information Systems

Researchers use different analytical techniques to examine complex relationships between variables. There are three basic types of analytical techniques:

Regression Analysis

Grouping methods, multiple equation models.

Regression analysis assumes that the dependent, or outcome, variable is directly affected by one or more independent variables. There are four important types of regression analyses:

Ordinary least squares (OLS) regression

OLS regression (also known as linear regression) is used to determine the relationship between a dependent variable and one or more independent variables.

OLS regression is used when the dependent variable is continuous. Continuous variables, in theory, can take on any value with a range. For example, family child care expenses, measured in dollars, is a continuous variable.

Independent variables may be nominal, ordinal or continuous. Nominal variables, which are also referred to as categorical variables, have two or more non-numeric or qualitative categories. Examples of nominal variables are children's gender (male, female), their parents' marital status (single, married, separated, divorced), and the type of child care children receive (center-based, home-based care). Ordinal variables are similar to nominal variables except it is possible to order the categories and the order has meaning. For example, children's families’ socioeconomic status may be grouped as low, middle and high.

When used to estimate the associations between two or more independent variables and a single dependent variable, it is called multiple linear regression.

In multiple regression, the coefficient (i.e., standardized or unstandardized regression coefficient for each independent variable) tells you how much the dependent variable is expected to change when that independent variable increases by one, holding all the other independent variables constant.

Logistic regression

Logistic regression (or logit regression) is a special form of regression analysis that is used to examine the associations between a set of independent or predictor variables and a dichotomous outcome variable. A dichotomous variable is a variable with only two possible values, e.g. child receives child care before or after the Head Start program day (yes, no).

Like linear regression, the independent variables may be either interval, ordinal, or nominal. A researcher might use logistic regression to study the relationships between parental education, household income, and parental employment and whether children receive child care from someone other than their parents (receives nonparent care/does not receive nonparent care).

Hierarchical linear modeling (HLM)

Used when data are nested. Nested data occur when several individuals belong to the same group under study. For example, in child care research, children enrolled in a center-based child care program are grouped into classrooms with several classrooms in a center. Thus, the children are nested within classrooms and classrooms are nested within centers.

Allows researchers to determine the effects of characteristics for each level of nested data, classrooms and centers, on the outcome variables. HLM is also used to study growth (e.g., growth in children’s reading and math knowledge and skills over time).

Duration models

Used to estimate the length of time before a given event occurs or the length of time spent in a state. For example, in child care policy research, duration models have been used to estimate the length of time that families receive child care subsidies.

Sometimes referred to as survival analysis or event history analysis.

Grouping methods are techniques for classifying observations into meaningful categories. Two of the most common grouping methods are discriminant analysis and cluster analysis.

Discriminant analysis

Identifies characteristics that distinguish between groups. For example, a researcher could use discriminant analysis to determine which characteristics identify families that seek child care subsidies and which identify families that do not.

It is used when the dependent variable is a categorical variable (e.g., family receives child care subsidies [yes, no], child enrolled in family care [yes, no], type of child care child receives [relative care, non-relative care, center-based care]). The independent variables are interval variables (e.g., years of schooling, family income).

Cluster analysis

Used to classify similar individuals together. It uses a set of measured variables to classify a sample of individuals (or organizations) into a number of groups such that individuals with similar values on the variables are placed in the same group. For example, cluster analysis would be used to group together parents who hold similar views of child care or children who are suspended from school.

Its goal is to sort individuals into groups in such a way that individuals in the same group (cluster) are more similar to each other than to individuals in other groups.

The variables used in cluster analysis may be nominal, ordinal or interval.

Multiple equation modeling, which is an extension of regression, is used to examine the causal pathways from independent variables to the dependent variable. For example, what are the variables that link (or explain) the relationship between maternal education (independent variable) and children's early reading skills (dependent variable)? These variables might include the nature and quality of mother-child interactions or the frequency and quality of shared book reading.

There are two main types of multiple equation models:

Path analysis

Structural equation modeling

Path analysis is an extension of multiple regression that allows researchers to examine multiple direct and indirect effects of a set of variables on a dependent, or outcome, variable. In path analysis, a direct effect measures the extent to which the dependent variable is influenced by an independent variable. An indirect effect measures the extent to which an independent variable's influence on the dependent variable is due to another variable.

A path diagram is created that identifies the relationships (paths) between all the variables and the direction of the influence between them.

The paths can run directly from an independent variable to a dependent variable (e.g., X→Y), or they can run indirectly from an independent variable, through an intermediary, or mediating, variable, to the dependent variable (e.g. X1→X2→Y).

The paths in the model are tested to determine the relative importance of each.

Because the relationships between variables in a path model can become complex, researchers often avoid labeling the variables in the model as independent and dependent variables. Instead, two types of variables are found in these models:

Exogenous variables  are not affected by other variables in the model. They have straight arrows emerging from them and not pointing to them.

Endogenous variables  are influenced by at least one other variable in the model. They have at least one straight arrow pointing to them.

Structural equation modeling (SEM)

Structural equation modeling expands path analysis by allowing for multiple indicators of unobserved (or latent) variables in the model. Latent variables are variables that are not directly observed (measured), but instead are inferred from other variables that are observed or directly measured. For example, children's school readiness is a latent variable with multiple indicators of children's development across multiple domains (e.g., children's scores on standardized assessments of early math and literacy, language, scores based on teacher reports of children's social skills and problem behaviors).

There are two parts to a SEM analysis. First, the measurement model is tested. This involves examining the relationships between the latent variables and their measures (indicators). Second, the structural model is tested in order to examine how the latent variables are related to one another. For example, a researcher might use SEM to investigate the relationships between different types of executive functions and word reading and reading comprehension for elementary school children. In this example, the latent variables word reading and reading comprehension might be inferred from a set of standardized reading assessments and the latent variables cognitive flexibility and inhibitory control from a set of executive function tasks. The measurement model of SEM allows the researcher to evaluate how well children's scores on the standardized reading assessments combine to identify children's word reading and reading comprehension. Assuming that the results of these analyses are acceptable, the researcher would move on to an evaluation of the structural model, examining the predicted relationships between two types of executive functions and two dimensions of reading.

SEM has several advantages over traditional path analysis:

Use of multiple indicators for key variables reduces measurement error.

Can test whether the effects of variables in the model and the relationships depicted in the entire model are the same for different groups (e.g., are the direct and indirect effects of parent investments on children's school readiness the same for White, Hispanic and African American children).

Can test models with multiple dependent variables (e.g., models predicting several domains of child development).

See the following for additional information about multiple equation models:

Finding Our Way: An Introduction to Path Analysis (Streiner)

An Introduction to Structural Equation Modeling (Hox & Bechger)  (PDF)

Written By : Pitch N Hire

Mon Sep 18 2023

Everything You Need To Know About Analytical Research Its Essential Uses

blog

Research is vital in any field. It helps in finding out information about various subjects. It is a systematic process of collecting data, documenting critical information, analyzing data, and interpreting it. It employs different methodologies to perform various tasks. Its main task is to collect, compose and analyze data on the subject matter. It can be defined as the process of creating new knowledge or applying existing knowledge to invent new concepts.

Research methods are classified into different categories based on the methods, nature of the study, purpose, and research design. Based on the nature of the study, research is classified into two parts- descriptive research and analytical research. This article will cover the subject matter of analytical research. Now, you must be thinking about what is analytical research. It is that kind of research in which secondary data are used to critically examine the study. Researchers used already existing information for research analysis. Different types of analytical research designs are used for critically evaluating the information extracted from the data of the existing research.

Read More : Best Applicant Tracking System (ATS) Providers In The USA

Effect of Analytical Studies on Education Trails

Students, research scholars, doctors, psychologists, etc. take the help of analytical research for taking out important information for their research studies. It helps in adding new concepts and ideas to the already produced material. Various kinds of analytical research designs are used to add value to the study material. It is conducted using various methods such as literary research, public opinion, meta-analysis, scientific trials, etc.

When you come across a question of what is analytical research, you can define it as a tool that is used to add reliability to the work. This is generally conducted to provide support to an idea or hypothesis. It employs critical thinking to extract the small details. This helps in building big assumptions about the subject matter or the material of the study. It emphasizes comprehending the cause-effect relationship between variables. 

Analytical Research Designs

Read More : Candidate Tracking System: What Does It Include?

Analytical Research includes critical assessment and critical thinking and hence, it is important. It creates new ideas about the data and proves or disproves the hypothesis. If the question comes of what analytical research is used for, it can be said that it is used to create an association between exposure and the outcome. This association is based on two types of analytical research design. The first is cohort studies and the second is a case-control study. In cohort studies, people of different groups with different levels of exposure are observed over time to analyze the occurrence of an outcome. It is a forward-direction and prospective kind of study. It is easier to determine the outcome risk among unexposed and exposed groups. 

It resembles the experimental design. Whereas in case-control studies, researchers enlist two groups, cases, and controls, and then bring out the history of exposure of each group. It is a backward-direction and retrospective study. It consumes less time and is comparatively cheaper than cohort studies. It is the primary study design that is used to examine the relationship between a particular exposure and an outcome.

Read More : How Do Hiring Software Systems Improve The Recruitment Process?

Methods of Conducting Analytical Research 

Analytical Research saves time, money, and lives and helps in achieving objectives effectively. It can be conducted using the following methods:

Literary Research 

Literary Research is one of the methods of conducting analytical research. It means finding new ideas and concepts from already existing literary work. It requires you to invent something new, a new way of interpreting the already available information to discuss it. It is the backbone of various research studies. Its function is to find out all the literary information, preserve them with different methodologies and analyze them. It provides hypotheses in the already existing research and also helps in analyzing modern-day research. It helps in analyzing unsolved or doubtful theories.

Read More : An Ultimate Guide to Candidate Management System

Meta-Analysis Research

Meta-Analysis is an epidemiological, formal, and quantitative research design that helps in the systematic assessment of previous research results to develop a conclusion about the body of research. It is a subset of systematic reviews. It analyzes the strength of the evidence. It helps in examining the variability or heterogeneity. It includes a quantitative review of the body of literature. It is PRISMA and its aim is to identify the existence of effects, finding the negative and positive kinds of effects. Its results can improve the accuracy of the estimates of effects.

Scientific Trials

Scientific Trials research is conducted on people. It is of two types, observational studies and the second is clinical traits. It finds new possibilities for clinical traits. It aims to find out medical strategies. It also helps in determining whether medical treatment and devices are safe it not. It searches for a better way of treating, diagnosing, screening, and treatment of the disease. It is a scientific study that involves 4 stages. It is conducted to find if a new treatment method is safe, effective, and efficient in people or not.

Read More : Applicant Tracking Software: Do They Really Help You Hire Better?

It aims to examine or analyze surgical, medical, and behavioral interventions. There are different types of scientific trials such as cohort studies, case-control studies, treatment trials, cross-sectional studies, screening trials, pilot trials, prevention trials, etc. 

Analytical Research is that kind of research that utilizes the already available data for extracting information. Its main aim is to divide a topic or a concept into smaller pieces to understand it in a better way and then assemble those parts in a way that is understandable by you. You can conduct analytical research by using the methods discussed in the article. It involves ex-ante research. It means analyzing the phenomenon. 

It is of different types such as historical research, philosophical research, research synthesis, and reviews. Also, it intends to comprehend the causal relation between phenomena. It works within the limited variables and involves in-depth research and analysis of the available data. Therefore, it is crucial for any data because it adds relevance to it and makes it authentic. It supports and validates a hypothesis. It helps companies in making quick and effective decision-making about the product and services provided by them.

Related Posts

  • Siebel interview questions
  • Spec work for graphic designers

Let our experts elevate your hiring journey. Message us and unlock potential. We'll be in touch.

  • Applicant Tracking System (ATS)
  • Interview & Assessment
  • SEO Services
  • Content Marketing Services
  • Social Media Marketing Services
  • Software Testing Services
  • Web Development Services
  • UI / UX Services
  • Mobile Development Services
  • Permanent Staffing Services
  • Contract Staffing Services

Our Popular Articles

PW Skills | Blog

Analyze Report: How to Write the Best Analytical Report (+ 6 Examples!)

By Varun Saharawat | March 1, 2024

Organizations analyze reports to improve performance by identifying areas of strength and weakness, understanding customer needs and preferences, optimizing business processes, and making data-driven decisions!

analyze report

Analyze Report: Picture a heap of bricks scattered on the ground. Individually, they lack purpose until meticulously assembled into a cohesive structure—a house, perhaps?

In the realm of business intelligence , data serves as the fundamental building material, with a well-crafted data analysis report serving as the ultimate desired outcome.

However, if you’ve ever attempted to harness collected data and transform it into an insightful report, you understand the inherent challenges. Bridging the gap between raw, unprocessed data and a coherent narrative capable of informing actionable strategies is no simple feat.

Table of Contents

What is an Analyze Report?

An analytical report serves as a crucial tool for stakeholders to make informed decisions and determine the most effective course of action. For instance, a Chief Marketing Officer (CMO) might refer to a business executive analytical report to identify specific issues caused by the pandemic before adapting an existing marketing strategy.

Marketers often utilize business intelligence tools to generate these informative reports. They vary in layout, ranging from text-heavy documents (such as those created in Google Docs with screenshots or Excel spreadsheets) to visually engaging presentations.

A quick search on Google reveals that many marketers opt for text-heavy documents with a formal writing style, often featuring a table of contents on the first page. In some instances, such as the analytical report example provided below, these reports may consist of spreadsheets filled with numbers and screenshots, providing a comprehensive overview of the data.

Also Read: The Best Business Intelligence Software in 2024

How to Write an Analyze Report?

Writing an Analyze Report requires careful planning, data analysis , and clear communication of findings. Here’s a step-by-step guide to help you write an effective analytical report:

Step 1: Define the Purpose:

  • Clearly define the objective and purpose of the report. Determine what problem or question the report aims to address.
  • Consider the audience for the report and what information they need to make informed decisions.

Step 2: Gather Data:

  • Identify relevant sources of data that can provide insights into the topic.
  • Collect data from primary sources (e.g., surveys, interviews) and secondary sources (e.g., research studies, industry reports).
  • Ensure that the data collected is accurate, reliable, and up-to-date.

Step 3: Analyze the Data:

  • Use analytical tools and techniques to analyze the data effectively. This may include statistical analysis, qualitative coding, or data visualization.
  • Look for patterns, trends, correlations, and outliers in the data that may provide insights into the topic.
  • Consider the context in which the data was collected and any limitations that may affect the analysis.

Step 4: Organize the Information:

  • Structure the report in a logical and coherent manner. Divide the report into sections, such as an introduction, methodology, findings, analysis, and conclusion.
  • Ensure that each section flows logically into the next and that there is a clear progression of ideas throughout the report.

Step 5: Write the Introduction:

  • Start with an introduction that provides background information on the topic and outlines the scope of the report.
  • Clearly state the purpose and objectives of the analysis.
  • Provide context for the analysis and explain why it is relevant and important.

Step 6: Present the Methodology:

  • Describe the methods and techniques used to gather and analyze the data.
  • Explain any assumptions made and the rationale behind your approach.
  • Provide sufficient detail so that the reader can understand how the analysis was conducted.

Step 7: Present the Findings:

  • Present the findings of your analysis in a clear and concise manner.
  • Use charts, graphs, tables, and other visual aids to illustrate key points and make the data easier to understand.
  • Provide context for the findings and explain their significance.

Step 8: Analyze the Data:

  • Interpret the findings and analyze their implications.
  • Discuss any patterns, trends, or insights uncovered by the analysis and explain their significance.
  • Consider alternative explanations or interpretations of the data.

Step 9: Draw Conclusions:

  • Draw conclusions based on the analysis and findings.
  • Summarize the main points and insights of the report.
  • Reiterate the key takeaways and their implications for decision-making.

Step 10: Make Recommendations:

  • Finally, make recommendations based on your conclusions.
  • Suggest actionable steps that can be taken to address any issues identified or capitalize on any opportunities uncovered by the analysis.
  • Provide specific, practical recommendations that are feasible and aligned with the objectives of the report.

Step 11: Proofread and Revise:

  • Review the report for accuracy, clarity, and coherence.
  • Ensure that the writing is clear, concise, and free of errors.
  • Make any necessary revisions before finalizing the report.

Step 12: Write the Executive Summary:

  • Write a brief executive summary that provides an overview of the report’s key findings, conclusions, and recommendations.
  • This summary should be concise and easy to understand for busy stakeholders who may not have time to read the entire report.
  • Include only the most important information and avoid unnecessary details.

By following these steps, you can write an analytical report that effectively communicates your findings and insights to your audience.

Also Read: Analytics For BI: What is Business Intelligence and Analytics?

Analyze Report Examples

Analyze Report play a crucial role in providing valuable insights to businesses, enabling informed decision-making and strategic planning. Here are some examples of analytical reports along with detailed descriptions:

1) Executive Report Template:

An executive report serves as a comprehensive overview of a company’s performance, specifically tailored for C-suite executives. This report typically includes key metrics and KPIs that provide insights into the organization’s financial health and operational efficiency. For example, the Highlights tab may showcase total revenue for a specific period, along with the breakdown of transactions and associated costs. 

Additionally, the report may feature visualizations such as cost vs. revenue comparison charts, allowing executives to quickly identify trends and make data-driven decisions. With easy-to-understand graphs and charts, executives can expedite decision-making processes and adapt business strategies for effective cost containment and revenue growth.

2) Digital Marketing Report Template:

In today’s digital age, businesses rely heavily on digital marketing channels to reach their target audience and drive engagement. A digital marketing report provides insights into the performance of various marketing channels and campaigns, helping businesses optimize their marketing strategies for maximum impact. 

This report typically includes key metrics such as website traffic, conversion rates, and ROI for each marketing channel. By analyzing these KPIs, businesses can identify their best-performing channels and allocate resources accordingly. For example, the report may reveal that certain channels, such as social media or email marketing, yield higher response rates than others. Armed with this information, businesses can refine their digital marketing efforts to enhance the user experience, attract more customers, and ultimately drive growth.

3) Sales Performance Report:

A sales performance report provides a detailed analysis of sales activities, including revenue generated, sales volume, customer acquisition, and sales team performance. This report typically includes visualizations such as sales trend charts, pipeline analysis, and territory-wise sales comparisons. By analyzing these metrics, sales managers can identify top-performing products or services, track sales targets, and identify areas for improvement.

4) Customer Satisfaction Report:

A customer satisfaction report evaluates customer feedback and sentiment to measure overall satisfaction levels with products or services. This report may include metrics such as Net Promoter Score (NPS), customer survey results, and customer support ticket data. By analyzing these metrics, businesses can identify areas where they excel and areas where they need to improve to enhance the overall customer experience.

5) Financial Performance Report:

A financial performance report provides an in-depth analysis of an organization’s financial health, including revenue, expenses, profitability, and cash flow. This report may include financial ratios, trend analysis, and variance reports to assess performance against budgeted targets or industry benchmarks. By analyzing these metrics, financial managers can identify areas of strength and weakness and make strategic decisions to improve financial performance .

6) Inventory Management Report:

An inventory management report tracks inventory levels, turnover rates, stockouts, and inventory costs to optimize inventory management processes. This report may include metrics such as inventory turnover ratio, carrying costs, and stock-to-sales ratios. By analyzing these metrics, inventory managers can ensure optimal inventory levels, minimize stockouts, and reduce carrying costs to improve overall operational efficiency.

7) Employee Performance Report:

An employee performance report evaluates individual and team performance based on key performance indicators (KPIs) such as sales targets, customer satisfaction scores, productivity metrics, and attendance records. This report may include visualizations such as performance scorecards, heatmaps, and trend analysis charts to identify top performers, areas for improvement, and training needs.

Also Check: Analytics & Insights: The Difference Between Data, Analytics, and Insights

Why are Analyze Report Important?

Analyze Report are important for several reasons:

  • Informed Decision Making: Analytical reports provide valuable insights and data-driven analysis that enable businesses to make informed decisions. By presenting relevant information in a structured format, these reports help stakeholders understand trends, identify patterns, and evaluate potential courses of action.
  • Problem Solving: Analytical reports help organizations identify and address challenges or issues within their operations. Whether it’s identifying inefficiencies in processes, addressing customer complaints, or mitigating risks, these reports provide a framework for problem-solving and decision-making.
  • Business Opportunities: Analytical reports can uncover new business opportunities by analyzing market trends, customer behavior, and competitor activities. By identifying emerging trends or unmet customer needs, businesses can capitalize on opportunities for growth and innovation.
  • Performance Evaluation: Analytical reports are instrumental in evaluating the performance of various aspects of a business, such as sales, marketing campaigns, and financial metrics. By tracking key performance indicators (KPIs) and metrics, organizations can assess their progress towards goals and objectives.
  • Accountability and Transparency: Analytical reports promote accountability and transparency within an organization by providing objective data and analysis. By sharing insights and findings with stakeholders, businesses can foster trust and confidence in their decision-making processes.

Overall, analytical reports serve as valuable tools for businesses to gain insights, solve problems, identify opportunities, evaluate performance, and enhance decision-making processes.

Types of Analyze Report

  • Financial Analyze Report: These reports analyze the financial performance of an organization, including revenue, expenses, profitability, and cash flow. They help stakeholders understand the financial health of the business and make informed decisions about investments, budgeting, and strategic planning.
  • Market Research Reports: Market research reports analyze market trends, consumer behavior, competitive landscape, and other factors affecting a particular industry or market segment. They provide valuable insights for businesses looking to launch new products, enter new markets, or refine their marketing strategies .
  • Performance Analysis Reports: These reports evaluate the performance of various aspects of an organization, such as sales performance, operational efficiency, employee productivity, and customer satisfaction. They help identify areas of improvement and inform decision-making to enhance overall performance.
  • Risk Assessment Reports: Risk assessment reports analyze potential risks and vulnerabilities within an organization, such as financial risks, operational risks, cybersecurity risks, and regulatory compliance risks. They help stakeholders understand and mitigate risks to protect the organization’s assets and reputation.
  • SWOT Analysis Reports: SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis reports assess the internal strengths and weaknesses of an organization, as well as external opportunities and threats in the business environment. They provide a comprehensive overview of the organization’s strategic position and guide decision-making.
  • Customer Analysis Reports: Customer analysis reports examine customer demographics, purchasing behavior, satisfaction levels, and preferences. They help businesses understand their target audience better, tailor products and services to meet customer needs, and improve customer retention and loyalty.
  • Operational Efficiency Reports: These reports evaluate the efficiency and effectiveness of operational processes within an organization, such as production, logistics, and supply chain management. They identify bottlenecks, inefficiencies, and areas for improvement to optimize operations and reduce costs.
  • Compliance and Regulatory Reports: Compliance and regulatory reports assess an organization’s adherence to industry regulations, legal requirements, and internal policies. They ensure that the organization operates ethically and legally, mitigating the risk of fines, penalties, and reputational damage.

For Latest Tech Related Information, Join Our Official Free Telegram Group : PW Skills Telegram Group

Analyze Report FAQs

What is an analytical report.

An analytical report is a document that presents data, analysis, and insights on a specific topic or problem. It provides a detailed examination of information to support decision-making and problem-solving within an organization.

Why are analytical reports important?

Analytical reports are important because they help organizations make informed decisions, solve problems, and identify opportunities for improvement. By analyzing data and providing insights, these reports enable stakeholders to understand trends, patterns, and relationships within their business operations.

What types of data are typically included in analytical reports?

Analytical reports may include various types of data, such as financial data, sales data, customer feedback, market research, and operational metrics. The specific data included depends on the purpose of the report and the information needed to address the topic or problem being analyzed.

How are analytical reports different from other types of reports?

Analytical reports differ from other types of reports, such as descriptive reports or summary reports, in that they go beyond presenting raw data or summarizing information. Instead, analytical reports analyze data in-depth, draw conclusions, and provide recommendations based on the analysis.

What are the key components of an analytical report?

Key components of an analytical report typically include an introduction, methodology, findings, analysis, conclusions, and recommendations. The introduction provides background information on the topic, the methodology outlines the approach used to analyze the data, the findings present the results of the analysis, the analysis interprets the findings, and the conclusions and recommendations offer insights and actionable steps.

Data Mining Architecture: Components, Types & Techniques

data mining

Comprehensive Data Analytics Syllabus: Courses and Curriculum

data analytics syllabus

Google Data Analytics Professional Certificate Review, Cost, Eligibility 2023

data analyst google certificate

Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

what is analytical research with example

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

what is analytical research with example

Psst… there’s more (for free)

This post is part of our dissertation mini-course, which covers everything you need to get started with your dissertation, thesis or research project. 

You Might Also Like:

Research design for qualitative and quantitative studies

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Research: How Different Fields Are Using GenAI to Redefine Roles

  • Maryam Alavi

Examples from customer support, management consulting, professional writing, legal analysis, and software and technology.

The interactive, conversational, analytical, and generative features of GenAI offer support for creativity, problem-solving, and processing and digestion of large bodies of information. Therefore, these features can act as cognitive resources for knowledge workers. Moreover, the capabilities of GenAI can mitigate various hindrances to effective performance that knowledge workers may encounter in their jobs, including time pressure, gaps in knowledge and skills, and negative feelings (such as boredom stemming from repetitive tasks or frustration arising from interactions with dissatisfied customers). Empirical research and field observations have already begun to reveal the value of GenAI capabilities and their potential for job crafting.

There is an expectation that implementing new and emerging Generative AI (GenAI) tools enhances the effectiveness and competitiveness of organizations. This belief is evidenced by current and planned investments in GenAI tools, especially by firms in knowledge-intensive industries such as finance, healthcare, and entertainment, among others. According to forecasts, enterprise spending on GenAI will increase by two-fold in 2024 and grow to $151.1 billion by 2027 .

  • Maryam Alavi is the Elizabeth D. & Thomas M. Holder Chair & Professor of IT Management, Scheller College of Business, Georgia Institute of Technology .

Partner Center

Search code, repositories, users, issues, pull requests...

Provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications

IMAGES

  1. How to Write an Analytical Essay (with Samples)

    what is analytical research with example

  2. 😍 Define analytical research. How to Write an Analytical Research Essay

    what is analytical research with example

  3. 45 Analysis Examples (2024)

    what is analytical research with example

  4. Analytical Essay Examples to Score Well in Academics

    what is analytical research with example

  5. How to Write an Analytical Research Paper Guide

    what is analytical research with example

  6. Learn How to Write an Analytical Essay on Trust My Paper

    what is analytical research with example

VIDEO

  1. Descriptive and Analytical Research

  2. Analytical Reasoning Networking Games Example 1

  3. Analytical Research Report

  4. Analytical Reasoning Networking Games Example 2

  5. DATA INTERPRETATION FACTS

  6. Analytical Reasoning: An example of Analytical Game

COMMENTS

  1. Analytical Research: What is it, Importance + Examples

    For example, it can look into why the value of the Japanese Yen has decreased. This is so that an analytical study can consider "how" and "why" questions. Another example is that someone might conduct analytical research to identify a study's gap. It presents a fresh perspective on your data.

  2. Analytical Research: What is it, Importance + Examples

    Examples of analytical research. Analytical research takes a singular measurement. Instead, you would consider the causes and changes to the trade imbalance. Detailed statistics and statistical checks help guarantee that to erkenntnisse are significant. a detailed review of any complexe into order to understand its nature or up determine its ...

  3. Analytical Research: Examples and Advantages

    Analytical Research: Examples and Advantages. Analytical research is a methodical investigation approach that delves deep into complex subjects through data analysis. It aids in understanding, problem-solving, and informed decision-making in diverse fields. A retail company is using analytical research to enhance its marketing strategies.

  4. Descriptive and Analytical Research: What's the Difference?

    Descriptive research classifies, describes, compares, and measures data. Meanwhile, analytical research focuses on cause and effect. For example, take numbers on the changing trade deficits between the United States and the rest of the world in 2015-2018. This is descriptive research.

  5. 8.5 Writing Process: Creating an Analytical Report

    Whenever you choose to write the introduction, use it to draw readers into your report. Make the topic of your report clear, and be concise and sincere. End the introduction with your thesis statement. Depending on your topic and the type of report, you can write an effective introduction in several ways.

  6. Study designs: Part 1

    Research study design is a framework, or the set of methods and procedures used to collect and analyze data on variables specified in a particular research problem. Research study designs are of many types, each with its advantages and limitations. The type of study design used to answer a particular research question is determined by the ...

  7. Overview of Analytic Studies

    Introduction. We search for the determinants of health outcomes, first, by relying on descriptive epidemiology to generate hypotheses about associations between exposures and outcomes. Analytic studies are then undertaken to test specific hypotheses. Samples of subjects are identified and information about exposure status and outcome is collected.

  8. Analytical studies: a framework for quality improvement design and

    An analytical study is one in which action will be taken on a cause system to improve the future performance of the system of interest. The aim of an enumerative study is estimation, while an analytical study focuses on prediction. Because of the temporal nature of improvement, the theory and methods for analytical studies are a critical ...

  9. What are Analytical Study Designs?

    When are analytical study designs used? A study design is a systematic plan, developed so you can carry out your research study effectively and efficiently. Having a design is important because it will determine the right methodologies for your study. Using the right study design makes your results more credible, valid, and coherent.

  10. Research Methods

    Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:

  11. PDF Tips for Writing Analytic Research Papers

    Communications Program. 79 John F. Kennedy Street Cambridge, Massachusetts 02138. TIPS FOR WRITING ANALYTIC RESEARCH PAPERS. • Papers require analysis, not just description. When you describe an existing situation (e.g., a policy, organization, or problem), use that description for some analytic purpose: respond to it, evaluate it according ...

  12. Research Design

    Table of contents. Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies.

  13. Writing theoretical frameworks, analytical frameworks and conceptual

    An analytical framework is, the way I see it, a model that helps explain how a certain type of analysis will be conducted. For example, in this paper, Franks and Cleaver develop an analytical framework that includes scholarship on poverty measurement to help us understand how water governance and poverty are interrelated.

  14. Descriptive Analytics

    Descriptive Analytics. Definition: Descriptive analytics focused on describing or summarizing raw data and making it interpretable. This type of analytics provides insight into what has happened in the past. It involves the analysis of historical data to identify patterns, trends, and insights. Descriptive analytics often uses visualization ...

  15. Data Analysis

    Data Analysis. Different statistics and methods used to describe the characteristics of the members of a sample or population, explore the relationships between variables, to test research hypotheses, and to visually represent data are described. Terms relating to the topics covered are defined in the Research Glossary. Descriptive Statistics.

  16. Analytical Research: What is it, Importance + Examples

    Analytical research is a type of researching that requires critical thinking skills the the examination of relevant basic and details. ... selecting the many appropriate approach or combination of application is crucial till conduct analyzatory research. Examples of analytical doing.

  17. Everything You Need To Know About Analytical Research

    Research is vital in any field. It helps in finding out information about various subjects. It is a systematic process of collecting data, documenting critical information, analyzing data, and interpreting it. It employs different methodologies to perform various tasks. Its main task is to collect, compose and analyze data on the subject matter.

  18. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  19. Analyze Report: How To Write The Best Analytical Report (+ 6 Examples!)

    Collect data from primary sources (e.g., surveys, interviews) and secondary sources (e.g., research studies, industry reports). Ensure that the data collected is accurate, reliable, and up-to-date. Step 3: Analyze the Data: Use analytical tools and techniques to analyze the data effectively.

  20. Descriptive vs Analytical Research: Understanding the Difference

    Descriptive employs observation and surveys; analytical uses statistical, mathematical, or computational techniques. Descriptive aims to identify patterns or trends, while analytical aims to establish causation. Descriptive research is often qualitative, whereas analytical can be both qualitative and quantitative.

  21. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #3: Discourse Analysis. Discourse is simply a fancy word for written or spoken language or debate. So, discourse analysis is all about analysing language within its social context. In other words, analysing language - such as a conversation, a speech, etc - within the culture and society it takes place.

  22. What Are Analytical Skills? Definition, Examples and Tips

    Key takeaways: Analytical skills are soft skills that help you identify and solve complex problems. Many jobs require analytical skills, like critical thinking, research and data literacy. Demonstrating analytical skills on your resume and in interviews can help you be a competitive job candidate.

  23. Research: How Different Fields Are Using GenAI to Redefine Roles

    Research: How Different Fields Are Using GenAI to Redefine Roles. Summary. The interactive, conversational, analytical, and generative features of GenAI offer support for creativity, problem ...

  24. Global Spectrometry Market Projects Robust Growth Through

    The Global Spectrometry Market has exhibited remarkable growth, valued at USD 12.78 billion in 2023 and expected to continue with a robust CAGR of 6.56% until 2029. Pivotal in industries such as ...

  25. segment-anything/notebooks/automatic_mask_generator_example ...

    The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model. Skip to content