In the first three installments of this series, we explored how generative AI tools, including ChatGPT and Claude, can be employed to onboard research projects, refine research objectives, develop hypotheses, determine a research methodology, develop an airtight screener, and optimize both qual and quant data collection instruments.

In this article, we’ll explore ways GenAI can assist with performing quality control and gaining a preliminary understanding of the results. As always, before putting survey or transcript data into a GenAI, ensure that you’re up on your company policies and know how to use the GenAI securely.

Your Quality Control Co-Pilot

GenAI has a wide variety of capabilities for checking data quality. It can look at a lot of data at once and provide suggestions for different approaches to data quality activities such as:

    • Outline best practices for data standardization and create frameworks and suggestions for data quality metrics.
    • Suggest a methodology for data monitoring, write scripts or instructions for data cleaning and suggest corrective actions.
    • Create documentation and scorecards for the QA process.
    • Recommend practices for data privacy, security, and governance.
    • Create strategies for ongoing data quality improvement.

Beyond just making suggestions, GenAI can also directly perform actions on data, such as data validation to uncover inconsistencies, errors, or missing information. It can examine and summarize information, give extra details to help users understand where the information comes from, change how the information looks or is measured, and spot any unusual bits that might need a closer look.

 

Respondents were asked the percentage of workloads that are currently hosted on the cloud, and the percentage of workloads expected to be on the cloud in two years. Are there any respondents that appear to be outliers based on their answers to these questions? Are there any suspicious patterns in these responses that might indicate low-quality data?

 

    • Respondent 908 has equal current and expected adoption after two years (55%). This is unusual if cloud adoption is expected to increase over time.
    • Respondent 375 currently has 74% adoption but expects only 15% in two years. This is a very large and unusual decrease.
    • Respondent 292 has no data for current adoption and expects 41% in two years. There may be missing data.

Your Thought Partner for Preliminary Insights

With its ability to parse large amounts of data, GenAI can be a partner in summarizing data for preliminary insights. This can be especially helpful to highlight findings while data collection is in still in progress, so you can pivot as needed to fill information gaps.

At the most basic level, GenAI can distill data from a large data set like survey responses or transcripts from in-depth interviews or focus groups. This is especially powerful with the ability to interrogate data collected qualitatively.

GenAI can help you begin to craft a narrative for your final report by reviewing multiple transcripts and summarizing recurring themes across interviewees.

 

 Based on the attached interview transcripts, what are the top three concerns or fears about using Generative AI?

Based on my review of the interview transcripts, the top three concerns or fears about using generative AI that were mentioned are:

      • Security and privacy risks – Several interviewees expressed concerns around potential data privacy issues and security risks, especially when using hosted generative AI tools like ChatGPT. There is fear about proprietary or sensitive data being exposed or misused when fed into generative AI systems.
      • Biases and inaccuracies – Multiple interviewees noted worries about biases, inaccuracies or “hallucinations” in generative AI content. There is concern this could be misleading or negatively impact brand reputation if not carefully monitored.
      • Job displacement – A couple interviewees discussed concerns about generative AI eventually replacing human roles and skills, especially in areas like content creation and production. There seems to be some fear of AI making certain marketing jobs obsolete.

Other concerns mentioned include lack of understanding of capabilities/best uses, unethical use/misuse of generative AI, loss of IP, and the need for governance and clear policies around use.

You can also use GenAI to summarize answers to specific research questions, identify gaps in the research, or to test hypotheses based on interview transcripts.

One of our research hypotheses is that AI has not yet reached maturity as a technology. Do these interviews confirm or disprove that hypothesis?

[edited for brevity] These interviews generally confirm the hypothesis that AI overall has not yet reached full maturity as a technology:

      • Most of the companies represented are still in experimental phases with generative AI and have not broadly integrated it into workflows, indicating immaturity.
      • Several mentioned the technology is still evolving, that capabilities and limitations are not yet well understood, and best practices still need to be developed.
      • The tools being used are viewed as supplemental at this stage, not mature enough to fully displace existing human skills and roles.
      • Some noted more time is needed to properly evaluate ROI and business value as the technology matures.

The prevailing sentiment seems to be that there is still a long way to go before it is a mature technology ready for ubiquitous adoption and integration. The hypothesis that AI remains an emerging, immature technology aligns with the perspectives shared in these interviews.

Open-ended questions in online surveys can provide rich input into understanding the ‘why-behind-the-what.’ But analyzing text responses can be cumbersome and time-consuming. With GenAI, you can feed the open-ended responses into the tool and ask for a summary of key findings, an assessment of general sentiment, or even a segmentation of the responses.

This data includes open-ended responses to the question asking “Why or why not?” [Q11: Do you agree with the statement: My cloud-providers’ built-in backup tools offer complete data protection (backup and recovery) for my public cloud workloads.]

Could you compare and contrast the themes between the respondents who agree and the respondents who disagree with that statement?

 

Respondents who agree with the statement tend to cite the reliability and trustworthiness of their cloud provider’s backup services. They find the built-in tools easy to use and sufficiently meet their current data protection needs. The security of cloud backups over on-prem solutions is noted. Some also mention the cost-effectiveness and affordability of using their provider’s native backup tools.

On the other hand, respondents who disagree point out missing features or capabilities in the built-in tools that prevent complete data protection. The reliance on a single provider makes them uneasy about risk, and they desire redundancy across providers or with on-prem systems. Integration with on-premises systems is called out as a weakness or gap that needs improvement. Compliance with regulations or meeting contractual requirements is another limitation cited. While the tools may serve current needs, some express concern that they will not meet future needs as workloads and needs evolve. A lack of customization options is highlighted by some as well.

GenAI can also help by providing outside context, comparing findings in your custom research to historical data or trends. It can recommend an outline to logically structure your report, ensuring that it highlights the findings that will be most interesting to your target audience, and can even suggest data visualization techniques to create compelling visuals. With this kind of assistance from GenAI, you can develop more thorough preliminary findings that will resonate with your stakeholders.

Polishing a final deliverable

Researchers can make GenAI an integral part of their projects, benefiting from data quality control capabilities and assistance in shaping preliminary insights. GenAI tools not only save time but can also enhance the overall quality and depth of research findings.

With a more efficient data collection process and a clear view of where your research is heading, GenAI can elevate your final deliverables with both content and format support. This will be the topic of the next installment of this series.

Reach out to us today at [email protected] to learn more about how GenAI can help optimize your research projects.

Read more in our AI for Market Research series

Part 1: Objectives, Onboarding, Methodology

Part 2: Methodology Selection + Participant Screening

Part 3: Data Collection Instruments

Part 4: Quality Control + Preliminary Findings

Part 5: Final Deliverables 

Part 6: The Future + Synth Data (Coming Soon)