In the first 4 episodes of the AI in UX research series we will be asking industry experts questions on the AI-UX research relationship. Make sure not to miss the following episodes and the final report from our AI in UX research survey.
Our series covers the following topics:
- Episode 1: Is the rise of AI use a benefit or a detriment to UX research?
- Episode 2: What would be the one aspect of UX research that is best compatible with using AI?
- Episode 3: Can UX researchers remain market viable if they don’t choose to adopt AI?
- Episode 4: Thoughts on AI-generated responses / AI-based users
- Final report: Results of the AI in UX research survey
In this episode, we will look at the answers of industry experts to the question:
“Do you think a UX researcher can remain market viable down the road if they don’t choose to adopt AI into their research process? Please explain why yes or why not.”
One of the touchiest spots of the ego of most people is the discussion about their replaceability. We all sometimes believe that there is no one or nothing that could replace us. Therefore, when someone… or something with this potential appears, we tend to be defensive and immediately view it as a threat. This often leads to a more negative stance on anything connected to this person, or a thing.
This could be the point of view of many UXers, who view AI as a threat to their job security above anything else. In this day and age job security-related anxiety is a real thing, which can easily negatively influence the mental health of any person. Therefore, this view is easy to understand.
However, we believe that it is not as straightforward and that AI most likely won’t be replacing us in the near future. We asked some of the renowned UX experts what they think about this touchy topic to learn whether any concerns any of us might have are warranted or not.
Here are the industry experts and thought leaders we asked for their opinions:
- Debbie Levitt, MBA
- Darren Hood, MSUXD, MSIM, UXC
- Caitlin D. Sullivan
- Joel Barr
- Dr Gyles Morrison MBBS MSc
- Stéphanie Walter
- Kevin Liang
- Nikki Anderson-Stanier, MA
- Julian Della Mattia
- Kelly Jura
- Ben Levin
In the next paragraphs, we will list the answers we managed to gather and at the end, we will let you in on our stance as well.
Debbie Levitt, MBA
I actually think that UX Researchers will be more market viable if they don’t adopt AI into their process. The more you adopt AI into your process, the more anybody is going to look at UX research and say, “Well I guess I can do this too. I guess I can go talk to a bunch of people and feed those recordings into AI, and it will tell me what the important points were, and I’ll just report that out. Wow. Maybe I don’t need these Researchers at all.”
Our talent and skill focuses on the full arc of the research. We start with planning research. Can AI do that? I haven’t been impressed when I’ve seen people say, “I asked AI to write questions for my research, and here’s what it gave me.” They were pretty proud. I found the questions to be poor. I found them to be examples of questions we shouldn’t ask in UX research. Planning questions and making sure the questions really match the goals; to me, that’s something a human should still do.
UX research is all about the talent and skill, and to some extent experience, that I bring to the full arc of your project: the planning, recruiting the right number of people from the right population, holding the sessions the correct way, asking the right questions the right ways, not biasing the participant based on something I say or do, not asking ridiculous questions, etc. Then there’s all of the analysis and synthesis that I do, which leads to all of the mapping, documentation, and cataloging problem statements I create.
If you say, “Hey, you know what? I’m going to have AI write our questions and I’m going to have AI answer those questions. It will pretend to be the user. It will write the questions. It will answer the questions. Then AI is going to take that data and it’s going to analyze and synthesize it. And AI is going to tell us what the most important points were and guide our company on what it should do next. OK. Well, hypothetically, if that is your adoption of AI into your CX or UX or research practice, are you still a Researcher? Are you still doing research? Is it good research?
Ultimately, we have to think about the chain and its weakest link: the moment that any of these goes wrong in a small or large way. The moment that any of these is flawed or low quality in small or large ways, we probably set the rest of the research up for small and large failure.
Researchers will be more marketable when they present themselves as the talent, skill, and the expertise that you can’t get from AI because currently, you can’t. AI just doesn’t replace Researchers even where people think it can. AI can write questions for me but that doesn’t mean it wrote good questions. Hey, it can analyze and synthesize for me. That doesn’t mean it did that well. There is a difference between getting something done and getting something done really well.
A lot of our companies expect world-class work from us, and our customers expect five-star quality. Why would we deliver them anything less? We can always be faster and cheaper. But how about we work on being better? Our customers need us to be better. Our customers know we suck. Why aren’t we working on being better? Why is it always “be faster and cheaper”? That’s what the company might want, but it’s not what your customers want.
As AI improves, there are opportunities for Researchers to bring it into their practice. But if someone isn’t using AI right now, I think that’s great. If someone is using AI right now, it could be great. But we have to compare it to what a human would have done. If it’s not as good as or better, or if it is a downgrade, then I would say that person has made themselves less marketable.
Darren Hood, MSUXD, MSIM, UXC
UX researchers can definitely remain viable because actual skill trumps AI usage. Not adopting AI into one’s research process is not a deal-breaker, because it doesn’t decrease the amount of value we bring (as long as the practitioner is truly skilled).
Caitlin D. Sullivan
Founder of User Research Consultancy and UX Research Advisor. You can find Caitlin on her LinkedIn.
I do think it will become more challenging, just as it has become a bit more challenging over the years to be only a qualitative or a quantitative researcher, for example. One can certainly specialize, and there have still been many jobs for people with one of those focus areas. But having an understanding of quantitative methods as a qualitative researcher, for example, often makes a person more competitive in the market. I see AI skills as adding a similar potential value and breadth that might be more competitive, depending on the company you’re hoping will hire you.
Joel is Lead User Researcher. You can find him on his LinkedIn.
Yes, I do. Until AI reaches that generative (can think and program itself + advise researchers accurately and predicatively about human behavior and actions while preventing human error) there will always be a need for UXR’s that do not rely on or use AI in it’s present iteration.
In the realm of UX jobs – it may be the safest place for someone to end up as far as job security. AI is nothing more than a tool. A co-pilot. A personal assistant. It can no more do my job for me than I could for it. Humans will always, always prefer to talk to another human about what they want/need/desire for physical and digital interfaces and cognitive engagement. Is talking to an AI that looks and sounds human fun? Yup. Does it leave the human feeling validated that another human has heard and understood their plight with the poor functionality of an interface? Nope. And let’s be honest – interacting with an AI is as analog as interactions get. Basic input/output exchange without much variability in the task. You give an assignment or query, it spits out a response. Rinse, repeat.
Dr Gyles Morrison MBBS MSc
Clinical UX Strategist and UX Mentor. You can find Gyles on his LinkedIn.
I think a UX Researcher can still compete in the market place even if they don’t use AI as long as they continue to show business value for their efforts. But it will be hard to convince clients/employers to accept analysis taking weeks if it can be done in hours to days with AI.
Again, it depends on what type of AI tool we are talking about. I don’t think AI will replace UX researchers, but I think that the ones who know how to leverage some AI tools can be efficient.
And, in today’s capitalistic society, efficiency means money. So, it might be tricky at some point if you don’t know how to use a couple of those. Also, because it’s a trend, and HRs are going to look for the keywords on resumes. Which, again, is sad, but the way recruitment works.
In 2019, radiologist Curtis Langlotz from Stanford University was quoted as saying, “AI won’t replace radiologists, but radiologists who use AI will replace radiologists who don’t.”
UX research-wise, the smart people at MeasuringU had done an empirical study comparing the reliability of human coders vs. ChatGPT. ChatGPT was only slightly less reliable, and it can only get better. Plus it can handle thousands of pieces of data in a short time, whereas a human can’t. But the narrative is the same: it’s all about people. AI won’t replace humans, but people who know AI will.
So knowing prompt engineering will be a keen skill to learn. There are a couple of free ones from Coursera to check out. In my conversations with hiring managers using AI in their teams, many service roles are being replaced with AI. Luckily, depending on who you’re working with, the human might not be removed, but rather just moved to a different function where their time or expertise is more valuable.
On a societal scale, another concern is the erosion of human creativity. Overreliance breeds subjugation, and we forget how to think for ourselves. We start thinking along the average line, or what algorithms output to us. I hope that we create environments conducive for us to exercise critical thinking, that begins and continues with scientific thinking, which has self-corrective procedures built-in. We don’t want to end up like the humans on Wall-E, right?
Nikki Anderson-Stanier, MA
Absolutely! There are all kinds of user researchers out there who have chosen to adopt certain approaches/frameworks and those who haven’t. There are many specializations within the field of user research and not all of them will be applicable to utilizing AI.
For instance, a user research team of one might use AI for certain tasks such as collecting and organizing large amounts of data, but a larger research team that has people dedicated to that task might not. I think it really depends on the environment you are in and your context.
I would say, there are certain tasks that it will make sense to automate using AI and that user researchers should ideally adopt those because it will ultimately help clear up time to do more important tasks, like more research!
Julian Della Mattia
As I said, AI is just another tool. You can choose to implement them or not, but in the end, what really matters is the outcome of your work, if you’re helping create value for the company, the business and the users. Our work is, and should always be, tool agnostic.
UX researchers should adopt AI in ways that make them more productive and that optimize the research process. As AI matures, UX Researchers will employ novel ways to more efficiently design and improve user experiences. That said, any researcher who relies too heavily on AI to conduct and analyze their research, though, will certainly produce poor results – and a negative impact on UX research.
UX Researcher & Strategist. Managing Partner, Chamjari. You can find Ben on his LinkedIn.
For UX researcher to remain viable in this market, she’s going to have to make use of whatever tools make her job more efficient. But this doesn’t necessarily mean diving headfirst into using a large language model AI to analyze her research sessions.
In fact, that could be quite counterproductive. Many of the current models show strong biases, lack context, awareness, and are prone to creating summaries of text one submits for analysis that are semantically logical, but contextually incoherent.
As a result, you may end up spending more time “debugging“ any AI-generated analysis than you would spend just performing it on your own.
Having said that, there are areas where, especially for individual, independent practitioners, automation tools can make a team of one far more efficient.
Participant recruitment and compensation, screening, scheduling, panel management, transcript transcription, and generation of “highlight reels“ are all areas of UX research for which automation tools exist, and can perform quite reliably. AI might make them more efficient, but that has to be balanced against the quality-check a Researcher will have to perform on the results.
Knowing the difference between a tool that will enhance and expand your capacity, and one which will inhibit it, isn’t easy. But the UX researchers who master the ability to make that discernment will certainly go further than those who don’t.
Ben’s opinion was quoted from his article Can UX Researchers Remain Viable Without Integrating AI Into Their Practice? read his full opinion there.
Our two cents
We fully respect everyone’s right to have unique feelings and we understand that the anxiety caused by topics like this can be very real and hard to beat. From our point of view, the best way to combat it is to embrace AI as a tool/companion and not as a competition.
AI lacks the human touch, that’s an opinion we fully agree with. This includes creativity, thinking outside the box, intuition, and also the quirks and sometimes wrong decisions, which are all so very human. AI will not be able to reproduce or not even approximate these in the near future. If UXers focus on these aspects, there is no way AI would be able to replace them. We believe that the experience and intuition of a seasoned UX researcher (or other specialists) are incredibly important and valuable and the rise of AI won’t diminish their “market value”.
However, it is important that we don’t ignore AI, thinking that it can never replace us. No, it won’t, well at least not explicitly. We too agree with the opinion that it would be the UXers who understand how to utilize AI to its fullest potential, who will have the best position on the market.
Yes, experience and talent will still be the most important factors, however, if you take into consideration two UXRs with comparable experience and talent, the one who can make their work more efficient will be the one who is more successful. AI, if grasped correctly, is a highway to efficiency.
Therefore, it is paramount for job security in the future to familiarize yourself with the options that AI provides. Spend some time honing your prompt engineering skills, learn how to integrate AI into your data analysis pipeline, etc. Don’t let yourself be left behind just because the AI isn’t fully there yet. It isn’t, but it will get better and better with every year.
What to look forward to?
We hope you found the insights we have gathered as exciting as we did. If you haven’t yet, we recommend reading the first episode: Is AI a benefit or a detriment to UX research? and second episode: What aspects of UX research are best compatible with AI? The next episode will be focused on the third question:
“What are your general thoughts on AI generated responses / AI based users? What are the biggest advantages? What should they never be used for?
After all the episodes are out, we will bring you a comprehensive report containing the results of the survey on how the UX community views the current state of AI. Stay tuned!
Let us know your answer to our question in the comments on our LinkedIn!