Spotlight: Michael Machado on why legacy support tools aren’t built for AI
12 min read
Last edited:
2023 was the year of AI in customer support with an influx of automation tools and AI bots promising to revolutionize service interactions. While basic chatbots are now ubiquitous, truly harnessing the power of artificial intelligence requires rethinking tools, workflows, and data at a fundamental level.
I sat down with Michael Machado, our Head of Product at DevRev, to understand the role AI will play in support at all complexity levels, why legacy support tools are not capable of harnessing the full potential of AI, and why SaaS companies cannot rely on generic support solutions.
Here are some of the key takeaways from our conversation:
- Spinning up a knowledge base-informed chatbot will become a commodity in Level 1 support, but to fully automate L1 deflection, companies should consider fine-tuning, minimizing hallucination, and reinforcement learning mechanisms.
- Level 2 support is stuck in outdated rule-based frameworks. The future CX will be demonstrated via AI assistants that can handle multi-turn conversations to resolve complex personalized customer requests. This requires migrating to skill / goal-oriented bots that seamlessly leverage the function frameworks of Large Language Models.
- To enable AI-assisted Level 3 and 4 support, companies need to unify all customer, product, and user data into a single platform. Without this consolidation, AI lacks the full context needed for technical troubleshooting.
On support for SaaS
Aditi Mishra: In the last year, we have seen almost every support tool rebrand itself as an AI-first solution. But are all AI solutions created equal?
Michael Machado: SaaS needs a whole new level of collaboration and automation that existing support tools are not capable of providing. A doctor's office, for example, is going to have completely different needs than a tech company. And that’s why you need a support solution purpose-built for digital first, software companies. Our vision is to create a purpose-built support solution for tech companies, that optimizes for L1 and L2 automation and streamlines the workflows for L3 and L4 escalation to resolution. Right now, you’ve probably bought a ticketing system that is built for L1 and a little bit of L2. The past was L1, and L2 call centers, right? Everyone was building call center software for support. And that's gone. What the future looks like is not what the past looked like. Your ‘human’ team will be primarily responsible for tech support. That's where you're going to be able to drive ROI for your team and customers, informing your team on the needs of the customer to solve problems faster and build a better product.
At DevRev, we want to help you do the hard things. We provide the food, shelter, and clothing for support teams; an omnichannel communication and ticketing platform, SLAs, CSAT, NPS, and the rest of it. But as a tech company, you need to evaluate your software around more than just these parameters and start thinking about higher-order needs. That’s not to say that ticketing doesn't matter, it's just that platforms that only deliver on your ticketing needs don’t give you the opportunity to go beyond vanity support metrics. You are not a legacy company. You're an innovation company. You're a growth company. You’re thinking about scale and operations and you're thinking customer success is equal to customer support. So you need to be asking if your support solution has access to product data. Not only do your teams operate better with that data, but AI needs that data to reach its full potential.
That’s how I can confidently say that DevRev exceeds the benchmarks of any of the commodity-level solutions; we have a scalable, easy-to-configure, easy-to-tune L1-L2 support automation solution and our L3/L4 workflow is enriched through product and customer data.
On L1 support
Aditi Mishra: Let’s start with L1 support. That’s probably the most obvious use-case for AI. How are you thinking of deflection in L1 support?
Michael Machado: L1-based deflection is going to be commoditized over the next year - it’s already on that path. Every company in the world is going to have a question-and-answer bot of some sort. And those bots are going to claim that you can increase your percentage of L1 support deflections with AI. But you need to peek below the hood and examine how these bots are really doing deflection, handling hallucinations, delivering personalized answers that are on-brand, and of course self-healing.
What sets a powerful AI bot apart is its ability to have better feedback loops, continuous training, and the ability to customize the model. With Turing Answers (our in-app chatbot powered by Gen AI), we’re saying that you don’t need to spend all these hours writing knowledge base articles, auditing them to keep them up-to-date, building workflows to answer queries, etc. All you need to do is enter your website/help center URL and then Turing AI can scrape your website and auto-create its knowledge base. But even beyond that, semantic search allows your users to self-serve as much as possible. And with human-in-the-loop reinforcement, the chatbot fine-tunes and doesn’t need your support ops team to manually configure it.
On L2 support
Aditi Mishra: What about L2 support? That’s beyond just article-based deflection. You need your support agents to collect more information from customers and do some more analysis on queries. How does AI help there?
Michael Machado: L2 is an interesting story. With L2, we have situations where you can't just look up a knowledge article to solve this problem, you need business data around the product and user, the context of an individual customer to serve the request. The old way required customers to build extremely complex rules to ask the user’s intent, prompt for more information, fill slots, and then perform an action. It improved efficiency but was painful to manage and shifted the responsibility to users (like a painful IVR system) and that is what every other support tool right now is offering. The problem is that it doesn’t scale and these tools just don’t have the ability to completely do away with the old way and start afresh. Customers are afraid to touch them, as any change breaks too much hard coded, reg-ex defined logic and for providers, there is no incentive to push through the change management requirements to modernize them.
This is why DevRev says we have a blank slate. We have the ability to start from scratch and say that the idea of rule-based intents, prompts for slot filling to handle L2 support queries is not working and is going to go away.
The new version of that is Gen-AI-based and uses skills and functions. Your ‘skill’ is what you probably used to call an ‘intent’ and your ‘function’ is the functional requirements to complete the task. These are usually goal-oriented. For example, let’s say your goal is to help a customer add a new user. Right now, the function to do that is:
→ I need to authenticate you
→ I need to know the email of the user you want to add
The old way meant you had to hardcode all these things. You either gave the customer a form, or you asked them a series of hard-coded questions, and you went back and forth, back and forth, back and forth regardless of customer response (the anti-thesis of a conversation). L2, for the past 10 years, has been no better than IVR, we just replaced the 1800 number with a series of forms and modals.
We're saying let AI solve that problem for you. Just tell us what good looks like. What API call do you have to make? We can pull all the parameters from that, and then we will prompt the user for the right inputs. But it will be Gen-AI powered, guide the customer without hard-coded rules, and provide the flexibility to solve multiple customer needs in a single conversation. That's going to be a much more scalable support automation solution, require less administration, and provide the optimal customer experience. We've been at the cutting edge of defining this modern framework and working with the best foundational model companies to create LLMs purpose-built for support organizations.
On L3 support
Aditi Mishra: Okay, L3 support. That’s most unique to SaaS companies and is probably a founder or CCO’s biggest priority. It requires support engineers to work with developers, product teams, and more. What role do you see AI playing in L3 support?
Michael Machado: You’re right in that L3 requires support to work with multiple departments. But the fact is your legacy support tools don’t allow for this. If your agent needs to switch from their Zendesk to log a ticket on Jira and then follow up with the developer on Slack, then that’s a collaboration problem. A superficial AI solution isn’t going to solve this. The fact is, AI does not care about your departments. You are a SaaS company. You need to be thinking about what it takes to deliver data to make your AI as performant as possible on everything that's sourced from your product, from your services, from your teams, from your roadmap, from your customer data, from your session recording data. That is how we are going to drive better L3 productivity and L4 resolution.
With DevRev, you bring in all your customer, product, and people data into a single platform. Our extensible object model allows you to speak to your developers working in Jira without ever stepping out of DevRev. And Turing AI is more powerful because it’s built on this huge knowledge graph. Therefore, telling you about the latest Git event, about the observability signals that we're getting from Datadog, about the session recordings, the user behavior, the crash analytics, etc is seamless. Plus, support teams have visibility into the feature roadmap, the prioritization of work, and all the ongoing work connected to engineering and product so agents are fully empowered to answer more technical product-related questions.
On the platform
Aditi Mishra: A lot of our differentiation also comes down to our platform. Can you explain how DevRev’s platform lays the groundwork for all the powerful capabilities we have in-built?
Michael Machado: Being founded post-2020 helped us build a platform that is truly unparalleled. Outside of AI, in the past 5 years we’ve seen that the best companies leveraged the latest advancements in cloud services, mobile, and browser technology. 20 years ago you were really limited in your ability to build analytics and automations into the platform. That’s why Salesforce has had to acquire vs. build much of its collaboration and automation toolsets. That's why Zendesk doesn't have a workflow engine that can connect to webhooks across 3rd party apps, and instead makes you wait hours for a job to run.
We often use the term ‘micro-tenancy’ when talking about our platform. It drives our object model, our workflow engine, our analytics, and our ability to give you a scalable solution. We give you feedback loops and the ability to curate and improve the LLM model. You can even bring your own LLM, for example, you can bring in your own open source model like LLama, because we understand how dynamic the AI landscape is today. Our ability to provide in-browser analytics, semantic search, RAG, workflow engines, and serverless architecture ultimately boils down to our platform capabilities.
On the roadmap
Aditi Mishra: Give us a glimpse into what’s coming up for DevRev. What are you most excited about?
Michael Machado: We’re just continuing to find new ways to redefine what support looks like for tech companies. We want to make user observability available in-app so it’s easier for L3 and L4 to perform RCA and repro customer problems. We want to collect qualitative multi-modal data and provide LLM interfaces for you to understand what customers are telling you. I'd love for our users to be able to search and query in natural language on-screen recordings at a journey level over the next 6 months. For example, it would be like being able to ask: Which customers drop off before they perform this action during onboarding? It's a hard question for us to answer, but we have to make it a simple one. In the end, I think SaaS apps need to work for their users, providing mechanisms to ask questions and get answers, interfaces to command the application rather than the application probing for updates, and in general bring the same beautiful design experiences we see in B2C to the apps we use everyday at work.
When we started building DevRev, we understood the importance of low-latency analytics, a powerful workflow engine, and built-in collaboration features. That's what our blank canvas approach has afforded us.
As Michael outlines, realizing AI's full potential in customer support necessitates rethinking existing frameworks around skills, knowledge graphs, and consolidating data silos. Legacy support systems lack the capabilities to enable next-gen AI, underscoring why purpose-built solutions like DevRev are critical for software companies to deliver unmatched customer support.