Chad Johnson: Hello, you’re listening to [00:00:30] another episode of Cloud and Clear. I am your host Chad Johnson, director of AI Development at Insight, and today we are pleased to welcome another Chad. Chad Brothers, welcome. Welcome to the show.
Chad Brothers: Thanks, Chad. Appreciate you having me.
Chad Johnson: Before we jump in, why don’t you give us a little bit of an overview about Viiz Communications and what you guys do?
Chad Brothers: So Viz Communications is a emergency services solutions provider, call center and non-emergency type solutions for the public safety space, [00:01:00] in addition to providing a large portion of directory services and operator assistance programs for large telecommunication services across the North America region.
Chad Johnson: That’s awesome. That’s such a dynamic space and so important to many of us. When you and I originally met, we were going through some of the challenges that you guys face in that industry. Why don’t you give us a little bit of an overview of [00:01:30] why what we’re going to talk about in a second was so important?
Chad Brothers: In our role within the public safety space, we provide a lot of fallback call assistance through our call centers, supporting large telecommunications companies in their effort to ensure that their customers are able to get to public safety 911 centers, PSAPs Emergency Communication Centers. And [00:02:00] through that process and our work within the industry, we along with others, noticed that one of the larger portions of work that is ostensibly farmed out to these 911 centers is non-emergency calls. Over time, the PSAPs, because they’re open 24 hours a day, have become really kind of the space where local [00:02:30] communities, as they reduce their hours or they reduce the availability of folks that are able to answer the phones, they start to direct these calls into these 911 centers and it starts to distract from their primary mission, which is first responder support and getting callers tied to first responders and get those requests for assistance addressed.
So that was really where [00:03:00] we started to look at options to provide solutions within the space. How can we help these centers do more with the resources that they have on the ground? I think that, culminated with a significant staffing crisis within the 911 space, really started to move us in the direction of finding solutions, technology solutions to address these [00:03:30] critical issues that we’re facing in the industry.
Chad Johnson: So why did gen AI jump to mind? If I can inject? I think that natural communication and the realtime nature of communication is always challenging to introduce technology. What did you start to see in gen AI that seemed like it would be a hope for the future?
Chad Brothers: Well, I think it was a real crossroads within [00:04:00] the space. The industry itself started to see the staffing crisis arise in the late teens, 2017-2018 timeframe where people were raising the flag. And the industry really started looking at, well, how do we process our way out of this? How do we change policy within hiring practices and things? It really wasn’t moving the needle. And so as AI started to come on the scene and become [00:04:30] more available, and especially as we started to see it tied with contact center solutions, it really felt like that is really like our sweet spot. We do contact center work, we have call centers and as a result of that, when you start to marry the two gen AI and contact center solutions, you started to see a picture form of, well, the type of work that we can take off the plate for these public safety spaces that [00:05:00] are dealing with calls that aren’t necessarily emergencies. Maybe there’s an application, a service, a suite that we can put in front of them that would ostensibly help that workflow reduce the strain on their resources and provide them equivalent outcomes.
Chad Johnson: Why don’t we get a little more specific? So let’s take a little scenario. Let’s say, I’ve got one day I [00:05:30] look outside and there’s something going on on my street, there’s a cat stuck in a tree. So I call my center and need some help with that, but it’s not urgent. What does the solution do in that situation?
Chad Brothers: Well, let’s maybe take a step back and say, well, what does it do without that solution, right? So without that solution, that person in their community is likely going to call a standard 10-digit number, [00:06:00] right? They’re going to pick up their phone, they’re going to call a non-emergency line, and ostensibly that’s going to arrive at a 911 center because 911 centers have 911 lines and then they have non-emergency lines or sometimes they’re called administrative lines. And so they broadcast these and they do a good job of telling their communities, if it’s not an emergency, don’t tie up a 911 line, call this 10-digit administrative line and we will determine [00:06:30] whether or not we can support you with that service. And so in your use case, in this spot, someone would call that 10-digit administrative line and they would get a dispatcher who’s generally on the same floor as the 911 callers and they would find the appropriate service whether or not it’s a dogcatcher or animal control or whether or not it’s the fire department to come out with a ladder truck and climb up that tree. The same person’s answering [00:07:00] both calls.
Now, with gen AI, what we can do is we can take those calls from those administrative lines and we can move them into a different queue and we can match them up with the services that are available within the community that don’t necessarily need a 911 dispatcher tele-communicator to answer that call. So for example, I could direct that call directly to the [00:07:30] animal control office and allow them to address the caller’s question because they’re generally going to get there in the first place or after that call has been made. So if we can cut that step out, one, we can get the caller to the help that they need more directly and two, we can keep that 911 tele-communicator focused on emergency issues because a cat in a tree, well, it may be an [00:08:00] emergency to the cat owner, it’s not technically an emergency that needs a first responder to get there, right? So we’re taking away those resources to address the cat from perhaps maybe someone who is having a cardiac arrest and we don’t want that.
Chad Johnson: Are you starting to notice some quality of life improvement for the operators? Does this make them more content in their job to not have to deal with so many of those less urgent calls and get to focus on the ones that are maybe [00:08:30] a little more impactful?
Chad Brothers: That’s a great question, and I would say in general, yes. One, it just drops the volumes of their calls too, right? So if you imagine a call center that gets a thousand calls in a day, 60% of those are generally nonemergency calls. That’s a significant reduction in the amount of calls that you can take off their plate. I’m not suggesting that we would [00:09:00] address all 60%, but if we can address half of those, that is still a very significant impact that we can have on the volume that these call-takers have to address on a day-to-day basis. So one, you don’t have the context switching as much, right? They can stay focused on their primary mission, what they’ve been trained for to deal with emergencies.
So I do think that there [00:09:30] are some positive quality of life impacts that folks are seeing, right? When we talk to the centers that we’re working with today, they immediately say, “Well, it’s in the background and I just don’t notice all of the calls that we were used to getting.” People asking about where do I go to get my license plate tags or can I park on this street, [00:10:00] things of that nature that just tend to draw them away from the important stuff.
Chad Johnson: So I’m assuming that you had to go through a little bit of work to train is maybe not the precise word, but to tune the system to discriminate between these different types of calls to understand where to route different types of requests. Was that process onerous, but problematic or did it go really smoothly? [00:10:30] How well was the gen AI able to learn the types of things you wanted it to do?
Chad Brothers: It went a lot faster than I thought it would. Of course, there’s tuning that needs to happen after you’ve taken that initial dive into the pool, if you will. So in our implementation, we chose to use what are called playbooks within the GCP platform using conversational agents. What we wanted to do was really to [00:11:00] avoid, to the extent that we could, being rigid within the deterministic-type flow approach.
Chad Johnson: Right. I don’t want to have to list every possible situation.
Chad Brothers: And while I think there’s pros and cons to both of those sides and there may be some aspects where deterministic flow works better, the approach that we really wanted to take was is to see if we could get to a higher percentage of calls being addressed by the [00:11:30] AI through the playbook approach. And the playbook approach allowed us to really not be rigid in how we address the call and the architecture that we married that up to within data stores. So the approach we wanted to take was using the data that was available publicly in a lot of cases, and then some cases not necessarily [00:12:00] publicly, to allow these conversational agents to reach back into and do a RAG approach to the response. I’m going to give you the answers to this large subset of questions that is available in data, and I want you to formulate a response that provides the appropriate response to the inquiry.
Chad Johnson: This is data that I assume was theoretically available to the human agent, [00:12:30] but the generative AI playbook can scan through probably way more potential data sources simultaneously look for phone numbers, who to call, what department, all of these different situations were documented.
Chad Brothers: Exactly, and where it’s not, I mean there’s an opportunity to build and add-on to a dataset. So a good example might be if I were to go scrape a community’s [00:13:00] website and I’ve got some options, I can use my own web scraping tool or I can use the embedded Google tool that allows us to stay married to it. If I were to use that tool, I could go out and I could scrape the entire county’s website of all the services and information that they have. And if I do that, then I don’t have to go create a deterministic flow for every one of those outcomes. Rather, I just have to create a few high-level playbooks [00:13:30] that allow me to reach back and provide a generative response back.
So if someone calls and they say, “Well, I have a question about posting bail or bond,” instead of me having to create a deterministic flow that directly addresses that and know exactly all the permutations that someone’s going to explain it, I can just write a simple inquiry playbook that says, “Go reach out into this dataset that I’ve provided you and formulate a response [00:14:00] to this person’s question. You can post bond through this website, these are the hours, these are the times. If you want to do it in person, this is where you need to go. Do you have any questions? Yes, no. Do you want this information in a text? I can send it to you in a text. Thank you very much. Is there anything else? Nope. That’s good.” And the conversation is over.
Whereas without that, without that AI, if I were to have a 911 [00:14:30] dispatcher that’s trying to address that, they’re either going to have to do one of two things. They’re going to say, “That’s not our department. You need to call the courts,” or they’re going to have to go look on the website, the same place that I just did using gen AI.
Chad Johnson: This is my favorite part of the podcast when I get to ask you, how’s it going? What do we know so far? You’re live with this in a couple of places, is it working? Is it accurate enough? Is it fast enough? What’s the feedback been?
Chad Brothers: [00:15:00] It’s definitely fast enough and it works. It does work, which is great. I think what we find now is that the bottleneck tends to be in the datasets where the community is. Now, and the way that I want to describe this is that when we are being very specific about the data that you can go reach, we [00:15:30] have to ensure that the quality of the data is good. So one of the things that we did at the beginning or as a function of our project was we wanted to have a tool that would give us a really kind of a readout of how well the agents are going to perform based on the data that we provide. So we created this tool that we can scrape the data, we can analyze the data that we’re going to train the bot with and make [00:16:00] available to it to answer inquiries to give us an assessment.
So we could go to our customer and we could say, “Hey, we can start with your data, but we’re going to have to build on your data because we can answer about 30% of these questions that would normally come up as a topic of conversation where your callers are, but these are the gaps. You don’t have enough on parking, you don’t have enough on road obstructions, you don’t have enough on [00:16:30] power outages or animal control. We need to build some additional data so that we can train the bot.” And I think that’s been one of the really amazing things to see out of this is that the bot’s hungry for data, it wants to answer the question. The way I look at it it’s like a puppy, it’s just waiting to be trained and it wants to please us. It wants to give an answer to a question and when it can’t, [00:17:00] then we need to give it something to do to keep it occupied.
Chad Johnson: You have a system in place to catch those times when it’s short of information so that you can recommend that the municipality put in some extra data in those places?
Chad Brothers: Right. So as a part of our ongoing evolution of this product is we’ve been building, along with what’s available [00:17:30] within the GCP platform like Conversational Insights, an analytical set that allows us to see which types of calls that we don’t have good data for, and which types of calls that we need to create new playbooks for when we see them as a repetitive set. So our initial focus of the build out, especially with using the playbooks, was to use Pareto’s Law for [00:18:00] let’s find the 10 things that will get us 80% of the responses back. And so now our objectives are okay, now let’s go in and fill these little gaps for you. Let’s find the areas that we can add a simple playbook in that will give a response and it will cut maybe 10 or 15 calls a week off, as opposed to folks that are wanting a records request or they want to file a police report, which is we’re seeing [00:18:30] maybe 50% or 60% of the callers who are wanting things like that.
Chad Johnson: That makes a lot of sense. It sounds like I think of that long tail kind of situation where you’ve really hit a lot of the most volumetric calls at the beginning, you’re going to keep working your way through. Is it getting, I don’t mean to anthropomorphize it too much, but is it getting better? I’m assuming that the more information you give it when you [00:19:00] fill in these gaps, it is kind of a whack-a-mole. You’re just kind of continuing to remove as many gaps as there were. Will it ever be perfect? I don’t mean to be philosophical, but what’s the future look like here? What’s the end state for a system like this?
Chad Brothers: I don’t think it will ever be perfect, and that’s okay, right, because we’re not perfect as humans either. So people make mistakes on calls, and I think one of the reasons why [00:19:30] we collectively within the industry that it feels like it’s not a third rail topic to do this with nonemergency calls. Now, it’s not that nonemergency calls don’t come through these administrative lines because they do. And that’s also part of the process of being able to weed out what is actually an emergency and what is nonemergency, even on a non-911 line, and we’ve taken appropriate steps to [00:20:00] do that.
Chad Johnson: Do you actually want to say for just a minute, in case anyone’s worried, what does happen if someone calls in with something truly emergent?
Chad Brothers: Yeah, no, that’s actually a great question. So as a part of that training process, and as I mentioned at the onset, we took an approach where we wanted to focus most of our work with playbooks as opposed to deterministic. But one of the hard-and-fast rules that we had at the very onset was that the calls that come in that [00:20:30] would be classified as an emergency if they had gone to 911, they get a deterministic flow. There’s no ifs, ands or buts about that. So if someone calls in and says they’re having chest pains, someone calls in and says they’re being attacked, someone calls in and says there’s been an accident or if there’s a burglary alarm or something to that note, that’s automatically getting filtered out at the very onset and goes to a deterministic [00:21:00] flow and it gets directly driven to a 911 agent. So that’s helpful to be able to carve that out, to have that sense of assurance that it does it, and it does it very well in terms of determining whether or not the callers intent is for an emergency or a nonemergency situation.
Chad Johnson: [00:21:30] What’s on your to-do list for the future? Is there anything you want to focus on next? What does the future look like for a solution?
Chad Brothers: That’s a great question. We’ve got a few irons in the fire right now that we’re focused on. Within the 911 center, things tend to be a little bit segmented. I don’t want to go too deep into the why’s, that’s a really long conversation. But when I say segmented, I mean technology. [00:22:00] And the next big focus that we have as we work with our agencies is to ensure that we’re not just answering the caller’s intent, but on those calls that do actually need a first responder or there’s a policy or process within the agency that they still want to send out a first responder, even though it’s not an emergency, maybe it’s a noise call, maybe it’s fireworks, maybe it’s [00:22:30] something of that nature where the policy would still be to dispatch a law enforcement officer or something of that nature, the next step is to automate the process into their CAD system, their computer-aided dispatch system, and that’s where things get a little segmented.
We’re working directly with CAD companies right now to bridge that gap. So it’s not just being able to take the call, answer the call, the caller’s query [00:23:00] to avoid needing to get to dispatch. The next step is, well, on those calls that do need it, let’s get the data to dispatch and have the dispatch create a ticket and then someone can triage whether or not someone needs to go. So that’s another step within that workflow of a call. And the way that I look at a call tends to be kind of like a manufacturing line. So if you’re looking at operational excellence, what can I remove in terms of variation to speed [00:23:30] the process up. And so if I can remove some of those steps along the way and all they need to do is go look at their CAD, boom, that’s great. I’ve just saved time and energy for those call takers and gotten a quicker outcome for the caller.
The other thing that we’re hyper-focused on right now is the analytics. So we have portals for our customers so that they can go in and they can look at [00:24:00] the calls, they can see all the recordings, they can see the segments of the recordings, they can see all the parameters that we’ve created on the call. So the beauty of this system is I can ask any question I want to of the caller and I can capture that data and I can display that data, I can push that data. And so we’ve spent a lot of time building parameters into the call flows into the playbooks, so that we’re asking the questions that would be necessary [00:24:30] to create a CAD incident and we display that. We provide that to our customers so that they can see what is happening in realtime. And so now we want to do some analytics on that, so provide them with analytics so they can see how well the bot’s performing and it’s system health. That’s actually actively underway, and we’re rolling those out as we build new metrics on a daily basis.
And then [00:25:00] my stretch goal, really where I want to go is I want to integrate spatial data. I think spatial data will provide us a significant improvement in being able to service communities where 911 centers have more than one city, township, borough that they support. So sometimes we deal with consolidated agencies that may support 40 different communities and they all have their own law enforcement, Fire and [00:25:30] EMS and other ancillary services. So if I can use spatial data to understand where the caller’s at, I can help shape the conversation best and not have to get bogged down with things like disambiguation or otherwise in the call.
Chad Johnson: That’s awesome, Chad. Thank you so much for the partnership with SADA. Thanks for helping me learn about this space. It’s so interesting and dynamic and I love hearing you talk passionately [00:26:00] about it. Anything else that you want to wrap with before we end the conversation today, anything that we missed?
Chad Brothers: I don’t think we missed anything. I think that the space that we’re in, I think the space that the industry’s going with adoption of AI, not just in the call-taking space, but there’s a host of other applications for AI to help improve the outcomes in our communities and help improve the quality [00:26:30] of life and service and support that our tele-communicators and our 911 centers do, I think we’re just really scratching the surface. And it’s going to be really interesting couple of years as these things start to become more normalized in the workflows.
Chad Johnson: For our listeners, if you enjoy these conversations about technology and transformation, we invite you to check out Insight On the podcast from our parent company, from SADA’s parent company, Insight. You can find it on the Apple Podcast [00:27:00] store, in the Spotify or anywhere that you get your podcasts. Thanks for listening to Cloud and Clear. Hope everyone has a great day.
Chad, thank you again for your time today.
Chad Brothers: Thanks, Chad.