Amy Clark (00:05):
It was really impressive to see how SADA helped us build a program that would take a video and then select clips from that video based on prompts here. So you can ask for certain characters or certain emotions or certain visual elements. It took this process of selecting clips and stills from an entire day’s worth of work to minutes.
Simon Margolis (00:36):
You are listening to another episode of Cloud and Clear SADA’s cloud transformation podcast. I am your host, Simon Margolis, and today we are pleased to welcome Amy Clark, director of content and product ops at The Sherlock Company.
Amy Clark (00:49):
Hi, it’s great to be here.
Simon Margolis (00:51):
Awesome. We are excited to have you. We know that The Sherlock Company is doing some really, really awesome work in the AI space, particularly in the entertainment industry. For those of our listeners who are going to be coming to Google Cloud Next in April in Las Vegas, we’re excited to showcase Sherlock a little bit more and talk a little bit more about what y’all are doing.
(01:10):
But first things first I want to talk a little bit about what The Sherlock Company is, what you all are doing. So Amy, do you want to give us a little bit of that background?
Amy Clark (01:18):
Yeah, I’m going to give you a short answer and then a little bit of a longer answer. I think it will set the stage for this conversation. So the short answer is that Sherlock is a creative operations agency. We help brands create, manage, localize, and deliver their global content at scale.
(01:37):
The longer answer is what does that look like? So imagine that you work in marketing for a company that makes speakers and you’re launching a new speaker in all of your regions and markets that you currently sell in, and you want to launch and announce on the same day. So you’re going to be creating a lot of marketing assets. You’re doing a multichannel campaign, so everything from social display, retail, video ads, you’re running the gamut. You’re also doing that in every country that you’re currently selling in. So you’re already working with a lot of assets and each of those destinations might have different rules and guidelines for the kinds of assets that you need and the delivery.
(02:22):
And then you also have your brand guidelines to protect. As companies grow, it becomes even more important and even more difficult to stay on brand. And so you are dealing with a lot of assets and a lot of rules. You also are putting work out to various agencies, freelancers, you have regional stakeholders, you have translators. You have to make sure that your localized content resonates with local audiences. And all of that can very quickly become a logistical nightmare. And it’s something that I think people tend to feel like that chaos is just baked in, and at The Sherlock Company we say it doesn’t have to be.
(03:01):
So we offer services and custom software that does creative automation to find ways to really boost your ROI on your creative production and save time, save resources. And we work with our clients as partners. So we not only are producing and localizing assets, we’re really managing their entire workflows. We have in-house software that we customize to our clients because we know that all workflows are different.
(03:31):
We got our start in the world of streaming. We actually started kind of at the onset of the streaming wars. So if you go back to the example that I just gave about launching a speaker, imagine you’re launching a streaming service. So you’re not only launching a product, you are taking your entire catalog. And in some of our clients’ cases, this catalog goes back to when film was invented. And you’re digitizing that content, you’re having to localize it, you’re making sure that all of the artwork is formatted correctly for where it’s going to appear in the product. You end up with, I mean, sometimes tens of thousands of assets that need to be delivered into often kind of complex legacy systems. And so that’s really where we cut our teeth, was helping large entertainment studios manage, and still to this day continue to manage, their global content in their streaming platforms. That’s the long answer for what we do.
Simon Margolis (04:30):
No, that’s awesome. I can only imagine how impactful that is for your customers because we know all about automation here in the technology world, and that’s a huge game changer, especially with the number of moving pieces and different parties you just mentioned. So really, really exciting. And obviously I had a little bit of a sneak peek because I have had the pleasure of working with you all for a long time and I’ve got to see some of the impact this has had, which is super amazing.
(04:54):
But for the rest of our audience, maybe let’s talk a little bit, rewind a little bit and talk about what was the initial reason that you started to explore the world of AI? What led you to reach out to SADA? And I will call out to the audience, I’ll give you guys the brag here, but I know you all were talking about AI kind of before it was cool. This is something that obviously has come into vogue recently, but these are conversations that we’ve been having with The Sherlock Company for a long time. So tell me a little bit about what led to that, why AI came up in your vernacular and what led you to working with us?
Amy Clark (05:28):
Yeah, absolutely. So you know Sherlock, Simon, our CEO, you’ve been working with him for a while and you know that he is someone who does not want to leave anything on the table. I mean, the company was founded, he jumped ship from working in the print DVD space to streaming. So I think we’re always keeping an eye on technology innovation. And I think also the space that we’re in, like I said, I think sometimes the creative ops world and creative production world tends to get overlooked in the grand scheme of marketing. And we just felt like there are so many opportunities for automation here. And when we’re looking at AI, it feels like there’s just this entire world of opportunities and game changers out there. And we touch so many aspects of the creative production process with our clients that, I mean, we are just seeing so many use cases.
(06:33):
I think we have had a relationship with SADA for a long time and you have always been really helpful in helping us leverage Google Cloud and making sure that we are utilizing it to the most optimized state. And so obviously, we don’t have a lot of experience with AI and it’s moving at such a dizzying pace that you were the first people we reached out to, and it really was amazing to see not only how quickly we could get something off the ground, but I think also just I was really surprised at where AI is and what it’s capable of right now, not just looking forward and imagining what it could be capable of in a few years.
Simon Margolis (07:22):
Exactly. Yeah, that was what was a lot of fun working with you all, it’s really a lot of fun in all my, working with all my clients is that this is such a rapidly evolving space and we love being able to have the impact that it sounds like we had with you. Which is that there is so much happening so quickly that it’s really hard for anybody to stay on top of all this, especially given that you have a day job that you’re responsible for. So love to hear that stuff.
(07:48):
I think you kind of teased it a little bit, but I want to see if you can double click on just, we started talking about this stuff a while ago, a lot of the art of the possible when it comes to AI, do you recall what that first aha moment was for you where you were like, oh my God, this AI stuff can really do stuff that maybe I hadn’t anticipated or didn’t think was possible yet?
Amy Clark (08:09):
Yes, definitely. So the way that we were interacting with AI at first and doing these explorations was really kind of text-based. I mean, that’s what we were all using. And our work obviously is incredibly visual. We work with images and videos, that’s the bulk of our work. And so I think the big aha moment for us was a lot of our use cases that we were hoping to kind of see to reality were based in images and video. And our assumption I think was like, oh, we can kind of get something going, but it’ll be like there’s a lot of manual help needed here. And that really wasn’t the case. I think where multimodal AI is right now, we used Google Vision, it is really amazing and I think we’ll talk about the sort of video project specifically and how that worked, but we were really blown away with, oh, this is not something where we still need human intervention, which was amazing.
Simon Margolis (09:16):
Yeah, exactly. That’s so cool. I mean, it’s fun for me to experience that via you all, just that, the game changer and the fact that, oh, this can actually do what were dreaming it could do, and obviously that’s awesome. But yeah, let’s talk a little bit about the specific implementation that you all had as that first mover. So I know when we were prepping for this episode, when we were chatting, we were talking a little bit about the whole scene picker that you were able to implement. Can you talk a little bit what that is and what problem it’s solving for your clients?
Amy Clark (09:51):
Yeah, absolutely. So one thing that has become a real need, especially in entertainment, but with all of our clients we’re seeing is hyper personalization in marketing. And you see this in streaming platforms a lot. I think Netflix is known for different people see a totally different homepage. And on streaming services, different people are seeing different artwork and different marketing for each title. So depending on what you like, you might be shown pictures of a certain actor for a new title that’s coming out. And that process for some of our clients of actually taking a title and generating enough content, so like clips and stills from it, to really personalize the experience down to different sentiments, different scenes, different actors, was very manual.
(10:51):
And actually fun fact, when I started at Sherlock six years ago, one of my first tasks was to actually go through, I was picking stills from the show Lost and sending them to somebody. I can tell you it’s pretty grueling work because-
Simon Margolis (11:07):
I bet, yeah.
Amy Clark (11:09):
… they’ll have stills from the show, they’ll have a photographer on set and you’re kind of just going on objectivity. This one looks not blurry and interesting. So it’s something that takes a really long time. And we had our clients reach out to us and ask, “Is there something we can do to solve this problem?” So that’s a really great use case for AI.
(11:37):
It was really impressive to see how SADA helped us build a program that would take a video and then select clips from that video based on, you can filter by different sort of prompts here. So you can ask for certain characters or certain emotions or certain visual elements. And then from there, select stills that are really high quality, so not blurry, no one’s eyes are closed, it’s clear what’s happening in those scenes. And you get this incredible library of stills, then you can go in and choose the ones that you like and they’re stored and you can download them. And it took this process of selecting clips and stills from an entire day’s worth of work to minutes, it really was just the processing time. And that is just such a game changer.
Simon Margolis (12:34):
Yeah, no. I mean that’s awesome. And not that our audience doesn’t care about the technical details, but I mean, look, the technology behind the scenes is really awesome, and I’m glad that we were able to come in with the SADA kind of engineers and our expertise and the space and make it such that instead of having Amy pick the still that looks the best because it feels right, not that that was wrong. But it is cool now that we’re able to take these models that do this AI work and then consistently feed them information, right? Hey, we didn’t get the reaction from our audience that we wanted to, so we can tell the model that and it can change the way it behaves to give us those better outcomes.
(13:14):
So obviously we could go on and on or I can totally geek out on the technology. It is really, really cool stuff. But I think what our audience cares most about is that for a long time when this AI, especially generative AI boom really blew up, a lot of people did a lot of things in the AI space that didn’t matter. They didn’t do anything for the customer. It’d be like, “Hey, look, I created this really cool picture of a corgi on a surfboard, but what does that help my business with?” But what you’re describing, you’re describing you’re taking a lot of manual work work and making it automated. You’re talking about things that had to be subjective or based on intuition and making them a little bit more metrics-based.
(13:54):
Can you just dive into what impact that actually makes for your customer besides just being faster? Is it better? Why does your customer care?
Amy Clark (14:03):
Yeah, so in addition to it being faster, it provides I think a much richer experience. So you’re able to go much wider. There’s a breadth of imagery that you can grab that again, it’s not like a human couldn’t do it, but it just would take a little long time and it’s really complex. I think also the way that the tool works, and like you said, the really cool thing about AI is there’s such a lower barrier to entry for someone who wants to come in and use this tool. It’s so user-friendly because so much of it is just kind of using natural language. I’m asking to see this kind of thing, and that’s what it’s going to give me. So it’s like working with a person. You’re not having to go in and do complex stuff in a backend or wait for some developer to come in and change things for you. You can do it instantly yourself. And I think that, it gives a lot of people the freedom to do that work as opposed to it being kind of siloed off in a corner for someone really technical.
Simon Margolis (15:17):
For sure. Yeah, I’ve been talking about this for a long time. This is what I hope to see more and more of in the AI space is this democratization of the work that we get to do. It doesn’t have to be that you have to go to the one person that has the technical skills to do a task. You can now chat with your virtual agent that’s AI based, and you talk to it just like you would a coworker and this agent’s able to unlock all that value. And so I love to hear that that’s happening. I love this use case.
(15:45):
But what you’re describing I think spans beyond entertainment. This isn’t unique to the entertainment space. So I was wondering if you wanted to give a little bit of your opinion about how this may scale outside of the entertainment industry to other spaces.
Amy Clark (15:59):
Yeah, absolutely. I mean, I think we kind of saw immediately this is really useful for large studios, but also for smaller content creators, gamers and streamers. Finding a thumbnail is always such a pain. And tools like this just allow you to experiment so much faster than you ever could have before. And marketing teams in general, we are all having to put out so much content these days that it’s so important to be able to purpose content in a fresh way. So tools like this where you can just iterate so fast, allow you to get so much more out of your content than if you were manually choosing frames or finding clips. So that’s really exciting for us as well.
Simon Margolis (16:53):
That’s awesome. So going back to The Sherlock Company, maybe looking forward, what makes you most excited about what the future might look like for you all?
Amy Clark (17:03):
Well, I think that our work with SADA really showed us that we can build products really quickly, especially with AI. I mean, I think that’s something that’s really exciting as well. And we have always had a really collaborative way of working with our clients, and so they’ll come to us with ideas and we work together on products and just being able to do that a lot faster is so exciting for us.
(17:34):
And also looking outwards, we are taking this product that SADA helped us build and we’ve added it to our in-house product, but we’re also working on our first foray into building this out into a SaaS platform that anyone can use. And so that’s really exciting for us as well, the opportunity to make more public facing software and knowing that we can do this. We’re not a huge team, we operate a little bit like a startup. And so we love the ability here to be nimble and try things out and experiment, and that’s what we do as a company anyway. We iterate really quickly for our clients. And so being able to do that on a larger scale with AI is super cool. We are just constantly thinking of exciting new projects that we want to try.
Simon Margolis (18:32):
I love that. I love that. This is why it’s such a great relationship. We worked together for so long. We love that. Don’t just stop here. There’s so much more potential. I can’t wait to explore more of that with your team and see what we talk about on the next episode of Cloud and Clear with you all, what else do y’all cook up.
(18:54):
Any last thoughts, Amy, on what our listeners should take away about Sherlock and the work you all do? Any final thoughts?
Amy Clark (19:03):
Just I would say that creative production doesn’t have to be a painful experience.
Simon Margolis (19:09):
I love it. I love it. It does not, and I think it’s only getting better, right?
Amy Clark (19:12):
Exactly.
Simon Margolis (19:13):
Thanks a lot to the work that you all are doing. So that’s awesome stuff. This has been an amazing look into the world of AI-powered creative operations. Amy, thank you so much for sharing your insights with us.
(19:26):
And for our listeners, don’t forget if you’re attending Google Cloud Next, make sure to attend our spotlight session featuring The Sherlock Company on April 10th. For more information, you can visit SADA.com/next.
(19:38):
Listeners, you can learn more about The Sherlock Company at TheSherlockCompany.com. Thanks everyone for listening to Cloud and Clear, and we will see you very soon.