They discuss the challenges of explaining the outputs of AI models in a way that provides customers with actionable data, and how they've had to quickly develop product offerings from initial proof of concepts. Charlene shares her technical perspective as a data scientist, and Jeff speaks through his lens as Pattern89's Chief Product Officer.
You can find more information about this podcast at sep.com/podcast and subscribe wherever you get your podcasts. Thanks for listening!
Zac Darnell: Welcome to Behind The Product, a podcast by SEP, where we believe it takes more than a great idea to make a great product. We've been around for over 30 years building software that matters more. And, we've set out to explore the people, practices and philosophies to try and capture what's behind great software products. So, join us on this journey of conversation with the folks that bring ideas to life. Hey, everybody, welcome to Behind The Product. As always, I am your host, Zac Darnell. And, joining me as my guest host for the show, Jordan Thayer. Jordan, How are you buddy?
Jordan Thayer: I'm doing well. Thanks Zac, How are you?
Zac Darnell: Doing well. So, you've been with SEP for about two years now, yeah?
Jordan Thayer: Yeah. It's been about two years.
Zac Darnell: What's your primary role here?
Jordan Thayer: So, I'm one of our AI practice leads at SEP. And, what that means is that I evaluate client's problems and look for ways that AI might help them fill some of their needs as part of a product that we're building for them. So, that might mean evaluating a dataset to see whether or not we can use machine learning to draw some insight from it, or evaluating a process to see if we could plan and schedule that process better for a client.
Zac Darnell: I know that I've been learning a lot from you over the last couple of years. And, one of the reasons why I thought you'd be a perfect guest for this is because your background in having a PhD in computer science with a focus on artificial intelligence kind of applies to our guests, and the company that they work for today. We talked to Jeff Cunning and Charlene Tay. Jeff is one of the co- founders and chief product officer, Pattern89. And then, Charlene Tay is their principal data scientist. I really enjoyed this conversation and I'm kind of curious, what was your favorite part of our talk with them?
Jordan Thayer: So, I think the thing that was really my favorite was just because of, it resonated with me personally, which was when Charlene put together a report saying," Hey, here's a thing we might be able to do, and here's an example of doing that." And then, the next day she had to backfill, and figure out how to support this as a product offering for her company. I feel like that's a thing that a lot of engineers have to deal with from the sales world. And, it's exciting to see people get fired up about what you're working on, but it can be a little intimidating the first few times around too.
Zac Darnell: Yeah, that's a good point. I love how Jeff talked about their product development philosophy, he didn't use these words, but this idea of nailing it before you scale it. And also, this common philosophy of painkillers over vitamins. One of the first products that they built, this constellation scorecard, they thought," Oh man, we're going to have all of this data, and that's going to be really, really valuable." And, I got a lot of feedback from customers that," I don't know what to do with this."" This is great, but I need more actionable intelligence." And, that's where Charlene and the world of artificial intelligence really came to light for them. So, just love hearing that journey, that evolution. And, it was really also interesting to me. I don't know if this resonated with you, but one of Charlene's responsibilities is this idea of explainability, in what some of the AI magic that she's building and her team is building to their clients means. Is that a common problem in the AI space?
Jordan Thayer: Yeah. In a couple of different ways, but I think where people are applying machine learning to generate business insights, it's a big thing, right. People want to understand why it is the system made a particular decision, so that they can understand why they're doing the thing that they're doing. I was particularly surprised to hear, maybe not surprised, but encouraged to hear that they actually selected the algorithms they use for machine learning on the back- end, so that they would have a greater ability to explain what the system was doing rather than using extremely high end black box techniques.
Zac Darnell: Well, I would imagine if we have the time in the conversation to dig in a little bit deeper, there's probably more conversation behind the scenes with some of their clients that helped to influence some of those decisions. And, it's always interesting to hear the, not tension necessarily, but the push and pull of what technology could do and is capable of, and what clients need to see or hear. So, it's just amazing to hear some of these stories. And yeah, without further ado, let's dive into the show. Hey everybody, welcome to the show. Today, we've got Jeff Cunning and Charlene Tay. Am I pronouncing those right?
Jeff Cunning: Yup.
Charlene Tay: Yes.
Zac Darnell: I'm like 10 for 10 this year on pronouncing names. Not that you guys have hard ones, but I like to set the bar right. So, you guys are with Pattern89, which is a relatively newer company spawned out of High Alpha. Correct?
Jeff Cunning: Yep.
Charlene Tay: Yeah, that's right.
Zac Darnell: And Jeff, looking a little bit around your background, you were a with ExactTarget a few years ago, and looks like went through that acquisition into Salesforce. Is that right?
Jeff Cunning: Yeah, exactly. I started with ExactTarget. Got in there just at a really lucky time to be able to ride the coattails of the expansion, both products, markets and eventually to IPO into acquisition. So, that was quite the ride.
Zac Darnell: Awesome. And then, you spent a little time at Twitter. What was that like?
Jeff Cunning: Twitter was an amazing place. I absolutely loved working there. The end of my time at ExactTarget and Salesforce, I was working on some products related to, how do you take email data and other first party data that Salesforce had access to, from their mobile products, from their social products, and actually take those audiences and be able to reach them through social advertising on Twitter, on Facebook, through the advent of some of the products that were new at that time back in 2013, 2014. Like Facebook, custom audiences and Twitter tailored audiences, which are really heavily used now today. And, I had the opportunity to go really to the heart of that with Twitter, and work on how they were really rolling out their Twitter audience platform, moving from just on Twitter to this extension of mobile apps. And, that was an acquisition they made of a company called MoPub. So, MoPub was this super fast growing, exciting startup, amazing culture. I mean, they kind of had the dream acquisition. They were probably around for three or four years and sold for multiple 100s of millions of dollars. And, I come to Twitter and I meet this team of people who are amazing, so fun to work with. They have this incredible culture. And I said to myself, like within the first couple of weeks, I was like," I want to do that next."" I want to have that feeling that this team has next." And so, my time at Twitter was amazing. I loved working there. The company, I just have nothing but good things to say about it. But, I always had kind of my eye on... What I want to do next is, I want to go to a startup, I want to go build something from the ground up, and be able to have an amazing culture like that team did.
Zac Darnell: That's really cool. Well, it sounds like you're on that journey right now, so that's fantastic. And Charlene, looking at your background before joining High Alpha, looks like you were more on the research side and data science side, like research academic. I don't know if that's the right phrasing and right way to look at that.
Charlene Tay: Yeah. I first got into data science after I graduated from IU in my undergrad. I was working with researchers in developmental cognitive science. So basically, these were researchers who were interested in how humans and specifically children learn. And, we focused on the intersection between human learning and machine learning. So, I was doing stuff like studying visual learning and humans using computer vision. And then, using human and neural inspiration to build machines that learn like children. But, that really got me excited about theater science and machine learning, and led me to become training as a data scientist.
Zac Darnell: Wow. That's way more over my head than I anticipated, but sounds very, very interesting. Jordan's going to have to give me a crash course in what you just said when we get done.
Jordan Thayer: So, I mean, just to maybe in a little bit more there. Was the goal of that research to build computational models that you could study to better understand the children, or to improve the outcome of the vision processes for use in some other product?
Charlene Tay: A little bit of both. So, it was study humans to build better machine learning algorithms, as well to study human learning and represent that in repeatable processes and models.
Zac Darnell: So Jordan, you have, obviously a background in data science. I forget what your PhD is in.
Jordan Thayer: Combinatorial optimization.
Charlene Tay: Oh.
Zac Darnell: I would imagine you and Charlene could go back and forth on some things here.
Jordan Thayer: You would be actually surprised. It's one of those things where my whole world focuses on something that it seems unlikely like outside of maybe one or two lectures in an AI course you've ever heard of, and vice versa, right. Like, I know enough data science and machine learning to fake it, if you a few bars, but it's really not where I'm super deep, even though I know how to apply the techniques.
Zac Darnell: So, let's dive into Pattern89 and see it in real life. Jeff, tell me a little bit about how long you guys have been around, how the company came to be, and where you guys are at right now?
Jeff Cunning: So, we founded the company in early 2017. And, there were five of us at the beginning, and we were working on solving problem that has plagued marketers for years. Which is, I don't know what creative is going to work. I have campaigns to launch. I have initiatives that I want to drive. I have messaging that I need to get across, but I don't know how to build a creative that's going to resonate. I have so many questions about it. Should it be just basics, like should it be an image or a video? But, what about when we dive into that image or video, should there be people in it? What should the background color be? What should the hue be? Should it be wide or tall? What objects should be present in it? Should it be indoors or outdoors? And, you can go on and on with imagery and with video content, as well as with your copy that goes into the ad. So, what words are most important? How long should my copy be? The most popular question, what emojis should I have in my headline to get people to engage? And so, that was really the problem that we set out to solve. And, when we started Pattern89, we actually started solving it through experimentation, and we wanted to help marketers experiment with more options faster, so that they could get real answers to these questions. And, we had all kinds of interest in this platform. And, we started growing really quickly from the start, because it was that challenge that every marketer can really identify with. And they're like," Yeah, I wish I could get the answers to these questions." As we started to grow the company, we kind of had... There were really two, I guess, big kind of trajectories that happened. One was, we started building a really large dataset, and we've created what we called our constellation scorecard. And, that allowed any marketer to sign up and get access to this free report. And by doing so, they're also joining our data co- op. And, that data co- op has now grown to over 1600 brands, over 170 billion impressions. There's over$ 5 billion in ad revenue that is attributed back to the ads that run through our system. And so, that one trajectory, since the beginning has been the growth of that dataset. The other has been that marketers have told us that," They don't really want to go through the work of building the experiment." So, as much as they loved this idea of experimentation, it was too time consuming. It added a step to their workflow. Instead of helping them, it actually just layered on more work. So, even if they got good results from that experiment, they still have to take a lot of time to put it together, to run it. And then, hope that the results that they got were going to just be true forever, or else, they have to run another experiment. And so, there was this constant feedback of," I just want the answer." And really, thanks to Charlene and the work of really the data science team at Pattern89, we were able to turn that on its head and say," Well, what if we could predict what will happen?"" What if we could use this data set that we've amassed to be able to understand what are the elements of creatives that drive performance?"" What are the patterns that are driving success or failure?" And so, that really drove the future of our business and where we are today, which is centered around this idea of predicting creative performance. So, that a marketer can come to us, and if they have an objective they're trying to accomplish, a KPI that they'll use to measure success in an audience they want to go after, we can tell them what creative is going to get the job done and what creative to avoid. So, that's really the core of the platform today. And, we've grown to about 25 employees. We've had some really exciting growth. Actually, our biggest growth has been in this most recent quarters. So, we're really excited that, to be growing during this pandemic, we feel really fortunate. And, we're excited to continue that into next year.
Jordan Thayer: So, there were a couple of really cool things in there, but I just want to start with a term of art. For the woefully uninformed like me, what's a creative?
Jeff Cunning: We talk about a creative as any type of content that goes into an ad. So, I should probably have mentioned that Pattern89 today works specifically with Facebook advertisers. And so, an ad is really comprised of an image or a video along with a headline, a set of body copy and a call to action, which is like a button that might say," Learn more, shop now, book travel," something like that. And so, all four of those elements are kind of what combined to form an ad or a creative.
Jordan Thayer: I heard you describing something almost like a design space for these things, right. You have a bunch of knobs that you can twist. Are we inside, are we outside, whats the age and gender of the people acting in the video or in the image, things like that. How do you come up with the set of knobs to twist?
Jeff Cunning: Do you want to talk us through Charlene kind of how we go through that analysis and what options are available to us?
Charlene Tay: So, we break down all the ads that we get into many different parts. Specifically for the image, we put that through a third party service right now to kind of come back with all these tags that identify the... You're using computer vision to object recognition, to identify all of the different things that are in the image or the video, as well as image related features such as the hue, or the color schemes and stuff like that. And then, we do the same for that language as well. So, we have it with all these different, what we call, image tags that we use as features to our model.
Jordan Thayer: Do you use a third party for semantic annotation of the copy as well?
Charlene Tay: Semantic annotation of the NLP stuff we do in- house.
Zac Darnell: What is semantic annotation?
Charlene Tay: So basically, that is like breaking down the text in your copy into, sorry like machine readable language. So, things like analyzing the sentiment of the texts, figuring out how many emojis are in there, how they emoji's are being used, things like named entity recognition. So, if there are nouns in there, verbs or certain keywords, that are important, things like that.
Zac Darnell: So, if I'm understanding this at a 30, 000 foot view. If I'm a marketer and I want to do a Facebook ad, the idea here is that by putting together what I think I want to serve to my audience, you can actually break down those four areas that you talked through, and kind of predict based on all of the data that you've collected so far, how successful that's going to be, and maybe make recommendations on what they should change. Is that kind of the idea?
Charlene Tay: Yeah, that's the idea.
Zac Darnell: Oh, wow. Okay. I'm a little proud of myself there. I'm glad that I was following to some degree. That's really... I mean, so as you go back to your first client, and some of the assumptions that went into this constellation scorecard, what were you thinking you were going to be solving first for marketers, then ended up learning that caused this pivot towards applying AI to this problem?
Jeff Cunning: I mean, it was really all about experimentation. We had so much interest in solving that problem of experimentation at scale, and there's a market out there. I mean, there's competitors out there in this space of mass production experiments. And so, we really thought," We're going to build the best experimentation platform there is." And, all with that same goal in mind of helping them understand what creative is going to work, we saw that as the biggest opportunity that is still out there in digital marketing. There's so many tools to help you build and manage digital marketing campaigns, to optimize your targeting, to optimize from an ad tech perspective you're bidding, you're budget management. Really, every element of that journey is optimized except for creatives. There's not a platform out there, and there's not really been technology in the past that's really enabled this, like computer vision and what Charlene was just talking about with what we do to actually really understand more of the science behind a creative, is really brand new within the last couple of years. And so, that has left creative as something that people solve through, I think this is going to work. This worked last time, so let's use it again, or let's run a test. And so, we really saw creative as this big area that we wanted to dive into. And yeah, we thought with our early customers, we actually launched what we call, Our Founder Circles. So we had, I want to say, six or seven brands that we started working with, who all agreed to join us in developing this at a discounted contract value that they'd get longer term. We thought," Hey, if we can start growing our dataset, we might be able to help coach them along the way, as they build out their experiment, we could give them advice on what to do." And, that was what really led us at the start to start building this larger dataset. And yeah, as we continue to go, I mean, there's a few points of feedback that I really look to in our past history that have guided a lot. One was around one of our customers, and actually, this person was in that initial Founder Circle saying," Hey, it's awesome that you have all this data, I don't have time to look through it."" I need you to tell me what to do." And so, this idea of how do we be very prescriptive with customers, instead of just giving them the world, right. It seems valuable to have access to all this data, but you're so busy in your day- to- day, you don't need like now another problem that you need to go solve. And so, that was one big piece of advice. And then the second was, I already know what's worked in the past, I want to know what's going to work next time. I want to know what I need to build in my next campaign. And, those were two really pointed pieces of feedback that I think helped guide us along the way. As well as you're seeing customers saying," I want to experiment," and then not doing it. And then, getting that feedback of like," Hey, this is adding a step to my workflow, this is..."" Again, just tell me what to do and I'll do that." And, that really drove," Hey, well, we have this data set, let's figure out if we can kind of predict that." And then, Charlene and team were able to kind of produce all kinds of different models that could accomplish that goal.
Zac Darnell: So, I would imagine the growth you talked about this last quarter in the midst of this pandemic, e- commerce has kind of blown up a little bit, maybe just a hair. Do you feel like there are other contributing factors? You talk about in customers sounds like a BI predictive action and results rather than information as part of like what they're actually getting from happening. Not yes, the information is maybe somewhat valuable. But really, if I'm hearing you're right, it's about the action that you recommend that they take, but then drives healthy results for them. Do you think there's anything else inside the last few months, just given where we're at right now in the midst of this pandemic? We're what, eight months into this now, give or take. What else do you think is contributing to some of that growth?
Jeff Cunning: Yeah. A message of having more certainty has definitely been one that's resonated a lot with our customer base, a message of just eliminating waste, resources are limited, how can you avoid what's not working, before putting it into market. Those are definitely messages that have really resonated. But then, in addition, we do work really closely with... E- commerce is a big, big vertical for us. And so, that has been a place where we've been able to benefit thanks to those industries benefiting.
Charlene Tay: It was also in the past few months that we've actually productionized and built a product around that predicts stuff. So before this, it was kind of, we had people who had to build the reports for the customers, because we only had the models. But, the engineering team has built a great product around it, and now the customers can go into the platform and do it themselves. So, I think that's another driving factor.
Jordan Thayer: You talked a little bit there about certainty, sort of in results and whatnot for folks. So, there's a lot to be said for, you've achieved results in the past, and so people trust you when you say," You can do a thing." But, a lot of these techniques allow for, either sort of probabilistic models, you can say with nine, five, I believe the following things will happen, given the inputs. How do you communicate that to customers, if at all? Do you find that you have to do a lot of education in telling them what the outputs of the models mean?.
Jeff Cunning: Yeah. I mean, I'm sure we can both speak to this, Charlene, but I would say that is the biggest challenge of building a business that is kind of centered around AI, is this whole explainability notion. And, it's easy when the AI said something that the marketer agrees with and they're like," Oh wow, their platform is really good." But, as soon as it suggests something that kind of goes against what their gut would have been, what their instinct would have been, then it calls everything into question. And, we've really started to see different personas from our customers of, some folks they're of the mindset of," I know the AI is going to be a big part of digital marketing in the future." I know that it's going to... I want to figure out how to build it into my marketing stack effectively, and I'm open to it, I'm going to believe it. And then, there's another persona of, I'm skeptical and I needed to understand all the ins and outs of it, in order to kind of rationalize it for myself. And in that world, it's really hard, right. Because, explaining what a model does, that's really hard. Or why does a model change? And, we've spent a lot of our engineering and data science time in working to better explain what's happening, because I think to reach the broadest market, we want to be able to serve all persona's and be able to justify what's happening. I don't know. What would you add to that Charlene? You're kind of the brains behind explaining it.
Charlene Tay: Yeah. We've been focusing a lot on building explainability tools around the models. And, that's a growing field in data science now. And, it's really difficult, because even with the explainability tools, the features that might be important to the model might not resonate with the customer, right. And so, even if the model says," Hey, because you are so and so, you're going to perform this well," the customer might not be able to do anything with that information, right. Because, they can't change who they are. So, it's a big problem, but we are working hard on serving that customer.
Jordan Thayer: So, if I hear you guys right then, you've got a couple of knobs that you can turn there, right. You could choose different modeling techniques. You could use a decision tree, or something not as powerful, but that people can wrap their minds around pretty quickly. But, it sounds like you've chosen to use the most powerful representations you can find, and then work on the explainability of adopting some of those new research techniques, or perhaps developing some yourself. So, that you can help people understand what it is, the more powerful techniques you're using or doing?
Charlene Tay: For the explainability part, we are trying to keep it. So, that you're right, we're using decision trees to try make it more explainable. So, we can use products, like inaudible or the What- If Tool to kind of toggle the weights and see how the different features are affecting the predicted outcomes, to show and recommend to the customer, what they can change to improve their performances.
Jordan Thayer: It's always wild to me, the way that the consumer, I mean, your clients are the consumers in this case, ingest the information, drives the technology decisions, right. Because in a vacuum, you might choose an entirely different technique.
Jeff Cunning: Yeah.
Charlene Tay: Oh, for sure. Yeah. As a data scientist, it's really hard, because you see all these new technologies and methods come out, and we want to try them all, but we just have to make sure we stay to what's best for the customer and what gives them the best results.
Zac Darnell: I'm sure that the technologist in you, that's a tough struggle. I feel like I hear that a lot from a lot of different maker groups, engineers, designers, data scientists, product folks. I'm sure it's hard. Oh, I want to go do that new shiny thing, that sounds really, really cool, but maybe it's not the right thing. Not right now, soon, one day.
Charlene Tay: If only we had all the time in the world.
Zac Darnell: So a minute ago, you mentioned that here recently, you kind of had an evolution in your product around reports given to customers, and automating those. Kind of got me thinking about, over the last couple of years, the evolution of the technology. Could you talk through some of the distinct phases of that? How baked was some of the tech for your first client versus today, and maybe what that journey look like?
Jeff Cunning: So, what Charlene was saying is that," Our kind of a UI and a self- service ability to predict creative is something that we launched in June of this year. And, for really the previous 12 months, it was a lot of innovating alongside our customers. And, it was using the same models that are now in production, but doing it kind of offline. And, in a way where it was highly manual to produce outcomes for customers. And then, it was building reports to try to explain what the outcomes were. And, that's really been the process for us with everything that we've developed. It's, how do we take the smallest step first, and then work alongside of our customers to validate, validate, validate. So that, we're not actually building software until we have that trust or that learning, that knowledge, that our customers really want this, that it's an important problem that they need to have solved, and what we're doing is solving it for them. And, that is something important enough that they're willing to pay for it too. There's a common product management saying that you'll hear is," Build the painkiller, don't build a vitamin." Because, if it's just a nice to have, like nobody takes their vitamins, but if they hurt, they're going to take a painkiller. And so, those are things that we want to test alongside. So, I mean, early in the phase of predict it was... I mean, Charlene would be better at explaining how baked the models were, but certainly the user experience was non- existent, and it was all handholding in those early days. And then, slowly building processes around being able to, okay, now we can do this manual work at higher scale because we speed a few things up internally. And then eventually, we built our first version of the UI, which had a big majority of the things that we wanted, the ability to load in all of your assets, set your prediction, run it self- service. But, a really key missing ingredient to that is, hey, once you've predicted all these apps, I want to launch them. I want to put it into market. And so, over the following... Then, three to four months, we built out the ability to actually publish those directly. Which was a big feature of, hey, great, you predicted my ads, now I know what's going to work, now I'm going to set it live, instead of having to then go rebuild that in another system. So that's, I guess, maybe the experience from the front end and from what a customer would experience. But yeah, Charlene could talk more about the model and how that's evolved.
Charlene Tay: It started out last year as a POC. Our CEO, R.J., Just asked me to, like, "Could you do something like this?" And, I built it in a Google Colab notebook in a few days. And then, I showed it to him. And then, the next day he was selling it to customers. So, I had to move really quickly to make sure it was in the right place. But yeah, really, what it was doing was basically generating the results into CSV files, and then we had to get other people to help us to create the reports based on all the outputs for the combinations, and of all the different add features and their predictions from that CSV file. But, automating it and building a product around it really helped. So, has R.J. brought it out to customers. We got a lot of great feedback, and I think it went from like more of a, "Hey, can I just get all these different combinations, and get the best one?" to more of... It started off as like, "Hey, I have all these ads, tell me which is the best one," so it was more of a ranking thing in the beginning. And then, it kind of evolved into a combination, like, "Here are all the different things I want to try, create the best combination for me and tell me the best one." So, evolved into that.
Jeff Cunning: And then, I feel like there was another phase too, where it went from, "Don't just tell me all the ranking of my assets, but what if all my assets are bad?", right. I want to know if that best option is actually going to be good, or if it's just the best of this set that I've provided to you. And then, it also really expanded just in terms of what types of objectives people were running. I remember at the very start, it was great. We built a model for a single objective or a single metric. And then, it started to get way more specific around all the different, just kind of spreading out to the greater and greater variety of needs.
Zac Darnell: Well, it sounds like you guys had a great time to hit the market with some of this, just kind of given what's going on. And, I love the idea, I don't know that I've ever heard anybody else talk about painkillers over vitamins. I feel like that's something I've always preached. I don't remember where I heard it from, but I love the sentiment and the idea that, this idea of like nailing it before you scale it, that's a common phrase around our office. So, this idea of making a small bet, and very methodically, making sure that you make a great point. Customers are willing to pay for it, and the technology progression and evolution really supported that journey. It built on itself. It sounds like you kind of hit the nail on the head from the beginning. So, your assumptions were pretty spot on, post the constellation scorecard. That's really, really cool to hear. I'm sure it's been a fun ride.
Jeff Cunning: It has. I mean, I feel like that you talked to anybody in a startup and it's a roller coaster, and that's absolutely what's its been. And the last three months, we never would've had this without the previous four years, and all the learnings, and just all of the time that's gone into this, and just persistence, and relentlessness to kind of get to this outcome. And so, we are feeling really excited about where we are right now, and how well this is resonating with the market.
Zac Darnell: I'm kind of curious a little bit about your teams, like what is your engineering, and product and data science team makeup look like? What do you think makes the Pattern89 team successful on a daily basis?
Jeff Cunning: So from a high level, we have myself and one other product manager and a head of design. And on the engineering side, we have James who's our VP of engineering. He's also been with us from the very start, and we have seven engineers, a couple who had just recently started as well. And then, from the kind of early days of Pattern89, we worked alongside High Alpha from a data science perspective. And, we worked together. And High Alpha was growing that practice, and I think we were, by far, the biggest customers of it. And so, we worked with few, just super intelligent data science folks, and then we were lucky enough to be able to bring Charlene on full- time. And I mean, maybe the one anecdote I would give is that, and I think Charlene can speak better to have the engineering and data science teams working together. But, I think our team is really, really big on context, and on having shared understandings of what problem are we solving? Why are we solving it? And, that challenge of explainability with AI, it's so key to the customer, but it starts with Charlene. Like, if Charlene has to be able to describe what's happening and how it works to us, to even be able to have a chance at explaining it back to the customer. And, the same is true with the engineering team, and being able to provide the context around what they're building, how they're building it, why they're building it that way, is so key for then on the product, on the product marketing side. To be able to then message that out to users for the designers, to be able to build the user experience that brings that to life. So, I think that context, and communication, and understanding the why behind everything has been really key in trying to solve that challenge of explainability. And, what would you add Charlene around` more how the engineers and data science work together?
Charlene Tay: I agree with you 100%. And, not just with engineering and data science, but also with the customer service and the sales teams as well.
Zac Darnell: I feel like so many people don't... You mentioned," Someone taking your proof of concept and selling it the next day." I feel like I hear that all the time, and its, I'm sure both frustrating, exciting, a little scary, all at the same time. Because it's like," Oh, I built a thing and oh, people want it, but I didn't build it to go sell yet."" Oh-ho, I got to go figure that part out." But I mean, that's kind of a testament to making a product that serves a need in the market. That's just awesome to hear. So, I love the story, you guys sound like you are very much on, maybe the early precipice of heading up the rollercoaster. I would imagine there's a fun future here. What's coming up soon here for Pattern89, that you could maybe give a small preview for?
Charlene Tay: Well, because of the growth we've been working very hard on scaling up our tech, and making sure that everything works fine with increased customers base. And, we're also continuing to work on the explainability stuff, hoping to make a tool for customers. So that, when they get recommendations and predictions, they can also see what they can change, and in order to get even better results.
Zac Darnell: So, even like adding onto that self- service capability, that's really cool. I'm sure that's not an easy thing to solve. Explainability, that sounds like a very complex problem.
Charlene Tay: We're also going to keep working on building in the feedback group. So now, that we've been able to publish all the different, once they get their predictions and they're able to publish the ads from the predictions, we can now track those and how they perform, and use that to feed into our models, and to learn and improve the models as well.
Zac Darnell: Wow. Well, I'm excited. We might have to do a follow- up, maybe sometime in 2021, just to hear what you're seeing when some of these things come out into the market. So, I really appreciate you guys joining us for the show. Jeff and Charlene, you guys have been awesome. And Jordan, thank you for being my guest host. I appreciate you. So yeah, have a great rest of your day and we look forward to catch up with you guys here in the future.
Jeff Cunning: Awesome. Thanks for having us.
Charlene Tay: Thank you.