Ashwin Singhania 0:00
It really helps to be selling a product that solves a pain point that you personally had. Right? I think like, early on our ability to tell a story about the pain points that Ryan's my co founder that Ryan and I both had, as product managers, at startups, and at Alexa, and how that story has, you know, how our pain points have shaped the product that we build? And why we think that product then is right for other product managers and oftentimes whom are selling. I think that has helped us kind of build a rapport with people, you know, with, with potential customers who were talking to? And I think it's also allowed us to understand what are the pain points that most resonate with them? So as we start to build out our sales team, I think, you know, making sure that a lot of those learnings and you know our story and understanding kind of the pain points that we face and why we've decided to focus on those, that those are part of the sales pitch that are, you know, everyone on the sales team really understands the product and you know, what kind of value it can provide, where it's where it's strong, where you know, still needs work. I think all of those things are super important.
Max Matson 1:15
Welcome to the future of product podcast, where I max Matson interview founders and product leaders at the most exciting AI startups to give you an exclusive glimpse into the workflows, philosophies and product journeys that are shaping the current and future AI landscape. This week, sit down with current co founder unwrapped former Amazon Alexa, Product Manager Ashwin Singhania, to learn more about how he and his co founder were product managers, the founders. With all that said, let's dive right in.
Welcome everybody to the future of product. today. I've got an awesome guest, Ashwin Singhania. He is the co founder of unwrapped AI, very exciting startup that actually just recently discovered while doing some snooping. Ashwin, you've got a really interesting background, would you mind telling me a little bit more about your background as a product manager,
Ashwin Singhania 2:01
Happy to so. So I've been product manager or related roles for about 10 years now. And I'll kind of give you the trajectory of it. So after graduating with a degree in computer science was kind of debating in between the dude go and kind of be a full time engineer or learn what this whole product management thing was about. And my interests were always gravitating more to kind of like kind of details like more of a diverse set of day to day experiences, right? Like I was, you know, I enjoyed developing, I enjoyed building products and actually writing code. But a lot of what I really enjoyed was spending time thinking about, you know, what are the problems that we should tackle for customers? What are the prototypes look like? What does, you know, version ABC look like. And so, started off at a pretty small startup here in Santa Barbara, the company was called Find the best later rebranded to graphic, it was a company that was essentially trying to reinvent vertical search. So the idea of think about what Yelp is for restaurants or what Zillow is, for homes, our concept was, all of those share the same underlying technology platform. But each of the data assets is different, right?
And so we were able to build 500 1000, vertical search engines, so started off there, and you're working at a small startup and allows you to kind of wear a bunch of hats and so eventually veered into product management roles. And so we've been doing that sense. The graphic story turned into the Amazon story. So in 2017, graphic was acquired by Amazon, essentially to be the brains of Alexa a question and answer product. So something that we had built up over the many years, that graphic was this large Knowledge Graph read, so to be able to make data driven comparisons in any product, right? You amass a large volume of data. And so that became particularly attractive to the Amazon team. And so spent five years at Alexa leading question and answer product teams. So if you asked us a question like, What is the score of the San Francisco Giants game? Or what is the GDP of France? Or who was the president of France? You know, 25 years ago, any kind of question that piques your curiosity? I was part of the teams, they're building the solutions to power those experiences.
Max Matson 4:35
Very cool. Very cool. Would you mind telling me just a little bit more about like, what that day to day look like for you as a product manager, you know, working on product?
Ashwin Singhania 4:44
Yeah, sure. So, you know, I think the simplest way to describe it was, if you think about the broad problem of answering every question, you know, it kind of seems fairly, never ending. But really what we tried to do is say how do we break our or product teams into categories? And so say, you know, how do we tackle problems around sports? So how do we tackle problems around? Or sorry, questions around sports or questions around politics or questions around shopping products. And so, you know, my job is a day to day leader of a product team there was to a try and understand how do I get a sense of what are the types of questions? And what are the types of information that customers are going to ask Alexa about those various categories? How do I measure how well we're doing against the types of questions that they're asking? Right? So how do I get a sense of where are we performing? Well, where do we have big opportunities? And then once you've identified those opportunities, thinking about, okay, what are the ways that we can go and make sure we can answer those types of questions, those categories of questions even better, right? So thinking about where do we need to source data to? What are the types of answers need to sound like and read, like, in various categories, and then, you know, the kind of day to day transactional stuff, ensuring that our teams are our engineering teams are actually building towards the right solutions, then, and then tracking the impact of those and, you know, on a week over week, month, over month, quarter over quarter basis, how much better getting at providing a q&a experience to customers in each of those categories?
Max Matson 6:18
I see. Gotcha, gotcha. So, you know, for the product managers out there, I would imagine working on just about any product, right, you are kind of inundated with feedback for you guys, I can only imagine what that looked like, you know, with with Alexa, would you mind talking a little bit more about that?
Ashwin Singhania 6:36
Yeah, and it's kind of super relevant to what I do today. You know, I think there were, it was kind of an interesting challenge, it is two sides to kind of chat or to tackle here. On one side, we were inundated with just huge volumes of feedback, both in terms of you know, the types of customer support tickets, and, you know, in app feedback that customers will leave us, which is going to be typical of any large consumer product. And then you have all the stuff that's on your social channels, so Reddit and Twitter and all the discussion that's happening about Alexa in general. But then on the flip side, we had, you know, oftentimes a lack of specific feedback in areas that we were working on, right. And so we'd launched a new feature. And we'd want to try and be able to get a ton of feedback about that feature. And you're sometimes limited to, you know, a small case study with, you know, a handful of people or what you can gather from, you know, friends and family testing your product, but to try to get a large data driven sample of feedback about a new experience that we launched in Alexa, specifically about that experience was was times challenging. So both of those, you know, they seem to have like opposite problems, but they were both kind of, you know, messy. And, you know, one of the things that really motivated my co founder and I had to really try and solve that ball.
Max Matson 7:57
Got it. Got it. So, yeah, let's, let's get right into that. So what you know what we'll get to like what motivated you specifically to to form unwrapped, but I really liked your guys's tagline, which is discover what your customers need and half the time. So yeah, you know, not to put you in the hot seat. But how do you do that?
Ashwin Singhania 8:15
Yeah, so let's kind of maybe like, I'll explain what a process of customer feedback analysis oftentimes looks like. And then kind of how we try and talk about problems like, perfect my experience, whenever we were at Alexa. Right? Was oftentimes, you try and first, you know, the first problem is get your hands on some set of relevant feedback to what you're building, right. And that can be really difficult. So I'll give you an example that one of the products that I worked on Alexa was, you know, we had lots of customers ask us, How do I do X? How do I tie it tie? How do I change a tire that type of stuff, right. And that's a complicated experience to try and deliver voice forward. You know, we played around with some various experiences on videos and in the app and things, and we really wanted to gather feedback on that. And so, you know, first my process would be trying dig through Reddit and customer support tickets, and, you know, maybe do some basic keyword searches to try and get my hands on that feedback. Right. Next, once you have your hands on some feedback that you think is relevant to you trying to say, okay, what are the actual meaningful patterns and actionable patterns in there was really manual. So really, what that would that look like for us was downloading that feedback into Excel, creating some sort of taxonomy in your head of like this piece of feedback maps to this problem that a customer is facing. So for example, answers are too long or right, or lack of visual elements or something like that, right? And once we've got that taxonomy, then you go through 100 500 pieces of feedback, you tag those, you put it into a pivot table, right, and you've got some sort of stack rank of what your opportunities are right now.
Okay, right. You know, it works well. If you've got kind of a one time analysis, maybe a few 100 pieces of feed Back are a lot of challenges that we faced were like a that just became overly time consuming, particularly you needed to do it on like a monthly or quarterly basis, right to be able to track these things over time, it started to add up the manual cost to, you know, it didn't really scale to the hundreds of decisions we needed to make, right? So the cost associated with trying to do that analysis just became particularly prohibitive. And then I think, three, what are the challenges that we faced was because it was so ad hoc, and so manual, it became really, really difficult for these things to like, for the analysis to persist over time. So someone will do that analysis, one time they would live on their laptop, maybe they would change teams, and suddenly now, you know, a new person comes in, and they're like, hey, like, how has, you know, have we improved on this type of problem over the last six months, and you know, what the analysis that six months ago is, is kind of lost, right? So fast forward to how we're trying to solve that problem. You know, over the last several years, the kind of the NLP space, natural language processing has really advanced far enough along where a lot of the manual work that we've been doing to identify patterns or problems. And then tag feedback against those patterns can be done programmatically, right. And so unwrapped, essentially tries to solve a few different problems.
First is making it really easy to get your hands on all your relevant feedback and have that just easily cleaned up and aggregated in one platform. And so integrating with any resources, whether that's, you know, product support, sorry, support tickets like Zendesk or intercom channel, NPs feedback, public forums, whether it's like Reddit, or discourse or stuff like that, bringing it all into one place. And then to kind of issue identification, right. And so by clustering, all that feedback and clustering all the actionable statements in that feedback, you can very quickly kind of see what types of actual patterns bubble up to the top right. And so you may take 2000 pieces of feedback, and look at that, and within a couple of minutes, be able to see the top five things customers are asking for complaining about our a, that they get far too many notifications that are relevant to them, that you know, the applications far too slow, you know, etc, etc, down the list, right. And so a, you've cut down the time to identify those in the first place, to you've now got a data driven stack rank of those opportunities. And then three, you see, as you start to make improvements to your product, you get a real time tracker of how the customer feedback is changing around those things. And so you can actually measure, Hey, have I really solved the problem? Or, you know, my internal team feel I've solved the problem. But externally, my customers are still complaining about that same feedback. Right? So it's gonna make each of those steps, you know, substantially easier.
Max Matson 12:46
Okay, gotcha, that makes a ton of sense. I, it's kind of a trend that I've seen with this, like recent wave of AI is that the data has become kind of ubiquitous, right? So it's a matter of how do you use it? Yeah, right. Right. Yeah.
Ashwin Singhania 12:58
Yeah, I think like, for most companies that we talked to, today, the data is there, and they have a ton of it. And so it's less, there's less of a concern for them that they need to generate a ton more data from their customers, it's oftentimes, they're overwhelmed by it. And because of that, they don't make a ton of use of it. And you know, what we feel pretty strongly about is that that's actually, you know, a big problem, especially when you're not making use of your customer feedback, it drives the cycle of your customers feeling like their feedback is kind of going into the void, which is, you know, experiences any consumer can can empathize with. And that can be really frustrating. Yeah, absolutely.
Max Matson 13:39
Absolutely. Like that a lot. Alongside that, though, can you tell me about any of, you know, the unique challenges that come with handling that much data with taking data from publicly available sources, your private demand and kind of aggregating it all?
Ashwin Singhania 13:53
Yeah, I mean, no shortage of them? I think, you know, maybe first because, you know, volume staffing kind of just scale up obviously has some interesting challenges. I think the first one is really, how do you pull out what is the most actionable and like, meaningful feedback from large blocks of text, right. And so typically, what we see is that there's a spectrum of the type of feedback that you can collect and the cleanliness around that, right. If you are collecting an NPS survey, and you ask them, you know, what are a few things that we can do to improve our product offering, you're probably going to get two to three sentences that are super clean and actionable feedback, right? Customers are gonna tell you verbatim what they want. And then you can use that try to understand what are their pain points, but parsing through that's very simple. It gets much messier as you start to go to customer support tickets, where they may be describing you know, a ton of background context and you need to pull out what of that is the actual pain point. It gets even messier when you have like a, you know, a back and forth chat with the customer support operator. Let's say you're you know, we're we're trying to analyze feedback from like, intercom, you may have a 510 minute conversation, how do you understand?
What was the initial reason or initial pain point that a customer presented to before the rest of the conversation? You know, so allow a product manager who's one realizing that to understand what are the root pains, and then again, that kind of gets even messier as you go to a source like Reddit or you know, Twitter where, you know, both you're inundated with much more noise, just a lot of people kind of shouting, hey, this product sucks, you know, they're not really providing a bunch of actual stuff from mining through that and understanding what is what are the actual statements is challenging? So we spend a lot of time thinking about what are ways what are, you know, what are classifiers that we can build to take these large kind of transcripts or large blocks of text and pull out the actionable stuff? And where that doesn't work? How do we build some sort of rules based process to pull that out? on a case by case basis?
Max Matson 15:57
I see. Yeah, I would imagine from a mental health perspective, and probably not the greatest crawling through Reddit comments. Yeah.
Ashwin Singhania 16:04
I mean, it's one of those things where it is, you know, I think it's, our, our customers say that it's both an incredibly valuable source for them, because it's kind of like a unencumbered or sorry, like, it's a, it is a it's an honest view viewpoint of a lot of their customers kind of experiences of their product. But it can be, but sifting through that noise can be extremely painful, right? Right? Read through a lot of stuff, that's just kind of rant to get to the, to the gold. And so we're, you know, we're trying hard to to make that process a lot easier for them.
Max Matson 16:43
Ya know, that makes a ton of sense. You know, I've had this kind of theory that the way that people interact with software is actually a lot more negative than we make it out in the product world to be right. Right. And I only have that theory based on my own experience, which is that every time I download an app, and it doesn't do exactly what I expect it to do on the first time I delete it. Right. So I would imagine that, you know, having that kind of unfiltered unbiased view of all these different strings of data gives you a much more holistic picture overall.
Ashwin Singhania 17:11
Yeah, I mean, one of the, one of the fun things that we've, I mean, I think it's it's been a learning of ours, too, is that initially, when we, when we started building, unwrap and starting to produce the initial dashboards for customers, a lot of what we focus on were like, what are the top 10 things your customers hate about your product, right. But like, on the flip side, we've we've seen a lot of feedback from our customers saying, Hey, we really want to see some of the stuff that they love, as well. And so you can actually, when you sift through that, you can give them a little bit better picture, or you can say look like, we've been able to also tell you the things that customers love about your product, things that you can promote things that you can just use internally to make your your engineering team feel good, right, one of the challenges that we hear a lot is engineers are so removed from day to day customer interactions, that they don't oftentimes hear the pain points, but they also don't hear about the joys that their customers receive, right. And so I get Amazon, we would spend a lot of time trying to gather anecdotes, both positive and negative, and share those with the team. And we think that's something that really want to carry over here, too, is how do you get not just anecdotes, but your data driven under like, actually holistic analysis of what things people like and what things people don't like, and help help our customers socialize those across their, their teams?
Max Matson 18:32
Totally. So I have a question. And I asked it, because over a clear zero, we are kind of really big fans of using our product on ourselves, right? So when our product breaks, we're the first to know, do you guys use unwrap for unwrap I would imagine.
Ashwin Singhania 18:49
So we use it a little bit, we throw all the feedback into our system. One of the things that is interesting about in rapid is that our value proposition really starts to become meaningful. Once you're getting 1000s of pieces of feedback a month. If you're getting like 100 to 200 pieces of feedback. It's not it's not meaningfully different, I think in terms of efficiency, then quickly scanning through it. So I guess my answer to you is I really hope for us to grow fast enough, you know, so that way, this becomes a pain point where unwrap is like an obvious use case for us. I don't think we're there yet. Right? I'm at I'm at the point now where I'm still reading through every piece of feedback that our customers leave us individually. But I certainly want to get to that volume soon.
Max Matson 19:34
Makes sense. Makes sense. From a GTM perspective, can you talk a little bit about what that kind of unique challenge is like, you know, serving an enterprise kind of customer at this early stage?
Ashwin Singhania 19:46
Yeah. So I'd say there's like a few right. So obviously the first one is like there's there's challenges around trying to be a an external vendor to to large enterprises, right? So you get all the challenges around data security integrations, how do you get through compliance and all that type of stuff, right. And so as we, as we work with larger customers, getting them comfortable with with sharing data and understanding all of our security policies and how we guard their data, that has been just a learning curve, right. So as we get as we get deeper and deeper into the experience, we've gotten pretty good at understanding how to have those conversations, what materials to have up front to share with their, you know, InfoSec teams, and things like that. And then I think separately, one of the things that we've learned, which has been, you know, I think it's unique to us, but I think the concept holds broader is understanding the difference between your day to day user at some of these larger enterprise companies versus the buyer.
And so some of that, you know, I think is we have learned to gaze, while our day to day user may be an individual product manager who manages in a water, many product lines, their buyers, that those companies are oftentimes folks in like the product operations roles, where they hold a function of making sure that customer feedback is, you know, probably disseminated to each of the teams that the insights are properly calculated. And so, you know, for us, it's been a journey of learning, you know, through customer interviews through the sales process, who are the decision makers in the buyers? How are those different than the day to day users? And, you know, how does that how do we both make sure our messaging is correct for the different parts of the conversation? And how to find find those internal champions, which I think is, you know, that's a common problem for any startup trying to build into enterprise sales is understanding who your buyer is understanding who your champion is, and how to how to cut down the the Navigating of the sales process.
Max Matson 21:53
Yeah, absolutely. Good. Can I ask, do you guys have a sales team?
Ashwin Singhania 21:56
We have a small sales team. Yeah. So you know, initially as because most startups do is, you know, all founder led sales for a while. And in the last several months, we started to build a small sales team.
Max Matson 22:09
Gotcha, gotcha. Well, what has you know, we're actually kind of at a similar stage at player zero. You know, as a founder, I just love to get your perspective on. Are there any kind of lessons that you've learned as you've kind of incorporated a sales motion into your company, on culture, you know, on just generally how you go to market, all that stuff?
Ashwin Singhania 22:32
I'm trying to think how to how to answer that. So I think we're still pretty early in the like, in the sales experiment, or the sales team experimentation. I do think there's a few things that, that we've, we've learned, at least, from my journey, doing sales, is it really helps to be selling a product that solves a pain point that you personally had, right? I think, like, early on our ability to tell a story about the pain points that Ryan's my co founder that Ryan and I both had, as product managers, at startups, and at Alexa, and how that story has, you know, how our pain points have shaped the product that we build? And why we think that product then is right for other product managers and nothing oftentimes whom are selling Do I think that has helped us kind of build a rapport with people, you know, with, with potential customers who were talking to?
And I think it's also allowed us to understand what are the pain points that most resonate with them? So as we start to build on our sales team, I think, you know, making sure that a lot of those learnings and you know, our story and understanding kind of the pain points that we face, and why we've decided to focus on those, that those are part of the sales pitch that are, you know, everyone on the sales team really understands the product, and, you know, what kind of value it can provide, where it's where it's strong, where, you know, still needs work. I think all of those things are super important. The way you know, I think we've handled that right now one, it's really easier. It's much easier when your sales team is tiny. Right? And but you know, something is we've tried hard to do is make sure that anybody who is in outwardly facing demos has used our product in time, they built demos for the customers that they're demoing to, right they understand how exactly our product works. And so making sure our sales team dog foods are on product, before they get out on calls, I think is pretty important.
Max Matson 24:35
Mix it understands makes total sense. So just changing streams just a little bit. I'm really interested in the fact that both of you are product managers, right and you've made this tool for product managers, kind of on that topic. I'm wondering how you from your kind of unique vantage point see that role? Because it feels like it's one that's going through a period of of a lot of change right now. Whew, how do you see that role changing? Now in the next year in the next five years,
Ashwin Singhania 25:04
my experience as a product manager is that, you know, people have asked me to try and define what a product manager is, right? I think a lot of people will be like, they'll tell you kind of various short definitions, I think it's pretty difficult to come up with like a one size fits all explanation. And what I found is that, depending on the stage of the company, or at just depending on the stage of product you're in, and you're the size organization, it can be many different things at different times, I think what we're going to see is, at least like, from my experience, product managers will spend, hopefully a lot less time doing the project management side of things. So how do you make sure that you have the right status updates, that the status updates are properly polished, so that way they can be, you know, consumed at different levels of the organization, like, I think artificial intelligence will know things like GPT will make a lot of that fairly easy, right, you'll be able to spit out reports and things like that, from when any your data stores, and then to the process of understanding, you know, what your customer pain points are, will hopefully become a lot more efficient, right. And that's, you know, a lot of one wrath is trying to do is say that today, if, if a product manager needs to launch or thinks they're going to do to tackle a particular problem, they may start with saying, Okay, I've got three or four hypotheses, I need to interview a bunch of customers, I need to be able to understand, you know, their pain points, I need to validate or invalidate my hypotheses.
And then from there, I need to go at work my engineering team to scope solutions, hopefully, you know, a lot of that initial like validation phase. And research phase can become a lot more efficient, both from a like making it really easy to identify which customers are best suited to give you feedback to making a lot of that asynchronous through, you know, automated tools. And then three, just cutting down the analysis process, right, making that much less painstaking. So I think I think those are going to be, you know, a lot more, that's going to open up a lot of efficiencies. And that's going to hopefully allow product managers to spend, you know, more time thinking about what our ideal long term solutions with their engineering teams, and really doing a lot of that heart design thinking, in my experience, I got to do less and less of that, the more time I spent at Amazon. What a part of that is, as my team grew, I spend more time just day to day managing people. But also the number of streams and things that I was constantly juggling, that a lot of the the project management burden took place of hard design thinking, which I think is both, you know, a frustration of mine. And also some of that I imagine, you know, imagine most product managers would love to spend less time doing status updates, and more time thinking about what the optimal solution for their products are?
Max Matson 28:09
Yeah, absolutely, absolutely. It's a trend that I've kind of seen is that, you know, in my in my personal opinion, product managers make some of the best founders, right, because you have kind of this holistic approach to the product, this kind of eagle eyed view where you can really see from the customer's perspective. But all that being said, what have I mean, I can imagine that this has been a much different challenge from working in Amazon, what have been your biggest product challenge, specifically when it comes to building on ramp?
Ashwin Singhania 28:40
Yeah, I mean, I think from so obviously, we have ton of, you know, engineering problems and hard like, NLP problems, right. I think maybe I'll talk about it into two ways. One is, I mean, NLP side, right? There's, there's all kinds of challenges around how do you spit out the type of analysis? So you know, assuming that you can get things you didn't get your classifiers accurate? You can you can kind of solve some of the basic science problems. But like, how do you spit out an analysis of feedback that is going to be most useful for different product managers, maybe a different levels of decision making at a at an organization? And so you know, I'll give you an example. Right? There's a customer of ours. They are they've got about a product team of about 4040 product managers. And then they build a product that's got 10s or hundreds of 1000s of paying customers, right. And so the analysis looks different for frontline product manager who's working on one or two features than it does for a group product manager who's trying to think about what their quarterly or you know, six month roadmap looks like versus what the CPO cares about. Right?
And they're you know, how do they think about what are the next six months To 12 months of investment look like, Are there major changes that we need to shift in terms of headcount? So thinking about, like, how does our analysis play to each of those levels of decision making, and then to, you know, our system is going to make certain, you know, taking one step back, like, you know, part of our product tries to do is say, given 1000s of pieces of feedback, how do we turn that into almost like, you know, cards that describe, describe a customer pain point, the volume, where that pain point, how that's changed over time. And then all the raw anecdotes, though, you can say, you can look at them and assess what are the top pain points for me to tackle? When do I tackle those. But you know, as any product manager can probably attest to, it's much more complicated than that, once you look at the analysis, you need to think about which four pain points are actually related. Which ones solve seven, four pain points with one solution. So I want to track those together, right? So all of those like taxonomy decisions, those things become difficult to automatically solve, right? Like, the you know, even the best AI today is going to spit out some decision that are of taxonomy that a product manager may look and be like, No, that's not how we think about it. Right? So what are our challenges?
And how do we fit their mental model of what's right? Or if we're not going to? How do we give them you know, controls to be able to really curate that? So if they, if they disagree with our, our outputs? How do they make corrections to that? And then how does our system learn from that? I think that is, that's like, you know, that's one big product side of it. And then I think the other is, is really just thinking about, you know, there's all kinds of product management tools out there, right. So you have in your whole product manager workflow, right, you got, you know, all your tools to manage your roadmap, whether that's Jira, or Asana, or you know, product board or things like that. You've got tools, maybe to do customer interviews, you've got tools to do, you know, monitoring in terms of like, you know, Mixpanel for engagement tracking, every year, like, you know, bug monitoring software, all that type of stuff is another challenge for us is thinking about how far into those areas? Do we try and either build integrations? Or do we try and build that experience into our tool, versus just today focusing on being a feedback analysis tool that is then kind of slotted in with everything else in a in a product manager stuff?
Max Matson 32:37
Totally. Yeah, that's it's an interesting kind of quandary that you raised there. It's something that I've been thinking about talking about with, Matt, our head of product is, yeah, that at what point can you stop stacking software, right, because there's so many different tooling that product managers use just to do their daily job. One thing that we've been talking through that I think kind of aligns with, with what we're talking about here, is the possibility of kind of going outside of data silos, right, and having one large pool of data and then having tools that essentially plug in to give whatever that vantage point that you need is, do you see kind of unwrapping a tool like that, where, you know, it's enabled by all this data, and you're able to get specifically the output that you need in order to address customer feedback?
Ashwin Singhania 33:30
Yeah, I mean, I think so we've seen that with, with, you know, some of our customers who are more sophisticated in terms of how they kind of build their their data infrastructure, what we'll see is that, you know, within their kind of data warehouse, they've got all of their quantitative analytics, right about their customers, their engagement, their revenue, all of that stuff, all their feedback is also flowing in as well, right. And then unwrap becomes kind of like a tool on top of that, that we, you know, we set up an integration with their, you know, redshift instance, or something like that. And then we're able to pull in all of that feedback. And what's really valuable about having all of their feedback, or sorry, all of their data in one place is it actually allows for much more actionable analysis, right. So for example, if we are pulling from an app store, right, a feedback, and we're pulling just directly from that portal. We don't know much about that customer, right? It's an anonymous piece of feedback. And we can kind of we can tell you, Hey, here's the patterns in in the text of the you know, of feedback. If we're pulling from a data lake that has all their other feedbacks, or all their other analytics attached to it, we can say of this group of feedback that we found of all these customers who are complaining about pain point A.
On average, those customers are exhibiting much less engagement. And you're kind of you're kind of average, or this is the total ARR of accounts associated with this pain point. Or, you know, these are the top pain points as patterns exhibited only of the segment of customers who have churned, right. And so I do think that to your point of having when, when our customers can aggregate all that data together in one place, and then we can pull, you know, kind of piecemeal what we think is most valuable from there, it makes our offering a lot more powerful. But I do think they, there's a ways to go for most companies, you know, from we've seen to really get to that place where they've got their data in a, in a in a state that makes it kind of, you know, allow us to reach that reality.
Max Matson 35:49
Right, right. So when we do reach that point, or if we do hypothetically right, and that's kind of the standard, how I think that, you know, tools, like your own AI tools will actually be able to kind of evolve and grow, given that just mass access to data.
Ashwin Singhania 36:09
Yeah, I mean, I think so one thing that we're seeing right now, it I think some of like, the trend in AI is that it's becoming like, substantially easier to parse through a lot of this data. So maybe two years ago, you'd say, okay, like, they have this stuff all kind of thrown in there. But it's still going to be a lot of work for a data engineering team for them to kind of structure that in a way that makes it so that way, it's possible by you, or it's in the schema that you need, so that they even to just integrate, let's say, their data from Salesforce on accounts with all the feedback that we're pulling from an intercom or Zendesk instance, is like weeks or months of work, right. And so that's a, that's an impediment to our tool, providing value for them. But as all this data starts to become, I think, a lot more easy to parse through like AI solutions, you can say, like, Hey, go and pull all the data from, you know, give me sales volume, by account from this, you know, from this particular data warehouse, and I can do that on their behalf not having, you know, not requiring data engineers to go and do a ton of parsing, I think it's going to open up our ability to provide a lot more insight a lot more quickly. So hopefully, that, you know, for us, that's a you know, tailwind, right, that allows us to go from kind of onboarding to fully data saturated, much faster.
Max Matson 37:34
Gotcha, gotcha, just enables you to kind of do it that much quicker.
Ashwin Singhania 37:39
Yeah. Yeah, I think what we hear from all of our customers, like, hey, look, they would love to integrate seven, eight different types of data sources with us. But there's just, there's so much internal engineering work that oftentimes needs to happen right now that those things, you know, you know, inevitably get, kind of push down in the priority stack when you have a narrow customer facing fires to fight.
Max Matson 38:06
Totally, totally. Alright, so we're getting close to time. But I did want to ask you just a couple of questions regarding AI in general. So, first question, where do you see the world in 2030? You know, societally from a business perspective? with it?
Ashwin Singhania 38:27
Say it again, with AI?
Max Matson 38:29
Yeah, with all the advancements that are going on, kind of where do you see the world changing, you know, for the better or worse by 2030?
Ashwin Singhania 38:38
Yeah, I think it's, I think most people that will try and predict 2030 Right now, especially, are gonna get in probably wildly wrong. So I'll caveat that. You know, I don't think that even five months ago, six months ago, I would have predicted how fast really just slapping a UI layer on top of GPD three would have kind of changed the landscape of technology, right? So you know, that that made it so widely accessible. And suddenly after that, everyone's like, hey, I can I can build on top of this. And you're already seeing how fast things have evolved. I mean, the My personal philosophy around these things are that, like with most technology, things that are seemingly rote or frustrating today to us will hopefully become, you know, substantially less prevalent. So a lot of the things that we do today that are you know, they're not actually super creative, or you know, generating net new ideas or innovation, but they're really like going through a lot of mechanical stuff that will kind of get automated away. And in turn, that'll free up people's time to do a lot more creative things. And so, you know, for example, I maybe wouldn't have described writing, you know, writing code As particularly wrote, but you know, as the invent, you know, kind of growth of co pilot. And some of those things have shown, right, there's actually a lot about writing code or day to day basis, which is really regurgitating the same thing over and over again. And no great engineer actually enjoys writing that little method to parse X and turn it into a Y.
Right, that's not the hard part of their job, the hard part is thinking about broader designs. And so I'm optimistic that obviously, that becomes a lot easier for everybody. And that opens up space for for a lot more creative thinking. And then I think from there, you'll start to see, you know, the types of you'll start to see kind of really cool new products, particularly created by, I'd say, much smaller organizations, or even individuals that couldn't be created before, right. And so 20 years ago, if you were, you know, a, if you were a teenager, right, you who had never made a video before, you probably couldn't build a following of millions of people and create a brand for yourself and become, you know, an entertainer right now today with tic toc and tools like that, that have kind of lower that technical barrier that's allowed for a lot of creativity and an entirely new industry. And you see a lot of great content that's produced by by people who, you know, 20 years ago would have been kind of roadblocked OR gate cut from that. And so hopefully, what we see is, people will be able to create a lot more, right, single one, two person teams will be able to create, like, really, really impressive pieces of software videos are all that type of stuff. So I think those are the things that I feel pretty strongly about that will know will happen. Beyond that. I think it's, you know, kind of roll the dice
Max Matson 41:54
Yeah, totally. No, I like that vision of it. And then I think we're kind of seeing that bearing out, right, is that in a similar way to like how social media gave small creators, the tools to just build it and get it out to the world? AI is doing that for small creators when it comes to like the business side, right?
Ashwin Singhania 42:12
Exactly right. And so, you know, for example, a one or two person company will hopefully be able to build a website that's highly optimized, they'll be able to spend substantially less on legal costs, because all those things have been automated away, they'll have to spend a lot less time experimenting with how to add target, because, you know, those tools have gotten so much better. All these types of things that today are, you know, I'd say impediments to them delivering the actual product that they're building, although the rest of the administrative stuff around that, hopefully, that becomes, you know, a lot lower cost. And then from there, you know, we'll see what kind of cool things they create, but the cost of cost will go down substantially.
Max Matson 42:57
Yeah, yeah, absolutely. We agree. And then final question, for you guys over at unwrapped, is there, you know, any type of processes that you guys have internally, any type of tooling that you're using yourself to kind of get an advantage? You know, in the whole build process in GTA?
Ashwin Singhania 43:15
Yeah, I'll, I'll give a plug to so one of the NLP engineers on my team. He's been, he's been building some really interesting tooling that we'd actually love to potentially release out there. But you know, one of the challenges that I think a lot of people are going to face today with this whole, like prompt engineering is how do you? How do you very quickly benchmark how well a particular prompt does for a task? Or sorry, given a task? How do you benchmark 1020 prompts, maybe against four or five different models and a few different inputs, right. And so I'll give you an example where one of the things that we have used large language models, like GPT for is to be able to train basically create our own training data. And so if we want to be able to create a sentiment analysis model for each of our customers, right? That can be you know, you can use something off the shelf, you can train your own model. But oftentimes gathering the data, the ground truth data, there can be really expensive. We're able to use GPT, and other things to create a lot of that data either synthetically or evaluate it and make kind of human decisions on our behalf and then train our own models against that. But we need to understand what is the f1 score, right, the you know, the quality measure of AI decisions for 1020 prompts.
And so we have tooling internally that says here are 20 prompts. Here are your 20 or 50, ground truth annotations that say this is what the right answer should be. And then you can plug in and instantly kind of fan those 20 to 30 prompts out against GPT against, you know, Google's offering against anybody else. And instantly get back, what is the best f1? Right. And so that's cut down our, our, like our prompt engineering lifecycle quite a bit. And, you know, we're excited to try and provide that type of framework to anybody else out there that's trying to solve that same problem. Because again, it's one of those things where like, it's just rope work, right to try and plug the Calculate the analysis. And so I think, you know, that and, you know, we're going to continue to find other ways to speed up the testing process. But but that's allowed us to, you know, take 557 Ideas are like, Hey, we really want to figure out how to, you know, what's the best prompt here? And within, you know, a matter of minutes now get the answer versus what used to be two or three days per prime offices.
Max Matson 46:01
That's a fantastic application. I, as soon as it's available, let me know, I'll hop on for sure.
Ashwin Singhania 46:07
I'll have his name is Jackson. He's, he's very excited about it. And I'll shoot your way.
Max Matson 46:14
Yeah, please do. That's fantastic. Well, I really appreciate you being on. It's been a fantastic session. been great to learn a little bit more about unwrapped, learn a little bit about your background and your thoughts on Product Management. Is there anything that you want to leave the people with any plugs or anything like that?
Ashwin Singhania 46:31
Yeah, I mean, I think that the main one is, if you are, if you're building products out there, and you are struggling to really understand what your customers pain points are, or you're spending too much time digging through feedback to try and get to those answers. We'd love to try and help. I was in the, you know, in I used to face those exact same problems. I've been in your shoes and the solutions we're building are built, you know, inspired by that. So give us a give us a shout unwrapped on AI.
Max Matson 47:02
Fantastic. You heard it guys. Thank you for listening to another episode of the future of product podcast, and a special thanks to my amazing guest Ashwin. If you enjoyed this episode and want to learn more about what I do over player zero. You can find us at player zero.ai. And if you're looking to go even deeper on the subjects we talked about in the pod, subscribe to future of product on substack. Be sure not to miss this Thursday's newsletter, which I break down the biggest takeaways from my conversation with Ashwin and explore in depth our product people can collect meaningful product feedback at lightspeed. Look forward to seeing you there.