This transcript was generated automatically by AI. If you find any mistakes, please email us.
[00:00:00] Announcer: Hello, everyone. You're listening to Cloud Next, your go-to source for cloud innovation and leaders insight brought to you by GlobalDots.
[00:00:16] Ganesh: Businesses today stand on the precipice of transformation. [00:00:20] Gazing into the cloud filled horizon, the dream is clear. A boundless digital expanse where data flows freely, scalability is a mere thought away, and innovation thrives in the fertile grounds of unlimited computational power.
[00:00:34] Yet beneath this gleaming promise lies a shadowed reality fraught with challenges. [00:00:40] As organizations embark on their cloud journey, they soon encounter the complexities of cost management, where expenses spiral unpredictably. The vast oceans of data, while rich with insight, also bring overwhelming complexity.
[00:00:52] Privacy concerns mount and the fear of data breaches becomes ever present. Our guest today is harnessing the power of AI to solve [00:01:00] these very challenges. I'm Ganesh The Awesome and joining me today is Tarang Vaish. Co founder and CTO of Granica, Terang stands at the forefront of this revolution, pioneering AI driven solutions that navigate the complexities of the cloud.
[00:01:14] Terang, before we start, what should people know
[00:01:16] Tarang: about you? Thank you, Ganesh. Um, bit about me, [00:01:20] like I am a immigrant here in US. I love, you know, working on any challenging problem that's in front of me. I am a gamer by nature. As a kid, I grew up playing all sorts of video games, and I love any good technical challenge.
[00:01:33] Ganesh. Uh, my passion has been always like, you know, learning new things and going deep into subjects [00:01:40] and figuring out, you know, what makes things stick and, uh, how can we solve those technical, like, you know, like challenges and how, how can we have like a business solutions built that are actually solving customer problems.
[00:01:53] So about me, like I've worked, uh, in different startups, starting from like, uh, working as a [00:02:00] GPU, like you're kind of a software architect on designing GPUs and working on NVIDIA CUDA, like SDKs and thinking about how we can accelerate applications using the GPU. Uh, next I had worked at a company called Cohesity where I worked on, uh, the distributed storage platform that they have.[00:02:20]
[00:02:20] And this gave me an experience of how people that are not deploying, you know, in the cloud, like the on prem world, how are they basically solving the challenges or the competition that has arrived with the cloud, you know, so to speak. So in short, what I learned is like, how do you make on prem systems cloud like?[00:02:40]
[00:02:40] Like, you know, like basically blur the gap or the big chasm between like on experience and the cloud experience. Uh, so it was great having that experience of building more from a, like a CapEx model, you know, transition. Uh, that I saw. Uh, and then I have worked in a SAS [00:03:00] company, uh, dealing with email security and being more of a enhanced email security solution.
[00:03:07] And I learned all the challenges that come with deploying a SAS product in the cloud and how difficult it is, you know, to control your costs and to provide real value with AI. [00:03:20] After that, my journey at Granica has been really awesome with a really great team, you know, that I have worked with. Our mission has been always about like, you know, helping customers with the journey in the cloud and being very data focused along with that.
[00:03:36] So we understand the bits and bytes of how data is being used [00:03:40] and think of thinking of it not just in like volumes of data, gigabytes, petabytes, so on, but also about the information that is really stored inside that data. Uh, right, like very simply, like in the cloud, when you have this IoT data or like, you know, machine generated data that has become exponentially larger, you know, in [00:04:00] size than what it used to be on prem.
[00:04:02] And the visibility into that is even worse than what you had on prem. So, like, these kind of challenges, you know, like, really excited me. That's why we started Granica. And we were, like, you know, thinking about how do we think, tackle this data problem, which is like 10x worse actually in the cloud than the [00:04:20] on prem world where I came from.
[00:04:21] Ganesh: And that's a, that's a very solid set of background credentials you just provided there, like everything from being a programmer to on prem genius to a SaaS platform guru, you know, you, you really ticked all the boxes along the way. And actually, You know, I came across a [00:04:40] word a good few years ago now, but it was when, when the cloud was still fairly quite young and the expression cloud sprawl was something that was provided.
[00:04:50] And I thought it's, that's actually something even, like you said, even more present now than ever before. You know, we just. Data is just pouring in [00:05:00] everywhere and it's mostly people too scared to tidy it up really. Um, would be great to get your insight, you know, with, uh, on the back of the, all of those credentials, you've then started up Granika.
[00:05:13] Can you tell us some of the valuable lessons you've learned on, on building your own AI startup?
[00:05:18] Tarang: Yes. So, um, [00:05:20] we thought about AI, you know, as a investment that we could not ignore. You know, like we basically wanted to make sure that we have a very solid business case. So, you know, it should be something that I can explain to anybody walking on the street, you know, and they are able to understand the business value of the product.[00:05:40]
[00:05:40] So we started from the core business value, which is like cloud efficient, like cloud cost efficiency. You have your data in the cloud, you know, it is growing exponentially at about 40 percent year over year. This is before the AI. And now it's probably even more crazy. Right. So there is always going to be a need [00:06:00] to make sure that you grow sustainably.
[00:06:02] And there is so many pain points and horror stories around like, you know, how cloud doesn't work for you a lot from the cost perspective. So we started with a strong business use case. Followed by understanding that, you know, the only way that we can like, you know, push the boundaries will be one from a [00:06:20] systems angle where we build really good, you know, distributed system solutions, which like, you know, tackle and build a platform that is very efficient to deal with this petabyte scale data followed by, you know, having an AI research from two angles.
[00:06:35] One is that we build our own solutions, which are more AI [00:06:40] enhanced or AI first. You know, so to speak and then also, uh, all the applications that we see today are, you know, AI centric So, how do we understand them best and how do we add value for them? You know, like, so kind of like going bottom up and building a solutions with systems and AI like, you know, conjoined together and as well [00:07:00] going top down and looking at what the AI applications, you know, are doing differently with their data.
[00:07:05] Uh, and then how we can, you know, give people like different real value with that, uh, from our platform.
[00:07:14] Ganesh: And that's. You know, you're doing it obviously a very interesting time. I think [00:07:20] I was involved with the, with the Granica story quite early and, and knew what you guys were up to. And this was, I would say prior to the chat GPT hype that we are now experiencing.
[00:07:34] So I would, I would sort of quite. Uh, in a flattering way, say that you were doing [00:07:40] AI before AI was cool and now everybody's all over it. But, um, what were the, what were the challenges you had in creating a team around that? Because it's a completely new idea. I'm guessing that the skill sets. didn't exist.
[00:07:54] You know, you can't, you can't hire for AI engineers because they just sort of don't exist and [00:08:00] certainly wouldn't have done when you were building Granica. So how did you approach that?
[00:08:03] Tarang: Yeah, I would actually give credit to, you know, our chief scientist, Andrea Montanari for that. He is a professor at Stanford and he was an early advisor at the company.
[00:08:14] And we used to talk to him about like information theory and some of the [00:08:20] challenges with basically dealing with, uh, let's say deduplication at petabyte scale data sets. So we had been working on that space for a while. And, uh, you know, under his leadership, you know, we were able to bring in a lot of AI research and build AI, you know, research labs, so to speak, within the company.[00:08:40]
[00:08:40] Uh, that was a big catalyst because now typically, you know, the building blocks are not there. Uh, you know, if you are just trying to transition to AI company. You know, it's not easy to just hire 10 great people, you know, and do that. Uh, so we had that, like, let's say, initial kindling with like, you know, Andrea coming on board, but then the team [00:09:00] slowly, like, you know, started getting built around AI research.
[00:09:03] And now we had this in house teams where they have engineering, they have cloud expertise, and now they have AI researchers, you know, sit coming together. And, uh, they were now able to talk to each other because they are, let's say, in the same room, so to speak. And, uh, they are able to bounce off ideas, or they are able to, [00:09:20] like, let's say, we had early presentations a year back when ChatGPT had just come and our AI researchers were talking about what it means to us, etc.
[00:09:29] So that actually was a great way for everybody to, like, transition towards, you know, an AI stack. And it also made it easier for us to attract more AI talent. [00:09:40] So we always focused on AI research, and that is how we were able to grow, you know, our team and having everybody talk to each other, we would have like these ninja talks every Wednesday, you know, or basically also invest in projects which are not always customer facing, like their investments that are like a [00:10:00] two quarters down the line, etc.
[00:10:02] Those things made us easier for us to be more flexible. But If 100 percent of your team is working on the current product line, it's very hard for you to pull out people and, you know, train them, transition them towards an AI product. But if you have a smaller team that is working two quarters ahead, you know, on [00:10:20] those problems, then once they have reached a certain, you know, like milestone of maturity, you can bring in more people and they can get trained from where things are.
[00:10:29] So it's been like a great vision for us to, uh, focus on, you know, AI research. It was a bet that we had even long before there was chat GPT, because we always [00:10:40] wanted to invest in the uh, the big applications that are going to exponentially keep growing, you know, your cloud adoption. Uh, it used to be these machine generated data, your logs and audits, etcetera, that was growing.
[00:10:52] Uh, then there was a lot of video content. Uh, you know, there was genomics data and we knew AI was the next big thing. So, [00:11:00] uh, that's where we always had the bets on those, uh, on that direction. But this is something that like, you know, shook everybody when it came.
[00:11:09] Ganesh: And I really hope that your meetings were actually called Ninja Talks, were they actually called Ninja Talks?
[00:11:15] They were
[00:11:15] Tarang: called Ninja Talks. So we call ourselves ninjas within the company [00:11:20] and, uh, yeah, the talks are called Ninja Talks. And it's, it's cool. pretty open every Wednesday, you know, there's lunch hour and people can have their food and uh, one or two folks can present anything that they like. I
[00:11:31] Ganesh: like
[00:11:32] Tarang: that
[00:11:32] Ganesh: a lot.
[00:11:32] We, we used to do, um, we used to call it the town hall meeting when I was, uh, when I was in the technical [00:11:40] operations team. And I always thought town hall meetings sounded very official and quite, quite stuffy, but, uh, Ninja talks. I like that quite a lot. You talked about Andrea and his, like the way he helped shape that journey.
[00:11:53] And I know the listener probably still doesn't maybe fully understand what Granica does, but as well, you know, [00:12:00] one example that's very easy for people, for listeners to understand is that you built a new standard for compressing data at petabyte scale. So, um, doing deduplication at a level that hasn't been done before by finding patterns within data so that you can dedupe.
[00:12:18] On patterns of [00:12:20] data rather than whole files, which is pretty groundbreaking. You know, I remember when I first saw the technology, I thought, wow, it's, um, people were just used to zip basically, you know, they, they seven zip or PK zip was invented that, that was it. That was what, that's what we do to, to make files smaller and [00:12:40] just, you know, I'm always amazed at the new innovations that come out and following on from that, how do, you know, that, that rate of innovation.
[00:12:50] And you're obviously going to keep growing products on the back of what you do. You're not a file compression company, you're a, you're a company that's solving those problems. But how are you, how are [00:13:00] you maintaining your innovation as you grow? Um,
[00:13:02] Tarang: yeah, I think on the compression side, just for, you know, viewers to understand, like compression is something of a technology that works on like kilobytes to megabytes level.
[00:13:12] If you want to grow anything bigger than that for a seven zip to do like you better give it like, you know, 10x more memory [00:13:20] to consume. And it's just not scalable because if you want to maybe read a part of your compressed data, you know, you don't want to undo your whole compression and run into gigabytes.
[00:13:30] So there'll be a lot of waiting periods and you'll have to have big systems to to compress anything larger. So there was a need, uh, at a petabyte scale to build [00:13:40] new compressors because there is a very unique sets of data are being generated. Like think of your images, you know, and videos, they are not zipped the same way as you know, your regular text data is because there is specific applications that are using that data.
[00:13:56] And there is so much volumes of data, you know, of type [00:14:00] images, that it there is a special like, you know, there is a need to build a dedicated compressor that can focus just on images. In fact, there are hundreds of those compressors that focus just on images, and they would be pretty bad, or they just don't work for anything else.
[00:14:14] So. So that was one of the intuition that there is no global compressor. And also [00:14:20] compression only works at like kilobyte, megabyte level. And above that you have to it becomes a systems challenge. How do you build a Like scalable system where it can use this technology called deduplication which can you know pull data at petabyte scale?
[00:14:34] Uh, what? Compression does at a miniature scale. So so that was pretty cool [00:14:40] Um We have like a pipeline of innovation, so to speak, like, uh, it starts from two sources. One is like, you know, from our customers where we are finding out, uh, like Granica as a vision, we always had a platform story. So we are not a point product that is just going to be, let's say, saving you like, you know, like a cost savings [00:15:00] product for your object storage.
[00:15:01] It is also going to be improving your whole data pipeline efficiency. Uh, starting from the application and how you are consuming your application, your data, or how are you processing, curating your data, et cetera. So we are from a business use case perspective, we are always looking for how our product can work, you know, in the market [00:15:20] and how it can work in adjacent, you know, business needs around that, uh, that solution.
[00:15:25] And the second is our research where we are looking and investing in. How is shaping the, you know, the, the, the market, but also what are the technical challenges around data that are going to [00:15:40] become even worse, you know, as we go. So for example, right now, training your data, you know, for fine tuning, uh, your models, you know, it's pretty expensive.
[00:15:50] And if you think of it, like you want to say, Hey, you know, I want a better model. Let me train on double the data. Right. So what would that entail? It will end up almost [00:16:00] doubling your cost of training. Right. So it's like and it's pretty linear and you need more data over time. So maybe you spend some, you know, effort and trained on certain like set of data sets.
[00:16:13] But in order to double that, like, you know, your cost to adopt in order to improve your model, your cost will double. Which is not something that you [00:16:20] would like, you know, appreciate if you're a, like a CFO or a CIO, right? Like, uh, where for every 5 percent or few percent incremental update to your model efficiency, your cost double, right?
[00:16:32] So the ROI is something that I always think about as a, you know, CTO of a company. And I have a lot of empathy for, you know, our [00:16:40] customers, which have to go through that. So we have to bend that curve. And that is one of the research that we have done that you don't actually incur double the cost if you have to train on double the data, for example.
[00:16:51] So there is a lot of these are like research projects, which are forward looking. Uh, in how we can enhance the entire [00:17:00] AI pipeline, you know, from AI, like from bringing your data from cleaning your data from training your, your, uh, models for that data. And then also on the LLM space, how do we, uh, improve privacy for around, you know, using LLMs in with your data that's so, so [00:17:20] in short, there is no dearth of, uh, projects and business use cases, uh, that we have come up like it's just.
[00:17:26] Uh, things are a little bit more uncertain right now in AI and that is actually a good opportunity where, uh, you know, like it may not be very clear, but you really start seeing that there is a lot of pain points for people to transition [00:17:40] into the AI stack.
[00:17:42] Ganesh: Sound sounds like you probably have, um, a problem of too many projects rather than not being able to think of, of innovation.
[00:17:51] It's, it's which, which bit of innovation you want to focus on would appear to be the problem more than not having enough. Um, it's interesting. You talk about the state of [00:18:00] AI market because, you know, we don't often get to speak to people like you who are on the show, so. You know, you are an AI leader, we, your, your thoughts would be very interesting because the impact of AI and people are aware of the obvious things, um, as particularly the use cases that we have today, you know, they're, they're quite well known.
[00:18:19] [00:18:20] I don't know. I'm pretty sure nobody today is writing formulas for Excel cells anymore. I think that's a solved problem. Um, but what are, what are more of the unexpected or what are the unusual effects do you think of AI?
[00:18:33] Tarang: Yeah. So, uh, how we have seen, uh, you know, I, I think the markets, [00:18:40] uh, like we used to have a FOMO of missing the cloud and, uh, you know, we wanted every, everybody said that, okay, let's just transition to cloud and you would have like, you know, your life would be much better than on prem world.
[00:18:52] I think in some way we are having the same FOMO with AI across the market. So that is amazing to see, uh, you know, [00:19:00] but obviously just like cloud, uh, like AI is there, you know, it's really tangible and people are getting those early results. So it's very exciting for people. So the, uh, I think this is like a discovery for everybody, they are all expecting that things are going to get much more cheaper and efficient, you know, and more productive as we go.
[00:19:19] And [00:19:20] people are trying to build on top of some AI stack. And they are looking to continue to, you know, make it flexible and grow over that stack. So the, uh, maybe the unusual part is like, I don't see that majority of people have like, you know, really experienced value yet with AI. But [00:19:40] there is a lot of investment that is happening there, you know, at this point.
[00:19:43] Uh, but this is basically the FOMO of like, you know, being not being too behind, you know, in, uh, in the AI journey for everybody. So how I think of it is people are going from like, you know, like having, uh, an AI tag in the product to [00:20:00] actually saying that, okay, you know, if we don't do this, we are going to be killed by new competition.
[00:20:05] We, we, we ourselves see that if you are an AI first company versus an AI enabled company, there's a huge difference. Between that in the speed of innovation, if you're AI first company, you are running much faster than an AI enabled company, so to speak, [00:20:20] but when I say I enabled, it's like, you know, you're just like maybe adding a chatbot around your product, or, you know, you are just using it in some decision making, but your product is not built with AI, you know, as your core engine.
[00:20:32] Uh, and we are building that ourselves and we see customers who are building it that way are moving much faster, you know, than others. [00:20:40] So that is pretty amazing and unusual to see because, you know, it's very unusual, like, okay, you have this magic wand and, you know, if you do that, then, uh, you know, you, like, jump ahead of the competition much faster.
[00:20:52] Uh, so that is how, uh, I think, uh, I'm seeing, uh, the market right now. Uh, in terms of, uh, how we are [00:21:00] utilizing AI, I mean, it's very interesting how quickly you can build products around AI versus, you know, how traditionally you had to do software engineering. I mean, we are seeing these early signs of like, you know, code completion and like static checks and like, you know, like, uh, like GitHub copilot is awesome, et cetera.
[00:21:19] Uh, but if [00:21:20] you see like Just like building an end to end AI program, prototyping is very quick. You know, it's in the matter of weeks, uh, that people can build a product that could actually beat, you know, uh, in like traditional software in certain aspects. Uh, but it's now as we get mature, And people want [00:21:40] to now deploy AI first products completely in production, like you will slowly see that, okay, you know, you need to build the rest of the 80 percent that you traditionally have to build your QA pipelines, your, you know, data quality engines, you have to be privacy aware, you have to think about licenses, you also have to think about like, are you going to [00:22:00] be dependent on some third party APIs, etc.
[00:22:03] So there's a lot of unknowns right now. But actually the speed of innovation and just prototyping and productionization has become exponentially faster, which is amazing.
[00:22:13] Ganesh: I, I definitely agree with the, the, they're not wanting to miss out on the hype. Um, [00:22:20] you know, and, and I have seen it quite a few times before as well, because I, prior to that big data, a good few years ago, everyone was convinced that they must have some information in their data.
[00:22:32] I worked with a few companies that spend a lot of money on big data projects only to find out that they had no, no real value in inside [00:22:40] their big data, but they had to do it just in case because otherwise somebody else might do it. Um, yeah. And, and, uh, and it's very interesting that, like, as you mentioned, being an AI led company versus an AI enabled would see unbelieve, you know, I work with a lot of software vendors.
[00:22:57] Now we see an unbelievable [00:23:00] amount of plugins now, which are. They're nice things, they're like, um, automatic pull requests where the code has been written for you. Something like, you know, Copilot or, um, just ways of helping people around SaaS platforms and things like that. But they're, they're not really game changers in all honesty.
[00:23:17] Um, so it's quite interesting to see where, where, where [00:23:20] that will go.
[00:23:21] Tarang: If you think about it, like, you know, when people started building. applications in the cloud, they were far more scalable, and you know, they had a global reach versus the applications that you build on prem. So even if that technology was maybe built better and more mature, but it could not have that scale and reach, you know, when [00:23:40] applications were built in the cloud.
[00:23:41] So AI is actually a similar example, again, that building with AI, you get so much stuff standard, like, you know, in building your application, that you basically can be far ahead of a traditional software company.
[00:23:54] Ganesh: That's a very nice, um, that's a very nice example, actually. And it's, it's funny how these [00:24:00] things repeat themselves over, you know, it's, it, you could say the same about hardware and virtualizing and people who virtualized their stacks and, and then it was moving into the cloud and then it was X, you know, containers or Kubernetes and then, and then.
[00:24:14] It's this, um, you have to keep running at the speed of light in order to stay in the same place. [00:24:20] Um, so we, we understand a bit about Granica and what you're doing and it's a cost saving tool and a compression tool. Um, but can you also unpack for us how it affects data privacy? Because I know that's an aspect of what you look at as well.
[00:24:36] Tarang: So, Granica provides today, uh, two product [00:24:40] lines. It's one is called Granica Crunch and the second is called Granica Screen. So, Granica Screen is a product where we preserve the privacy of data from training to inference. Uh, we enable privacy preserving training and inference of LLMs and other models. Um, how does it work?
[00:24:58] You deploy our platform. [00:25:00] You point it towards, uh, you with a container that contains our pre trained models, which are specifically focused on privacy detection, and they support a hundred plus languages. And it's APIs API driven, you point your data to our container and you can [00:25:20] basically get a redacted data set back.
[00:25:22] You so, and you send it textual data, you get back redacted data and you also get the metadata required to unredacted later on. You.
[00:25:31] Ganesh: I guess we have quite a few players in that industry at the moment. Um. I could think of a specific [00:25:40] example would be dig security to got bought by Palo Alto recently, but generally, um, the market, the DSPM data security posture management, which I just think is kind of, um, sweeping your cloud for privacy.
[00:25:57] Basically, it's kind of a definitely a [00:26:00] huge growing market because people don't want to leak anything in there. Um, is there something specific about. Uh, granite, a screen that is, that gives it an edge and what's, what's, what's the USP over other products?
[00:26:13] Tarang: Yeah. So the USP is it's like built completely with AI models in mind, [00:26:20] and, uh, it has been fun and proven to work at petabyte scale.
[00:26:24] Uh, usually the screen product, like the data privacy products that you traditionally work with are working on data sets that in the order of gigabytes to a few terabytes. Uh, and the functionality, the cost models and, you know, efficiency of the, of the [00:26:40] software systems are not that advanced. Uh, what we have been able to build is a petabyte scale system where you, you want to train your data on a petabyte of, uh, you want to train your model on a petabyte of data.
[00:26:52] You can screen that all, set all petabyte of data much more easily. Quicker than the competition, and then [00:27:00] you can actually be able to detect sensitive information in multiple different languages, and you can customize the models to be able to work with Uh be more context aware on the data set so it's going to be more meaningful It's more advanced.
[00:27:15] It is has 100 plus language support And it really [00:27:20] works at the petabyte scale where you know, we are now deploying like this is a use case that has Probably become more relevant now than much than before ai Where you need to now do this sweeping your data set at that scale On top of it. We also provide screen to [00:27:40] give you privacy protection around your llms So think if you are using an external, you know llm provider And you send sensitive information to them.
[00:27:50] Uh, you need to make sure that, you know, your security protocols and compliance rules are adhered to. So you don't want any sensitive information to leave your premise. You can [00:28:00] use Granica screen to basically desensitize your data before it goes to the LLM. Response comes, we can like un redact all that information.
[00:28:10] Ganesh: It's um, it's an an, an lm a an AI LLM in order to help people. And then another AI model [00:28:20] that is watching to make sure that the data is not breaching privacy. It's like. You know, stacking, stacking one AI tool on top of the other. Interestingly, you talked about some of those use cases before as well. I sort of forgot to say the, the, the more obvious use cases like image optimization.
[00:28:39] We [00:28:40] actually see some very cool things coming out of there at the moment. And I'm not sure if you're familiar with speed size, but it's an Israeli tech firm that somehow worked out what human, uh, using AI worked out. What human eyes are actually able to pick up from an image [00:29:00] in order to make a highly compressed image with it That's lossless to the human eye.
[00:29:06] So it's not actually lossless in terms of its Its pixels or whatever like that, but it actually the human eye can't pick it up and I I You know, I'm sort of constantly, this is
[00:29:17] Tarang: an age old, uh, you know, [00:29:20] optimization. I, I, before I, you know, I worked on GPUs, I used to be working in computer graphics. So, uh, I know that domain very well.
[00:29:28] Like I remember they used to be a company where they would play the game of cricket. And the only thing that they, you know, made sure was the cricket ball, you know, has high quality and the rest of the [00:29:40] system could be very blurred.
[00:29:42] Ganesh: Yeah, yeah, that's, that's genius. And that's, yeah, it's, uh, it's definitely, it's, it's the old made new, basically.
[00:29:53] Yeah. Yeah. Makes perfect sense. Um, you've been in the industry for a little while here, Terang. [00:30:00] On the show, we love a good war story. Um, what, what misery can you share with our audience that you've had along your
[00:30:06] Tarang: way? Um, yeah, I think, I mean, there are like, uh, it's, it's a great question because, you know, there are so many miseries along the way.
[00:30:15] Uh, and luckily you forget about those miseries unless somebody tries to remind [00:30:20] you, we've been like me, like, I guess, uh, opening misery was like, you know, when we started the company, uh, late in 2019, we just like, you know, we just formed a small, like a team and be like, okay, let's get started. And then just COVID hit.
[00:30:37] So, uh, you know, it was like just weird [00:30:40] to just abandon that, uh, like, you know, the, your office and be working completely remotely and not be able to see your co workers for more than a year. Uh, I even had an employee where I did not see him through his whole employment. He worked for us for a year. And I never met him face to face, which is very weird.
[00:30:58] You know, before [00:31:00] COVID times, this was a very weird thing to have, right? Like, okay, you work with somebody, you have never seen him. Um, but, uh, in terms of, uh, war miseries, uh, one of the challenges that happens is with bigger companies, they take, tend to take much longer, you know, to basically go through their journey with us and all the POCs [00:31:20] and during those times, anything can happen.
[00:31:22] Uh, so one of these, uh, stories was where. A customer had, uh, literally hundreds of petabytes of data. And, uh, this was a very big company and, you know, they were very excited to work with us. And things suddenly started shaking in that company. And every time we would [00:31:40] have a meeting, the folks would be saying that, okay, we are here, don't worry, nothing's going bad.
[00:31:45] Uh, and it kept on happening for a few weeks, and then the folks were gone. Uh, and then we came to know later that they basically just, uh, you know, the company got acquired, and they basically deleted 90 percent of their data. [00:32:00] So they basically like basically said, okay, we're just closing the business units.
[00:32:05] We don't know how to tackle, you know, our challenges with the cloud expenses. We just have to do something dramatic. So instead of actually making value of the data, they just deleted 90 percent of the data.
[00:32:16] Ganesh: That is a crazy story. And You know, uh, [00:32:20] knowing the kind of companies you work with, and if you think of people who are, you know, pet multi multi petabyte scale to be deleting data on this size is, uh, boggles the mind.
[00:32:33] Um, yeah, I think, yeah, no, no words beyond that. [00:32:40] Um, the other thing we like to ask people when they come on a show is if you could go back in time and give yourself one piece of professional advice. What would it be, or maybe if you'd like, you can, you can have three or five tips or something like that. Um, yeah, something to share with the audience.
[00:32:56] Tarang: Yeah, sure. Um, I think, um, [00:33:00] at, uh, especially this is for professional advice for a company like, you know, or the stage where we were before, where, you know, the company just getting started and, uh, you're looking to hire people. Uh, there was a time when you are competing with the big companies, you know, like the fan companies and you're in the Bay Area, things are [00:33:20] more expensive.
[00:33:21] So there was a big trend to just hire remote workers. Uh, I think that became a little bit too difficult. Like it was not a great solution for, you know, Early stage companies where, you know, you have your core team members completely working remotely, uh, you do want like, you know There is a [00:33:40] discovery phase at the initial stage of the company where you don't have anything very clear You are learning you're you know, it's a chicken and egg problem You're solving a lot of hard problems and every day things could be different or different problems become more Very critical for you.
[00:33:55] So you really want that team to work together closely and solve these [00:34:00] technical problems Uh, so early investing in remote teams early on in your company is not a great idea I would say, you know, you should have some critical mass or people that you're familiar with only those cases You know, it's easier to work with remote folks.
[00:34:15] So I would Like, you know, recommend not hiring too many remote folks. [00:34:20] Yeah, unless you have certain business, once you have a business model running and, you know, you have a sustainable, you know, like a predictable pipeline and processes, until then hiring too many remote folks could be difficult, you know, to scale.
[00:34:35] In terms of professional advice, I think the biggest thing I feel is, [00:34:40] uh, don't give up, you know, on problems that you are solving. Uh, you know, I have, like, I remember when first time I had to learn Kubernetes, uh, it sounded weird. You had to, I had to go read an illustrative guide on Kubernetes and On other things to try to understand what this is and I had this fear that this is too complicated and [00:35:00] I can't pick it up Uh, but then I was kind of forced by my ceo to hey, you know, you have to do this So, you know, it was two weeks of just having a coffee cup always by my side so that I can power through it But after that, when I powered through that, I really enjoyed, you know, Kubernetes and now it's becomes like, uh, always [00:35:20] intuitive, even if I'm not developing Kubernetes or with Kubernetes every day, I know intuitively what it means.
[00:35:25] So, uh, the same thing I had with SQL, you know, a lot of other technologies with AI, you know, now that, uh, you really have to push yourself, uh, like do things hands on, uh, learn how these systems work, uh, [00:35:40] especially for older folks, you know, AI was. It's non existent at our time, uh, you know, so you have to really go hands on, play with these technologies and, uh, you have, that will bring you intuition.
[00:35:52] Even if you are like, you know, higher up rank in your company, it's important to co play with these technologies because they are [00:36:00] fundamentally a different vertical from where things used to be before. You know, it's this whole software stack is going to look like a bad world software stack for you very soon.
[00:36:08] So it's important to be hands on, uh, you know, with the technologies that you have and yeah, and also the other insight I had. Like running a company is like every generation, like the team [00:36:20] sizes are shrinking, you know, so we used to have these large companies with hundreds of people, you know, now when you have companies hit a billion dollars, I'm sure like, you know, they are sometimes they're not even a hundred people working in that company and that's trend is going to keep going and going.
[00:36:35] So the idea or the. The onus on everybody is to be more broad [00:36:40] in terms of, you know, your knowledge base and experience. So in startups, we love like more generalist than, you know, specialist. Uh, so you have to become a good generalist or a full stack developer or be a very good specialist. So kind of pick your lane, uh, you know, but you have to continue to learn and, uh, you know, [00:37:00] adopt, uh, as we go.
[00:37:02] Otherwise the young folks are going to eat your cake.
[00:37:07] Ganesh: That's, that's definitely, definitely great advice. And yeah, as, as someone who's been in the tech industry for sort of 17 years and I get further and further away from the, from the, the, the [00:37:20] frontline, I do have to keep pushing myself to go back in there and play with technology so that I don't become completely out of touch with it.
[00:37:27] So it's a definitely, definitely, that's, that's great advice. And interesting, what you said about the billion dollar company, I was actually listening to a Tim Ferriss podcast the other day, and they were talking about [00:37:40] the, the decretion in the number and the prediction is that because of AI, we will soon have a one person, 1 billion company, which is completely mind blowing to think about, they'll be backed by AI, um, and, um, I think maybe it'll be no,
[00:37:58] Tarang: no people, it'll be an AI company that [00:38:00]
[00:38:01] Ganesh: even better.
[00:38:04] Um, but, but, but bear bearing in mind all of those things. Um, you are, you, you are an industry leader, be very interesting to get your thoughts. What future trends do you see? What, what do you see coming in the world of.
[00:38:16] Tarang: Yes, so I, I think, uh, there is a lot of rush [00:38:20] right now to reach, you know, be AI ready or AI enabled or be, you know, AI first.
[00:38:26] Uh, so I do see, you know, there will be a rush to value, which is really awesome. You know, it's like a gold rush. Everybody's rushing and figuring out how they can transition. To this, but, uh, with every technology, you know, the problems kind of [00:38:40] like slowly catch up. So one of the things I feel like, you know, is like, there would be tens of different problems that will now come when you have more AI in your product.
[00:38:50] Uh, like data privacy is a huge problem. Uh, one of the open problems I see is like, uh, like GDPR, like if your AI model knows [00:39:00] some, like, you know, personal information, how do you unlearn from that information? Right. Or. How do you, like, retrain your model without that information? It's not simply just delete the database entry anymore.
[00:39:12] So there is going to be challenges around data lineage because everybody will have to prove, you know, when AI [00:39:20] model is returning some data, just like Wikipedia or, you know, your own company, where is this data coming from? Where is the data? What's the data provenance? So there would be challenges around data governance, there would be challenges around like using third party AI.
[00:39:35] Or where do you run like their systems challenges around where you run these [00:39:40] models, these government compliance challenges, cost challenges, uh, being just being able to, like, you know, provide a clean data, like, you know, train model, uh, that can be not like, you know, subject to lawsuits, those are all going to be something that we are seeing right now, uh, and that's [00:40:00] just going to continue to grow.
[00:40:01] And, uh, another big area is around just like AI security, uh, where, you know, you would have problems where the adversaries are now also AI enabled. Uh, so that makes it double challenging to, you know, track and solve [00:40:20] or like, you know, like, uh, Yeah, just handle like that kind of cyber attacks or some security issues.
[00:40:26] So, so it's like, it's, you have to be really prepared, you know, for that new world. It's like a, it'll be a new urban warfare, cyber warfare, I should say that you have to deal with.
[00:40:36] Ganesh: Yeah. The, the, the, the hacker [00:40:40] aspect of it is definitely. Something that the, I don't feel like we've properly seen the impact of that at the moment.
[00:40:48] And it's going to be disastrous for a lot of people. Um, I, you know, there's a, there's a, the world of thinking that you can just have things unpatched [00:41:00] or not get seen. It's over basically, you know, every, everybody can see you with your pants down now. And not only that, they're going to, they will come for you with, with, with sophisticated AI bots.
[00:41:13] So yeah, it's a, it's a very, very scary, very scary thing, but it's, if you're in the business of, it's like a
[00:41:18] Tarang: black
[00:41:19] Ganesh: mirror episode, [00:41:20] we are right in the middle of the black mirror episode. But at the end of this episode, um, Terang, it was a real pleasure having you on the show. Um, really great having you.
[00:41:34] Thank you so much. Um, any parting words?
[00:41:36] Tarang: Thank you, Ganesh. This was wonderful talking to you. Um, [00:41:40] I'm very thankful to our team for where we are. We are thankful to our customers and if you want to know more about our data privacy and data efficiency solutions, please check out granica. ai.
[00:41:53] Ganesh: Perfect. Thank you so much and, uh, enjoy the rest of your day.
[00:41:57] Thank you. This episode was produced [00:42:00] and edited by Daniel O'Hana and Tomer Mouviton. Sound editing and mix by Bren Russell. I'm Ganesh The Awesome, a Senior Solutions Architect. And if you're ready to deep dive and start transforming the way you approach security, then the team and myself at GlobalDots are at your disposal.
[00:42:19] It's [00:42:20] what we do, and if I don't say so myself, we do it pretty well. So, have a word with the experts, don't be shy, and remember that conversations are always for free. Find us at GlobalDots. com.