Breaking the Mold with Sandbox Semiconductors
|Aired:||January 25, 2023|
In episode one of Circuit Talk: Funders and Founders, John Cole is joined by Meghali Chopra, CEO of Sandbox Semiconductor. A benevolent disruptor in the semiconductor ecosystem, Sandbox Semiconductor seeks to revolutionize the way that process engineers conduct experiments on innovations within the field. Chopra and Cole speak on the difficulties of breaking into a field so cautiously protective of IP, as well as the benefits to be shared by producing anonymized case studies.
A window into the future of innovation for the semiconductor industry, this conversation with Meghali Chopra is an inspiring listen for anyone interested in the success of an innovator changing the game and defining new paths to succeed in this critical technology.
• 0:00 - 0:32
I'm joined today on Circuit Talk funders and founders by Dr. Meghali Chopra, the founder and CEO of Sandbox Semiconductor. Dr. Chopra is an engineer and entrepreneur. She under PhD in Chemical Engineering from UT Austin, worked in industry and launched Sandbox Semiconductor in 2016. Sandbox Semiconductor, accelerates chip manufacturing process development using their suite of software to accelerate process development in Fabs. Well, can you start by just telling me a little bit about what you did before you started Sandbox?
• 0:32 - 0:34
What led to to the startup?
• 0:35 - 1:09
When I completed my undergrad at Stanford, I was really, became really interested in nano manufacturing and how to scale essentially technologies from the lab to market. And so when I, uh, came to grad school at UT Austin, I was really lucky they had just started, uh, an engineering research center called Nascent, which was funded by the National Science Foundation, which was dedicated to nano manufacturing. Um, and so I essentially, uh, started, uh, doing research in that area of focusing on, uh, process development issues that, uh, were important.
• 1:09 - 1:20
That was pretty core to my PhD, is I wanted, uh, to be able to work on something that people would use or that was an applicable problem. And so that's kind of how I at least first got really introduced into this area.
• 1:21 - 1:32
Really interesting. So then how did you make the leap from what you were doing in the lab? And you, you found it sandbox at some point you met up with customers and understood what they were working on and how this was applicable?
• 1:33 - 2:07
Yeah. Uh, so, um, so I started doing my research in, um, essentially plasma processing. And in this research, um, one of the first things I wanted to figure out was, well, okay, how are people modeling plasma processes right now? And when I talked to, um, people in industry, um, which I was able to do as a graduate student because of this engineering research center, we got some, a lot of industry exposure. Um, they basically said that, you know, we do some fundamental plasma modeling, but most process engineers really rely on their experience.
• 2:07 - 2:21
They've been doing this recipe development for years. They have good intuitive understanding of kind of how process controls work. And, um, I mean, that was devastating. Cause I just told you my goal was to u build something that people would use <laugh>. Yeah. And they were saying, ah, that that'd be a really hard tool to use.
• 2:21 - 2:52
Um, so then I essentially, um, was like, okay, well why aren't people using these types of computational tools? Well, for plasma modeling, which, you know, plasma is core to edge processes, deposition processes, they, uh, models have really large numbers, uh, parameters. The, uh, process spaces are large or multidimensional. There's a million non-linearities. This just makes it, uh, what we call a computationally intractable problem, just impossible. And so most of the types of models, uh, that people use were really simple.
• 2:53 - 3:16
Um, so essentially what we set out to do, um, I was working with my, uh, PhD advisor at the time now co-founder Roger Bony, um, was try and tackle some of these problems. And so our goal was essentially to develop a solution where engineers could use very limited data to calibrate a model and then use that model to make, uh, process decisions to help reduce the experiments, give them a better idea of the process space.
• 3:17 - 3:48
And so as part of the engineering research center, I got to kind of hear feedback. People were like, oh, that sounds interesting. And then I did this program called NSF icor, which in NSF Icor, the goal of the program is essentially if you have a graduate student, or I think even bachelor's students who have a technology that they think has some potential, they kind of force you through like a shark take like scenario to do a hundred, uh, interviews of customers, buyers, uh, users. Does anyone who might be remotely interested in your technology.
• 3:49 - 4:03
Which, you know, for me was really hard because, well, one, as a graduate student, you're kind of alone in a closet working <laugh> Yeah. On a modeling problem. And then you're getting out there and you're talking to people. But the other thing is that, you know, just to get feedback on solutions in the semiconductor industry is pretty challenging.
• 4:03 - 4:34
It's really secretive. People working on process development can't really share a lot about what their processes are, even about their challenges. So other people in my program were interviewing, you know, teachers, other users of their technology, whereas I was showing up at manufacturing sites trying to, uh, interview process engineers, whatever they could tell me. But, um, through that program, I basically was able to quantify a need for what we were developing. So I, I kind of essentially had product market fit that gave me at least enough confidence that, okay, this is something I should pursue, keep pushing forward.
• 4:34 - 5:01
That's great. Yeah. We've heard a lot about the, uh, the Icor program and how that, you know, is an essential bridge from getting sort of technical founders out of the lab and talking to potential customers to actually find where their problem sits in the commercial space. But when you think of sort of the problems that you're solving right now, are, are you helping say fabs improve yield or discover new processes or, uh, troubleshoot processes are spinning up? How, how are you seeing this tool used right now?
• 5:02 - 5:33
Yeah, so, um, at Sandbox what we focus on is essentially, uh, developing software tools to help process engineers, uh, reduce the number of experiments that they need to achieve a given process. So we take kind of like a unit process approach. Okay. Um, so just to kind of give you an idea of the scope. So for, you know, a given chip, there's like 1200 processing steps, right? They can be etch, they can be deposition, they can be lithography. Each of those individual processing steps are dependent on each other, and they all have to be optimized.
• 5:33 - 5:52
And so what we do is we, we take those individual unit steps and we model them by reducing these really large multidimensional spaces. So they're large because engineers have to figure out what chemistry sees, um, how the chemistries interact with materials that they're using. They're optimizing tens of different conditions.
• 5:52 - 6:28
The recipes are 10 to a hundred steps long. We essentially take the most important parameters from that model it for them, help them visualize what the trends are in that space, and then help them achieve an optimal process window to hit that recipe target. This can essentially help with all the things you mentioned. So by having a better process window, you can improve process yields. Mm-hmm. <affirmative> by, um, just showing a recipe window. You can sometimes find a process solutions that maybe people didn't think were possible. And then, you know, sometimes our biggest value is actually telling an engineer who's been trying to work really hard in a recipe that the space he's looking in just isn't going to work.
• 6:28 - 6:36
Like <laugh>, you don't need to do any more experiments. This is, you need to try something else. But that's how we help to reduce the total cost of, of that process development.
• 6:36 - 6:46
Yeah. Well that's great. So closing doors, opening doors, and helping engineers find their way through process, that sounds really valuable. Is it, this is similar to digital twining, is that right? Is this sort of the same idea?
• 6:46 - 6:54
Yeah, it's pretty similar. Essentially virtually simulating process outcomes before the experiment is being performed and kind of emulating and experimental space.
• 6:55 - 7:15
You mentioned earlier when you, um, you know, talking to fabs and, and like talking to anybody working in this space, very protective of their IP and therefore like probably very protective of their data. How'd you convince folks to, I assume you have to somehow work with it. Do you, do you take it offsite or do you have to show up and keep it all onsite? How'd you convince people to let you have access to it?
• 7:16 - 7:47
Yeah, I was, you know, the kind of like in your first question you were talking about like how do you get outta the lab to, you know, into the market? That was honestly one of the biggest barriers is you need access to data, prove out your technology, right? But you also need to be able to share data so that you can get early customer adopters shows like some success solutions. Yeah. Um, both of those things are hard in the space. We, you know, we would overcome the barrier with one customer, have a success story and then not be able to share it with another customer. Our solution has kind of been, um, a customized fit.
• 7:47 - 8:15
Like every company we've worked with has their own data security requirements. We've seen some companies that are comfortable with just only comfortable with on premise solutions. So we install at their facilities on their servers, meet their security requirements. Some companies are open to the cloud so we can deploy on the cloud. Makes it a little bit easier just in terms of pushing software updates, things like that. Um, but yeah, still it remains a challenge just to, there's not like a one size fits all solution yet.
• 8:15 - 8:49
Yeah. Speaking of showing what you can do to customers, I noticed on your website you have a lot of great white papers sitting out there that look like kind of anonymized case studies, right? Yeah. <laugh> your data, but really try to show what you can do to help improve their process or, or provide value. That's really awesome. So you talked a little bit about that, like that leap and, you know, icor tell you how to move and make that really hard leap from being an engine like an engineer in the lab to being an entrepreneur that can go out and sort of like talk to customers and understand their problems and take your technology and and apply them.
• 8:50 - 9:06
Is there anything else the ecosystem is kind of missing right now to help more technical founders? I mean, you're, you're an amazing unicorn that made it out of the lab, but for every one of you there are multiple kind of sitting in there with great ideas, but maybe don't have the courage or the, you know, skills to sort of make that courage.
• 9:06 - 9:37
Yeah, I have to do it. Yeah. You know, I, I had the fortune to be part of a few accelerators. I have a PhD in chemical engineering, but until starting the company, I never built a financial model before. Built a revenue model. Yeah, yeah. Something like that. Um, you know, just getting the tool sets to be able to do that early on, I think really helped us. Cuz you have to write a business plan. We also participated in a few business plan competitions, which, you know, just explaining your technology to investors is always helpful and kind of getting a sense for what the path is forward.
• 9:38 - 9:49
When I think about that time of sand box's growth, I, I feel pretty lucky that we weren't a capital company. Like we weren't doing hard work. Cause I can see that the barriers would've been, uh, pretty hard in terms of raising enough capital as, as AE company
• 9:51 - 10:01
Icor helps with the customer discovery. And then if you're startup that sort of has a capital requirement, like you said, where do you go to sort of learn that, that aspect? Or is that not covered
• 10:02 - 10:34
Yet? Gosh, yeah, that's, um, I mean, as a graduate student we didn't cover that type of thing. I mean, UT Austin had had a, a bunch of really nice programs that, um, could kind of help enable you to find those routes. But even raising funding as a precede semiconductor company is, is challenging because there's not a lot of investors who have the technical expertise to get comfortable with it. Um, it ha you know, the semiconductor market is highly consolidated. It's big, but it's consolidated. So inevitably you're like, well, how many customers could you have? Who would you sell to? These are all questions that would make a more traditional investor not very satisfied.
• 10:34 - 10:58
So you, you know, you, the kind of, the routes you can do are go through, you know, strategic investors, but the options are far more limited. Yeah. Um, we ended up winning several S B I R grants, so we were supported by a National Science Foundation, the Department of Energy and ns, um, that really helped us just overcome the initial barriers to get, you know, get our proof of concept going, get our technology built enough to where we could build more confidence to, to get investment.
• 10:58 - 11:16
Yeah, that's great. So, I mean, on your tech, you mentioned yourself, your PhD advisor was a co-founder of yours, right? Takes more than just technical folks to sort of build a team. You've got all sorts of other people on board. How'd you start to connect and find other members of your team? And, um, at what point did you bring on?
• 11:17 - 11:47
We started off, um, you know, probably our biases as technical people are, all of our first hires were high, highly technical <laugh>. We got, um, pretty lucky in that we found some highly creative relentless engineers who weren't daunted by the challenge of building a, a product from start to finish. And we managed to, you know, keep them on board. Um, we connected them honestly through pretty conventional routes, you know, through job postings. We also recruited several engineers directly from UT Austin. So, uh, we have that kind of direct access to talent here.
• 11:48 - 12:13
Well, one thing you and I have kind of talked about before, just, you know, we've heard this from other startups too, just how, how, like how challenging can be to hire, especially technical people, right? You're sort of out there competing, first of all with other semiconductor companies and startups and also software companies as well. Um, a lot of the talent that you're seeking kind of overlaps. How do you go up against the likes of like Googles and Amazons of the world when you're sort of scouting talent?
• 12:13 - 12:47
Yeah, that's a, that's a good question. I, um, I like to think that if you join a company like Sandbox, it's a far more exciting work experience because anything you touch, we will be using and everybody on the team is invested in what you're doing. So if you are excited by that kind of thrill, which, um, I think a lot of engineers are just like a really, uh, a technical problem where you'll actually be implementing the solution, you'll be helping to make sure it meets the customer requirements. And the fact that, I mean, the people that need the most help with the recipe development are the people working on the bleeding edge processes.
• 12:47 - 12:58
I mean, you're working on the cutting edge stuff here. I think that that helps. It's, it's hard to back away from a challenge like that, especially if, you know, the types of numerical modeling and machine learning problems that we work on excite you.
• 12:58 - 13:19
Yeah, no, that's fantastic. So lead with the challenge and make sure folks know they're doing meaningful work. That's, that's always, that's a winning combination there. <laugh> you mentioned, uh, you, you know, went to school in Austin. You're at UT Austin Sandbox is based out of Austin now. What's the semiconductor scene like there in Austin and why are you in Austin and not the Bay Area?
• 13:19 - 13:44
Yeah, the semiconductor scene is growing. So we have Samsung over here. We also have Tokyo Electron. We have, you know, a few, a big fabs being built nearby. We also have a pretty cool startup scene, um, you know, a bunch of incubators around. And we also have access to, um, you know, one of the best engineering schools in the nation. So I feel like it has all the advan, a lot of the advantages that the Bay Area has. And so, I mean, it's been a good home for us.
• 13:44 - 13:46
Good, good barbecue too, I guess
• 13:47 - 13:48
<laugh> Yeah, that's true. <laugh>
• 13:48 - 13:56
Are you recruiting mostly at UT Austin or at like, there's a wide enough ecosystem you can draw from there and, and bring people in outside of that?
• 13:56 - 14:13
Yeah, I would say almost half of our employees have come from UT Austin, but we recruit from everywhere. So, you know, we've, across the board, essentially what we look for are for our engineers or people who have strong numerical modeling backgrounds, um, some data science backgrounds, that type of thing. Um, so we, we recruit from all the strongest schools in those areas.
• 14:14 - 14:21
So if you were to sort of wave your magic wand at any other problem within the semi industry right now, what would you wave it at?
• 14:22 - 14:41
Ooh, I think, uh, one of the biggest problems right now that, uh, you know, people are talking about is metrology. Um, just being able to measure the things that we're building. It would help make our models better. Um, it would help engineers do their job better. I think if I could just remove a problem that we had to face, it would be just having a perfect metrology solution.
• 14:42 - 14:49
That's fantastic. What would you, so would it just be gathering more data or what would you change about metrology or improve right
• 14:49 - 15:19
Now? Oh gosh. So, um, okay. So if we were to take for example, a stack that, um, uh, for, for like a NAND application, the metrology right now is, is incredibly expensive. So you can etch a channel through the stack, then you have to measure each individual layer to get a sense of the profile. And engineers right now are dealing with things like twisting, tilting things that we don't quite understand very well yet. Um, and they're being asked to, you know, figure out how to troubleshoot this profile control, but they're like, am I even measuring it?
• 15:19 - 15:48
Right? I mean, and even measuring it is so expensive. And so they have to be really selective. Like, I'm gonna run these, this set of experiments, then I'm gonna measure maybe half the experiments. Cause the measuring is also expensive. Yeah. For us, you know, as when we develop our models, we, um, can provide, uh, our users with a good indication of what's causing behaviors like twisting or, you know, clogging whatever they're seeing in their profile control. But we do need the data input in order to build that model. So if I could just kind of remove that, that would solve a lot of kind of like the hurdles that engineers have to face.
• 15:49 - 15:59
Great. Yeah. How do you go about telling a customer that maybe you it's time to upgrade <laugh> the perfect tools, or like, this would work a lot better if we had some more data. Um, so
• 15:59 - 16:31
Yeah, that's a, that's always a, a challenge. Good question. Right? More data better. You know, I think, um, we always try and help our customers get the minimum amount of data they need in order to just be successful. And so we try and take an iterative approach. But I mean, even then, I mean the, the problems that they're solving, it's, it's kind of crazy what they can get away with, um, in terms of just the metrology that they do and the amount of parameters that they're optimizing. That's always part of the feedback, looping. If you wanna build a model, you need data. If you wanna optimize a process, you need to figure out what's going on. So just kind of part of the life cycle.
• 16:32 - 16:45
Well, thank you so much. It's, you're, you're building some amazing stuff and you know, this is really the next generation of technology that's gonna improve the industry and sort of leapfrog us forward. So thanks, thanks for taking the time to speak to us today. Look forward to having you back soon.
• 16:45 - 16:46
Yeah, thank you. I appreciate it.
• 19:20 - 19:26
Tell me a little bit more about your vision, you know, at Sandbox, what kind of problem are you solving for your customers?
• 19:27 - 19:58
Oh yeah, thanks. Well, our vision is to essentially put high performance computing in the hands of process engineers. So I think kind of our ultimate solution would basically be for a process engineer to be able to use the computing toolbox that we're giving them to inform their process development. This would solve, uh, a lot of problems today. You know, some of them being, you know, workforce development in the sense that, uh, newer engineers would now have a toolbox aid in their recipe development. A computing toolbox that can predict process outcomes would reduce total recipe development costs.
• 19:58 - 20:19
It would accelerate times in market. It would enable new processes for processes that are not yet known with our type of solution. We could help process engineers essentially have more time for themselves. I, I don't know if you've ever talked to a process engineer, but they, uh, work 24 7. So our ultimate goal is to essentially just make their lives a little bit easier.
• 20:19 - 20:34
What's the ideal time, sort of in the development of their process? Are they they using your product upfront or are they coming to you after they've failed and they're on a problem they can't solve at all? Or what point are people coming, coming to you and engaging, you know, your software?
• 20:35 - 21:08
Yeah. Um, we kind of can enter in at many points. Usually when process, what, usually when a team reaches out to us, they have essentially, uh, hit, are kind of stuck in their process development, trying to figure what's going on with the recipe, what's causing a certain type of behavior. But the, our software tool Sandbox CDI, is meant to be used essentially to iteratively with the process development. So you perform an experiment, the model updates, it's, it's kind of like holding your hand and your recipe development, but giving you some data driven moves. You know, I, I've talked a little bit about Sandbox d i, but the other product that we have is called Weave.
• 21:09 - 21:27
Um, it kind of goes through the metrology question that you were asking about earlier. Um, weave essentially automates image processing for engineers. So engineers, uh, have to essentially measure tons and tons of images just to see if their process is working correctly. Weave aims to essentially eliminate all of that by providing measurements automatically.
• 21:27 - 21:29
How does it do the measurements automatically?
• 21:30 - 22:01
Uh, so using machine learning, we essentially segment the images. Um, this is a really, really hard problem. Um, so essentially you're taking gray scale SCM images, which the eye visibly, for example, material layers are indistinguishable. And we've essentially will segment those images. The users can define where, what they want measured, and we'll extract out those measurements. So this is super important for modeling, cuz a lot of times, I mean, we need accurate measurements to build the model, but also for process validation. So engineers have to measure, uh, uniformity of their process outcomes.
• 22:01 - 22:12
Now with Weave, you can actually have a very quantitative statistical results on how uniform your features are and pretty accurate measurements of how your, um, of what your material measurements are and what your profiles are looking like.
• 22:13 - 22:17
So, so have you deployed this tool yet? Like, uh, with, uh, at, at a Fab or at Scalia?
• 22:17 - 22:29
Yeah, so we're deploying it, uh, this February. We've done a few projects kind of, uh, in beta, uh, with some customers, um, to socially show that it's working through concepts, but pretty excited this year to launch it. Our team is getting ready.
• 22:30 - 22:35
That's great. So with your initial customers, like how, how did you know it was successful? What kind of value did it bring to them?
• 22:36 - 23:04
The, the biggest value you can bring, honestly, is in helping them to see different material layers that are otherwise hard. So, um, in a gray scale image, it's really hard to kind of see how you're for an edge process, how you're etching different features With Weave, all those things are segmented and so you can actually see it a lot better, make better decisions and Weave feeds directly into our other software Sandbox studio. So the whole thing is essentially an automation loop so that the metrology and the modeling can work together.
• 23:05 - 23:12
That's great. Wow. What kind of feedback do you get from engineers on this change changing lives, making 'em better? <laugh>,
• 23:13 - 23:29
<laugh> I think, I think making their lives a little bit easier. It's, uh, I think people who have to do a lot of image analysis, it's pretty straining. Um, and it, it's honestly pretty boring work. I think people really appreciate just being able to have a tool do that, do it for them. That's, uh, that's more or less completely automated.
• 23:29 - 23:38
That's great. So saving money and saving eyes and saving process engineer sanity too, huh? That's a, that's a huge mission there. Awesome.
• 24:52 - 25:15
So I mean, like in the startup world, you often talk about sort of solve, like it's much more valuable and easier to make a sale if you're solving a pain point than if you're sort of solving, you know, making somebody's life better, right? Yeah. And so imagine your sales pitch is more around pain points. Like, we're gonna make your life easier or solve this problem for you than it is we're gonna make you, I don't know, richer, faster, or, um, yeah,
• 25:16 - 25:49
That's, um, yeah, that's a good question. I mean, the, the, the pain points, we essentially reduce costs and accelerate time to market. Um, but it is, it does become kind of a different, the process engineers who use our software care about different things than the directors and, um, you know, vice presidents purchasing it. Um, you know, the, the VPs care about roi, the process engineers really care about what they're gonna be able to learn. Will this, will our software illuminate something they didn't know? So it is, it is kind of a fine balance. You want that advocate, but ultimately, we essentially always try and show that we're going to reduce total cost, provide a good ROI for our users.
• 25:50 - 26:10
Yeah. Do you, I mean, so do you, it sounds like your, your entry point is with engineers and then they sell it back up the chain to sort of their executive, uh, channels, right? How do you go about like, sort of arming them to have that conversation where you're like, and clearly solving your pain point, but your next challenge is, let's take it up to your executive and here's how you make the sale to them.
• 26:11 - 26:29
Oh, it's always the data, right? Or a data driven company. So we try and arm them with the data, like, look, we solved your process. This is how fast we did it. You never wanna say, the technology just has to show its value, right? But that's, that's essentially what we show here. Like, you know, this is working, it's worth the investment. It's worth you bringing on a new vendor. This, this will help you solve your challenges.
• 26:29 - 26:39
Yeah. I, but I guess you don't want the process engineer going, being like, my eyes hurt from looking at all these pictures, <laugh> like, um, you know, actually we caught this many more issues, or, or, you know,
• 26:39 - 26:40
• 26:40 - 26:41
Show the value that
• 26:41 - 26:56
Way. I could frame it more like, um, you know, process engineers spend like 20% of their time measuring stuff and all stuff is like low hanging fruit. So if we can automate that, we increase productivity. So I think it's, it's, uh, there's definitely still an ROI there as well.
• 26:57 - 27:02
Yeah. No, that's great. Good to think about that. The data basically solves itself, right? That's great. <laugh>,
• 27:02 - 27:03
• 27:04 - 27:04
Fantastic. Well thank you so much, it seems like you’re building some interesting stuff this is really the next generation of technology that’s really going to leap frog us forward. We look forward to having you back soon!
Yeah, thank you!