I feel like I am misunderstanding something. Defect rate seems like the inverse of yield? 90% defective = 10% yield, which no matter what “context” is missing, sounds abysmal. 10 chips, 10% yield assuming translates to a 90% defect rate, means 1 chip is good. 1000 chips, 900 are defective. No matter how you slice it, this seems terrible? Regardless of size, number of chips per wafer, if you are only getting 10% out of it, thats a lot of waste?
If we have a small chip design where we can fit 100 into this wafer, only 20 will be harmed by defects, and the remaining 80 will be fine. That’s an 80% yield rate. If we build a larger chip and only 25 can fit into a wafer, we would have 20 defective dies and 5 good dies. That’s a yield rate of 20%.
I think the article’s implying that the 10% figure was for a huge chip, and the defect density is low enough that normal-sized chips get a much better yield. If you make a big enough chip even on a really mature process, you’ll get a terrible yield. Sometimes you might need a really big chip, though, and be willing to spend a ludicrous amount of money on it.
The article doesn’t state the size of the chip the 10% figure was for, though, and just lists examples of things that wouldn’t have happened if the figure for a typical chip was that abysmal.
They’re saying that defect percent isn’t as important as “good chips per platter”
If you can fit 100 chips on a single platter, with an 20% defect rate, you get 80 chips.
If you can fit 1000 chips on a single platter with a 50% defect rate, you get 500 chips.
yeah I feel like we are saying the same thing - but the only figure we have is 10% yield and I don’t understand how this article disputes it?
Yeah, the article seems to say “don’t trust that number, but we don’t have any other numbers”
Very unsatisfying.