NVIDIA KEEPS HOWLING that it will have GT300 out this year, so we did a little digging on the topic. All you can say is that it seems willing to fritter away large swaths of its and TSMC’s cash to meet a PR goal.
The short story is that GT300 taped out in week 3 or 4 of July 2009. It is a huge chip, about 23mm X 23mm, but we are now hearing it will be 23.something X 23.something millimeters, so 530mm^2 might be a slight underestimate. In any case, TSMC runs about 8 weeks for hot lots, 2 more weeks for bringup, debug, and new masks, and finally another 10-12 for production silicon.
GT300 wafers went in at TSMC on August 3, give or take a couple of days, so if you add 22 (8 + 2 + 12) weeks to that, you are basically into 2010 before TSMC gives you the wafers back with pretty pictures inscribed on them. Like babies, you just can’t rush the process with more, um, hands.
While it is not rushable, you can do some of it in parallel, and that uses what are called risk wafers. Those wafers are put in at the same time the first silicon hot lots are, so they have been in the oven about 2 weeks now. Just before the final layers are put on, the risk wafers are parked, unfinished, off to the side of the TSMC line.
The idea is that with the hot lots, Nvidia gets cards back and debugs them. Any changes that are necessary can hopefully be retrofitted to the risk wafers, and then they are finalized. Basically, the risk wafers are a bet that there won’t be anything major wrong with the GT300 design, in that any changes are minor enough to be literally patched over.
Risk wafers are called risk wafers for a reason. If you do need to make a change big enough, say a metal layer spin, the risk wafers are then what are called scrap wafers. You need to make damn sure your design is perfect, or nearly perfect, before you risk it. Given Nvidia’s abysmal execution recently, that is one gutsy move.
Just looking at the cost, TSMC 40nm wafers are priced at about $5,000 each right now, and we hear Nvidia is paying a premium for each of the risk wafers. 9,000 X ($5,000 + premium) is about $50 million. If there is a hiccup, you have some very expensive coasters for the company non-denominational winter festivity party. Then again, putting them in the oven this early makes dear leader less angry, so in they go.
Costs for this are pretty high, but here again, Nvidia has an out. The rumor is that TSMC, as a way to slide Nvidia money through backchannels to prevent it hopping to GlobalFoundries, is charging Nvidia per good chip while yields are below 60%. Some knowledgeable folks who design semiconductors for a living estimate the initial yields at 20-25% after repair. For reference, early yields for GT200 on a known 65nm process were about 62% for the GT280 and yield salvaged GT260.
What does this mean? One of two things, Nvidia is either trying to make a seriously downscaled version just to get something out the door, or it is eating wafers in the hopes of getting enough parts to pretend it is not lost. Think absurdly expensive press parts.
The downscaled chip would be like a GT260– instead of ++, possibly downclocked as well. If the GT260 had two clusters disabled, you could do the same with GT300 and disable more to get effective yields up. We doubt this is the way it will go though. Cypress will slaughter Nvidia if it takes this route.
The other option is to cherry pick the few good parts that come out and pretend that they are not eating a few hundred dollars per card sold while praying that yields come up fast. Then again, if the rumors of TSMC shouldering the cost of bad chips is true, it isn’t Nvidia’s problem now is it?
Which way will it go? We will know in Q1. There will either be an updated spec card with a higher stepping or there won’t be. Nvidia will have cards out in real, not PR, quantities or it won’t. Nvidia will either have a hidden charge on its balance sheet or TSMC will. When you are talking about this much money, though, amounts this large are hard to hide.
In any case, Nvidia made a huge bet for ego purposes. The real question is who pays for it in the end, Nvidia or TSMC? Betting people say that TSMC will eat the costs and Nvidia will be a GlobalFoundries customer at the earliest opportunity anyway.S|A
Charlie Demerjian
Latest posts by Charlie Demerjian (see all)
- Microsoft Hobbles Intel Once Again - Sep 20, 2024
- What is really going on with Intel’s 18a process? - Sep 9, 2024
- Industry pioneer Mike Magee has passed away - Aug 12, 2024
- What is Qualcomm launching at IFA this year? - Aug 9, 2024
- SemiAccurate is back up - Aug 7, 2024