Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

question about fault coverage?

Status
Not open for further replies.

bearpetty

Newbie level 6
Joined
Dec 3, 2003
Messages
12
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
57
williams-brown model

In general, when do DFT, how much fault coverage do we require? 90%, 95%, or 100%?

Thanks!
 

Hi

i think u r new to the DFT.generally the fault covrage is nothing but the ratio of the no. of detectable faults VS total no .of faults.so now the higher the fault coverage will detect more no of faults in the chip. but for the more fault coverage the no of vectors are also high.If the no.of vectors are high then te tester time is high. so it is better to have 95+% fault coverage.
Regards
Ramesh.S
 

Yes, typically over 95% is acceptable.

Part of the Art of DFT is to increase to increase the testable nodes and increase the fault coverage, which often requires changing your logic
 

I searched some paper at IEEE, and found that there are some equations between fault coverage and reject ratio. such as Williams-Brown model, Agrawal model, Seth-Agrawal model.

But:
1. These models have some assumption when get the reject ratio-fault coverage model. And these assumption may be not exact.

2. These model use some parameters to get the relationship between reject ratio and fault coverage, such as average number of faults on a chip, Clustering effect parameter.
I wonder where can we get these parameters? From foundry? or get them by lots of experiments?

I am curious about how to define fault coverage requirement in other companies? Based on thse quation? or based on experience?

Thanks very much![/img]
 

Hi,

I don't agree with the discussion here. I think the fault coverage must be better than 99%, perhaps better than 99.9%. Let me expaln.

1. Assume the yield is 90% (each is a very good yield). So if the fault coverage is only 95%, than for die with only 1 fault, there will be 5 bad chips in every 1000 chips you delivery to your customer!

2. Becareful how you calculate fault coverage. Logic like boundary scan are not in the scan chain, and thus appear as "loss in fault coverage". But actually, they are tested by another pattern. So while you "scan-chain" fault coverage is 95% or more, the actual coverage is higher.

Regards,
Eng Han
www.eda-utilities.com
 

Hi All,
i accept with the leeenghan notes.
but we must know one thing that the highest proiority was goes to customer.it customer is ready to pay the high amount for the tester every one should target must be for getting the 99.9+%.but as a compromise between tester cost and a good coverage is of 95+% is a good one. i think u understand
my concept.
regards
Ramesh.S
 

Hi ,
When the Fault coverage is around 95+% ,the Test coverage will be more than that for the obvious reason that some portion of the logic will be no faulted , like Tap logic & some macro's like PLL etc which are tested by other means .Some coverage loss will be because of the ATPG constraints & non scan elements.

So I feel fault coverage of 95+% is good enough .

Thanks & Regards
Chandhramohan
 

you can achieve 99% from a small design, but it is very difficult to get more than 96% from a big chip design.
 

DFT fault coverage has been a controversial subject fo the last 20 years. Some companies like IBM, SUN ..etc give very high importance to DFT and make their designs as DFT friendly as possible. This ofcourse adds some complexity and some area, possible loss of highest performance (due to additional muxing logic coming in the critial path) but their view is that its all worth it. Many other comapnies especially ASIC fabless companies in the bay area rely more on functional testing to validate silicon and not so much on DFT. Depending on where you work the fault coverage requirements differ.
 

Hi,

I have to disagree again with a few points mentioned above.

Firstly, as long as we insert scan chain, and the designs take simple steps to improve fault coverage, it is not difficult to get close to more than 99% for those logic that can be tested by scan.

Secondly, we cannot just stop at achieve 95% dft coverage. We have to look at the 5% that is not covered, and determine if they are test by other vectors, or is a coverage escape.

Thridly, while test cost is high, it does not add alot to test cost if you implement the test structure properly. Running a million vectors more take 0.1s more if you are running at only 10Mhz. With the new compression techniques this is even much less than the time for the overall test.

The important thing is the design must make it test friendly. The next important thing is the mindset should aim for more than 99.9% overall test coverage. It is much more costly for you customer to return you the bad chip than removed it in the first place.

Regards,
Eng Han
www.eda-utilities.com
 

Yes, this topic of FaultCoverage number is debateable..... because this actually depends very much on the application a chip is going to be used in.

Say for example, if the chip is going to be part of an automobile (car) and is going to control the airbags, then it is on a life critical application. So it is necessary to have >99% coverage.

At the same time if the chip is going to be used for a lighting control system, it doesnt have to be that much... 90% is fair enough.

Another example, say it goes into a space craft which is a mission crital operation, there 99.9% is a must, but beyond that, multi-level redundacy is employed there to take care of the worst case scenario (in case of that 0.1% becomes a reality).

So the numbers get decided by the application..

Best Regards,
Harish
https://hdlplanet.tripod.com
https://groups.yahoo.com/group/hdlplanet
 

hys said:
Say for example, if the chip is going to be part of an automobile (car) and is going to control the airbags, then it is on a life critical application. So it is necessary to have >99% coverage.
This is always happerned in our company,It's really really a hard job.
 

I think that the fault coverage of 96%+ is accepted in large design. We have to trade off between fault coverage with test time, overhead of test structure, cost of ATE, etc.

Added after 1 minutes:

U should take a look at many proposed works in DFT domain
 

I think it's depend on the fault model you use, for stuck-at fault we need 95%+ coverage, however, for some other fault models, such as delay path fault, the coverage can't be that high.
 

fault coverage should be more than 95%
 

Hi,

I am surprised to hear that most people here think that 95% is acceptable. To me, 5% is a large number of untested nodes. If your design has 200K cell, there will be about 400K nodes. 5% will mean 20K nodes.

Can I ask the question from another angle. What are the "5%" that you cannot test?

Regards,
Eng Han
 

I think En Han points are well, the fault coverage should not be one standard value, which is different to your different projects. If the atpg patterns do not coverage all the faults, you should confirm the rest faults uncovered by atpg patterns.
 

it is related to the application of the chip
if it is something pretty critical, then it must be very high...almost 99.9%
 

hi

i agree with the points that the no. of fault converge depend on the application of the chip

nothing is absolute

regards
drrizle
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top