Electronic Manufacturing PCBA Quality Process Audit — In Circuit Test

1. Operator and Work Instructions
1.1 Is there a revision controlled Test Instruction which contains unique details for the specific product being tested? (Score 0 if any unsigned/undated handwritten instructions or any handwritten instructions more than 48 hrs old)
1.2 Are Work Instructions readily available to the test operator and are they followed?
1.3 Is the Fixture ID specified on Work Instructions?
1.4 Is the Fixture ID traceable to a specific PCBA part number and revision level and to the Unit Under Test?
1.5 Is the Batch File specified on Work Instructions?
1.6 Is the Batch File traceable to a specific PCBA part number and revision level and to the Unit Under Test?
1.7 Is the Vacuum setting and range specified on Work Instructions?
1.8 Is the PCB orientation to the Fixture identified in the Work Instructions or on the Fixture?
1.9 Does the test Operator have the Standard Operating Procedure (SOP) for the tester available to them at all times?
1.10 Is there evidence that the Operator has been trained and certified against the Standard Operating Procedure for the Tester?
1.11 Does the Operator know the content of the Standard Operating Procedure for the Tester and do they and follow it?
1.12 Are Operators required to log in at the Test station and does this provide an automatic verification of training status?
2. ICT Fixture
2.1 Is the ICT fixture identified with a name or number?
2.2 Is the Preventative Maintenance / Calibration sticker on the ICT fixture current and up to date?
2.3 Is there a Preventative Maintenance procedure and schedule for ICT fixtures?
2.4 Is there evidence to demonstrate that Preventative Maintenance records are up-to-date?
2.5 Are spare test fixture parts (excluding probes) stocked?
2.6 Are spare test fixture probes stocked for each design required to support Dell fixtures?
2.7 Is the inventory of spare test fixture parts adequately controlled?
2.8 Are ICT fixtures adequately stored in such a way that the tester interface is protected?
3. ICT System Hardware
3.1 Is there a vacuum gauge on the line connected to the test system?
3.2 Is a calibrated vacuum gauge visible to the test operator and is it at the correct setting within an acceptable range?
3.3 Is the Preventative Maintenance / Calibration sticker on the Test System current and up to date?
3.4 Is there a Preventative Maintenance procedure and schedule for ICT Test Systems?
3.5 Are the test systems pin electronics verified daily using the internal test?
3.6 Is there evidence to demonstrate that Preventative Maintenance records are up-to-date?
3.7 Are spare test system parts stocked?
3.8 Are spare test system pin electronics boards stocked?
3.9 Are the inventory of spare test system parts adequately controlled?
4. ICT Software
4.1 Are all ICT test systems networked to a server?
4.2 Is the ICT software downloaded from a central server when the program is called up or at least compiled once per day?
4.3 Is the ICT software revision controlled for program changes?
4.4 Are all changes to an ICT program, no matter how insignificant the change is considered, logged in the program or otherwise?
4.5 Is it impossible for unapproved ICT s/w changes to remain on the test system longer than 24 hours before recompiling the program?
4.6 When changes are made, is there evidence that change details are sent to Dell for approval?
5. Test Operation
5.1 Is there an automated method of loading test programs (i.e. the batch file).
5.2 Is the Fixture ID used to select and automatically load the correct ICT program for the unit under test?
5.3 If the same ICT fixture is used for different PCBA part numbers, does the Batch file automatically differentiate? NA may be used.
5.4 Is there a foolproof method to ensure that product A will not pass ICT if Batch File B is used? NA may be used.
5.5 Is there a foolproof method to ensure that product B will not pass ICT if Batch File A is used? NA may be used.
5.6 Is the PCBA orientation to the fixture identified or is required to be checked before board loading?
5.7 Is the test program uniquely identified on the test system display after the test program has been loaded?
5.8 Is there a documented and agreed convention outlining the storage of untested, failed, and pass boards?
5.9 Are boards marked in some way to facilitate the implementation of this convention? (Forced Routing is acceptable.)
5.10 Are boards awaiting test identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.11 Are passing boards identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.12 Are failing boards identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.13 Are pass and failed boards stored on different storage carts OR routed via a unique conveyor?
5.14 Is SPC data collected and effectively used at this process point?
5.15 Is the content of the SPC data chart up-to-date?
5.16 Are out of control SPC data points effectively actioned?
5.17 Are ICT buffer trigger points established to ensure process shut down should the limits be exceeded?
5.18 Has a Gauge R&R study been completed in accordance with l GR&R Procedures?
5.19 Has test coverage been calculated using the l Metrics for ICT coverage, and is this readily available and known?
6. ICT WIP Tracking
6.1 Is a Forced Board Routing system and WIP Tacking system fully deployed throughout the test process?
6.2 Does the system verify the ‘last’ step processed and compare it to the expected ‘last’ step?
6.3 Do boards that PASS have an identifying mark to indicate their pass status? (Forced Routing is acceptable.)
6.4 Do boards that FAIL have an identifying mark to indicate their fail status? (Forced Routing is acceptable.)
6.5 Do failed boards have ICT fail listings attached for debug purposes? (Paperless repair is acceptable)
6.6 Do all debugged boards have an identifying mark to indicate a debug status? (Forced Routing is acceptable.)
6.7 Is there a software link between ICT Test results & the Forced Routing/Quality Data System?
6.8 Does this software link eliminate manual intervention to indicate the test result status?
6.9 Is this link fully automated and used to log Test Yield/First Pass Yield & Board Yield data?
6.10 Is FPY and BY readily known and has it been calculated in accordance with Dell definitions and specified retry conditions?
6.11 For boards tested in panel format, is the failed board identified before the panel is removed from the fixture?
7. Debug and FA Capability
7.1 Are boards awaiting debug identified and stored to one side of the operator or routed via dedicated conveyor?
7.2 Are debug buffer trigger points established to ensure process shut down should the limits be exceeded?
7.3 Can it be demonstrated that a technician qualification or formal training is required for debug and failure analysis activity?
7.4 Can it be demonstrated that all debug personnel meet the above requirements ?
7.5 Is there adequate debug equipment available at each debug station ?
7.6 Is ICT debug and repair conducted real-time (on-line) whenever possible?
7.7 Can it be demonstrated that a tool is used to capture fail codes, their fixes, and then recommend the most common fix?
7.8 Does the ICT debug technician access the ICT system via a display unit to better understand to root cause of failure?
7.9 Is a test point location map available to the repair/failure analysis operator?
7.10 Are there detailed instructions on determining false failures?
7.11 Are false failure rates tracked, monitored, and documented?
7.12 Are there goals for false failure reduction? (I.e. goals to increase FPY)

Write a Comment

Your email address will not be published. Required fields are marked *