Electronic Manufacturing PCBA Quality Process Audit — In Circuit Test

1. Operator and Work Instructions
1.1Is there a revision controlled Test Instruction which contains unique details for the specific product being tested? (Score 0 if any unsigned/undated handwritten instructions or any handwritten instructions more than 48 hrs old)
1.2Are Work Instructions readily available to the test operator and are they followed?
1.3Is the Fixture ID specified on Work Instructions?
1.4Is the Fixture ID traceable to a specific PCBA part number and revision level and to the Unit Under Test?
1.5Is the Batch File specified on Work Instructions?
1.6Is the Batch File traceable to a specific PCBA part number and revision level and to the Unit Under Test?
1.7Is the Vacuum setting and range specified on Work Instructions?
1.8Is the PCB orientation to the Fixture identified in the Work Instructions or on the Fixture?
1.9Does the test Operator have the Standard Operating Procedure (SOP) for the tester available to them at all times?
1.10Is there evidence that the Operator has been trained and certified against the Standard Operating Procedure for the Tester?
1.11Does the Operator know the content of the Standard Operating Procedure for the Tester and do they and follow it?
1.12Are Operators required to log in at the Test station and does this provide an automatic verification of training status?
2. ICT Fixture
2.1Is the ICT fixture identified with a name or number?
2.2Is the Preventative Maintenance / Calibration sticker on the ICT fixture current and up to date?
2.3Is there a Preventative Maintenance procedure and schedule for ICT fixtures?
2.4Is there evidence to demonstrate that Preventative Maintenance records are up-to-date?
2.5Are spare test fixture parts (excluding probes) stocked?
2.6Are spare test fixture probes stocked for each design required to support Dell fixtures?
2.7Is the inventory of spare test fixture parts adequately controlled?
2.8Are ICT fixtures adequately stored in such a way that the tester interface is protected?
3. ICT System Hardware
3.1Is there a vacuum gauge on the line connected to the test system?
3.2Is a calibrated vacuum gauge visible to the test operator and is it at the correct setting within an acceptable range?
3.3Is the Preventative Maintenance / Calibration sticker on the Test System current and up to date?
3.4Is there a Preventative Maintenance procedure and schedule for ICT Test Systems?
3.5Are the test systems pin electronics verified daily using the internal test?
3.6Is there evidence to demonstrate that Preventative Maintenance records are up-to-date?
3.7Are spare test system parts stocked?
3.8Are spare test system pin electronics boards stocked?
3.9Are the inventory of spare test system parts adequately controlled?
4. ICT Software
4.1Are all ICT test systems networked to a server?
4.2Is the ICT software downloaded from a central server when the program is called up or at least compiled once per day?
4.3Is the ICT software revision controlled for program changes?
4.4Are all changes to an ICT program, no matter how insignificant the change is considered, logged in the program or otherwise?
4.5Is it impossible for unapproved ICT s/w changes to remain on the test system longer than 24 hours before recompiling the program?
4.6When changes are made, is there evidence that change details are sent to Dell for approval?
5. Test Operation
5.1Is there an automated method of loading test programs (i.e. the batch file).
5.2Is the Fixture ID used to select and automatically load the correct ICT program for the unit under test?
5.3If the same ICT fixture is used for different PCBA part numbers, does the Batch file automatically differentiate? NA may be used.
5.4Is there a foolproof method to ensure that product A will not pass ICT if Batch File B is used? NA may be used.
5.5Is there a foolproof method to ensure that product B will not pass ICT if Batch File A is used? NA may be used.
5.6Is the PCBA orientation to the fixture identified or is required to be checked before board loading?
5.7Is the test program uniquely identified on the test system display after the test program has been loaded?
5.8Is there a documented and agreed convention outlining the storage of untested, failed, and pass boards?
5.9Are boards marked in some way to facilitate the implementation of this convention? (Forced Routing is acceptable.)
5.10Are boards awaiting test identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.11Are passing boards identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.12Are failing boards identified and stored separately according to the convention OR routed via a dedicated conveyor?
5.13Are pass and failed boards stored on different storage carts OR routed via a unique conveyor?
5.14Is SPC data collected and effectively used at this process point?
5.15Is the content of the SPC data chart up-to-date?
5.16Are out of control SPC data points effectively actioned?
5.17Are ICT buffer trigger points established to ensure process shut down should the limits be exceeded?
5.18Has a Gauge R&R study been completed in accordance with l GR&R Procedures?
5.19Has test coverage been calculated using the l Metrics for ICT coverage, and is this readily available and known?
6. ICT WIP Tracking
6.1Is a Forced Board Routing system and WIP Tacking system fully deployed throughout the test process?
6.2Does the system verify the ‘last’ step processed and compare it to the expected ‘last’ step?
6.3Do boards that PASS have an identifying mark to indicate their pass status? (Forced Routing is acceptable.)
6.4Do boards that FAIL have an identifying mark to indicate their fail status? (Forced Routing is acceptable.)
6.5Do failed boards have ICT fail listings attached for debug purposes? (Paperless repair is acceptable)
6.6Do all debugged boards have an identifying mark to indicate a debug status? (Forced Routing is acceptable.)
6.7Is there a software link between ICT Test results & the Forced Routing/Quality Data System?
6.8Does this software link eliminate manual intervention to indicate the test result status?
6.9Is this link fully automated and used to log Test Yield/First Pass Yield & Board Yield data?
6.10Is FPY and BY readily known and has it been calculated in accordance with Dell definitions and specified retry conditions?
6.11For boards tested in panel format, is the failed board identified before the panel is removed from the fixture?
7. Debug and FA Capability
7.1Are boards awaiting debug identified and stored to one side of the operator or routed via dedicated conveyor?
7.2Are debug buffer trigger points established to ensure process shut down should the limits be exceeded?
7.3Can it be demonstrated that a technician qualification or formal training is required for debug and failure analysis activity?
7.4Can it be demonstrated that all debug personnel meet the above requirements ?
7.5Is there adequate debug equipment available at each debug station ?
7.6Is ICT debug and repair conducted real-time (on-line) whenever possible?
7.7Can it be demonstrated that a tool is used to capture fail codes, their fixes, and then recommend the most common fix?
7.8Does the ICT debug technician access the ICT system via a display unit to better understand to root cause of failure?
7.9Is a test point location map available to the repair/failure analysis operator?
7.10Are there detailed instructions on determining false failures?
7.11Are false failure rates tracked, monitored, and documented?
7.12Are there goals for false failure reduction? (I.e. goals to increase FPY)