Category Archives: Instrument

Flow Visualization in PCB Testing: Part 3

Part 3: Best Wind Tunnels for Flow Visualization

In our final post of the 3 part blog series, we discuss lab equipment that easily allow for optimal flow visualization when testing PCBs.

Our first two posts show the benefits of air and liquid flow visualization in PCB testing. In order to successfully evaluate the thermal performance of a component and PCB, the proper test environment must exist:

A test system that:

  • Accommodates both single components and multiple PCBs
  • Reflects the actual system
  • Can simulate elevated temperature conditions with a controlled flow, if required

 

ATS offers a family of wind tunnels that offer an automated facility for thermal characterization, testing, and optimization of PCBs, components, and heat sinks.

ATS Wind Tunnels

 

All ATS wind tunnels have a large Plexiglas test section for optimal flow visualization. Wind tunnel controllers, such as the WTC-100 and CLWTC-1000 automate testing while accessories such as the HP-97 simulate heat dissipation. Most wind tunnels are easily portable and can be operated vertically or horizontally, maximizing compact lab space.

Rework of PCB layout using flow visualization techniques

Flow visualization is an easy and effective way to understand the flow condition of a component. It allows the PCB layout to be changed at the design stage, reducing the development costs of reworking the board layout after the design has be solidified. It minimizes thermal component costs, increases system reliability, and speeds up a product’s time to market.

To learn more about the wide range of ATS Wind Tunnels, please visit http://www.qats.com/Products/Wind-Tunnels or email ats-hq@qats.com.

Click here for part 1 of this three part series
Click here for part 2 of this three part series

Flow Visualization in PCB Testing: Part 2

In part 2 of our blog series on flow visualization, we discuss the benefits of liquid flow visualization, along with several primary methods for creating successful flow visualization.

Part 2: Liquid Flow Visualization

Liquid Flow Visualization

Flow visualization is usually easier to perform in water than air and yields results of better quality. As with smoke visualization, dye entrainment is successful mostly in laminar flow. The enhanced mixing in turbulent flow causes the smoke streaks to diffuse too rapidly to be of value as tracers. Compared to smoke visualization in air, dye entrainment in liquids is helped by the fact that the mixing between most dyes and water is less intense than between smoke and air. As a result, water-flow tunnels are frequently used to study air flows by testing scaled models at lower velocity. This often provides a better description of the flow.
A liquid flow model is scaled with a different working fluid than air using dimensional analysis. For a treatment of the principals of dimensional analysis and similitude that should be used in applying flow visualization in model experiments, reference can be made to any standard textbook on fluid mechanics. For the flow conditions around a model to be completely similar to those of the prototype, all relevant dimensionless parameters must have the same corresponding values: the model and prototype are then said to possess geometric, kinematic, dynamic and thermal similarity.
To visualize the flow, water-soluble dyes such as food coloring, potassium permanganate, methylene blue, ink and fluorescent ink may be injected using hypodermic needles or entrained from holes or slots in the walls of a test section. It is important that the velocity and density of the injected dye equal those of the surrounding fluid to maintain a stable dye filament and reduce disturbance of the surrounding flow.

In summary, fluid flow visualization is a powerful and unique technique for quickly identifying the flow distribution and approach air velocity to thermally challenging components in a complex PCB structure. By using this technique, one can attain the following:
-Examine the PCB layout at the design stage for expected thermal performance, e.g., determine flow stagnation areas.
-Make component layout recommendations that provide a thermally optimum board.
-Identify approach air velocities necessary for component thermal management and the choice of cooling system.

Click here for part 1 of this three part series
Click here for part 3 of this three part series

Flow Visualization in PCB Testing: Part 1

In our first post about flow visualization, we discuss the benefits of flow visualization, along with several primary methods for creating successful flow visualization.

Part 1: Best Techniques for Air Flow Visualization

PCBs support a multitude of components with varied geometries, electrical functions, power dissipation and ther­mal performance needs. For a PCB to work properly, a component’s thermal requirements must be met locally or at the system level. Regardless of the type of housing that surrounds a PCB, its cooling system must be designed so that diverse components are electrically functional and run at temperatures that help them reach their expected life spans.

Much effort is needed to meet a component’s thermal re­quirements, whether by enhanced fluid flow (liquid or air) or by adding a cooling solution, e.g., heat sink, onto the component. Except for a conduction cooled PCB, where a cold plate extracts heat from the board, electronics are typically in contact with some sort of cooling fluid. In many cases, the PCB is in direct contact with the coolant. This creates a very complex problem along with a unique opportunity.

The problem stems from the intricate topology of the PCB. Highly complex flows are observed on PCBs due to their three dimensional protrusions, i.e., components. A typi­cal PCB sees every imaginable flow structure. These include laminar, turbulent, separated flow, reversed flow, pulsating, locally transient and others. Flow visualization has the potential to yield more insight into a fluid flow or convection cooling problem than any other single method. Many misconceptions can usually be cleared up by flow visualization. However, it is important to use the technique most suited to a given problem.

Air Flow Visualization

Smoke entrainment is the most common visualization technique for laminar air flows. But it has somewhat limited use in turbulent flows due to its rapid diffusion by turbulent mixing. Smoke can be produced from many sources, but essentially it is made by either smoke-tube or smoke-wire.

In the smoke-tube method, vaporized oil is used to form a visible whitish cloud of small particles as the hot oil vapor condenses. Consideration must be given to the vortices shed by the smoke probe itself, since the most visible small-scale features often arise from the probes own wake. This effect is important for probe diameter-based Reynolds numbers exceeding about 15. Because it is impractical to reduce the probe diameter beyond a point and still get a reasonable amount of smoke flow, the smoke can be injected upstream of a convergence section to eliminate wake effects. A key disadvantage to the smoke-tube method is that the smoke is produced hot and rises due to its own buoyancy, thus it doesn’t follow the local flow faithfully. To reduce buoyance effects, the smoke can be cooled in a long length of tubing from the point of generation before its introduction into the flow.

In the smoke-wire method, smoke is generated as a sheet by coating a thin wire with oil, stretching it across the flow, and heating it with a pulse of current. Almost any wire and power supply can be used in this technique. The oil should be chosen carefully to have a broad boiling plateau, rather than a single temperature, in order to generate good smoke. Model train oil is suitable for this method.

The end result of board level flow visualization is PCBs that are thermally optimized and require no re-spin because of thermal constraints. If the board is thermally laid out, heat sinks and other cooling solutions are often not required.

Click  here for part 2 of this three part series
Click here for part 3 of this three part series

Why Use Research Quality Instruments?

The life expectancy of most products is estimated at some point prior to their introduction. Reliability analyses are an integral part of the design cycle of a product. In all reliability calculations, temperature is the key driver. The predicted life span from these calculations is often the deciding factor for introducing the product or investing more resources in redesign.

The questions that linger are: to what level of accuracy can we determine the temperature magnitude, and what is the impact of temperature uncertainty on the predicted reliability (i.e., the expected life of the product)?

When a system is operating, it incessantly experienc­es temperature and power-cycling. Such fluctuations, resulting from system design and operation or from complex thermal transport in electronic systems, create large bandwidths in temperature response. Whether it happens in the course of an analysis or a compliance/ stress testing, we often overlook the accuracy by which temperature is measured or calculated. Yet to truly obtain an adequate measure of a systems reliability in the field, such temperature data is essential.Why - Nomenclature

The CLWT-115 wind tunnel produces warm air flows for thermal studies

To demonstrate the impact of temperature on reliability, consider the two models commonly used in practice. The Arrhenius model [1], often referred to as Erroneous, is perhaps the most broadly used model in the field. Equation 1 shows the reaction rate (failure rate) k and the acceleration factor AT. KB is the Boltzmann constant (8.617 x 10-5 eV/K) and Ea is the activation energy. All temperatures are in Kelvin. Activation energy depends on the failure mechanism and the materials (for example, 0.3 – 0.5 for oxide defects, and 1.0 for contamination).

Why - 1

[1]

The second model, Eyring, often referred to as More Erroneous, is shown by Equation 2.

Why - 2

[2]

The data shows that the uncertainty band is between 7 to 51%. These numbers by themselves are alarming, yet they are commonly encountered in the field. In either case, Stand-Alone or Device-In-System, being able to accurately determine the temperature or air velocity in a highly three-dimensional thermal transport environment is not a task to be treated casually.

To measure the impact of such uncertainty on the reliability prediction, it’s best to calculate its impact on the Acceleration factor AT.

Let us consider the case when:

T1 = 40oC

T2 = 150oC

Ea = 0.4 eV

kB = 8.6×10-5 eV/K

This results in AT = 48. Now, let us impose a 10% and 35% uncertainty on the temperature measurement of T2. Table 1 shows the result of this error on the acceleration factor.

Why - Table 1

Table 1 clearly demonstrates how a small degree of uncertainty in temperature measurement can negatively impact the Acceleration Factor and, thus, the reliability predictions where AT is often used. The first row shows the correct temperature. The second row shows the result of a 10% error in temperature measurement (i.e., 165oC instead of 150oC). The last row shows the impact of a 35% error (i.e., 202oC vs. the 158.6oC that the device is actually experiencing). The end result of this error in measurement is a 230% error in the Acceleration Factor.

One may think such an error is rare, but the contrary is true! In a simple device-case-temperature measurement, the temperature gradient could be in excess of 20oC from the die to the edge of the device. Or the air temperature variation in a channel formed by two PCBs could exceed 30oC. Of course, there are variations due to geometry, material and power dissipation that are observed in any electronics system. If we add to these the effects of improperly designed instruments, the combination of physical variation and the instrument error could certainly be detrimental to a products launch.

Longevity and life cycle in the market are keys for a products success. Therefore, to determine system performance, a reliability analysis must be performed. Since time is of the essence, and first-to-market is advantageous, the quickest reliability prediction models (analysis in general) will continue to be popular. To make such models, the use of Equations 1 and 2, or others more meaningful, must include accurate component and fluid temperature data. Measurement is heavily relied upon for temperature and air velocity determination. It is imperative to employ instruments designed for use in electronics systems with the highest level of accuracy and repeatability. High-grade instruments with quality output will enhance the reliability of the product you are working on.

SUMMARY

Small errors in temperature and air flow measurements can have a significant effect on reliability predictions. The origin of these errors lies in the measurement process or the use of inaccurate instruments. The former depends on the knowledge-base of the experimenter. That is why a good experimentalist is even a better analyst. You must know where to measure and the variations that exist in the field of measurement. Electronics system environments are notorious for such variations. It is repeatedly seen that, in one square centimeter of air flow passage between two PCBs, you can have temperature variations in excess of 30oC. Therefore, measurement practices and instrument selection must address these changes and not introduce further errors because of inferior design. Besides its design, an instrument’s construction and calibration should not introduce more errors. Accurate and high-quality instruments are not only essential for any engineering practice, their absence will adversely impact reliability predictions of a product at hand. No company wants to have its products returned, especially because of thermally induced failures.

References:

1. Klinger, D., Nakada, Y., and Menendez, M., AT&T Reliability Manual, Van Nostrand Reinhold, 1990.

2. Azar, K., The Effect of Uncertainty Analysis on Temperature Prediction, Therminic Conference, 2002.

Touch Screen Scanner Simplifies Temperature and Air Velocity Measurement for Thermal Characterization

ATVS-NxTThe ATVS-NxT hot wire anemometer is a fully-portable scanner that provides rapid and highly precise temperature and air velocity measurements for the thermal characterization of electronic packages. The ATVS-NxTis member of our temperature & velocity measurement instruments, along with the ATVS-2020 and eATVS-8, but its touch screen feature and embedded pc speaks to its truly unique capability.

The unique touch screen feature allows users to control the stageVIEW software at their fingertips, quickly acquiring temperature and air velocity data, analytics, and reports. It includes an 8 GB hard drive and a CD/RW drive. Ethernet connections allow the scanner to be operated over an intranet or the Internet.
Watch the quick 30 second video showing the ATVS-NxT touch screen feature

The ATVS-NxT supports up to 32 sensors for simultaneous, single-point air velocity and temperature measurements in environments where single or multipoint measurements are required. The sensors are calibrated for both low (natural convection) and high velocity flow rates.

The easy to use ATVS-NxT collects heat and airflow data useful to heat sink manufacturers, IC houses, board designers, and other electronics manufacturers. The scanner’s single-point measurement of temperature and air velocity provides faster, more precise data enabling more efficient thermal management solutions. Visit qats.com to learn more or to request a quote.