Top 5 laser diffraction data quality challenges and how Data Quality Guidance is here to help

Introduction

Accurate particle size analysis is crucial in all scientific disciplines and industrial processes. Whether you work in pharmaceutical formulation or environmental monitoring, the quality of your data directly impacts how smoothly your work plan progresses. With poor background and/or sample data, there is an increased likelihood of recording an incorrect particle size result, and this can impact product or process quality, time and effort spent on troubleshooting your analysis and buildup of business and environmental costs with waste of dispersant and sample usage for repeat measurements.

Data Quality Guidance is a tool available for use on the Mastersizer 3000 to make it easy for anyone to achieve accurate and reliable data. This add-on feature immediately flags if you’re experiencing issues with your data. It also provides clear information on the potential cause and outlines the next steps you should take to resolve these issues. And all this feedback is provided live within the measurement window, so no time is wasted on analyzing the quality of your data. With this tool, you can spend less time troubleshooting issues and more time using your advanced knowledge working with results you are confident in.

Introduction

Accurate particle size analysis is crucial in all scientific disciplines and industrial processes. Whether you work in pharmaceutical formulation or environmental monitoring, the quality of your data directly impacts how smoothly your work plan progresses. With poor background and/or sample data, there is an increased likelihood of recording an incorrect particle size result, and this can impact product or process quality, time and effort spent on troubleshooting your analysis and buildup of business and environmental costs with waste of dispersant and sample usage for repeat measurements.

Data Quality Guidance is a tool available for use on the Mastersizer 3000 to make it easy for anyone to achieve accurate and reliable data. This add-on feature immediately flags if you’re experiencing issues with your data. It also provides clear information on the potential cause and outlines the next steps you should take to resolve these issues. And all this feedback is provided live within the measurement window, so no time is wasted on analyzing the quality of your data. With this tool, you can spend less time troubleshooting issues and more time using your advanced knowledge working with results you are confident in.

Top 5 Data quality challenges

To achieve meaningful results, a laser diffraction measurement requires meticulous attention to detail throughout the measurement of the background and sample, as at each step, a careful evaluation of data quality is required. Some of the most common challenges include:

  1. Inconsistent or poor sample preparation can lead to inconsistent and inaccurate particle size measurements.
  2. Contaminants in the dispersion, such as residual particles left from previously measured sample or air bubbles, can lead to inaccurate results.
  3. Optical alignment issues, such as misalignment of the laser due to dirt on the cell windows, can distort measurement results. Consistent alignment checks during background measurements are necessary to maintain data quality.
  4. Dispersant instability - fluctuations in the dispersant environment, such as temperature, can impact data quality. Ensuring stability of the dispersant by verifying that the background is stable, without large fluctuations, is essential.
  5. Sample instability - particles suspended in the dispersant might not be stable.  For example, during the measurement particles may aggregate or settle out leading to an unstable measurement. Adequate dispersion and measurement parameters chosen during method development are crucial to mitigate this challenge.

How to avoid poor data?

Fortunately, it is often possible to detect potential issues early on during the measurement as we know that achieving good data requires two main components: a clean and stable background, showing a progressive decrease across the detector range, as shown in Figure 1, and a stable sample measurement.

[AN231108 Fig1.png] AN231108 Fig1.png
Figure 1. Mastersizer3000 example of good background

To achieve a clean and stable background we must verify that: 

1. No hump is observed in the data.

  • By ensuring a clean optical path.
    • With no contamination on cell windows and optical protection window.
    • Laser is well aligned.

2. No intermittent peaks across the detectors.

  • By ensuring our dispersant is free from contaminants for e.g., particles/ bubbles.

3. No unexpected large fluctuation.

  • By ensuring stable dispersant for e.g., the dispersant temperature equilibrated to the system temperature.
To achieve a stable sample measurement, we must verify that:

1. The laser obscuration is suitable for the measured particle size.

  • Sample amount added is enough to achieve good signal/ noise ratio.
  • Sample amount added is not too much, as it can cause multiple scattering.

2. Minimal negative data is recorded.

3. Dataset variability is minimal.

  • Sample is not dissolving, dispersing, agglomerating or sedimenting.

However, even with the best knowledge and experience, data quality issues can easily go undetected. If this happens then the wrong particle size distribution for a sample might be obtained which could impact batch release and sample specification, or cause inconsistency in results due to lower accuracy and/or precision.

In this application note we will explore how the Mastersizer 3000 Data Quality Guidance tool can ensure poor quality data is detected and resolved promptly, simplifying and/or eliminating challenges faced by the user.

Case study 1: Poor background due to contamination from previous sample

One very common issue that often goes undetected is large fluctuations in the background due to contamination from a previous sample or air bubbles. Samples more prone to this include metal powders, paints and lactose. A user might easily miss out on detecting such an issue as it requires the user to be very vigilant, keeping a close eye on the background fluctuation and dispersant within the accessory unit during measurement.

In this example, a background measurement was completed. Upon completion the data quality tab turned yellow, as shown in figure 2, indicating that a new message is available to view.

[AN231108 Fig2.png] AN231108 Fig2.png
Figure 2. During measurement Data Quality tab yellow indicator.

When the data quality tab is selected a message indicating a poor background is shown. The tool uses a range of algorithms to analyze the light scattering data and thus it can detect a comprehensive range of common background data quality issues; if an issue is detected then the user will be provided with the potential list of causes, from the most to least likely, and given guidance on how to resolve as shown in figure 3.

[AN231108 Fig3.png] AN231108 Fig3.png
Figure 3. The Data Quality tab following background measurement showing message for unstable- poor background.

Case study 2: Poor sample data

For each individual sample measurement, the tool performs data quality checks for the level of obscuration,​ alignment​, negative data​, data fit​, optical model ​and fine powder mode. Similar to the background quality check, once an issue is detected, the user will be informed of the cause and given guidance on how to resolve it.

In this case study, we have measured a yellow ink sample where too much sample has been added to the dispersion unit, with laser obscuration value measuring at 11.42% and mean DV50 value of 0.043 μm for 6 measurement records. For this sample, the data quality guidance tool has detected high obscuration, as shown in figure 4. During measurement, the obscuration and particle size are checked to determine appropriate sample concentration. A warning is displayed if too much or too little sample is added.

[AN231108 Fig4.png] AN231108 Fig4.png
Figure 4. Data quality tab during measurement showing message for high obscuration.

Impact of high obscuration message on our data

The data quality guidance advice was followed, and the yellow ink sample was remeasured at a lower obscuration value of 2.45%. This time, the data quality guidance tool did not detect any issues. To understand the impact that the change in obscuration had on our particle size distribution results we can compare the two datasets. Figure 5 shows how with increased obscuration a smaller particle size distribution is measured, with a skewing in the particle size distribution observed on the results graph (a) and lower mean Dv10, Dv50 and Dv90 values measured for the 11.42 % obscuration compared to 2.45 % obscuration measurement results (b). With too much sample a smaller particle size distribution is measured due to multiple scattering which led to exaggerated fines being interpreted within our data. The impact of multiple scattering is more significant for samples smaller than 10 μm.

(a)
[AN231108 Fig5-a.png] AN231108 Fig5-a.png
(b)
[AN231108 Fig5-b.png] AN231108 Fig5-b.png
Figure 5. Comparison between higher (11.42 %) and lower (2.45 %) obscuration measurement results on (a) particle size distribution graph (b) measurement records Dv10, Dv50 and Dv90 values.

Case study 3: Poor sample stability

Once the full sequence of sample measurements is completed, the data quality guidance tool performs dataset variability checks in accordance with ISO and USP standards. If the Percentage Relative Standard Deviation (%RSD) values are outside of the ISO and/or USP standards this will be indicated within the data quality tab as shown in figure 6.

[AN231108 Fig6.png] AN231108 Fig6.png
Figure 6. Data quality guidance dataset variability analysis

For additional sample data stability checks, the user can also input their own manual % RSD limits that is available within the data quality guidance report tab post measurement (Figure 7).

[AN231108 Fig7.png] AN231108 Fig7.png
Figure 7. Manual % RSD limits check available post measuremnt within the data quality report tab

Conclusion

With prompt data quality feedback, available during the measurement, you will be the first to know if something isn’t right with your data, so you can fix it early and avoid mistakes in your results. Now you will spend less time manually identifying or troubleshooting issues, and more time working with results you’re confident in. Not only you will be saving time and effort but also you will be wasting less sample and dispersant by avoiding multiple repeat measurements.

For an expert user the tool can be used to support you in verifying your data quality, and for a novice user, it can be used to guide you through the measurement, like a built-in training session – your results will be more accurate this time, and you’ll know the pitfalls to avoid next time!

With Data Quality Guidance, it is now easier than ever to achieve accurate and reliable data.

Login

Not registered yet? Create an account