Choosing the Best Device Sample Size for Verification and Validation - Webinar CD/Transcript

$287.00
Quantity Discounts
1 - 2
$287.00
3 - 4
$258.00
5 - 6
$244.00
7 - 9
$230.00
10 - 9999
$215.00

Choosing the Best Device Sample Size for Verification and Validation: Tools to Safely Speed Your Device to Market

Today's FDA inspectors focus like lasers on your testing methods … how you justify them statistically … and your results.

They want to make sure you're applying the right statistical tools and methods, and that you're using them correctly, in a proper way.

Unclear on what you need to do? The Choosing the Best Device Sample Size for Verification and Validation webinar CD and transcript set will show you.

Device manufacturers understand that you must answer the two key design verification and validation questions: did we make the product right and did we make the right product?

And you know that testing is the only way to effectively answer them.

Finally, your test samples must demonstrate that the results can be reasonably applied to larger production runs and maintain safety.

But how many units should you test? And when you decide, can you statistically justify that decision?

These questions have baffled devicemakers for years.  The answers are … well, it depends. 

And that's why design control statistical expert Steve Walfish leads a session to help you understand exactly what it does depend on. During his presentation, you will learn:

  • The requirements for statistical techniques and how they impact design controls processes (21 CFR 820.30(f)(g))
  • What types of requirements lend themselves to statistics in verification and validation (hypothesis testing, confidence interval, design of experiments)
  • How variance in the population can impact the sample size necessary to establish objective evidence
  • The relationship between risk and sample size (i.e., risk to patient — critical major, minor)

You will also gain the fundamental knowledge you need to determine sample size in statistical testing. (For example, a sample size of 3 is not sufficient without justification.)

In addition, Mr. Walfish covers the following:

  • Why it is critical to understand the different compliance requirements for design verification and design validation — and how to understand those differences
  • Leveraging statistical methods that work best to satisfy the FDA’s requirements for defensible methods
  • How to use proven methodologies to avoid too small — or too large — sample sizes. 
    (Too small and you might not be accurately determining risk; too large and you could be unnecessarily wasting time and money.)
  • How sample size should optimally be proportional to risk (business and patient)
  • Why it’s pointless to try and predict the personal focus of different auditors — and why the real foundation for a successful audit is being able to produce a defensible program based on visible standards

You'll learn proven tactics and tools to develop a strong statistical methods program — and a thorough understanding of what FDA auditors look for when they come for an inspection at your facility.

  • Validation and verification professionals
  • Quality engineering
  • Regulatory Affairs
  • QA/QC
  • Software development, programming, documentation, testing
  • R&D
  • Engineering
  • Production
  • Operations

Steven Walfish is the president of Statistical Outsourcing Services, a consulting company that provides statistical analysis and training to a variety of industries. Prior to starting Statistical Outsourcing Services, he was the Senior Manager Biostatistics, Non-clinical, at Human Genome Sciences in Rockville, MD. Prior to joining HGS, he was a senior associate at PricewaterhouseCoopers specializing in the pharmaceutical industry.

Mr. Walfish brings more than 18 years of industrial expertise in the development and application of statistical methods for solving complex business issues including data collection, analysis and reporting. He has held positions with Johnson & Johnson and Chiron Diagnostics where he worked with large data sets for monitoring process data.