What Happens in the Shure Product Validation Lab

What Happens in the Shure Product Validation Lab

Share this Facebook Twitter LinkedIn

What Happens in the Shure Product Validation Lab

Facebook Twitter LinkedIn

In 2014, we added another quality control function to our product testing arsenal. In this interview, team members explain how their tests differ from traditional quality tests, and why everyone loves the SweatBot.

Shure has a longstanding reputation for rigorous product testing dating back to World War II. In 2014, we added another quality control function to our arsenal: the Product Validation Lab. Read on to find out what they do (and why they need a SweatBot).

How is the Product Validation Lab different from Corporate Quality Engineering?

Ahren Hartman: CQE is designed to test and ensure long-term reliability. Our products are used in the field for decades at a time, so that's really important. By contrast, we're focused on the typical workflows of the users: what do they do with the product? What are their experiences? In what environments are they using the product? What's it like to set up the product? How does it perform? What's it like to tear it down and ship it on a regular basis?

Mark Gilbert: Our CQE tests came to us many years ago from the military, from user experiences on ships. There were standards they had to pass in order to be qualified for use in that environment. What we're looking at is how users today interact with products, which is a little different than it was during World War II on ships. So, we're like the leading edge of product testing, from the research and development side. But if we come up with something that we think is a really good test that should be done on a lot of products, then it'll become part of the CQE testing process, and we'll move onto something new and wacky.

AH: Another big differentiator is that a new product usually gets CQE approval at the end of the project. We're involved at the beginning of projects, and during the design, so we have an opportunity to provide feedback to the product design team and influence the design.
 

Are your tests Pass/Fail like theirs?

MG: Unlike CQE, we don't have a Pass/Fail system. We'll tell you how the product behaved when used in a certain manner, but we don't say whether it passed or failed. We just say what happened.

AH: We focus more on putting ourselves in the shoes of the users whereas CQE is looking at environmental long-term wear. We're always asking the question, "What if the user does this to the product?"

Rex Balcita: We also make sure that the data we gather is validated and repeatable so that if anybody on a product design team has questions, we can explain exactly what happened when we used the product a certain way.

AH: A good example of this is the SweatBot test. Degradation from sweat generally happens over a long time. You don't typically damage a product with sweat overnight. So we test how a product performs over many, many wet/dry cycles with acidic sweat. We study how the product materials hold up, quantify that, look at the trends in the data, and compare that to competitor products. Pass/Fail tests don't tell you all that. They only tell you if your product passes or fails. They don't tell you why it failed, when it failed, or under what conditions it failed.

 

How did the need for the PVL become apparent?

Eric Miedema: We're producing more products and more complex products than we used to. If we set up use cases with all the moving parts, then we'll discover things to fix during the design process, before they become issues in the field. That's part of the justification for a lab where you can put all those products through a full user setup and run through as many use cases as possible to really stress the products the way a user would.

AH: Another thing that's really driving the need for this lab is that audio products are no longer stand-alone things. They have to work with other products. Say you have an array mic that connects to a corporate network, or you have wireless systems that connect to each other on a big digital audio network. In those cases, the design engineers might not even be the right people to test interoperability. You're no longer just testing a microphone. You're testing systems of interconnected products, which is a whole other level of user experience.

 

How do products end up in the PVL? What's the process?

MG: The more complex a product is, and the more new its technology is, the more likely it is to end up in the PVL.

AH: Often, teams will walk in and ask about a test they're not sure how to perform, and they'll ask for help. We'll say, "Absolutely, let's talk about what you're trying to learn and what you need."

We're a small team, so we can't touch everything, but we hope to be involved in more projects going forward, and we aim to help anyone who asks. And if CQE needs help with something, we'll partner with them. Sometimes we'll work with sales, product support, or market development if they've got a field issue they can't debug.
 

Is it only internal teams that come to you with requests, or do you work directly with customers?

AH: Occasionally, the service department will come to us with an issue from an external customer, especially if it's something they've seen multiple times. We'll test it out and report back to them, and they'll share the findings with the customer.

RB: We get emails from marketing, often about earphones connecting to iPhones. Recently we got one about a customer who wanted to make sure that our VP mics would work with a system he has, so we set it up, tested, and let the customer know the results. So, occasionally, we'll interact directly with outside customers, but those folks usually come to us through our market development team.

AH: Yeah, the general public doesn't know about us.

AW: Yet…

 

What are you most proud of that the lab has accomplished since opening?

Anas Lakhal: I would say the SweatBot because it's allowed us to study exactly what happens with earphones and microphones in the presence of sweat with users, which is very hard to quantify. We're using this information on the next generation of earphone and microphone designs.

EM: The project teams are approaching us unprompted with requests for help. I'm proud to have built the reputation that they can trust us to help. It's most rewarding for me to see our work integrated into the design changes and ultimately influencing the outcome of the final product.
 

What's the weirdest testing machine in the PVL?

Everyone: The SweatBot!

MG: We also used to have a rain test, with a showerhead and everything, until the flood. The hose broke one weekend. So, we don't do the rain test in here anymore. We relocated that.

RB: The stage was cool too. We had a whole setup with drums and guitar and amps. People used to ask us, "What, are you guys just jamming out in here?" I also like the truss. You can hang 12,000 lbs on it. It lets us hang lights and test for wireless interference.

 

Anything else you want people to know about the lab that we haven't covered?

AH: We're here as another voice of the customer. Whatever we can do for engineering, marketing, sales, applications, service, internal, external, anywhere around the globe, bring us the challenge. We'll bring you some data.

RB: Yeah, mechanical engineering tests, electrical engineering tests, systems tests—we have experience in all of that, so no matter the question, we'll find a way to test it and give you data.