Pro AV Logo Originally published as a Consultant's Connection
column in Pro AV Magazine
  August 2002

Does Testing Make the Grade?

Wouldn't it be great if consultants, integrators and owners all knew what testing and documentation to expect on every project?

By Tim Cape, CTS-D

The consultant-led design-bid-build process can be a long and perilous journey. Along the way, there are countless issues that arise, both large and small, that require attention and resolution before the job is done. Though consultants have lots to do early on with getting the base building infrastructure designed and creating system drawings and specifications, it's the construction administration/installation/checkout phase of the project that usually brings the most headaches. One of the challenges in this part of the project is component and system testing. Testing is almost always required in the specifications, but the problem is whether it actually gets done—and whether or not anyone notices.

To test, or not to test?

In the relatively short history of AV consulting and integration, consultant specifications have been based on the architectural model used to specify building materials. This doesn't quite fit our industry, but consultants have really tried over the years to make it work. Testing usually falls under Part 3 of the specifications (the aptly-named “Execution” section) but it never quite fit in. Compared to bricks-and-mortar product testing requirements, there's simply a lot more to be said about AV testing. There isn't a concise list of reference standards like there are for many building materials and assemblies.

In addition, about ten years ago when the larger share of projects that consultants were designing were generally smaller (more in the hundreds of thousands of dollar range, or less, instead of the millions), it was humanly possible to test everything—every wire, every component, every input to every output at every step of the way. But was it all needed? Well, according to the spec it was. Maybe it all was done and maybe not, but it could be done! In today's systems designs, many specifications haven't been seriously re-evaluated and updated for multimillion-dollar projects, and the testing requirements of a few years ago aren't really feasible on these larger projects.

Worse yet, some consultants aren't always consistent and don't always enforce the testing required in the specification document, and some of the contractors know it. So what's an integrator to do at bid time? One knows the particular consultant never enforces the elaborate testing called for in the spec and lowers his bid. Another doesn't know this and pads the bid heavily to cover it. If the low bidder gets the job and he's right, then the other “naïve” bidders lose the job, the owner saves some money, the consultant doesn't have as much paperwork to do, and maybe a few problem components get missed. If the low bidder is wrong and the specified testing is demanded, then the integrator loses money and/or a lot of heated discussion ensues. In either case, this is no way to run a railroad.

The Need to Know

The intent of contract-mandated testing is multifaceted. Consultants, as the owner's representative, want to know that the cabling is installed without damage, that the equipment arrived to the integrator in operable condition, that the racks are fully functional and meeting spec before shipping, and that the system is completely operational before the final check-out begins. That can be a lot of testing. Does all that testing ever happen if it's specified on a larger project? Probably not.

These days there is a lot to know about a system, particularly one that costs more than $1 million. The question is, how much do we need to know? What tests are the ones that should be required in a specification? Ultimately, what we really want is the system to perform to suit the purposes determined initially in the program or needs analysis phase.

At one end of the spectrum, there is all encompassing testing required of all elements of the system at various points during the installation phase. If this isn't feasible on a given project, why specify it if it won't happen? The other end of the spectrum (short of no testing) is to walk in at the end of the job and make a subjective judgment. Does it sound good enough? Does it look good enough? Maybe this isn't enough. It certainly isn't something that can be enforced in a contract with the owner. Therefore, what's a reasonable “minimum” set of tests that is workable?

One Opinion (at least for now)

At our firm we have been reconsidering system testing requirements in the past few years and our current approach is to require a report of a basic set of tests from every system input to every system output once it's assembled, which theoretically covers all the cabling and equipment in between. Only if a problem is found at this point is more detailed component testing called for by contract. Of course, though it's not required by contract, it would behoove the prudent contractor to do more cable and component testing before the end of the job.

It will take all of us working within the industry to help make this part of the AV integration process better. We need a coherent set of standards much like the Data/Telecom industry has. We could use new AV test equipment that makes and logs the results automatically to make testing more efficient and standardized. It would also help to have some agreement between and among contractors and consultants on what testing should be done and how.

Wouldn't it be great if consultants, integrators and owners all knew what testing and documentation to expect on every project? That's the way it is for much of the Data/Telecom world, but not for us.

 

 

 
email us © 2014 Technitect, LLC All Rights Reserved