TRIPLETT CTT42 Coating Thickness Tester User Manual
- June 5, 2024
- TRIPLETT
Table of Contents
CTT42 Coating Thickness Tester
User Manual
CTT42 Coating Thickness Tester 5 Commonwealth Ave Woburn, MA 01801
Phone 781-665-1400
Toll-Free 1-800-517-8431
Introduction
Congratulations on your purchase of the Triplett CTT42 Coating Thickness Tester. The CTT42 is a portable meter designed for non-invasive coating thickness measurements. The meter uses two measurement methods: magnetic induction (for ferrous metal substrates) and eddies current (for non-ferrous metal substrates). Proper use and care of this meter will provide many years of reliable service.
Description
Meter Description
- USB jack for PC interface
- Measurement probe
- Backlit LCD display
- SET /OK /YES /MENU /SELECT button
- UP /LEFT ARROW button
- Power ON/OFF button
- CANCEL /ESC /NO /BACK button (menu and Backlight ON/OFF (normal mode)
- DOWN ARROW /RIGHT button
- ZERO calibration button
Note: Battery compartment on rear of met.
Display Icon Description
NFE | Non – Ferrous metals |
---|---|
Fe | Ferrous metals |
AUTO | Automatic substrate recognition |
F or N | Calibration icons |
DIR | DIRECT mode |
GRO1…4 | GROUP Mode |
um | Micrometers’ unit of measure |
mils | mils = millimeters ‘ 2.54/100 |
mm | Millimeters unit of measure |
AUG | Average reading |
Min | Maximum reading |
MAX | Maximum reading |
SERV | Standard Deviation reading |
NO | Number of data points |
Low battery | |
USB Connection |
Quick Start Instructions
Meter Power
Press the power button 0 to switch the meter ON. The display will switch
ON. If the display does not switch ON, replace the batteries.
Measurements
Use the supplied film references and zero reference metal substrates to
learn how the meter operates before moving to a professional application. The
round metal substrate is the ferrous (magnetic) substrate and the rectangular-
shaped metal substrate is the non-ferrous (non-magnetic) substrate. The meter
automatically senses ferrous or non-ferrous substrates.
- Place a reference film (250pm, for example) on the round, ferrous substrate.
- Place the spring-loaded meter sensor against the reference film.
- In the single-mode, the meter will emit an audible tone indicating that the measurement has been
- In the continuous mode, the display will continually measure and update the display
- The LCD will show the reading (250pm) at the center of the display area.
A typical display will also show the following
Experiment with the remaining reference films and the substrate before using
the meter professionally.
Automatic Power OFF
In order to conserve battery life, the meter will automatically turn off
after approximately 3 minutes. To defeat this feature, use the programming
menu detailed in the next section (menu parameter AUTO POWER OFF under
OPTIONS).
LCD Backlight Button E – -‘ 1: ,
The LCD is equipped with backlighting for easier viewing, especially in
dimly lit areas. Press the backlight button to turn the backlight on. Press
again to turn the backlight off. To enable/disable the LCD backlighting, use
the BACKLIGHT parameter under OPTIONS in the programming menu.
Factory Default Reset
To restore the meter to its original factory default settings:
- Turn the meter OFF.
- Press and hold the ZERO button while switching the meter ON.
- When the meter powers up, the display will prompt for a YES or NO.
- Press the SET button for YES or the CANCEL button for NO.
- Note that all data in all memory locations will be erased, in addition to statistics, calibration values, and alarm limits.
Programming Menu
The meter can be configured and calibrated through simple button presses in the programming menu. Press the MENU button to access the menu and refer to the menu ‘tree’ below. The menu uses UP/DOWN, SELECT, BACK, & ESC presses for navigation and selection. In the table below the factory default settings are in bold with an asterisk. Each parameter is explained in detail in the subsequent sections.
Top-level | Sublevel 1 | Sublevel 2 | Notes |
---|---|---|---|
STATISTICAL VIEW | AVG’ | Average of a series of readings | |
MIN | Lowest of a series of readings | ||
MAX | Highest of a series of readings | ||
NO | Number of sampled readings | ||
SERV | The standard deviation of a series | ||
OPTIONS | Measure mode | Single* | One reading at a time |
Continuous | Continuous measurements | ||
Working mode | Direct* | Readings are not stored in groups | |
Group 1…4 | Store readings in groups | ||
Probe used | Auto’ | Meter automatically selects a mode | |
Fe | Ferrous measurement mode | ||
No Fe | Non-ferrous measurement mode | ||
Unit settings | pm* | Micrometers | |
mils | Mils = mm * 2.54 / 100 | ||
mm | Millimeters | ||
Backlight | ON’ | Enables backlight operation | |
OFF | Disables backlight operation | ||
LCD Statistics | AVG’ | Average of a series of readings | |
MAX | Lowest of a series of readings | ||
MIN | Highest of a series of readings | ||
SERV | The standard deviation of a series | ||
Auto Power OFF | Enable’ | Allows auto power off to operate | |
Disable | Defeats the auto power-off feature | ||
LIMIT | Limit settings | High Limit | High Alarm alerts the user when reached |
Low Limit | Low Alarm alerts the user when reached | ||
Delete Limits | Clear the alarm limit values | ||
DELETE | Current Data | Delete current data | |
All Data | Delete all stored data | ||
Group Data | Delete data stored data plus alarm and calibration data | ||
MEASUREMENT VIEW | View stored data in all groups | ||
CALIBRATION | Enable | Allow calibration access | |
Disable | Lockout the calibration mode | ||
Delete Zero N | Clear the zero cal. data (non-ferrous) | ||
Delete Zero F | Clear the zero cal. data (ferrous) |
NOTE: Disable the Auto Power OFF feature before lengthy programming to
avoid inconvenient automatic power down while programming.
Note: The unit of measure icon flashes when in the CONTINUOUS mode of
operation. The unit of measure icon is stable when in the SINGLE mode of
operation.
STATISTICAL VIEWS Menu
- Press the MENU button to access the programming menu
- Press SELECT to choose STATISTICAL VIEW
- Use the UP and DOWN buttons to scroll through the AVERAGE, MINIMUM, MAXIMUM, NUMBER OF DATA, and SDEV (Standard Deviation) values for the stored
- ‘NO DATA’ will display if no readings are available for the meter to analyze. Stored readings will clear when the meter powers down unless the GROUP feature is used (refer to GROUP function explanation later in this section).
- Press the BACK and then the ESC soft keys to return to the normal operating OPTIONS Menu
- Press the MENU button to access the programming menu
- Use the DOWN ARROW button to scroll down to OPTIONS
- Press SELECT to choose OPTIONS
- Use the UP and DOWN buttons to scroll to the MEASURE MODE, WORKING MODE, PROBE USED, UNIT SETTINGS, BACKLIGHT, LCD STATISTICS, AND AUTO POWER OFF parameters. Use the SELECT soft-key to select the desired Each parameter is detailed below:
-
Measure Modes
Select CONTINUOUS or SINGLE under MEASURE MODES in the OPTIONS menu using the arrow buttons and the SELECT soft-key.
In the CONTINUOUS measurement mode, the meter displays a running average of readings as they are taken. Note that the audible measurement ‘beep’ is not active in this mode.
In the SINGLE measurement, mode measurements are taken one at a time. Single- mode measurement readings are accompanied by an audible tone. -
Working Modes
Select DIRECT or GROUP 1, 2, 3, or 4 under WORKING MODES in the OPTIONS Menu using the arrow buttons and the SELECT soft-key.
In DIRECT mode, individual readings are logged to memory. When power is switched off or if the meter is switched to GROUP mode, all DIRECT readings will be cleared. However, the statistical analysis data will remain. The statistical analysis utility can evaluate up to 80 readings. When the memory is filled, new readings will replace old readings. Lastly, this mode has its own calibration and alarm limit values. In GROUP mode each group memory can store a maximum of 80 readings and 5 statistical values. Calibration and alarm limit values can be individually set and stored for each group. When the memory is filled, measurements will continue to be taken but readings will no longer log (previously logged readings are not affected); in addition, statistical data will no longer update. If desired, the group data, statistical values, calibration data, and alarm limit values can be deleted using Delete under the programming menu. -
Used Probe
Select AUTO, Fe, or No Fe under USED PROBE in the OPTIONS menu using the arrow buttons and the SELECT soft-key.
In the AUTO mode, the meter automatically activates the probe measurement method (ferrous or nonferrous) based on the metal substrate that is being measured. When the probe is placed on a magnetic substrate it will work in the magnetic induction mode. If the probe is placed on a non-ferrous metal it will work in the eddy current model.
In the Ferrous (Fe) Mode, the Magnetic induction measurement mode is activated.
In the Non-Ferrous (No Fe) Mode, the eddy current measurement mode is activated. -
Units of measure selections
Select mm, pm, or mils under UNIT SETTING in the OPTIONS menu using the arrow buttons and the SELECT soft-key (mm = millimeters; pm = micrometers; mils = mm*2.54/100) -
Backlight
Select ON or OFF under BACKLIGHT in the OPTIONS menu using the arrow buttons and the SELECT soft-key. If OFF is selected the LCD backlighting will be completely disabled. If ON is selected, the user can turn the light on or off using the backlight button (CANCEL button). -
LCD Statistics
Select AVERAGE, MINIMUM, MAXIMUM, or SDEV (Standard Deviation) under LCD STATISTICS in the OPTIONS menu using the arrow buttons and the SELECT soft- key. This selection determines which statistic is shown as default on the LCD display. -
Auto Power OFF
Select ENABLE or DISABLE under AUTO POWER OFF in the OPTIONS menu using the arrow buttons and the SELECT soft-key. When enabled, the meter automatically switches OFF after 3 minutes of inactivity. When disabled, the meter will only switch OFF with a button press or when the battery power is weak.
LIMIT Menu
High and Low Alarm Limits can be set when an Alarm Limit is reached the
meter displays an Alarm Alert icon (H for High Alarm and L for Low
Alarm).
- Press the MENU button to access the programming menu
- User, the DOWN ARROW button to scroll down to LIMIT and press SELECT
- Press SELECT again to choose LIMIT SETTING
- Press SELECT again to choose HIGH LIMIT
- User the ARROW BUTTONS to set a High Alarm value
- Press OK to store the limit and press BACK to return to the menu
- Perform the same steps for the LOW LIMIT
- Use the DELETE LIMIT parameter to clear Alarm Limit values
DELETE Menu
The DELETE menus allow for deleting current data, all data, and group data.
The following parameters are available in the DELETE Menu:
Delete Current data: Deletes the current reading and updates the statistics
(AVG, MIN, MAX, etc.) Delete All data: Delete all reading and statistical
data.
Delete Group data: This function duplicates the “Delete all data” function
with additional, deletions of High alarm, Low alarm, and one- and two-point
calibrations.
- Press the MENU button to access the programming menu.
- User the DOWN ARROW button to scroll down to DELETE.
- Press SELECT to open the DELETE function.
- User, the ARROW keys to scroll to CURRENT, ALL or GROUP.
- Press SELECT again to choose CURRENT, ALL, or
- The meter will prompt with an ‘are you sure?” confirmation.
- Press YES or NO as desired.
MEASUREMENT VIEW Menu
The Measurement View menu allows for scrolling through the readings in all of
the groups. Press the MENU button to access the programming menu.
User the DOWN ARROW button to scroll down to MEASUREMENT VIEW.
Press SELECT to open the MEASUREMENT VIEW parameter. User the ARROW buttons to
scroll through the stored readings. CALIBRATION Menu
The Calibration menu allows the user to enable/disable the calibration
utility. The Calibration menu also
allows the user to delete Zero calibration data for both ferrous (Zero F) and
non-ferrous (Zero N) modes.
- Press the MENU button to access the programming menu.
- User the DOWN ARROW button to scroll down to CALIBRATION.
- Press SELECT to open the CALIBRATION parameter.
- User the ARROW to scroll through the available parameters detailed below. ENABLE: Enable the calibration mode
DISABLE: Disable the calibration mode.
DELETE ZERO N: Delete ZERO calibration data for non-ferrous probe DELETE ZERO
F Delete ZERO calibration data for ferrous probe
Measurement Considerations
- After calibration, measurements should meet the published accuracy specifications.
- Strong magnetic fields can affect the
- When using the statistical analysis functions for obtaining a mean value, take several readings of the same measurement area. False readings or outliers can then be removed (deleted) using the programming
- The final reading is derived from a statistical calculation with regard to the meter’s published accuracy
Calibration
Calibration Types
Perform a zero calibration and a multi-point calibration before any
critical measurements are taken. The calibration options are listed below.
Read the description for each and select the best match for a given
application.
- Zero Point Calibration: Perform before each measurement
- One Point Calibration: Use for high accuracy with repeated tests on a constant coating thickness.
- Multi-Point Calibration: Use for high accuracy within a known range of coating thickness.
- Calibration for shot-blasted
Calibration Considerations
The calibration sample must correspond to the product sample in the following
ways: Curvature radius
Substrate material properties
Substrate thickness
Size of measurement area
The point at which the calibration is made on the calibration sample must
always be identified with the point of measurement on the product itself,
especially in the case of corners and edges of small parts.
To achieve the highest measurement accuracy, perform several calibrations in
succession (for zero values and calibration film values).
Preparing for calibration
- Clean the probe tip (grease, oil, metal scrap, and the slightest impurity will affect the measurement and distort readings).
- Switch the meter ON (at a 10cm [41 minimum distance from any metal).
- Ready the supplied metal substrate samples and necessary films (supplied calibration reference films).
- Set the meter to:
- DIR: (MENU-OPTIONS-Working Mode-Direct)
- CONTINUOUS: (MENU-OPTIONS-Measure Mode-Single mode [flashing units indicator])
- MAX: (MENU-OPTIONS-LCD Statistic-Maximum).
- urn: (MENU-OPTIONS-Unit Settings-um).
- The meter is now ready for calibration.
- You should perform a Zero calibration prior to a thickness calibration.
Zero calibration
-
Clean the tip of the meter before calibration.
-
Place the meter on an uncoated section of the material to be measured or on the reference, substrate Use either the Ferrous or Non-Ferrous reference as required by the measurement application. If the non-Ferrous reference substrate is used, place it on a non-ferrous surface.
-
Place the probe on the uncoated substrate and watch the LCD readings .
-
When the readings appear stable, lift the meter off the substrate.
-
Press and Hold the Zero button for 2 seconds (2 beeps).
-
Repeat this procedure at least 5 times to create an accurate mean
-
The meter is now ready for thickness calibration or measurements.
Calibration
The meter’s accuracy can be enhanced by performing calibration using the
reference films supplied.
1. Perform a zero calibration as stated above
-
Perform a one-point calibration using a reference film.
Press “MENU” and then scroll up “•” to “Calibration”.
Press “Select” and highlight the “Enable” function and then press “Select”. -
Press “Esc” and the meter will enter the calibration mode.
-
Place one of the reference films on the substrate used to Zero the
-
Place the meter on the calibration reference film and wait for the meter to beep.
-
Lift the meter off the
-
Press the Up or Down arrows to set the displayed value to the reference film
-
Place the meter back down on the reference film and wait for the meter to beep.
-
Lift the meter off the
-
Press the Up or Down arrows to set the displayed value to the reference film
-
Repeat up to 5 times to establish a mean average reading.
-
Turn the meter OFF to exit Calibration mode and save the calibration data.
-
The meter is now calibrated to the base material and thickness film
-
For a Two-point calibration refer to that section below.
Notes:
- During calibration, the ♦ and buttons adjust the internal calibration factors with high resolution. The display resolution may be such that it may take up to 10 presses to see a 1digit change on the display.
- Calibration data is stored in memory and is not erased when the meter is turned
Two – point calibration
This method requires the use of two films. The thicker film should be, if
possible, 1.5 times as thick as the thinner one. For best results, the
expected thickness should be between the two calibration values.
This method is especially suitable for taking high-precision readings. It is
advisable to take mean several times. This considerably reduces the effect of
scattering which occurs during calibration of the upper and lower values.
- Perform a Zero-point calibration (previously described)
- Enable Calibration mode in the Options Programming
- Perform a One-point calibration (as previously described) on the thinner calibration film reference and make 5 measurements.
- Tum off the meter to save the measured values.
- Turn on the meter and Enable calibration mode.
- Perform a One-point calibration using a thicker calibration film and make 5 measurements.
- Turn off the meter to save the measured values.
- Turn on the meter, it is now ready to make measurements.
Calibration Notes:
- The thickness of the calibration films should be roughly equivalent to the estimated coating thickness to be measured.
- Calibration can be performed as often as necessary. Old calibration data will be overwritten but the ZERO calibration data remains in memory until another zero point calibration is made.
Calibrating for Shot – blasted surfaces
The physical nature of shot-blasted surfaces results in higher than normal coating thickness readings. The mean thickness over the peaks can be determined as follows:
- The meter should be calibrated according to the calibration instructions. Use a smooth calibration sample with the same curvature radius and the same substrate as the device to be tested.
- Take approx. 10 readings on the uncoated, shot-blasted sample to produce the mean value Xo.
- Take an additional 10 readings on the coated, shot blasted test sample to produce the mean value X
- The difference between the two mean values is the mean coating thickness Xeff over the peaks. The greater standard deviation ‘S’ of the two values Xm and Xo should also be taken into consideration: Xeff = (Xm – Xo) ±S
NOTE: For coatings thicker than 300 pm, the influence of roughness
generally is of no importance, and therefore it is not necessary to apply the
above calibration methods.
Measuring metallic coatings
This meter can measure non-magnetic metal coatings (Zinc) on a magnetic
(ferrous) substance base, and non-metal coatings on a metal base (ferrous or
non-ferrous).
Statistical Analysis Considerations
The meter calculates statistics from a maximum of 80 readings (For Group 1 through Group 4, a maximum of 400 readings can be stored). Note that readings cannot be stored when in DIRECT mode. However, statistics on these readings can still be calculated. When the meter is powered off or if the working mode is changed (in the programming menu), the DIRECT mode statistics will be lost. The following statistical values can be calculated:
- NO.: Number of readings
- AVG: Average value
- Serv: Standard deviation (square root of a data set’s variance)
- MAX: Maximum reading
- MIN: Minimum reading
Statistical Terms
The average value ( X ) is the sum of readings divided by the number of
readings. ****
Standard Deviation (Serv)
The sample standard deviation is a statistic that measures how the sample
value is distributed around the sample mean. The standard deviation of a set
of numbers is the root mean square of the variance S2
The variance of a list is the square of the standard deviation of the list,
that is, the average of the squares of the deviations of the numbers in the
list from their mean divided by the(number of readings – 1 )
NOTE: Use the DELETE parameter from the programming menu immediately
after an outlier or erratic reading has been taken. Refer to the Delete
function in Programming Menu.
Storage Capacity Overflow
In GROUP mode, if the storage capacity is exceeded, statistics will not be
updated, although measurements can still be made. If the memory is full,
subsequent readings will not be included in the statistics. The meter’s
display will show “FULL” (in the SINGLE measurement mode). In DIRECT mode,
when the memory is full, the newest reading will replace the oldest reading
and the statistics will be updated.
Error Messages
The following error messages will appear on the meter’s LCD if a problem
arises.
Ertl: Eddy’s current probe error
Err2: Magnetic induction probe error
Err3: Eddy current and Magnetic induction errors
Err4, 5, 6: Unused error displays
Err7: Thickness error
Please contact Triplett if a problem exists.
Maintenance
Cleaning and Storage
Periodically wipe the meter housing with a damp cloth and mild detergent; do
not use abrasives or solvents. If the meter is not to be used for 60 days or
more, remove the batteries and store them separately.
Battery Replacement/Installation instructions
- Remove the Phillips head screw that secures the rear battery door
- Open the battery compartment
- Replace/install the two 1.5V ‘AAA’ batteries
- Secure the battery compartment
You, as the end-user, are legally bound (Battery ordinance) to return all used
batteries and accumulators; disposal in the household garbage is prohibited!
You can hand over your used batteries/accumulators at collection points in
your community or wherever batteries/accumulators are sold!
Disposal: Follow the valid legal stipulations in respect of the disposal of
the device at the end of its lifecycle
Specifications
Sensor probe | Ferrous | Non-Ferrous |
---|---|---|
Measurement principle | Magnetic induction | Eddy current principle |
Measuring range of coating thickness | 01250pm 0-49.21mils | 01250pm 0-49.21mils |
Accuracy’ (% of reading) | 0850pm: -±(3% + 1pm) 850pm —1250pm: (±5%) |
0-33.46mils: -±(3% + 0.039mils)
33.46mils —49.21mils: (±5%)| 0-850pm: -±(3% + 1.5pm) 850pm —1250gm: (±5%)
0-33.46mils: -±(3% + 0.059mils) 33.46mils —49.21mils: (±5%)
Resolution| 050pm: (0.1pm)
50pm —850pm: (1pm) 850pm 1250pm: (0.01pm)
0-1.968mi1s: (0.001mils) 1.968mils-33.46mils: (0.01mils) 33.46mils-49.21mils:
(0.1mils)| 050pm: (0.1pm)
50pm —850pm: (1pm)
850pm —1250pm: (0.01pm) 0-1.968mils: (0.001mils)
1.968mi1s-33.46mils: (0.01mils) 33.46mils-49.21mils: (0.lmils)
Min. curvature radius of substrate| 1.5mm (59.06 mils)| 3mm (0.004 mils)
Diameter of Min. area of the substrate| 7mm (275.6 mils)| 5mm (196.9 mils)
Basic critical thickness of the substrate| 0.5mm (19.69 mils)| 0.3mm (11.81)
Industrial Standards| Conforms to GB/T 4956-1985, GB/T 4957-1985, JB/T
8393-1996, JJG 889-
95, and JJG 818-93
---|---
Operating Temperature| 32°F-104°F (0°C-40°C)
Operating
Relative
Humidity (R.H.)| 20%-90% Relative Humidity
Dimensions| 4.3 x 2.0 x 1.0″ (110 x 50 x 23mm)
Weight| 3.9 oz. (100g)
Accuracy statement applies on a flat surface, with a zero and a calibration performed near the thickness of the film to be measured, with identical base metal and with the meter stabilized at ambient temperature. The accuracy of the reference films or any reference standards should be added to the measurement results.
Warranty
Triplett / Jewell Instruments extends the following warranty to the original
purchaser of these goods for use. Triplett warrants to the original purchaser
for use that the products sold by it will be free from defects in workmanship
and material for a period of (1) one year from the date of purchase. This
warranty does not apply to any of our products that have been repaired or
altered by unauthorized persons in any way or purchased from unauthorized
distributors so as, in our sole judgment, to injure their stability or
reliability, or which have been subject to misuse, abuse, misapplication,
negligence, accident or which have had the serial numbers altered, defaced, or
removed. Accessories, including batteries, are not covered by this warranty
Copyright © 2021 Triplett
Test Equipment Depot
– 800.517.8431
– 5 Commonwealth Ave,
Woburn, MA 01801
TestEquipmentDepot.com
References
- Triplett Test Equipment & Tools for Industrial & Electronics Solutions
- Download Triplett Product Software — Triplett — Triplett Test Equipment & Tools
Read User Manual Online (PDF format)
Read User Manual Online (PDF format) >>