Frequently Asked Questions (FAQ)


FAQ – Infrared Thermometers eT650D and eT1050D

The two laser beams cross at a distance of 8 inches (203 mm). This is the recommended target distance for most measurements. Since the distance-to-spot ratio of the eT650D is 10:1, the spot diameter at this distance is 0.8 inches (20 mm). To measure a smaller spot area, move the thermometer closer to the target and use the distance-to-spot ratio of 10:1 to estimate the spot size. For instance, at a target distance of 5 inches, the spot size is 0.5 inch (13 mm). But you can also measure surfaces at distances greater than 8 inches, the eT650D and eT1050D can easily detect temperatures of targets several feet away. The further away the target, the greater the spot size, and the two laser dots approximately tell you the edges of the circular measurement spot. So make sure the target area you want to measure is larger than the spot size (lies inside the two laser dots).


To measure the grilling grate surface temperature, hold the temperature gun at a shallow angle and shoot the laser across the top of the grilling surface. The lasers should be visible on the grate of the grill. If your angle is too steep you may be sensing the heat of the coals beneath, which will not give an accurate reading of the grilling surface.

You will have to open the oven door to measure the temperature of the pizza stone. You cannot measure it through the glass of the oven door. To do this, open the oven and shoot the laser across the stone at an oblique (shallow) angle. You can hold the trigger down and scan the stone to get a series of temperature readings for the whole pizza stone surface. Typically, an oven will reach the ambient air temperature for baking well before the pizza stone reaches that temperature, so simply hearing the oven beep that it has reached temperature does not indicate that the pizza stone has reached the target temperature.

A black cast iron or dark non-stick coated pan will give you accurate readings at the standard emissivity of 0.95. However, a shiny stainless steel sauté pan will give you inaccurate readings due to the shine unless you adjust the emissivity of your eT650D infrared thermometer. You can find detailed information about that in the manual which comes with your thermometer. One more tip: you can use the scan feature of the thermometer to discover how evenly your pan is heating and how the temperature changes when different foods are added to the pan.

No, the eT650D is not designed to measure body temperature. It will only measure skin surface temperature which is not an accurate reflection of your internal body temperature.

For information on our NIST certification and calibration services, please visit the NIST FAQ section.

FAQ – Digital Meat Thermometer eT820F

The best way to measure poultry is to insert the thermometer into the bird at the inner thigh, just between the drumstick. Make certain not to touch the bone with the tip of the thermometer. (Don’t insert it too far, just so the tip is in the meat completely.) When the temperature reads 165°F for the inner thigh, then the bird is ready.

To measure thin cuts of meat, like pork chops or steaks, you’ll want to insert the thermometer into the side of the cut of meat. Try to position the thermometer towards the top and angle it slightly downward so that the measuring tip is as close to the center of the meat as possible.

Ground meats should be measured at the thickest part. For meatloaf, this is the very center of the loaf. Do not insert the thermometer tip too deeply, as it will be measuring the bottom half of the loaf, which is closer to the pan and heat. Instead, insert the thermometer just slightly past the tip, and the tip will be in the center of the loaf, which is furthest from the heat.

No! This thermometer should not be left in food which is cooking, either in the oven, on the grill or on the stove top. It is an instant read thermometer, so you can take a reading very quickly, and then remove it.

You can insert the thermometer for long enough to take a good reading – which is about 5 seconds. You can even leave it in the meat for 10, 15 or 20 seconds if you are unsure of the reading; but that really is not necessary. It will give an accurate reading in about 5 seconds. Then press the ‘HOLD’ button and remove the thermometer and it will still show the temperature reading.

Yes! Simply hold the thermometer so that your hand is not in the steam, so you do not burn yourself. The thermometer is sealed to prevent steam from entering the sensitive electronics or battery compartment. The digital display may fog up from the outside, but it will not fog up on the inside and you can easily wipe it clean. Remember to press the ‘HOLD’ button when the five seconds are up, and then remove the thermometer for safer, easier reading.

Light, Sound and Weather Meters

FAQ – Light Meter eL200K

This port provides an alternate way to power the instrument. To use it you will need a USB A-Male to Mini-B cable to power the unit from a USB power adapter. The cable and power adapter are not included with the light meter.

The “DC” and “AC” Output ports provide an analog signal that can be used to record the measured light level signal with a data logger (200,000 lux = 3300mV). A short adapter cable that ends in two bare wires is included for that purpose.

The light meter turns itself off after 5 minutes of inactivity. You can disable this feature by holding the “Units” button down while turning the unit on. You will see the Power Symbol appear on the LCD, indicating that the auto-power-off feature is disabled.

The ennoLogic light meter is NOT for photography. It does NOT provide a way for you to determine the proper exposure for a photo. Its purpose is to show you light intensity in lux or foot candles at any given location.

FAQ – Anemometer eA980R

The only batteries that will work for your anemometer are LIR2032 lithium batteries. Do NOT use CR2032 batteries (which are not lithium!) in the eA980R or it will NOT work. The LIR2032 provides a higher voltage than the CR2032. You can recharge LIR2032 batteries with a suitable charger that you can buy online.

With the anemometer turned on, press and hold the “Set” button for about 2 seconds, until the backlight comes on.

The temperature sensor is located behind the little hole on the left, just below the impeller. You can decrease the response time by increasing the air flow through this hole, for instance by holding the anemometer into the wind or air flow, or by moving the anemometer back and forth. When air is continuously moving through the anemometer, the response time is about 30 seconds. Without air flow, the response is much slower (several minutes) because the temperature sensor is located inside the unit.

The eA980R anemometer does not measure absolute altitude, like a GPS device, but measures pressure altitude instead. Pressure altitude is used in aviation planning. It is a relative altitude measurement based on barometric pressure. Pressure altitude is defined as ‘the altitude above or below a theoretical 29.92 inHg standard datum plane.’ For every 1000 feet you go up in altitude there is a decrease of about 1 inHg. The barometer’s reading at a certain altitude compared to the 29.92 inHg standard pressure can be used to calculate pressure altitude: Pressure Altitude = (29.92 – Current Pressure) x 1,000 + Field Elevation, where field elevation is the elevation of the airport location.

Yes, the anemometer can be calibrated for all parameters. The instructions for that are included in the user manual. Some parameters, like barometric pressure, are pretty easy to calibrate, you just use a reliable online weather service that reports atmospheric pressure for your location. Temperature calibration will require you to have an accurate thermometer. Calibrating wind speed is more complicated, most of us don’t have access to a wind tunnel. Unless you have the equipment needed to calibrate wind speed, we recommend not changing the factory calibration for wind speed on your anemometer.

Moisture Meters

FAQ – Moisture Meter eH710T

Properly seasoned firewood should have a moisture content of 15-20% to ensure optimal burning for high heat and low smoke output.

Press the arrow button to choose the desired material. The material number (1-7) will be displayed along with either the wood or building material symbol.

We recommend material setting 1 (see manual). To determine moisture content, the meter measures the electrical impedance of the test material. Different settings are required to account for the difference of impedance in different materials. This meter has settings for wood, carpet, dry wall, concrete etc.
Because there is such a wide range of carpet types and pads, it is not impossible to provide moisture-to-impedance correlations for all of them. This is why we recommend setting 1 which has the widest moisture range.
So, while you will not be able to get an exact moisture percentage you will be able to determine dry, moist or wet conditions. This is also why we recommend testing a known dry area of the carpet first to establish a reference. In this way you can use this meter on carpets to determine the extent of flood damage, discover cracks in the foundation causing water seepage, or to discover pet accidents.

It depends. If the moisture content of the material you’re measuring is higher than the highest value of the range you selected (for example 47.9% for setting 2), the meter will read “Hi” and remain on “Hi”. However, the meter may read “Hi” initially when measuring very moist materials. If the moisture content does not exceed the range, the meter will settle on a value within a few seconds (typically less than 5).

Laser Levels and Distance Meters

FAQ – Laser Distance Meter eD560L

Yes, there is. Simply press and HOLD the red “MEAS” button. The laser distance meter will switch to continuous mode and will beep each time it takes a new measurement. The distance displayed will update accordingly. To exit this scanning mode, press the “MEAS” button again briefly.

Press the bottom-left button and watch the symbol in the top left corner of the LCD display. Every button press toggles between the two options. The symbol on the display will indicate which edge is used as reference edge.

The Pythagorean modes (there are three of them) can be used to indirectly measure places that are either hard to access or difficult to measure directly with the meter. It’s based on the fact that measuring two sides of a right triangle allows you to calculate the length of the third side. For example: you can measure the height of a building by making one horizontal measurement close to the ground, and another one at an angle to the top edge of the building. The meter then calculates and displays the height of the building from your two measurements. To learn more about the different Pythagorean modes, check out the user manual, it explains these and other modes like “Area” and “Volume” in detail.


FAQ – TRMS Multimeter eM860T

This depends on the function that has been selected, as well as the range. To give you a general idea on the accuracy of the most commonly used functions: most DC voltage ranges provide an accuracy of 0.8%, AC voltage ranges 1.0%, DC current 1.5%, and resistance ranges 0.5%. For more detailed and accurate specifications please consult the users manual that is included with your unit.

To turn the backlight on or off, press and hold the “HOLD” button for 2 seconds. The backlight will turn off automatically after 15 seconds to conserve power.

TRMS (also called True RMS) stands for True Root Mean Square. The TRMS value of an AC waveform is the equivalent DC value of this waveform, or in other words, its DC heating value. When taking current measurements, it is important to take accurate readings to protect conductors from exceeding their insulators’ ability to withstand heat, or to assure devices under power work properly. A TRMS meter is able to provide this information accurately and directly.

Yes, the meter automatically powers off after 15 minutes of idle time. If needed, the auto-shutoff can be disabled by pressing and holding the SELECT button when you turn the meter on.

The shunt resistance for uA ranges is 100 Ohms, and for mA ranges it is 1 Ohm.

FAQ – Multimeter with Battery Tester eM530S

1) Plug the black test lead into the COM jack at the bottom of the multimeter.
2) Plug the red test lead into the “BATT uA mA” jack, right next to (left of) the COM jack.
3) Press the ON/OFF button of the multimeter to turn it on.
4) Set the the rotary switch to “1.5V” (that’s one of the three battery test settings, 9V, 3V, and 1.5V)
5) Using the test leads, touch the tip of the black test lead to the negative pole of the AA or AAA battery, and touch the tip of the red test lead to the positive pole of the AA or AAA battery.
6) Read the voltage displayed on the LCD display of the meter. It will tell you the voltage of the AA or AAA battery “under load”. Under load means that the meter is closing the circuit and draws a small current out of the battery, which makes it behave as if it was in use. This is a more accurate test than simply measuring the voltage without placing it under load, in which case the voltage will often be higher and misleading.
7) Now that you know what voltage the AA or AAA battery produces under load, you can use that information to determine how much battery life you have left, approximately. Simple battery testers usually give you a “green/orange/red” test result. The multimeter you bought gives you an accurate voltage reading instead, but you have to know what it means. A full new AA or AAA battery measures 1.55..1.65V. A used battery that still has some usable juice left will read between 1.25V and 1.55V. If the reading is below 1.25V the battery is typically more than half empty, and below 1.1V it’s pretty much done and not worth keeping it. Time to recyle. Under most conditions, the discharge of the battery between 1.5V and 1.0V is pretty much linear, so if it’s at 1.25V you can assume you got about 50% left. But true remaining battery life depends on many other factors, such as age of battery, temperature, load current, and battery brand.

Yes, the meter automatically powers off after 15 minutes of idle time. One minute before power off, the multimeter will beep 5 times. The beeper will sound again before power off.

The eM530S multimeter runs on 3 alkaline AAA batteries.


NIST is the National Institute of Standards and Technology, a physical science laboratory that is part of the U.S. Department of Commerce. It was founded in 1901 and is one of the nation’s oldest physical science laboratories. Its main purpose and mission is to develop and maintain measurement standards. As part of that mission, NIST provides certified standard reference materials (SRMs) allowing science and industry to achieve more accurate measurements.

A NIST certificate is a document that certifies that your product has been tested against reference standards and/or tested with NIST-traceable instruments and that it meets or exceeds its specifications. A NIST certificate states that the product was found to be within its stated tolerance of accuracy. It also documents the environmental conditions at the time of testing, which equipment or standard reference materials were used, and some NIST certificates provide the actual test data as well.

A NIST traceable product is a product that was tested with equipment that has an unbroken chain of traceability to NIST standards. That means that every calibration instrument used for NIST certification has to have a valid NIST certificate itself, which may have been obtained by comparing its calibration against another NIST traceable instrument and so on, eventually leading back to a standard reference material certified by NIST. Instruments used for NIST certification must get tested and re-certified at regular intervals, typically annually.

Comparing measurements of a test unit against a reference material is the preferred method of verification since it eliminates the measurement uncertainty that comes with a chain of traceable instruments. An example is the ice point of 0°C, established with a temperature controlled ice bath and used for calibrating or testing thermometers. However, the practical use of SRMs is often limited, and NIST traceable instruments have to be used instead to characterize the test unit at multiple points across its measurement range.


Most often, NIST certificates are requested for regulatory compliance purposes. For example, restaurants or food banks may be required to have a NIST certificate for the thermometer they use to test the temperature of food.

Some industries may require NIST certificates for their instruments to meet the documentation requirements of ISO compliance. Or scientists may wish to maximize the measurement accuracy for a critical experiment, and having a NIST certificate with test data available gives them more detailed information on the measurement error of their instrument.

You may also want a NIST certificate to simply know that your unit was tested and found to be accurate before it was shipped.

No, we are not. However, we are well aware of the structural, process and management requirements outlined in the ISO 17025 documentation and follow its guidelines to make sure our test and quality control system produce accurate and consistent results and all data and records are stored securely and with backups. Also, all of our NIST traceable test instruments are calibrated by an ISO 17025 accredited laboratory.

In some cases, regulations may require you to obtain a NIST certificate from an ISO 17025 accredited laboratory. If that applies to you, you need to contact a third party calibration lab that is accredited under ISO 17025 to get your product NIST certified.

Currently, we only provide NIST certificates for our infrared thermometers.

No, you have to purchase the product that explicitly states “…with NIST certificate”. The standard units we sell do not come with NIST certificates.

Yes, we can. Send an email to [email protected] and request a pdf version of your NIST certificate. Your email needs to include the serial number of your unit so we can find the corresponding certificate in our database. The serial number is located on the unit itself, not on the packaging.

Unfortunately, we cannot do that. We can only issue NIST certificates for units that have gone through the process of testing required to issue the certificates. If you intended to purchase a NIST certified unit and purchased a standard unit by accident, please contact us to discuss an upgraded replacement.

Yes, you can download a sample certificate here.

The short answer is: typically for one year, after which you need to get your unit re-certified and recalibrated, if needed (we do provide this service). Depending on your application and requirements, you may choose to have your unit checked and re-certified by us more frequently. The time between re-certifications is called the calibration interval.

Keep in mind that this calibration interval, the period for which your NIST certificate is valid, does not start with the date on the NIST certificate. It starts on the date the unit is placed into service. Our thermometers typically do not undergo changes in accuracy between the time of calibration and the time they are placed into service (i.e. during shipment and storage).

Also, please note that although the measurement results provided on the NIST certificate can be considered to be traceable to NIST reference standards at the time the measurements were performed, we cannot certify that those measurement results are valid after the instrument has been returned to you and placed into service. You or the responsible party in your organization must have an appropriate internal measurement assurance program in place to assure the continued validity of these measurement results. This program should also specify a calibration interval appropriate for your application. The recommended calibration interval for our thermometers for general industrial and laboratory use is one year.

Should you discover, through your internal measurement assurance program, that your thermometer seems to give inaccurate readings, please contact us to get your unit recalibrated!

Our IR thermometers can be calibrated, but this calibration needs to be performed by us. Calibration of IR thermometers requires the proper equipment, good knowledge and experience of how to use it and its limitations, and the exact sequence and methodology of the calibration process itself. We currently do not provide calibration instructions to customers or third party laboratories.

We use a temperature-controlled crushed ice bath to test the accuracy at 0°C (32°F). For tests at higher temperatures up to 500°C (932°F), we use infrared calibrators (IR-500 and BR-M400) and probe-type reference thermometers (DTU6005-002-N from QTI Sensing Solutions and Precision Plus Thermometer by Thermoworks).

We charge $50 per unit for re-certifcation/re-calibration. You pay to ship your unit to us, and we cover the return shipping costs.

Leave A Comment

Join our Email list & receive
10% OFF
Your First Order!
Give it a try, you can unsubscribe anytime.
We will never share your email with anyone.
%d bloggers like this:
Web Statistics