top of page

Search Results

82 results found with an empty search

  • High Voltage Testing: Guide to Safe Methods & Compliance | CISCAL

    Learn safe high voltage testing methods, Aussie standards, and step-by-step controls to prevent shocks and downtime. < Back Guide to Safe High Voltage Testing Methods High voltage testing checks whether cables, switchgear, transformers, motors and lab equipment can safely withstand service voltages and surges. The biggest risk is electric shock. Do three things every time: plan the test, isolate and prove de-energised, and use the right method, PPE and earthing/discharge controls. Learn More Safe HV Testing in Australia Applying controlled stress ( AC/DC/VLF/surge ) to prove dielectric with stand, detect insulation resistance ( IR ) issues, and find defects ( e.g., partial discharge or tan delta loss ). Used in utilities, manufacturing, mining, pharma, and research labs during commissioning, maintenance, and after repairs. Top Three Controls Plan: written test plan, risk assessment, drawings. Isolate & Prove De-energised: lockout/tagout, test for dead, set approach distances. Use the right method & PPE: method per standard/OEM, barricades, observers, earthing, discharge rods. What is High Voltage Testing? High voltage ( HV ) testing uses elevated test voltages to check whether insulation systems can withstand normal and abnormal stresses ( steady-state, switching surge ) without breakdown. It covers cables, switchgear, motors, transformers, lab HV supplies and more. The IEC/AS-NZS 60060 family is the technical backbone for HV test techniques. Methods at a Glance Method What it checks Typical voltage & dwell Best for Notes Dielectric Withstand (Hipot) Pass/fail withstand of insulation to elevated AC/DC kV level; dwell typically minutes per standard/OEM Commissioning, after repair Always discharge and earth the DUT before disconnecting. Insulation Resistance (IR) DC resistance (MΩ/GΩ); trends over time e.g., 500 V–5 kV; 1-min value, PI/DAR ratios Baseline health checks Good for routine checks without high stress; PI = 10-min/1-min. VLF AC AC withstand at very-low frequency e.g., 0.1–0.01 Hz; minutes MV polymeric cables Lower stress than 50/60 Hz; combine with diagnostics. Tan δ (dissipation factor) Dielectric loss/aging Paired with VLF; trending MV cables Rising tan δ = aging/moisture; use limits/criteria. Partial Discharge (PD) Defect activity (pC); inception/extinction Online/offline Cables, terminations, motors Calibrate PD circuit to IEC 60270 before testing. Surge/Impulse Turn-to-turn integrity Fast impulses; waveform compare Motors/windings Detects faults that IR/hipot may miss. When to use each: Commissioning ( withstand plus diagnostics ), maintenance (I R trend, VLF+tan δ/PD ), post-repair ( targeted hipot/surge ). Australian Standards & Legal Duties ( Know the Rules ) AS/NZS 60060 ( IEC 60060 series ): sets definitions, measuring systems and on-site test requirements for HV test techniques. Recent IEC updates ( e.g., IEC 60060 -1:2025 ) clarify scope for AC, DC and impulse testing above 1 kV. Use these standards to select test voltages, durations, and measuring systems. WHS duties & Codes of Practice: The Model Code of Practice, Managing electrical risks is an approved code under the WHS Act. Following an approved code will assist with compliance; an equivalent or higher method is acceptable. States publish their own approved versions ( e.g., NSW 2019, QLD 2021/varied 2025 ). Victoria ( Blue Book ): If operating in Victoria, use The Blue Book 2022 for work on/near HV apparatus, approach distances, permits, roles, sanctioning and more. It’s referenced under Victorian regulations and sets minimum safety requirements. 3-step “Prove Competence” Engage a competent person ( typically a licensed/registered electrician or inspector with HV competency ). Use a documented procedure aligned to the Code/standard. Keep records plans, permits, isolation tests, results, and calibration traceability. Quick Reference Links SafeWork NSW: Managing electrical risks ( Code of Practice ). WorkSafe Victoria: Electrical safety guidance. WorkSafe QLD: Managing electrical risks ( Code of Practice ). Energy Safe Victoria: Blue Book 2022. Core Methods & How to Run Them Safely Dielectric Withstand ( Hipot ) Purpose: Prove withstand capability under elevated AC or DC voltage; go/no-go. Typical setup: kV output, defined ramp/dwell; leakage monitored. Always discharge with a rated rod and earth before removing leads. Safe sequence 7 steps Confirm isolation, LOTO, permits and approach distances ( Vic sites: Blue Book ). Bond the test set earth first; attach the return/guard as per OEM. Post barricades/signage; nominate a dedicated observer. Ramp to the specified test voltage; hold for the dwell ( often minutes per OEM/standard ). Record voltage, time, leakage current, ambient conditions. ( Use the test plan template below. ) Lower to zero and allow the internal discharge cycle to complete. Apply the discharge rod to the DUT until confirmed de-energised; earth/short the DUT. AC vs DC: DC can over-stress aged polymeric cables; VLF AC with diagnostics is preferred for service-aged MV cables. Insulation Resistance ( IR ) What you get: A 1-minute IR value ( MΩ/GΩ ) and optional PI ( 10-min/1-min ) or DAR ratios. Great for baseline and trending. When IR beats Hipot: Routine checks where you don’t want to apply high stress; first look after maintenance; quick pre-commissioning screen before withstand tests. VLF AC for Cables When to use: Commissioning/maintenance of modern polymeric MV cables; safer on capacitive loads than 50/60 Hz. Post-test, discharge and earth the circuit and maintain signage until proven safe. Tan Delta ( Dielectric Loss ) What it shows: Changes in dielectric loss indicate ageing or moisture trees. Pair with VLF to plan repair/replacement windows. hvinc.com Partial Discharge ( PD ) Why run it: Finds defects ( voids, sharp edges, bad joints ) before failure. Combine with VLF and tan δ for a fuller picture. Calibrate the PD measuring system to IEC 60270 before testing and document PD inception/extinction voltages. Surge/Impulse (For Motors & Windings) Use case: Detect turn-to-turn and phase-to-phase weaknesses early, faults that IR/hipot may miss. Compare waveforms between phases; a left-shift or amplitude change flags a winding issue. Safety Controls & Site Setup ( Zero-harm Checklist ) Plan the test: Write a test plan with drawings, switching schedule, permits, required competencies and emergency steps. Isolate, LOTO, test for dead: De-energise; lockout/tagout; prove dead; define approach distances (Blue Book for Vic). Barricade & signage: Set exclusion zones; assign a dedicated observer with radio. PPE & insulated tools: Arc-rated clothing, dielectric gloves/boots, hot sticks per site risk assessment. Earthing/grounding: Earth the DUT and adjacent equipment; keep ground sticks and discharge rods rated for the job. Post-test discharge: Lower voltage to zero, wait internal discharge, then apply discharge rod; for DC tests, hold grounds for at least 4× test duration on long cables. Incident reporting: Notifiable electric shock events must be reported promptly (e.g., SA guidance notes enforcement). SafeWork SA highlighted 331 notifiable shocks in one period and under-reporting penalties apply. Who should perform HV testing? A competent person should perform and interpret electrical tests, usually a licensed/registered electrician or a licensed electrical inspector with HV competency. Calibration Matters: Keep HV instruments calibrated by ISO/IEC 17025 labs and maintain uncertainty statements with your results. Many enterprises set 6-monthly cycles for critical HV gear. Evidence & Why it Matters ( Stats ) Electrical fatalities: 11 in AU/NZ for 2023–24 ( 10 in Australia ), 0.34 deaths per million. Most network-related deaths involved overhead conductors. All-cause WHS fatalities: 188 worker deaths in 2024; 1.3 per 100,000 workers. Shock notifications & enforcement ( SA ): Hundreds of shock injuries were reported, with regulators warning on under-reporting and issuing more enforcement notices. Choosing Test Equipment ( Buyer’s Quick Tips ) Match method to asset: IR for trend baselines; VLF for MV cables; tan δ/PD for diagnostics; surge for motors; hipot for withstand. Confirm standard alignment: Check equipment and procedures align to AS/NZS 60060/IEC guidance for test voltages, dwell times and measurement. Safety by design: Look for integrated discharge features and proper earthing points; ensure suitability for capacitive loads ( VLF ). Insist on calibration certificates: Use ISO/IEC 17025 labs and retain uncertainty with results for audits and decisions. ( Best practice reinforced by WHS Code record-keeping. ) Downloadable tools HV test plan template (.doc) : sections for assets, standards, roles, approach distances, isolation, and acceptance criteria. Pre-start checklist (.pdf) : PPE, barricades, earthing points, observer, comms test. Risk matrix (.png) : quick consequence/likelihood chart for on-site use. FAQs Previous Next

  • RESOURCES | CISCAL

    Find practical calibration guides, product highlights, and CISCAL news and events. Get the latest and book a service when you’re ready. RESOURCES CISCAL’s Resource Hub brings together guides, product spotlights and news from our team, serving industry since 1963. Get practical advice on calibration, validation and repair, plus updates on equipment and standards. Read the latest and get in touch to book a service. NEWS AND EVENTS Stay up to date with CISCAL’s latest announcements, industry developments, and milestones. Get in touch with upcoming events where innovation and expertise come together. Excellence in Extrusion Pure Performance COLLIN: From Lab to Production Excellence in Extrusion Pure Performance Excellence in Extrusion Pure Performance read more PRODUCT HIGHLIGHT From advanced instruments to industry essentials, explore solutions trusted by professionals. Our product highlights help you choose equipment that ensures safety and performance. Steroglass Flash2: One Platform for Multisector Automated Titration Read More Optimising Kiln Temperature with Keller PK-11 Series Read More Tartaric Stability With Steroglass EasyCheck Read More LOAD MORE BLOG Discover expert perspectives and practical tips on calibration, compliance, and industry best practices. Our blogs deliver insights to help your business stay accurate, efficient, and audit-ready. LOAD MORE

  • Steroglass Flash2: One Platform for Multisector Titration | CISCAL

    Automatic titrator Flash2 from Steroglass: compact lab system for precise chemical, food, water and wine analysis. Find details at CISCAL Resources. Steroglass Flash2: One Platform for Multisector Automated Titration Why Automated Titration Matters for Australian Labs Australian laboratories in wine, food and beverage, water and environmental testing, and chemical production are under steady pressure. Export markets expect tight process control, domestic regulators expect traceable data, and many labs are trying to do more work with the same or fewer people. Manual titration still works, but it is slow, operator-dependent and difficult to standardise between shifts, sites and seasons. Reading burettes, judging endpoints by eye and handwriting results into logbooks all add variation and admin load. When labs run hundreds of wine, dairy or water samples per week, that variation can turn into repeat work, release delays and stressful audits. The Steroglass Flash2 automatic titrator is designed as a compact, multisector platform that automates these routine titrations, improves reproducibility and creates digital records that stand up in NATA, ISO/IEC 17025 and food safety audits. What is Steroglass Flash2 Automatic Titrator? The Steroglass Flash2 is a fifth-generation automatic titration system that replaces manual glass burettes and colour-change endpoints with an automated, sensor-based process. It is built to perform routine titrations across oenological, food, environmental and chemical matrices on a single platform. Flash2 is a compact benchtop titrator with a 7-inch high-brightness touchscreen. The interface guides users step-by-step through method selection, sample information, titration, result review and data export. This is a shift from “remember the method and write it down” to “follow the on-screen recipe”, which suits mixed-experience teams and seasonal lab staff. The system can be configured with one or two precision burettes and up to three peristaltic pumps for auxiliary reagents. An AS Plus autosampler (14, 18 or 30 positions, depending on beaker size) can be added for batch workflows, so the same automatic titrator can handle both single urgent samples and production runs. Flash2 is designed as a true multisector titrator. On one instrument, laboratories can run wine analysis pH, titratable acidity and free/total SO₂; dairy acidity and chlorides; acidity and peroxides in edible oils; vitamin C in juices and sauces; alkalinity and hardness in water; and acid–base and redox titrations for chemical and galvanic baths. Key Features and Specifications of Steroglass Flash2 Compact, Multisector Platform Flash2 measures approximately 25.5 × 20.5 × 44 cm and weighs around 10 kg, so it fits comfortably on a standard lab bench next to a balance or pH meter. In shared laboratories, contract facilities and winery labs that already host a mix of instruments, this footprint helps avoid yet another trolley or crowded corner. The compact housing still supports up to two burettes and multiple peristaltic pumps, giving enough flexibility to run acid–base, redox and complexometric titrations across food, water, wine and chemical samples on a single system. Instead of buying different titrators for wine, dairy and process water, Australian labs can standardise on one automatic titration system with methods configured for each matrix. Automation and Throughput Flash2 automates all main titration steps: dosing titrant via precision burettes, dispensing auxiliary reagents with peristaltic pumps, stirring, monitoring electrode responses and detecting endpoints. The instrument records the titration curve in real time, calculates results and stores them in an internal database. According to Steroglass , automated sampling, degassing and auto-levelling systems mean Flash2 can cut analysis and sample preparation time by up to 90% compared with manual methods. When combined with the autosampler (14, 18 or 30 positions), labs can load a batch, start the run and focus on other work while the titrator processes each sample in sequence. For high-throughput contract labs and busy winery labs during vintage, this batch style reduces manual pipetting and burette reading. Fewer manual steps mean fewer transcription errors, fewer repeats and faster release decisions for production. Data Handling, GLP Compliance and Connectivity Flash2 is designed with GLP-style operation in mind. The instrument can store up to 30 user-editable methods and around 300 sets of results, calibrations and titrant data, along with titration curves. User accounts support an administrator plus up to eight secondary users, providing basic access control for regulated environments. Results can be exported via USB as CSV files, printed or transferred to a PC using Flash2Data software. From there, data can be integrated into LIMS or QA databases to support NATA-accredited operations and GMP or ISO/IEC 17025 requirements. This replaces handwritten logbooks and spreadsheets with traceable electronic records, helping laboratories reduce transcription errors and prepare for audits with less manual collation. How Steroglass Flash2 Supports Key Australian Industries Flash2 is built as a multisector titrator, so the same instrument can sit in a winery lab, a regional dairy plant, a council water lab or a chemical works. Below are examples of how the platform fits into typical Australian workflows. Wine and Oenology In winery labs, Flash2 automates routine wine analysis pH, titratable acidity and free/total SO₂ across harvest, fermentation, maturation and bottling. These parameters drive taste, mouthfeel and preservation, and they are central to decisions on acid additions, sulphur dosing and blending. With wine exports worth around $1.9 billion and about 60% of production shipped overseas, consistent titration data helps maintain brand and country reputation in crowded export markets. Automated titration also supports larger wineries and groups that operate multiple sites, as the same methods can run on identical instruments in different regions. A mid-sized Australian winery, for example, could use Flash2 with an autosampler to run morning and afternoon batches during vintage: musts and ferments in the morning, and barrel or tank samples in the afternoon. The instrument handles dosing and endpoint detection while staff focus on interpreting trends and advising winemakers. Food and Beverage Manufacturing Flash2 supports a wide range of food and beverage quality control tests. In dairy, titrations for acidity and chlorides help check milk freshness, monitor processing and control salt levels in cheese and other products. In juices, sauces and preserves, acidity and vitamin C titrations protect shelf life, taste and label claims. Edible oils can be checked for acidity and peroxide values to monitor oxidation and storage stability. These capabilities line up with the scale of Australian food and beverage manufacturing, where around 87% of firms are SMEs and the sector generated about $31.8 billion GVA in 2022–23, with almost $36 billion in exports. With 14.8 million tonnes of food and non-alcoholic beverages sold in 2022–23, even small efficiency gains in routine QA can free staff to work on process improvement and incident investigation rather than repeating manual titrations. Water and Environmental Testing Water and environmental labs must show that drinking water, process water and wastewater meet guideline values for pH, alkalinity, hardness and related parameters. Flash2 methods can cover titrimetric alkalinity and hardness testing alongside pH and conductivity, supporting compliance with Australian drinking water and environmental guidelines while providing traceable digital records. For regional councils or industrial sites that run modest sample numbers, the compact footprint means Flash2 can share a bench with other water quality instruments rather than needing a dedicated titration bench. Chemical and Galvanic Industries In chemical manufacturing and galvanic or electroplating plants, Flash2 can handle acid–base titrations, TAN/TBN measurements and titrations for hydrogen peroxide, active chlorine, alkali and other bath components. Keeping bath composition within tight ranges helps maintain coating quality, avoid corrosion and reduce rework. Automated titration is especially helpful where baths are aggressive or hot. Flash2’s automation reduces direct contact with reagents, supporting safer operation and more consistent process monitoring. Manual vs Automated Titration – Benefits of Steroglass Flash2 How does Steroglass Flash2 improve accuracy and reproducibility? Manual titration relies on the operator’s judgement to detect endpoints, read burettes and record values. Even experienced analysts can differ slightly in how they see a colour change or handle a busy bench. Those small differences add up across batches, shifts and sites. Flash2 standardises dosing and endpoint detection by using precision burettes, electrodes and automated algorithms rather than subjective colour changes. Distributors describe the system as providing fast, accurate and repeatable titrations across food, wine, pharmaceutical and chemical matrices, which helps multi-site organisations align methods and acceptance criteria. How does Steroglass Flash2 save time and reduce risk? Steroglass reports that Flash2 can reduce analysis and sample preparation time by up to 90% compared with manual titration, thanks to automated sampling, degassing and levelling systems. This reduction in manual workload lowers overtime pressure during peak periods such as vintage or seasonal production and helps keep turnaround targets realistic without constant “all hands on deck” titration sessions. Less manual handling of strong acids, bases and oxidants improves safety, and more efficient reagent use with no single-use plastic burettes or tips reduces waste. With fewer manual steps, the risk of sample swaps, transcription errors or missed logbook entries also drops, supporting cleaner audit trails. Choosing and Deploying Flash2 in Your Lab Assessing Samples, Methods and Configuration Before selecting a configuration, it helps to map out how your laboratory currently uses titration: List all routine titrations (wine, dairy, sauces, oils, water, chemical baths and others). Estimate weekly and seasonal sample volumes for each test. Identify which tests run singly and which run in batches. Note reporting requirements, such as certificates for customers, release reports for production or records for auditors. Labs with a small menu of tests and moderate sample numbers may be well served by a single-burette Flash2 with one or two pumps. Sites with high throughput or more complex methods (for example, where two titrants are needed) often benefit from a dual-burette setup. Next, assess whether an autosampler is justified. As a rough guide, if staff are regularly queuing more than 10–15 samples per run or working extended hours during peak periods, an autosampler with 14, 18 or 30 positions can significantly reduce manual handling time. A simple checklist for internal discussions could include: Sample types and matrices (wine, dairy, sauces, oils, water, chemical baths) Required parameters (pH, TA, SO₂, chlorides, alkalinity, hardness, TAN/TBN and others) Typical and peak sample volumes per week Desired turnaround times Data and reporting needs (LIMS, ERP, certificates, audit reports) Calibration, Maintenance and Compliance in Australia An automatic titrator is only as reliable as its sensors, balances and volumetric systems. Routine calibration of pH electrodes, temperature probes, balances and volumetric hardware is key to maintaining traceable measurements. For many labs, combining Flash2 installation with a broader calibration review (including reference buffers, thermometers and balances) makes sense, so the full measurement chain is documented when the system goes live. Why Partner with CISCAL for Steroglass Flash2 in Australia? CISCAL is more than an equipment reseller. The team supports clients through instrument selection, method setup, installation, operator training and ongoing calibration and service. For Steroglass Flash2 users, this can include: Helping scope the right configuration (burettes, pumps, autosampler and accessories) for your mix of wine, food, water and chemical analyses. Integrating Flash2 into existing QA systems and data workflows, including LIMS or certificate templates. Providing on-site installation and training so staff are confident in running routine and validation methods. Delivering NATA-accredited calibration and preventive maintenance for supporting equipment, with documentation ready for audits. Offering access to tools such as CIS CAL SMART for asset management and certificate storage, where applicable, so QA teams can access calibration records quickly during audits. Working with a local calibration and service partner reduces the load on internal QA, engineering and maintenance teams and supports long-term reliability of the titration system. Steroglass Flash2 multisector titration Steroglass Flash2 brings multisector titration – for wine, food and beverage, water and chemical industries – onto a single compact, automatic platform. It streamlines manual titration steps, improves reproducibility, provides GLP-style data handling and supports audit-ready operation with digital records and LIMS connectivity. FAQs Previous Next < Back

  • Optimising Kiln Temperature with Keller PK-11 Series | CISCAL

    Improve kiln temperature control using Keller PK-11 Series pyrometers and CISCAL’s support for setup and NATA-accredited calibration. Optimising Kiln Temperature with Keller PK-11 Series Kiln temperature is simply the temperature inside the kiln along its length and over time. When that kiln temperature drifts or swings around, product quality, fuel bills and even refractory life all take a hit. Across cement, lime, brick, tile and ceramics plants in NSW, VIC, QLD, WA and other states, stable kiln temperature control is now a major lever for cost and quality. This guide explains how infrared pyrometers measure kiln temperature, why the Keller PK-11 Series fits industrial kilns, and how CISCAL supports supply, setup and NATA-accredited calibration. Common Kiln Temperature Problems Many sites across Australia see the same kiln temperature issues: Uneven temperature zones – One side of the tunnel kiln is hotter, or the burning zone in a rotary kiln is narrow and unstable. Hot spots and cold spots – Refractory hot spots on a rotary shell, or cold channels through a load of bricks or tiles. Relying only on thermocouples – A few fixed thermocouples can miss what is happening on the product surface or in the load, especially in long tunnel kilns. Impact on scrap, rework and downtime – When kiln temperature is not monitored properly, scrap rates rise, more product needs refiring and unplanned shutdowns become more common. Infrared pyrometers give a direct view of the actual surface temperature of product or refractory, filling in the gaps between thermocouples. How Infrared Pyrometers Measure Kiln Temperature Non-contact temperature basics An infrared pyrometer is an infrared thermometer that measures temperature from the thermal radiation a surface emits, without touching it. In simple terms: Every hot surface gives off infrared radiation. The pyrometer’s lens focuses that radiation on a detector. Electronics convert the signal into a kiln temperature reading. Because no probe or thermocouple is in the hot zone, non-contact kiln temperature measurement: Works where access is limited or the load is moving Avoids wear on probes in abrasive or high-velocity gas streams Responds very quickly to changes in kiln temperature Typical Kiln Applications Infrared pyrometers are used on many kiln types: Tunnel kiln temperature – Measure load or refractory temperatures at pre-heat, firing and cooling zones. Rotary kiln temperature – Monitor clinker, lime or refractory temperatures through kiln viewing ports. Shuttle / batch kilns – Check soak temperature at set points in the chamber. Common mounting points include: Firing zone ports on rotary cement kilns and lime kilns Side or roof ports in tunnel kilns above the main firing and soak zones Observation ports in cooling zones to control cooling rate Introducing the Keller PK-11 Series for Kilns The Keller CellaTemp PK 11 BF 2 is a compact infrared pyrometer well suited to kiln temperature monitoring in harsh industrial environments. It combines non-contact measurement with simple setup and integration into PLC or SCADA systems. Core Features of CellaTemp PK 11 BF 2 Key features relevant to tunnel kiln temperature, rotary kiln temperature and batch kilns include: Measuring range: 0–1000 °C (ideal for ceramics, cement kiln temperature and lime kiln temperature on product or refractory) Spectral range: 8–14 μm, optimised for non-metal surfaces Fast response: t90 ≤ 60 ms, so control loops see changes quickly Stainless steel body with IP65 protection , suitable for dusty kiln areas Analogue 0/4–20 mA output + IO-Link , for easy connection to PLCs and SCADA Clear LED display and keypad on the sensor , so parameters can be adjusted onsite These features allow maintenance and process teams in NSW, VIC, QLD, WA and other regions to mount PK 11 units close to the kiln, wire them into existing IO and read kiln temperature locally at the same time. Smart Functions for Process Stability Beyond basic temperature output, the CellaTemp PK 11 includes smart functions that support stable kiln temperature control: Two PNP switching outputs – Can be set as alarms or limits, for example over-temperature on a tunnel kiln roof or low temperature in a pre-heat zone. Emissivity and transmission factor adjustment – Lets you tune readings for different products (e.g. dark bricks vs lighter tiles) and any protective window between the pyrometer and kiln. “Vitality” indicators and diagnostics – The vitality function monitors internal status and can flag when the device may need maintenance or checking, helping with preventive maintenance. Choosing Between Keller PK 11-K001, K002, K003 and K004 The PK-11 measuring systems combine the PK 11 BF 2 sensor with different optics and mounting hardware. All have a 0–1000 °C measuring range and 8–14 μm spectral range . Quick Comparison Table Model Measuring range Target size Focus distance Mounting set Key feature PK 11-K001 0–1000 °C 11 mm 0.3 m PK 01-027 + cable VK 02/L AF 1 Small spot for close-range kiln ports PK 11-K002 0–1000 °C 33 mm 0.9 m PK 01-007 + cable VK 02/L AF 1 Standard spot for longer viewing ports PK 11-K003 0–1000 °C 33 mm 0.9 m PK 01-024 + cable VK 02/L AF 1 Bayonet lock for quick removal PK 11-K004 0–1000 °C 33 mm 0.9 m PK 11-006 + cable VK 02/L AF 1 Bayonet lock + kiln-suited mounting combination Each mounting set includes an air-purged fitting with flow-optimised air to keep the lens clear with minimal air consumption. Matching PK-11 Variants to Kiln Layouts Close-range ports or pilot kilns → PK 11-K001 Good where the pyrometer can be mounted close to the product, such as small shuttle kilns, pilot kilns or inspection doors with limited space. Standard tunnel or rotary kiln viewing ports → PK 11-K002 Suits many cement kiln and lime kiln applications where there is a longer standoff distance and a standard port size. Dusty, high-maintenance points → PK 11-K003 or PK 11-K004 The bayonet lock makes it easy to remove the measure head for cleaning protective windows or lenses without disturbing brackets. PK 11-K004 adds a kiln-optimised mounting combination for heavy-duty use. Practical Tips for Reliable Kiln Temperature Readings Spot Size, Distance and Aiming For accurate kiln temperature monitoring: Make sure the measuring spot is fully filled by product or refractory; do not let the spot “see” frame edges or steelwork. Select PK 11 optics so that spot size suits the kiln window or product size (e.g. 11 mm at 0.3 m vs 33 mm at 0.9 m). Align the pyrometer so it looks at a stable part of the load, not just at holes or conveyors. Dealing With Dust, Scale and Flames In cement and lime kilns, dust and scale are a daily reality: Use air-purged mounts supplied with PK 11 measuring systems to keep dust off the lens. Where possible, avoid looking directly at flames; aim at the product bed or refractory opposite the burner. Use bayonet-lock mounts (K003, K004) to remove the sensor quickly, clean protective windows and refit without re-aiming. Integrating Keller PK-11 With Your Control System The Keller PK 11 Series is designed for easy integration: 0/4–20 mA analogue output can feed directly into PID loops in your PLC or standalone controllers for kiln temperature control. IO-Link supports parameter changes, diagnostics and remote monitoring from control rooms or SCADA. Switching outputs can be used for over-temperature alarms, burner interlocks or cooling air control. Steps to set up a Keller PK-11 on a kiln inspection port Confirm the required measuring point (e.g. firing zone roof port on a tunnel kiln). Choose the PK 11-Kxxx system that matches distance and port size. Install the mounting set on the port, including air purge and any protective window. Screw in the PK 11 BF 2 sensor and roughly aim at the target area. Wire the 0/4–20 mA output to the chosen PLC or controller channel and connect IO-Link if used. Set emissivity and measuring range via the keypad or IO-Link. Check readings against existing thermocouples or reference data and fine-tune aiming as needed. Explore CISCAL’s Temperature Services Calibration and Compliance for Kiln Temperature in Australia Why calibration of kiln pyrometers matters? Infrared pyrometers used for kiln temperature control sit inside quality systems and energy audits. Uncalibrated instruments can give biased readings which: Lead to incorrect firing curves Reduce confidence during customer or regulatory audits Affect energy and emission reporting In Australia, calibration traceability links back to NMI standards and is typically provided through ISO/IEC 17025 NATA-accredited laboratories . How CISCAL supports Keller PK-11 users CISCAL has held ISO/IEC 17025 accreditation with NATA (Accreditation No. 411) since 1963, with a scope that includes temperature meters, thermocouples, digital thermometers and temperature enclosures such as ovens and furnaces. Support for kiln operators across Australia (NSW, VIC, QLD and other states) includes: NATA-accredited calibration of temperature instruments in the lab and onsite, supporting traceable kiln temperature measurement. Documentation for audits and OEM requirements, aligned with ISO/IEC 17025 and Australian measurement system expectations. Assistance with setup, function checks and maintenance plans for Keller PK 11 pyrometers, including advice on mounting, emissivity and control loop integration. Getting the Most Value from Your CISCAL services Stable kiln temperature is one of the strongest levers for quality, throughput and energy efficiency in cement, lime, brick, tile and ceramics plants across Australia. The Keller PK-11 Series provides reliable, non-contact kiln temperature monitoring with fast response, IO-Link connectivity and smart diagnostics, while the different PK 11-K001 to K004 systems handle a wide range of kiln layouts. CISCAL ties this hardware into a complete solution by supplying Keller PK-11 pyrometers, supporting correct installation and integration, and providing NATA-accredited, ISO/IEC 17025-compliant calibration for ongoing confidence in every kiln temperature reading. Quick answers about kiln temperature and Keller PK-11 (AEO) Previous Next < Back

  • Pressure Gauge Calibration Tips for Accuracy | CISCAL

    Learn expert tips for accurate pressure gauge calibration. Ensure compliance, reliability, and safety in Australian industries. < Back Pressure Gauge Calibration Tips for Accuracy Pressure gauge calibration means checking a gauge against a more accurate, traceable reference so you can quantify (and if allowed, adjust) error for safe, compliant operation in Australia. In practice you compare readings with a deadweight tester (piston gauge) or a pressure comparator + digital reference, with results traceable to national standards via NATA-recognised chains to the National Measurement Institute (NMI). Fast how-to: Isolate the gauge, connect to a reference, apply pressure at defined points up/down, record “as-found”, adjust if permitted, re-test “as-left”, report with uncertainty and traceability. Stay compliant and precise with CISCAL’s NATA-accredited pressure gauge calibration services. We cover NSW, VIC, QLD, WA, SA, TAS and NT with fast turnaround and digital certificates. Book your service today. What is Pressure Gauge Calibration? Calibration is a comparison : your pressure gauge ( Bourdon tube gauge, differential, diaphragm, digital gauge or pressure transducer ) is checked against a reference whose performance is known and traceable to national standards. In Australia, traceability requirements are set out by NATA and rely on chains linking to NMI reference standards , typically documented on your certificate. Labs demonstrating competence do so under ISO/IEC 17025. Why it matters: Better calibration means safer plant, fewer deviations, cleaner audits, and less downtime. NMI’s service scope spans high vacuum to 500 MPa with uncertainties as low as 0.0010%, that sets a realistic ceiling for what’s achievable in Australia. Ensure accuracy and compliance, get your pressure gauges calibrated by CISCAL Australian Standards & Regulatory Context ISO/IEC 17025 & NATA accreditation: Organisations choose NATA-accredited labs so results are recognised and defensible in audits. NATA explains how ISO/IEC 17025 underpins reliable calibration and reporting across industries. Metrological traceability in Australia: NATA’s policy explains how labs must establish and maintain traceability ( usually back to NMI ) and document the chain. WHS angle ( NSW example ): The WHS Regulation ( NSW ) requires pressure equipment to be regularly inspected by a competent person; approved Codes of Practice are a recognised pathway to achieving compliance. AS 1349 ( Bourdon tube pressure & vacuum gauges ): Sets requirements and accuracy classes often called up in utility and water specs ( e.g., Sydney Water ). AS/NZS 3788 ( In-service inspection) : If your gauges sit on pressure vessels/receivers, align your inspection regime with AS/NZS 3788 guidance and your regulator’s expectations (see SafeWork SA). When Should Gauges Be Calibrated? Use a risk-based interval : set periods that reflect criticality, process conditions ( vibration, clean steam/CIP/SIP, temperature cycles ), required accuracy, historical gauge drift, and audit expectations. NATA doesn’t set one fixed interval for all gear; instead, it provides guidance to help facilities justify intervals ( ISO/IEC 17025 expects you to control and verify the equipment you rely on ). Illustrative Examples ( Not Prescriptive ): Critical pharma CIP/SIP line ( Class 0.6 gauge, hot cycles ): 6 to 12 months. Benign utility air header ( Class 1.6 gauge, stable temp ): 12 to 24 months. Portable test gauge used as a reference: Match interval to required uncertainty and usage rate; shorten if drift trends up. WHS law expects a plant to be maintained and tested per manufacturer instructions by a competent person, calibrated gauges are part of that control. Equipment & Reference Standards Reference options: Deadweight tester / piston gauge: Lowest uncertainty; needs local gravity and environmental corrections. Pressure comparator + digital reference gauge: Fast and portable; ideal for onsite rounds. Liquid column/manometer or high-stability reference gauges: Used where appropriate (e.g., low pressures), provided they’re traceable. Guidance documents discuss using manometers and pressure balances as references. Accuracy ratio: Aim for ≥4:1 ( reference uncertainty ≤¼ of the DUT tolerance ). The MSA Test Method 2 makes this explicit for mechanical gauges. Traceability note: Certificates from your reference instruments should show traceability to NMI ( or an equivalent national metrology institute ) and current calibration dates. NMI’s pressure labs cover vacuum to 500 MPa with 0.0010% capability, useful context when selecting references. Setup essentials: Clean fittings, appropriate media ( gas vs oil/water ), leak-free connections, match orientation to service, and allow stabilisation at each point. Calibration Methods (How-To) Method 1: Deadweight Tester (bench, lowest uncertainty) Use when: You need the tightest uncertainty ( e.g., master test gauges, critical ranges ). Principle: Pressure = mass × gravity / effective area of the piston-cylinder; you float the piston and compare. Correct for local gravity, temperature, and other influence factors. Steps ( bench ): Visual & safety checks : Condition, rating, cleanliness; verify media compatibility. Warm-up/stabilise : Control temperature; level the DWT. Mount vertically as in service : Keep the gauge’s dial vertical; ensure proper head height. Select points : At least 0, 25, 50, 75, 100% FS, up and down; add more for Class ≤0.3 gauges. Apply masses & float the piston : Use the screw press to reach the float; hold steady; log the DWT value. Record: “as-found” errors, repeatability and hysteresis; adjust if allowed; repeat “as-left”. Corrections & uncertainty : Apply local gravity and any environmental/head corrections; include them in the uncertainty budget. Tip: Suppliers request your local gravity so weights can be adjusted; if not specified, instruments may be set for standard gravity. Method 2: Pressure Comparator + Digital Reference Use when: You need speed and portability ( onsite rounds, multiple ranges ). Principle: DUT and reference are in parallel on a comparator; apply pressure with a hand pump/controller; read the reference as the true value. Steps: Connect DUT and reference gauge to the comparator; match DUT orientation to service. For hydraulic comparators, prime to remove bubbles; for gas, use fine adjust. Step through 0, 25, 50, 75, 100% FS ( up and down ); stabilize at each point. Log corrections ( DUT minus reference ), temperatures and any head height differences. If permitted, adjust, then rerun for as-left data. Acceptance Criteria, Accuracy Classes & Decision Rules Tie acceptance to the accuracy class on the dial ( AS 1349 conventions ; typical classes include 0.1, 0.25, 0.6, 1.0, 1.6, 2.5, 4 ). For labs, apply a decision rule consistent with ISO/IEC 17025/ILAC practice, MSA Test Method 2 describes a pragmatic rule for mechanical gauges: a gauge complies if both the correction and the uncertainty are each within the tolerance at all points ( unless your contract specifies another rule ). Where gauges are used for compliance testing, make sure your decision rule is documented and agreed with users/auditors. Errors, Drift & Uncertainty Common contributors: Zero shift & span error ( pointer slippage, movement wear ). Hysteresis & elastic fatigue ( Bourdon tube, diaphragm ). Temperature & media effects ( oil-filled vs dry; gas vs liquid ). Head height and local gravity ( especially for deadweight methods ). Resolution/readability, repeatability, and leaks. Fluke and DKD guidance list environmental and correction factors ( like local gravity ) as significant influence quantities in the uncertainty budget. Trend your as-found data to refine intervals. Training resources: NMI runs pressure measurement and uncertainty courses that help teams manage error sources and reporting. Documentation: What Your Certificate Must Include Use this checklist to reduce queries during audits: Unique ID, make/model/serial, range/units, accuracy class ( if marked ). Method used ( deadweight tester or comparator ), test points ( up/down ). As-found/as-left results and corrections; any adjustments made. Environmental conditions ( temperature, media ), head height/gravity notes ( if relevant ). Measurement uncertainty ( coverage factor ), and the decision rule applied. Reference standards used ( IDs, calibration dates ). Traceability statement to national standards ( NMI ). NATA accreditation no. and scope ( if applicable ). Safety & Compliance Notes Isolate/depressurise before removing any gauge. Confirm relief and isolation valves function before re-pressurising. Air receivers and pressure vessels need in-service inspection by a competent person; align with AS/NZS 3788 and your state regulator’s guidance. For compressed air systems, see Safe Work Australia’s information sheet; air receivers can explode if neglected. Why this matters: Safe Work Australia’s latest report shows 188 worker fatalities in 2024 ( 1.3 per 100,000 ). Keeping gauges accurate is one small, visible part of a larger plant safety system. Industry-Specific Considerations Pharma/biotech: GMP requires audit trails and clear decision rules; validate ranges for CIP/SIP lines and maintain NATA-traceable evidence. Food & beverage: Hot wash-downs and vibration accelerate gauge drift; use stainless wetted parts and sanitary seals; review intervals after the first year. Research & engineering labs: Wide ranges, occasional vacuum work, mixed media; ensure reference capability covers both vacuum and positive pressure; NMI’s scope informs what’s realistic. How CISCAL Helps ( Service Block + CTA ) What you get: NATA-accredited, ISO/IEC 17025 calibration for industrial gauges ( Bourdon tube, diaphragm, differential, digital/test gauges, manometers ) and pressure sensors/transducers. Nationwide coverage ( NSW, VIC, QLD, WA, SA, TAS, NT ) with onsite comparator calibrations and lab deadweight options for tight uncertainties. Digital certificates & asset portal ( searchable history, traceability to NMI, decision rules, uncertainty ). Sample uncertainties by range provided on scope/quote. Fast turnaround and emergency slots. Get precise, NATA-accredited pressure calibration — book with CISCAL today FAQs Previous Next

  • Laser Calibration: When and How to Do It | CISCAL

    Learn when and how to perform laser calibration. Ensure compliance, precision, and safety for Australian industries and labs. < Back Laser Calibration: When and How to Do It Use a risk-based interval typical practice: 6 to 12 months for regulated work; shorter if critical, high-use, or harsh environments. A fixed number isn’t mandated by ISO/IEC 17025, intervals must be justified and records kept. What triggers a calibration? On installation/commissioning, after any impact/repair, after major software/firmware changes, when drift is detected, and at your defined interval. Audit-ready results in Australia: Use labs with NATA-endorsed certificates showing SI traceability via Australia’s National Measurement Institute ( NMI ), with measurement uncertainty reported. Learn about more in CISCAL services What is Laser Calibration? Laser calibration is a comparison of your instrument’s readings against a more accurate, traceable reference to quantify error and report expanded uncertainty ( 95% confidence ). In Australia, NATA requires metrological traceability to SI units, typically through NMI, and ISO/IEC 17025 sets the competence framework labs are assessed against. Common Categories: Dimensional: Laser interferometry for machine tools/CMM axes; generates compensation tables to correct positioning errors. Radiometric: Laser power/energy meters verified against NMI-traceable standards ; checks responsivity and linearity. Spectral: Wavelength checks of lasers/wavelength meters against stabilised references or transfer standards; uncertainty stated in nanometres per the lab’s scope. Beam diagnostics: Beam profile/divergence/M² checks to ensure process or research performance matches spec. Construction lasers ( levels ): Practical level/line checks and, if out, full lab calibration. Compliance in Australia NATA & ISO/IEC 17025: NATA accredits labs to ISO/IEC 17025, providing independent assurance that methods, uncertainty, and traceability are sound. NATA-endorsed certificates are widely recognised, including via ILAC. Traceability & uncertainty: NATA’s Metrological Traceability Policy explains how results must be linked to national standards ( commonly NMI ) and how uncertainty is established and reported. Laser safety labelling/classification: Follow ARPANSA guidance and AS/NZS IEC 60825 series ( equipment classification, user guidance ). Workplace controls ( construction ): Safe Work Australia states Class 3B and 4 lasers must not be used for construction work. Use Class 1/1M/1C/2/2M/3R only. Sector drivers: TGA adopts PIC/S GMP for medicines ( calibrated, traceable instruments and records ); FSANZ requires at least one thermometer accurate to ±1 °C in food businesses ( handy for instrument verification in HACCP ). When to Calibrate: By Risk & Use Case Set intervals with evidence. Consider safety/quality risk, usage hours, environment ( heat, vibration ), historical drift, firmware changes, and audit expectations. Document the rationale in your SOP. Application Typical triggers Suggested interval (guide only) Standard/driver Machine tools / CMM axes Commissioning, after crash or ball-screw work; tolerance changes 6–12 months for production; shorter if tight tolerances ISO/IEC 17025 conformity; OEM specs; NATA traceability; laser interferometer methods per vendor guidance. Laser power/energy meters Before validation/R&D campaigns; after sensor replacement/impact 6–12 months; verify at operating wavelengths and expected ranges NATA traceability via NMI optical services; lab scopes list ranges/uncertainties. Wavelength meters/spectrometers Before critical experiments; after firmware/hardware change ≈12 months for regulated labs; risk-based in research NMI optical standards; NATA-endorsed certificates show SI traceability and uncertainty. Construction laser levels After drops/shock; if site check fails Site check monthly; lab calibration as per contract/spec Field check per RedBack method; if out, book NATA calibration. How to Calibrate: Procedures and Checklists A. Laser Interferometry (Machine Positioning) What you’re doing: Using a laser interferometer ( or tracker with interferometry ) to measure linear errors, backlash, straightness, squareness, pitch/yaw/roll and then generating axis compensation tables in the controller. Set-up essentials ( checklist ): Stable environment ( temp, air flow ); warm-up machine and optics. Align optical path; use a retroreflector/SMR or plane mirror targets. Log environmentals ( air temp/pressure/humidity ) for refractive index compensation. Verify laser reference status and traceability; check beam quality. Run-through ( summary ): Baseline sweep on each axis ( up/down ) for linear error and reversal. Cross-tests for straightness and squareness. Rotary/axis tests if applicable. Upload compensation tables; re-run for as-left verification; issue uncertainty-backed report. Many Australian shops use systems like Renishaw XL-80 or API trackers; both depend on interferometry with traceable wavelength standards. B. Laser power/energy meters Aim: Compare DUT readings to a NATA-traceable reference at relevant wavelengths and power/energy levels; check linearity and responsivity; report expanded uncertainty ( k≈2 ). Use NMI-traceable standards or transfer artefacts. Steps ( bench ): Inspect sensor head; confirm damage/contamination-free. Stabilise source; set wavelength compensation. Apply points across the working range ( up/down ); hold steady; record as-found. If allowed, adjust cal factors; repeat for as-left; capture ambient conditions and drift notes. Include traceability and uncertainty budget on the certificate. C. Wavelength ( Lasers/Wavelength Meters ) Aim: Validate wavelength accuracy against stabilised references ( e.g., iodine-stabilised He-Ne or frequency-comb-derived transfer standards ) or accredited transfer standards; verify across your working range; report uncertainty in nm. Use a lab with appropriate scope. Steps: Warm-up the DUT; set to nominal lines ( e.g., 632.8 nm ). Compare to reference; note offsets; repeat across range. Report as-found/as-left, stability, and uncertainty with full traceability chain. D. Field Check for Construction Laser Levels ( Quick Site Method ) Use when you need a fast go/no-go on site. 5-step check ( horizontal line ): Set the laser ~10 m from a wall; mark the beam. Rotate 90°; mark again; repeat for 180° and 270°. All marks should align within the maker’s tolerance. If out, don’t “tweak” in the field, book a NATA calibration. After knocks/drops, re-check before use. Laser calibration with CISCAL Documentation Auditors Expect Have these items on every certificate/SOP checklist: NATA-endorsed certificate and scope reference ( ranges and CMCs ). SI traceability statement ( chain to NMI or another NMI via ILAC ). Method ( interferometry, radiometry, spectral ), as-found/as-left data, and environmental conditions. Expanded uncertainty ( coverage factor ) and the decision rule used. Technician and reviewer sign-off; due date/next interval; digital record retention. Safety & legal obligations in AU Laser safety classes: Follow AS/NZS IEC 60825 classification and ARPANSA guidance. Label devices with class, power, wavelength, use signs, and implement controls per class. Construction work: Do not use Class 3B or 4 lasers for construction tasks; they present significant eye/skin hazards and require strict controls. Training: Consider Laser Safety Officer/Supervisor training and consult your state/territory regulator for local requirements. Sector Call-outs Pharma/biotech: The TGA adopts PIC/S GMP; keep periodicity risk-based and show it in your validation/CAPA trail. Reference NATA-endorsed calibration in your VMP/SOPs. Food & beverage: FSANZ requires at least one probe thermometer accurate to ±1 °C; if you use IR “laser” thermometers for checks, validate against a probe and document. Research & engineering labs: Mixed dimensional/spectral/power work—ensure the lab’s scope actually covers your range and uncertainty needs. NMI optical and length services are the national reference. Choosing a Provider Quick checklist: NATA-accredited for the optical/laser scope you need ( check the lab’s Scope of Accreditation ). Traceability to NMI stated on certificates. Fit-for-purpose uncertainty at your wavelength/power/range. On-site vs lab capability ( e.g., on-site interferometry; lab-grade radiometry ). Turnaround & logistics that suit validation windows. Digital certificates/asset portal for audits. Common Drift Causes & Troubleshooting Heat and air turbulence shifting interferometer paths, control HVAC, allow warm-up. Vibration and transport shock, use isolation mounts; re-check after moves/impacts. Optics contamination, clean lenses/windows per OEM. Fibre connector wear, inspect ferrules; replace worn leads. Detector ageing ( power meters ), trend responsivity over time; adjust intervals if drift grows. Firmware changes, treated as a calibration trigger with as-found/as-left records. Glossary Traceability: An unbroken chain of comparisons to standards, with stated uncertainties, up to SI units ( usually via NMI in Australia ). Expanded uncertainty ( 95% CL ): Reported uncertainty multiplied by a coverage factor, often k≈2, giving ~95% confidence. Responsivity: Ratio of detector output to incident optical power ( e.g., V/W ). Linearity: How constant responsibility is across the operating range Beam profile: Intensity distribution across the beam cross-section. Compensation table: Controller file that corrects axis errors at positions. MPE: Maximum Permissible Exposure, safety concept defined in the laser standards/guides. How CISCAL Helps NATA-accredited, ISO/IEC 17025 calibration for laser interferometers, laser power/energy meters, wavelength meters/spectrometers, construction laser levels, and optical instruments. Nationwide support ( NSW, VIC, QLD, WA, SA, TAS, NT ), onsite and lab options. Advanced optical tools and SI traceability via NMI; digital certificates with uncertainty and decision rules. Fast turnaround aligned to qualification/validation windows. FAQs Previous Next

  • Common Errors in Gas Detector Calibration | CISCAL

    Discover the most common errors in gas detector calibration and how they impact safety, and compliance in Australian industries. < Back Common Errors in Gas Detector Calibration Gas detector calibration is the process of adjusting a detector’s readings by comparing them to a more accurate, traceable reference gas so results are trustworthy for safety and compliance. In Australia, auditors expect ISO/IEC 17025 traceability on certificates from NATA-accredited providers. What is Gas Detector Calibration? Calibrating a gas detector sets its zero and span so it reads the right value when exposed to a known gas. In Australia that means using correct gas, flow and procedure, recording results, and keeping NATA-endorsed evidence that’s traceable to national standards. Many sites require a daily bump test and monthly quarterly calibrations, driven by risk and manufacturer instructions. Why Calibration Matters in Australia WHS laws place a duty on PCBUs to manage risks. Approved Codes of Practice ( confined spaces; hazardous chemicals ) outline practical testing steps and sampling methods; following them is an accepted way to meet the Regulations. Confined-space mis-testing can be fatal. Safe Work Australia’s 2025 release confirms 188 traumatic injury fatalities in 2024 regulators scrutinise plant safety records and evidence of control. The Most Common Calibration Errors Below are the failure modes we see across labs, plants and field teams in Australia and how to fix them. 1) Skipping bump tests before use Symptom: Detector “works on paper” but doesn’t alarm to gas on shift. Cause: No functional challenge before entry. Fix: Enforce a pre-use bump test ( or before each shift ). Use docking stations to automate tests, logs and certificates. 2) Using expired or incorrect calibration gas Symptom: Readings drift after “successful” calibration. Cause: Expired cylinders; wrong balance gas ( air vs N₂ ) or wrong concentration. Reactive mixes ( e.g., H₂S, Cl₂ ) can change over time. Fix: Track expiry and lot; match gas matrix and set-point to the sensor spec; store cylinders correctly. 3) Wrong flow rate or regulator type Symptom: Slow or unstable response; calibration won’t settle. Cause: Using a fixed-flow regulator on a pumped instrument ( or vice-versa ), or using the wrong flow. Fix: For pumped instruments use a demand-flow regulator; for diffusion instruments, a fixed-flow set to the manufacturer-specified rate. Verify flow with a calibrator. 4) Calibrating in unsuitable environments Symptom: Results vary between benches or between days. Cause: Wind, heat, humidity; silicone/solvent vapours; nearby sources of contaminants. Fix: Calibrate in a clean, ventilated spot; allow temperature stabilisation; keep silicones, solvents and aerosols away, these can poison catalytic LEL sensors. 5) Poor zero/span procedure Symptom: Zero offsets; overshoot; inconsistent span. Cause: No warm-up; skipping fresh-air zero; not waiting for stable readings. Fix: Standardise the SOP: warm-up, fresh-air zero, span at the correct flow until stable, document acceptance ranges. Docking systems help make steps repeatable. 6) Confusing %LEL with ppm Symptom: Wrong alarm set-points on multi-gas units; flammables checked in ppm charts, toxics in %LEL by mistake. Cause: Unit mix-ups %LEL is for flammability, ppm is typical for toxic gases. Fix: Put the units on the work instruction, and validate alarms after calibration. 7) Ignoring cross-sensitivity & sensor poisoning Symptom: CO alarms in battery rooms; PID VOC readings in solvent-rich air are “too high.” Cause: Non-target gases affect the sensor ( e.g., H₂ interferes with CO ); silicones/lead/sulphur can poison pellistors. Fix: Check the maker’s cross-sens tables; choose filtered or H₂-compensated sensors; verify with the correct target gas. 8) Not updating intervals after sensor replacement or harsh exposure Symptom: Fresh sensor drifts early; over-range exposure followed by quiet failures. Cause: Intervals remain “ business as usual ” after a change-out, shock, poisoning or over-range event. Fix: Shorten intervals temporarily and re-establish stability; record the trigger in the asset system. 9) Inadequate record-keeping & traceability Symptom: Audit failures; “ no evidence ” of calibration, gas lot, or uncertainty. Cause: Paper logbooks only; no NATA-endorsed reports; missing gas details. Fix: Keep certificates with as-found/as-left, uncertainty, reference IDs and NATA traceability; use docking stations/portals for automated logs. 10) Relying on non-accredited providers Symptom: Certificates rejected by clients or regulators. Cause: Results not issued under ISO/IEC 17025; no evidence of SI traceability. Fix: Use a NATA-accredited lab to check the scope and the endorsement. 11) No confined-space pre-entry test plan Symptom: Atmosphere tested only at head height or after entry. Cause: No plan for remote sampling, top-middle-bottom stratification, and continuous monitoring. Fix: Follow the Model Code of Practice Confined Spaces and your jurisdiction’s code. Consequences of Calibration Errors Safety: False negatives or false positives in confined spaces expose workers to toxic or flammable atmospheres. The Confined Spaces Code sets expectations for testing and monitoring methods. Compliance: Poor practice can trigger improvement notices or stop-work orders; inspectors expect traceable, competent calibrations and pre-entry testing. a Operations: Downtime and rework add cost. National WHS statistics underline the scale of harm and the scrutiny on plant-related risks. Australian Standards, Codes & Frequency Guidance AS/NZS 60079.29.2 : selection, installation, use and maintenance of flammable-gas and oxygen detectors ( your go-to maintenance reference ). AS/NZS 60079.29.1 : performance requirements for flammable-gas detectors ( equipment performance ). Model Code of Practice confined Spaces : test from outside, sample different levels, and keep monitoring while occupied ( updated Nov 2024 ). Managing risks of hazardous chemicals : approved code that explains how codes support WHS duties. Frequency: standards and codes set methods and duties, not fixed intervals. Intervals are risk-based and guided by manufacturer instructions and use conditions . How to get Calibration Right ( Step-by-step ) Check environment & gas : Choose a clean area; confirm cylinder concentration, balance gas, lot & expiry; consider temperature. Inspect the instrument : Battery, filters, sample lines, pump ( if fitted ). Fresh-air zero & warm-up : Stabilise, then zero in clean air. Bump test : Challenge all sensors; confirm alarms/response time before calibration. Apply span gas correctly : Use the right regulator type/flow; wait for a stable span at each point. Save results : Store as-found/as-left, uncertainty, gas lot/expiry, and technician ID on a NATA-endorsed certificate ( or in your dock ). Verify : Run a post-cal bump or check. Set next due date : Base it on risk, use and any recent events ( over-range, shock, replacement ). Get your gas calibration done right! Book with CISCAL today Choosing a NATA-accredited Provider Look for: a NATA scope covering gas detectors; ISO/IEC 17025 endorsement on reports; uncertainties on the certificate; practical turnaround; onsite vs lab options; digital records/portal. CISCAL holds continuous ISO/IEC 17025 accreditation through NATA ( Acc. No. 411 ), covers NSW/VIC/QLD with reach across Australia and the Pacific, and provides the SMART Portal for real-time job tracking and asset/certificate management. Book NATA-accredited gas calibration with CISCAL FAQs Previous Next

  • How Torque Wrench Calibration Is Done | CISCAL

    Learn how torque wrench calibration supports compliance. Step-by-step guide tailored for Australian labs, pharma, and food industries. < Back How Torque Wrench Calibration Is Done Calibration sets a torque wrench’s indicated value against a more accurate reference standard and reports the measurement uncertainty; verification is a quicker in-house check between calibrations. In Australia, choose NATA-accredited labs working to ISO/IEC 17025, with SI traceability (typically via the National Measurement Institute, NMI). The calibration method is defined in ISO 6789-2:2017; design/conformance requirements live in ISO 6789-1:2017. Standards That Apply in Australia ISO 6789-1:2017 covers: design & quality conformance (Type I indicating, Type II setting tools). ISO 6789-2:2017 sets: the calibration method and how to calculate measurement uncertainty ( the lab’s certificate should reference this ). Australia’s former AS 4115: was withdrawn ( Oct 2016 ); calibration follows ISO 6789-2. ISO/IEC 17025 via NATA: auditors expect NATA-endorsed certificates with traceability; NATA’s Metrological Traceability Policy explains how labs demonstrate SI links. You can search NATA’s directory for torque scopes. Terminology tip: ISO 6789 uses “maximum permissible ( relative ) deviation” ( MPD ) instead of a loose “accuracy” label. How Often to Calibrate Principle: set intervals by risk and usage ( criticality, environment, transport, history ). ISO 6789-2 itself suggests 12 months or 5,000 cycles ( whichever first ) if you don’t run your own control procedure; then adapt based on successive results. Industry guidance ( OEM ): Norbar ( AU ) commonly advises every 12 months, with shorter intervals for heavy use/critical tasks; 5,000 cycles is widely cited as a default. Norbar Torque Tools+1 Decision Mini-table (Illustrative, not Prescriptive): Situation Suggested interval Critical process / high use / harsh environment 6 months or ≤5,000 cycles Routine production / moderate use 12 months After shock, overload, transport damage Immediately, then shorten temporarily ( Record the rationale in your QMS; ISO doesn’t mandate a single number. ) Iteh Standards Equipment Used Torque tester / transducer with known uncertainty ( calibrated and traceable ). Under ISO 6789, the measurement device uncertainty must be suitably small relative to the tool’s expected uncertainty (often expressed as ≤¼ of the tool’s expected uncertainty/MPD). Loader/arm & fixtures to apply torque horizontally and support the wrench at the handle load point; good systems minimise parasitic forces ( e.g., floating supports/counter-balance) . Adaptors to align square/hex drives; environmental control ( temperature, etc. ) and a data system to compute uncertainty per ISO 6789-2. Step-by-step: How Torque Wrench Calibration is Done ( ISO 6789-2 ) The steps below reflect ISO 6789-2:2017 concepts used by accredited labs. Your certificate should list method, as-found/as-left, uncertainty, traceability, and equipment IDs. Pre-checks Identify tool type ( Type I indicating vs Type II setting ) and inspect ratchet/drive, scale and handle. Record tool ID. Exercise the wrench Operate the wrench several times near the target value to settle components ( per lab procedure ). Set-up Mount the wrench horizontally; align at the handle load point; use correct adaptors; minimise side loads; record ambient conditions. Select test points Calibrate from the lowest marked value to the top of range; many labs test at minimum, ~60%, and 100% of the specified range, in each direction if applicable. ( ISO 6789-2 requires coverage down to the lowest marked value. ) Apply load at the correct rate For Type II (setting) tools, increase smoothly to ~80%, then reach the target within a short, controlled window ( commonly 0.5 to 4 s from 80% to target refer to the ISO tables by range ). This avoids overshoot and improves repeatability. Repeat readings Take repeated applications per point ( per ISO class ), capturing indicated vs reference values. Compute error & uncertainty ISO 6789-2 defines how to calculate relative measurement error and expanded uncertainty for the tool and to confirm the measurement device is suitable ( its uncertainty interval ≤¼ of the tool’s expected uncertainty interval ). Adjust ( Type II ) & re-test If the tool is adjustable and out of tolerance, adjust and repeat the points to produce as-left results. Issue certificate Include as-found/as-left, uncertainty, method = ISO 6789-2:2017, ambient conditions, equipment IDs, traceability ( NMI/ILAC chain ), technician sign-off, and next due date ( your risk-based choice ). Pass/Fail Criteria & Accuracy MPD ( maximum permissible relative deviation ) is the ISO term; tools must meet the MPD for their type/class. ( Manufacturers may specify tighter. ) In practice, many hand wrenches work to ±4% or ±6% classes ( depending on type/class and torque level ). Use the tool datasheet and your quality procedure to select the rule. Worked Example ( Illustrative ): Target = 100 N·m; average indicated = 96.0 N·m; relative error = ( 96.0−100 )/100 = −4.0%. Expanded uncertainty ( k≈2 ) on the tool at this point = ±1.2%.Decision rule ( per ISO/IEC 17025 QMS ): if MPD = ±4%, this result just meets the limit at the point estimate; if your lab applies guard banding, uncertainty may influence the pass decision. ( Your certificate should state the decision rule used. ) Compliance in Regulated Industries (Australia) Pharma ( TGA / PIC/S GMP ): Calibrated, qualified equipment with records is expected under the PIC/S Guide to GMP adopted by the TGA. ( TGA currently references the PIC/S Guide; version updates are in progress with transition communications. ) Food & beverage ( FSANZ ): Food safety standards require reliable measurements under documented controls; calibrated devices support HACCP and verification of critical fasteners on processing equipment. Maintenance Tips That Extend Calibration Stability Store at minimum load; avoid shock and over-range. Handle at the marked centre of the handle; don’t use extensions not accounted for. User verification between lab calibrations using a torque checker helps spot drift early (not a substitute for a full ISO 6789-2 calibration). Transport in a padded case; record cycles to refine intervals. Choosing a Provider (What to Look for) NATA accreditation for torque under ISO/IEC 17025 (check the Scope of Accreditation for ranges & CMCs). Certificates showing ISO 6789-2 method, uncertainty, and SI traceability (via NMI or an ILAC NMI). Turnaround & logistics, on-site options, and digital record access. CISCAL proof points: NATA Acc. No. 411; torque scope 1.25–1,500 N·m (CMC ±1.2%), multi-state presence, operating since 1963, and the SMART portal for real-time certificates and asset tracking. FAQs Previous Next

  • Understanding UV Meter Readings Easily | CISCAL

    Learn how UV meter readings, calibration, and compliance in Australian labs and industries. Simple guide for accuracy and safety < Back Understanding UV Meter Readings UV Index vs irradiance UV Index ( UVI ) is a skin-effect-weighted scale for solar UV . It’s what ARPANSA and Cancer Council publish for cities and what SunSmart messaging uses. ≥3 UVI generally means “use protection.” Irradiance is physical power per area ( e.g., W/m², mW/cm², µW/cm² ). Lab meters for disinfection and validation report irradiance (by spectral band) rather than UVI. Dose is the time-integral of irradiance. Spectral bands UVA: 315–400 nm ( passes through much window glass to varying degrees ). UVB: 280–315 nm ( main erythema driver; most glass blocks it ). UVC: 100–280 nm ( germicidal; common disinfection lamps around 254 nm ). Tip: Don’t “convert” a UVI reading into a UVC dose. UVI is a solar, skin-weighted index; UVC meters use different filters and responsibility curves. Keep like-with-like. UV Index at a Glance (Australia) Low: 1–2 Moderate: 3–5 High: 6–7 Very high: 8–10 Extreme: 11+ SunSmart threshold: take sun protection when UVI ≥ 3. Step-by-step: Taking Accurate UV Meter Readings Outdoor UV Index Reading ( Quick Field Method ) Hold vertically: sensor up, at arm’s length; avoid your body shading the sensor. Stand in unshaded sun: press/hold to read; repeat within a few minutes because clouds change UVI quickly. Try comparisons you can replicate: full sun vs under a shade sail, and inside by a window ( see “typical numbers” below ). ARPANSA provides a user guide and real-time UVI charts for Australian cities. Indoor/UVC Disinfection Reading ( Lab and Field ) Use a UVC-specific probe with the right spectral response; verify the meter’s range will not saturate. Set distance/angle per your method; log exposure time to get dose. Never look at UVC sources; follow guarding/PPE per your lab policy. ( See UV safety below. ) Recording & QA Log instrument ID, serial, calibration date, ambient conditions, distance, time, and location. Cross-check with ARPANSA’s UVI network when you’re outdoors. Interpreting Readings: Intensity vs Dose Intensity ( irradiance ) is instantaneous. Dose ( energy per area ) accumulates: Dose = Intensity × Time. For solar exposure, ARPANSA reports dose in SED ( Standard Erythema Dose ). As a rule-of-thumb, ~1 SED/day is often cited as a safe planning reference for most people, noting skin type matters. For outdoor work, UVI ≥ 3 triggers protection. Typical Numbers in Australia ( Sanity Check Benchmarks ) Under a good shade sail: UVI typically lower than full sun ( substantial reduction ), but not zero. Under dense canvas shade: often lower again than shade sails. Under a leafy tree: UVI commonly reduced, yet scattered light means a measurable UVI can persist. Through clear window glass: UVA can pass UVI may still be measurable indoors ( varies by glass/film ). Through car windscreens: laminated windscreens block almost all UVB and ~98% UVA, so UVI is near zero; side windows ( tempered glass ) allow more UVA unless laminated/film-treated. Your readings will vary with cloud, haze, altitude, and glazing. Calibration & Traceability (NATA / ISO/IEC 17025) Why calibrate? UV meters drift with detector ageing, filter changes, temperature, and spectral mismatch; without calibration you can over- or under-estimate dose. Australia has NATA-accredited optical/photometric labs and NMI services for high-accuracy optical metrology . Local Options & Notes ( Examples ): LightLab International runs a NATA-accredited photometric lab ( incl. UVA detectors/photometric meters ). Kingfisher International operates a NATA-accredited optical calibration lab; check the NATA listing for scope. UVC meter kits supplied in AU often come with ISO/IEC 17025-accredited, traceable calibration. Weathering testers (QUV/Q-SUN): use the specified calibration radiometer ( CR10/UC10 for fluorescent UV, CR20/UC20 for xenon ). Do not cross-calibrate with generic meters due to spectral mismatch. Maintenance Cadence ( Practical ): follow your QMS and the manufacturer; many labs choose annual checks for portable UV meters and create a certificate library for audits. Include uncertainty and traceability statements. Compliance Context (Australia) AS/NZS 2243.5 : Safety in laboratories, Non-ionising radiations : the lab safety reference that covers UV. ARPANSA: provides regulatory guides for UV sources ( e.g., when a device is “controlled apparatus” ) and the occupational UV exposure standard ( RPS 12 ). ARPANSA/NMI : run the national UV monitoring and dose services for Australian cities ( useful for QA checks and training ). Always align with your site WHS risk assessment and procedures (e.g., engineering controls, PPE, and exposure time limits). Troubleshooting Bad Readings Saturation/clipping ( the number doesn’t increase near a very bright source ): You’re at/above the range; pick a probe with the right dynamic range. Wrong band ( UVA probe on a UVC lamp ): Expect under-reads or nonsense. Use the correct detector. Geometry errors ( angle/distance/shading ): Standardise fixtures/jigs; use cosine-corrected probes for wide-field measurements. Dirty/damaged sensor window: Clean per OEM; re-verify on a stable source. Ambient effects ( temperature/humidity ): Note in the log; allow the sensor to stabilise. Comparisons disagree with the network: Check time and cloud; ARPANSA’s city charts shift minute-by-minute. UV Safety Quick Guide ( Workplaces & Labs ) Outdoors: Protect when UVI ≥ 3 ( hat, long sleeves, shade, sunglasses, SPF ). Use local UVI forecasts or ARPANSA’s real-time charts. Indoors with UVC: Never stare at UVC sources; fit interlocks/guards, post signage, and control exposure time per your lab SOP and safety training. RPS 12 is the AU exposure standard for occupational UV. Choosing the Right UV Meter Match the instrument to the job: Band & purpose: UVI for solar exposure; UVC radiometer ( ~254 nm or specified LED band ) for disinfection. Dynamic range & saturation: Ensure the meter won’t clip at your brightest point. Cosine correction & geometry: For wide-field or off-axis work, use devices with cosine response. Data logging & QA: Prefer meters with timestamped logs for audits. Calibration support: Choose suppliers with ISO/IEC 17025 calibration and NATA-recognised pathways in Australia. Measure light right, choose and calibrate your UV meter with CISCAL How CISCAL Helps (Calibration & Validation) NATA-accredited calibration for UV meters and optical instruments with ISO/IEC 17025-traceable results (SI-linked via national standards). Pickup/onsite options, fast turnaround, and digital certificates with uncertainty, as-found/as-left, and asset ID. SMART portal for reminders, history and downloadable certs across sites and states. Book NATA-accredited calibration with CISCAL FAQs Previous Next

  • Humidity meter: monitor indoor air the smart way | CISCAL

    Learn how a humidity meter (hygrometer) keeps indoor air healthy, prevents mould and improves comfort. Tips, placement, ranges and calibration for Australia. < Back How a Humidity Meter Helps Monitor Indoor Air A humidity meter (hygrometer) measures indoor relative humidity (RH), allowing you to maintain it within a healthy range of about 30 to 50% RH for most homes and workplaces, which helps reduce mold, dust mites, and discomfort. Place meters in a representative spot (not by windows, vents or steam) and check them regularly. For dependable records, get meters calibrated and keep certificates traceable to Australian standards. In Australia, managing indoor air is particularly important given the mix of humid coastal climates and dry inland conditions. Reports from the CSIRO indicate that excessive humidity can lead to structural issues in homes, while low humidity during the winter months often exacerbates respiratory illnesses. Why Indoor Humidity Matters in Australia Too-high RH supports mould and dust mites, which trigger asthma and allergies. NSW Health advises addressing moisture sources and ventilation to prevent mould growth and protect health. At the same time, newer, more airtight homes in Australia are more comfortable and energy-efficient. Still, they require adequate ventilation to control condensation and minimize the risk of mould. Monitoring RH helps you spot problems early. Healthy Indoor Humidity Range For most dwellings and offices, aim for 30 to 50% RH ( many people find 30 to 60% still comfortable ). In specialist spaces like archives, museums or regulated labs follow your SOP or standard. Space Target RH Living areas / general offices 30–50% RH Archives, collections, labs Per SOP/standard Australian workplace guidance also recommends keeping humidity between 30 to 50% where possible. Types of Humidity Meters Analogue hygrometers ( hair/coil ): simple, low-cost; slower response; need regular checks. Digital thermo-hygrometers: quick, readable; often ±2–5 %RH accuracy; many include min/max, alarms and dew point. Data loggers: record RH and temperature over time for compliance and diagnostics. Smart/Wi-Fi meters: push alerts to apps; handy for remote sites and homes. Psychrometers ( wet-bulb/dry-bulb ): classic HVAC method; good for cross-checks and challenging environments. HVAC/transmitter probes: fixed installations for building control and large facilities. For selection and use in Australian homes and facilities, RS Australia’s guide covers features, maintenance and regular calibration. Where to Place Your Meter Put it at head height in a representative location with free airflow. Keep away from windows, direct sun, kitchens, bathrooms, heaters and supply vents. Avoid corners, exterior walls and damp micro-climates unless that’s what you’re investigating. Use one per level/zone, plus extras for problem rooms. For fixed sensors, ensure unobstructed airflow and periodic verification.. How to Use and Read a Humidity Meter Unbox & power: the meter; select °C and %RH. Place: it in your chosen spot and allow 15 to 30 minutes to stabilize. Log a baseline: note RH/temperature, time, and location. Check at key times: (morning/evening; before/after showers or cooking ) to see patterns. Set alerts: for 30% ( too dry ) and 50% ( start managing moisture ) in homes and many workplaces. Act on readings: ventilate, use extraction, reduce indoor moisture generation; consider a dehumidifier if RH stays high. Record weekly: RH trend, actions taken ( e.g., increased ventilation ), and any issues ( condensation, odours ). Re-site or add meters: for large floors or where readings vary widely; book annual calibration if you rely on the data for maintenance or audits. Tip for teams: store readings in a simple spreadsheet or your facilities platform so trends are easy to spot and share. Preventing Condensation & Mold Ventilate wet areas: ( showers, laundries, kitchens ) with ducted exhaust to outside; maintain flow rates. Control sources: use lids when cooking; vent clothes dryers; fix leaks quickly. Insulate: cold surfaces or thermal bridges to reduce condensation. Dry out: after rain events; open windows when outdoor air is dry; use heating + ventilation to speed drying. Use dehumidifiers: when RH remains high. These actions align with NSW Health advice and the ABCB’s Condensation in Buildings handbook. NSW Health+1 Troubleshooting Readings Sudden spikes near showers or kettles placement issue; move the meter or add a second unit. Sensor lag ( slow response ) allows stabilization time; check filters/vents. Wrong room “story” takes a one-week log in multiple locations, then refine placement. Meter vs dehumidifier disagreement built-in humidity stats read locally and can be off; use an independent meter and validate after moving units or changing filters. Unstable readings check for drafts, direct sun, or proximity to vents; consider a small stand or wall mount. Unusual swings with temperature remember RH is temperature-dependent; dew point stays constant while RH shifts as air warms/cools. Accuracy, Calibration & Documentation Most quality digital meters specify ±2 to 5 %RH accuracy. Sensors drift with age, contamination and harsh conditions. For trusted results especially in audits or multi-site programs follow manufacturer instructions and calibrate regularly, keeping certificates and traceability statements. NATA’s metrological traceability policy explains how results should link to SI units ( ISO/IEC 17025 ), typically through standards maintained by Australia’s National Measurement Institute ( NMI ). Practical tips for consumer and facility meters: clean sensors, update firmware ( if applicable ), and schedule annual checks. Need defensible records for QA? Use a NATA-accredited lab for calibration and store certificates alongside your maintenance logs. Sector-specific Notes Healthcare & sterile stock: monitor RH per hospital policy; escalate excursions ( e.g., sterile stores, theatres ) and document corrective actions. Workplaces: Queensland WHS guidance recommends RH around 30 to 50%, with moisture and mould managed via maintenance and ventilation. Homes after floods: dry quickly, remove water-damaged porous materials, and ventilate; check RH frequently during recovery. Choosing a Humidity Meter Accuracy & range: look for specs that meet your use ( e.g., ±2 %RH for QA; wider is fine for home awareness ). Response time & display: faster sensors help with real-time decisions; ensure clear units and alarms. Data logging & connectivity: onboard memory, Wi-Fi/app alerts, and export features simplify compliance and team workflows. Calibration access: confirm you can obtain ISO/IEC 17025 ( NATA-endorsed ) certificates. Environment: operating temp/RH limits; suitable housings, wall/desk mounts, or probes for ducts/cabinets. Spot checks: a psychrometer is handy for validation and HVAC commissioning. Previous Next

  • Sound Meter Basics: What You Should Know | CISCAL

    Learn how sound meter calibration ensures compliance, safety, and reliable operation in Australian industries. < Back Sound Meter Basics: What You Should Know What is a Sound Meter? A sound ( noise ) level meter measures sound pressure level in decibels ( dB ) using frequency weightings ( A/C/Z ) and time weightings ( Fast/Slow). It’s used for workplace health and safety checks , environmental licence compliance and lab work. In Australia, the exposure standards are LAeq,8h 85 dB(A) and LC,peak 140 dB(C). How a Sound Meter Works A condenser microphone converts air pressure into voltage, a pre-amplifier and analogue-to-digital converter ( ADC) digitise it, and onboard DSP computes descriptors like LAeq, LAFmax, and statistical levels ( LAF10/LAF90 ). Many meters add octave/third-octave analysis and data logging. Meters are built to Class 1 (higher precision) or Class 2 tolerances; Class 1 is the usual choice for compliance and environmental monitoring in NSW. Explore how sound meters ensure compliance Australian Standards & Regulatory Context Work health & safety exposure standards ( WHS ): LAeq,8h 85 dB(A) and LC,peak 140 dB(C). The Code shows dose logic ( e.g., 88 dB(A) for 4 h ≈ 85 dB(A) for 8 h ). Environmental compliance ( NSW ): NSW EPA Approved Methods ( 2022 ) require a Class 1 sound level meter conforming to AS/NZS IEC 61672.1:2019. The acoustic calibrator must comply with IEC 60942:2017 and be the same class as the meter. Policy framework: The Noise Policy for Industry ( 2017 ) is the key guideline for industrial assessments in NSW. Class 1 vs Class 2: Which Do You Need? Class 1: tighter tolerance; required for NSW EPA licence/consent compliance and most formal environmental surveys. Class 2: suitable for internal screening or preliminary OH&S checks; not acceptable for NSW EPA compliance submissions. Core Measurements & Metrics LAeq,T – time-averaged A-weighted level over period T . LAFmax – highest A-weighted level with Fast time weighting. LAF90 / LAF10 – A-weighted levels exceeded for 90%/10% of T ( background vs. “loud” events ). Where a licence doesn’t specify descriptors, the Approved Methods require at least LAeq,T, LAFmax, LAF90, LAF10 with 15-minute Fast as the default for statistical descriptors. Quick lab example: If a packaging line returns LAeq,15min 82 dB(A), LAF90 78 dB(A) and LAFmax 93 dB(A), you’re seeing steady background around 78 dB(A) with intermittent peaks from events like capping Calibration & Field Checks ( What Good Practice Looks Like ) Field checks: Perform an acoustic calibrator check immediately before and after measurements. If the post-check differs by > 1.0 dB from the pre-check, disregard the intervening measurements and repeat. Class-matched calibrator: Use a calibrator that meets IEC 60942:2017 and is the same class as your meter ( Class 1 with Class 1 ). Traceable lab calibration: The reference sound source ( and other relevant instrumentation ) must be calibrated by a NATA-accredited facility at least once every two years for environmental work. Keep certificates. Metrological traceability: Follow NATA’s Metrological Traceability Policy under ISO/IEC 17025; certificates should state uncertainty and traceability. Choosing a Sound Meter for Australian Use Checklist Class: Class 1 ( AS/NZS IEC 61672.1:2019 ) for EPA/consent/licence work. Frequency & dynamic range: Cover your sources ( low-frequency plant noise to impulsive peaks ). Logging & descriptors: LAeq, LAFmax, LAF10/90; 1/3-octave option for tones/low-frequency checks. Accessories: Class-matched IEC 60942:2017 calibrator, large windshield, tripod, weather kit, GPS/time-sync, and reporting software aligned to Approved Methods . Compliance note: If measuring for NSW EPA licence compliance, confirm your meter and calibrator match the standard editions explicitly named in the Approved Method s . Safety & Compliance Examples for Labs and Manufacturing WHS exposure planning: The exposure standards are LAeq,8h 85 dB(A) and LC,peak 140 dB(C). Every +3 dB roughly halves allowable time (e.g., ~88 dB(A) for 4 h ≈ 85 dB(A) for 8 h). Use this for shift design and hearing protection programs. Typical cases: Cleanrooms/biotech filling: mid-70s to low-80s dB(A) → verify LAeq against task duration; ensure staff rotation if close to 85 dB(A). Tablet presses: mid-80s to low-90s dB(A) at operator position → check daily patterns; confirm hearing protection class under AS/NZS 1269 program guidance referenced by the WHS Code. Beverage bottling halls: mid-90s dB(A) with impulsive peaks → measure LC,peak and verify it stays < 140 dB(C). When to Call an Accredited Cal Lab ( And What You’ll Get ) Use a NATA-accredited ( ISO/IEC 17025 ) facility for periodic instrument calibration ( at least every two years for environmental-noise reference sources/instrumentation, or more often if your QMS requires ). Expect a certificate with measurement uncertainty, traceability, equipment IDs, and results that regulators recognise. CISCAL is NATA-accredited ( No. 411 ) with national coverage ( NSW, VIC, QLD ). We calibrate sound level meters, calibrators and related accessories; our SMART Portal gives you asset histories, reminders and downloadable certificates. Schedule Your Calibration Now FAQs Previous Next

  • What Is a Data Logger and How Does It Work? | CISCAL

    Learn what a data logger is, how it works, types, accuracy, and Australian compliance uses (cold chain, HACCP, GMP). Practical picks, FAQs, and examples. < Back What Is a Data Logger and How Does It Work? A data logger is a small, battery-powered device that automatically samples one or more sensors at set intervals and stores timestamped readings, then makes them available via USB, Bluetooth, Wi-Fi or the cloud. Core parts are sensor(s), signal conditioning, ADC, microcontroller, memory, power and communications. In Australia, they’re used for vaccine cold chain ( Strive for 5 ), HACCP food safety, labs, logistics and HVAC to create audit-ready records. A data logger ( also called a data recorder or DDL ) is a portable instrument designed for unattended monitoring. It wakes up on a schedule, records, and sleeps to save battery. Unlike SCADA/DAQ systems ( always-on, networked, operator-driven ), loggers are stand-alone and optimised for long-term field use. How a Data Logger Works Signal path: Sensor, conditioning, ADC, microcontroller, timestamped memory, local/remote download. Key settings: sampling interval, start/stop, alarm thresholds, units, logging mode ( wrap/stop ). Data access: USB/BLE apps for quick offloads; Wi-Fi/LTE/cloud for live dashboards and alerts. Common Types By Parameter Temperature / Humidity / Temp-Humidity ( fridges, rooms, transport ) Voltage/Current/Power ( energy checks, PQ events ) CO₂/Pressure/Light ( IAQ, packaging, photometrics ) Shock/Vibration ( transport validation ) By Form Factor USB “stick”: cheapest, plug-in downloads Bluetooth: phone app, on-site checks Cloud-connected: Wi-Fi/LTE, alerts & dashboards Multi-channel bench/industrial: thermocouples/RTDs, mapping studies, wide ranges When to Choose Which (quick picks) Vaccine fridge: buffered temp probe + 5-min logging + alarms Food coolroom: multiple TH loggers at warm spots for HACCP Pharma warehouse: multi-point mapping then continuous monitoring Core Specs to Compare Accuracy vs resolution: pick accuracy aligned to your tolerance ( e.g., vaccine work needs tight accuracy and verifiable calibration ). Sensor & range: thermistor/RTD ( high accuracy ), thermocouple ( wide range ); confirm probe interchangeability. Sampling & memory: ensure interval suits the risk ( 5 to 15 min is common for fridges ). Power: replaceable vs rechargeable; battery life at chosen interval. Ingress protection ( IP ) & operating temp: match environment. Alarms: local buzzers/LEDs plus SMS/email for after-hours. Calibration & certificates: request NATA-endorsed certificates when records must be SI-traceable for audits. Australian Use Cases & Standards Vaccines & healthcare ( Strive for 5 ): keep +2 °C to +8 °C ( aim +5 °C ), use a data logger or automated monitoring, and download/review data to assess any breach. Victorian guidance specifies 5-minute intervals for vaccine fridges and weekly review of automated systems. Food & beverage ( HACCP ): continuous logging provides evidence at CCPs. FSANZ notes ~4.7 million cases of foodborne illness/year in Australia, with ~47,900 hospitalisations and 38 deaths, underlining the value of reliable records. Pharma logistics (GDP) & mapping: temperature mapping of storage areas and warehouses is expected under WHO Annex 9-aligned programs adopted in Australian practice; local guidance highlights mapping and ongoing monitoring to verify hot/cold zones before placing permanent sensors. Getting Reliable Data (How-To) Probe placement Put probes at centre, corners, near doors/warm spots, and at multiple heights ( warehouse ). Avoid direct contact with walls, coils, or fans; allow equilibration after moving sensors. Configure Sampling: start at 5 to 15 min for fridges; faster for unstable environments. Alarms: set pre-alarm cushions ( e.g., ±0.5 °C from limits ) and escalation contacts. Time sync: align to local time/AEDT; check daylight-saving rollover. Verify Quick ice-point/boiling-point checks where appropriate; schedule accredited calibration. For new fridges/rooms, run a 24 to 72 h mapping with multiple loggers ( empty and operational, seasonally if possible ). World Health Organization Maintenance & Calibration Sensors drift with time, shock and environment. Regulated sites commonly use a risk-based 6 to 12 month cadence; lower-risk applications may extend farther with evidence. Choose NATA-accredited (I SO/IEC 17025 ) labs reporting uncertainty and traceability per NATA’s Metrological Traceability Policy this is what auditors recognise across Australia ( via ILAC ). NATA+ CISCAL note: Our NATA-accredited capability ( Accreditation No. 411, Site 404 ) includes multi-channel thermocouple data recorders and digital temperature systems. See the CTA below. Buyer’s Checklist Accuracy Channels & sensor type, Logging interval & memory, Alarms & remote alerts, Battery life, Operating range & IP rating, Software/export formats, NATA-calibration support, Local service & turnaround. Example Configurations GP vaccine fridge 1x temp logger with buffered probe, 5-min sampling, daily min/max check, weekly review/download, and back-to-base alerts for deviations. Food coolroom 2 to 4 TH loggers ( door, centre, warm spot, return-air side ), monthly reports for HACCP verification; investigate excursions with timestamped logs. Pharma warehouse Run mapping ( utilizing multiple loggers at various heights and locations); identify hot/cold zones, then install permanent, monitored probes with alarms; remap after HVAC/layout changes, or as required seasonally . Glossary Accuracy : closeness to the true value ( not the same as resolution ). Resolution : smallest display increment. Sampling interval : time between measurements. CMC : lab’s Calibration and Measurement Capability ( uncertainty ). IP rating : ingress protection ( dust/water ). Mapping : multi-point temperature study to characterise a space. Validation : documented evidence the system consistently performs as intended. GDP/GMP : Good Distribution/Manufacturing Practice. NATA : National Association of Testing Authorities ( ISO/IEC 17025 accreditation in AU ). FAQs Previous Next

  • Noise Level Meter: How to Use It Effectively | CISCAL

    Learn how to use a noise level meter for compliance, workplace safety, and accurate sound monitoring. Expert tips from Australia’s calibration specialists. < Back How to Use a Noise Level Meter Effectively How to use a noise level meter effectively Monitoring noise levels isn’t just a technical requirement, it's a legal and safety obligation across many Australian industries. From factory floors to research labs, noise can impact both compliance and the wellbeing of workers. Under the Work Health and Safety Act, employers must manage risks associated with occupational noise exposure. In addition, environmental noise surveys are often mandatory for industrial operations. That’s where a noise level meter comes in. With over 60 years of NATA accredited calibration expertise, CISCAL has supported organisations across pharma, food production, biotech, and engineering to keep their equipment accurate and audit ready. What is a noise level meter? A noise level meter , sometimes called a sound level meter or decibel meter, measures sound pressure levels in decibels (dB). It captures real-time sound intensity to assess whether environments comply with safety and environmental standards. Industries use them in different ways: Pharmaceutical cleanrooms : ensuring HVAC systems meet sterility requirements without exceeding safe noise thresholds. Food production lines : monitoring machinery noise for operator safety. Research labs and universities : protecting sensitive experiments from disruptive noise. Manufacturing plants : assessing machine noise levels against regulatory limits. Accurate readings are critical not just for compliance with ISO/IEC 17025 and GMP, but also for reducing risks of hearing damage and workplace disruption. Standards & compliance requirements Noise monitoring isn’t optional, it's tied directly to Australian workplace laws and industry standards. AS/NZS 1269.1 sets the framework for occupational noise management. Safe Work Australia mandates exposure limits of 85 dB(A) averaged over 8 hours. Peak sound pressure should not exceed 140 dB(C). Pharmaceutical and biotech facilities must demonstrate compliance with TGA, ISO, and GMP requirements during audits. Food production and manufacturing companies must maintain a safe workplace under WHS regulations. Choosing a NATA-accredited calibration provider ensures that your measurements stand up in audits and meet both national and international compliance requirements. Types of noise level meters Not all meters are created equal. The right instrument depends on your application. Class 1 vs Class 2 (IEC 61672 standard): Class 1 meters are more accurate, suitable for research, regulatory compliance, and legal cases. Class 2 meters are less precise but acceptable for general workplace monitoring. Portable handheld meters : ideal for spot checks on the factory floor. Integrated logging meters : used for long-term monitoring, often required in environmental and industrial settings. For example, a factory might use a portable Class 2 meter for quick daily checks, while a university research centre would rely on a Class 1 logging meter for controlled studies. Calibration and setup Even the most advanced noise level meter is only as reliable as its calibration. Daily verification : Use an acoustic calibrator before each session to check accuracy. Scheduled calibration : Meters should undergo full calibration at a NATA-accredited lab at least once a year. Audit readiness : Calibration certificates provide traceable evidence of compliance. Since 1963, CISCAL has been accredited to ISO/IEC 17025 by NATA, giving clients confidence that their instruments will perform with precision when it matters most. How to use a noise level meter indoors Indoor noise monitoring requires attention to setup for reliable results: Position the microphone at ear height where employees are normally stationed. Avoid placing meters near reflective surfaces like glass or walls, as they can distort readings. Minimise background interference (air conditioning, unrelated machinery). In pharma labs, this ensures HVAC systems do not compromise sterile conditions. In food QC rooms, it helps ensure safe working environments for quality staff. Outdoor & industrial use Outdoor monitoring adds another layer of complexity due to weather and environmental variables. Always use a windscreen on the microphone to reduce wind noise. Mount the meter on a tripod for stability and accuracy. Use logging functions to capture changes over extended periods (e.g., during construction or plant operations). Environmental noise monitoring must also comply with local council and environmental regulations, which often specify acceptable dB levels for industrial zones versus residential areas. Recording & interpreting results A noise level meter provides raw data – but knowing how to interpret it is critical: <70 dB : Generally safe for long exposure. >85 dB : Risk level requires assessment and potential hearing protection. >100 dB : Harmful, even for short durations. For workplaces, this data is used to perform noise dose assessments, which calculate an employee’s overall daily exposure. Reports generated can then be integrated into compliance documentation for audits and risk management. Case studies across industries Pharma labs : HVAC systems kept within safe dB ranges to avoid contamination risks while protecting technicians. Food & beverage factories : Monitoring bottling lines and mixers to maintain compliance with Safe Work standards. Biotech research centres : Reducing background noise that could interfere with sensitive genetic analysis. Manufacturing plants : Long-term monitoring of heavy machinery to prevent unsafe exposure levels. These real-world applications show how noise monitoring is essential for both compliance and operational efficiency. Getting the most value from your CISCAL services Noise monitoring isn’t just about buying the right instrument, it's about keeping it reliable year after year. With CISCAL’s NATA-accredited calibration, validation, and equipment solutions, you can: Ensure traceable compliance with AS/NZS 1269.1 and Safe Work Australia standards. Reduce downtime by detecting and resolving calibration issues early. Protect employees’ hearing while maintaining productivity. Whether you need one-off calibration, long-term asset management through the CIS CAL SMART Portal, or full-service compliance support, CISCAL helps you stay accurate and audit-ready. Ensure accuracy and compliance with CISCAL’s NATA-accredited calibration services. From industrial sound meters to laboratory precision instruments, our experts keep your equipment compliant and your operations safe. Contact CISCAL today. FAQS Previous Next

  • DIY vs Lab Multimeter Calibration Explained | CISCAL

    Discover the key differences between DIY and lab multimeter calibration. Ensure accuracy, compliance, and reliability with the right method < Back DIY vs Professional Multimeter Calibration: What You Need to Know Multimeters are indispensable tools for technicians, engineers, and researchers. They provide critical electrical measurements such as voltage, current, and resistance. However, like all precision instruments, multimeters can drift out of specification over time due to regular use, environmental conditions, or component wear. This drift can compromise accuracy, safety, and compliance with industry standards. Calibration is the process of comparing a multimeter’s readings to a known reference standard and adjusting it if necessary. In this guide, we’ll explore the difference between DIY calibration and professional laboratory calibration, helping you decide which approach suits your needs. What is Multimeter Calibration? Multimeter calibration ensures that readings from the device are accurate and traceable to recognized standards. The process typically involves applying known electrical signals (voltage, current, resistance) and comparing the multimeter’s readings against a reference standard. In industrial and research environments, calibration must align with ISO/IEC 17025 and NATA accreditation to guarantee reliability and compliance. Calibration certificates document traceability, measurement uncertainty, and test conditions making them essential for audits and regulatory checks. Why Calibration Matters for Multimeters Accurate electrical measurements are vital in industries such as manufacturing, pharmaceuticals, and energy. A miscalibrated multimeter could: Lead to incorrect electrical readings, causing faulty designs or unsafe conditions. Fail compliance audits under NATA and ISO standards. Result in financial losses from product recalls or downtime. For example, in pharmaceutical production, precise readings ensure compliance with safety and quality standards. In energy systems, incorrect measurements can increase risks of failure or hazards. DIY Multimeter Calibration DIY calibration involves comparing your multimeter against a reference source, such as a precision voltage source or a calibrated device. The process typically requires: A precision reference (calibrated power supply, resistor, or voltage standard). Adjustment tools or software (for analog meters, physical trim pots; for digital, firmware settings). Pros of DIY Calibration: Cost-effective for hobbyists and small-scale use. Quick checks possible without sending equipment away. Cons of DIY Calibration: Limited accuracy, depending on the reference used. No traceability to recognized standards. Not audit-compliant in regulated industries. DIY calibration is suitable for personal use or non-critical applications, but it falls short when regulatory compliance is required. Laboratory Multimeter Calibration Professional calibration labs provide controlled environments and traceable reference standards, ensuring maximum accuracy and compliance. These labs operate under ISO/IEC 17025 standards and are accredited by organizations such as NATA. What Happens in a Lab Calibration? The multimeter is tested using reference instruments with uncertainties much lower than the device under test. Environmental conditions (temperature, humidity) are controlled. Measurements are documented, and a calibration certificate is issued. Benefits of Lab Calibration: High accuracy and traceability. Compliance with audits and industry standards. Reduced uncertainty in measurements. Supports industries where errors could have major consequences (e.g., aerospace, pharma, energy). DIY vs Lab Calibration: Side-by-Side Comparison Factor DIY Calibration Lab Calibration Accuracy Limited, depends on user High, traceable to standards Cost Low upfront Higher, but ensures compliance Compliance Not compliant Meets ISO/IEC 17025, NATA Time Quick for simple checks Scheduled, but ensures precision Suitable for Hobbyists, non-critical tools Industrial, research, regulated industries When Should You Choose Lab Calibration? Certain situations demand professional calibration: Regulatory compliance: Industries requiring ISO or NATA-certified measurements. Safety-critical environments: Manufacturing, energy, and pharmaceuticals. High-accuracy research: Scientific labs where precise data is crucial. CISCAL, for example, provides NATA-accredited calibration services across Australia, ensuring traceability and compliance. With over 60 years of experience, their services cover a wide range of electrical equipment. Risks of Skipping Professional Calibration Ignoring lab calibration can lead to: Audit failures: Non-compliance with ISO/IEC 17025 or NATA. Safety hazards: Faulty measurements can risk lives. Downtime & costs: Inaccurate results cause rework, recalls, and financial losses. Real-world example: In food manufacturing, inaccurate electrical readings from poorly calibrated meters can result in temperature control failures, leading to spoiled products and costly recalls. How Often Should Multimeters Be Calibrated? Calibration frequency depends on usage and environment: General recommendation: Annually. High-use environments: Every 6 months or per manufacturer’s guidance. Harsh conditions: More frequent checks (e.g., high humidity, electrical noise). For enterprises, scheduling calibration with providers like CISCAL ensures consistent compliance and accuracy. Choosing the Right Calibration Partner When selecting a calibration provider, consider: Accreditation: Ensure NATA-accredited and ISO/IEC 17025 compliance. Turnaround time: Minimal downtime for equipment. Coverage: Nationwide support. Digital tools: CISCAL offers the SMART Portal, giving real-time access to calibration certificates. DIY calibration may be sufficient for hobbyists or basic troubleshooting. However, in professional, regulated, and safety-critical industries, lab calibration is non-negotiable. It ensures compliance, reliability, and confidence in every measurement. Ensure your instruments meet compliance and performance standards. Contact CISCAL now! Frequently Asked Questions Previous Next

  • Why Pipette Calibration Is Vital in Research | CISCAL

    Understand why pipette calibration is essential for research accuracy, reproducibility, compliance, and resource efficiency in scientific labs. < Back Why Pipette Calibration Is Vital in Research Labs Pipettes are among the most fundamental tools in modern laboratories. From medical research to biotechnology and pharmaceutical development, they allow scientists to handle microliter to milliliter volumes with a high degree of precision. The accuracy of these small-volume liquid transfers directly influences the integrity of experimental results. Even minor deviations can distort data, compromise reproducibility, and lead to wasted resources. Pipette calibration is the process that safeguards against these risks. By ensuring pipettes dispense the intended volume, calibration supports the reliability of scientific work, helps laboratories maintain compliance with international standards, and minimizes costly errors. Without routine calibration, pipettes may drift over time, resulting in inaccuracies that go unnoticed until they affect experiments or regulatory audits. What Is Pipette Calibration? Pipette calibration is the process of checking and adjusting a pipette’s performance to ensure it delivers the intended liquid volume within defined tolerances. The most widely accepted method is gravimetric calibration, in which the dispensed liquid is weighed on an analytical balance. Because water has a predictable density under controlled conditions, weight can be converted into volume with high precision. Calibration is not only about detecting errors but also about correcting them. If a pipette consistently delivers too much or too little liquid, adjustments can be made to bring it back within acceptable performance limits. Globally, ISO 8655 defines the standards for piston-operated volumetric devices, including pipettes. This standard specifies test methods, accuracy limits, and acceptable tolerances. Compliance with ISO 8655 ensures pipettes are tested and maintained to the same rigorous benchmarks across laboratories worldwide. Why Pipette Calibration Matters in Research Accuracy and Precision Scientific research depends on accurate measurements. A pipette that dispenses even slightly more or less than intended can alter the concentration of reagents, disrupt chemical reactions, and invalidate results. Precision is equally important. If a pipette varies significantly from one use to another, reproducibility suffers. Calibration helps maintain both accuracy and precision, providing confidence in every transfer. Reproducibility Across Experiments Reproducibility is the cornerstone of credible science. Other researchers must be able to replicate results using the same methods. If pipettes in one lab are not calibrated properly, their results may differ from those in another lab, even if all other conditions are identical. Regular calibration ensures consistency across time and across institutions. Regulatory and Quality Compliance In regulated industries such as pharmaceuticals, biotechnology, and clinical diagnostics, calibration is more than best practice, it is a compliance requirement. Good Laboratory Practice (GLP) and Good Manufacturing Practice (GMP) guidelines require documentation of calibration activities. Regulatory agencies such as the FDA may request pipette calibration records during inspections. Maintaining compliant calibration schedules reduces audit risks and protects the credibility of laboratory results. Cost Efficiency Uncalibrated pipettes can cause failed experiments, wasted reagents, and unnecessary repetition of work. For high-value reagents such as antibodies, enzymes, or cell culture media, these losses can quickly add up. Routine calibration prevents waste and extends the lifespan of pipettes, making it a cost-effective practice. Data Integrity The accuracy of data is critical not only for publishing research but also for making informed decisions in industries like pharmaceuticals and diagnostics. Calibration ensures that pipettes do not introduce hidden biases or drifts into experimental results, preserving the integrity of data across projects. Calibration Frequency and Practical Guidelines The ideal calibration frequency depends on usage intensity, type of pipette, and regulatory environment. For general academic or research use, many laboratories adopt a semi-annual calibration schedule. Pipettes in high-precision or regulated environments are often calibrated quarterly or even monthly. Multi-channel pipettes, due to their complexity and higher wear rates, may require more frequent checks. Some labs use performance monitoring between scheduled calibrations. For instance, in-house checks can identify pipettes drifting out of specification before they affect critical experiments. User feedback also plays a role: if a pipette feels inconsistent or delivers visibly uneven volumes, it should be recalibrated immediately. Best Practices for Effective Pipette Calibration To ensure calibration produces reliable results, laboratories must adopt best practices that address technique, environment, and equipment. Use the right tips : Pipette tips from different manufacturers may not fit properly, leading to leaks or volume deviations. Always use tips designed for the pipette model in use. Control the environment : Temperature and humidity can influence gravimetric calibration by affecting evaporation rates and water density. Calibration should be performed in stable, controlled conditions. Consistent technique : Operator technique influences pipetting accuracy. Immersion depth, plunger speed, and pipetting angle should be standardized during calibration. Gravimetric methods : Analytical balances must be used, with repeated measurements at different volumes to confirm accuracy across the pipette’s range. Equipment traceability : Calibration is only as reliable as the instruments used to perform it. Balances, weights, and reference thermometers must themselves be calibrated and traceable to recognized standards. By adopting these practices, laboratories can significantly reduce variability and improve the dependability of calibration results. Traceability and Accreditation Significance Calibration is only meaningful when it can be traced back to recognized standards. Traceability creates an unbroken chain of comparisons, linking laboratory measurements to international SI units. For pipette calibration, this means the balance, weights, and environmental monitors used must also be calibrated against certified standards. Accredited laboratories operating under ISO/IEC 17025 are recognized for meeting both technical and quality requirements. Calibration certificates from such labs include documented uncertainties, reference standards, and traceability details. This documentation is essential for audit readiness and provides confidence that calibration results are defensible in regulated environments. Traceability also enhances collaboration. Research across multiple labs can only be compared meaningfully if all results stem from instruments calibrated against recognized standards. In-House vs Professional Calibration Services Some laboratories perform pipette calibration in-house, while others rely on accredited service providers. Each approach offers advantages and limitations. In-house calibration : This approach is cost-effective for labs with a large pipette inventory and frequent usage. Trained personnel can perform routine gravimetric checks, identifying pipettes that are drifting out of tolerance. However, in-house calibration may not always meet accreditation or traceability requirements for regulated environments. Professional calibration services : Accredited providers offer ISO/IEC 17025 calibration, complete with certification, detailed reports, and traceable documentation. They may also provide preventive maintenance, repairs, and validation services. This option is more costly but ensures full compliance with GLP, GMP, and regulatory audits. Most laboratories find value in a hybrid approach, conducting routine in-house checks between scheduled professional calibrations. Stay Compliant and Efficient Pipette calibration is not optional, it is a vital safeguard for the integrity of research. It ensures accuracy, reproducibility, compliance, and cost efficiency. By maintaining regular calibration schedules, following best practices, and leveraging accredited services when necessary, laboratories can protect their data quality and regulatory standing. Guarantee precision and reproducibility in your research, schedule accredited pipette calibration now and stay compliant and efficient with CISCAL Frequently Asked Questions Previous Next

© 2021 CISCAL. All rights reserved.

bottom of page