The Secure Power knowledge base aims to give you the information you need to navigate the jargon and deliver straightforward advice around the terminology used within uninterruptible power systems.
Voltage is defined as the electrical potential between two points, the international unit of measurement for measuring this potential is volts, universally referred to as voltage.
When we talk about input voltage the first thing we need to remember is that voltage is supposed to be a constant. For a domestic single-phase main power supply this is the 240V we all know and love, it can however be 415V as found in three-phase power supplies commonly used within industrial applications.
So what does this mean when we talk about voltage in the context of a UPS input. When the input of a UPS is connected to a utility power supply part of its function is to monitor the utility power to determine whether the connected equipment should be powered from energy stored within the batteries.
Each UPS as part of it’s specificaion has a tolerance that relates to how far above and below the expected nominal voltage the connected utility power supply can vary, this is referred to as the Voltage input window. This can either be given as a specific min/max value, or as a percentage variance of the expected 240V. When the input voltage moves outside of this window (or utility power fails) the UPS will be unable to draw power from the mains supply to power the connected devices, the UPS will instead use electrical power stored within the batteries to produce mains power.
Utility power isn’t always what it should be, the input of the UPS is where it is measured.