An instrument used to measure currents is called an ammeter. To measure the current in a wire, you usually have to break or cut the wire and insert the ammeter so that the current to be measured passes through the meter. An ammeter always measures the current passing through it. An ideal ammeter would have zero resistance, so including it in a branch of a circuit would not affect the current in that branch. Real ammeters always have some finite resistance, but it is always desirable for an ammeter to have as little resistance as possible.
Ammeters – Principle of Operation
The principle of operation of ammeters is based on the interaction between an electric current and a magnetic field. Ammeters are typically designed as a low-resistance device that is placed in series with the circuit being measured. When an electric current flows through the ammeter, it generates a magnetic field around the ammeter.
The magnetic field generated by the current interacts with a permanent magnet or a coil of wire within the ammeter, causing a mechanical force that deflects a pointer on a scale. The amount of deflection is proportional to the current flowing through the ammeter, and the scale is calibrated in units of amperes (A).
In analog ammeters, the mechanical force is transferred to a pointer or a needle that moves along a graduated scale to indicate the current value. Digital ammeters, on the other hand, use an electronic circuit to convert the current measurement into a numerical value that is displayed on a digital display.
To ensure accurate readings, ammeters must have a very low resistance compared to the circuit being measured. This is typically achieved by using a shunt resistor, which is a low-resistance resistor placed in parallel with the ammeter. The shunt resistor allows most of the current to flow through the circuit, while a small amount of current flows through the ammeter to provide an accurate measurement of the total current.