Ohm's Law definition:
Ohm's law states that the current through a conductor between two points is directly proportional to the voltage across the two points
Ohm's Law is an important rule in electrical engineering that's also easy to learn an apply. It describes how electrical current is related to voltage and resistance:
Ohm's Law's basic equation can be rearranged to find any of the three values:
The easiest way to remember this relationship between voltage, current, & resistance is by arranging them in a triangle, like this:
When you want to find one value, cover it up and look at the remaining values - they will show you what operation to perform. For example, when we omit R, we see V over I …
And we can repeat this process for the remaining values with the following results …
One of the most practical applications of Ohm's Law is using it to find the amount of resistance we'll need in order to pass a specific amount of current from a specific amount of voltage. This is handy when you want to provide the correct amount of current to an LED.
Ohm's law was probably the most important of the early quantitative descriptions of the physics of electricity. We consider it almost obvious today. When Ohm first published his work (1827), this was not the case; critics reacted to his treatment of the subject with hostility. They called his work a "web of naked fancies" and the German Minister of Education proclaimed that "a professor who preached such heresies was unworthy to teach science." The prevailing scientific philosophy in Germany at the time asserted that experiments need not be performed to develop an understanding of nature because nature is so well ordered, and that scientific truths may be deduced through reasoning alone. Also, Ohm's brother Martin, a mathematician, was battling the German educational system. These factors hindered the acceptance of Ohm's work, and his work did not become widely accepted until the 1840s. However, Ohm received recognition for his contributions to science well before he died.