Newton-meter (N·m)
Definition
The Newton-meter (N·m) is the SI unit of torque, representing the rotational force applied at a distance. It quantifies the moment of force, defined as 1 N·m = 1 N × 1 m, where 1 Newton is the force required to accelerate a 1 kg mass at 1 m/s².
History
The concept of torque dates back to ancient mechanics, but the term "Newton" honors Sir Isaac Newton for his work on motion in the 17th century. The unit became standardized with the adoption of the International System of Units (SI) in 1960.
Uses
Newton-meters are prevalent in engineering, physics, and everyday applications, such as measuring the torque of wrenches, engines, and other machinery. They are essential in automotive engineering, construction, and various scientific research fields.
Conversions
- 1 N·m = 0.73756 ft·lb (foot-pound)
- 1 N·m = 10 kg·cm (kilogram-centimeter, approx)
- 1 N·m = 8.8507 in·lb (inch-pound)
Fun Facts
- A common misconception is that torque and force are the same; however, torque is force applied at a distance.
- The Newton-meter is also used in measuring work done (energy) in rotational systems.