However, when the electric motor inertia is bigger than the load inertia, the electric motor will need more power than is otherwise necessary for the particular application. This improves costs because it requires paying more for a electric motor that’s larger than necessary, and because the increased power intake requires higher working costs. The solution is by using a gearhead to match the inertia of the electric motor to the inertia of the load.
Recall that inertia is a way of measuring an object’s resistance to improve in its movement and is a function of the object’s mass and shape. The greater an object’s inertia, the more torque is needed to accelerate or decelerate the object. This implies that when the strain inertia is much bigger than the electric motor inertia, sometimes it can cause extreme overshoot or increase settling times. Both circumstances can decrease production line throughput.
Inertia Matching: Today’s servo motors are producing more torque relative to frame size. That’s due to dense copper windings, lightweight materials, and high-energy magnets. This creates higher inertial mismatches between servo motors and the loads they are trying to move. Utilizing a gearhead to better match the inertia of the electric motor to the inertia of the strain allows for utilizing a smaller electric motor and outcomes in a far more responsive system that is simpler to tune. Again, that is attained through the gearhead’s ratio, where in fact the reflected inertia of the load to the motor is decreased by 1/ratio^2.
As servo technology has evolved, with manufacturers generating smaller, yet more powerful motors, gearheads have become increasingly essential partners in motion control. Finding the optimal pairing must take into account many engineering considerations.
So how does a gearhead go about providing the energy required by today’s more demanding applications? Well, that goes back again to the fundamentals of gears and their capability to alter the magnitude or direction of an applied force.
The gears and number of teeth on each gear create a ratio. If a motor can generate 20 in-lbs. of torque, and a 10:1 ratio gearhead is mounted on its result, the resulting torque can be close to 200 in-lbs. With the ongoing emphasis on developing smaller footprints for motors and the equipment that they drive, the ability to pair a smaller engine with a gearhead to achieve the desired torque result is invaluable.
A motor may be rated at 2,000 rpm, however your application may just require 50 rpm. Trying to run the motor at 50 rpm might not be optimal based on the precision gearbox following;
If you are operating at a very low acceleration, such as 50 rpm, as well as your motor feedback resolution is not high enough, the update rate of the electronic drive may cause a velocity ripple in the application. For example, with a motor feedback resolution of 1 1,000 counts/rev you possess a measurable count at every 0.357 amount of shaft rotation. If the digital drive you are using to control the motor has a velocity loop of 0.125 milliseconds, it’ll search for that measurable count at every 0.0375 degree of shaft rotation at 50 rpm (300 deg/sec). When it does not discover that count it’ll speed up the motor rotation to find it. At the swiftness that it finds another measurable count the rpm will end up being too fast for the application and then the drive will sluggish the motor rpm back down to 50 rpm and then the whole process starts all over again. This continuous increase and decrease in rpm is what will cause velocity ripple in an application.
A servo motor running at low rpm operates inefficiently. Eddy currents are loops of electrical current that are induced within the engine during operation. The eddy currents in fact produce a drag power within the engine and will have a greater negative effect on motor overall performance at lower rpms.
An off-the-shelf motor’s parameters may not be ideally suited to run at a minimal rpm. When an application runs the aforementioned motor at 50 rpm, essentially it is not using all of its obtainable rpm. As the voltage continuous (V/Krpm) of the engine is set for an increased rpm, the torque continuous (Nm/amp), which is certainly directly related to it-is definitely lower than it needs to be. As a result the application needs more current to operate a vehicle it than if the application had a motor particularly created for 50 rpm.
A gearheads ratio reduces the electric motor rpm, which explains why gearheads are occasionally called gear reducers. Using a gearhead with a 40:1 ratio, the engine rpm at the input of the gearhead will end up being 2,000 rpm and the rpm at the result of the gearhead will be 50 rpm. Operating the motor at the bigger rpm will permit you to prevent the worries mentioned in bullets 1 and 2. For bullet 3, it allows the design to use less torque and current from the engine predicated on the mechanical benefit of the gearhead.