The difference between 50 and 60Hz is, well, 60Hz is 20% higher in frequency. For a generator or motor (in simple terms) it means 1500/3000 RPM or 1800/3600 RPM (for 60Hz). The design of such magnetic machines is such that they are really one or the other. It may work in some cases, but not always, and to change between supply frequencies will certainly have an effect on efficiency, and may mean de-rating is necessary.
Transmission loss in distribution..
The Steinmetz equation is about magnetic losses in power transformer cores. As large power Transformers are of the order 98% efficient, whatever these losses are they are small. Most of the fully loaded power transformer losses are from wire resistance, as even better transformers have been made with superconductors. Magnetic losses may represent 98% of losses at very low load (not a normal situation in a grid). Maybe that is where that answer (taken from Google Answers) comes from. So cross off that equation and transformers as major issues.
In a distribution system it is the power lines that represent the losses. It seems intuitively that 20% higher in frequency leads to more series inductive and shunt capacitive losses. However it seems to be resistive loss at high voltage and corona loss at extreme voltage that dominate. Probably both 50 and 60Hz are below the frequency where the inductive and capacitive reactance dominate. The second link has "Transmission and distribution losses in the USA were estimated at 7.2% in 1995, and in the UK at 7.4% in 1998". These are small differences, despite the different conditions and frequencies. To me, this shows that proper design leads to a suitable efficiency for either frequency.
When it comes to the consumption of electricity, there are industrial and domestic and urban needs. Variations in lamps (flicker) are not all that real above about 40Hz, so this is not big at 50Hz or 60Hz. Other issues like tape recorder motors (almost a thing of the past now) that depend on frequency have moved to being independent using electronic controls. Frankly it is hard to see any real world difference, and it would be nit picking to find one.
The more significant difference is that 60Hz systems usually use 120V or thereabouts for the domestic supply, while 50Hz systems tend to use 230V etc. This has the impact that house wiring needs to be twice the cross section for the 120V system for the same power. However the optimum system is accepted as around 230V (wire size and power required versus safety). In most of the US the 120V system is in tandem with the 240V US system that provides for the higher powered appliances like stoves and clothes dryers, while 120V does wall outlets and lights. Hardly an issue nowadays.
Early history, I disbelieve the story about rotational speeds. The early generators ran at higher and lower RPM for various reasons, and the frequency was also higher and lower. Europe and the US had a wide range. Generally things started to settle on either 50 or 60 Hz by the end of WW2 because of international trade, and the increasing use of motors etc. in domestic appliances, so requiring greater standardization. There are a plethora of cute stories/myths, but for 50 or more years there were many frequencies, with none gaining ascendancy for any good technical reasons.
Overall conclusion..
There is little real difference between 50 and 60 Hz systems, as long as the equipment is designed appropriately for the frequency. It is more important to have a standard and stick with it. See the last link for lots of history.
RPM (revolutions per minute) is a measure of rotational speed, while Hertz is a measure of frequency. RPM is commonly used to measure the rotational speed of mechanical components, such as motors or fans. Hertz, on the other hand, is used to measure the frequency of electrical signals or electromagnetic waves.
The beat frequency is the difference between the two frequencies, so 359 - 352 = 7 hertz.
The main difference between a 15K RPM (Revolutions Per Minute) and a 7.2K RPM drive is the speed at which the disk spins. A 15K RPM drive spins at a faster rate than a 7.2K RPM drive, resulting in quicker data access and transfer speeds. However, a 15K RPM drive may generate more noise and heat compared to a 7.2K RPM drive.
Hertz is a unit of frequency, while watts are a unit of power. There is no direct conversion between the two.
Audible sound is sound that can be heard by the human ear, typically falling within the frequency range of 20 Hz to 20,000 Hz. Inaudible sound, on the other hand, refers to sound that falls below or above this range and cannot be heard by the human ear without special equipment, such as ultrasound or infrasound.
CPM stands for Cost Per Mille, and it represents the cost per thousand impressions of an advertisement. RPM stands for Revenue Per Mille, and it represents the revenue generated per thousand impressions for publishers. CPM is a metric used by advertisers to determine cost, while RPM is used by publishers to measure revenue.
bites is units for memory, hertz is units for frequency
Check out the related links for information on the differences between .rpm and .deb packages.
Hertz Antenna is lambda by 2 antenna & marconi antenna is lambda by 4 antenna...
The speed at 60 Hz is 600 r.p.m.
RPM stands for Revolutions-Per-Minute, and is counted at the crankshaft. At 3000 RPM the engine is spinning twice as fast as at 1500 RPM.
rated rpm is the maximum number of complete rotations that a rotor can rotate for every minute
I believe it is Hertz. Hz can be obtained by dividing rpm by 60.
The beat frequency is the difference between the two frequencies, so 359 - 352 = 7 hertz.
NONE AT ALL.
Difference between Torque and load
Yes, you will see a difference in performance, read tests and write tests. As rule, 5400 rpm hard drives are faster and perform better.
20 Hertz