It is: 180/4 = 45 mph
The average speed if an airplane travels 1364 miles in 5.5 hours is 248 miles/hr.
100 miles / 4 hours = 25 miles per hour.
The average speed of a car can be calculated using the formula: average speed = total distance / total time. In this case, the car travels 100 miles in 4 hours. Thus, the average speed is 100 miles ÷ 4 hours = 25 miles per hour.
It travels 288 miles in 4 hours 30 minutes at that speed.
42mph
40mph
Average speed = (total distance) divided by (total time) = (150 + 150) / (2.5 + 3.5) = 300/6 = 50 miles per hour.
60 mph
60 mph
If that's really what the question is, the answer is 1400 mph. If the question is actually "A truck travels 1400 miles in 20 hours, what is its average speed?" then the answer is 70 mph.
If a car travels 330 miles in 5.5 hours, what is the mph?
To find the average speed, divide the total distance traveled by the total time taken. In this case, the car travels 160 miles in 1.5 hours (1 hour and 30 minutes). The average speed is calculated as 160 miles ÷ 1.5 hours, which equals approximately 106.67 miles per hour.