8.52 seconds.
Chat with our AI personalities
To travel 500 feet at 40 mph, it would take about 6 seconds. This is calculated by converting 40 mph to feet per second (40 mph = 58.67 fps) and dividing 500 feet by the speed in feet per second.
An airplane flying at 500 miles per hour will take 4 hours to travel 2000 miles (2000 miles ÷ 500 mph = 4 hours).
1,500 feet long and 500 feet wide
At 45 mph, it will take approximately 20 seconds to travel 500 meters.
There are 0.3048 meters in one foot. Therefore to get amount of meters in feet, value in feet has to be multiplied by amount of meters in one feet: 500 feet = [feet] * 0.3048 = 500 * 0.3048 = 152.4 meters
There are 152.4 meters in 500 feet. 500 feet x 0.3048 meters/1 foot = 152.4 meters 1 foot = 0.3048 meters