No matter how you look at it - from either side - an accident, even without an injury can make your life miserable. If you're struck by an uninsured driver, you will be 100% on your own for repairs, los of use and any medical charges. You could always sue, but if the person who hit you is broke, it would be like trying to squeeze blood from a turnip. If you're the uninsured driver and damage someone else's car - the cost of repairs can be crippling. Insurance is a good idea no matter what. *The above answer basically says that auto insurance should be mandatory because it is good for us and the alternative leads to misery, but A lot of decisions we all make can result in misery. Eating too much or too little can make our lives miserable. Same for not getting enough exercise, smoking or or consuming too much alcohol or medications, but generally, there is no mandate against partaking in any of these activities, other than some age limitations. Not making wise financial decisions can have equally miserable results and operating an uninsured vehicle is basically a financial decision. It is certain that insuring your vehicle is a "good idea" but the question is: Why should it be mandatory, especially when all auto Insurance companies offer policies to cover accident losses involving Uninsured Motorists? *While having insurance is a wise choice, I agree with the second viewpoint. When you get right down to it, who is mandatory insurance benefitting the most? That's right: the already rich and powerful insurance companies. On the other hand, who is mandatory insurance hurting the most? Right again: the poor who either struggle to pay for it or simply don't have insurance because they genuinely can not squeeze the cost out of their budgets. While I stress that I believe having auto insurance is a wise choice and that it is best to have it if you are going to drive, doesn't the government already mandate our lives enough? Having insurance should be an individual choice.
Yes, auto insurance is mandatory in the state of Illinois. To learn what the minimums are, visit www.dmv.org/il-illinois.
It is not mandatory, in the United States as a country,to carry auto insurance, but in most states it is a law.
No. Mandatory auto insurance is a state law in Texas.
"It is not mandatory, but it is very heavily suggested because it helps pay in the incident of an accident. You can get liability insurance or collateral."
In Florida, auto insurance policies differ greatly from other states. The required insurance includes Property Damage Liability, Personal Protection Insurance and Personal Injury Protection. Things such as Collision and Comprehensive coverage are not mandatory in Florida.
after reviewing florida auto inspection laws it is mandatory to have your vehicle inspected prior to getting your inspection registration sticker and auto insurance is required to do so
Yes, carrying auto insurance is mandatory in all states, including Georgia. Driving without insurance can result in fines and/or jail time.
Do not know what you mean. Did the insurance company cancel? Did you lose the card? New Hampshire and Wisconsin do not have mandatory auto insurance laws. If you have an accident any you are to blame, you still should pay the damage either out of your pocket or thru the insurance company. If you have insurance, you still have to pay for the damage, thru increased rates. I am opposed to mandatory auto insurance laws since they hurt poor people and even the insurance industry is opposed. Go to http://www.centspermilenow.org/715oppos.htm
There is always a mandatory insurance. There is an auto insurance policy, there is self insurance, there is a certificate of deposit, and there is a liability bond.
Every state but Wisconsin and Tennessee, both states have bills that would require insurance in 2009
Yes. Auto insurance is mandatory if you live in Hawaii. You can read more about it here: http://hawaii.gov/dcca/ins/consumer/consumer_information/mvi