Over and over you read threads where the warranty is a prominent component of the ownership experience for some. You read of folks paying to have a shop look over the car with a fine-toothed comb before the warranty runs out to hand the list to the dealer of things to fix. Nearly every mod question is prefaced with "does it void my warranty?". I have known people that trade in cars when the warranty expires and tell me "I'd never own a car that wasn't under warranty." Why? Is it just financial risk associated with having a car you don't own outright so additional cost beyond the payment (which could already be a stretch) etc. is a deal-breaker? Is it peace of mind? I'm just curious as I've never really cared much about them for any car I've owned. I've never had a major thing fixed under warranty - only nickle/dime stuff and those have been few and far between and this spans Honda, Ford, and VW models that I've owned that had a new car warranty (or a portion of it left if purchased used). All major repairs I've had on all vehicles I've owned over 30 years of driving have been out-of-warranty and would have occurred past any reasonable extended warranty ended (north of 100K miles); I keep my cars (both new and used) a long time so don't trade them in after a few years like many. Both of my current VWs have the 6/72K warranty and while nice, really wasn't a purchasing factor to me; I also never buy/never have bought extended warranties/service contracts. Just a discussion b/c I'm bored today .