24 fps is required for get the proper cinema look that people associate with high-end, movie-theater quality. 60 fps makes everything look like TV, and consequently cheap. Try watching a movie on you flat screen with frame interpolation turned on (likely giving you 60 or 120 fps), and with it turned off (true 24 fps): it will probably look much more like a “real” movie at a true 24fps. There were some complaints over The Hobbit’s high frame rate for this reason.
The argument against 24 fps, in the era of CRTs, was that it didn’t fit very easily into NTSC 60 fields per second, so 3:2 pulldown was required. In the era of 120 fps LCDs and Plasma TVs, you don’t have to worry about 3:2 pulldown judder.
If I put in a typical Blu-ray Disc from my collection into my player, it’s going to be 1080p24. I can either watch it at 60 Hz, in which case a good many frames will be slightly blurred by the 3:2 pull down, or at 48 Hz, in which case, all the frames will be sharp.
There’s no option to see a version produced at 60 fps. Sure, I can view a couple of other discs made for 1080i60 but odds are that it won’t be the same content. A direct comparison is impractical.
Most TVs have a mode where they will interpolate inter-frames up to 60, 120 or 240 fps. This “feature” goes by different names, such as MotionFlow, ClearScan, TruMotion, etc. On most models such frame interpolation is the default mode, and can even be difficult to turn off (even when the TV is set to film mode).
Outputting a 24p signal to a panel display should never lead to blurred/interlaced frames, unless your TV simply does 3:2 pulldown wrong. It should simply be displaying fully progressive 3:2 pulldown. Going to 48Hz should reduce judder, but have no effect on how sharp the frames are.