Would it be correct to say that a 3.5/5 movie rating on a 0.5/5 scale isn’t exactly the same as a 7/10 rating on a 0.5/10 scale, even though it seems like it mathematically? The reason is that half a star on the 5-point scale visually represents less than a full point on the 10-point scale. So, while a great scene might earn you a half-point bump, it wouldn’t necessarily add a full point on the 10-point scale. If rated on a 10-point scale, it’d probably be closer to a 6.5, which converts to 3.25/5 or simply 3/5 on the 0.5/5 scale. This shows that converting ratings to different scales don’t always align perfectly with your intended rating.
Would I be right claiming this?
I think I probably would too. And yet, I would tend to instinctively think of 70% as worse than 7/10, even though that makes no sense.