When humans make different calls for the same situation, they call it "flexibility". When models do so, they call it "hallucination". Doesn't seem so fair.
Definitely something here to ponder more deeply on. With Machines, we are always solving for consistency, even if it's to customised for a segment. What defines consistency can be nuanced. From a data perspective, it's to ensure that information signals are propagated in the same way. But from a human perspective, consistency can in fact lead to different decisions because of the need to maintain consistent "principles" and "values".
When humans make different calls for the same situation, they call it "flexibility". When models do so, they call it "hallucination". Doesn't seem so fair.
Haha this is so true!
Definitely something here to ponder more deeply on. With Machines, we are always solving for consistency, even if it's to customised for a segment. What defines consistency can be nuanced. From a data perspective, it's to ensure that information signals are propagated in the same way. But from a human perspective, consistency can in fact lead to different decisions because of the need to maintain consistent "principles" and "values".