Discussion about this post

User's avatar
charipol's avatar

When humans make different calls for the same situation, they call it "flexibility". When models do so, they call it "hallucination". Doesn't seem so fair.

Expand full comment
Eric Sandosham's avatar

Definitely something here to ponder more deeply on. With Machines, we are always solving for consistency, even if it's to customised for a segment. What defines consistency can be nuanced. From a data perspective, it's to ensure that information signals are propagated in the same way. But from a human perspective, consistency can in fact lead to different decisions because of the need to maintain consistent "principles" and "values".

Expand full comment
1 more comment...

No posts