@[email protected] to [email protected]English • 1 year agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square207fedilinkarrow-up1533
arrow-up1533external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.com@[email protected] to [email protected]English • 1 year agomessage-square207fedilink
minus-square@[email protected]linkfedilinkEnglish4•1 year agoI think you are spot on. I tend to think the problems may begin to outnumber the potentials.
minus-squareKillingTimeItselflinkfedilinkEnglish5•1 year agoand we haven’t even gotten into the problem of what happens when you have no more data to feed it, do you make more? That’s an impossible task.
minus-squareNatanaellinkfedilinkEnglish2•1 year agoThere’s already attempts to create synthetic data to train on
I think you are spot on. I tend to think the problems may begin to outnumber the potentials.
and we haven’t even gotten into the problem of what happens when you have no more data to feed it, do you make more? That’s an impossible task.
There’s already attempts to create synthetic data to train on