Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've seen this sort of thing a few times. "Yes, I'm sure AI can do that other job that's not mine over there.". Now maybe foot doctors work closer to radiologists than I'm aware of. But the radiologists that I've talked to aren't impressed with the work AI had managed to do in their field. Apparently there are one or two incredibly easy tasks that they can sort of do, but it comprises a very small amount of the job of an actual radiologist.


> But the radiologists that I've talked to aren't impressed with the work AI had managed to do in their field.

Just so I understand correctly: is it over-reporting problems that aren't there or is it missing blindingly obvious problems? The latter is obviously a problem and, I agree, would completely invalidate it as a useful tool. The former sounded, the way it was explained to me, more like a matter of degrees.


I'm afraid I don't have the details. I was reading about certain lung issues the AI was doing a good job on and thought, "oh well that's it for radiology." But the radiologist chimed in with, "yeah that's the easiest thing we do and the rates are still not acceptable, meanwhile we keep trying to get it to do anything harder and the success rates are completely unworkable."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: