Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
koalala
on Dec 26, 2022
|
parent
|
context
|
favorite
| on:
What ChatGPT can't do
from the comments on the blog; couldn't this issue be solved by prime-prompting it with instructions to always lay out its steps before answering? this way the answer wouldn't be 'tainted' by the first few words being the wrong answer.
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: