Wolfram Alpha and ChatGPT might look similar in the surface but are very different in the inside.
ChatGPT really is a generator of text that makes sense. It doesn't necessarily have to be true, just that it reads fine. The problem with this is that the end result might be a text that reads fine but describes something that is simply untrue.
Wolfram Alpha on the other hand only produces true statements based on the underlying representation of math. The problem with this is that you might get something that is true, but not what you are interested in.
In the end, combining the two might produce very interesting results.
I know that.
Looks like you made this comment before reading up on how the wolfram alpha plugin for chatGPT works.
reply