T O P

  • By -

DrCMS

Yes Chat GPT is lying. You should never use ChatGPT to answer technical questions because you need enough knowledge to answer the question yourself to realise ChatGPT is a bullshitting useless cunt that sounds all too plausible to non-experts. The "genius programmers" behind should have built fact checking into it but did not bother because that is actually hard work and they want easy money for a shitty halfarsed job. Dehydrating malic acid with sulfuric acid gives coumalic acid as per this ref [https://orgsyn.org/demo.aspx?prep=CV4P0201](https://orgsyn.org/demo.aspx?prep=CV4P0201) There is probably some fumaric acid (and some maleic anhydride) formed in the reaction but the isolated yield of coumalic acid was 65-70% so I would not bother trying it myself.


Thaumius

Use Scifinder to find procedures, not AI.


-Rano

As a general rule, don't use GPT for anything chemistry related, it can barely adjust a reaction.


Viridono

ChatGPT cannot reliably recapitulate technical information like this, and it’s extremely important to understand why: It synthesizes its responses word-by-word based on the statistical likelihood of that next word according to the ENTIRE INTERNET as a database. More straightforward questions have probably been asked (in some form) hundreds of thousands of times on the internet, and have usually gotten the same answer (e.g. ‘What are the four essential macromolecules of life?’, ‘What are the 20 canonical amino acids?’, etc), and ChatGPT is more likely to consistently recapitulate the correct answer. More technical questions (or even questions with more words in them), like this, usually require highly specific answers, and you’re just not going to get that from a language model that was trained on, and is essentially the average of, the entire written internet, non-chem stuff included. It’s not designed to know facts. It’s designed to respond to questions the way it’s learned humans do. It obviously can’t critically think, so any answer it gives you is a weighted average of the writing samples it ‘recalls’ from its entire dataset, which again is far too general for a question like this. All of that being said, ChatGPT can still be an enormously powerful and helpful tool if used correctly. It’s a language model. Stick to using it for semantic tasks, like summarizing information, or achieving a conceptual understanding of written-about phenomena. I’ve just started thinking of ChatGPT as a slightly more sophisticated Google search that summarizes the relevant bits of resulting pages. But again, nothing that requires any sort of train of reasoning.


No-Watercress880

Chat gpt is most definitely lying.


randomnonexpert

My recollection is dodgy af, but on paper it's possible to produce fumaric acid from malic acid. It would involve a couple of reactions and conversions. Theoretical yield: high Practical yield: low TLDR. You can. But you shouldn't.


Glum_Refrigerator

It’s partially lying. You can isomerize Maleic acid into fumaric acid using a strong acid but it’s due to the acid protonating the double bond into a single bond not dehydration. In general chatgpt isn’t recommended for things like this


rextrem

Someone has likely come with a dehydration reaction, phosphoric anhydride with heat. If there's R-CO-CH3-CHOH-R then the CH3 is acidic and the OH can become a leaving group -> dehydration.