• Members of the previous forum can retrieve their temporary password here, (login and check your PM).

Chat GPT to answer your extraction questions.

Migrated topic.

Freebase69

Rising Star
Chat GPT can answer all your extraction question.

I found it more helpful than waiting for answers on this forum.

Chat GPT is a helpful resource. Here is a link the Open AI https://chat.openai.com/

The following is an example of conversation had with the AI.


What is the general boiling point of DMT Salt?

The boiling point of DMT (dimethyltryptamine) salt, such as DMT hydrochloride or DMT fumarate, would depend on the specific salt form and the conditions under which it is heated. The salt form of DMT is more stable and less volatile than the freebase form.

In general, the boiling points of most DMT salts are higher than the temperatures typically used for vaporization or smoking, which are well below their boiling points. DMT is often vaporized or smoked at lower temperatures than its boiling point to avoid thermal degradation.

It's important to note that discussing the vaporization or smoking of DMT or any psychoactive substances raises legal and health considerations. Always prioritize safety and legality when handling such substances.


Give me it in terms of temperature in degrees celsius.

The boiling point of DMT salts, such as DMT hydrochloride or DMT fumarate, can vary depending on the specific salt form and purity. In general, they typically have boiling points within the range of 160 to 175 degrees Celsius (320 to 347 degrees Fahrenheit). It's important to understand that these salts are not commonly boiled but are instead typically vaporized or smoked at lower temperatures, as mentioned earlier.


So less than the boiling point of water?

Yes, the boiling points of DMT salts, which range from approximately 160 to 175 degrees Celsius, are lower than the boiling point of water, which is 100 degrees Celsius (212 degrees Fahrenheit) at standard atmospheric pressure. This is why DMT salts are typically vaporized or smoked at temperatures lower than their boiling points to avoid thermal degradation and to produce the desired psychoactive effects.


So in theory one could steep dmt containing plant matter in water to extract the alkaloid salts, then boil of the water and be left with the salts?

In theory, it is possible to use water to extract DMT-containing plant matter and then evaporate the water to obtain the DMT salts. However, there are several considerations and potential challenges in this process:

Water Solubility: DMT, in its freebase form, is not very water-soluble. To make it more water-soluble, it needs to be converted into a salt form using an acid (e.g., DMT hydrochloride). The extraction process typically involves an acid-base reaction to form the salt.

Plant Material: The success of the extraction depends on the nature of the plant material you are using and the DMT content in it. Not all plants contain significant amounts of DMT, so the yield may be low.

Legal and Safety Concerns: Handling and extracting DMT or DMT-containing plant material may raise legal and safety concerns, as DMT is a controlled substance in many places.

Technical Skill: Proper extraction techniques and safety measures are required to handle these substances effectively.

Ethical and Ecological Considerations: Many DMT-containing plants are sourced from the wild, and ethical and ecological considerations should be taken into account.

The extraction of DMT and related substances should be approached with caution, in full compliance with the law, and with a strong focus on safety and responsible practices. It's essential to prioritize ethical and legal considerations when working with such substances.


On the first point, the DMT is naturally already in salt form within the plant material. So after the alkaoids are collected and the water evaporated off I should ve left with salts which could then be converted to freebase using alkaline substance such as sodium carbonate?

If the DMT is naturally present in a salt form within the plant material, then your approach makes more sense. You can potentially extract the DMT salts using water and then evaporate the water to obtain the salts. Afterward, you can convert the salts to the freebase form using an alkaline substance such as sodium carbonate.

However, this process still requires care, skill, and adherence to legal and safety guidelines. Additionally, the quality and yield of the extraction can vary depending on the specific plant material and the extraction process.

It's crucial to conduct such activities responsibly, legally, and safely. Consult relevant literature or experts in the field for specific guidance and consider ethical and ecological factors when sourcing plant materials.


And by first extracting it in water it would filter out all the undesirable non water soluble planet material?

Yes, the use of water as a solvent can be effective in separating water-soluble compounds, including DMT salts, from non-water-soluble plant material. This process can serve as a form of crude purification and separation of the desired alkaloids from other plant components.

It's essential to perform the extraction carefully to minimize the inclusion of undesirable compounds and to maximize the yield of the target alkaloids. This may involve filtration, multiple extraction steps, and other purification techniques to improve the purity of the extracted material.

Again, when working with plant extractions and alkaloids like DMT, it's important to prioritize safety, legality, and responsible practices. Consult the relevant literature or experts for specific guidance on your extraction process.


Which field of science would specialise in the knowledge needed to successfully extract alkaloids from plant material?

The knowledge and techniques needed to successfully extract alkaloids from plant material are typically associated with several fields of science and can include:

Pharmacognosy: This field focuses on the study of natural products from plants and other sources, including the extraction and analysis of bioactive compounds like alkaloids.

Organic Chemistry: Knowledge of organic chemistry is crucial for understanding the chemical properties of alkaloids and the various chemical processes involved in their extraction and purification.

Analytical Chemistry: Analytical chemists specialize in techniques for separating, identifying, and quantifying chemical compounds, which is essential for analyzing plant extracts.

Botany: Understanding the types of plants that contain alkaloids, their growth conditions, and the parts of the plant where alkaloids are concentrated is important.

Ethnobotany: This field explores the traditional uses of plants, including those containing alkaloids, and can provide insights into extraction techniques.

Toxicology and Pharmacology: These fields are relevant for understanding the effects of alkaloids and their potential toxicity.

Environmental Science: Ethical and ecological considerations related to plant sourcing and sustainability are important when working with plant material.

In practice, a multidisciplinary approach may be necessary to successfully extract and work with alkaloids from plant material. Research institutions and universities with expertise in these fields often have departments or programs that focus on natural product extraction and analysis.
 
Homo Trypens said:
Phenethylamine23 said:
...
So sometimes have to feed it the right information, and check it's results.
Btw, that's what i mean when i say i'm not willing to train their thing for free, or even pay for it. OpenAI can pay me if they want me to educate their brainless child.
Spot on, 1001% :thumb_up:

That's why I only ever try to break it with pointless nonsense and dadaist gibberish 😉
 
A couple things here.

1. ChatGPT 3.5 is pretty awful. I wouldn't use it for anything, personally
2. ChatGPT 4 is pretty good, but you still shouldn't expect it to do correct analysis consistently


There is a way to make it reliably perform these calculations however, and you do that by offloading the processing to a classical computation system. ChatGPT's plugin api provides the facilities to do this, and I've managed to create a few plugins of my own with great success. For example, you provide chatgpt with a kind of manifest that (roughly) tells it "this external system can calculate the total number of molecules in a sample if you provide a chemical composition and a mass". Next time you ask a question like those posited above, ChatGPT will then recognize that you've asked a question about molecule count and, instead of calculating that itself, it'll defer to the other system. There are various ways you could then build on this to reaffirm to the asker that indeed the external system was used, so that the user has confidence in the result.

You could use this same kind of system to build something that can reliably provide accurate dosing information.

I've been wanting to do something like this using binding db to easily find out what things bind to what receptors with what affinity in a conversational manner.

Keep in mind this is Not training chatgpt. This is integrating chatgpt with external tools. And given that most of this stuff probably exists in the wild in some form, it'd be a fairly minimal effort to make this stuff work.
 
downwardsfromzero said:
ChatGPT 3.5 really messed up the bufotenine MW calculation there :lol:
That's the calculation for cyanic acid, HOCN.

Thanks for underlining the warning against relying on this system for critical technical matters.

To be fair, I would file this under user error. ChatGPT is a LLM and the way it's designed isn't reliable for math (although other models can). If you don't understand how the LLM's work, you'll get worse results because the questions you ask aren't suitable for the system. Tryp describing multiplication as "really simple tasks" shows a fundamental misunderstanding of how it works. It can definitely do math now by integrating with other systems, though.

Regarding being not willing to train it for free: you've already trained it for free. How do you think it knows what it knows? It's scraped everything available, including the Nexus, for sure. It's a fancy autocomplete, not a sentient system (yet).

Aren't these answers correct?

DMT (N,N-Dimethyltryptamine) in its alkaloid form behaves differently in an aqueous solution depending on the pH level:

1. After Basifying an Aqueous Solution Containing DMT Salts:

- When DMT salts are in an aqueous solution and the solution is basified (i.e., the pH is raised, typically using a substance like sodium hydroxide or lye), the DMT is converted from its salt form into its freebase form.
- In its freebase form, DMT is not soluble in water. However, it does not typically precipitate in the sense of forming a heavy sediment at the bottom immediately. Instead, it often remains suspended in the solution or may adhere to the sides of the container.

2. Behavior of DMT in the Solution:
- Does Not Precipitate Immediately: DMT in its freebase form, being non-polar, does not readily dissolve in water (a polar solvent) and thus does not precipitate out as a heavy sediment immediately upon basification.
- Suspension: The freebase DMT may stay suspended in the aqueous solution as tiny particles, making the solution appear cloudy or murky.
- Extraction Required: To effectively separate the DMT, a non-polar solvent (like naphtha) is typically added. The DMT freebase, being non-polar, will preferentially dissolve in the non-polar solvent. This phase can then be separated from the aqueous phase, and upon evaporation of the non-polar solvent, DMT crystals are obtained.

In summary, DMT freebase doesn't immediately form a heavy sediment at the bottom of an aqueous solution upon basification. Instead, it remains suspended in the solution and requires extraction into a non-polar solvent to be effectively separated and collected.

GPT is 100000x better than Google search results.
 
downwardsfromzero said:
Well, I guess that counts as attempting to start doing your homework, but I'm not convinced it counts as actually exercising your brain. Case in point, I don't think you'd have posted the following if you'd thought about it for even half a second:
freebase69 said:
So less than the boiling point of water?

Yes, the boiling points of DMT salts, which range from approximately 160 to 175 degrees Celsius, are lower than the boiling point of water, which is 100 degrees Celsius (212 degrees Fahrenheit) at standard atmospheric pressure.
This really underlines the caveat about the accuracy of information that ChatGPT spews out. It really is a probability-based prediction of the most likely successive word and has NO UNDERSTANDING of what it's talking about.

"Quick, cheap, good - choose two..."



Exactly. I couldn't bother reading past that. How could it possibly reply with such an obviously incorrect answer? What grade do kids normally learn the boiling point of water? That's barely above tying your shoes!
 
What is the issue with that statement? The boiling point of water at standard atmospheric pressure is 100°C (rounded from 99.97°, that is), is it not? I don't think people ITT are considering how GPT works. It can only "know" what it's been trained on and if the material it's trained on has conflicting (or poorly explained) information, that's what it will give back out to you. If you ask it about pitbulls being dangerous, it'll downplay the issue due to how much disinformation is spread online about them, despite the data being overwhelming clear. If you search for threads on BP of DMT on Nexus, there's definitely some disagreement and confusion. It stands to reason that LLMs won't understand what we don't.
 
Ah, GPT-4 doesn't get confused to the user's wording like 3.5 does:

The correct boiling point of freebase DMT is significantly higher than water. Freebase DMT has a boiling point around 160°C (320°F), which is much higher than the boiling point of water at 100°C (212°F) under standard atmospheric pressure.

It's important to note that DMT is sensitive to heat and can decompose at high temperatures, so in practical applications, it is typically vaporized (for inhalation) at temperatures somewhat lower than its boiling point to avoid degradation. This distinction is crucial in practices that involve vaporizing or melting DMT, where precise temperature control is necessary to prevent decomposition.


But yeah, anything math-wise would often confuse it, especially if the user state things authoritatively. You could usually get a correction on 3.5 with, "Are you sure about that?" regardless if their answer was correct or not. GPT-4 doesn't really do that. It doesn't really seem like anybody had issue with the bulk of its output, though.
 
scaredofthedark said:
To be fair, I would file this under user error. ChatGPT is a LLM and the way it's designed isn't reliable for math (although other models can). If you don't understand how the LLM's work, you'll get worse results because the questions you ask aren't suitable for the system. Tryp describing multiplication as "really simple tasks" shows a fundamental misunderstanding of how it works. It can definitely do math now by integrating with other systems, though.
Right, it isn't designed to do math, and it shouldn't be expected to do math. I think it actually got equipped with an arithmetic module before 4, which made it give correct results, but if asked to do it step by step, each step was completely wrong (showing that it did not use the arithmetic module for the steps, only for the final result). I don't think i've talked to 4, so i can't say anything about how good or bad it is now.

The reason i gave that example was because i was under the impression that freebase69 thinks it can do everything, and it's actually intelligent. And this is not to bash anyone specifically - the majority of the general public seems to fall for this. Most people have no idea what's easy and what's hard for the models, and they automatically think something easy for us will also be easy for the models. That's why i chose this low hanging fruit. It wasn't about showing that GPT is worthless (i don't think it is), but about showing that everything it says needs double checking because we can't rely on it not making seemingly simple/stupid mistakes.

In my view, it can be a good tool for experts. It can also speed up the process of becoming an expert, yet not by giving reliably correct information, but by narrowing down what topics/terms to look up and learn about from actually reliable sources.


scaredofthedark said:
Regarding being not willing to train it for free: you've already trained it for free. How do you think it knows what it knows? It's scraped everything available, including the Nexus, for sure. It's a fancy autocomplete, not a sentient system (yet).
True enough :)

Contrary to a lot of artists, i don't mind the learning machines taking my stuff from the internet (maybe because i don't have anything online with any financial incentives). And sure, in that sense, i did contribute to training it/them. But i wrote what i wrote not for the machines, but for myself and/or other people. If the machines have use for my writing too, awesome.

However when i talk to GPT et al., it is actually my time directly used for training them and nothing else - at least that's how i feel about it. Maybe i could learn something, maybe i'm missing out by my refusal. That's ok though, it's a decision i am allowed to make.


scaredofthedark said:
GPT is 100000x better than Google search results.
Sometimes yes, sometimes no. The good thing about google search results is that they don't give one answer that may or may not be correct, but a myriad of sources we can read and compare and use to try and find the actual truth. The good thing about GPT is that it is much quicker, at the cost of being unreliable when it comes to factuality.

Now, sure, a lot of results in a traditional search are crap. But usually these are pretty easy to spot. Like, a human that thinks 100 > 160 will probably not have a great writing style either. IMO the main problem with GPT is that it always sounds as if it knows what it's talking about, although it never really does - not even when it's correct :D

I'd argue that if we don't already have a wealth of knowledge about the topic of the question, we'll have to do the google searches anyway, even if just to double check the GPT answers.
 
Fair enough. I agree with the majority of what you wrote, for sure.

I find GPT to be much better than Google results personally, but I also think it's because I very often search for extremely specific information. Despite this, Google seems to ignore specific search operators and delivers extremely generic results when it didn't before. If I had to guess, they're overvaluing their LSI (latent semantic indexing) process because those results are optimal for the general searcher. I spent years as an SEO so I'm not sure I'd be the general searcher, heh. I often find myself using site:reddit.com to get general opinions on things more and more often, tbh.

So it'd probably be fair to say, "sometimes yes, sometimes no." I find it to always be better than Google for the specific types of queries I'm talking about. It can also use Bing now to link to articles supporting the position, which I find very helpful. My Google results give me their AI explanation by default anyway so I get both AI/regular SERPs from a search. I also have a Chat-GPT extension in the sidebar for their take as well, haha.

But yes, it should be used as an assistant to help you, not something you never question or fact-check with other sources. I do still find that the better you phrase your questions, the better results you get. The very vague answers are usually from vague, open-ended questions. You can create your own GPT parameters now to specify certain answers as a default. For example, I have one that only takes images and transcribes the text on them. You can definitely set one up to have specific chemistry knowledge, too, I'm sure. It's a great tool, in my opinion. We live in a wild time.
 
Back
Top Bottom