Antimatter experiment validates Einsteins theory of relativity

Experiments with the new ALPHA-g apparatus at CERN offer support for Einsteins general relativity theory from 1915. Physicists dive into the mysterious interaction between gravity and antimatter. Their findings on these intriguing particles finally answer the question, whether antiparticles behave differently than matter particles in gravitational fields. 

Antimatter has all the same properties as regular matter, but with opposite charge. You may have heard of it mostly in sci-fi literature and films, but nevertheless it is a real phenomenon. The ALPHA-g experiment focuses on producing and studying antiparticles, especially antihydrogen atoms. The “g” stands for gravity, as the experiment investigates how the antiparticles respond to gravity.  

Einsteins theory about general relativity has taught us almost all we know about gravity, but one aspect he was not aware of, how do antiparticles respond to gravitational fields? About 15 years after Einsteins publication, the counterpart of the electron, the positron, was identified by Dirac. This new finding raised new questions and, may I say, even challenged Einsteins infamous theory of gravity. Will Einsteins theory hold for something that was discovered long after the publishment? 

In the general theory of relativity, Einstein claims that regardless of a particle’s inner structure, all masses react identically to gravity. Antimatter particles have the same mass as their matter- counterparts and shouldn’t be an exception to this theory. 

The research team at CERN made antihydrogen atoms by combining antiprotons and positrons. This resulted in antihydrogen atoms, with no charge, that could be shot up in the vertical ALPHA-g machine. The antihydrogen atoms are now trapped in the magnetic “atom trap.”  Depending on the direction the atoms move in, the scientists could see how the anti-matter particles reacted to earth’s gravitational field. 

The outcomes of the experiment were the same as those observed with regular hydrogen atoms, with around 80% of both moving downward after release. This reinforces Einsteins theory that anti matter and matter particles would react the same way to gravity. Regardless of how exotic anti particles may sound. 

The results don’t only confirm the attractive gravitational force between Earth and the anti-particle, but also rule out the possibility that the gravitational force could be repulsive. Which has been a hot topic since the discovery of the first positron. That would mean that the anti-particle would be moving up instead of down like regular particles. With this experiment we can now confidently say that this is not the case. Unfortunately, it challenges some cosmological models relying on the repulsive matter-antimatter gravitation theory.
 

Anderson, E.K., Baker, C.J., Bertsche, W. et al. Observation of the effect of gravity on the motion of antimatter. Nature 621, 716–722 (2023). https://doi.org/10.1038/s41586-023-06527-1
 

 

Plastic-eating Bacteria: a Natural Solution for Plastic Pollution

During the span of the last century plastic has become an indispensable part of our daily lives. Have you ever wondered what happens to all that plastic we use daily? Plastic and more specifically PET, which is short for polyethylene terephthalate is a useful material due to its flexibility, low moisture absorption and dimensional stability. However it also has downsides, mostly when it comes to recycling and environmental impacts. Plastic waste accumulation has become a global threat that requires immediate action. Imagine if we ignore this plastic problem—it could mess up our health and harm all living things. Due to the material’s non-biodegradable nature finding a solution to the plastic pollution crisis is a tricky matter. The method with possibly the most potential as of now, could be biodegradation using the plastic-eating bacteria.

Polyethylene terephthalate is a petroleum-based polymer composed of monomeric residues of terephthalic acid (TPA) and ethylene glycol (EG) bonded via ester linkages which greatly contributes to the stability of the molecules. This property of PET makes it a complex material to recycle. It’s not as easy as tossing it in the blue bin. There are a few methods that have been developed for that purpose. There is physical recycling which involves processes like grinding and melting, and then there is also chemical recycling that employs techniques like hydrolysis or glycolysis. However, these methods have major drawbacks, as they consume great amounts of energy. They also involve the use of harsh chemicals and are not suitable to be utilized on a larger scale. Moreover, these processes can be harsh on our planet, sparking some eco-concerns. That’s why we’re on the hunt for friendlier and greener alternatives.

In response to the limitations of traditional methods, biodegradation emerges as a cleaner and gentler alternative. It utilizes the ability of certain bacteria to enzymatically degrade plastic, offering a more sustainable solution for plastic pollution. It’s just nature’s way of breaking down plastic into tiny, recyclable bits. The bacteria discovered to possess plastic eating  abilities is Ideonella skaiensis. It opens new possibilities for harnessing biodegradation as an effective means of plastic recycling. Ideonella sakaiensis produces two enzymes crucial for this process, IsPETase and MHETase. They break down PET into smaller pieces, so called monomeric forms through enzymatic hydrolysis. The bacterium, when coming in contact with the PET surface, binds to it, which triggers the release of the enzymes. The enzymes are believed to be transported through the extracellular appendages of the bacterium. IsPETase breaks PET into MHET and TPA and during the second step of the process MHETase further decomposes MHET into TPA and EG (ethylene glycol). These major products that can be further processed to make new plastic polymers, which makes biodegradation a very advantageous method of recycling.

Imagine a world where plastic is recycled by tiny creatures instead of harsh methods. That’s the potential of biodegradation compared to traditional recycling! It stands out for its ecological compatibility. The method doesn’t require the involvement of harsh chemicals or great amounts of energy, instead it utilizes the natural abilities of bacteria to metabolize plastic. Using naturally occurring organisms can offer cost advantages in terms of raw material and process simplicity. However, the limitations of this recycling method must also be addressed. Research still needs to be done when it comes to scaling up biological processes, including microbial culture limitations and optimizing conditions for large-scale applications. There are some challenges on the horizon, but diving into this research is like unlocking a more eco-friendly and energy-saving way to manage our waste. Seems totally worth it, right?

References:

Kaur, K. et al. (2023) Recent Advancements and Mechanism of Plastics Biodegradation Promoted by Bacteria: A Key for Sustainable Remediation for Plastic Wastes. Biosciences, biotechnology research Asia. [Online] 20 (1), 1–12.

Maja Kubik

The Colourful Chemistry of Synthetic Dyes

In a world awash with synthetic dyes, a hidden reality hides beneath the  enchanting spectrum of colours. Recently, synthetic dyes have claimed the spotlight across various industries, owing to their cost-effectiveness and superior efficiency compared to their natural alternatives. These artificial compounds stealthily make their way into unexpected places such as cosmetics, food, pharmaceuticals, and inevitably find their way into our waters. 

The careless disposal of liquid waste into our waterways wreaks havoc on aquatic life, degrading water quality, and hinders plant growth. Our water bodies face an increasing cocktail of chemicals, but fortunately, there’s a glimmer of hope in the form of ionising radiation. 

High-energy x-rays present a groundbreaking solution, wielding the power to dismantle the resilient molecular structures of synthetic dyes. This advanced oxidation process (AOP) distinguishes itself as one of the select methods capable of effectively neutralising the conjugated bonds within dye molecules. 

Conjugation takes place when molecules contain adjacent double bonds, leading to the delocalisation of electrons throughout the chain. These liberated electrons gain the freedom to move among elements and bonds, spreading and stabilising the energy distribution of the molecule. 

These conjugated systems lay the groundwork for chromophores, the light-absorbing constituents responsible for the distinctive colours of dyes. But why are these robust systems problematic? The longer the conjugation chain, the higher the stability of the molecule, complicating the chemical and biological treatment of wastewater.  

How, then, can radiation make a difference? Radiolysis, or the molecular damage of chemical bonds by ionising radiation, becomes a crucial player in water treatment. It generates hydroxyl radicals that oxidise and reduce the molecular weight of contaminants, gradually decreasing the concentration of pollutants in wastewater. 

A research team from the Russian Academy of Sciences delved into the impact of hydroxyl radicals on dye discoloration. With a LINS-03-350-EURF linear accelerator, they irradiated four distinct groups of universal dyes. The visible colour of the solutions, or optical absorption, was measured using a spectrophotometer. The results revealed that even with the slight addition of both hydroxyl and hydrogen radicals, significant discoloration occurred across a range of dye solutions. 

Hydroxyl radicals prove to be effective tools in dismantling the complex conjugation of vulnerable dye molecules. The addition of a hydroxyl radical induces stress on the conformation of the dye molecule, ultimately damaging intramolecular conjugate bonds. 

Despite radiolysis presenting itself as a viable option for large-scale wastewater treatment, its implementation faces hurdles due to high costs and inherent complications. t was found that at this stage in the development of radiolysis, wastewater simply could not be irradiated quickly enough, leading to low yields of actual dye decomposition. Yet, even with these flaws, the potential role of radiation and radiochemistry in shaping our future remains a compelling prospect. 

Sofia Vilkman

Kholodkova, E.M., & Ponomarev, A.V. (2022). Degradation of the Chromophore Functions of Dyes in Irradiated Solutions. High Energy Chemistry, Vol. 57(2), 146–150. Springer Link. 

Genetic scissors: will they cure cancer?

 

Our genes hold the secret to everything in life. One revolutionary breakthrough in genetic research is CRISPR/Cas9. Imagine a tool that can cut our DNA with impeccable precision and edit our genes. That’s CRISPR/Cas in a nutshell. But how does it work? And why is it used in cancer research?  

CRISPR, which stands for Clustered Regularly Interspaced Short Palindromic Repeats was originally found to be a natural defense mechanism against viral attacks in bacteria. The bacteria cut the viral DNA and placed it between these short palindromic DNA sequences. It works the same way as taking a screenshot and storing it in a hard drive. This way the bacteria remember the attack and can act quicker if infected by the same virus. 

Alongside the CRISPR, bacteria also have a special Cas protein, which stands for CRISPR-associated proteins. This protein, together with crRNA (transcribed from the viral DNA) and tracrRNA (complementary to the palindromic repeats), makes up the effector complex. If the bacteria encounter the same virus, the effector complex finds and cuts the viral DNA, so it no longer works. This prevents the virus from doing further damage and spreading. 

This phenomenon inspired a new gene-editing tool: CRISPR/Cas9. We can synthetize and glue together the tracrRNA and the crRNA. This new piece of RNA is then called sgRNA and together with the special Cas 9 protein it can edit any gene. The protein uses the sgRNA to find the right place in the DNA, and the protein cuts it with its molecular scissors. So, in theory, you could edit any gene if you just know the location of the gene. It works just like any pair of tiny scissors.  

We can cut the DNA, so the gene loses its function. This can be done with knock-in or knock-out methods. When the Cas protein cuts the DNA, it can also remove a few nucleotides (the basic building blocks of DNA and RNA), which results in a permanent loss of function in the gene. When the cell tries to repair the cut, it uses an NHEJ (non-homologous end joining) repair mechanism. The broken ends could usually be glued together with ease, but with too few building blocks the end product is incorrect. An incorrect DNA sequence causes the gene to lose its function. It’s like trying to repair a jigsaw puzzle without all the pieces. This method is called knock-out. With knock-in, the cell uses another repair mechanism called HDR (homology-directed repair), which means that the cell uses a template as a guide to repair itself. It’s like building a jigsaw puzzle using the picture on the box as a guide. If we provide the cell with another template, we can add new information.  

We have also found a way to use the Cas 9 protein to stop the transcriptional process, the process where the cell reads the DNA recipe and “bakes” proteins. This means the DNA isn’t altered, but the cas9 protein prevents the cell from producing proteins. With the same principle, we can also activate genes that might be “underperforming”. This is called transcriptional activation. 

Recent research has found that CRISPR/Cas could be used to find a cure for cancer. We can study our genes and analyze how medication affects them. By understanding how a specific gene works, we can create target-specific drugs that could be used as a cancer treatment. We can use the previously mentioned methods to enhance the effectiveness of the medication.  

Compared to its alternatives (for example, ZFN and TALEN) CRISPR is the most popular editing tool and seems to be the best in many ways, but there are a few things to consider. One of them is time. Changing genes using CRISPR is not a fast or easy task. It takes time to make all the necessary proteins and molecules, it takes time and precision to add them into the cells and it takes time for the DNA to be altered. It’s also an expensive process. 

We also need to consider ethics. We need to study human genes to understand how cancer affects them. Human reproductive cells or embryonic cells could give us much information helpful for cancer research, but scientists are debating whether it is ethical or not. If we are allowed to edit embryotic cells to prevent diseases, could we also be allowed to edit them to alter our appearance? CRISPR is a useful tool for understanding our genes and curing diseases, which is why scientists are trying to make it as good and as safe as possible. It’s a time-consuming and expensive process, but it’s the best one we have. 

YAN, R., WANG, J., LIU, M., ZHOU, K. (2022). CRISPR accelerates the cancer drug discovery. BIOCELL, 46(10), 2159–2165, 1-7,  https://doi.org/10.32604/biocell.2022.021107. Accessed 18 May 2022. 

Linnea Flytström

 

 

Exploring Galactic Cosmic Rays:

When you hear the word Galactic Cosmic Rays it sounds like something that just came out of a science fiction book, but let me tell you that they are something real. Yet, what exactly are they? This is a question that even scientists struggle to answer nowadays even though they were first observed back in 1912. What we know about them is that they are particles produced outside of the solar system that “rain” into earth with light speed. But why are these particles important?

Galactic Cosmic Rays (GCRs for shortening) are very energetic particles that when they come into contact with the Earth’s atmosphere they create a cascade of nucleonic, muonic, and electromagnetic processes. At the same time, this cascades that the GCRs form are the ones that allow us to detect them. To detect them, scientists have been using ground-based detectors called neutron monitors (NMs) since 1951 which have been able to measure  variations in GCR flux over time in periods ranging from hours to solar cycles. Count rate of a NM is caused by the local flux of secondary nucleons. It can be presented as follows:

I won’t go into detail about this complex equation, but it allows us to calculate the flux of the secondary nucleons and from that we can look back and get information about Galactic Cosmic Rays. Expanding our cosmic toolkit, scientists utilise  (yes, the one used also to know how old things are) and to explore Galactic Cosmic Rays variability. Radiocarbon  is produced in the atmosphere by capturing thermal neutrons, with its production directly linked to the GCR flux. This is just one proof of the connection between cosmic processes with Earth’s atmospheric dynamics.

Similarly,  is born from the separation of nitrogen and oxygen nuclei in the atmosphere. Unlike radiocarbon,  does not take part in the global carbon cycle, settling rapidly on the Earth’s surface. This isotope’s behavior is meticulously modeled using atmospheric dynamical models, offering valuable insights into the dynamics of GCR interactions.

A very important concept is effective energy as it emerges as a pivotal tool in understanding the capabilities of cosmic ray detectors. Describing the energy-integrating capacity of a detector, effective energy provides a measure of the energy at which GCR protons vary  proportionally to the detector’s count rate.     

The figure above vividly illustrates the temporal profile of data from the Oulu Neutron Monitor since 1993, alongside the variability of GCR protons at fixed energies. Notably, effective energy for the Oulu NM is identified as approximately 12 GeV/nucleon, demonstrating its close alignment with the variability of GCR protons at this energy level.

In conclusion, the exploration of Galactic Cosmic Rays unveils a captivating interplay of particles and processes. Neutron monitors, radiocarbon, and cosmogenic isotopes serve as our cosmic detectives, unraveling the mysteries of Galactic Cosmic Rays variability over time. The introduction of effective energy provides a streamlined approach to interpret data obtained through different measurement methods, offering a universal language for researchers in the field.

As we continue to probe the depths of cosmic phenomena, ground-based detectors and effective energy calculations stand as our compass, guiding us through the intricate dance of Galactic Cosmic Rays. The insights gained not only deepen our understanding of the cosmos but also pave the way for future discoveries and advancements in the field of astrophysics.

Aleksandra

References:

  • Gil, A., Asvestari, E., Kovaltsov, G. A., & Usoskin, I. (2017). Heliospheric modulation of galactic cosmic rays: Effective energy of ground based detectors. PoS Proceedings of Science301,  032]. https://pos.sissa.it/301/
  • The first figure was taken from NASA archives: https://apod.nasa.gov/apod/ap060814.html
  • The rest of the figures have been extracted from the article cited.

Stage Fright — Electrons Have It Too!

We all feel blue every now and then. Sometimes it feels like everything is out control, as if our actions, or lack of such, does not really make that big of a difference. If we are not capable of affecting our own life, we surely cannot influence physical phenomena, right? Actually, what if I told you that just by observing we are able to affect reality… in the quantum world!

A study conducted by researchers of Weizmann Institute of Science demonstrated that the distribution of electrons passing through the double slit was affected by the act of having been observed. The most remarkable outcome of the study is that it was proven: “the greater the amount of “watching”, the greater the observer’s influence on what actually takes place”.

Let us have a glance at how it all began. Up untl the beginning of the 20th century, our world would be explained by the so-called realist theories, which included classical mechanics, statistical mechanic, special and general relativity. The basis of these realist theories is that we are able to observe physical phenomena, and our observations do not have an impact on the results of the observation. It is this assumption that makes the theories realist, i.e. allows to predict any phenomenon either exactly or probabilistically, using chaos theory of classical statistical mechanics. It all changed when quantum physics came to be. Quantum physics is well described by N. Bohr, who stated that each observation in quantum physics becomes a singular, each time unique act of creation of quantum phenomenon, rather than a mere observation of the pre-existing property of a quantum object.

Now that we have gotten a sufficient insight, let us see how the experiment by Weizmann Institute of Science proceded in 1998.The experiment was designed the following way. Researchers used a very very small measuring device (less than a micron in size), which could spot passing electrons. Its precision was calibrated by varying the current passing through the device. The device was then set to detect the electrons passing through the openings. The electrons seemed to be passing through the slits like particles, and no interference patters was then observed. After that, the device has been switched off, and the electrons exhibited wave-like behaviour, showing interference pattern, i.e. passing through two slits at the same time. To researchers` surprise, when the detector was plugged in again, but with smaller precision (it could no longer detect every electron passing through the slits), the beam of electrons was still producing interference patters, but this interference would get weaker the more precise the detector was set to be.

What does this all mean and how would it affect us, ordinary popular science blog readers? In fact, such unique properties of the quantum world are likely to be widely used in many spheres. One of the more promising scenarios is using quantum effect described above to ensure the safety of information transfer. This can be achieved by decoding information in a way that would require the interference of electron paths to decipher it. This way, the interference will not occur if it is being observed by someone, that is, the secrecy of information tranfer is being compromised.

Artem Mkrtychyan

Citations:

ScienceDaily. (1998, February 27). Quantum theory demonstrated: Observation affects reality. ScienceDaily. https://www.sciencedaily.com/releases/1998/02/980227055013.htm

Plotnitsky A. 2023 ‘Theagency of observation not to be neglected’:complementarity, causality and the arrow ofevents in quantum and quantum-like theories.Phil.Trans.R.Soc.A381: 20220295.https://doi.org/10.1098/rsta.2022.0295

Unveiling Uranus’ Invisible Light Show: Infrared Auroras, Magnetic Mysteries, and Exoplanetary Insights

University of Leicester astronomers confirmed the existence of infrared aurora on the second last planet of the Solar System – Uranus. This achievement lets the scientists discover more about magnetic fields and atmospheres of far-away planets. Uranus’ peculiar magnetic field (by peculiar I mean that it’s off-centre, not aligned with its rotation axis by a whole 59 degrees, and changes direction almost every Uranian day!) allows astronomers to study how exactly magnetospheres are created on different types of planets. The discovery gives us more tools to rate exoplanets on how habitable they are and if they’re suitable for future colonisation.

In the article published in Nature Astronomy, the researchers conducted an analysis of long-speculated infrared aurora borealis (or northern lights as it’s commonly known). Auroras are created due to interaction between highly energetic charged particles emitted during solar flares which slip through the planet’s magnetic field near magnetic poles and the planet’s atmosphere. Those collisions cause the atmosphere’s particles to get rid of the excess energy in the form of light. Earth’s atmosphere, which is made up of mostly nitrogen, emits the energy in the visible light spectrum. That is not the case on Uranus since its atmosphere is mostly hydrogen in helium, which would emit less energised electromagnetic radiation that falls in the infrared spectrum.

The analysis of auroral data from Uranus took almost 30 years. Thanks to data collected by the Keck II telescope in Hawaii they were able to conduct a spectral analysis. In spectral analysis, scientists analyse the light obtained from the object they were observing. Each element emits (or absorbs) unique wavelengths of light that act as its identifier. In this case, the brightness of the line created by the wavelength emitted by a particle H3+ (a positive ion of a molecule composed of three hydrogen atoms) depends on the temperature of the particle, making it possible to measure the temperature of Uranus’ atmosphere. Researchers found higher amounts of H3+ in places where an infrared aurora was present, but the temperature of those particles didn’t change.

These results will help scientists in future analysis of exoplanets, as most of them are ice giants like Uranus, to determine if they are suitable for human life. The results also put into question previous results obtained on Neptune, which has a similar magnetic field as Uranus and auroras should behave similarly. It also may explain how ice giants like Uranus and Neptune are much hotter than models predict if their only heat source was the Sun. Infrared auroras may deliver heat that is later moved towards the magnetic equator and effectively warming up the planet. Finally, Uranus is a great opportunity to study the effects that changing magnetic field may have on a planet and to discover different mechanisms of creating a planetary magnetosphere compared to what we know about this process on Earth. That research would have to be conducted on both Uranus and Earth to get the most accurate results. Faraway planets still hide many mysteries from us and unveiling these mysteries will help us not only in deep space travel but to also tackle issues in our own backyard.

Thomas, E. M., Melin, H., Stallard, T. S., Chowdhury, M. N., Wang, R., Knowles, K., & Miller, S. (2023). Detection of the Infrared Aurora at Uranus with Keck-NIRSPEC. Nature Astronomy. https://doi.org/10.1038/s41550-023-02096-5

Gene Editing Might Save Your Rice Bowl from Drought Drama

Yearly, drought affects people from individual consumers to whole industries. Personally, I have had to cut some products out of my diet since the prices have been unreasonable. But seriously, the ones suffering most are the people in less privileged situations who are driven to hunt for basic products, such as rice, and may face malnutrition. 

As the globe warms and population sizes reach new records food products are needed more than ever and no one can afford losing any crops to drought. This may sound scary but don’t worry scientists take this as a challenge and opportunity to do the unimaginable (in the hopes of winning a noble prize at the end 😉 ). Solving this global issue has turned the scientific community to its creative limit trying to find innovative solutions that are not only effective but sustainable in the long term and minimally invasive. Surely a creative and head-turning solution that has caught many eyes is the use of gene editing with the CRISPR-Cas9 method to breed drought-resistant crops. In this post, I will focus on many households’ beloved staple, rice.  

What is CRISPR-Cas9? 

Gene editing may sound like a plot from a sci-fi movie or a futuristic novel but it is far from that. The discovery of a gene editing technique called CRISPR-Cas9, which was found occurring within microorganisms, has changed the game for scientists. This technique allows scientists to edit a gene at a specific location, or locus, with higher accuracy and lower cost. Now scientists are taking advantage of this and improving agricultural traits, enhancing stress tolerance, and promoting desired characteristics of crops. 

How can we use CRISPR-Cas9 to fight drought?

This particular study I looked at gives a great example of how CRISPR technology can be utilized to get closer to developing drought-tolerant rice varieties. 

The researchers focused on one specific gene in rice called OsSAP. They compared the effects of drought stress on the growth of regular unmanipulated rice and rice that was edited using CRISPR-Cas9 at the location of this particular gene. And as you guessed it, they observed differences in growth. They found that OsSAP was what they call a “negative regulator” meaning that it suppresses the expression of other genes or proteins. OsSAP-gene was found when subjected to drought stress to induce cellular senescence, a state of permanent growth arrest of the plant, and apoptosis, programmed cell death. This means that if we overexpress this gene these processes will be induced even further thus improving the drought tolerance of rice. And that is exactly what the researchers did. This discovery suggested that this particular gene OsSAP is a potential candidate for creating a drought tolerant rice. This discovery is an example how CRISPR is utilized in the fight against drought and enables a closer step towards addressing food shortages. 

What’s next?

This opens the question of whether are there more genes we can edit to further enhance the desired characters. The answer is yes. Evolution has been kind enough to produce plants that are highly sophisticated in dealing with drought. By looking at these plants and tracing their characteristics to their specific gene or genes researchers can potentially use them as a guide in enhancing drought tolerance of other plants. However, this may be potentially a challenging process but research such as the one looked at in this post is allowing us to get closer to permanent solutions and significant discoveries. 

Fella Mekki

 

Reference

Park, J.-R., Kim, E.-G., Jang, Y.-H., Jan, R., Farooq, M., Ubaidillah, M., & Kim, K.-M. (2022). Applications of CRISPR/Cas9 as new strategies for short breeding to drought gene in Rice. Frontiers in Plant Science, 13. https://doi.org/10.3389/fpls.2022.850441 

 

Please! Do not use Chat-GPT to answer your chemistry homework for you!

Author: Johanna Salo, Date: 17/11/2023, Finland

Nowadays, it is impossible to go through your week without mentions of AI or this little program called Chat-GPT, unless you are living under a rock on Mars. GPTs and large language learning models (LLM) have taken over the internet in recent years in the form of chatbots and different types of generated content in media. These programs can answer (output) user-presented questions or prompts (input) by relying on their training material or by navigating the internet for an appropriate answer. There are controversies regarding the legal ownership and copyright of said training material, but that topic deserves its own blog post, and we will not be discussing it here. Instead, we will discuss the utilization of GPT’s in science!

As university students, we have all come across that one person who uses Chat-GPT to answer their homework questions and assignments in our study careers. This post is a review of sorts on using Chat-GPT as a form for “tutoring” in your chemistry studies. How reliable are the answers? Do they make any sense? Is AI going to take over the chemical industry??

Behind this post is the research paper conducted by Kan Hatakeyama-Sato, et al on “Prompt engineering of GPT-4 for chemical research: what can/cannot be done?” published in Japan, 2023. The research group inputted different chemistry-related prompts into Chat-GPT and recorded the findings. The level of difficulty varied from basic information on chemical and physical properties of input compounds all the way to designing synthesis mechanisms for block-chain polymerization and predicting properties of the unknown products. The GPT’s outputs were compared to different established algorithms’ outputs on the accuracy and the number of trials when obtaining answers.

Chat-GPT showed university textbook-level knowledge in organic chemistry and other established and well-researched fields of chemistry, but either gave confidently wrong answers or refused to answer at all when prompted with tasks centering around polymer chemistry or kinetics/rate law. Generally, the GPT demonstrated being book smart and connecting simple concepts together without separate instructions from the input but failed to apply the knowledge on a larger scale. Therefore, using Chat-GPT to answer your homework questions and assignments is redundant, as you would be using the same amount of time on correcting the GPT’s answer as you would in studying and answering the assignment on your own, for now.

The main limitations of GPT’s currently are the access to modern research that is behind a paywall and the limit on the amount of information one can include in the prompt. Another issue is the lack of GPT’s understanding of non-verbal information such as pictures and audio, as it is a language learning model. In chemistry, this mainly shows in the form of the GPT not understanding how molecules look like, or the bonding between atoms and the three-dimensional structure of them. In the paper, Chat-GPT adds in and takes away atoms that would be impossible to occur in real life, which is a huge problem if you want to sketch a reaction pathway using a GPT.

But why are we obsessed with GPT’s? The answer is simple: accessibility. Being able to give natural language instructions to a program to solve complex tasks is a huge selling point for the improvement of GPT’s, as this would lower the entry barrier for chemists who might not be code wizards and robotics experts. These aforementioned limitations could be minimized in the short term by creating plug-ins to the already established algorithms, as is already done with the field of mathematics with the existence of calculator plug-ins for Chat-GPT, allowing it to accurately perform more complex computations.

So, for now, we will still have to vigorously study theory and learn to apply concepts from micro to macro, but there could be a day when we can just give prompt requests for AI and reap the results of their labor. But until that day, please do not use Chat-GPT to answer your homework questions for you.

(This was written by a real flesh-and-bone human being. Trust me.)

 

Hatakeyama-Sato, Kan, et al. “Prompt Engineering of GPT-4 for Chemical Research: What Can/Cannot Be Done?” 2023, Japan.

The Optimisation Adventure: Simulated Annealing Unveiled

Hello, algorithm enthusiasts! If you want to find the easiest route to glory, Simulated Annealing (SA) is just for you. If you’re up against a big and complex problem, this mind-blowing algorithm finds the easiest way to solve it. Just like magic! Visualise SA as a huge bouncing ball on a quest to find the most efficient solution, navigating a landscape of peaks and valleys to reach the treasure at the lowest point.

Our 40-year-old trusty sidekick SA, was born in 1983 to tackle nonlinear problems, inspired by the annealing process in metallurgy. At the start of every journey to find the lowest point, SA, the huge bouncing ball, jumps through different altitudes and obstacles. Everything is high at the beginning for our daring adventurer; the altitude, the energy, and the expectations. With the high energy, it can leap over immense mountains, with every jump creating temperature. As the altitude drops, the temperature drops as well, and instead of those intense jumps, our adventurer becomes more selective as a clearer path emerges in the valleys…

The annealing schedule to control the temperature determines how much uphill movement SA allows. Choosing the right schedule is crucial. Start with a high initial temperature, “melting” the system, and gradually reduce it as the search progresses. Finding this balance ensures SA explores extensively at the start but narrows down to the optimal solution.

The algorithm involves four key components: a representation of possible solutions, a generator of random changes, an evaluator for problem functions, and an annealing schedule which is a road map for temperature reduction during the search.

SA shows its brilliance in chaotic data because of its random search capability. One of the biggest issues that the more traditional friends of SA face is getting stuck in a local minimum, which means falling into a hole other than the lowest point, and failing to get out of it. However, our daring adventurer randomly accepts challenges and doesn’t shy away from climbing uphill. In other words, since SA doesn’t rely on strict model properties, it has the power of bypassing the local minima.

Even with these impressive features, in the optimisation arena, SA has some ambitious competitors like Neural Nets and Genetic Algorithms. Unlike Neural Nets that learn by strictly following one function, SA is a smart random searcher, which is an advantage as known in the local minima. When pitted against Genetic Algorithms, SA often emerges victorious, offering a global convergence guarantee.

SA is a probabilistic optimisation algorithm, that allows its versatility in problem solving. On the other hand, this means that a lot of time and precision in inputs is required for the quality of the solution. Implementing SA requires defining solutions, generating random changes, evaluating functions, and setting up the annealing schedule. Another vital part in these phases is the implementation of covariance matrices which show the distribution magnitude and direction of multivariate data in a multidimensional space. (Multivariate data refers to datasets that involve more than one variable.)

Now, you might wonder, “Why does SA matter to me?” Well, if you’re dealing with financial instruments, SA is becoming the go-to algorithm for hybrid securities and intricate trading strategies as its flexibility stands out while also having the ability to navigate complexities of multivariate systems, blending continuous and discrete sets seamlessly.

In a nutshell, SA is a flexible and robust optimisation tool that excels in navigating complex landscapes. While it may be computation-intensive, its prowess in tackling nonlinear problems makes it a hero in the world of optimisation.

So, optimisation enthusiasts, whether you’re crunching numbers in finance or exploring intricate models, consider adding Simulated Annealing to your toolkit. It might be just the adventure you need to overcome your optimisation challenges by diving to the lowest point.

Stay optimised, stay curious!

Prepared By: Ali Onur Özkan

Article Reference: Busetti, F. (2018, December 10). (PDF) simulated annealing overview – researchgate. ResearchGate. https://www.researchgate.net/publication/238690391_Simulated_annealing_overview