查看: 1059|回复: 6

【AIP Advance】Second law of information dynamics

[复制链接]

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
发表于 2022-7-23 15:47:34 | 显示全部楼层 |阅读模式
本帖最后由 avigan 于 2022-7-23 16:45 编辑

Second law of information dynamics: AIP Advances: Vol 12, No 7 (scitation.org)

Second law of information dynamics

Submitted: 23 May 2022 Accepted: 10 June 2022 Published Online: 11 July 2022
AIP Advances 12, 075310 (2022); https://doi.org/10.1063/5.0100358
Melvin M. Vopson* and  S. Lepadatu

1School of Mathematics and Physics, University of Portsmouth, PO1 3QL Portsmouth, United Kingdom
2Jeremiah Horrocks Institute for Mathematics, Physics and Astronomy, University of Central Lancashire, Preston PR1 2HE, United Kingdom
a)Author to whom correspondence should be addressed: melvin.vopson@port.ac.uk

ABSTRACT
One of the most powerful laws in physics is the second law of thermodynamics, which states that the entropy of any system remains constant or increases over time. In fact, the second law is applicable to the evolution of the entire universe and Clausius stated, “The entropy of the universe tends to a maximum.” Here, we examine the time evolution of information systems, defined as physical systems containing information states within Shannon’s information theory framework. Our observations allow the introduction of the second law of information dynamics (infodynamics). Using two different information systems, digital data storage and a biological RNA genome, we demonstrate that the second law of infodynamics requires the information entropy to remain constant or to decrease over time. This is exactly the opposite to the evolution of the physical entropy, as dictated by the second law of thermodynamics. The surprising result obtained here has massive implications for future developments in genomic research, evolutionary biology, computing, big data, physics, and cosmology.


I. INTRODUCTION

The research field of information dynamics (infodynamics) has its origins in a few significant scientific developments that include the seminal Information Theory developed by Shannon in 19411 and the pioneering work of Brillouin in 19532 and Landauer in 19613 on information physics. A more recent development is the introduction of the mass-energy-information (M-E-I) equivalence principle formulated by Vopson in 2019.4 Using thermodynamic considerations, Landauer introduced his 1961 principle stating that information, as defined in Shannon’s framework, is not just a mathematical construct, but it is physical, having small energy associated with it, which is detectable at information erasure. Backed up by multiple experimental confirmations reported in the literature,5–8 Landauer’s principle passed long ago the theoretical realm and the scientific community today broadly accepts it as valid. The M-E-I equivalence principle proposed in 2019 is an extension of Landauer’s principle stating that, if information is equivalent to energy, according to Landauer, and if energy is equivalent to mass, according to Einstein’s special relativity, then the triad of mass, energy, and information must all be equivalent, too (i.e., if M = E and E = I, then M = E = I). The M-E-I equivalence principle generated a number of interesting ramifications in physics,9–11 but still awaits an experimental confirmation.12 The Landauer and M-E-I equivalence principles are necessary in order to fulfill the thermodynamic laws of physics. These principles have been initially proposed in the context of digital information and computing technologies. This is because any computational or memory device is essentially a physical system, which is part of the universe and it must obey the universal laws of physics, including thermodynamics. Due to these considerations, Landauer suggested that logical irreversibility must be equivalent to physical irreversibility. Because irreversible processes are also dissipative, i.e., they take place with dissipation of energy, and since the erase operation that deletes a bit of information is irreversible, then it must dissipate a small energy that comes from the memory bit itself. Hence, Landauer deduced that a bit of information is physical, or more generally, any form of information as defined in Shannon’s framework is physical. The M-E-I equivalence principle proposes that the Landauer energy of an information bit condenses into its equivalent mass-energy when the information is stored at equilibrium. These fundamental ideas have created a bridge between pure mathematics and physics, essentially “physicalizing” mathematics. The concept of physicalizing the mathematics has profound implications for the way that we think about the whole universe, because it shows that the universe is fundamentally mathematical and it can be seen as emerging from information, i.e., “it from bit,” a concept coined by the legendary physicist, Wheeler.13
Here, we examine the entropy and the time dynamics of information systems and, in analogy to the second law of thermodynamics, we formulate the second law of infodynamics.

II. ENTROPY OF INFORMATION

Let us assume a physical system is in its virgin state with no information stored in it [Fig. 1(a)]. We now assume that the system undergoes the process of encoding digital bits of information via a given process of digital information storage. The technology deployed to encode digital information is irrelevant to our discussion, but we will demonstrate our argument here using a magnetic data storage system. The total entropy of the system is a measure of all its possible physical microstates compatible with the macrostate, and we call this the physical entropy of the system, Sphys. The physical entropy of the system is characteristic of the non-information bearing microstates within the system. We now assume that N digital bits of information are created within the physical body. This is equivalent to the “write” operation of a digital data storage device. The additional N bits of information created within our test system represent N additional microstates superimposed onto the existing physical microstates.
FIG. 1. (a) Schematics of a material in virgin state with no information stored in it; (b) the word INFORMATION is written on the material in binary code using magnetic recording; and (c) the grid of 0 and 1 information states created in the process of information recording.


These additional microstates are information bearing states, and the additional entropy associated with them is called the entropy of information, Sinf.
The total entropy of the system is now the sum of the initial physical entropy and the newly created entropy of information, Stot = Sphys + Sinf. Hence, an important observation is that the process of creating information increases the overall entropy of a given system. In our example, we write digitally onto our hypothetical system the word INFORMATION using magnetic data recording, so a digital 0 is blue (magnetization up) and a digital 1 is red (magnetization down) [Figs. 1(b) and 1(c)].
In binary code, this results in 11 bytes, so N = 88 bits of 0 and 1 states, are encoded [Fig. 1(c)]. The evolution of the physical entropy and the total entropy of our test system are both governed by the second law of thermodynamics. The second law of thermodynamics has many alternative formulations, but in this context, we will use the one stating that the entropy of an isolated system undergoing any transformation remains always constant or increases over time. When applied to the whole universe, Clausius definition states, “The entropy of the universe tends to a maximum.” Mathematically, this formulation of the second law is written as ∂S/∂t ≥ 0, where S is the total entropy and t is time.
Let us now examine the applicability of the second law to the entropy of the information bearing states. To do so, we need to use Shannon’s information theory developed in 1940s.1 Shannon gave the mathematical framework of classical information theory by linking the occurrence probability of an event to its information content. According to Shannon, for an event whose probability of occurring is p, the information extracted from observing the event is a continuous function of its probability:
I(p)=logb(1/p), (1)
where I is the information value and its units are given by the choice of the base, b.
Units of bits are obtained when b = 2, trits when b = 3, nats when b = e, i.e., Euler’s number. The natural choice of b = 2 resulting in bits, which is the case in this article, is dictated by the current digital technologies making this a convenient choice. For an arbitrary choice of the base “b,” the information function can be returned in different units using the logarithm base change formula, I(p)=logb(1/p)=loga(1/p) / loga(b). For example, if we want to convert information expressed in nuts into bits, then b = 2, a = e, and I(p)=ln(1/p)(nuts)=ln(2)⋅log2(1/p)(bits.When we observe a set of n independent and distinctive events X = {x1, x2, …, xn} having a discrete probability distribution P = {p1, p2, …, pn} on X, the average bit information content per event that can be extracted when observing the set of events X once is
H(X)=∑j=1npj⋅log21pj. (2)


The function H(X) is called the Shannon information entropy and it is maximum when the events xj have equal probabilities of occurring, pj = 1/n, so H(x) = log2 n. The Shannon information entropy function returns a bit content value per event. When observing N sets of events X, or N times the set of events X, the number of bits of information extracted from the observation is N · H(X). The Shannon information entropy H(X) is closely linked to the entropy of the information bearing states, Sinf. If N digital bits are created within the physical body, then the additional possible states, also known as distinct messages in Shannon’s original formalism, are equivalent to the number of information bearing microstates, Ω compatible with the macrostate:4
Ω=2N⋅H(X). (3)
Using (2) and (3), we can deduce the entropy of the information bearing states from the Boltzmann relation,
Sinf = kb.lnΩ=Nkb⋅ln2∑j=lnpj⋅log21pj, (4)

where kb = 1.380 64 × 10−23 J/K is the Boltzmann constant.
In the context of Shannon’s information theory, our test example shown in Fig. 1, has N = 88, n = 2 and X={0,1}. Counting the occurrences of 0 and 1 s, we get 49 and 39, respectively. This results in P={p1,p2}={49/88,39/88}. If the two events would have equal probabilities, i.e., P={p1, p2}={44/88,44/88}={1/2,1/2}, then using (2) it can be shown that H(X) = 1, or an average of one bit of information is encoded per each state. For our example, however, the Shannon entropy function is just under 1 bit,
H(X)=∑nj=1pj⋅log21pj=(49/88*log2(88/49)+39/88*log2(88/39))=0.991. (5)
Our objective in this study is to examine the time evolution of Sinf. According to (4), only two variables, N and H(X), can drive any changes in the Sinf. The Shannon function has a maximum value, which is 1 in our case and it tends to 1 for large N.

III. TIME EVOLUTION OF DIGITAL INFORMATION STATES

Let us assume that H(X) → 1, and we now examine what the evolution of N is over time. In our example, N information microstates are physically determined by magnetization changes in the material [see Figs. 1(b) and 1(c)]. For any given non-zero Kelvin temperature T, these magnetic states will undergo magnetic relaxation processes with a relaxation time (τ) given by the well-known Arrhenius–Neel equation,
1/
τ ν0 * exp(-KaV/KvT), (6)
where ν0 is the magnetization attempt frequency to overcome the energy barrier and it is approximately ν0 ≈ 109 Hz, Ka is the anisotropy constant of the magnetic material, V is the magnetic grain volume, and kb is the Boltzmann constant.
The meaning of this relaxation time is the average time it takes for a magnetic grain of volume V within a magnetic bit state to undergo a spontaneous magnetization flip due to the thermal activation. Hence, after a sufficiently long time, we expect magnetic grains to lose their magnetization state, leading to magnetic bit states undergoing self-erasure, and, therefore, reducing the information states N. The implication of this analysis is that the entropy of the information bearing states tends to decrease over time.
To demonstrate this, we simulated a granular magnetic thin film structure with perpendicular uniaxial anisotropy of Ka = 8.75 × 108 J/m3, saturation magnetization Ms = 1710 kA/m, and average unit cell size (cubic) V = 10−27 m3, at room temperature T = 300 K. A standard cell size volume suitable for magnetic recording should be sufficiently large to maintain thermally stable magnetization of the cell for ∼10 years [i.e., τ in Eq. (6) is 3.15 × 108 s]. Under this condition, the ratio of magnetocrystalline energy to thermal energy at T = 300 K should be around 40, KaV/KbT≈40, resulting in a cell size volume of V ≈ 1.9 × 10−27 m3. The unit cell size volume in our simulations has been intentionally taken 1.9 times lower in order to speed up the computation time. These values resulted in a relaxation time of 1.5 s, which corresponds to a single iteration in the Monte Carlo algorithm. The simulated thin film sample size was 400 × 550 × 2 nm3, giving a bit size of 50 × 50 nm2. Starting with a thermalized random state [Fig. 2(a)], INFORMATION is suddenly written using the digital binary code [Figs. 1(b) and 2(b)]. Using a micromagnetic Monte Carlo algorithm,14 we tracked the information loss as the system is allowed to thermalize over a period of time [Figs. 2(c)–2(h)]. The data indicate that the system evolves over time in a way that the second law of thermodynamics is indeed fulfilled by the physical entropy and the total entropy of the system. However, when the entropy of the information bearing states is examined independently, we conclude that the second law manifests in reverse so that the information entropy stays constant or decreases. This is called the second law of information dynamics (infodynamics), and it is mathematically written as

Sinf/ t ≤0  (7)

FIG. 2. Time evolution of the digital magnetic recording information states simulated using Micromagnetic Monte Carlo. Over time, the information states gradually vanish due to self-erasure, reducing the information entropy of the system. Red denotes magnetization pointing out of the plane and blue is magnetization pointing into the plane. (a) Initial random state; (b) INFORMATION is written (t = 0 s); (c) Iteration 140 (t = 210 s); (d) Iteration 460 (t = 690 s); (e) Iteration 590 (t = 885 s); (f) Iteration 930 (t = 1395 s); (g) Iteration 1100 (t = 1650 s); and (h) Iteration 1990 (t = 2985 s).

This new law of infodynamics must not violate the second law of thermodynamics, so the entropy reduction in the information states must be compensated by an entropy increase in the physical states, via a dissipation mechanism. This rationale was behind Landauer’s principle that information is physical, which was also derived similarly and expanded to the mass-energy-information equivalence principle by Vopson.4
The simulation performed on our test sample resulted in a simultaneous reduction of the magnetization of all the magnetic information bit states up to the point when N = 0. However, in reality, this process can take place gradually so that N reduces to a lower value in steps, until it reaches zero eventually. Re-examining relation (4), it can be easily seen that a reduction in N would lead to a reduction of the information entropy, confirming indeed the second law of infodynamics (7).

IV. TIME EVOLUTION OF BIOLOGICAL INFORMATION STATES

In order to verify the universal validity of the second law of infodynamics, we need to examine the time evolution of the information entropy of a system, in which the number of information states N remains constant and the reduction of the information entropy comes from Shannon’s information entropy function.
A natural information coding system that fulfills this requirement is the genetic DNA/RNA code, because the information is encoded in the sequence of nucleotides and its time evolution is described by the genetic mutations. Genetic mutations are changes in the nucleotide sequence, and these changes can take place via three mechanisms: (i) Single nucleotide polymorphisms (SNPs), where changes occur so that the number of nucleotides remains constant; (ii) Deletions, where N decreases; and (iii) Insertions, which result in N increasing. Out of the three possible cases, only the SNP mutations are of interest to us, because they maintain the value of the N constant.
The ideal test system is a virus genome that undergoes frequent mutations in a short period of time. In this study, we examined the RNA sequence of the novel SARS-CoV-2 virus, which emerged in December 2019 resulting in the current COVID-19 pandemic.A DNA sequence can be represented as a long string of the letters A, C, G, and T. These represent the four nucleotides: adenine (A), cytosine (C), guanine (G), and thymine (T) [replaced with uracil (U) in RNA sequences]. Therefore, within Shannon’s information theory framework, a typical genome sequence can be represented as a 4-state probabilistic system, with n = 4 distinctive events, X={A,C,G, T} and probabilities p={pA, pC, pG, pT}. Using digital information units and Eq. (2), for n = 4, we determine that Shannon information entropy is 2 (H = log2 4 = 2), so each nucleotide can encode maximum 2 bits: A = 00, C = 01, G = 10, and T = 11. For a given genomic sequence containing N nucleotides, the total Shannon information entropy can be maximum 2N bits.
The reference RNA sequence of the SARS-CoV-2, representing a sample of the virus collected early in the pandemic in Wuhan, China in December 2019 (MN908947),15 has 29 903 nucleotides, so N = 29 903. For this reference sequence, we computed the Shannon information entropy using relation (2).
The value obtained represents the reference Shannon information entropy at time zero before any mutations took place. Using the National Center for Biotechnology Information (NCBI) database, we searched and extracted a number of SARS-CoV-2 variants sequenced at various locations around the globe, at different times, starting from January 2020 to October 2021 (Table I).
TABLE I.Tabulated results of the analysis performed on selected SARS-CoV-2 variants sequenced at various locations around the globe, over a period of 22 months.
Genome        References        SNPs        Time (months)        Location        Shannon IE
MN908947        15        0        0        China        1.957 024 3
LC542809        19        4        3        Japan        1.956 919 7
MT956915        20        7        5        Spain        1.956 923 0
MW466798        21        9        7        South Korea        1.956 932 7
MW294011        22        19        10        Ecuador        1.956 705 8
MW679505        23        25        14        USA        1.956 663 0
MW735975        24        26        14        USA        1.956 571 4
OK546282.1        25        32        16        USA        1.956 567 5
OK104651.1        26        40        20        Egypt        1.956 459 1
OL351371.1        27        49        22        Egypt        1.956 261 4

By searching for complete genome sequences, containing the same number of nucleotides as the reference sequence, we carefully selected variants that displayed an incremental number of SNP mutations with time, and we computed the Shannon information entropy for each variant. The calculations have been performed using previously developed software, GENIES,16,17 designed to study genetic mutations using Shannon’s information theory.18
The full dataset, including genome data references/links collection times, number of SNP mutations, and the Shannon information entropy value of each genome are shown in Table I. Figure 3 shows the time evolution of the number of SARS-CoV-2 SNP mutations and the time evolution of each variant’s information entropy, Sinf computed using (4) and normalized to kb. The data indicate that, as expected, the number of mutations increases linearly as a function of time [Fig. 3, bottom graph, coefficient of determination (COD) = 99%]. Remarkably, for the same dataset, the Shannon information entropy (see Table I), and the overall information entropy of the SARS-CoV-2 variants (Sinf) computed using (4), decreases rather linearly over time (Fig. 3, top graph, COD = 97%). The observed correlation between the information entropy and the time dynamics of the genetic mutations is truly unique, because it reconfirms the second law of infodynamics, but it also points to a possible deterministic approach to genetic mutations, currently believed to be just random events. The existence of an entopic force that governs genetic mutations instead of randomness is very powerful and it could lead to the future development of predictive algorithms for genetic mutations before they occur.


FIG. 3. Time evolution of the number of genetic mutations (bottom graph) and the entropy of information bearing states, Sinf normalized to kb (top graph), of selected sequences of SARS-CoV-2 virus. Covid-19 virus photo by Centers for Disease Control (CDC) and imported from unsplash.com.



V. CONCLUSIONS

In this study, we introduced the second law of infodynamics, which is universally applicable to any information system, including biological systems where the number of information states remains constant. We demonstrated that the information bearing states evolve over time in a way that their associated entropy remains constant or decreases. Hence, all physical systems containing information states should obey not only the second law of thermodynamics but also the second law of infodynamics, as demonstrated in this article. The introduction of the second law of infodynamics is of fundamental importance because it will aid future studies and developments in a diverse range of sciences, including genetics, evolutionary biology, virology, computing, big data, physics, and cosmology. However, in this article, we do not address the implications of the second law of infodynamics to fundamental issues such as the evolution of information in the universe, the overall balance of physical and information entropies in the universe, or the growth of biological information in the terrestrial biosphere and beyond. We also do not explain how the second law of infodynamics relates to the relaxation times of the information states and the observation time, nor do we address the question of the possible existence of fluctuations of information states when the minimal information entropy state occurs. We, therefore, hope that these unanswered questions will be addressed in the future studies stimulated by this work.

ACKNOWLEDGMENTS
M.M.V. acknowledges the financial support received for this research from the School of Mathematics and Physics, University of Portsmouth. S.L. also acknowledges the support received for this research from the Jeremiah Horrocks Institute for Mathematics, Physics, and Astronomy, University of Central Lancashire.

AUTHOR DECLARATIONS

Conflict of Interest
The authors have no conflicts to disclose.


DATA AVAILABILITY
The numerical data associated with this work is available within the article. The RNA sequences used in this study are freely available from Refs. 15 and 1927.

REFERENCES

1.C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J. 27, 379–423 (1948). https://doi.org/10.1002/j.1538-7305.1948.tb01338.x, Google ScholarCrossref
2.L. Brillouin, “The negentropic principle of information,” J. Appl. Phys. 24, 1152–1163 (1953). https://doi.org/10.1063/1.1721463, Google ScholarScitation, ISI
3.R. Landauer, “Irreversibility and heat generation in the computing process,” IBM J. Res. Dev. 5(3), 183–191 (1961). https://doi.org/10.1147/rd.53.0183, Google ScholarCrossref, ISI
4.M. M. Vopson, “The mass-energy-information equivalence principle,” AIP Adv. 9(9), 095206 (2019). https://doi.org/10.1063/1.5123794, Google ScholarScitation, ISI
5.J. Hong, B. Lambson, S. Dhuey, and J. Bokor, “Experimental test of Landauer's principle in single-bit operations on nanomagnetic memory bits,” Sci. Adv. 2(3), e1501492 (2016). https://doi.org/10.1126/sciadv.1501492, Google ScholarCrossref
6.G. Rocco, B. Enrique, M. Satoru, H. van der Zant, and L. Fernando, “Quantum Landauer erasure with a molecular nanomagnet,” Nat. Phys. 14, 565–568 (2018). https://doi.org/10.1038/s41567-018-0070-7, Google ScholarCrossref
7.A. Bérut, A. Arakelyan, A. Petrosyan, S. Ciliberto, R. Dillenschneider, and E. Lutz, “Experimental verification of Landauer’s principle linking information and thermodynamics,” Nature 483, 187–189 (2012). https://doi.org/10.1038/nature10872, Google ScholarCrossref
8.Y. Jun, M. Gavrilov, and J. Bechhoefer, “High-precision test of landauer’s principle in a feedback trap,” Phys. Rev. Lett. 113(19), 190601 (2014). https://doi.org/10.1103/physrevlett.113.190601, Google ScholarCrossref
9.M. M. Vopson, “The information catastrophe,” AIP Adv. 10(8), 085014 (2020). https://doi.org/10.1063/5.0019941, Google ScholarScitation, ISI
10.M. M. Vopson, “The information content of the universe and the implications for the missing dark matter,” (June 2019). Google Scholar
11.M. M. Vopson, “Estimation of the information contained in the visible matter of the universe,” AIP Adv. 11(10), 105317 (2021). https://doi.org/10.1063/5.0064475, Google ScholarScitation
12.M. M. Vopson, “Experimental protocol for testing the mass–energy–information equivalence principle,” AIP Adv. 12(3), 035311 (2022). https://doi.org/10.1063/5.0087175, Google ScholarScitation, ISI
13.J. A. Wheeler, “Information, physics, quantum: The search for links,” in Complexity, Entropy, and the Physics of Information, edited by W. H. Zurek (Addison-Wesley, Redwood City, 1990), p. 3. Google Scholar
14.S. Lepadatu, “Micromagnetic Monte Carlo method with variable magnetization length based on the Landau–Lifshitz–Bloch equation for computation of large-scale thermodynamic equilibrium states,” J. Appl. Phys. 130, 163902 (2021). https://doi.org/10.1063/5.0059745, Google ScholarScitation
15.See https://www.ncbi.nlm.nih.gov/nuccore/MN908947 for full genome sequence. Google Scholar
16.GENIES software free download https://sourceforge.net/projects/information-entropy-spectrum/. Google Scholar
17.Genetic Information Entropy Spectrum (GENIES) User manual, 10 December (2020), https://doi.org/10.13140/RG.2.2.36557.46569. Google Scholar
18.M. M. Vopson and S. C. Robson, “A new method to study genome mutations using the information entropy,” Physica A 584, 126383 (2021). https://doi.org/10.1016/j.physa.2021.126383, Google ScholarCrossref
19.See https://www.ncbi.nlm.nih.gov/nuccore/LC542809 for full genome sequence. Google Scholar
20.See https://www.ncbi.nlm.nih.gov/nuccore/MT956915 for full genome sequence. Google Scholar
21.See https://www.ncbi.nlm.nih.gov/nuccore/MW466798 for full genome sequence. Google Scholar
22.See https://www.ncbi.nlm.nih.gov/nuccore/MW294011 for full genome sequence. Google Scholar
23.See https://www.ncbi.nlm.nih.gov/nuccore/MW679505 for full genome sequence. Google Scholar
24.See https://www.ncbi.nlm.nih.gov/nuccore/MW735975 for full genome sequence. Google Scholar
25.See https://www.ncbi.nlm.nih.gov/nuccore/OK546282 for full genome sequence. Google Scholar
26.See https://www.ncbi.nlm.nih.gov/nuccore/OK104651 for full genome sequence. Google Scholar
27.See https://www.ncbi.nlm.nih.gov/nuccore/OL351371 for full genome sequence. Google Scholar
© 2022 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).










回复

使用道具 举报

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
 楼主| 发表于 2022-7-23 15:53:19 | 显示全部楼层
Questioning the randomness of genetics could preemptively predict mutations: Scilight: Vol 2022, No 28 (scitation.org)

Questioning the randomness of genetics could preemptively predict mutations
Alane Lim

Digital data storage and COVID-19 RNA genomes show second law of information dynamics acts opposite to the second law of thermodynamics.



The second law of thermodynamics, which states that the entropy of a system must either remain constant or increase over time, is fundamental to understanding the physics of the entire universe. Vopson and Lepadatu introduced the second law of information dynamics, or infodynamics, which states just the opposite.
Using digital data storage and a COVID-19 RNA genome as examples, the authors found information states are a unique exception to the second law of thermodynamics. Instead of entropy increasing over time, the entropy of information decreases over time.
To demonstrate the second law of infodynamics, the authors discussed how the time evolution of a digital information system requires a decrease of entropy due to thermal activation and backed this up with Monte Carlo micromagnetic simulations. The team also analyzed real COVID-19 genomes and showed their information entropy decreased over time.
“Our observed correlation between the information entropy and the time dynamics of genetic mutations is truly unique because it reconfirms the second law of infodynamics, but it also points to a possible deterministic approach to genetic mutations, currently believed to be just random events,” author Melvin Vopson said. “The evidence of the existence of an information entropic force that governs genetic mutations instead of randomness is very powerful … and it could lead to the future development of predictive algorithms of genetic mutations before they occur.”
Vopson said there are many more questions that warrant further investigation, including the implications of the second law of infodynamics to the evolution of information in the universe and non-equilibrium states.


Source: “Second law of information dynamics,” by Melvin M. Vopson and S. Lepadatu, AIP Advances (2022). The article can be accessed at https://doi.org/10.1063/5.0100358.
© 2022 Author(s). Published by AIP Publishing (https://publishing.aip.org/authors/rights-and-permissions).



回复

使用道具 举报

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
 楼主| 发表于 2022-7-23 15:58:08 | 显示全部楼层
看到这个文章, 有种“山中方一日, 世上已千年”的感觉。 信息动力学第二定律横空出世, 我居然连第一定律都还不清楚。得好好研究一下了。

回复

使用道具 举报

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
 楼主| 发表于 2022-7-23 16:46:31 | 显示全部楼层
网页很不好编辑公式,可能有些出入,最好看原文。
回复

使用道具 举报

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
 楼主| 发表于 2022-7-23 17:42:03 | 显示全部楼层
本帖最后由 avigan 于 2022-7-23 17:43 编辑
avigan 发表于 2022-7-23 15:58
看到这个文章, 有种“山中方一日, 世上已千年”的感觉。 信息动力学第二定律横空出世, 我居然连第一定律 ...

挺有意思的文章,论证就宏观而言基因突变并非完全随机,而是由内控因素引导的。
但是新闻报道有点噱头,早上看报道被标题吸引,以为基因突变出现了预测理论。仔细一看,原来还是信息领域的,这个文章更多的可称为信息系统理论在生物科学范畴的一个验证。
目前这是理论性的,并无实际病例的应用指导作用。
回复

使用道具 举报

497

主题

2335

帖子

7118

积分

论坛元老

Rank: 8Rank: 8

积分
7118
 楼主| 发表于 2022-7-23 21:46:13 | 显示全部楼层
Nature在今年1月发表了一篇关于基因突变偏好与自然选择的文章。

这篇文章对基因累积突变和表观基因组特征展开了广泛的研究, 提出基因突变有偏向性,就是说突变是有选择的,在突变的过程中受外界因素导致的表观性变化会影响最终的突变,也就是说生命体感受环境而后突变以适应环境,这个结论,与达尔文进化论提出的随机突变而后物竞天择, 在因果和先后上提出了颠覆性意见。

同时讨论基因突变的偏向性,AIP advances这篇几乎完全基于分析和理论,它将信息、物理和生命科学相结合,指出基因突变存在偏向性。 它的这个信息动力学第二定律由香农信息理论衍生,目前对我来说这个理论有点玄。

Nature的这篇基于大量的实验数据和分析,对表观基因组和基因突变进行了关联研究,它的研究结果在我个人的理解范畴内非常make sense。适应性突变,生物体基于对外界变化的长期感知而对自身进行长期或永久性的调整;应激性反应,生物体基于对外界变化的即时感知而对自身进行临时性的调整, 应激性适应在功能层面是非常明确了的;从一致性上来说,生物体不仅仅应有应对临时变化的适应性机制,也应该对长期性变化有相应的适应机制。这个发现,将对很多研究比如环境变化对人体健康的影响, 食物变化对人体健康的影响,有重大的指导意义。

这两篇文章,从不同的角度提出了生物基因变异的偏向性,个人更喜欢Nature的这篇,AIP advances这篇从一个完全不同的角度就变异偏向性进行解释,非常突破传统,但其是否经得起辩论还待考验。

这两篇文章研究的都是突变,都是进化过程中的突变,一篇是拟南芥的,一篇是covid的。前两天, 我收到一个审稿邀请,恰好也是关于进化的。虽然研究领域我很熟,摘要里面列出的基因我也很熟,但考虑到我在进化方面并没有很深入,就不指手画脚了。今天看到Monroe的这篇,关联表观基因组和基因突变,忽然想起邀请审稿的那篇,觉得很有意思。

贴一下Nature这篇的原文链接:https://www.nature.com/articles/s41586-021-04269-6

Mutation bias reflects natural selection in Arabidopsis thaliana

J. Grey Monroe*, Thanvi Srikant, Pablo Carbonell-Bejerano, Claude Becker, Mariele Lensink, Moises Exposito-Alonso, Marie Klein, Julia Hildebrandt, Manuela Neumann, Daniel Kliebenstein, Mao-Lun Weng, Eric Imbert, Jon Ågren, Matthew T. Rutter, Charles B. Fenster & Detlef Weigel*

Nature volume 602, pages101–105 (2022)

Abstract
Since the first half of the twentieth century, evolutionary theory has been dominated by the idea that mutations occur randomly with respect to their consequences1. Here we test this assumption with large surveys of de novo mutations in the plant Arabidopsis thaliana. In contrast to expectations, we find that mutations occur less often in functionally constrained regions of the genome—mutation frequency is reduced by half inside gene bodies and by two-thirds in essential genes. With independent genomic mutation datasets, including from the largest Arabidopsis mutation accumulation experiment conducted to date, we demonstrate that epigenomic and physical features explain over 90% of variance in the genome-wide pattern of mutation bias surrounding genes. Observed mutation frequencies around genes in turn accurately predict patterns of genetic polymorphisms in natural Arabidopsis accessions (r = 0.96). That mutation bias is the primary force behind patterns of sequence evolution around genes in natural accessions is supported by analyses of allele frequencies. Finally, we find that genes subject to stronger purifying selection have a lower mutation rate. We conclude that epigenome-associated mutation bias2 reduces the occurrence of deleterious mutations in Arabidopsis, challenging the prevailing paradigm that mutation is a directionless force in evolution.

回复

使用道具 举报

203

主题

2474

帖子

6077

积分

论坛元老

Rank: 8Rank: 8

积分
6077
发表于 2022-7-24 16:54:35 | 显示全部楼层
热烈欢迎高级科学家回归~,牛大发~
信息动力学第二定律,听上去就很犀利
同样是病毒,为啥西班牙病毒就自动消失了, 新冠病毒就活蹦乱跳赖着不走呢
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则


快速回复 返回顶部 返回列表