1900: A physics genius wandering around Europe

Chapter 572 The Essence of Information! Matter or Energy? Information Entropy! Pioneering Informatio

Chapter 572 The Essence of Information! Matter or Energy? Information Entropy! Pioneering Information Theory!

If you ask what is information?
I'm afraid many people will be very confused at the first time.

"Information is information, what else can it be?"

Information is a very abstract concept.

Just write a word, a sentence, or an article, it contains information.

Furthermore, information can be conveyed even without the use of words.

For example, China's beacon fires relied on smoke to transmit military intelligence and did not use any text information.

Therefore, anyone can know the meaning of the information.

However, if you were asked to give a strict definition of information and how to describe it, most people would probably be at a loss.

This is not only a problem for ordinary people, but also a problem for scientists.

In real history, the birth and development of informatics are closely related to physics.

In the early 20th century, with the booming development of electronics, communications and other technologies, engineers at the time encountered a problem.

“How do we characterize the amount of information?”

For example, you want to send a gift to a distant relative.

You can say in the letter that there are five kinds of cakes, six kinds of fruits, a small bronze mirror, etc. in the box.

These numbers are accurate information.

Although your relative cannot see you in person, he can verify the information himself after seeing it.

This seems like a very simple question.

However, communications engineers are worried about this.

They need to encode this text information into electronic information first.

Then, through translation, compression, control and other means, it can be transmitted at the speed of light in the wires.

In this process, whether it is the computing requirements of the machine or the charging requirements of the business, it is necessary to clearly know the amount of information.

So which statement contains more information: “five kinds of pastries” or “six kinds of mangoes”?

Or, let's take a more extreme example: which one contains more information: a piece of white paper with only 5 words on it or a piece of white paper with only 500 words on it?
When the early telegraph was first put into operation, charges were based directly on the number of words sent.

Five words are more valuable than one.

This raises questions.

"Does sending five words use more power than sending one?"

This involves the issue of the amount of information.

This problem seems very simple to ordinary people, but it is very difficult to prove it strictly scientifically.

Later, with the emergence of computer theory, scientists' demand for quantification of information became increasingly urgent.

Thus, information theory was born.

It is a theory that specifically studies information.

In real history, in 1948, American mathematician and cryptographer Shannon published a paper that shocked the academic community: "A Mathematical Theory of Communications."

In this paper, Shannon first proposed the concept of "information entropy".

He borrowed the concept of entropy from thermodynamics and called the average amount of information after excluding redundant information "information entropy."

In thermodynamics, entropy represents the number of possible states of a microscopic system, or the degree of disorder.

Information entropy represents the probability of each possible event occurring.

In the paper, Shannon gave the mathematical expression formula for information entropy: H(x)=-∑plogp (p represents the probability of an event occurring).

At this point, the problem of quantitative measurement of information has been solved, and scientists can now study information in a quantitative way.

At this point, we can give a more rigorous definition of information:

“Information is something that is used to eliminate uncertainty. The greater the uncertainty, the greater the amount of information.”

But there is one thing to note here.

Information entropy is a strict mathematical concept based on computer communications, 01 coding, etc.

It is a theoretical science created specifically for computers.

At the operational and computational level, information entropy can be understood as the minimum number of bits required to store information.

However, calculating information entropy in the real world is extremely complicated, and it may even be impossible to calculate it.

Generally, the problem is solved by measuring the change in information entropy.

In addition, different people may get different amounts of information from the same sentence.

For example, for the same face, some people may interpret it as "beautiful" while others may interpret it as "ugly".

This kind of abstract information with associative and personal experience characteristics is not included in the scope of information entropy proposed by Shannon.

In short, Shannon published this paper and proposed the concept of information entropy, marking the birth of information theory.

He himself is naturally called the "Father of Information Theory".

After the emergence of information theory, computer theory and technology have developed by leaps and bounds.

And it gave birth to the extremely brilliant information age.

At this point, someone may have discovered the problem.

"Huh? What does that information have to do with Maxwell's demon?"

"What does it even have to do with physics?"

In the early days of information theory, no one could clearly explain the relationship between information and the real physical world.

Information seems to be the third form of existence besides matter and energy.

For example, an apple is placed there, and there is a lot of information about it, such as taste, quality, color and so on.

If the apple disappears at this moment, then both matter and energy disappear, but the information about the apple does not disappear.

Even if we can't see the apple, we still know its taste.

But from another perspective, information cannot exist independently from matter and energy.

If all matter and energy in the universe suddenly disappears in the next moment, will there still be information in the universe?

In other words, does the information still have "meaning" at this point?

In short, the relationship between information, matter and energy is very vague.

At this time, someone suddenly had an idea:
"If information and the physical world are unified, then the amount of information should also be considered a physical quantity."

But there are two problems with this view.

First, in physics, various physical quantities are needed to describe the state of a physical system, such as speed, energy, etc.

These physical quantities all have their own dimensions, which means they are obtained through measurement.

For example, the dimension of entropy in thermodynamics is J/K.

However, information entropy only has units and no dimensions.

Because information entropy does not need to be measured, but the probability of an event occurring is calculated mathematically.

Dimension is the property that characterizes the nature of a physical quantity, while unit simply indicates the size of the physical quantity.

There is an essential difference between the two.

Therefore, it is questionable whether the amount of information can be regarded as a physical quantity.

Of course, this problem can be solved by expanding the definition of physical quantities.

But the second problem is not easy to solve.

In physics, a change in the state of a physical system means that work has been done on the system.

The work done represents the change in the energy of the system.

But does the change in information state, or the change in information entropy, require energy consumption?

For example, which piece of paper has greater energy, a piece of paper with 5 words written on it or a piece of paper with 100 words written on it?

Or maybe you are typing on your computer, but you are not satisfied with what you have typed, so you delete all the words.

In this process, if we do not consider factors such as resistance, does the change of information consume energy?

This is a very interesting question.

The essence of the problem can be summarized as: Is there a relationship between information and energy? What is the relationship?
The key to solving this problem is actually "entropy" in thermodynamics. Do you remember the formula proposed by Clausius?
"dS = dQ/T."

In this formula, he linked entropy and heat.

To generalize further, a change in entropy corresponds to a change in energy.

This is a remarkable discovery.

However, Clausius did not take this formula seriously at the time.

He even thought that the concept of entropy was dispensable.

It's like the auxiliary line in mathematics, it's just for the convenience of understanding the problem.

By combining the Clausius entropy formula with Boltzmann's microscopic entropy formula, we can perform sophisticated manipulations of entropy and energy.

In real history, another great figure in information theory, Landauer, linked information entropy with thermal entropy.

He put forward the view that "all information requires a physical carrier."

“When processing information, these physical carriers must be operated.”

"And the physical carrier needs to be constrained by the laws of physics."

For example, all kinds of information in a computer are stored on the hard disk.

When we modify information, we are actually operating on the materials that make up the hard disk at a microscopic level.

In this way, Randall connected the information and physical worlds.

It was under the guidance of this idea that he proposed the famous Landauer principle in 1961.

"To erase 1 bit of information, at least kTln2 of energy must be dissipated into the environment."

Where k is the Boltzmann constant and T is the ambient temperature.

Through formula conversion, this principle can be expressed in another way:
"Erasing 1 bit of information will cause the entropy of the environment to increase by at least kln2."

Landauer's principle means that information no longer exists outside the physical world, but has a deep connection with physical entities.

Later, a big shot even boasted: "The universe is a quantum computer."

Another big shot said: "Everything comes from quantum bits."

However, these views seem too vague at present.

But soon, physicists found an example of how information theory was changing physics.

That is, based on Shannon and Landauer's work, the Maxwell's demon conjecture can be perfectly explained from the perspective of information!

And now, let alone the Landauer principle, even Shannon is still just a kid.

There is not even a shadow of information theory.

Therefore, everyone present, including the physics giants, knew very little about the connotation of the information.

When they heard the word "information", they were all shocked.

Then I was filled with doubts again.

Because the word "information" seems to have transcended the scope of physics.

“What angle is the information from?”

"How is Maxwell's demon related to information?"

"Isn't there a physical quantity called information in physics?"

Everyone started talking immediately.

Oppenheimer, Wang Dezhao and others widened their eyes.

Is Professor Bruce going to come up with a completely new theory?

“No one had ever thought about physics in terms of information.”

“This idea is so innovative!”

“It’s hard to imagine how information can be related to physics.”

Bigwigs like Lang Zhiwan and de Broglie looked excited and their eyes were sharp.

In their view, this view is very interesting.

Although people do not understand information, it is because this theory has not yet appeared and no one has studied it systematically.

But this doesn't mean that everyone doesn't know what the information means.

The mass, charge, speed of movement, etc. of an electron are all forms of information.

The collection of states of physical existence is information.

These are brief summaries of the information.

But how does information affect physical existence itself?

The big guys were extremely curious about how Professor Bruce would interpret the information.

Under the expectation of everyone, Li Qiwei smiled slightly and continued:

"If you know anything about the industrial world, you should know that the invention of electronic components such as diodes and transistors has brought about innovation in the field of communications."

“But new developments often bring new problems.”

"Not long ago, a communications engineer from Bruce Group raised a very interesting question."

“That is: How do we measure information?”

"While researching the product, the engineer discovered that it was very important to know exactly how big and how much information there was."

"But currently there is no theory in physics specifically for information."

"So, he could only let it go."

“However, when I came across this question by chance, I suddenly became interested.”

“I think this issue is worth further study.”

Wow!
Everyone was shocked!
On the one hand, everyone is amazed at the Bruce Group's innovation in product research and development.

On the other hand, I was even more surprised that Professor Bruce attached so much importance to an engineer's question.

Given the other party's current status, there is a huge difference between him and an engineer.

But Professor Bruce is still full of curiosity about the unknown.

Everyone was immediately in awe.

But this still doesn’t dispel everyone’s doubts:

"What does information have to do with Maxwell's demon?"

At this time, Li Qiwei continued:

"In thermodynamics, entropy is a physical quantity that characterizes the degree of disorder in a system."

"The entropy of a crystal with atoms arranged regularly inside must be smaller than that of glass of the same size."

"So let's think divergently, is there a similar phenomenon with information?"

"For example, some people give speeches for half a day without any key message, it's all nonsense."

"My speech, Bruce, was full of valuable information, and I won everyone's applause every few minutes."

“This shows that my speech contained a lot of information, while other people’s speeches contained little information.”

Everyone laughed. Professor Bruce was still as confident and humorous as ever.

"If you think about it a little more, a speech that is all nonsense has a confusing message, while a speech that is all substance has a coherent message."

"How similar is this to chaos in physical systems?"

“So, I boldly put forward a concept: information entropy!”

“It indicates the total amount of valid information contained in a piece of information.”

"This is analogous to the concept of entropy in thermodynamics."

“I think information is the means to eliminate uncertainty.”

"Since I don't know much about the field of communications, I can't give a strict mathematical expression."

"However, based on information entropy, we can continue to deduce through conceptual logic."

Wow!
The whole audience was shocked!

(End of this chapter)

Tap the screen to use advanced tools Tip: You can use left and right keyboard keys to browse between chapters.

You'll Also Like