This time in english because of terminology.
Dont be feared of science. All those are models and by definition not true. However, it can be fun to understand the stuff all the "intelligent" people talk about. Even more, we can make use of it to understand (our) biology.
Entropy is...well
Entropy first, means energy,
Of course this is the most known context where the conecept of entropy is used. It is the tendency of the universe (if it is a closed system...) towards thermodynamic equilibrium. Spray a gas into a room and observe the molecules, after a while they will be equaly distributed. There will be no more interaction on particle level. Its called heat-death or entropy-death. Absolute zero point. The end of all change (except quantum mechanic and zero-point energy induced particle motion). So a system has a macrostate which is its temperature and it has microstates which are the movement of the molecules, that is the kinetic energy of them. The temperature is composed of the microstates. Equal distribution = no interaktion = no movement = no temperature
[you dont need to underastand it right now: Pi are the probabilities, that particle i will be in a given microstate, and all the Pi are evaluated for the same macrostate. In thermodynamics k is the Boltzmann constant]
We would live in a one shot univers. An early phase where everything is evolving, due to turbulences --> galaxies will form, and slowly matter will dispers.
fortunatly its not obvious, nor trivial that the "universe" is a closed thermodynamic system (maybe its open closed) and to speak about that kind of entropy without the system beeing in equilibrium is kind of vague. All those vague theorys bring us to Boltzmans Box throwing up many problems. So lets for now leave this popular definition of entropy.
Entropy also means (dis-)order or randomness
We as oberservers of our world have a sense of time, only because there is change. Without change, no time and of course no life. Entropy is claimed to be the tendency of the universe towards disorder an death. Realy harsh isnt it? We have our time, because the universe only moves from one state of relativly high order, to another state of relatively lower order or from lower entropy to higher entropy. So entropie also means absense of order aka. disorder.
Well but why? Imagine you open a fabric new, well ordered card deck and you throw it on the ground and with high probability, it is now less ordered. With every throw, it gets less and less ordered or more and more disordered. This is because there are way more states of higher disorder, then states of a higher order. This is why time is a one way street.
...is it? When you spill your 10th beer over the floor, how propable is it, that the beer will reassemble it self and jumps back into the glass? Not very probable right? But it is not impossible! Probability even within a finite sequence or with finite time will never go to zero and with a sample szise large enaugh things will happen almost surely (strong law of large numbers). Its called a poincare recurrence.
Therefore the second law of termodynamics ("entropy in a closed system is only increasing") is only a statistical law and no physical law. It is based on the law of large numbers which is a class of laws of probility theory, where all statistics are based on.
Informationtheoretical-Entropy
Than there are some different contexts, like information theoretical entropy. The Shannon-entropy. Imagine, I toss a coin and the information the coin holds are the distinguishable states it can be in. If its a standard coin, than it is head or tail, 1 or 0. This is one bit of information. Now imagine I dont show you what the result is. Then you have no information about the state of the system. Its entropy is =1bit.
[p(i) is the possibility of receiving message i]
So entropy is also information. Its a special kind of information. Its information you dont have. And getting this information means reducing entropy of the system. Now imagine there are six people observing the coin toss. Whats up now with the 1bit of information/entropy?
Has every person now one bit? No because the system holds only 1bit. They all share the same bit. Its so called mutual information.
When entropy is information you dont have...than a hash or a passwort which is not yours, has high entropy. The higher the entropy, the harder it is to guess. High entropy passwords and hashes are better then low entropy passwords right? In this context it has also to do with disorder as absense of guessable patterns.
Ingeborg123 vs. gn2Io3gr1e ...ok? both have the same probability distribution but since in a password, sequence matters, the second one with higher disorder compared to the ordered and guessable string #1 is preferable. Better ofcourse would be a purely random string like: 8!!dP7&qM3 ?D@
There are even more entropy versions when we go into quantum theory. But lets stick with those mentioned.
[did you know that you cant see a hollow face? This gives us a hint for later article. Just keep it in mind. We will go from all those abstract shit to the biological/neurological foundation of perception, knowledge and good decisions in complex problems]
Sources
T.M. Cover and J.A. Thomas 1991: Elements of Information Theory.
Susskind and brown 2017: The Second Law of Quantum Complexity
Lathia et al. 2015: Heat Death (The Ultimate Fate of the Universe)