One thing that is said about "machines" (read anything from algorithms to AI or robots) is that they don't have emotions. Over time, in various sci-fi movies, it has been debated whether an emotion-driven artificial intelligence is better or not than one that coldly calculates odds and makes decisions based on them. I don't think we know yet. I don't think we are ready for artificial intelligence with emotions yet, even though AI models nowadays try to mimic them.
But humans are emotional creatures. Even more, in recent years, since social media became such a big influence on the society, emotional intelligence has a more important role than the IQ. The reason why that is important, if you haven't guessed it, is because social media offers a perfect wall behind which emotionally immature people can feel empowered or even become predators, something that may not happen to the same degree in real life. Emotions may not be as easy to read or interpret without seeing the other person, and even then, some are easier at hiding their emotions than others.
That's between people. But even your own emotions toward a situation and how you handle them may decide the outcome.
I read somewhere some time ago, that decisions are justified logically but triggered emotionally. That particular statement referred to investments, but may apply to other domains.
I found the respective statement very powerful, and at first I disagreed with it, but later I came to the conclusion it is true in most cases. I thought, at first: "What do you mean triggered emotionally? You mean I can't act on a decision I made logically, without being emotionally involved?" That statement didn't make sense to me.
Ideogram's work ^^
It's not by chance that both FOMO and FUD have the "fear" basic feeling (amygdala-driven, unconscious) as the first word.
What does the animal brain do when it feels uncertainty (i.e. insecurity), when it is in doubt?
Is a lion behind those bushes? Seems like it is, but I'm not sure... Well, I'd better run before I find out!
In the case of an investor, that translates into selling, running away from the danger in the market.
Sure, in the wild, animals may not do that until the threat becomes imminent, because the energy consumed by running is difficult to get back.
This may be useful for investors too, to not sell at every FUD, or to sell when there is FOMO (for example, to stop following a mirage).
But this extends far beyond FUD and FOMO.
Let's say there is neither, but you logically decide it is a good time to sell for the time being. Does your logic provide the signal to sell? Potentially. Do you act on it? I don't think so. To only go for basic feelings, it's either your fear of losing if your logic is sound (protecting your gains), or your greed to make more once the prices go down and you can reenter the market at a lower level (aggressive growth). So nope... it's not the logic. The logic provides the justification, the emotional side pulls the trigger.
Being a logical guy myself, I had a hard time accepting that for a good while. But I tend to believe it is true.
Would it be better if no emotions were involved? I don't know. Trading bots don't have feelings, and sometimes make it very well in the market while other times they lose. It's also good to know that we already have the majority of the participants in trading in the non-human category, so, while some of take into account the predictable human emotions, they have no emotions themselves.