A friend of mine said:
“… Concerning self-correction, can we say that the system corrects itself toward a state of homeostasis (feedback—-correction—-goal) OR toward maintaining the integrity of the closed system? Is it o.k. to look at this in terms of homeostasis?
Reason why I’m asking is that I’ve been looking at the socialization behavior of horses in the herd, and how they “process” emotions. The processing looks very much like a feedback loop with a self-correcting mechanism – I see it as an infinity symbol. The interaction produces an action which takes the herd back to a state of “stability” or I call it homeostasis.
Anyway, this was a good experience for me because I read in that way – when I don’t get something I have a piece of paper and I draw to try to put the words into some visual.”
In the obvious case of the thermostat, there is a ‘goal’ – a temperature that is set by the second-order system (the human being). But now with horses you get something more obviously complexificated (i like coining ridiculous words). The horse is a system, the herd another, the environment another, and perhaps even another may be the system of “Horse” (i.e. ALL horses) thought about over a vast time period (that is, thinking about time from a second-order perspective – i think).
In this case, when we want to talk about homeostasis, what variables are tending towards sameness? If there are closed systems within closed systems within closed systems (as mitochondria are closed systems within each cell, which is another closed system…), then do we need to speak of a goal for each closed system? And is there only one goal per system (as in the thermostat), or can there be multiple goals which overlap from system to system? Or even multiple conflicting goals within a single system? Or is this not even possible from an cybernetic standpoint… or would it just be a double bind? Can you create a thermostat which simultaneously is tending towards TWO temperatures? How would it behave – oscillate in some frequency set by the speed of the feedback loops? Tend towards some intermediate value? Runaway?
Maybe symbiosis (which is an interesting concept to think about cybernetically) is the holding of multiple goals, one of which at least overlaps with another system? I’m just spitballing here.
Individual horse behavior —> herd qualities —> individual horse behavior (your infinity sign)
So if you can detect the processing of emotion in an individual horse, do you see also a higher-order processing of emotion that does NOT take place at the individual horse level but ONLY at the herd level? Is this a distinct process? Or is the herd’s processing so completely bound up with the individual horse’s processing that it is impossible to tell apart?