Next Article in Journal
Relationships between Self-Talk, Inner Speech, Mind Wandering, Mindfulness, Self-Concept Clarity, and Self-Regulation in University Students
Previous Article in Journal
Determinants of Quality of Life (QoL) in Female Caregivers in Elderly Care Facilities in Korea
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Introducing Entropy into Organizational Psychology: An Entropy-Based Proactive Control Model

by
Haozhe Jia
and
Lei Wang
*
School of Psychological and Cognitive Sciences and Bei**g Key Lab for Behavior and Mental Health, Peking University, Bei**g 100871, China
*
Author to whom correspondence should be addressed.
Behav. Sci. 2024, 14(1), 54; https://doi.org/10.3390/bs14010054
Submission received: 16 November 2023 / Revised: 30 December 2023 / Accepted: 12 January 2024 / Published: 15 January 2024
(This article belongs to the Section Organizational Behaviors)

Abstract

:
This paper provides a systematic review of the transfer and quantification of the concept of entropy in multidisciplinary fields and delves into its future applications and research directions in organizational management psychology based on its core characteristics. We first comprehensively reviewed the conceptual evolution of entropy in disciplines such as physics, information theory, and psychology, revealing its complexity and diversity as an interdisciplinary concept. Subsequently, we analyzed the quantification methods of entropy in a multidisciplinary context and pointed out that their calculation methods have both specificity and commonality across different disciplines. Subsequently, the paper reviewed the research on how individuals cope with uncertainty in entropy increase, redefined psychological entropy from the perspective of organizational management psychology, and proposed an “entropy-based proactive control model” at the individual level. This model is built around the core connotation of entropy, covering four dimensions: learning orientation, goal orientation, change orientation, and risk taking. We believe that psychological entropy, as a meta structure of individuals, can simulate, explain, and predict the process of how individuals manage and control “entropy” in an organizational environment from a dynamic perspective. This understanding enables psychological entropy to integrate a series of positive psychological constructs (e.g., lean spirit), providing extensive predictive and explanatory power for various behaviors of individuals in organizations. This paper provides a new direction for the application of the concept of entropy in psychology, especially for theoretical development and practical application in the field of organizational management.

1. Introduction

In the VUCA (volatility, uncertainty, complexity, ambiguity) era, organizations face unprecedented changes and challenges [1]. In such an environment, effectively responding to and managing instability and uncertainty from the external environment, as well as the resulting complexity and ambiguity, is crucial for the sustainable development of organizations.
Entropy, as a concept describing the ambiguity, uncertainty, and degree of chaos in systems, may offer a novel perspective in understanding the internal and external complexities of organizations and assist in exploring potential pathways to maintain organizational dynamism and systemic stability [2]. At the micro level, humans, as organic life systems, inherently engage in an entropy-resisting process in their survival and development, characterized by an innate tendency toward entropy reduction [3]. At the macro level, organizations as a whole need to proactively face continuously changing external challenges through resource allocation and process optimization to maintain competitiveness and innovation capability in a dynamic environment. Whether at the individual or organizational level, to achieve high-quality survival and development, the effective management of entropy (possessing the laws/processes of entropy reduction) is essential. Therefore, this article attempts to guide individuals and organizations in maintaining and enhancing adaptability and innovative capacity in ambiguous, disordered, complex, and uncertain environments by understanding and applying the concept of entropy.
Clausius first introduced the concept of entropy within the field of thermodynamics in 1865 [4]. He posited that entropy reflects the degree of energy’s even distribution. As the scientific community’s understanding and focus on entropy deepened, the concept was introduced into the field of information theory. In this domain, entropy is an information-theoretic measure of uncertainty based on a set of known event probabilities that is used to measure the amount of information (complexity) or uncertainty [5,6]. High information entropy signifies greater uncertainty, implying that the probability of all events tends to be uniform. Conversely, when the probability of a subset of events becomes greater than that of other events, information entropy correspondingly decreases. For instance, it is challenging to predict the outcome of a dice roll, which possesses a high degree of uncertainty, and hence, the information entropy is increased. In contrast, the natural law of the sun rising in the east and setting in the west has a fixed and unique orientation, thereby resulting in lower information entropy [7,8]. The introduction of information entropy not only brought the concepts of disorder and uncertainty into the understanding of entropy for the first time but also marked the interdisciplinary expansion of the concept of thermodynamic entropy [9,10]. This development paved the way for the application of entropy in various disciplinary fields, especially in providing new perspectives and theoretical frameworks for understanding and addressing the complexity and uncertainty of the VUCA era.
Although the concept of entropy has been transferred in various forms across multiple fields, its interdisciplinary application still faces numerous challenges. Taking psychology (specifically, organizational management psychology) as an example, first, there remains an incomplete understanding of key concepts and their ambiguities and subtleties. This partial comprehension may lead to vague interpretations of a range of psychological phenomena, as psychological phenomena themselves often exhibit an inherent complexity that is difficult to clearly explain with external concepts [11]. This ambiguity leads to theoretical uncertainty and poses difficulties for empirical research [12]. Second, the operation of organizations reveals a complex duality between order and disorder: they both disintegrate and organize; they are simultaneously complementary and competitive; and they constrain and promote each other. Therefore, both entropy increase (systems tending toward disorder) and entropy reduction (tending toward order) are fundamental to the existence and survival of organizations [13]. However, existing entropy theoretical frameworks have not fully addressed the synergistic transformation between order and chaos. These frameworks often adopt a deterministic perspective, which simplistically categorizes disorder and order, as well as chaos and organization, as completely contrasting concepts. Third, the organizational decision-making process involves diverse individual behaviors and complex organizational dynamics, encompassing entropy reduction processes at both the individual and organizational levels.
Traditional research has predominantly been explored from the perspective of uncertainty, such as social uncertainty, perceptual uncertainty, action uncertainty, outcome uncertainty, etc. [14,15,16,17,18]. However, decision making by individuals in dynamic organizational contexts is often a complex process. Uncertainty only reflects one aspect of the aforementioned challenges. Therefore, a research perspective based on uncertainty/ambiguity in decision making has limitations in predicting individual efficacy in organizational change within dynamically changing environments. The concept of entropy offers a more comprehensive analytical perspective. However, describing individual behavioral outcomes in dynamically changing environments from the theoretical viewpoint of entropy remains an unresolved issue. This also renders the quantification and interpretation of entropy in psychology or organizational behavior exceptionally complex [6,19,20,21,22].
Finally, current interdisciplinary applications of entropy mainly focus on theoretical transfer and construction based on its core concepts. However, the lack of empirical research restricts the further validation and development of these theories [23]. For instance, although entropy can be used to explain organizational responses to ambiguity, complexity, and uncertainty, operationalizing these concepts in practice (at macro or micro levels), as well as examining the related processes through empirical methods, remains a challenge.
Based on the considerations above, this review firstly interprets the concept and theory of entropy and provides a comprehensive overview of the conceptual transfer of entropy across various disciplines, aiming to understand the core essence of this ultimate law governing the universe. Secondly, we delineate methods for quantifying entropy in interdisciplinary and multi-contextual backgrounds. Finally, by considering entropy a fundamental concept describing disorder or randomness within complex systems, we propose that there exists a psychological structure at the individual level for proactively controlling increases in entropy. Therefore, this article proposes an entropy-based proactive control model at the individual level and redefines “psychological entropy” accordingly. This article posits that psychological entropy reflects the meta-mindset of individuals proactively adapting to, managing, regulating, and controlling entropy changes within and outside an organization. This meta-mindset, acting as a meta-structural characteristic of the individual, not only as a meta-structure of the individual, can not only explain and predict various positive organizational behaviors but also integrates, to some extent, the behavioral outcomes of individual decisions made in situations of uncertainty, ambiguity, and complexity. For instance, the explanatory power of psychological entropy can extend the TU (tolerance of uncertainty) spectrum toward the positive. More importantly, psychological entropy can also integrate a series of psychological constructs that promote sustainable individual development, such as lean spirit, and provide theoretical support for the proposition of new constructs.
In summary, the concept of psychological entropy not only enriches our understanding of individual behaviors within organizations but also offers new perspectives and tools for management practice.

2. The Conceptual Development of Entropy in the Context of Various Disciplines

2.1. Physical Perspective

Clausius [4] first introduced the concept of entropy within the context of thermodynamics in 1865. He emphasized that, in the absence of external influences, heat always flows from a hotter body to a cooler one. However, there is always a loss in energy conversion, such as a generator never being able to achieve 100% efficiency. Clausius considered the portion of energy that could not be converted into electrical energy entropy. Thus, thermal entropy can be understood as a form of energy “residue,” that is, the energy within a heat system that cannot be utilized for work. Entropy also reflects the degree of uniform energy distribution within a system. Clausius proposed that, in a state of thermodynamic equilibrium, the distribution of energy within a system is most uniform, and there exists no cyclic process that can continuously and independently extract energy from a heat source and completely convert it into useful work [24]. At this point, the thermodynamic entropy is at its maximum. For example, when a cup of hot water and a cup of cold water are thoroughly mixed, the heat becomes uniform, with no flow of thermal or cold energy, thus reaching a state of thermodynamic equilibrium.
Subsequently, in 1877, Boltzmann proposed a probabilistic equation related to entropy (also known as the Boltzmann–Planck equation) and reinterpreted entropy from the perspective of statistical mechanics [25]. He emphasized that the entropy of a system is proportional to the number of microstates in a closed system and the probability of these microstates occurring. Consider a box filled with gas molecules; these molecules can be arranged and move in many different ways. Each specific arrangement is referred to as a “microstate”. The macrostate of this box, such as its total energy, volume, and the total number of gas molecules, is actually manifested by the collection of these microstates. Therefore, if a system has a large number of possible microstates, it becomes more difficult to accurately determine its current state, thereby increasing its uncertainty and entropy [26,27].
Subsequently, the concept of entropy was applied to explain the physical basis of living organisms [28]. Schrödinger [28] proposed that living systems are capable of reducing their own entropy by absorbing energy from their external environment, thereby maintaining their structure and function. This view serves as a complement to the Second Law of Thermodynamics, which states that the entropy of a closed system only increases. However, living systems have the ability to absorb energy from their external environment to maintain internal order. For example, plants absorb solar energy through photosynthesis, and animals obtain energy by consuming food. Both processes involve absorbing energy from the natural environment to sustain life functions, thereby hel** organisms maintain or increase their internal state of order. This exemplifies the process of entropy reduction.

2.2. Computational Science and Information Theory Perspective

Shannon [5] introduced the concept of entropy from thermodynamics into information theory; thus, it is also known as Shannon entropy. In information theory, Shannon entropy is a measure of the novelty and uncertainty of information. The core idea is that, the greater the uncertainty of an event, the more information we obtain from it and, consequently, the higher the information entropy. For example, consider tossing a coin that has a head and a tail. When flip** the coin, the probability of each outcome is 50%, making the result uncertain. Therefore, when the coin lands, we receive information that was previously uncertain. However, if we replace this coin with a double-headed coin, the outcome of the toss is certain, and thus, the result holds no value for us. In the examples given, the toss of a regular coin has higher information entropy because its outcome is more uncertain and can provide us with new knowledge that we did not previously understand. In contrast, the toss of the double-headed coin is very certain; hence, it has low information entropy.
Wiener [29] proposed that, in cybernetic information systems, entropy represents the degree of disorder within the system [30]. He believed that, when discussing the “organization” of a system, the presence of information becomes crucial, as it forms the basis for defining and characterizing the system. From this perspective, changes in entropy are inextricably linked to changes in the organization of the system, that is, changes in its structure.
Gell-Mann emphasized that entropy is closely related to information. In fact, entropy can be seen as a measure of our degree of ignorance about unknown entities [8]. Gell-Mann viewed entropy as a measure of the uncertainty generated by an individual’s lack of understanding of the microstates within a macroscopic system. For instance, consider walking into a library rich in books. Initially, we know nothing about the variety and distribution of books in the library, not even how to find a specific book we need. At this point, we face significant uncertainty because of our ignorance of the library (the microsystem), which is a manifestation of high entropy. However, as we start using the library’s indexing system and gradually become familiar with the library’s layout and the classification of books and their specific locations, our understanding of the library improves. At this stage, by acquiring more information about the microstates (such as the distribution of books), we reduce uncertainty, which is indicative of a low entropy state.

2.3. Dynamic Theory Perspective

Dynamical systems are systems that evolve over time. Unlike discrete systems, whose states are fixed at specific moments, dynamical systems are chaotic and unordered, and the relationships between their elements are uncertain. In dynamical systems, entropy is often used to quantify the uncertainty of the system’s state. The higher the entropy of the system, the greater the uncertainty in predicting its future state [31]. For example, we can imagine a dynamical system as a flock of birds flying in the sky. The group flight of birds is highly complex, with each bird’s position and speed constantly changing relative to the others. These birds constitute the elements of a dynamical system, and the relationships and interactions between them are highly uncertain. Questions arise such as how information is rapidly transmitted throughout the flock, how they can change formation so swiftly, how their speeds and accelerations are distributed, and how they manage to turn together without colliding. Therefore, this complex collective behavior makes it difficult to predict the flight pattern of this flock of birds (the dynamical system) at any given moment, exemplifying high entropy.
In the theory of dynamical systems, uncertainty is often related to the initial conditions of the system, the dynamical laws of the system’s evolution, and the system’s sensitivity to initial conditions. This means that even minor changes in the initial state can lead to significant differences in the system’s behavior over time [32].

2.4. Understanding Entropy in the Nervous System

Entropy is used as a measure of the information-processing capacity of the nervous system [33,34,35], and it serves as a powerful tool for quantifying brain function (complexity and unpredictability) and its information-processing capabilities [36,37]. High neural entropy indicates that brain activity patterns are more complex and irregular, potentially offering greater adaptability in processing diverse information and the ability to make effective decisions in complex tasks. On the other hand, low neural entropy suggests more ordered or repetitive neural activity patterns. In such cases, the brain may not be as well-suited for processing diverse information but could be more efficient in performing certain specific, repetitive tasks [36,38].
A balanced neural entropy, which is the equilibrium between entropy and redundancy in neural activity, might represent the most efficient state for the brain to process information. This is because the brain’s capacity to process information depends not only on entropy (i.e., the diversity of information) but also on reliability, which is the balance between entropy and redundancy [39].

2.5. Understanding Entropy from a Psychological Perspective

Psychological entropy has been used to describe the uncertainty and disorder in an individual’s mental state [22]. For instance, conflicting beliefs, unclear self-concepts, or unresolved decision-making difficulties all signify higher psychological entropy. This is often accompanied by cognitive and emotional turmoil [40]. Research indicates that, during problem solving, when initial strategies fail, a significant increase in behavioral entropy is observed, manifesting as irregularity and unpredictability in behavior [41,42]. The increase in entropy prompts individuals to seek new strategies and solutions, marking a shift in their approach. Thus, problem solving is essentially a process of reducing chaos or, in other words, lowering psychological entropy. Once a new effective strategy is formed, behavioral patterns tend to return to a predictable, stable state of low entropy.

2.6. Understanding Entropy from a Sociological Perspective

In the construction of social systems, the maintenance of social order is closely linked to the criteria for classifying social roles. These criteria are diverse, encompassing aspects such as social class, educational background, abilities, and talents, collectively determining an individual’s role and status in society. When the aforementioned order and classification fail to sustain the normal functioning of social mechanisms, the social system can descend into chaos [43]. Therefore, in the field of sociology, entropy is often defined as a key indicator for measuring the degree of order, stability, and chaos/disorder within a social system, essentially reflecting the dispersion or unorganized state of social elements. In other words, entropy is also used to gauge the lack or abundance of diversity within a system [44].
Dinga, Tănăsescu, and Ionescu [45] propose that entropy and order are opportunity costs of each other and, based on the concept of social order, have developed a novel theoretical framework for social entropy. They argue that social entropy fundamentally rests on social norms and must be related to social order. Specifically, social entropy is inversely proportional to social order. A society that is orderly and adheres to rules exhibits lower social entropy. However, social entropy is not merely a representation of a society’s state of disorder. Dinga and colleagues [45] identified three core structures essential for an individual’s fit within society: self-esteem, freedom, and democracy. When these three core needs are not met within a society, it leads to a deviation from social order, resulting in increased social entropy and heightened societal chaos. In summary, their concept of social entropy is largely based on the values and demands of social justice [45].

2.7. Organizational System Perspective

Organizations are often conceived of as systems, typically described as collections of interconnected or interacting elements. Testa and Kier [46] suggest that a system can be characterized in three aspects. Firstly, a system needs to have a structure (form) that can be formally described. Secondly, the system must exhibit functional behavioral patterns, meaning that the behaviors among individuals are interrelated, focusing on their characteristic properties rather than the dimension of time. Thirdly, the form and function of a system are not static but change over time, which can be described as complex system fluctuations. Therefore, organizational entropy often quantifies the level of chaos or disorder within an organizational system. This disorder may arise from a combination of factors within the organization, such as decision making, communication, technology, or culture. Such a state of high entropy not only consumes resources and reduces efficiency but may also hinder an organization’s innovation and adaptability [47,48,49]. Assessing organizational entropy considers the organization’s ability to maintain a differentiated state, which is relevant for fostering the long-term sustainable development of the organization [48].

2.8. Entropy from the Perspective of Management

Management entropy is used to describe the chaotic and unsustainable state within an organization caused by factors such as information asymmetry, unclear objectives, inefficient workflow, and resource misallocation [5,50,51]. The greater the chaos within an organization, the higher the management entropy. Kast and Rosenzweig [50] proposed that organizational systems can import resources from their environment, that is, by maintaining a continuous flow of matter, energy, and information to achieve a dynamic equilibrium state, thereby reducing management entropy. Therefore, addressing management entropy is an inevitable challenge for every organization, rooted in organizational complexity and human diversity. Reducing management entropy not only enhances the operational efficiency of an organization but also contributes to creating a more harmonious working environment. This implies that the introduction of the concepts and methodologies of thermodynamic entropy into management is crucial, as they allow us to address management issues from new perspectives [52].

3. Quantification and Application of Entropy in the Context of Various Disciplines

In various academic fields, quantifying the degree of uncertainty in events involving stochastic processes is a pervasive challenge. This uncertainty often implies disorder, ambiguity, and a lack of predictability, making the prediction of stochastic processes extremely difficult. Against this backdrop, the concept of entropy becomes a key tool for understanding and quantifying the uncertainty, ambiguity, and disorder of systems. The quantification of entropy spans multiple domains, and its diversity is reflected in different types of entropy. Whether it is measuring the distribution of energy in thermodynamic systems as thermodynamic entropy, quantifying the richness of information in messages in information systems as information entropy, describing the degree of uncertainty in mental states as psychological entropy, or measuring the order and stability in social systems as social entropy, the concept of entropy provides a reliable and consistent method of quantification. This is particularly important for analyses in fields involving probability and uncertainty. Next, we will introduce a series of common and easily understandable entropy quantification concepts. This work enables us to more accurately understand and predict complex events involving stochastic processes.

3.1. Thermodynamic Entropy

In the Second Law of Thermodynamics, entropy is primarily used to describe energy changes and can be represented by Equation (1). Here, dS represents the change in entropy, T is described as the thermodynamic temperature of the system, and dQ is the heat change in a reversible process. Taking the process of ice melting into water as an example, when the temperature of ice decreases to 0 degrees Celsius, it melts into water. We can then calculate as follows: dSmelting = dQmelting/T. Here, Qmelting is the heat required for the ice to melt, and T represents the melting temperature (in Kelvin). However, this formula describes the relationship between the change in entropy of a system and the heat absorbed or released by the system in a reversible process. In practical applications, since most natural processes are irreversible, this formula is usually used for idealized analysis.
d S = d Q T

3.2. Entropy Quantification in Statistical Mechanics

In statistical mechanics, the quantification formula for entropy provides a method to understand entropy from a microscopic perspective, which can be described by Equation (2). Here, S represents entropy, and kB is the Boltzmann constant, which provides a conversion from microscopic energy units (such as electron volts) into macroscopic energy units (such as joules). W is the number of microstates of the system, which is the number of possible microscopic arrangements of the system under given macroscopic conditions. The core idea of this formula is that entropy is directly proportional to the natural logarithm of the number of possible microstates in the system. The greater the number of microstates, the higher the entropy of the system, indicating a higher degree of disorder. We can imagine a system composed of an ideal gas (consisting of non-interacting, structureless particles). The number of microstates, W, for an ideal gas system with a given energy, volume, and number of particles can be estimated using Maxwell–Boltzmann statistics. Entropy, S, can then be calculated by substituting into the formula. However, although this calculation provides a powerful framework for understanding entropy from a microscopic perspective, this formula mainly explores how entropy arises from behavior at the atomic and molecular levels and typically involves complex integrals and knowledge of statistical physics, making it difficult to transfer and apply.
S = kB ln(W)

3.3. Information Entropy

Shannon [5] introduced the concept of entropy into information theory and proposed a method for measuring the amount of information based on the aforementioned formula, as shown in Equation (3). This formula is very important in both statistical mechanics and information theory. It indicates that entropy is the negative sum of the probabilities of all possible states multiplied by their logarithms. In information theory, it measures the uncertainty of information or the average amount of information. In statistical mechanics, it describes the uncertainty or disorder of the system’s microstates. In the formula, S represents entropy, K is a positive constant (such as K = 1), and pi represents the probability of the i-th microstate. The summation is over all possible microstates. This also means that we must determine the potential probability of an event occurring in a random process as accurately as possible.
Take the result of a coin toss as an example. Assuming the coin toss is fair, ideally, the probability of getting heads or tails is 0.5 each. Applying Shannon’s entropy formula, we can calculate the entropy: I = −(0.5log(0.5) + 0.5log(0.5)). If we use logarithms to the base 2, substituting into the formula, we obtain I = 1 bit (the unit of entropy is bits). This means that, on average, each coin toss provides 1 bit of information.
Shannon entropy can also describe the richness of information. Take the string “0001000100010001…” as an example. Based on this string, we can calculate the probabilities of 0 and 1 appearing in the string. We find P(0) = 0.75, P(1) = 0.25. Still using logarithms to the base 2, we have I = −(0.75log0.75 + 0.25log0.25) ≈ 0.811. It is worth noting that some studies have provided more precise formulas for the value of K, such as depending on the length (b) of a finite alphabet, A, and considering K = 1/log2b.
I = K i = i n P i   log   P i

3.4. The Quantification and Application of Entropy in Social Science

Given the universality and stability of Shannon entropy in measuring uncertainty and complexity, it has been widely applied in the field of social sciences.

3.4.1. Psychological Entropy

As mentioned earlier, psychological entropy is used to describe the uncertainty and disarray in an individual’s mental state. Hirsh, Mar, and Peterson [22], drawing on Shannon entropy, have developed a method for calculating psychological entropy. Their entropy of uncertainty model (EUM) conceptualizes an individual’s perceptual and behavioral processes as a probability distribution. The perceptual process is understood as an individual’s interpretation of sensory input based on expectations, motivations, and prior experiences. Thus, it is possible to quantify a probability distribution of potential meanings and perceptual experiences from any given sensory input. Concurrently, an individual’s potential actions also follow a probability distribution [53]. Therefore, Hirsh et al. [22] propose that the uncertainty associated with a given perceptual or behavioral experience can be quantified using Shannon entropy, as shown in Equation (4). This formula reflects the negative logarithmic sum of the probabilities of each possible outcome. For instance, in a scenario with four potential outcomes, X1, X2, X3, and X4, if the probability of one outcome is significantly higher than the others, it implies a lower level of psychological entropy. Conversely, if the probabilities of all four outcomes are evenly distributed, it indicates a higher level of psychological entropy. FeldmanHall and Shenhav [6] also suggested that this method can be used to quantify an individual’s social uncertainty.
Entropy   = i = 1 n P ( x i )   log 2   P   ( x i )

3.4.2. Organizational Entropy

To assess the sustainability of an organization, the concept of entropy can be utilized to quantify the level of understanding of the organizational system. Martínez-Berumen et al. [48] provide an approach for this. Initially, it is necessary to identify the organizational system to be evaluated. Subsequently, a range of organizational scenarios that can describe potential risk levels for the organization’s long-term sustainable development should be determined. Martínez-Berumen et al. [48] suggest considering up to 11 scenarios, ranging from Scenario 0 (indicating high risk) to Scenario 10 (indicating low risk). The next step involves identifying variables within each scenario that may contribute to uncertainty (e.g., innovation, talent, culture, leadership, structure, etc.). An assessment based on a specific scenario, such as innovation, is then conducted to obtain a probability distribution. This distribution is subsequently used in the calculation of Shannon entropy.
Martínez-Berumen et al. [48] also propose a quantitative indicator of organizational entropy: when 3/4InK < S ≤ InK, it indicates a high level of chaos within the organizational system; when 1/2InK < S ≤ 3/4InK, it suggests that the organizational system is orderly. Organizations are advised to focus on the trend of “organizational sustainability” and determine if any factors need strengthening. When 0 < S ≤ 1/2InK, the organizational system is highly orderly, where K represents the number of defined scenarios (11 in this case). Therefore, the entropy of an organizational system can be used to assess the risks faced by the organization and its long-term sustainability [48].

3.4.3. Social Entropy

Social Entropy Theory (SET), proposed by Bailey [54,55], offers a framework for understanding and quantifying the disorder and uncertainty in social systems. Bailey suggests that social entropy can be assessed using a framework known as PILOTS (Bailey, 1997; Bailey, 2008). Within the PILOTS framework, society is viewed as a bounded spatial region (S), characterized by its population (P) and various informational elements such as information (I) and technology (T). These variables collectively form a complex network and, through self-organization (O), achieve a level of entropy minimization, thereby optimizing the quality of life (L).
In the PILOTS framework, the elements do not directly involve specific information at the individual level but describe the attributes of the entire society through a series of macro variables. For instance, population (P) is subdivided into individuals with immutable characteristics, including gender (G), race (R), and age (A), collectively referred to as GRA. Therefore, we can assess a system’s social entropy based on this framework. For example, to evaluate the social entropy index of city A, we can quantify social entropy by assessing the diversity and complexity of different social groups, economic activities, cultural activities, the distribution of city resources (such as education, healthcare, and housing), and the city’s response and resilience to external shocks like economic crises and natural disasters. When city A possesses strong economic stability and adaptability, it often indicates lower social entropy; conversely, an uneven distribution of resources in the city can lead to an increase in social entropy.
Additionally, Takaguchi et al. [56] proposed using the information entropy method to predict a sequence of conversations among individuals [56]. Peng and others utilized Shannon entropy to demonstrate the focus of Twitter users on different topics compared with the entire system [57]. Kulisiewicz et al. [58] suggested that entropy (calculating first-order, second-order, and third-order entropy) could be used to describe the dynamics of human communication mechanisms in social networks, which can help us observe and understand sociological processes in dynamic communities [58]. Westbury and others [59] used Shannon entropy to predict the humorous response generated by meaningless strings (non-word strings, NWs). The results showed that Shannon entropy does indeed correctly predict human judgments of NW funniness, also demonstrating that the perceived humor is a quantifiable function of how far the NWs are from being words.
In summary, the current approach to entropy calculation across various disciplines is primarily based on the concept of information entropy. This involves striving to ascertain the latent probability of a specific event occurring within a random process to facilitate the computation of entropy. However, quantifying the latent probability of an event’s occurrence is undoubtedly not a trivial task. This challenge hinders the quantification and calculation of entropy in certain disciplinary contexts. Consequently, integrating different disciplinary characteristics to adopt varied methods for entropy quantification is a complex and nuanced process. It necessitates a profound understanding of the nature of entropy and the inherent uncertainties involved.

4. How Do Individuals Cope with and Manage Uncertainty in Entropy Increase?

As mentioned, entropy is a fundamental concept used to describe ambiguity, disorder, complexity, and uncertainty within complex systems, and individuals possess an innate ability to reduce entropy in such environments. However, why can some individuals effectively cope with increasing entropy to achieve sustainable development while others are gradually “consumed” by it? We believe that the difference in outcomes depends on the individual’s ability to control entropy. In traditional research, psychologists and management scientists have attempted to answer this question by studying “uncertainty”. Although the disorder, complexity, and randomness inherent to entropy can trigger an individual’s perception of uncertainty [60], fundamentally, we consider the process of controlling and managing uncertainty, whether originating internally or externally, to be part of entropy management.
In the traditional field of uncertainty research, psychology offers integrative concepts and mid-level generalizations [18]. Uncertainty implies a lack of reliability, credibility, or adequacy [61]. Information characterized by uncertainty can lead to self-doubt in individuals and have a detrimental impact on their thoughts and behaviors [62]. Therefore, at the individual level, social (organizational) behavior has a critical and potent motivator, namely, the desire to reduce uncertainty [6].
In many behavioral theories, psychological uncertainty is considered an important mediator in human responses to unknown outcomes [63]. Psychological uncertainty is defined as a psychological structure that includes a variety of potential positive or negative psychological effects [64,65]. However, in most cases, uncertainty is seen as a negative influence, for example, inducing worry, anxiety sensitivity, fear of negative evaluation, perceptions of vulnerability, and avoidance of decision making in individuals [15,17]. Faced with these effects, individuals adopt different strategies and responses to mitigate the negative effects of uncertainty. Some individuals tend to use passive co** mechanisms, such as attention diversion to ignore uncertainty, thereby achieving emotional regulation [15], while others, although bravely acknowledging and confronting uncertainty, experience reduced action efficacy because of the resulting fear, anxiety, and disempowerment, accompanied by emotional dysregulation [66,67]. Further, some research indicates that an individual’s experiential permeability (EP) is a key factor determining whether they can positively cope with uncertain situations. In other words, if one person’s knowledge is complete, it is difficult for them to experience uncertainty [68], which, in turn, prompts them to discover and benefit from the positive effects hidden in uncertainty [69]. Therefore, depending on the specific environment, individuals exhibit different responses to uncertainty, depending on individual personality traits and differences in strategies for co** with uncertainty [70].
In the realm of decision-making research, Herbert A. Simon, as early as in his work referenced as [71], delved into the behavioral responses of decision-makers when faced with uncertainty. He suggested that decision-makers should be viewed as having bounded rationality, largely because individuals cannot know all alternatives, hold uncertain attitudes toward exogenous events, and lack the ability to estimate outcomes. Kahneman and Tversky proposed that individuals’ rules of perception and intuitive judgments significantly affect their decision making in the face of uncertainty. They explored how individuals use heuristics in uncertain situations and the biases they are prone to in various judgment tasks, such as predictions and evaluations of evidence [72,73,74]. They also studied individuals’ loss aversion in riskless choices [75,76] and how estimates of the probability of uncertain outcomes in prospects become a determining factor in decision making (prospect theory) [76]. Subsequently, Kahneman [18] proposed that intuition and reasoning are alternative methods of problem solving and described the role of prototype heuristics in uncertain decision-making tasks. In recent years, FeldmanHall and Shenhav [6] combined Bayesian thinking to propose three methods of reducing uncertainty: automatic inference, controlled inference, and social learning. Moreover, emotional regulation methods also determine individuals’ decision-making responses in situations of uncertainty (including adaptive or maladaptive strategies [14,16,66,77]).
Although the aforementioned studies focus on exploring what kind of irrational behaviors individuals exhibit in scenarios of uncertainty or what cognitive strategies and emotional regulation methods they can use to reduce uncertainty, they overlook the important capacity of individuals to consider prognostic activity as a meaningful variable, as well as the related goal setting and thinking processes [64]. In other words, discussions of uncertainty in decision-making contexts are mainly conducted within the frameworks of cognitive psychology and organizational decision research, lacking an examination of differences between decision-makers and research into related traits and abilities.
Therefore, the academic community has begun to focus on the important role of an individual’s psychological state/traits in influencing their decisions and responses to uncertainty. Since the 1990s, some scholars have identified the difficulty in handling uncertainty as a distinguishable personality trait. It is really a predisposition [78,79,80]. Consequently, many studies turn to exploring individual (in)tolerance of uncertainty, which has received more extensive exploration within the discipline of clinical psychology [81,82]
Intolerance of uncertainty (IU) refers to the negative emotions or beliefs triggered in individuals because of the perception of lacking significant, critical, or sufficient information [83,84]. This tendency toward negative responses may manifest at the emotional, cognitive, and behavioral levels and is maintained by related perceptions of uncertainty [85]. Tolerance to uncertainty tends to describe an individual’s emotional response to their orientation toward an undetermined future [86]. Individuals with higher levels of IU view uncertainty as a source of stress, discomfort, fear, and conflict [87,88,89] and find it difficult to tolerate aversive experiences related to uncertainty [90]. Research shows that higher levels of IU are transdiagnostic risk factors for many clinical disorders, including anxiety, depression, obsessive–compulsive disorder, and eating disorders [91,92].
Tolerance and intolerance for uncertainty are key variables in the overall system of individual choices and decision-making regulation under conditions of uncertainty. The concept of tolerance for uncertainty proposed by Frenkel-Brunswik [93] is subject to substantial definitional heterogeneity. Although initially IU and UT were studied as traits in the fields of cognition and personality, over time, IU and UT have gone from being viewed as being two poles of the same conceptual continuum to being partly independent constructs and dimensions of personality [94]. Tolerance of uncertainty (TU) emphasizes “tolerance”. Hillen et al. [15], in their study, note that to “tolerate” means ”to allow” (something that is bad, unpleasant, etc.) to exist, ”happen or be done”, or “to experience (something harmful or unpleasant) without being harmed”. This means that, in TU, the most an individual can do is to remain unaffected by negative events. Furthermore, where does the boundary of TU begin and end? In individual responses to uncertain situations, which responses should be considered to constitute the phenomenon of TU itself rather than just being produced by TU is also a matter of debate.
Furthermore, there is debate over whether TU and IU can represent a stable personality trait that predisposes individuals to specific psychological responses [70,79,95]. Some of the literature suggests that TU is predominantly a psychological trait [79]; thus, these studies typically view TU as a measurable and stable construct and often omit exploring context-specific manifestations of uncertainty [15,96]. Where TU is explored as a modifiable state, the state of TU is influenced by either contextual or situational factors that may alter the individual’s TU condition [70,80]. Hillen et al. [70] developed a contemporary and comprehensive integrative model of uncertainty tolerance (IMUT) and suggested that exploring TU as either a trait or a state is appropriate.
In summary, Simon and Kahneman primarily studied individual decision-making behavior in uncertain situations in a cognitive framework [18,71]. Furthermore, both IU and TU reflect the emotional response dimension of those experiencing uncertainty, embodying an individual’s anticipation and interpretation of future outcomes under uncertain conditions. Therefore, although previous research has explored individual co** strategies and behavioral responses from the perspective of uncertainty, it does not explain which personality traits and co** methods enable individuals to proactively face uncertainty in organizational development/change. From this perspective, research on organizational entropy change can provide clearer, more comprehensive answers.
On the other hand, Hirsh et al. proposed a concept of psychological entropy at the individual level, used to describe the uncertainty and chaos of an individual’s mental state [22]. However, this research suggests that an increase in entropy is a sign prompting individuals to seek new strategies and solutions (problem solving). Like the aforementioned uncertainty research, it does not address under what conditions people seek more rather than less uncertainty, nor whether individual differences in uncertainty seeking reflect a positive feeling toward uncertainty itself or a desire for information and/or solutions to aversion to uncertainty.
Therefore, this paper posits that entropy change occurs at various stages of life and societal development, with entropy reduction being an innate tendency of living organisms, determining orderly individual development. We attempt to propose a meta-mindset at the individual level by analyzing and understanding the theoretical content of entropy, combining existing entropy research in psychology, and building on perceptual research of uncertainty. Upon redefining psychological entropy, we propose an active entropy control model. Through this model, we aim to deepen the understanding of how individuals with certain mental models can better face disorder, ambiguity, complexity, and uncertainty in situations such as organizational change and possess the ability to predict positive organizational outcomes.

5. Entropy-Based Proactive Control Model

Entropy and energy form the foundation of all natural processes, including human activities. Despite the fact that thermodynamics has been established for over a century and a half, no amount of technological advancement or theoretical innovation has been able to undermine its principles. This holds true even for the forward-thinking and revolutionary quantum theory [97]. Many physicists unanimously agree that the most convincing and encompassing laws in physics are embodied within the laws of thermodynamics. All interacting natural forces and processes adhere to the laws of energy and entropy. Therefore, entropy is not only a focal point for interdisciplinary unified knowledge but, more importantly, it can serve as a focal point for the interdisciplinary unification of knowledge and, to some extent, embodies characteristics of the “Grand Unified Theory” that Einstein pursued throughout his life [98]. From the perspective of life development, the entropy reduction phenomenon inherent to human biological instincts may have the capacity to generalize entropy control tendencies and personality traits in organizational management contexts. This implies that individuals within organizations have an inherent motivation to actively reduce uncertainty (entropy reduction). Han et al. [60] regard uncertainty as a fundamental metacognitive state consisting of the conscious awareness of ignorance [99]. It arises from unconscious brain mechanisms, functioning independently of rational thought [100]. Therefore, we attempt to propose a meta-concept from the perspective of entropy to describe the proactive control of disorder and uncertainty within a system by individuals. This includes the mindset of individuals actively controlling entropy in organizational contexts and explains how individuals can proactively deal with uncertainty to achieve organizational success. We believe that psychological entropy has the rich connotation of integrating various organizational management constructs and can broadly predict and explain a variety of behaviors of individuals within organizations. To achieve this, it is imperative to first delineate and understand several key concepts.
Firstly, it is essential to properly understand increases and reductions in entropy. In the field of psychology, Hirsh et al. [22] proposed the entropy uncertainty model (EUM). This model conceptualizes the realms of perceptual and behavioral uncertainty as probability distributions, revealing how individuals interpret sensory input based on expectations, motivations, and past experiences. The EUM emphasizes that individuals strive to reduce uncertainty to a manageable level, thereby alleviating psychological discomfort caused by the uncertainty of perception and behavior. However, this model appears to reflect the essence of determinism, exemplified by Newtonian thought: order dominates everything. It (the model) underscores the individual’s motivation to actively seek to change the state of disorder [13,101]. However, humans are complex organisms, and maintaining equilibrium can involve two distinct types of activities: “preventing entropy increase” and “facilitating entropy reduction”. For example, in physical exercise, human muscle cells accelerate the breakdown of carbohydrates and fats to generate motion and energy. This process produces waste. If this waste is not expelled from the body through an open system, it is impossible to “prevent entropy increase”, leading to the collapse of the organic system. Simultaneously, physical exercise is also a process of ”facilitating entropy reduction”. Through exercise, the muscle structure of the human body becomes more ordered and efficient. Muscle cells and the nervous system gradually adapt through repeated movement training, enhancing the coordination and efficiency of movement. Hence, the continuous attainment of coordination and proficiency in muscle tissues represents an individual’s effort to achieve entropy reduction. The life system encompasses both “preventing entropy increase” and “facilitating entropy reduction”, two complementary processes [102].
Secondly, dissipative structures, open systems, and entropy reduction are critical to the continuation of life. Prigogine introduced the concept of dissipative structures based on Bénard convection experiments, thereby revealing how structures, organizations, and order emerge in the face of anomalies, turbulence, disorder, and dissipation [103,104]. A dissipative structure refers to a complex and ordered structure that spontaneously forms in an open system far from thermodynamic equilibrium when the system reaches certain critical conditions through the exchange of matter and energy with its surroundings [105]. In fact, as hypothesized in thermodynamics, in an absolutely closed system, entropy tends toward infinity because the system becomes increasingly disordered and incapable of co**. However, both life systems and organizational systems formed from numerous life systems are open systems composed of dissipative structures [106]. They continuously exchange with the external environment, allowing the system to adopt measures, such as actively taking interventions (like energy) from the external environment to reorganize the internal disorder, thereby achieving entropy reduction. In other words, to survive and sustain development, it is first necessary to maintain an open state. The system provides conditions for the evolution of its complexity by constantly resisting, absorbing, or even transforming disorder. It can be said that, for a system’s order and organization to sustain development, it must possess the capacity to tolerate, utilize, and proactively regulate states of disorder [107].
Thirdly, emergence often occurs in the interplay between order and disorder, and it is key in advancing order and achieving entropy reduction within open systems [108]. Emergence is a unique phenomenon in open systems, referring to new and holistic properties or behaviors that arise from the interactions of the system’s various parts. These properties do not exist within the individual components of the system. Emergence can be understood from the perspective of the transition between order and disorder. Diversity embodies disorder, and disorder generates diversity; unity represents order, and thus, the unification of diversity is emergence. For instance, the large-scale aggregation of mass in the universe to form black holes is an example of emergence—a new structure. Similarly, a group of musicians playing randomly is akin to chaos and disorder, like grains of sand being scattered. However, under the conductor’s unified organization, they form an organic harmony, playing the same piece in a structured and regulated manner, resulting in a high degree of order—this is emergence. In this process, the active organization of the conductor (akin to doing work) is crucial. Hence, it is evident that emergence (the unification of diversity) is a core characteristic of the continuous development of an organization.
In summary, we believe that achieving the sustainable survival and development of organizations requires individuals to actively exercise their agency. This necessitates that they not only guard against the emergence of disorder but also respond to disorder in a manner consistent with the organization’s survival needs. This ensures that the system neither disintegrates because of disorder nor becomes rigid because of order. Ultimately, by managing both order and disorder, continuous emergence can be achieved [109]. Based on this premise, this paper introduces an entropy-based proactive control model.
Indeed, we believe that existing discussions based on entropy predominantly focus on the perspective of information entropy, that is, the disorder and chaos of the system, to explain specific issues. However, a precise understanding of the Second Law of Thermodynamics and the essence of entropy aids in better explaining the complex evolutionary processes of life systems and organizational systems. This paper attempts to draw upon certain aspects of fundamentalism and meticulously executes its theoretical transition based on the core essence of “entropy change”. First, entropy possesses the following characteristics [45,110,111,112,113]:
  • Entropy is a concept of maximum generality, applicable to any of the three worlds in Popper’s framework.
  • Entropy can be formalized as a state variable, a state function, or a state vector.
  • The magnitude of entropy’s change depends solely on the initial and final states.
  • Entropy is a parameter, with its magnitude inversely proportional to the degree of order.
  • Entropy is non-static, meaning that, in a closed system, entropy inevitably and permanently increases.
  • Global entropy (i.e., the entropy within a closed system) is irreversible.
  • Entropy is a macroscopic variable determined through the integration of microstate simulations, and it exhibits macroscopic irreversibility.
  • According to statistical thermodynamics formulas, entropy is a statistical quantity.
  • Entropy is an additive variable.
After a detailed clarification of the fundamental characteristics of entropy, we can now examine the entropy change process in the context of individuals and organizations based on the four core implications of entropy change:
(a)
Entropy reduction occurs in open systems/dissipative systems.
(b)
The higher the concentration of high-quality energy, the lower the entropy.
(c)
When a system is in equilibrium, energy is most dispersed (the most configurations), resulting in higher entropy.
(d)
The complexity of critical states/self-organization states is highest, leading to higher entropy.
Building upon these principles, this paper, based on an accurate grasp of entropy change, proposes a model for explaining how organizations can achieve sustainable development by addressing disorder/uncertainty in transformational contexts, namely, through an entropy-based proactive control model. This model integrates four core concepts from psychology and organizational behavior, each corresponding to one of the four key connotations of entropy change, including learning orientation, goal orientation, change orientation, and risk taking, thereby playing a descriptive and predictive role in how individuals within organizations cope with the process of increasing entropy.
We believe that psychological entropy reflects the meta-mindset of individuals in proactively adapting, managing, regulating, and controlling the ”entropy changes” within and outside an organization. Psychological entropy, through the adjustment of individual agency, drives the organizational system to break relative equilibrium, enhance organizational functional complexity, and achieve dynamic stability and high-level development at the organizational level.
Specifically, individuals within the organization exhibit a strong ability for continuous learning and active adaptation to new information, strategies, and methods. This meta-mindset motivates them to actively set strategic goals, continually advance in uncertain situations, and courageously take risks to facilitate adaptive evolution and innovation within the organizational system. When individuals possess a high level of psychological entropy, it often signifies their strong abilities in entropy adaptation and control. Next, we will introduce each of the four components of the entropy-based proactive control model.

5.1. Dissipative System and Learning Orientation

Prigogine [104] first introduced and detailed dissipative systems, which are the subject of research on how open systems interact with their environments [114]. As mentioned earlier, the entropy of isolated and closed systems only increases and never decreases. However, dissipative systems break away from the traditional closed system model, providing a framework for understanding how open systems generate order and structure from non-equilibrium conditions [10].
Unlike traditional closed systems, open systems can exchange energy, matter, and information with the external environment, allowing them to maintain non-equilibrium states. These states are variable and dynamic and can generate new ordered structures, known as dissipative structures. Prigogine and Stengers [115] further extended this “entropy reduction” framework to biological organisms, where individuals can also be viewed as open dissipative systems. Therefore, to maintain their stability and organizational structure, internal entropy must be effectively transferred to the external environment. This provides us with an important insight, in that individuals need to proactively construct an efficient dissipative system by continuously exchanging energy, matter, and information to maintain non-equilibrium states and generate new ordered structures, thus better managing changes in entropy.
Based on the fundamental properties of entropy and dissipative systems, we propose that psychological entropy should include the core component of learning orientation.
Learning orientation is a set of values that influences the degree to which proactive learning occurs [116]. Individuals with a learning orientation often possess an open mindset and a commitment to learning. They do not confine themselves to existing and fixed thought patterns; instead, they proactively embrace new knowledge and new experiences. Through the exchange of information between new and old knowledge, they break through tradition and generate creative thinking [116].
Furthermore, learning orientation encourages individuals to respond and adapt quickly to organizational contexts. It equips them with the ability to continuously enhance their competitive advantage within the organization through knowledge sharing, exchange, and absorption [117]. Previous research on individual responses to uncertainty has suggested that an individual’s tolerance of uncertainty is highly correlated with openness (i.e., experiential permeability [69,118,119]). Additionally, uncertainty reduction theory also posits that individuals have a motivation to actively acquire external information and resources to reduce uncertainty [120], particularly in continuously develo** and changing scenarios, where information acquisition is crucial. Therefore, we propose that learning orientation is a crucial capability for individuals to proactively adapt to the uncertain organizational environment, making them a form of dissipative structure.
We believe that cognitive–behavioral systems fundamentally adhere to the same basic principles as other dissipative systems, and the sustainability of cognitive–behavioral systems depends on the ability of dissipative systems to reduce entropy. Individuals with a high learning orientation are precisely those who can promote knowledge absorption and information exchange both internally and externally through open thinking and a commitment to learning. They reshape their neural connections and activity patterns to respond to environmental challenges and uncertainties, ultimately maintaining their functionality and stability [121,122], thereby promoting entropy reduction. In conclusion, we believe that the entropy-based proactive control model should include an individual-level dissipative process, namely, learning orientation.

5.2. Concentrated Energy and Goal Orientation

According to the principles of thermodynamics, entropy (a measure of disorder) in a closed system is always increasing. However, based on the second core feature of entropy change, in an open system, the more concentrated the high-quality energy, the lower the entropy [123]. Prigogine [124] and Doll [125] also proposed that, as a system is injected with increasing amounts of energy, it will “transform” into a state far from thermodynamic equilibrium. Similarly, the activities of individuals/organizations should also be goal-oriented, focusing energy and effort more effectively, thereby achieving a more efficient “transformation,” i.e., entropy reduction. We believe that goal orientation offers a method of realizing this approach.
Goals represent the specific cognitive representation of an individual’s desires and can also be understood as a state of intentional behavior guidance [126]. For the realization of a desire, individuals must set clear goals to gather focused energy toward the goal until it is ultimately achieved [127]. In organizational behavior research, goal orientation is often seen as a stable, trait-like characteristic that varies among individuals [128]. Goal orientation is also typically conceptualized as a personality disposition and measured as a trait-like individual difference variable. From the perspective of personality traits, goal orientation can initiate purposeful goal striving [129]. Goals often influence an individual’s perception and behavior by affecting the processing of goal-relevant information and the selection of behaviors [130,131,132]. Previous studies have shown that the higher an individual’s intolerance of uncertainty (IU) is, the lower their level of self-control becomes, making it more difficult to anticipate future scenarios. Consequently, this leads to a lack of capacity to facilitate goal setting from a future time perspective [133]. Consequently, when individuals possess a high level of psychological entropy, the included goal orientation enables individuals to focus their activities more sharply, with greater purpose, and with higher efficiency in terms of survival.
Individual differences in the proactive selection, determination, and pursuit of future goals directly influence organizational achievement [129]. When individuals are in a “goal-deficient” state, such as failing to set clear goals or when existing goals are abandoned without new goals to replace them, they experience high levels of entropy and wastage regarding information resources [134]. In such instances, a clear goal framework as a behavioral guide is crucial. Although the process of establishing goals can introduce some uncertainty in the short term, as it requires the mobilization of cognitive resources to identify new potential behavioral paths, when a new decision is perceived as promoting the achievement of goals, it becomes the dominant choice for the individual, thereby reducing entropy to a level lower than before.
Moreover, when individuals are in a state of “goal masking”, where their established goals are disrupted or obscured by uncertain organizational/environmental cues, they also experience heightened decision ambiguity and behavioral uncertainty, resulting in a high entropy state. In such situations, individuals with high goal orientation can proactively break down the currently obscured goals into a series of more specific sub-goals using dynamic programming techniques [135,136]. Subsequently, by tap** into a wealth of information resources (stemming from a learning orientation), they direct their focus toward these more specific sub-goals, thereby gaining localized and focused psychological energy. In summary, individuals can proactively manage and regulate uncertainty within an organization through goal orientation, adapting to the continuously changing organizational context. Therefore, we have incorporated goal orientation into the entropy-based proactive control model presented in this paper. We believe that having goal orientation enables individuals to proactively set or break down goals based on real situations, thereby stimulating stronger motivational drives. This leads to the acquisition of high-quality energy directed toward behaviors, facilitating the achievement of goals and the attainment of a state of entropy reduction.

5.3. Thermodynamic Equilibrium and Change Orientation

According to the third core feature of entropy, when a system is in equilibrium, it implies a more dispersed energy distribution, with the most configurations and the highest entropy, and the system undergoes no macroscopic changes [137]. To achieve entropy reduction, it is necessary to break this state of equilibrium, which is the process of emergence. Non-equilibrium states and nonlinear interactions within these states act as catalysts (key mechanisms) for emergence. Metaphorically, emergence refers to a phenomenon that exists in one dimension but not in another. Under non-equilibrium conditions, systems far from stable states may, through interactions between components, lead to the emergence of new structures and patterns [10]. Therefore, creating and maintaining a dis-equilibrium state in an organization is a requisite aspect of emergence [125,138,139,140]. Studies show that emergence is often triggered by “unconventional” activities/events (events occurring “outside the norm”), pushing the system into a highly dynamic state [125,141]. Lichtenstein and Plowman [21], analyzing three empirical studies on emergence within organizations [142,143,144], identified four constructs of emergence at successive organizational levels. These include a dis-equilibrium state, amplifying actions, recombination/self-organization, and stabilizing feedback. They argue that these four structures are necessary conditions for the emergence of a new order but not sufficient conditions. Dis-equilibrium could be caused by the proactive pursuit of new opportunities, threats, or crises from within the environment/system, or fluctuations that alter the entire organizational system. From this perspective, this paper proposes that individuals and organizations seeking high-quality evolution and development need to maintain and manage non-equilibrium states within the system, constantly driving emergence through changes. In other words, in an organizational context, individuals need to possess a “change orientation” trait. Therefore, we propose that change orientation constitutes a core component of our entropy-based proactive control model.
Organizational change refers to the change (reform) in an organization from its current state to a more optimized form [145]. The result of this is emergence within the organization. By definition, we find that change and emergence have many conceptual similarities; therefore, we believe that change orientation is a crucial factor in driving organizational emergence. Individuals with higher change orientation tend to exhibit greater adaptability [146], innovativeness [147], and an active approach to managing change [148]. Specifically, change orientation prompts individuals to proactively seek and drive systemic changes, regulate themselves to adapt to disorderly situations, and even proactively explore new possibilities from disorder [149,150,151]. Therefore, we incorporate change orientation into the entropy-based proactive control model proposed in this article. We believe that individuals with a change orientation can proactively embrace and manage change. In the context of change, individuals demonstrate greater adaptability, innovativeness, and openness by strengthening the clarity of their cognitive maps and goal structures, ultimately leading to the “process of emergence”.

5.4. Criticality and Risk Taking

According to the fourth major feature of entropy change, when a system is in a state of criticality/self-organized criticality, it exhibits the highest level of complexity and can rapidly evolve into new patterns. Criticality is seen as a kind of edge structure, which is neither completely ordered nor completely disordered. Emergence often occurs in states of criticality [152].
Criticality is often used to describe the critical points of phase transitions [153]. A phase transition describes the process of a material transforming between different states of matter. When the nature of the dominant feedback in a system changes, a phase transition occurs. Phase transitions are ubiquitous in nature and society, such as the transformation between ice and water or the succession of historical dynasties.
The characteristics of a system at its critical point are especially complex, manifested in the uncertainties of phase transitions, incompleteness of information, and nonlinear interrelationships between elements [154]. For instance, in a sustained 0-degree Celsius environment, whether water freezes or ice melts is uncertain. In organizational management, even planned organizational changes often have randomness and uncertainty in their direction and outcomes [155,156], meaning that organizational change essentially involves risk. When organizations are in a critical state, they exhibit characteristics such as complexity, disorder, and uncertainty. In this context, an individual’s organizational behavior often involves significant risks. Therefore, how one balances anticipated returns and risks will determine their behavioral performance in change scenarios [157]. Risk taking is an important form of human behavior, long used to explain the adaptability of human actions and the rationality behind them [158]. Risk taking is defined as engagement in behaviors that are associated with some probability of undesirable results [159]. Burnes et al. [160] proposed that the pursuit of a goal-directed option, which could result in multiple outcomes including some that are undesirable or potentially hazardous, should be considered an instance of individual’s risk-taking behavior [161]. Theories about risk taking can be broadly categorized into three types: the first type explains which personality traits frequently lead to risk taking [162]; the second type often explains the differences between risk seeking and risk aversion (prospect theory [163]); and the third type explains why some individuals take risks only in specific situations because they value and believe in success in those scenarios [158,164].
Risk taking is a multi-dimensional concept, and its outcomes are not always positively oriented. Zinn [165] proposed that risk taking includes key dimensions like motivation, control, reflexivity, and develo** and protecting identity, which together influence the likelihood of an individual engaging in risk taking. It is noteworthy that risk taking can either be adaptive or maladaptive. When the benefits of certain activities are far outweighed by the potential harm, it is maladaptive. Conversely, it is adaptive as long as the opposite holds true [158]. Individuals can adapt successfully by systematically pursuing certain risks while avoiding others [166,167]. Therefore, we believe that individuals in an organization’s “critical state” need proactive risk taking to achieve a positive phase transition [168]. This prompts individuals to take initiative when facing uncertainty and potential negative consequences, actively responding to challenges.
Therefore, we include risk taking as the last core component of psychological entropy in our entropy-based proactive control model. We believe that individuals with a higher propensity for risk taking can fully assess and undertake risks in the uncertain environment of organizational change and hold the belief of “risk as value”. They engage in risk taking to protect or regain control over the organization, thus driving the organization’s critical state toward a desired positive direction [169,170]. Risk taking drives individuals to proactively adapt to situational changes, confront the organization’s critical state, strive to balance order and disorder to maintain the overall structure and function, and ultimately achieve a positive organizational phase transition, thereby realizing entropy reduction.
In summary, psychological entropy is a meta-mindset that reflects an individual’s proactive adaptation, management, regulation, and control of ”entropy changes” within and outside an organization. The entropy-based proactive control model proposed in this article comprises four components: learning orientation, goal orientation, change orientation, and risk taking. We have no intention of redefining the meaning of these four concepts but rather, based on the characteristics of entropy change and referencing the construction process of psychological capital [171], to conceptually integrate individual capabilities and tendencies for entropy control in change situations. We then propose a meta-concept that is richer in content and more broadly applicable. These four components constitute a high-order structure that can predict an individual’s ability to proactively control entropy. They are sequential in time and together constitute an active, dynamic process of entropy management (as shown in Table 1). This theory not only explains how individuals within an organization can promote high-quality development by actively regulating, controlling, and adapting to uncertainties but its rich content may also integrate existing constructs in organizational behavior.

6. Future Directions in Organizational Psychology of Entropy Research

The Second Law of Thermodynamics has been widely described as ”one of the deepest and most perfect laws in physics” [172]. In this law, entropy plays a central role and has played a crucial role in interdisciplinary transfer and application. This is because entropy provides a framework for understanding and quantifying disorder and uncertainty in systems. However, the application and expression of entropy in different disciplines are influenced by the attributes of each discipline, and interdisciplinary research approaches also add complexity and disorder to knowledge [173]. For example, scholars often introduce too many subjective descriptions when attempting to transfer knowledge from another discipline to the field they are studying. This is due to the fact that different scholars’ understandings of the same concept are influenced by their interpretative frameworks and knowledge backgrounds, which can lead to contradictory concept definitions [174]. Indeed, when introducing interdisciplinary concepts, regardless of the field, high entropy and uncertainty factors are introduced. Under the guidance of this high-entropy research approach, many authoritative but conflicting views have emerged [174,175]. This further exacerbates confusion and disorder, and already high knowledge entropy grows more rampant. Therefore, the authors hope that, through this article, more scholars can awaken to a comprehensive understanding and deep reflection on this “dominant” concept in order to achieve a comprehensive assessment of its interdisciplinary impact and practical applications. Entropy and energy have universal significance in different disciplines, making them universally valuable in all disciplines [176]. Thus, when we attempt to explain the complex world through the concept of entropy with interdisciplinary common values, it may help reduce knowledge confusion.
This article aims to integrate and unify the concept of entropy with organizational psychology based on a systematic analysis of entropy and to accurately grasp the essence of entropy. First, this article, based on the core characteristics of entropy, outlines the four major attributes of entropy change [45,110,111,112]. Subsequently, we propose a meta-mindset, namely, psychological entropy, which may have the ability to integrate multiple organizational or psychological concepts, to explore a possible entropy control mechanism, similar to a “Maxwell’s demon”, that can drive individuals to proactively achieve entropy reduction [177]. We construct the entropy-based proactive control model from a dynamic perspective—which can simulate and predict how individuals manage and control entropy in organizational environments and proactively respond to uncertainties brought about by organizational changes—in order to achieve high-quality sustainable development at both the individual and organizational levels.
More specifically, our entropy-based proactive control model encompasses four dimensions: learning orientation, goal orientation, change orientation, and risk taking. Firstly, individuals with a learning orientation are perceived as dissipative systems, equipped with the ability to proactively absorb new knowledge and execute information exchange. The essence of learning lies in guiding actions. Here, a higher level of goal orientation becomes vitally important. It enables individuals to proactively induct and organize information resources, thereby enhancing the efficiency of cognitive interpretive frameworks in utilizing information (entropy reduction). During this process, goals provide motivation for individual behavior. Moreover, high-quality organizational development necessitates individuals with a strong change orientation. Change orientation encourages individuals to proactively adapt to, manage, and regulate changing circumstances (entropy reduction) with change goals in mind, ultimately leading to emergence. However, organizational change often results in a state of organizational criticality. Criticality implies complexity and disorder. Therefore, risk taking enables individuals to bravely confront the risks associated with organizational phase transitions, proactively adapting to situational changes to address the challenges of criticality, thus fostering high-quality organizational development. The process of regulating psychological entropy is a prerequisite for survival and sustained development, and it is a key factor in individuals actively adapting to organizational environments, optimizing decisions, and facilitating personal growth. In this process, psychological entropy and its four important dimensions can integrate and explain the behavioral responses of decision-makers in uncertain scenarios to some extent. Moreover, although the entropy-based proactive control model only includes four dimensions, we believe that the rich connotation of psychological entropy is sufficient to integrate more variables of organizational behavior and to predict and explain a series of potential outcome variables (as shown in Figure 1). For example, psychological entropy reflects an individual’s ability and tendency to actively control uncertainty and thus has broad predictive utility and integration capacity for other organizational management variables that can reduce uncertainty, such as lean spirit (reflecting an individual’s autonomous motivation to reduce resource wastage, improve work efficiency, and continuously enhance work quality). This awaits further empirical research for substantiation.
Although this article proposes the concept of psychological entropy and its four dimensions, it does not delve into the methods of measuring psychological entropy. Clearly, when we rely on the concept of entropy to create a new construct in organizational behavior studies, traditional scale development and self-reporting can be utilized for measurement [178,179]. This is a highly stable and reliable quantitative path and can even be used to verify whether the proposal of these four dimensions has statistical justification. Therefore, in the future, this method can be used to develop a set of measurement tools.
Furthermore, Shannon’s quantitative formula for entropy also provides us with a series of new ideas for quantification [5,22]. In previous research, predictions about the probability of an event occurring were mainly through variance in the predictions themselves (known as risk when predicting possible rewards) [180,181], but FeldmanHall and Shenhav [6] suggested using the Shannon entropy concept to compute (non)social uncertainty [5] and quantifying total uncertainty (nonsocial + social) based on the conditional entropy method. As mentioned above, the calculation of Shannon entropy is based on the probability of an event occurring within a system. Similarly, we can quantify the entropy of a variable within an organization based on the concept of probability. Taking learning orientation as an example, suppose we conducted a survey among 100 employees, from which we obtained a dataset of scores for each employee in the dimension of learning orientation.
For instance, using the Likert scale method, we can obtain a dataset consisting of a series of continuous data. This paper will provide two measurement approaches. Firstly, we can calculate the probability distribution of participants on a 1–7-point rating scale and then input this into the aforementioned formula to calculate the corresponding entropy value. In addition, we can set a categorization threshold to classify this continuous dataset into “high” and “low” categories. The threshold can be set based on data distribution and research purposes, such as percentiles, median, mean, standard deviation, natural data segmentation points, the extreme grou** method (27%), etc. Zou et al. also proposed a sampling-based threshold auto-tuning method (machine learning) for imbalanced classification [182]. Suppose, according to the classification threshold, we distinguish 60 people with higher learning orientation and 40 with lower learning orientation; we can then calculate their probabilities, which are 0.6 and 0.4, respectively, and subsequently use them in the Shannon entropy formula for calculation. We can use this method to calculate the entropy levels of learning orientation, goal orientation, change orientation, and risk taking. Since these four variables are conceptually independent and together constitute a higher-order concept, this paper proposes a possible calculation method: joint entropy.
Joint entropy can calculate the total entropy of multiple variables and provide information about the uncertainty of the entire system [5]. Suppose there are two events, X and Y, and let P(I, j) be the probability of the first event, i, and the second event, j, occurring simultaneously. Then, the probability of joint entropy can be represented by Equation (5). Since the four dimensions of psychological entropy are conceptually independent, we can calculate by adding the information entropy of each and then subtracting the interference of mutual information [183]. Let us denote learning orientation, goal orientation, change orientation, and risk taking as A, B, C, and D, respectively. Therefore, the calculation of psychological entropy is conducted as follows: H(A, B, C, D) = H(A) + H(B) + H(C) + H(D) − I(A, B) − I(A, C) − I(A, D) − I(B, C) − I(B, D) − I(C, D). Here, H(A), H(B), H(C), and H(D) represent the entropy of the four dimensions respectively, while “I” denotes the mutual information between variables.
Mutual information (MI) is a measure that quantifies the degree of dependency between two variables. It indicates how much the information from one variable reduces the uncertainty of another [183]. In information theory, mutual information is used to quantify the amount of information shared between two random variables, as shown in Equation (6). Here, P(A, B) represents the joint probability distribution of A and B, while P(A) and P(B) are the marginal probability distributions of A and B. Let us take I(A, B) as an example for explanation. Since A and B are relatively independent and can be distinguished as “high” or “low” states, we can calculate probabilities like P(Ahigh, Bhigh), P(Ahigh, Blow), and P(Alow, Bhigh), P(Alow, Blow). The calculation of the marginal probability is performed as follows: P(Ahigh) = P(Ahigh, Bhigh) + P(Ahigh, Blow). By calculating the joint and marginal probabilities for each pair of variables in the same way, we can use the aforementioned formula to complete the calculations.
H   ( x ,   y ) = i , j p   ( i , j )   log   p   ( i , j )
I   ( X ,   Y ) = x X   y Y   P ( x ,   y )   log   ( P ( x ,   y ) P ( x ) P ( y ) )
Measuring the Shannon entropy of a concept within an organization can provide insights into the distribution and diversity of that concept in the organization [113]. For example, by quantifying the psychological entropy of individuals, we can reveal how psychological entropy is distributed across different teams or departments. This can serve as a basis for optimizing resource allocation, implementing changes, and develo** plans within an organization. However, calculating Shannon entropy requires defining and quantifying the “states” or “levels” of these abstract concepts, which is a challenge in itself and offers a direction for subsequent research. Although we have proposed two methods for quantifying the psychological entropy introduced in this paper, we encourage the use of more statistical measurement methods to quantify the entropy of certain constructs in organizational psychology, thereby providing powerful quantitative tools for future empirical research.

7. Concluding Remarks

Since Clausius first introduced the concept of entropy in 1865, various disciplines have seen projections of the transfer and application of the entropy concept. However, in the fields of psychology and organizational management psychology, there is still no comprehensive definition and quantification method for entropy and the entropy change process. This lack hinders our in-depth study of entropy and entropy theory at the psychological level and in organizational contexts. This article adopts a positive and proactive perspective, suggesting that individuals can act as dissipative structures, proactively controlling and regulating perceived entropy, which has significant positive implications for the development of both individuals and organizations.
Specifically, we propose a four-dimensional, entropy-based proactive control model, as these four dimensions can be explained by entropy, and, simultaneously, these dimensions can explain all human organizational behaviors. In detail, first, all human activities can be understood as learning processes; humans must grow through learning, thereby achieving entropy reduction. Second, human activities must be goal-oriented; goals focus energy, maintain order, and achieve entropy reduction. Third, the realization of goals is accomplished through change, making change orientation a concrete path to achieving goals and reducing entropy. Fourth, implementing change, causing phase transitions, and fostering emergence inevitably involve facing uncertainties and encountering risks. Hence, individuals must be capable of risk taking, motivating them to pursue certain risks while avoiding others to achieve positive adaptive outcomes (as shown in Figure 2). These four dimensions collectively embody how individuals adapt to uncertain scenarios proactively, aiming to regulate and control internal entropy.
In summary, theoretically, the proposition of psychological entropy provides a new framework for understanding disorder, chaos, and uncertainty (entropy increase) within individual situations. This also offers a new theoretical method for studying dynamic changes in individuals and organizations. Moreover, the introduction of psychological entropy reflects the process of interdisciplinary integration, showcasing the potential of interdisciplinary research in explaining complex human behavior. Practically, the proposition of an entropy-based proactive control model can guide organizations in better understanding and managing employees’ behaviors and attitudes during transformational implementations, thereby motivating active participation in the organizational change process. Additionally, we believe that psychological entropy is not only a key determinant of individual potential for sustained development in organizational changes but also an important indicator for predicting various positive organizational behaviors. We believe the study and practice of psychological entropy have great potential to help engender a more orderly and harmonious world for all.

Author Contributions

Conceptualization, L.W. and H.J.; methodology, L.W. and H.J.; resources, L.W.; writing—original draft preparation, L.W. and H.J.; writing—review and editing, L.W. and H.J.; supervision, L.W.; project administration, L.W. All authors have read and agreed to the published version of the manuscript.

Funding

We acknowledge the funding from National Natural Science Foundation of China grant #31971013 and a Bei**g Well-being Foundation grant #0020344 to L. Wang.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bennett, N.; Lemoine, G.J. What a difference a word makes: Understanding threats to performance in a VUCA world. Bus. Horiz. 2014, 57, 311–317. [Google Scholar] [CrossRef]
  2. Schneider, M.; Somers, M. Organizations as complex adaptive systems: Implications of complexity theory for leadership research. Leadersh. Q. 2006, 17, 351–365. [Google Scholar] [CrossRef]
  3. Aoki, I. Entropy principle for human development, growth and aging. J. Theor. Biol. 1991, 50, 215–223. [Google Scholar] [CrossRef] [PubMed]
  4. Clausius, R. Ueber Verschiedene für die Anwendung Bequeme Formen der Hauptgleichungen der Mechanischen Wärmetheorie: Vorgetragen in der Naturforsch. Gesellschaft den 24. April 1865; Verlag Nicht Ermittelbar: München, Germany, 1865. [Google Scholar]
  5. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  6. FeldmanHall, O.; Shenhav, A. Resolving uncertainty in a social world. Nat. Hum. Behav. 2019, 3, 426–435. [Google Scholar] [CrossRef]
  7. Bynum, T.W. Norbert Wiener and the rise of information ethics. In Information Technology and Moral Philosophy; Cambridge University Press: Cambridge, UK, 2008; pp. 8–25. [Google Scholar] [CrossRef]
  8. Gell-Mann, M.; Tsallis, C. (Eds.) Nonextensive Entropy: Interdisciplinary Applications; Oxford University Press: Oxford, UK, 2004. [Google Scholar] [CrossRef]
  9. Landsberg, P.T. Is equilibrium always an entropy maximum? J. Stat. Phys. 1984, 35, 159–169. [Google Scholar] [CrossRef]
  10. Prigogine, I.; Stengers, I. Order out of Chaos: Man’s New Dialogue with Nature; Verso Books: New York, NY, USA, 2018. [Google Scholar]
  11. O’Connor, J. On the two contradictions of capitalism. Capital. Nat. Social. 1991, 2, 107–109. [Google Scholar] [CrossRef]
  12. Smith, R.C. Uncertainty Quantification: Theory, Implementation, and Applications; Siam: Philadelphia, PA, USA, 2013; Volume 12. [Google Scholar]
  13. Thietart, R.A.; Forgues, B. Complexity science and organization. In The Sage Handbook of Complexity and Management; Sage Publications: Thousand Oaks, CA, USA, 2011; Volume 2, pp. 53–64. [Google Scholar]
  14. Rogier, G.; Garofalo, C.; Velotti, P. Is emotional suppression always bad? A matter of flexibility and gender differences. Curr. Psychol. J. Divers. Perspect. Divers. Psychol. Issues 2019, 38, 411–420. [Google Scholar] [CrossRef]
  15. Hillen, M.A.; Gutheil, C.M.; Strout, T.D.; Smets, E.M.A.; Han, P.K.J. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare. Soc. Sci. Med. 2017, 180, 62–75. [Google Scholar] [CrossRef]
  16. Peña-Sarrionandia, A.; Mikolajczak, M.; Gross, J.J. Integrating emotion regulation and emotional intelligence traditions: A meta-analysis. Front. Psychol. 2015, 6, 160. [Google Scholar] [CrossRef]
  17. Hong, R.Y.; Cheung, M.W.-L. The structure of cognitive vulnerabilities to depression and anxiety: Evidence for a common core etiologic process based on a meta-analytic review. Clin. Psychol. Sci. 2015, 3, 892–912. [Google Scholar] [CrossRef]
  18. Kahneman, D. Maps of bounded rationality: Psychology for behavioral economics. Am. Econ. Rev. 2003, 93, 1449–1475. [Google Scholar] [CrossRef]
  19. Arrow, K.J. Information and Economic Behavior; Federation of Swedish Industries: Stockholm, Sweden, 1973; Volume 28. [Google Scholar]
  20. March, J.G. How decisions happen in organizations. Hum. Comput. Interact. 1991, 6, 95–117. [Google Scholar] [CrossRef]
  21. Lichtenstein, B.B.; Plowman, D.A. The Leadership of Emergence: A Complex Systems Leadership Theory of Emergence at Successive Organizational Levels. Leadersh. Q. 2009, 20, 617–630. [Google Scholar] [CrossRef]
  22. Hirsh, J.B.; Mar, R.A.; Peterson, J.B. Psychological entropy: A framework for understanding uncertainty-related anxiety. Psychol. Rev. 2012, 119, 304. [Google Scholar] [CrossRef] [PubMed]
  23. Peterson, J.; Bomberg, E. Decision-Making in the European Union; Bloomsbury Publishing: London, UK, 1999. [Google Scholar]
  24. Collins, M.W.; Dougal, R.C.; Koenig, C.; Ruddock, I. (Eds.) Kelvin, Thermodynamics and the Natural World; WIT Press: Southampton, UK, 2015; Volume 10. [Google Scholar]
  25. Broda, E. Ludwig Boltzmann; Deutscher Verlag d. Wiss: Berlin, Germany, 1957. [Google Scholar]
  26. Gibbs, J.W. Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundations of Thermodynamics; C. Scribner’s Sons: New York, NY, USA, 1902. [Google Scholar]
  27. Frenkel, D.; Warren, P.B. Gibbs, Boltzmann, and negative temperatures. Am. J. Phys. 2015, 83, 163–170. [Google Scholar] [CrossRef]
  28. Schrödinger, E. What Is Life? The Physical Aspect of the Living Cell and Mind; Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
  29. Wiener, N. Cybernetics or Control and Communication in the Animal and the Machine; MIT Press: New York, NY, USA, 1961; Volume 25. [Google Scholar]
  30. Kailath, T. Norbert Wiener and the development of mathematical engineering. Curr. Sci. 1996, 71, 261–274. [Google Scholar]
  31. Downarowicz, T. Entropy in Dynamical Systems; Cambridge University Press: Cambridge, UK, 2011; Volume 18. [Google Scholar]
  32. Lorenz, H.W. Nonlinear Dynamical Economics and Chaotic Motion; Springer: Berlin/Heidelberg, Germany, 1993; Volume 334. [Google Scholar]
  33. Bergström, R.M.; Nevanlinna, O. An entropy model of primitive neural systems. Int. J. Neurosci. 1972, 4, 171–173. [Google Scholar] [CrossRef]
  34. Fagerholm, E.D.; Dezhina, Z.; Moran, R.J.; Friston, K.J.; Turkheimer, F.; Leech, R. Selection entropy: The information hidden within neuronal patterns. Phys. Rev. Res. 2023, 5, 023197. [Google Scholar] [CrossRef]
  35. Hancock, F.; Rosas, F.E.; Mediano, P.A.; Luppi, A.I.; Cabral, J.; Dipasquale, O.; Turkheimer, F.E. May the 4C’s be with you: An overview of complexity-inspired frameworks for analysing resting-state neuroimaging data. J. R. Soc. Interface 2022, 19, 20220214. [Google Scholar] [CrossRef]
  36. Keshmiri, S. Entropy and the brain: An overview. Entropy 2020, 22, 917. [Google Scholar] [CrossRef]
  37. Mišić, B.; Betzel, R.F.; Nematzadeh, A.; Goni, J.; Griffa, A.; Hagmann, P.; Flammini, A.; Ahn, Y.Y.; Sporns, O. Cooperative and competitive spreading dynamics on the human connectome. Neuron 2015, 86, 1518–1529. [Google Scholar] [CrossRef] [PubMed]
  38. Saxe, G.N.; Calderone, D.; Morales, L.J. Brain entropy and human intelligence: A resting-state fMRI study. PLoS ONE 2018, 13, e0191582. [Google Scholar] [CrossRef]
  39. Breakspear, M. Dynamic models of large-scale brain activity. Nat. Neurosci. 2017, 20, 340–352. [Google Scholar] [CrossRef] [PubMed]
  40. Kruglanski, A.W.; Pierro, A.; Mannetti, L.; De Grada, E. Groups as epistemic providers: Need for closure and the unfolding of group-centrism. Psychol. Rev. 2006, 113, 84. [Google Scholar] [CrossRef] [PubMed]
  41. Stephen, D.G.; Boncoddo, R.A.; Magnuson, J.S.; Dixon, J.A. The dynamics of insight: Mathematical discovery as a phase transition. Mem. Cogn. 2009, 37, 1132–1149. [Google Scholar] [CrossRef]
  42. Stephen, D.G.; Dixon, J.A.; Isenhower, R.W. Dynamics of representational change: Entropy, action, and cognition. J. Exp. Psychol. Hum. Percept. Perform. 2009, 35, 1811. [Google Scholar] [CrossRef] [PubMed]
  43. Excerpt from Robert Alun Jones. In Emile Durkheim: An Introduction to Four Major Works; Sage Publications, Inc.: Beverly Hills, CA, USA, 1986; pp. 24–59.
  44. Bailey, K.D. Social Entropy Theory; State University of New York (SUNY) Press: New York, NY, USA, 1990. [Google Scholar]
  45. Dinga, E.; Tănăsescu, C.R.; Ionescu, G.M. Social entropy and normative network. Entropy 2020, 22, 1051. [Google Scholar] [CrossRef]
  46. Testa, B.; Kier, L.B. Emergence and Dissolvence in the Self-organisation of Complex Systems. Entropy 2000, 2, 1–25. [Google Scholar] [CrossRef]
  47. Webb, E.; Weick, K.E. Unobtrusive measures in organizational theory: A reminder. Adm. Sci. Q. 1979, 24, 650–659. [Google Scholar] [CrossRef]
  48. Martínez-Berumen, H.A.; López-Torres, G.C.; Romo-Rojas, L. Develo** a method to evaluate entropy in organizational systems. Procedia Comput. Sci. 2014, 28, 389–397. [Google Scholar] [CrossRef]
  49. Neves, A.; Godina, R.; Azevedo, S.G.; Pimentel, C.; Matias, J.C.O. The potential of industrial symbiosis: Case analysis and main drivers and barriers to its implementation. Sustainability 2019, 11, 7095. [Google Scholar] [CrossRef]
  50. Kast, F.E.; Rosenzweig, J.E. General Systems Theory: Applications for Organization and Management. Acad. Manag. J. 1972, 15, 447–466. [Google Scholar] [CrossRef]
  51. Bondar, A.; Bushuyev, S.; Bushuieva, V.; Onyshchenko, S. Complementary Strategic Model for Managing Entropy of the Organization. In CEUR Workshop Proceedings; CEUR: Aachen, Germany, 2021; pp. 293–302. [Google Scholar]
  52. Lawrence, P.R.; Lorsch, J.W. Differentiation and integration in complex organizations. Adm. Sci. Q. 1967, 12, 1–47. [Google Scholar] [CrossRef]
  53. Cisek, P.; Kalaska, J.F. Neural mechanisms for interacting with a world full of action choices. Annu. Rev. Neurosci. 2010, 33, 269–298. [Google Scholar] [CrossRef] [PubMed]
  54. Bailey, K.D. System entropy analysis. Kybernetes 1997, 26, 674–688. [Google Scholar] [CrossRef]
  55. Bailey, K.D. Boundary maintenance in living systems theory and social entropy theory. Syst. Res. Behav. Sci. Off. J. Int. Fed. Syst. Res. 2008, 25, 587–597. [Google Scholar] [CrossRef]
  56. Takaguchi, T.; Nakamura, M.; Sato, N.; Yano, K.; Masuda, N. Predictability of conversation partners. Phys. Rev. X 2011, 1, 011008. [Google Scholar] [CrossRef]
  57. Peng, S.; Li, J.; Yang, A. Entropy-based social influence evaluation in mobile social networks. In Algorithms and Architectures for Parallel Processing: 15th International Conference, ICA3PP 2015, Zhangjiajie, China, 18–20 November 2015, Proceedings, Part I 15; Springer: Berlin/Heidelberg, Germany, 2015; pp. 637–647. [Google Scholar]
  58. Kulisiewicz, M.; Kazienko, P.; Szymanski, B.K.; Michalski, R. Entropy measures of human communication dynamics. Sci. Rep. 2018, 8, 15697. [Google Scholar] [CrossRef]
  59. Westbury, C.; Shaoul, C.; Moroschan, G.; Ramscar, M. Telling the world’s least funny jokes: On the quantification of humor as entropy. J. Mem. Lang. 2016, 86, 141–156. [Google Scholar] [CrossRef]
  60. Han, P.K.; Klein, W.M.; Arora, N.K. Varieties of uncertainty in health care: A conceptual taxonomy. Med. Decis. Mak. 2011, 31, 828–838. [Google Scholar] [CrossRef]
  61. Ellsberg, D. Risk, ambiguity, and the Savage axioms. Q. J. Econ. 1961, 75, 643–669. [Google Scholar] [CrossRef]
  62. Matta, F.K.; Scott, B.A.; Colquitt, J.A.; Koopman, J.; Passantino, L.G. Is consistently unfair better than sporadically fair? An investigation of justice variability and stress. Acad. Manag. J. 2017, 60, 743–770. [Google Scholar] [CrossRef]
  63. Fehr, E.; Camerer, C.F. Social neuroeconomics: The neural circuitry of social preferences. Trends Cogn. Sci. 2007, 11, 419–427. [Google Scholar] [CrossRef]
  64. Kornilova, T.V.; Chumakova, M.A.; Kornilov, S.A. Tolerance and intolerance for uncertainty as predictors of decision making and risk acceptance in gaming strategies of the Iowa gambling task. Psychol. Russ. 2018, 11, 86. [Google Scholar] [CrossRef]
  65. Osman, M. Controlling uncertainty: A review of human behavior in complex dynamic environments. Psychol. Bull. 2010, 136, 65. [Google Scholar] [CrossRef]
  66. Aldao, A.; Nolen-Hoeksema, S.; Schweizer, S. Emotion-regulation strategies across psychopathology: A meta-analytic review. Clin. Psychol. Rev. 2010, 30, 217–237. [Google Scholar] [CrossRef]
  67. Anderson, E.C.; Carleton, R.N.; Diefenbach, M.; Han, P.K. The relationship between uncertainty and affect. Front. Psychol. 2019, 10, 2504. [Google Scholar] [CrossRef]
  68. Windschitl, P.D.; Wells, G.L. Measuring psychological uncertainty: Verbal versus numeric methods. J. Exp. Psychol. Appl. 1996, 2, 343. [Google Scholar] [CrossRef]
  69. Fergus, T.A.; Rowatt, W.C. Intolerance of uncertainty and personality: Experiential permeability is associated with difficulties tolerating uncertainty. Personal. Individ. Differ. 2014, 58, 128–131. [Google Scholar] [CrossRef]
  70. Durrheim, K.; Foster, D. Tolerance of ambiguity as a content specific construct. Personal. Individ. Differ. 1997, 22, 741–750. [Google Scholar] [CrossRef]
  71. Simon, H.A. A behavioral model of rational choice. Q. J. Econ. 1955, 69, 99–118. [Google Scholar] [CrossRef]
  72. Tversky, A.; Kahneman, D. Availability: A heuristic for judging frequency and probability. Cogn. Psychol. 1973, 5, 207–232. [Google Scholar] [CrossRef]
  73. Tversky, A.; Kahneman, D. Judgment under Uncertainty: Heuristics and Biases: Biases in judgments reveal some heuristics of thinking under uncertainty. Science 1974, 185, 1124–1131. [Google Scholar] [CrossRef]
  74. Kahneman, D.; Slovic, P.; Tversky, A. Judgment under Uncertainty: Heuristics and Biases; Cambridge University Press: Cambridge, UK, 1982. [Google Scholar]
  75. Kahneman, D.; Knetsch, J.L.; Thaler, R.H. Anomalies: The endowment effect, loss aversion, and status quo bias. J. Econ. Perspect. 1991, 5, 193–206. [Google Scholar] [CrossRef]
  76. Tversky, A.; Kahneman, D. Advances in prospect theory: Cumulative representation of uncertainty. J. Risk Uncertain. 1992, 5, 297–323. [Google Scholar] [CrossRef]
  77. Gross, J.J. The emerging field of emotion regulation: An integrative review. Rev. Gen. Psychol. 1998, 2, 271–299. [Google Scholar] [CrossRef]
  78. Hodson, G.; Sorrentino, R.M. Uncertainty orientation and the Big Five personality structure. J. Res. Personal. 1999, 33, 253–261. [Google Scholar] [CrossRef]
  79. Koerner, N.; Dugas, M.J. An investigation of appraisals in individuals vulnerable to excessive worry: The role of intolerance of uncertainty. Cogn. Ther. Res. 2008, 32, 619–638. [Google Scholar] [CrossRef]
  80. Herman, J.L.; Stevens, M.J.; Bird, A.; Mendenhall, M.; Oddou, G. The tolerance for ambiguity scale: Towards a more refined measure for international management research. Int. J. Intercult. Relat. 2010, 34, 58–65. [Google Scholar] [CrossRef]
  81. Rosen, N.O.; Ivanova, E.; Knäuper, B. Differentiating intolerance of uncertainty from three related but distinct constructs. Anxiety Stress Co** 2014, 27, 55–73. [Google Scholar] [CrossRef]
  82. Tobin, S.J.; Loxton, N.J.; Neighbors, C. Co** with causal uncertainty through alcohol use. Addict. Behav. 2014, 39, 580–585. [Google Scholar] [CrossRef]
  83. Dugas, M.J.; Buhr, K.; Ladouceur, R. The Role of Intolerance of Uncertainty in Etiology and Maintenance. In Generalized Anxiety Disorder: Advances in Research and Practice; Heimberg, R.G., Turk, C.L., Mennin, D.S., Eds.; The Guilford Press: New York, NY, USA, 2004; pp. 143–163. [Google Scholar]
  84. Dugas, M.J.; Robichaud, M. Cognitive-Behavioral Treatment for Generalized Anxiety Disorder: From Science to Practice; Routledge/Taylor & Francis Group: London, UK, 2007. [Google Scholar]
  85. Carleton, R.N. Fear of the unknown: One fear to rule them all? J. Anxiety Disord. 2016, 41, 5–21. [Google Scholar] [CrossRef] [PubMed]
  86. Furnham, A.; Marks, J. Tolerance of ambiguity: A review of the recent literature. Psychology 2013, 4, 717–728. [Google Scholar] [CrossRef]
  87. Dugas, M.J.; Gagnon, F.; Ladouceur, R.; Freeston, M.H. Generalized anxiety disorder: A preliminary test of a conceptual model. Behav. Res. Ther. 1998, 36, 215–226. [Google Scholar] [CrossRef] [PubMed]
  88. Furnham, A. A content, correlational and factor analytic study of four tolerance of ambiguity questionnaires. Personal. Individ. Differ. 1994, 16, 403–410. [Google Scholar] [CrossRef]
  89. Ladouceur, R.; Gosselin, P.; Dugas, M.J. Experimental manipulation of intolerance of uncertainty: A study of a theoretical model of worry. Behav. Res. Ther. 2000, 38, 933–941. [Google Scholar] [CrossRef] [PubMed]
  90. Buhr, K.; Dugas, M.J. The intolerance of uncertainty scale: Psychometric properties of the English version. Behav. Res. Ther. 2002, 40, 931–945. [Google Scholar] [CrossRef]
  91. Carleton, R.N.; Mulvogue, M.K.; Thibodeau, M.A.; McCabe, R.E.; Antony, M.M.; Asmundson, G.J. Increasingly certain about uncertainty: Intolerance of uncertainty across anxiety and depression. J. Anxiety Disord. 2012, 26, 468–479. [Google Scholar] [CrossRef]
  92. Renjan, V.; McEvoy, P.M.; Handley, A.K.; Fursland, A. Stomaching uncertainty: Relationships among intolerance of uncertainty, eating disorder pathology, and comorbid emotional symptoms. J. Anxiety Disord. 2016, 41, 88–95. [Google Scholar] [CrossRef]
  93. Frenkel-Brunswik, E. Intolerance of ambiguity as an emotional and perceptual personality variable. J. Personal. 1949, 18, 108–143. [Google Scholar] [CrossRef]
  94. Kornilova, T.V.; Kornilov, S.A. Intelligence and tolerance/intolerance for uncertainty as predictors of creativity. Psychol. Russ. State Art 2010, 3, 240–256. [Google Scholar] [CrossRef]
  95. Yap, A.; Johanesen, P.; Walsh, C. Moderators uncertainty tolerance (UT) in healthcare: A systematic review. Adv. Health Sci. Educ. 2023, 28, 1409–1440. [Google Scholar] [CrossRef] [PubMed]
  96. Geller, G.; Tambor, E.S.; Chase, G.A.; Holtzman, N.A. Measuring physicians’ tolerance for ambiguity and its relationship to their reported practices regarding genetic testing. Med. Care 1993, 31, 989–1001. [Google Scholar] [CrossRef] [PubMed]
  97. Sommerfeld, A. Thermodynamics and Statistical Mechanics; CUP Archive: Cambridge, UK, 1964; Volume 5. [Google Scholar]
  98. Stowe, K. An Introduction to Thermodynamics and Statistical Mechanics; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
  99. Smithson, M. Conflict aversion: Preference for ambiguity vs conflict in sources and evidence. Organ. Behav. Hum. Decis. Process. 1999, 79, 179–198. [Google Scholar] [CrossRef] [PubMed]
  100. Burton, R.A. On Being Certain: Believing You Are Right Even When You’re Not; St. Martins Griffin: New York, NY, USA, 2008; p. 220. [Google Scholar]
  101. Morin, E. From the concept of system to the paradigm of complexity. J. Soc. Evol. Syst. 1992, 15, 371–385. [Google Scholar] [CrossRef]
  102. Morin, P.J. Community ecology and the genetics of interacting species. Ecology 2003, 84, 577–580. [Google Scholar] [CrossRef]
  103. Prigogine, I.; Nicolis, G. On symmetry-breaking instabilities in dissipative systems. J. Chem. Phys. 1967, 46, 3542–3550. [Google Scholar] [CrossRef]
  104. Nicolis, G.; Prigogine, I. Self-Organization in Nonequilibrium Systems. From Dissipative Structures to Order through Fluctuations. Berichte Bunsenges. Phys. Chem. 1978, 82, 672. [Google Scholar] [CrossRef]
  105. Prigogine, I. Time, structure, and fluctuations. Science 1978, 201, 777–785. [Google Scholar] [CrossRef]
  106. Tiezzi, E.B.P.; Pulselli, R.M.; Marchettini, N.; Tiezzi, E. Dissipative structures in nature and human systems. In Design & Nature IV: Comparing Design in Nature with Science and Engineering; Brebbia, C.A., Ed.; WitPress: Boston, MA, USA, 2008; pp. 93–300. [Google Scholar]
  107. Comfort, L.K. Self-organization in complex systems. J. Public Adm. Res. Theory J-PART 1994, 4, 393–410. [Google Scholar]
  108. Goldstein, M. On the reality of the residual entropy of glasses and disordered crystals: The entropy of mixing. J. Non-Cryst. Solids 2011, 357, 463–465. [Google Scholar] [CrossRef]
  109. Gershenson, C.; Fernández, N. Complexity and information: Measuring emergence, self-organization, and homeostasis at multiple scales. Complexity 2012, 18, 29–44. [Google Scholar] [CrossRef]
  110. Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; University of California Press: Oakland, CA, USA, 1961; Volume 4, pp. 547–562. [Google Scholar]
  111. Wehrl, A. General properties of entropy. Rev. Mod. Phys. 1978, 50, 221. [Google Scholar] [CrossRef]
  112. Moore, R.C.; Swendsen, J.; Depp, C.A. Applications for self-administered mobile cognitive assessments in clinical research: A systematic review. Int. J. Methods Psychiatr. Res. 2017, 26, e1562. [Google Scholar] [CrossRef]
  113. Zingg, C.; Casiraghi, G.; Vaccario, G.; Schweitzer, F. What is the entropy of a social organization? Entropy 2019, 21, 901. [Google Scholar] [CrossRef]
  114. Prigogine, I.; Nicolis, G. Self-Organization. Non-Equilibrium System: Towards A Dynamics of Complexity; Bifurcation Analysis; Springer: Dordrecht, Germany, 1985. [Google Scholar] [CrossRef]
  115. Prigogine, I.; Stengers, I. The End of Certainty; Simon and Schuster: New York, NY, USA, 1997. [Google Scholar]
  116. Sinkula, J.M.; Baker, W.E.; Noordewier, T. A framework for market-based organizational learning: Linking values, knowledge, and behavior. J. Acad. Mark. Sci. 1997, 25, 305–318. [Google Scholar] [CrossRef]
  117. Calantone, R.J.; Cavusgil, S.T.; Zhao, Y. Learning orientation, firm innovation capability, and firm performance. Ind. Mark. Manag. 2002, 31, 515–524. [Google Scholar] [CrossRef]
  118. Piedmont, R.L.; Sherman, M.F.; Sherman, N.C.; Dy-Liacco, G.S.; Williams, J.E. Using the five-factor model to identify a new personality disorder domain: The case for experiential permeability. J. Personal. Soc. Psychol. 2009, 96, 1245. [Google Scholar] [CrossRef]
  119. Berenbaum, H.; Bredemeier, K.; Thompson, R.J. Intolerance of uncertainty: Exploring its dimensionality and associations with need for cognitive closure, psychopathology, and personality. J. Anxiety Disord. 2008, 22, 117–125. [Google Scholar] [CrossRef]
  120. Berger, C.R.; Bradac, J.J. Language and Social Knowledge Uncertainty in Interpersonal Relations; Arnold: London, UK, 1982. [Google Scholar]
  121. Friston, K. The free-energy principle: A unified brain theory? Nat. Rev. Neurosci. 2010, 11, 127–138. [Google Scholar] [CrossRef] [PubMed]
  122. Kelso, J.S. Dynamic Patterns: The Self-Organization of Brain and Behavior; MIT Press: Cambridge, MA, USA, 1995. [Google Scholar]
  123. Callen, H.B. Thermodynamics and an Introduction to Thermostatistics; John wiley & Sons: Hoboken, NJ, USA, 1991. [Google Scholar]
  124. Prigogine. An Introduction to Thermodynamics of Irreversible Processes; Thomas: Springfield, IL, USA, 1955. [Google Scholar]
  125. Doll, W.E. Prigogine: A new sense of order, a new curriculum. Theory Pract. 1986, 25, 10–16. [Google Scholar] [CrossRef]
  126. Elliot, A.J.; Thrash, T.M. Approach-avoidance motivation in personality: Approach and avoidance temperaments and goals. J. Personal. Soc. Psychol. 2002, 82, 804. [Google Scholar] [CrossRef] [PubMed]
  127. Locke, E.A.; Latham, G.P. Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. Am. Psychol. 2002, 57, 705. [Google Scholar] [CrossRef] [PubMed]
  128. Colquitt, J.A.; Simmering, M.J. Conscientiousness, goal orientation, and motivation to learn during the learning process: A longitudinal study. J. Appl. Psychol. 1998, 83, 654. [Google Scholar] [CrossRef]
  129. Barrick, M.R.; Mount, M.K.; Li, N. The theory of purposeful work behavior: The role of personality, higher-order goals, and job characteristics. Acad. Manag. Rev. 2013, 38, 132–153. [Google Scholar] [CrossRef]
  130. Aarts, H.; Custers, R.; Holland, R.W. The nonconscious cessation of goal pursuit: When goals and negative affect are coactivated. J. Personal. Soc. Psychol. 2007, 92, 165. [Google Scholar] [CrossRef] [PubMed]
  131. Bargh, J.A.; Chartrand, T.L. The unbearable automaticity of being. Am. Psychol. 1999, 54, 462. [Google Scholar] [CrossRef]
  132. Bargh, J.A.; Gollwitzer, P.M.; Lee-Chai, A.; Barndollar, K.; Trötschel, R. The automated will: Nonconscious activation and pursuit of behavioral goals. J. Personal. Soc. Psychol. 2001, 81, 1014. [Google Scholar] [CrossRef]
  133. Yang, Q.; Van den Bos, K.; Li, Y. Intolerance of uncertainty, future time perspective, and self-control. Personal. Individ. Differ. 2021, 177, 110810. [Google Scholar] [CrossRef]
  134. Carver, C.S.; Scheier, M.F. A model of behavioral self-regulation. Handb. Theor. Soc. Psychol. 2012, 1, 505–525. [Google Scholar]
  135. Bellman, R. Dynamic programming and stochastic control processes. Inf. Control 1958, 1, 228–239. [Google Scholar] [CrossRef]
  136. Sutton, R.S.; Barto, A.G. Toward a modern theory of adaptive networks: Expectation and prediction. Psychol. Rev. 1981, 88, 135. [Google Scholar] [CrossRef] [PubMed]
  137. Schroeder, D.V. An Introduction to Thermal Physics; Oxford University Press: Oxford, UK, 2021. [Google Scholar] [CrossRef]
  138. Meyer, A.D.; Gaba, V.; Colwell, K.A. Organizing far from equilibrium: Nonlinear change in organizational fields. Organ. Sci. 2005, 16, 456–473. [Google Scholar] [CrossRef]
  139. Goldstein, J. Emergence in complex systems. In The SAGE Handbook of Complexity and Management; Sage Publications: London, UK, 2011; pp. 65–78. [Google Scholar]
  140. Schieve, W.C.; Allen, P.M. (Eds.) Self-Organization and Dissipative Structures: Applications in the Physical and Social Sciences; University of Texas Press: Austin, TX, USA, 1982. [Google Scholar]
  141. Anderson, P. Perspective: Complexity theory and organization science. Organ. Sci. 1999, 10, 216–232. [Google Scholar] [CrossRef]
  142. Chiles, T.H.; Meyer, A.D.; Hench, T.J. Organizational emergence: The origin and transformation of Branson, Missouri’s musical theaters. Organ. Sci. 2004, 15, 499–519. [Google Scholar] [CrossRef]
  143. Bergmann Lichtenstein, B.M. Emergence as a process of self-organizing-New assumptions and insights from the study of non-linear dynamic systems. J. Organ. Change Manag. 2000, 13, 526–544. [Google Scholar] [CrossRef]
  144. Plowman, D.A.; Baker, L.T.; Beck, T.E.; Kulkarni, M.; Solansky, S.T.; Travis, D.V. Radical change accidentally: The emergence and amplification of small change. Acad. Manag. J. 2007, 50, 515–543. [Google Scholar] [CrossRef]
  145. Khaw, K.W.; Alnoor, A.; Al-Abrrow, H.; Tiberius, V.; Ganesan, Y.; Atshan, N.A. Reactions towards organizational change: A systematic literature review. Curr. Psychol. 2023, 42, 19137–19160. [Google Scholar] [CrossRef]
  146. Pulakos, E.D.; Arad, S.; Donovan, M.A.; Plamondon, K.E. Adaptability in the workplace: Development of a taxonomy of adaptive performance. J. Appl. Psychol. 2000, 85, 612. [Google Scholar] [CrossRef]
  147. Anderson, N.; Potočnik, K.; Zhou, J. Innovation and creativity in organizations: A state-of-the-science review, prospective commentary, and guiding framework. J. Manag. 2014, 40, 1297–1333. [Google Scholar] [CrossRef]
  148. Appelbaum, S.H.; Habashy, S.; Malo, J.L.; Shafiq, H. Back to the future: Revisiting Kotter’s 1996 change model. J. Manag. Dev. 2012, 31, 764–782. [Google Scholar] [CrossRef]
  149. Kaneko, K. Chaos as a source of complexity and diversity in evolution. Artificial Life 1993, 1, 163–177. [Google Scholar] [CrossRef]
  150. Lambert, P.A. The order-chaos dynamic of creativity. Creat. Res. J. 2020, 32, 431–446. [Google Scholar] [CrossRef]
  151. Marshak, R.J. Morphing: The Leading Edge of Organizational Change in the Twenty-first Century. Organ. Dev. J. 2004, 22, 8–21. [Google Scholar]
  152. Soodak, H.; Iberall, A.S. Thermodynamics and complex systems. In Self-Organizing Systems: The Emergence of Order; Life Science Monographs; Springer: Boston, MA, USA, 1987; pp. 459–469. [Google Scholar] [CrossRef]
  153. Bak, P. How Nature Works: The Science of Self-Organized Criticality; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  154. Cessac, B.; Blanchard, P.; Krüger, T.; Meunier, J.L. Self-organized criticality and thermodynamic formalism. J. Stat. Phys. 2004, 115, 1283–1326. [Google Scholar] [CrossRef]
  155. Weick, K.E.; Quinn, R.E. Organizational change and development. Annu. Rev. Psychol. 1999, 50, 361–386. [Google Scholar] [CrossRef]
  156. Burnes, B. Kurt Lewin and the planned approach to change: A re-appraisal. J. Manag. Stud. 2004, 41, 977–1002. [Google Scholar] [CrossRef]
  157. Zuckerman, M.; Kuhlman, D.M. Personality and risk-taking: Common biosocial factors. J. Personal. 2000, 68, 999–1029. [Google Scholar] [CrossRef]
  158. Byrnes, J.P.; Miller, D.C.; Schafer, W.D. Gender differences in risk taking: A meta-analysis. Psychol. Bull. 1999, 125, 367–383. [Google Scholar] [CrossRef]
  159. Boyer, T.W. The development of risk-taking: A multi-perspective review. Dev. Rev. 2006, 26, 291–345. [Google Scholar] [CrossRef]
  160. Burnes, B. Complexity theories and organizational change. Int. J. Manag. Rev. 2005, 7, 73–90. [Google Scholar] [CrossRef]
  161. Furby, L.; Beyth-Marom, R. Risk taking in adolescence: A decision-making perspective. Dev. Rev. 1992, 12, 1–44. [Google Scholar] [CrossRef]
  162. Zuckerman, M. Psychobiology of Personality; Cambridge University Press: Cambridge, UK, 1991; Volume 10. [Google Scholar]
  163. Kahneman, D.; Tversky, A. Prospect theory: An analysis of decision under risk. In Handbook of the Fundamentals of Financial Decision Making: Part 1; World Scientific Publishing: Hackensack, NJ, USA, 2013; pp. 99–127. [Google Scholar] [CrossRef]
  164. Turner, R.A.; Irwin, C.E.; Millstein, S.G. Family structure, family processes, and experimenting with substances during adolescence. J. Res. Adolesc. 1991, 1, 93–106. [Google Scholar]
  165. Zinn, J.O. The meaning of risk-taking–key concepts and dimensions. J. Risk Res. 2019, 22, 1–15. [Google Scholar] [CrossRef]
  166. Baumrind, D. The influence of parenting style on adolescent competence and substance use. J. Early Adolesc. 1991, 11, 56–95. [Google Scholar] [CrossRef]
  167. Byrnes, J.P. The Nature and Development of Decision Making: A Self-Regulation Model; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1998. [Google Scholar]
  168. Dewett, T. Exploring the role of risk in employee creativity. J. Creat. Behav. 2006, 40, 27–45. [Google Scholar] [CrossRef]
  169. Slovic, P. Assessment of risk taking behavior. Psychol. Bull. 1964, 61, 220–233. [Google Scholar] [CrossRef]
  170. March, J.G.; Shapira, Z. Managerial perspectives on risk and risk taking. Manag. Sci. 1987, 33, 1404–1418. [Google Scholar] [CrossRef]
  171. Luthans, F.; Avey, J.B.; Avolio, B.J.; Norman, S.M.; Combs, G.M. Psychological capital development: Toward a micro-intervention. J. Organ. Behav. 2006, 27, 387–393. [Google Scholar] [CrossRef]
  172. Lieb, E.H.; Yngvason, J. The physics and mathematics of the second law of thermodynamics. Phys. Rep. 1999, 310, 1–96. [Google Scholar] [CrossRef]
  173. Dalton, A.; Wolff, K.; Bekker, B. Interdisciplinary Research as a Complicated System. Int. J. Qual. Methods 2022, 21, 16094069221100397. [Google Scholar] [CrossRef]
  174. Graff, H.J. The “problem” of interdisciplinarity in theory, practice, and history. Soc. Sci. Hist. 2016, 40, 775–803. [Google Scholar] [CrossRef]
  175. Alvargonzález, D. Multidisciplinarity, interdisciplinarity, transdisciplinarity, and the sciences. Int. Stud. Philos. Sci. 2011, 25, 387–403. [Google Scholar] [CrossRef]
  176. Haddad, W.M. Thermodynamics: The unique universal science. Entropy 2017, 19, 621. [Google Scholar] [CrossRef]
  177. Maruyama, K.; Nori, F.; Vedral, V. Colloquium: The physics of Maxwell’s demon and information. Rev. Mod. Phys. 2009, 81, 1. [Google Scholar] [CrossRef]
  178. Hinkin, T.R. A brief tutorial on the development of measures for use in survey questionnaires. Organ. Res. Methods 1998, 1, 104–121. [Google Scholar] [CrossRef]
  179. Reise, S.P.; Waller, N.G.; Comrey, A.L. Factor analysis and scale revision. Psychol. Assess. 2000, 12, 287. [Google Scholar] [CrossRef]
  180. Bach, D.R.; Dolan, R.J. Knowing how much you don’t know: A neural organization of uncertainty estimates. Nat. Rev. Neurosci. 2012, 13, 572–586. [Google Scholar] [CrossRef]
  181. Glimcher, P.W. Choice: Towards a standard back-pocket model. In Neuroeconomics; Academic Press: Cambridge, MA, USA, 2009; pp. 503–521. [Google Scholar]
  182. Zou, Q.; **e, S.; Lin, Z.; Wu, M.; Ju, Y. Finding the best classification threshold in imbalanced classification. Big Data Res. 2016, 5, 2–8. [Google Scholar] [CrossRef]
  183. Fano, R.M. Transmission of Information. A Statistical Theory of Communication; The MIT Press: Cambridge, UK, 1961. [Google Scholar]
Figure 1. Correlation variables of psychological entropy.
Figure 1. Correlation variables of psychological entropy.
Behavsci 14 00054 g001
Figure 2. Dynamic mechanism of entropy-based proactive control model.
Figure 2. Dynamic mechanism of entropy-based proactive control model.
Behavsci 14 00054 g002
Table 1. Four dimensions of an entropy-based proactive control model.
Table 1. Four dimensions of an entropy-based proactive control model.
Four Characteristics of Entropy ChangePsychological Entropy
Entropy reduction occurs in open systems/dissipative systemsLearning orientation
The higher the concentration of energy, the lower the entropyGoal orientation
The equilibrium state has the highest entropy and the most dispersed energyChange orientation
The complexity of the critical state is the highest, and entropy is higherRisk taking
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jia, H.; Wang, L. Introducing Entropy into Organizational Psychology: An Entropy-Based Proactive Control Model. Behav. Sci. 2024, 14, 54. https://doi.org/10.3390/bs14010054

AMA Style

Jia H, Wang L. Introducing Entropy into Organizational Psychology: An Entropy-Based Proactive Control Model. Behavioral Sciences. 2024; 14(1):54. https://doi.org/10.3390/bs14010054

Chicago/Turabian Style

Jia, Haozhe, and Lei Wang. 2024. "Introducing Entropy into Organizational Psychology: An Entropy-Based Proactive Control Model" Behavioral Sciences 14, no. 1: 54. https://doi.org/10.3390/bs14010054

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop