By Donna Hooft

1 Introduction 

Artificial Intelligence (AI) is one of the greatest rising tools in applied medicine and will potentially revolutionize healthcare, by providing efficient and effective solutions to challenges in the medical industry (King, 2023). The human organism is an integrated network of various physiological systems, these systems collaborate on multiple temporary and spatial scales  (Ivanov, 2021).  Neurovascular coupling, for instance, can be described as an interaction between the vascular and the neuronal systems (Girouard & Iadecola, 2006).


Image from Ivanov, 2021

In order for AI, neural networks in particular,  to be capable of aid in the biggest medical challenges, it is important that human physiology is learned. This gives rise to a fundamental question: Can a Neural Network Learn Physiology? 

Or more specifically: Does a Neural Network have to be able to learn physiology in order to be useful in these medical challenges?

Answering these fundamental questions is crucial for advancing the AI revolution in medicine. Therefore, in search of answers, the following sections will thoroughly examine the human physiolome and use of AI in physiology. 


2 Human Physiology

Physiology is a biology branch which focuses on the function of and mechanisms within an organism (Physiology | definition & bodily function | Britannica, 2024), in contrast to anatomy, wherein the structure and organization is central (Anatomy | definition, history, & biology | Britannica, 2024). 

Anatomy and physiology are closely intertwined, especially in the human body. Take the vascular system, for example. Blood is pumped to the lungs for oxygen absorption, then circulated to muscles and tissues to supply oxygen before returning to the lungs. This demonstrates how the function of a red blood cell (RBC) depends on its environment and location in the body. In summary, besides taking into account dynamic system states, physiology must also consider the anatomical aspects of human body functions. 

So far, it has been shown that neural neural networks can effectively learn and detect anatomy in medical imaging, even with scarcely annotated data (Menkovski, Aleksovski, Saalbach, & Nickisch, 2015). The TotalSegmentator of Wasserthal et al., (2023) for instance, automatically segments 104 different anatomical structures on CT images with a high robustness and accuracy. However, neural networks effectively learning physiology has not been demonstrated yet (Ivanov, 2021). This gives rise to the question, whether this is actually possible. 

Therefore, the field of Network Physiology has emerged to address these fundamental questions. It emphasizes the importance of coordination and interactions among organ systems (Bashan et al., 2012; Ivanov and Bartsch, 2014). A physiological system can also be referred to as an organ system, however, the field aims to shift the focus from single organ systems to understand how networks contribute to generating global behaviors at the organism level (Ivanov, 2021).

2.1 Network Physiology

Humans are complex systems, which can be modeled, similar to any other system, in terms of nodes and edges. For example, consider Nodes A, B, and C, connected by edges e, f, and g. Nodes represent entities, and connections their interactions.

This system with three nodes can, for example, represent three friends (nodes A-C) and their moods, and their interactions (edges e-g). If A and B gossip (e) about C, it negatively affects C's mood. However, if B and C engage in a friendly conversation (g) it positively impacts both. The important thing to take away from this example is that both the nodes and the connections are dynamic, affecting each other reciprocally.

Let's apply the network model to neurovascular coupling. Consider A as a neuron, B as an astrocyte, and C as a pericyte—a type of endothelial cell in the brain's vascular lining. In case a neuron rises in activity, it signals the astrocyte it will need higher oxygen and glucose supply. The astrocyte, in turn, communicates with hormones to the pericyte to expand the arterial wall, resulting in a higher blood supply for the neuron. 

On the other hand, in case of  reduced neural activity or lower blood pressure, the pericyte constricts the arterioles, resulting in less of a blood supply to the neurons (Girouard & Iadecola, 2006). Herein, again, the interactions between all 3 entities, the connections are dynamic as well as the cell states, the nodes. 

This is exactly what is of importance in learning physiology, it's not just about grasping causal relationships between individual entities, but also about understanding both the dynamic interactions and states of these entities. Depending on the physiological scale being considered, the entities can resemble cells, tissues, organs or organ systems. These approaches must consider: 1) the dynamic nature of individual systems (network nodes), 2) the dynamical aspects of network links that depict real-time organ communications, 3) the evolution of organ interactions over time, and 4) the development of collective network behavior in response to changes in physiological states and conditions (Ivanov, 2021).

Image made with powerpoint  


2.2 Major Challenges  

Although systems biology and integrative physiology have made notable progress, our comprehension of how various systems dynamically interact to produce physiological states remains limited (Bashan et al., 2012; Ivanov and Bartsch, 2014).

There are several reasons for this, of which some will be described in the following section. 

Physiological systems in the human body have complex nonlinear relationships, therefore, learning connections within a network should be adaptable.  The underlying physiological interactions also happen on different spatiotemporal scales, increasing the difficulty to establish a network describing the different spatial and temporal dynamics.

On one hand, the human body is a huge data generator, generating continuous streams of data in scales from milliseconds to years (Stanovski et al., 2015) (think of heart rate, hormone levels, blood pressure).  This makes it difficult to determine what data segments should be focused on, or on what scale the physiology should be studied, in order to be useful.  Essentially, physiology is not limited to a single scale of study, but it can be investigated across various scales depending on its specific application or area of interest.

On the other hand, continuous streams of data of human physiological systems not yet available across different physiological states and conditions (Finazzi et al., 2018) . The main reason behind this is the difficulty in gathering and saving this data in clinical settings  (Ivanov, 2021).

Finally, the major challenge to learning physiology is that there is not yet a well established analytical or computational tool wherein this framework for human physiology can be created (Ivanov, 2021). This would require a tool that can handle multiple continuous streams of data, on different spatiotemporal scales and from different system levels ranging from cells to organs. Would AI, or more specifically, neural networks pose a solution for this? 

Ultimately, it comes down to the fact that the human body is incredibly complex and because of this, modeling physiology will remain challenging. However, if it is possible to model physiological subsystems, perhaps, in the future, it will be possible to model the entire human physiology.

2.3 Current progress

There are already multiple methods in development to provide nonlinear interactions between pairs of dynamical systems networks, of which, mutual information, phase synchronization, coherence, complex wavelets, transfer entropy and granger causality (Ivanov, 2021). 

Additionally, the time delay stability method (TDS) can be used to detect and measure physiological interactions between diverse organ systems with different outputs. (Bashan et al., 2012). The use of TDS has already been demonstrated to be useful in determining the connectivity between the cortico-muscular systems. (Rizzo, Garcia-Retortillo, & Ivanov, 2022; Rizzo et al., 2023).

Adaptive dynamical networks could also provide answers to the big challenges surrounding physiology, as there are able to incorporate both nonlinear and adapting connections on different temporal scales, in combination with adaptable network states or nodes (Berner, Gross, Kuehn, Kurths, & Yanchuk, 2023).

Because of the methods described in the previous paragraphs and other advancements in Network Physiology, the understanding of physiologic states and functions has drastically increased.  By adopting a network perspective, studies have investigated interactions on multiple levels, from cells to tissues, to organ systems,  in order to build the ultimate goal: the Human Physiolome  (Ivanov, 2021).

The term "human physiolome" merges "physiology" and "-ome" to encompass all bodily functions and processes. It aims to understand how they collaborate to maintain a functioning organism. However, to return to our initial question, can a neural network learn physiology? We first have to become acquainted AI, neural networks in particular, and how these can be applied to learning physiology.



3 Neural Networks in Physiology/Medicine  

3.1 What are Neural Networks? 

Before diving into whether a neural network can or not learn human physiology, or be of help in developing the human physiolome, it is important to understand what a neural network is. 

Simply said, an artificial neural network (ANN) is a computer algorithm based on how human nerve cells process information  (Agatonovic-Kustrin & Beresford, 2000; Stern, 1996) 

Note that the term “artificial neural network” is used, and not “neural network” (NN), to avoid confusion with the neural networks in the human brain. In the scope of this blogpost, the term “neural network” will refer to an ANN, focusing on the computer algorithm, not biological neural networks. In the table seen below, the analogies between artificial neuron networks, and biological neural networks in the (human) brain are shown (Guresen & Kayakutlu, 2011).

An ANN consists of multiple layers of neurons, also known as processing units or elements within the computer algorithm. Each neuron receives input, processes it, and then produces an output passed on to the next layer. The input layer is the first layer of the network, responsible for receiving input data, while the output layer is the final layer, producing the network's output or prediction  (Guresen & Kayakutlu, 2011).

For a visual representation, please refer to the image ..., illustrating a simple neural network with an input layer, an output layer, and one hidden layer.  In practice, NNs can include thousands of ‘neurons’ with hundreds to thousands of hidden layers; increased complexity does not always result in improved performance. In practice, the connections between layers, depicted as black lines in the example, represent various mathematical operations such as convolutions, dropout layers, and more (Aloysius & Geetha,  2017).

Table from: (Guresen & Kayakutlu, 2011)

Neural networks use optimization methods, often a loss function to measure how much the output deviates from the desired output. Hidden layers adjust accordingly to optimize future inputs. However, unsupervised deep learning networks do not have a desired output, but aim to identify patterns within data (Baldi, 2012) .

image made with powerpoint

Various applications of neural networks include computer vision, which encompasses tasks such as object identification and image segmentation, as well as natural language processing. This is evident in widely used translators such as DeepL (DeepL, [25-05-2024]) and in chatbots such as ChatGPT (ChatGPT, [25-05-2024]).   Both of these applications have the potential to be useful in medical applications

3.2 NNs for physiology

Now that neural networks are clarified, let's revisit the initial question: Can they effectively learn physiology?

The primary challenges in learning physiology stem from the dynamic interactions of diverse networks across varying spatiotemporal scales.  Additionally, there are vast amounts of data streams from different physiological networks within the body, originating from different scales. 

To resolve the dynamic interactions issue, adaptive dynamic networks have been proposed. Adaptive networks are dynamical networks whose topologies and states coevolve, and can be modeled as complex systems like neural networks, and biological networks (Sayama et al., 2013).  In other words, these are capable of having dynamic states, nodes, as well as dynamic interactions, connections (Berner, Gross, Kuehn, Kurths, & Yanchuk, 2023).

Additionally, it is possible to embed multiple networks within one another, often referred to hierarchical networks (Clauset, Moore, & Newman, 2008), this could be implemented to account for the physiological interactions on different spatiotemporal scales. The use of this type of subnetworking also enables analyzing different aspects of input data, for example data on different temporal scales originating from different physiological networks. This makes using hierarchical networks more reliable and applicable for physiological applications (Mavrovouniotis & Chang, 1992).

3.2.1 Hierarchical and Adaptive Dynamics in Neural Networks

When modeling physiological systems, employing hierarchical and adaptive dynamic networks makes sense. Hierarchical systems are suitable because physiological networks often consist of multiple spatial and temporal subnetworks. Additionally, adaptive dynamic networks are also appropriate since physiology involves changing states and interactions.

Berner et al. note that neural networks are inherently adaptive dynamical networks because they self-adjust during the learning phase through gradient descent and backpropagation.1 

Therefore, we shouldn't question whether a neural network can implement an adaptive dynamic network; it inherently functions as one due to its adaptive states and connections. Additionally, using hierarchical architectures has already been implemented in neural networks (Clauset, Moore, & Newman, 2008).  And, therefore, we can infer that neural networks already implement several solutions for the challenges that come with learning physiology.  Consequently, in theory, neural networks should be capable of learning physiology.

Why are we then asking this question? Primarily, because it has not been extensively demonstrated yet. This can partly be explained by the challenges described in the 1st section, but also because networks employed in physiological contexts may not always effectively learn physiology itself. The following sections will explore various studies wherein physiological systems are examined.

1 Gradient descent adjusts the parameters to minimize errors, backpropagation updates these parameters based on the errors detected during training, repeated iteratively until the network learns to make accurate predictions or classifications.

3.2.2 Modeling Sepsis with a Two-Layer Network Model

It is possible to learn the physiology behind a certain disease or condition. This has been demonstrated in the case of sepsis  (Berner, Sawicki, Thiele, Löser, & Schöll, 2022). In this study they implemented an adaptive dynamical two-layer network of a parenchymal and immune cell layer.  Physiology is modeled through a system of coupled oscillators representing the interactions and synchronization between the layers.

The interactions between these layers are mediated by cytokines, with adaptive coupling representing the dynamic changes in the system. The model captures the transition from healthy (synchronized) states to pathological (desynchronized) states as the disease progresses

The model parameters are informed by existing physiological and clinical data, but the study primarily focuses on theoretical and computational insights rather than direct empirical validation against a specific ground truth.  However, this model successfully captures the interactions and dysregulation seen in sepsis on a cellular scale, demonstrating that an adaptive dynamic network can learn physiology. 

The network architecture is specifically designed for modeling sepsis, however, the general approach of using hierarchical and adaptive dynamic networks can potentially be adapted for other physiological conditions involving complex interactions between different cell types and signaling molecules.


From (Berner, Sawicki, Thiele, Löser, & Schöll, 2022).

3.2.3 Learning Network Connectivity Dynamics using Time Delay Stability

Rizzo et al. investigated dynamic physiological interactions between brain waves and muscle activity in both control groups and Parkinson's patients using time delay stability (TDS) methods.  (Rizzo, Garcia-Retortillo, Rizzo, & Ivanov, 2022; Rizzo et al., 2023). TDS does not specifically rely on neural networks or other machine learning applications. Instead, it focuses on analyzing the stability of interactions among different physiological systems based on time delays (Bashan et al., 2012).  

Both works use TDS to model the dynamic network connectivity between brain waves and muscular activity. Specifically, the connectivity is modeled by examining the synchronization and coordination of the cortico-muscular interactions. The studies capture how these interactions change across different states, such as sleep stages or neurodegenerative conditions. 

These studies are highly interrelated, as the first one provides a reference for the latter. In the first paper of Rizzo et al., (2022), a baseline is established for the cortico-muscular connectivity during different sleep stages. In the following one, this baseline is compared to the connectivity across Parkinson's patients.  They found that the connectivity patterns between the brain and muscles differ over different sleep stages. Additionally, they showed that the consistent pattern changes  of connectivity between healthy people and parkinson patients differed, possibly explaining sleep issues in the latter.

In these studies, the dynamic interactions between two organ systems were investigated without the use of neural networks or adaptive dynamic networks. This indicates that dynamic networks can be researched without relying on these methods.  Moreover, TDS is versatile and can be used to study a wide range of physiological interactions and can be applied to various conditions (Ivanov, 2021). By integrating TDS with neural networks, it could potentially accelerate its application in network physiology. 

 




Image from Rizzo, Garcia-Retortillo, Rizzo, & Ivanov, 2022



4 Learning Potential vs. Practical Utility 

4.1 Tackling physiological challenges without learning physiology

Im from (Mambetsariev et al., 2023). 


AI is already used widely in medical applications. AI-driven networks are used for drug matching and treatment outcome predictions (Mambetsariev et al., 2023). Deep neural networks have shown promising results in cancer diagnosis through intelligent image analysis, outperforming traditional methods (Munir, Elahi, Ayub, Frezza, & Rizzi, 2019). 

Furthermore, neural networks can detect and diagnose seizures in EEG data with (Acharya, Oh, Hagiwara, Tan, & Adeli, 2018). All in all, Neural networks are widely employed in complex medical diagnosis systems, facilitating disease detection, classification, and drug compatibility testing (Lin, Vasilakos, Tang, & Yao, 2016).

In these cases, neural networks learn patterns from data to help with the designated tasks, like disease detection and diagnosis aid, but we cannot conclude that the physiology behind these diseases is learned. This leads us to the question: Must neural networks learn physiology to be useful in medical contexts, or the human physiolome?

4.2 What has to be learned?

The examples in the previous section (Acharya, Oh, Hagiwara, Tan, & Adeli, 2018; Lin, Vasilakos, Tang, & Yao, 2016; Mambetsariev et al., 2023;  Munir, Elahi, Ayub, Frezza, & Rizzi, 2019),  demonstrate that neural networks can learn and mimic physiology, even if its learning process is not learning physiology. 

Hence, the final section of this blog post shifts focus from whether neural networks can learn physiology to whether this is essential for their use in physiological questions.  In a general sense, during the learning phase of a neural network, we do not look at what kind of features of the input are represented in the hidden layers. However, sometimes it can be useful to visualize these features, in order to see what a neural network bases its learning upon. 

In image classification, neural networks start by detecting basic features like gradients and edges in the initial layers, progress to identifying shapes and lines in mid-level features, and ultimately recognize object-like images in high-level features (LeCun, 2015).  In this case of image classification, it is rather useful to verify what this network bases its classifications on, and evidently, it ‘’learns’ them rather similarly to humans.

Image from LeCUn, 2015

In other types of neural network structures like autoencoders, lower-level features often correspond to specific aspects of the output. Autoencoders generally compress input data into a latent space and then reconstruct it to match the original image (Kingma & Welling, 2019) Some can also be used to generate outputs, such as a variational autoencoder.


Image made with powerpoint     

 In the case of a conditional VAE(CVAE), the latent space  has a semantic meaning for the output. In the CVAE of Yan, Yang, Sohn, & Lee (2016), for generation of faces, values in the latent space correspond to attributes like gender, hair color, or age.

                                                                                                                                                                                                                                             Image from Yan, Yang, Sohn, & Lee (2016)

In these cases related to image classification and image generation, we could argue that having ‘sensible’ features in the hidden layers or latent space might be useful. However, there are plenty of neural networks used where these hidden features might become too abstract to be interpretable. If the performance of these respective networks still yields satisfactory results, should we then worry about what these features mean?

This raises concerns about the patterns that artificial intelligence (AI) may identify in input data, which humans are not capable of understanding. Additionally, it raises ethical and transparency concerns. Understanding the features and patterns identified by AI could be crucial for ensuring accountability, avoiding biases, and making informed decisions based on AI outputs. While performance is important, interpretability and understanding of the underlying mechanisms should also be considered to build trust and reliability in AI systems, especially in healthcare applications.


5 Discussion 

5.1 Conclusion

To answer the question, can a NN learn physiology? The answer is yes.

However, given the complexity of the human physiolome, neural networks might not be the only tool for studying human physiology, and other methods, such as TDS, might provide solutions to physiological questions as well.

We can conclude that a Neural Network in theory is able to learn human physiology, but it does not need to have the ability to learn physiology, in order to be useful in building the human physiolome.  Nevertheless, to ensure real-world applicability and accountability, it should be capable of learning interpretable patterns for physiology.

5.2 Review

The results of this brief review on neural networks regarding physiology are promising, but not surprising. Artificial intelligence is one of the fastest-evolving fields of the last decades and considering its quickly rising performance, applications to human physiology do not seem far out of reach. 

One thing we should always keep in mind in modeling biological systems is that a certain degree of reductionism is applied. For instance, in reality a tissue consists of thousands of cells, but on some models this will be represented by a single entity.

Here, George Box’s saying is applicable:

“All models are wrong, but some are useful” (1976). 

We could use AI, more particularly neural networks, as a map or blueprint for human physiology, but we shouldn't forget that this always deviates from the actual system it describes. On the other hand, building the human physiolome using AI, even if reductionized, could offer valuable solutions to physiological issues and challenges.

I think the upcoming decades will be an exciting time in AI for medical applications, and I am curious to see what kind of physiological breakthroughs will emerge with the use of Neural Networks. 

5.2.1 Upon Request

In the following paper by Bodein, A., Chapleur, O., Droit, A., & Lê Cao, K.-A.  a generic, data-driven analytical framework for integrating different types of longitudinal data with microbial community data is proposed. The goal is to study relationships between molecular mechanisms and microbial community structures or host-microbiota interactions over time. The framework addresses challenges such as uneven time points, small sample sizes, high individual variability, and the unique characteristics of microbial data (e.g., sparsity and compositionality). This type of framework could also pose solutions for data issues in pyhsiology. 

Bodein, A., Chapleur, O., Droit, A., & Lê Cao, K.-A. (2019). A generic multivariate framework for the integration of microbiome longitudinal studies with other data types. Frontiers in Genetics, 10. https://doi.org/10.3389/fgene.2019.00963

6 References

  • Acharya, U. R., Oh, S. L., Hagiwara, Y., Tan, J. H., & Adeli, H. (2018). Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. Computers in Biology and Medicine, 100, 270-278. https://doi.org/10.1016/j.compbiomed.2017.09.017
  • Agatonovic-Kustrin, S., & Beresford, R. (2000). Basic concepts of artificial neural network (Ann) modeling and its application in pharmaceutical research. Journal of Pharmaceutical and Biomedical Analysis, 22(5), 717-727. https://doi.org/10.1016/S0731-7085(99)00272-1
  • Aloysius, N., & Geetha, M. (2017). A review on deep convolutional neural networks. 2017 International Conference on Communication and Signal Processing (ICCSP), 0588-0592. https://doi.org/10.1109/ICCSP.2017.8286426
  • Anatomy | definition, history, & biology | britannica. (2024, mei 6). https://www.britannica.com/science/anatomy
  • Baldi, P. (2012). Autoencoders, unsupervised learning, and deep architectures. Proceedings of ICML Workshop on Unsupervised and Transfer Learning, 37-49. https://proceedings.mlr.press/v27/baldi12a.html
  • Bartsch, R. P., Schumann, A. Y., Kantelhardt, J. W., Penzel, T., & Ivanov, P. Ch. (2012). Phase transitions in physiologic coupling. Proceedings of the National Academy of Sciences, 109(26), 10181-10186. https://doi.org/10.1073/pnas.1204568109
  • Bashan, A., Bartsch, R. P., Kantelhardt, J. W., Havlin, S., & Ivanov, P. C. (2012). Network physiology reveals relations between network topology and physiological function. Nature Communications, 3(1), 702. https://doi.org/10.1038/ncomms1705
  • Berner, R., Gross, T., Kuehn, C., Kurths, J., & Yanchuk, S. (2023). Adaptive dynamical networks. Physics Reports, 1031, 1-59. https://doi.org/10.1016/j.physrep.2023.08.001
  • Berner, R., Sawicki, J., Thiele, M., Löser, T., & Schöll, E. (2022). Critical parameters in dynamic network modeling of sepsis. Frontiers in Network Physiology, 2, 904480. https://doi.org/10.3389/fnetp.2022.904480
  • Box, G. E. P. (1976). Science and statistics. Journal of the American Statistical Association, 71(356), 791-799. https://doi.org/10.1080/01621459.1976.10480949
  • Clauset, A., Moore, C., & Newman, M. E. J. (2008). Hierarchical structure and the prediction of missing links in networks. Nature, 453(7191), 98-101. https://doi.org/10.1038/nature06830
  • Finazzi, S., Mandelli, G., Garbero, E., Mondini, M., Trussardi, G., Giardino, M., Tavola, M., & Bertolini, G. (2018). Data collection and research with MargheritaTre. Physiological Measurement, 39(8), 084004. https://doi.org/10.1088/1361-6579/aad10f
  • Girouard, H., & Iadecola, C. (2006). Neurovascular coupling in the normal brain and in hypertension, stroke, and Alzheimer disease. Journal of Applied Physiology, 100(1), 328-335. https://doi.org/10.1152/japplphysiol.00966.2005
  • Guresen, E., & Kayakutlu, G. (2011). Definition of artificial neural networks with comparison to other networks. Procedia Computer Science, 3, 426-433. https://doi.org/10.1016/j.procs.2010.12.071
  • Islam, M., Chen, G., & Jin, S. (2019). An overview of neural network. American Journal of Neural Networks and Applications, 5(1), 7-11. https://doi.org/10.11648/j.ajnna.20190501.12
  • Iten, R., Metger, T., Wilming, H., Del Rio, L., & Renner, R. (2020). Discovering physical concepts with neural networks. Physical Review Letters, 124(1), 010508. https://doi.org/10.1103/PhysRevLett.124.010508
  • Ivanov, P. C. (2021). The new field of network physiology: Building the human physiolome. Frontiers in Network Physiology, 1. https://doi.org/10.3389/fnetp.2021.711778
  • Ivanov, P. Ch., & Bartsch, R. P. (2014). Network physiology: Mapping interactions between networks of physiologic networks. In G. D’Agostino & A. Scala (Red.), Networks of Networks: The Last Frontier of Complexity (pp. 203-222). Springer International Publishing. https://doi.org/10.1007/978-3-319-03518-5_10
  • King, M. R. (2023). The future of ai in medicine: A perspective from a chatbot. Annals of Biomedical Engineering, 51(2), 291-295. https://doi.org/10.1007/s10439-022-03121-w
  • Kingma, D. P., & Welling, M. (2019). An introduction to variational autoencoders. Foundations and Trends® in Machine Learning, 12(4), 307-392. https://doi.org/10.1561/2200000056
  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444. https://doi.org/10.1038/nature14539
  • Lin, D., Vasilakos, A. V., Tang, Y., & Yao, Y. (2016). Neural networks for computer-aided diagnosis in medicine: A review. Neurocomputing, 216, 700-708. https://doi.org/10.1016/j.neucom.2016.08.039
  • Mambetsariev, I., Fricke, J., Gruber, S. B., Tan, T., Babikian, R., Kim, P., Vishnubhotla, P., Chen, J., Kulkarni, P., & Salgia, R. (2023). Clinical network systems biology: Traversing the cancer multiverse. Journal of Clinical Medicine, 12(13), 4535. https://doi.org/10.3390/jcm12134535
  • Mavrovouniotis, M. L., & Chang, S. (1992). Hierarchical neural networks. Computers & Chemical Engineering, 16(4), 347-369. https://doi.org/10.1016/0098-1354(92)80053-C
  • McDonnell, K. J. (2023). Leveraging the academic artificial intelligence silecosystem to advance the community oncology enterprise. Journal of Clinical Medicine, 12(14), 4830. https://doi.org/10.3390/jcm12144830
  • Menkovski, V., Aleksovski, Z., Saalbach, A., & Nickisch, H. (2015). Can pretrained neural networks detect anatomy? (arXiv:1512.05986). arXiv. http://arxiv.org/abs/1512.05986
  • Munir, K., Elahi, H., Ayub, A., Frezza, F., & Rizzi, A. (2019). Cancer diagnosis using deep learning: A bibliographic review. Cancers, 11(9), 1235. https://doi.org/10.3390/cancers11091235
  • Physiology | definition & bodily function | britannica. (2024, mei 1). https://www.britannica.com/science/physiology
  • Rizzo, R., Garcia-Retortillo, S., & Ivanov, P. Ch. (2022). Dynamic networks of physiologic interactions of brain waves and rhythms in muscle activity. Human Movement Science, 84, 102971. https://doi.org/10.1016/j.humov.2022.102971
  • Rizzo, R., Wang, J. W. J. L., DePold Hohler, A., Holsapple, J. W., Vaou, O. E., & Ivanov, P. C. (2023). Dynamic networks of cortico-muscular interactions in sleep and neurodegenerative disorders. Frontiers in Network Physiology, 3. https://doi.org/10.3389/fnetp.2023.1168677
  • Sayama, H., Pestov, I., Schmidt, J., Bush, B. J., Wong, C., Yamanoi, J., & Gross, T. (2013). Modeling complex systems with adaptive networks. Computers & Mathematics with Applications, 65(10), 1645-1664. https://doi.org/10.1016/j.camwa.2012.12.005
  • Sherwood, L. (2010). Human physiology: From cells to systems (7th ed). Brooks/Cole, Cengage Learning.
  • Siddique, N., Paheding, S., Elkin, C. P., & Devabhaktuni, V. (2021). U-net and its variants for medical image segmentation: A review of theory and applications. IEEE Access, 9, 82031-82057. https://doi.org/10.1109/ACCESS.2021.3086020
  • Stankovski, T., Ticcinelli, V., McClintock, P. V. E., & Stefanovska, A. (2015). Coupling functions in networks of oscillators. New Journal of Physics, 17(3), 035002. https://doi.org/10.1088/1367-2630/17/3/035002
  • Stern, H. S. (1996). Neural networks in applied statistics. Technometrics, 38(3), 205-214. https://doi.org/10.1080/00401706.1996.10484497
  • Vertalen met DeepL Translate—’s werelds meest accurate vertaler. (z.d.). Geraadpleegd 25 mei 2024, van https://www.deepl.com/translator
  • Vertalen met DeepL Translate—’s werelds meest accurate vertaler. (z.d.). Geraadpleegd 25 mei 2024, van https://www.deepl.com/translator
  • Wasserthal, J., Breit, H.-C., Meyer, M. T., Pradella, M., Hinck, D., Sauter, A. W., Heye, T., Boll, D. T., Cyriac, J., Yang, S., Bach, M., & Segeroth, M. (2023). Totalsegmentator: Robust segmentation of 104 anatomic structures in ct images. Radiology: Artificial Intelligence, 5(5), e230024. https://doi.org/10.1148/ryai.230024
  • Yan, X., Yang, J., Sohn, K., & Lee, H. (2016). Attribute2image: Conditional image generation from visual attributes. In B. Leibe, J. Matas, N. Sebe, & M. Welling (Red.), Computer Vision – ECCV 2016 (Vol. 9908, pp. 776-791). Springer International Publishing. https://doi.org/10.1007/978-3-319-46493-0_47
  • Yegnanarayana, B. (2005). Artificial neural networks (11. print). Prentice Hall of India.


  • Keine Stichwörter