diff --git "a/9NAyT4oBgHgl3EQf3Plu/content/tmp_files/load_file.txt" "b/9NAyT4oBgHgl3EQf3Plu/content/tmp_files/load_file.txt" new file mode 100644--- /dev/null +++ "b/9NAyT4oBgHgl3EQf3Plu/content/tmp_files/load_file.txt" @@ -0,0 +1,1340 @@ +filepath=/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf,len=1339 +page_content='Segmentation based tracking of cells in 2D+time microscopy images of macrophages Seol Ah Park1, Tamara Sipka2, Zuzana Kriva1, George Lutfalla2, Mai Nguyen-Chi2, and Karol Mikula1 1Department of Mathematics and Descriptive Geometry, Slovak University of Technology in Bratislava, Slovakia 2DIMNP, CNRS, Univ.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Montpellier, Montpellier, France Abstract The automated segmentation and tracking of macrophages during their migration are chal- lenging tasks due to their dynamically changing shapes and motions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' This paper proposes a new algorithm to achieve automatic cell tracking in time-lapse microscopy macrophage data.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' First, we design a segmentation method employing space-time filtering, local Otsu’s threshold- ing, and the SUBSURF (subjective surface segmentation) method.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Next, the partial trajectories for cells overlapping in the temporal direction are extracted in the segmented images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Finally, the extracted trajectories are linked by considering their direction of movement.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The segmented images and the obtained trajectories from the proposed method are compared with those of the semi-automatic segmentation and manual tracking.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The proposed tracking achieved 97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='4% of accuracy for macrophage data under challenging situations, feeble fluorescent intensity, irregular shapes, and motion of macrophages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' We expect that the automatically extracted trajectories of macrophages can provide pieces of evidence of how macrophages migrate depending on their polarization modes in the situation, such as during wound healing.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 1 Introduction Since the 17th century and the first microscopes, biologists have dedicated enormous efforts to understanding cellular behaviors within living animals [1].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Embryologists have first described how cellular movements shape embryonic development, but immunologists soon realized that by using microscopy, they could have access to the behavior of specialized, highly mobile cells that play cru- cial roles in immunity [2].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' With the recent development of video microscopy, the diversification of confocal microscopy techniques, and the constant improvement of the sensitivity, resolution, and speed of acquisitions of microscopes [3,4], biologists are now generating huge sets of data that need automated processing to extract significant data to describe the integrated process and understand underlying rules.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Thanks to the contributions of theoreticians and modelers, biologists can now integrate these imaging data with biochemical and genetic data to propose integrated models of cellular behaviors and even to offer integrated models of the development of as complex organisms as vertebrates [5,6].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Identifying (segmenting) and tracking individual cells is challenging because cells divide, move, and change their shapes during their journey in the developing embryo.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Many efforts have been ded- icated to developing software to track cells during embryonic development, and robust solutions are now available [7].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Some of these solutions are compatible with the study of other situations where cells are either moving in an organism (the heart) or in a moving organism (neurons in foraging worms [8]), but some specific cellular populations, due to their particular behaviors, are challenging to identify and track during their journey within a living animal.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' This is the case of macrophages, one of the fastest-moving cellular populations with more irregular shapes and movements.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Macrophages have protective roles in immune defense, homeostasis, and tissue repair, but they also contribute to the progression of many pathologies like cancers, inflammatory diseases, and in- fections [9].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The key feature of macrophages is their remarkable dynamic plasticity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' They respond to changing environments by constantly adopting specific phenotypes and functions defined as M1 and M2, which are the two extremes of a continuum of polarization states [10].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the early stage of 1 arXiv:2301.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='00765v1 [eess.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='IV] 2 Jan 2023 inflammation, M1 macrophages have been shown to accumulate at the wound/infection site where they initiate a pro-inflammatory response showing highly phagocytic and removing any pathogens or debris [11–14].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' During the resolution of inflammation, they switch to M2 macrophages which mediate anti-inflammatory response and participate in tissue remodeling and repair [14–17].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Some studies have reported that the different functions of M1/M2 macrophages seem to be related to shapes and migration [18–21].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' M1 macrophages are more rounded and flat shapes than M2 macrophages, showed by elongated shapes [19, 21].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In addition to the variable morphology, macrophages are known to have two migration modes: amoeboid and mesenchymal.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Amoeboid migration has a fast speed in a largely adhesion-independent manner, mainly observed for M1 macrophages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In contrast, mesenchymal migration is slower and more directional in the presence of strong adhesion mainly observed for M2 macrophages [18–20].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' So far, the relationship between the macrophage activation and migration modes involving the change of macrophages’ shapes in vivo is still unclear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Image segmentation and cell tracking in macrophage data can be the first steps to analyzing the charac- teristics of macrophages [22,23].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Related works and contribution to macrophage segmentation Segmentation of macrophages has been previously studied performing a filter-based method [24], image-based machine learning [25], anglegram analysis [26], etc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Also, deep learning-based segmen- tation methods have been developed for various types of cells [27–32].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' U-Net [27], Cellpose [31], and Splinedist [32] are designed for segmentation of general shapes of cells in microscopy data and have shown a high performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' However, it is still challenging to segment macrophages due to their vary- ing nature, extreme irregularity of shapes, and variability of image intensity inside macrophages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In [33], we have proposed a macrophage segmentation method that combines thresholding meth- ods with the SUBSURF approach requiring no cell nuclei center or other reference information.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' However, a problem occurs when attempting to segment macrophages in time-lapse data since the segmentation parameters are not always suitable for macrophages in all time frames.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In this paper, first, we improve the ability to detect macrophages with low image intensity by applying space-time filtering, which considers the temporal coherence of time-lapse data [34].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Second, Otsu’s method is implemented in local windows to deal with cases where each macrophage has a substantially differ- ent image intensity range.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Similarly, as in [33], the SUBSURF method [35] is applied to eliminate the remaining noise and to smoothen the boundaries of the macrophages resulting from space-time filtering and the thresholding method (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 1).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Related works in cell tracking Automatic cell tracking in microscopy images has been investigated and various methods [23, 30, 36–42] have been proposed.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The tracking algorithm using linear assignment problem (LAP) [36,41] is computationally efficient and has shown good performance, especially for Brownian motion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' However, it can be less accurate if many cells are densely distributed or if some cells suddenly move toward the other nearby cells.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The studies [38, 40] performed cell tracking during zebrafish embryogenesis by finding a centered path in the spatio-temporal segmented structure.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In [39], a workflow was designed, from the image acquisition to cell tracking, and applied to 3D+time microscopy data of the zebrafish embryos.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Those methods show outstanding performance in the case of embryogenesis.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The keyhole tracking algorithm that anticipates the most probable position of a cell at the next time slice has been proposed and applied to red blood cells, neutrophils, and macrophages [43–45].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Moreover, deep learning-based motion tracking in microscopy images has been studied for various types of biological objects with different learning approaches [46–55].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' For instance, the method in [48] trains the networks by utilizing semi-supervised learning to predict cell division.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Usiigaci [50] segments individual cells providing each unique ID to them with a Mask R- CNN model, then the method links the cells by given IDs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The methods by training image sequences using LSTM (long short-term memory) networks have shown their performance for tracing nuclear proteins [51] and bacteria [53].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In [52], the algorithm to solve linear assignment problems in tracking is trained with a deep reinforcement learning (DRL)-based method.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 2 Contribution to macrophage tracking and outline Although various cell tracking methods have been studied, there is still a need for more accurate tracking of erratic movements, such as macrophages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The cell tracking studied in this paper deals with macrophages which undergo fast and complicated motion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' It results in non-overlapping cells in the time direction, and in many cases, one can observe a “random movement”.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' This paper proposes a tracking method that covers the situations of a large number of macrophages and their complex motion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The first step is to extract the cell trajectories from their shapes overlapping in time.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' By this approach, we often obtain only partial trajectories because not always a segmented macrophage overlaps with its corresponding cell in the next/previous frame of the video.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Next, we connect end- points of partial trajectories corresponding to macrophages that do not overlap in time.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' For this, the tangent calculation is used to estimate the direction of macrophages at the endpoints of the partial trajectories.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 1 illustrates briefly all steps of the proposed method yielding macrophage tracking.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The mathematical descriptions of the proposed method are illustrated in Materials and meth- ods.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The performances of the macrophage segmentation and tracking are shown in Results.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In this section, the proposed segmentation method provides the approximate shapes of macrophages, indicating that it can reasonably be the first tracking step.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Also, proposed tracking shows that connecting the centers of macrophages by considering the direction of movement works properly for tracing fast-moving macrophages.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The tracking performance is validated visually and quantitatively, showing how obtained trajectories are close to the manually extracted trajectories.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In Discussion, we summarize the results of the proposed method and discuss limitations, future works, and possible applications.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 1: Procedure of macrophage tracking in 2D+time data.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 2 Materials and methods 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 Image acquisition and preparation The proposed method is applied to two representative datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In both datasets, a three days old transgenic zebrafish larva (Tg(mpeg1:Gal4/-UAS:Kaede)) is used and imaged using a spinning disk confocal microscope.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The green fluorescent protein Kaede is indirectly expressed under the control of macrophage-specific promoter mpeg1 so that macrophages produce the green fluorescent protein in their cytoplasm.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the first dataset, migrating macrophages are imaged from 1 hour to 6 hours post-amputation (1 − 6 hpA) for the caudal fin fold with a time step of 4 minutes and a z step of 4 µm.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the second dataset, macrophages are imaged from 30 minutes post-amputation to 6 hours post-amputation (0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='5 − 6 hpA) with the imaging time step of 2 minutes and z step of 1 µm.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The pixel size is 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='326 µm and 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='347 µm in the first and second datasets, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=" For the numerical experiments, we used the 2D+time projection images, where the three-dimensional 3 Data acguisition Image segmentation Space-time filtering Local Otsu's method+SUBSURF Macrophage tracking Extraction of partial trajectories Connection of the trajectories(3D) microscopy images are projected onto a plane with the maximum intensity of the 3D dataset in every pixel selected." metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Due to the image acquisition speed in the second dataset, the exposition time and fluorescence intensity are reduced, resulting in a low signal-to-noise ratio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' We perform the histogram crop from the acquired images to ignore the noise effects from very high image intensity in a small pixel area.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the case of this noise, the number of pixels is very small compared with the image size.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Therefore, in the histogram, a tiny peak positioned at the highest image intensity corresponds to this type of noise.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' To ignore it, this tiny peak will be cropped in the histogram.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The steps of the histogram crop are the following.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The first step is the estimation of the noise size relative to the image size.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Let us consider the noise accounts for pnoise percent of the total number of pixels Ntot.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Then, the number of pixels for the noise Nnoise equals to Ntot × pnoise.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the histogram, the number of pixels from the maximum intensity in descending order is counted because we want to remove the small noise having the highest image intensity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Let us denote the counted number of the pixels as Ndes(I).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' For example, Ndes(I) for the maximum image intensity Imax equals to the number of pixels of the image intensity Imax.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Likewise, for the minimum image intensity Imin, the counted number of pixels from Imax to Imin is Ndes(Imin) = Ntot.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Finally, when Ndes(I∗) = Nnoise is satisfied, counting is stopped.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The new maximum intensity Inew;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='max is set by searching for the maximum intensity smaller than I∗.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The image intensity ranging from I∗ to Imax is changed to Inew;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='max.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the supplementary material (https://doi.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='org/10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1016/j.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='compbiomed.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='106499), an ex- ample of histograms in the presence of the spot noise and after the histogram crop is depicted.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' We apply the histogram crop only to the second dataset as pnoise = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='001.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' After the histogram crop, the image intensity is scaled to the interval [0, 1] for applying space- time filtering.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Then, the images obtained from space-time filtering are rescaled to the interval [0, 255] to simply perform the local Otsu’s method since histograms of images are usually described by the discrete distribution in a finite interval.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' To apply the SUBSURF method, two types of images are used;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' one is the original images after the histogram crop with the interval [0, 1], and the other is the output of the local Otsu’s method.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2 Segmentation of macrophages in microscopy videos 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 Space-time filtering In the datasets processed by the methods presented in this paper, the macrophages do not always have similar image intensities.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Some can hardly be recognized due to their weak image intensity in static images, but they can be recognized in videos, as human eyes consider temporal information to distinguish objects.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' However, the traditional segmentation method dealing with static images does not view temporal information.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Therefore, it is difficult to detect and segment macrophages if the image intensity of macrophages is similar to the background.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The filtering method that can utilize temporal coherence was introduced in [34], where the regularized Perona–Malik model and scalar function clt, measuring the coherence of objects in time slices, are combined.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The term clt means a “curvature of Lambertian trajectory” [56–58] and vanishes for points (of an object) that preserve the intensity and move on a smooth trajectory in a temporal direction.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the following, let a sequence of time slices be given on the interval [0, θF ], and θ will denote a particular time slice.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The PDE representing nonlinear diffusion filtering is written as ∂u ∂t = clt(u)∇ · � g(|∇Gσ ∗ u|)∇u � , (1) where t denotes the scale, the amount of filtering and u(t, x1, x2, θ) is the unknown real function which is defined on [0, TF ] × Ω × [0, θF ], x = (x1, x2) ∈ Ω ⊂ R2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In |∇Gσ ∗ u|, the “∗” stands for the convolution operator.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The initial condition given by u(0, x, θ) = u0(x, θ), (2) 4 represents the processed 2D+time video.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The clt(u) function is defined as in [34,56] by formula clt(u) = min w1,w2 1 (∆θ)2 � | < ∇u, w1−w2 > |+|u(x−w1, θ−∆θ)−u(x, θ)|+|u(x+w2, θ+∆θ)−u(x, θ)| � , (3) where w1, w2 are arbitrary vectors in 2D space, and ∆θ is the time increment between discrete time slices.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Here, < a, b > denotes Euclidean scalar product of a and b.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The function g is the so-called edge detector function and is defined by g(s) = 1 1 + Ks2 , K > 0, (4) where K is a constant to control the sensitivity of s [59].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Finally, Gσ is a Gaussian function with variance σ, which is used for pre-smoothing by convolution.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Let us denote by un k a numerical solution in the kth frame of the image sequence in the nth discrete filtering (scale) step nτF with the step size τF , i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=', un k(x) = u(nτF , x, k∆θ).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' (5) By using the semi-implicit scheme [34], Equation 1 is discretized as follows un+1 k − un k τF = clt(un k)∇ · � g(|∇uσ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='n k |)∇un+1 k � , (6) where g(|∇uσ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='n k |) = g(|∇Gσ ∗un k|).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' From Equation 3, the discretization of clt(un k) in the point x ∈ Ω can be written as clt(un k) = min w1,w2 1 ∆θ2 � | < ∇un k, w1 −w2 > |+|un k−1(x−w1)−un k(x)|+|un k+1(x+w2)−un k(x)| � .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' (7) For space discretization, we use the finite volume method with finite volume (pixel) side h.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Let us consider that a point x is a center of a pixel (i, j) and let us denote by Vi,j a finite volume corresponding to pixel (i, j), i = 1, · · · M, j = 1, · · · N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The quantity clt(un k) is considered constant in finite volumes.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Then, Equation 6 is integrated with the finite volume Vi,j, and by using Green’s theorem, we get � Vi,j un+1 k − un k τF dx = clt(un k) � ∂Vi,j g(|∇uσ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='n k |)∇un+1 k ni,jdS, (8) where ni,j is a unit outward normal vector to the boundary of Vi,j.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The gradient of u on the pixel edges can be approximated by computing the average values of neighboring pixels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' By using the diamond cell approach [60], we compute the average of neighboring pixel values in the corners of the pixel (i, j) as follows (see also Figure S2 in the supplement materials https://doi.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='org/10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1016/ j.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='compbiomed.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='106499).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' u1,1 i,j,k = 1 4(un i,j,k + un i,j+1,k + un i+1,j,k + un i+1,j+1,k), u1,−1 i,j,k = 1 4(un i,j,k + un i+1,j,k + un i,j−1,k + un i+1,j−1,k), u−1,−1 i,j,k = 1 4(un i,j,k + un i−1,j,k + un i,j−1,k + un i−1,j−1,k), u−1,1 i,j,k = 1 4(un i,j,k + un i,j+1,k + un i−1,j,k + un i−1,j+1,k).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' (9) The gradient of un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k in nth filtering step,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' for a pixel (i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' j) in kth frame of the image sequence,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' is computed at the center of edges of the pixel [60],' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' ∇1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='0un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k = 1 h(un i+1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' u1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − u1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='−1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k ),' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' ∇0,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='−1un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k = 1 h(u1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='−1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − u−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='−1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k ,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k),' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' ∇−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='0un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k = 1 h(un i−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' u−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − u−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='−1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k ),' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' ∇0,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k = 1 h(u1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − u−1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='1 i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k ,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j+1,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k − un i,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='j,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='k),' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' (10) 5 where h denotes the pixel size.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' With the set of grid neighbors Ni,j that consists of all (l, m) of Vi,j, such that l, m ∈ {−1, 0, 1}, |l| + |m| = 1, the final discretized form of Equation 1 is written as un+1 i,j,k = un i,j,k + τF h2 clt(un i,j,k) � |l|+|m|=1 g(|∇l,muσ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='n i,j,k|)(un+1 i+l,j+m,k − un+1 i,j,k).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' (11) For solving Equation 11, the successive over-relaxation (SOR) method is used.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' The SOR method is an iterative method for solving a linear system of equations as a variant of the Gauss–Seidel method [61].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In our simulations, the relaxation factor of the SOR method was set to 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='8 and the calculation was stopped when �M i=1 �N j=1 |un+1 i,j,k − un i,j,k| < 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='001 for every k.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content='2 Local Otsu thresholding It has been shown that the traditional Otsu thresholding technique, which selects a threshold globally (global Otsu’s method), works well for some shapes of macrophages in [33].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' However, global Otsu’s method does not work for all macrophages if there is a wide range of macrophage image intensity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' When cells have a huge variability of shapes, sizes, and intensities, local thresholding techniques can be a powerful segmentation tool [62].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' We, therefore, apply Otsu’s method in local windows to realize the benefits of both Otsu’s method [63] and local thresholding techniques [62,64].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the global Otsu’s method, the two classes which represent objects and the background are firstly defined with the help of a general threshold value Tr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Then the optimal threshold is obtained by finding a particular threshold value T ∗ r that maximizes the between-class variance of the two classes.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' For local Otsu’s method, we calculate the optimal threshold in a certain window of size s×s centered in (i, j) for every pixel.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' In the local window Wi,j, the gray-level histogram is normalized, and a probability distribution is regarded as pr = nr/N, L � r=0 pr = 1, (12) where nr is the number of pixels of intensity r in Wi,j, N = s2 and L is the maximum image intensity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Then, the probabilities of background and foreground in Wi,j are given by ω0(Ti,j) = Ti,j � r=0 pr, ω1(Ti,j) = L � r=Ti,j+1 pr = 1 − ω0(Ti,j), (13) and means of background and foreground are given by µ0(Ti,j) = 1 ω0(Ti,j) Ti,j � r=0 rpr, µ1(Ti,j) = 1 ω1(Ti,j) L � r=Ti,j+1 rpr = µtot − µ0(Ti,j)ω0(Ti,j) 1 − ω0(Ti,j) , (14) where µtot = �L r=0 rpr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/9NAyT4oBgHgl3EQf3Plu/content/2301.00765v1.pdf'} +page_content=' Finally, the between-class variance, the variance between classes of fore- ground and the background, related to the pixel (i, j) is defined as [63] σ2 B(Ti,j) = ω0(Ti,j)(µ0(Ti,j) − µtot)2 + ω1(Ti,j)(µ1(Ti,j) − µtot)2 (15) which simplifies to σ2 B(Ti,j) = � µtotω0(Ti,j) − µ0(Ti,j)ω0(Ti,j) �2 ω0(Ti,j) � 1 − ω0(Ti,j) � , (16) and the optimal threshold T ∗ i,j is given by σ2 B(T ∗ i,j) = max 0≤Ti,j