Submitted Date
Subjects
Authors
Institution
Your conditions: Interdisciplinary Papers
  • Copula Entropy: Theory and Applications

    Subjects: Mathematics >> Statistics and Probability Subjects: Statistics >> Mathematical Statistics Subjects: Information Science and Systems Science >> Basic Disciplines of Information Science and Systems Science submitted time 2024-05-22

    Abstract: Statistical independence is a core concept in statistics and machine learning. Representing and measuring independence are of fundamental importance in related fields. Copula theory provides the tool for representing statistical independence, while Copula Entropy (CE) presents the tool for measuring statistical independence. This paper first introduces the theory of CE, including its definition, theorem, properties, and estimation method. The theoretical applications of CE to structure learning, association discovery, variable selection, causal discovery, system identification, time lag estimation, domain adaptation, multivariate normality test, two-sample test, and change point detection are reviewed. The relationships between the theoretical applications and their connection to correlation and causality are discussed. The frameworks based on CE, the kernel method, and distance correlation for measuring statistical independence and conditional independence are compared. The advantage of CE based on methods over the other comparable methods is evaluated with simulated and real data. The applications of CE in theoretical physics, astrophysics, geophysics, theoretical chemistry, cheminformatics, materials science, hydrology, climatology, meteorology, environmental science, ecology, animal morphology, agronomy, cognitive neuroscience, motor neuroscience, computational neuroscience, psychology, system biology, bioinformatics, clinical diagnostics, geriatrics, psychiatry, public health, economics, management, sociology, pedagogy, computational linguistics, mass media, law, political science, military science, informatics, energy, food engineering, architecture, civil engineering, transportation, manufacturing, reliability, metallurgy, chemical engineering, aeronautics and astronautics, weapon, automobile, electronics, communication, high performance computing, cybersecurity, remote sensing, ocean, and finance are briefly introduced.

  • Electromagnetic Fields of Moving Point Sources in the Vacuum

    Subjects: Information Science and Systems Science >> Basic Disciplines of Information Science and Systems Science Subjects: Physics >> Electromagnetism, Optics, Acoustics, Heat Transfer, Classical Mechanics, and Fluid Dynamics Subjects: Astronomy >> Astrophysical processes submitted time 2024-05-22

    Abstract: The electromagnetic fields of point sources with time varying charges moving in the vacuum are derived using the Liénard-Wiechert potentials. The properties of the propagation velocities and the Doppler effect are discussed based on their far fields. The results show that the velocity of the electromagnetic waves and the velocity of the sources cannot be added like vectors; the velocity of electromagnetic waves of moving sources are anisotropic in the vacuum; the transverse Doppler shift is intrinsically included in the fields of the moving sources and is not a pure relativity effect caused by time dilation. Since the fields are rigorous solutions of the Maxwell’s equations, the findings can help us to abort the long-standing misinterpretations concerning about the classic mechanics and the classic electromagnetic theory. Although it may violate the theory of the special relativity, we show mathematically that, when the sources move faster than the light in the vacuum, the electromagnetic barriers and the electromagnetic shock waves can be clearly predicted using the exact solutions. Since they cannot be detected by observers in the region outside their shock wave zones, an intuitive and reasonable hypothesis can be made that the superluminal sources may be considered as a kind of electromagnetic blackholes.

  • The transition to compulsion in addiction:insights from personality traits, social factors, and neurobiology

    Subjects: Psychology >> Medical Psychology Subjects: Psychology >> Physiological Psychology submitted time 2024-05-20

    Abstract: Compulsion stands as a central symptom of drug addiction; however, only a small fraction of drug users exhibit compulsive characteristics. Differences observed in Sign-trackers (ST) and Goal-trackers (GT) during Pavlovian conditioning may shed light on individual variances in drug addiction. Here, we focus on the behavioral attributes, formation processes, and neural mechanisms underlying ST and how they drive addiction towards compulsivity in humans. We will explore addiction from three interconnected levels: individual personality traits, social factors, and neurobiology. Furthermore, we distinguish between the processes of sensitization and habituation within ST. These nuanced distinctions across various aspects of addiction will contribute to our understanding of the addiction development process and the formulation of targeted preventive strategies.

  • Analysis of Multi-collaboration Model of Large-scale Reading Promotion Projects in the United Kingdom

    Subjects: Library Science,Information Science >> Readers Work Subjects: Library Science,Information Science >> Library Science all over the world submitted time 2024-05-20

    Abstract: The multi-collaboration implementation model is one of the main factors in the success of large-scale reading promotion projects in the UK. Multiple subjects such as government, social organizations, public libraries, enterprises, and the public establish cooperative relationships and networks with different roles and functions, and form characteristic and effective cooperative mechanisms. The model has the characteristics and advantages of cross-border cooperation, partnership, dual core, professionalism, extensiveness and adaptability, as well as potential problems of looseness and instability. China can learn from its experience in innovating and improving the government’s guidance and support methods, establishing the core status of public libraries’ multi-cooperation, actively absorbing social funds and making full use of social resources, promoting cross-domain cooperation, and establishing a standardized and stable cooperation mechanism.

  • Non-perturbative corrections to the planetary perturbation equation

    Subjects: Astronomy >> Celestial Mechanics Subjects: Physics >> Geophysics, Astronomy, and Astrophysics submitted time 2024-05-16

    Abstract: This paper presents a systematic improvement in celestial dynamics theory by introducing a new symmetric form of particle dynamics equation. For open multi-body systems, the symmetric new equation can be applied to any translational reference frame, thus avoiding the need for inertial reference frame approximations and enhancing the accuracy of theoretical predictions. In the case of bound multi-body systems, applying the symmetric new equation allows for an extremely simplified derivation of the planetary perturbation equation in one step. Furthermore, by considering temporary thrust or impact forces acting on planets, or even considering any external forces acting on the bound system further to enhance the computational precision, a new correction equation is now established for the planetary perturbation that can be further imposed with non-perturbative interactions. This will assist in the prediction of the trajectory of asteroids affected by external forces and in the accurate calculation of the orbit of satellites.

  • Turing’s thinking machine and ’t Hooft’s principle of superposition of states

    Subjects: Physics >> General Physics: Statistical and Quantum Mechanics, Quantum Information, etc. Subjects: Computer Science >> Other Disciplines of Computer Science submitted time 2024-05-14

    Abstract: In his 1950 paper  cite{Turing1950}, Turing proposed the notion of a thinking machine, namely a machine that can think. But a thinking machine has to follow a certain law of physics, provided it is realized physically. In this paper, we show that Turing’s thinking machine necessrily obeys ’t Hooft’s principle of superposition of states, which was presented by ’t Hooft  cite{Hooft2016} in 2016 beyond the usual one as described by Dirac  cite{Dirac1958} in the conventional quantum mechanics. Precisely, Turing’s thinking machine must be a quantum machine, while ’t Hooft’s principle characterizes its thinking behavior in a probabilistic way.

  • Decision on public goods can only be made by a public authority is a corollary of Arrow’s Impossibility Theorem

    Subjects: Management Science >> Management Theory Subjects: Library Science,Information Science >> Utilization of Information Subjects: Other Disciplines >> Synthetic discipline submitted time 2024-05-12

    Abstract: Purpose/Significance Arrow’s Impossibility Theorem (AIT) was stated and proved by Kenneth Joseph Arrow, one of the winners of the Nobel Prize in Economics in 1972. The New Economic and Financial Dictionary defines that the statement that a reasonable strategic decision on public goods can only be made by a competent public authority as a corollary of AIT. This corollary relationship has been widely circulated on the Internet and taught in classrooms, and established as a correct conclusion with AIT. However, this corollary relationship is not rigorous, or even wrong, and would lead the study on public choice, public economics, welfare economics, administrative jurisprudence and other disciplines in our country astray. Therefore, it is necessary to clarify misdirection from the theory, steer relevant disciplines toward the right direction.  Method/Process Literature examination method were employed to review the origin and development of this corollary relationship, clarify the connotation and extension of related concepts in different disciplines, as well as the evolution logic of the relationship between related concepts. Result/Conclusion Relevant literature shows that this statement was first associated with AIT in the article Administrative Compulsion and the Realization of the Public Interest. Later, it was excerpted as a definition of AIT in the New Economic and Financial Dictionary and became a corollary of AIT. If the statement were a corollary of AIT, then, as long as AIT holds, decisions on public goods can only be imposed or dictated by a public authority. But in this evolution process, the connotations of public goods and public authority have changed in different disciplines, and the relation between them has evolved, taking the statement as a corollary of AIT is not very tight. AIT indeed causes public goods decision-making difficulties, but studies in the field of public choice does not conclude that the dilemma of decision-making on supply of public goods, arising from AIT, should be left to the imposition or dictatorship of the public authorities. On the contrary, they are committed to the institutional design of the voting mechanism in order to ensure that the supply of public goods is decided by population.

  • FXR activation remodels hepatic and intestinal transcriptional landscapes in non-alcoholic steatohepatitis

    Subjects: Pharmaceutical Science >> Other Disciplines Subjects: Medicine, Pharmacy >> Pharmacology submitted time 2024-05-10

    Abstract: The progression of simple steatosis to non-alcoholic steatohepatitis (NASH) has emerged as a significant health concern. The activation of FXR shows promise in countering this transition and its detrimental consequences. However, the specific alterations within the NASH-related transcriptional network remain elusive, hindering the development of more precise and effective therapeutic strategies. Through a comprehensive analysis of liver RNA-seq data from human and mouse NASH samples, we identified central perturbations within the NASH-associated transcriptional network, including disrupted cellular metabolism and mitochondrial function, decreased tissue repair capability, and increased inflammation and fibrosis, thus shedding light on the complex molecular mechanisms underlying NASH progression. By employing integrated transcriptome profiling of diverse FXR agonists-treated mice, FXR liver-specific knockout mice, and publicly available human datasets, we determined that hepatic FXR activation effectively ameliorated NASH by reversing the dysregulated metabolic and inflammatory networks implicated in NASH pathogenesis. This mitigation encompassed resolving fibrosis, reducing immune infiltration, and creating an immune microenvironment that mirrors the positive trends observed in clinical disease progression. By understanding the core regulatory network of FXR, which is directly correlated with disease severity and treatment response, we identified approximately one-third of the patients who could potentially benefit from FXR agonist therapy. A similar analysis involving intestinal RNA-seq data from FXR agonists-treated mice and FXR intestine-specific knockout mice revealed that intestinal FXR activation attenuates intestinal inflammation, and has promise in attenuating hepatic inflammation and fibrosis. Collectively, our study uncovers the intricate pathophysiological features of NASH at a transcriptional level and highlights the complex interplay between FXR activation and both NASH progression and regression. These findings contribute to precise drug development, utilization, and efficacy evaluation, ultimately aiming to improve patient outcomes.

  • Ultra-low-noise transimpedance amplifier with a single HEMT in pre-amplifier for measuring shot noise in cryogenic STM

    Subjects: Engineering and technical science >> Physics Related Engineering and Technology Subjects: Engineering and technical science >> Technology of Instrument and Meter Subjects: Electronics and Communication Technology >> Electron Technology Subjects: Physics >> Interdisciplinary Physics and Related Areas of Science and Technology submitted time 2024-05-06

    Abstract: In this work, a design of transimpedance amplifier (TIA) for cryogenic scanning tunneling microscope (CryoSTM) is proposed. The TIA with the tip-sample component in CryoSTM is called as CryoSTM-TIA. With transimpedance gain of 1 Gohm, the bandwidth of the CryoSTM-TIA is larger than 200 kHz. The distinctive feature of the proposed CryoSTM-TIA is that its pre-amplifier is made of a single cryogenic high electron mobility transistor (HEMT), so the apparatus equivalent input noise current power spectral density at 100 kHz is lower than 6 (fA)2/Hz. In addition, bias-cooling method can be used to in-situ control the density of the frozen DX- centers in the HEMT doping area, changing its structure to reduce the device noises. With this apparatus, fast scanning tunneling spectra measurements with high-energy-resolution are capable to be performed. And, it is capable to measure scanning tunneling shot noise spectra (STSNS) at the atomic scale for various quantum systems, even if the shot noise is very low. It provides a powerful tool to investigate novel quantum states by measuring STSNS, such as detecting the existence of Majorana bound states in the topological quantum systems.

  • Low-noise large-bandwidth high-gain transimpedance amplifier for cryogenic STM at 77 K

    Subjects: Electronics and Communication Technology >> Electron Technology Subjects: Engineering and technical science >> Technology of Instrument and Meter Subjects: Physics >> Interdisciplinary Physics and Related Areas of Science and Technology Subjects: Physics >> Condensed Matter: Electronic Structure, Electrical, Magnetic, and Optical Properties submitted time 2024-05-06

    Abstract: In this work, we design and fabricate the transimpedance Amplifier (TIA) following the design mentioned in Ref. 1 . In the TIA, the preamplifier (Pre-Amp) is made of a junction field effect transistor (JFET) that can work at 77 K. The post-amplifier (Post-Amp) is made of an operational amplifier. Cascade Pre-Amp and Post-Amp to form the inverting-amplifier. With a 1.13 Gohm feedback network, the gain of TIA is 1.13 Gohm and its bandwidth is about 97 kHz. The equivalent input noise voltage power spectral density of TIA is not more than 9 (nV)2/Hz at 10 kHz and 4 (nV)2/Hz at 50kHz, and its equivalent input noise current power spectral density is about 26 (fA)2/Hz at 10 kHz and 240 (fA)2/Hz at 50 kHz. The measured transport performances and noise performances of TIA are consistent with the simulations and calculations. As an example, the realization of TIA in this work verifies the design method and analytical calculations for the low-noise large-bandwidth high-gain TIA proposed in Ref. 1,2 . And, the TIA in this work is perfect for the cryogenic STM working at liquid nitrogen temperature.

  • Protective Effect of IGFBP-3 Protein on Heavy Ion Radiation Induced Injury in Mice

    Subjects: Physics >> Nuclear Physics Subjects: Biology >> Radiobiology submitted time 2024-05-06

    Abstract: Manned spaceflight and nuclear technology applications are running on a highway in China today. The radiation and nuclear safety will continue to be a major national demand in a long term. Thus, the continuous observation of new radiation protection molecular targets and related drugs is of great value to us. Our previous study has found that the circulating Insulin-like Growth Factor Binding Protein 3 (IGFBP-3) showed a significant increase after total body exposure of mice to ionizing radiation. However, the function of IGFBP-3 and the effects of it level change on radiation induced damages are still unclear. In this study, we set up the Igfbp3 gene overexpression and knock-down cell models in mouse Kupffer (MKC) cells. The CCK-8 assay, EdU assay, clone formation assay and microsphere phagocytosis experiment were performed for investing the proliferation activity, DNA replication activity and phagocytic ability of different cell models after carbon-ion irradiation. Moreover,mice were tail vein injected with recombinant IGFBP-3 protein at 2 hours before 5 Gy carbon-ion irradiation, and the survival curves of mice were drawn. The results showed that overexpression of IGFBP-3 protein significantly alleviated the radiation-induced decrease of the DNA replication activity, cell viability, clone formation rate, and phagocytic ability of MKC cells. On the contrary, the knock-down of IGFBP-3 protein expression reduced the above results. Injection of IGFBP-3 protein before carbon-ion exposure significantly delayed the time of death in mice. Our results indicate at the cellular and animal levels that IGFBP-3 protein has the potential to reduce radiation-induced damages and serve as a target for radiation protection. Through enhancing the radiation resistance and phagocytic ability of Kupffer cells in mice to reduce the risk of infection after radiation exposure might be the underlying mechanism of the effects of IGFBP-3 on radiation protection.

  • The statistical analysis methods for extremely unbalanced data in GWAS

    Subjects: Medicine, Pharmacy >> Preventive Medicine and Hygienics Subjects: Statistics >> Biomedical Statistics submitted time 2024-05-06

    Abstract: 【Abstract】 Extremely unbalanced data here refers to datasets where the values of independent or dependent variables exhibit severe imbalances in proportions, such as extremely unbalanced case-control ratios, very low disease incidence rates, heavily censored survival data, and low-frequency or rare genetic variants. In such scenarios, test statistics in classical statistical methods, such as logistic regression and Cox proportional hazards models, may deviate from normality or chi-square assumptions, leading to difficulties in controlling type I errors. With the increasing availability and exploration of resources from large-scale population cohorts in whole-genome association studies, there is a growing demand for efficient and accurate statistical approaches to handle extremely unbalanced data in independent and non-independent samples. To address this need, this paper provides a systematic methodological overview. Firstly, it derives test statistics from classical statistical methods. Secondly, it elucidates the impact of extremely unbalanced data on the distribution of test statistics. Thirdly, it introduces two widely used methods for correcting statistics in genome-wide association studies: Firth correction and saddlepoint approximation methods. Finally, it briefly introduces commonly used software for extremely unbalanced genomic data. This paper provides theoretical references and application recommendations for the statistical analysis of extremely unbalanced data.

  • Construction and performance test of charged particle detector array for MATE

    Subjects: Physics >> Nuclear Physics Subjects: Nuclear Science and Technology >> Nuclear Detection Technology and Nuclear Electronics submitted time 2024-04-28

    Abstract: A charged particle array, named MATE-PA, which serves as an auxiliary detec#2;tor system to the Multi-purpose Active-target Time projection chamber for nuclear astrophysical and exotic beam Experiments (MATE) has been con#2;structed. The array is composed of twenty single-sided strip-silicon detectors, covering around 10% of the solid angle. It is dedicated for the detection of reaction-induced charged particles which penetrate the MATE active volume. The performance of MATE-PA has been experimentally studied using an alpha source, and a 36-MeV 14N beam injected into the chamber of MATE, filled with a mixture gas of 95% 4He and 5% CO2 under the pressure of 500 mbar, at the Radioactive Ion Beam Line in Lanzhou (RIBLL). The results demonstrate good separation of light charged particles with the forward double-layer silicon detectors of MATE-PA. The energy resolution of the Si detectors was deduced to be about 1% (σ) for an energy loss of about 10 MeV by the α particles. The inclusion of MATE-PA helps improve particle identification, and increases the dynamic range for the kinetic energy of charged particles, in particular that of α particles up to about 15 MeV.

  • Radio AGN Selection and Characterization in Three Deep-Drilling Fields of the Vera C. Rubin Observatory Legacy Survey of Space and Time

    Subjects: Other Disciplines Subjects: Other Disciplines submitted time 2024-04-25

    Abstract: The Australia Telescope Large Area Survey (ATLAS) and the VLA survey in the XMM-LSS/VIDEO deep field provide deep ($\approx 15$ ${\mu}$Jybeam$^{-1}$) and high-resolution ($\approx$ 4.5--8 arcsec) radio coverage of the three XMM-SERVS fields (W-CDF-S, ELAIS-S1, and XMM-LSS). These data cover a total sky area of 11.3 deg$^2$ and contain $\approx 11000$ radio components. Furthermore, about 3~deg$^2$ of the XMM-LSS field also has deeper MIGHTEE data that achieve a median RMS of 5.6 ${\mu}$Jy beam$^{-1}$ and detect more than 20000 radio sources. We analyze all these radio data and find source counterparts at other wavebands utilizing deep optical and IR surveys. The nature of these radio sources is studied using radio-band properties (spectral slope and morphology), and the IR-radio correlation. %and spectral energy distribution. Radio AGNs are selected and compared with those selected using other methods (e.g. X-ray). We found 1656 new AGNs that were not selected using X-ray and/or MIR methods. We constrain the FIR-to-UV SEDs of radio AGNs using {\sc cigale} and investigate the dependence of radio AGN fraction upon galaxy stellar mass and star-formation rate.

  • The X-ray spectral and variability properties of typical radio-loud quasars

    Subjects: Other Disciplines Subjects: Other Disciplines submitted time 2024-04-25

    Abstract: We present X-ray spectral and long-term variability analyses of an unbiased sample of 361 optically selected radio-loud quasars (RLQs) utilizing sensitive serendipitous X-ray data from the Chandra and XMM-Newton archives. The spectral and temporal properties of RLQs are compared with those of radio-quiet quasars (RQQs) matched in $L_\mathrm{2500A}$ and $z$. The median power-law photon index ($\Gamma$) of RLQs is $1.84^{+0.01}_{-0.01}$, which is close to that of matched RQQs ($1.90^{+0.02}_{-0.01}$). No significant correlations between $\Gamma$ and radio-loudness, $L_\mathrm{x}/L_\mathrm{x,rqq}$ (the X-ray luminosity over that expected from the $L_\mathrm{x}$-$L_\mathrm{uv}$ relation for RQQs), redshift, or Eddington ratio are found for our RLQs. The stacked X-ray spectra of our RLQs show strong iron-line emission and a possible Compton-reflection hump. The intrinsic X-ray variability amplitude is $\approx40$% for RLQs on timescales of months-to-years in the rest frame, which is somewhat smaller than for the matched RQQs ($\approx60$%) on similar timescales, perhaps due to the larger black-hole masses and lower Eddington ratios in our RLQ sample. The X-ray spectral and variability results for our RLQs generally support the idea that the X-ray emission of typical RLQs is dominated by the disk/corona, as is also indicated by a recent luminosity correlation study.

  • The aox--HeII EW Connection in Radio-Loud Quasars

    Subjects: Other Disciplines Subjects: Other Disciplines submitted time 2024-04-25

    Abstract: Radio-loud quasars (RLQs) are known to produce excess X-ray emission, compared to radio-quiet quasars (RQQs) of the same luminosity, commonly attributed to jet-related emission. Recently, we found that the HeII EW and $\alpha_{\rm{ox}}$ in RQQs are strongly correlated, which suggests that their extreme-ultraviolet (EUV) and X-ray emission mechanisms are tightly related. Using 48 RLQs, we show that steep-spectrum radio quasars (SSRQs) and low radio-luminosity ($L_{\rm R}$) flat-spectrum radio quasars (FSRQs) follow the $\alpha_{\rm ox}$--HeII EW relation of RQQs. This suggests that the X-ray and EUV emission mechanisms in these types of RLQs is the same as in RQQs, and is not jet related. High-$L_{\rm R}$ FSRQs show excess X-ray emission given their HeII EW by a factor of $\approx$ 3.5, which suggests that only in this type of RLQ is the X-ray production likely jet related.

  • The $L_\mathrm{x}$-$L_\mathrm{uv}$-$L_\mathrm{radio}$ relation and corona-disk-jet connection in optically selected radio-loud quasars

    Subjects: Other Disciplines Subjects: Other Disciplines submitted time 2024-04-25

    Abstract: Radio-loud quasars (RLQs) are more X-ray luminous than predicted by the X-ray-optical/UV relation (i.e. $L_\mathrm{x}\propto L_\mathrm{uv}^\gamma$) for radio-quiet quasars (RQQs). The excess X-ray emission depends on the radio-loudness parameter ($R$) and radio spectral slope ($\alpha_\mathrm{r}$). We construct a uniform sample of 729 optically selected RLQs with high fractions of X-ray detections and $\alpha_\mathrm{r}$ measurements.We find that steep-spectrum radio quasars (SSRQs; $\alpha_\mathrm{r}\le-0.5$) follow a quantitatively similar $L_\mathrm{x}\propto L_\mathrm{uv}^\gamma$ relation as that for RQQs, suggesting a common coronal origin for the X-ray emission of both SSRQs and RQQs. However, the corresponding intercept of SSRQs is larger than that for RQQs and increases with $R$, suggesting a connection between the radio jets and the configuration of the accretion flow. Flat-spectrum radio quasars (FSRQs; $\alpha_\mathrm{r}>-0.5$) are generally more X-ray luminous than SSRQs at given $L_\mathrm{uv}$ and $R$, likely involving more physical processes. The emergent picture is different from that commonly assumed where the excess X-ray emission of RLQs is attributed to the jets. We thus perform model selection to comparecritically these different interpretations, which prefers the coronal scenario with a corona-jet connection. A distinct jet component is likely important for only a small portion of FSRQs.The corona-jet, disk-corona, and disk-jet connections of RLQs are likely driven by independent physical processes. Furthermore, the corona-jet connection implies that small-scale processesin the vicinity of SMBHs, probably associated with the magnetic flux/topology instead of black-hole spin, are controlling the radio-loudness of quasars.

  • Using Leaked Power to Measure Intrinsic AGN Power Spectra of Red-Noise Time Series

    Subjects: Other Disciplines Subjects: Other Disciplines Subjects: Other Disciplines submitted time 2024-04-25

    Abstract: Fluxes emitted at different wavebands from active galactic nuclei (AGNs) fluctuate at both long and short timescales. The variation can typically be characterized by a broadband power spectrum, which exhibits a red-noise process at high frequencies. The standard method of estimating power spectral density (PSD) of AGN variability is easily affected by systematic biases such as red-noise leakage and aliasing, in particular, when the observation spans a relatively short period and is gapped. Focusing on the high-frequency PSD that is strongly distorted due to red-noise leakage and usually not significantly affected by aliasing, we develop a novel and observable normalized leakage spectrum (NLS), which describes sensitively the effects of leaked red-noise power on the PSD at different temporal frequencies. Using Monte Carlo simulations, we demonstrate how an AGN underlying PSD sensitively determines the NLS when there is severe red-noise leakage and thereby how the NLS can be used to effectively constrain the underlying PSD.

  • The role of executive functioning components in the relationship between family socioeconomic status and mathematical abilities: A longitudinal study

    Subjects: Psychology >> Developmental Psychology Subjects: Psychology >> Educational Psychology submitted time 2024-04-23

    Abstract: As a crucial component of cognitive function, mathematical ability plays an essential role in an individual’s future development. Previous studies have highlighted significant differences in this ability between children from high and low family socioeconomic backgrounds. Executive functioning are the most reliable factor in explaining this disparity. However, fundamental questions remain about the mediating role of executive functioning in this relationship: (1) The role of specific subcomponents of executive functioning in their relationship. (2) The differences in the impact of socioeconomic status on various mathematical abilities through these executive functioning subcomponents. (3) The variations in the role of executive functioning in the relationship between socioeconomic status and both current and future mathematical abilities. Accordingly, our study explored the impact of socioeconomic status on the second and third graders’ mathematical operations, logical reasoning, and spatial imagination abilities, as well as the mediation role of interference inhibition, response inhibition, and working memory. A total of 185 second-grade students were followed for 20 months in two assessments. At the beginning of second grade, children were assessed on their working memory through forward and backward digit span tasks; their interference inhibition was assessed with the Stroop task; and their response inhibition was assessed via the Go/No-go task. Children’s socioeconomic status was assessed using their parents’ educational levels, occupational status, and Family Affluence Scale. The Chinese Rating Scale of Pupil’s Mathematics Abilities, which included subtests for addition, subtraction, number sequence, length estimation, and cube counting, was utilized to assess the children’s mathematical calculation, logical thinking, and spatial imagination abilities. We explored the main effects of socioeconomic status on children’s current and future mathematical abilities using structural equation modeling and simultaneously developed multiple mediation models to investigate how executive functioning components mediate these relationships. The results indicated that the three types of mathematical abilities in second graders showed significant improvement over the 20-month period; socioeconomic status in the second grade directly predicted mathematical abilities at the same grade level; and socioeconomic status could indirectly predict mathematical calculation abilities in the second grade and logical thinking abilities in the third grade through the mediating role of working memory. It is concluded that the present study extends previous research that has explored the mediation role of executive functioning between socioeconomic status and mathematical ability, demonstrating that working memory is a crucial cognitive factor that contributes to the explanation of this mechanism. It provides a scientific basis for educational and research professionals to develop interventions aimed at enhancing the mathematical abilities of children from lower socioeconomic backgrounds.

  • SteganoDDPM: A high-quality image steganography self-learning method using diffusion model

    Subjects: Computer Science >> Information Security Subjects: Computer Science >> Computer Application Technology submitted time 2024-04-23

    Abstract: Image steganography has become a focal point of interest for researchers due to its capacity for the covert transmission of sensitive data. Traditional diffusion models often struggle with image steganography tasks involving paired data, as their core principle of gradually removing noise is not directly suited for maintaining the correspondence between carrier and secret information. To address this challenge, this paper conducts an in-depth analysis of the principles behind diffusion models and proposes a novel framework for an image steganography diffusion model. The study begins by mathematically representing the steganography tasks of paired images, introducing two optimization objectives: minimizing the secrecy leakage function and embedding distortion function. Subsequently, it identifies three key issues that need to be addressed in paired image steganography tasks and, through specific constraint mechanisms and optimization strategies, enables the diffusion model to effectively handle paired data. This enhances the quality of the generated stego-images and resolves issues such as image clarity. Finally, on public datasets like CelebA, the proposed model is compared with existing generation model-based image steganography techniques, analyzing its implementation effects and performance parameters. Experimental results indicate that, compared to current technologies, the model framework proposed in this study not only improves image quality but also achieves significant enhancements in multiple performance metrics, including the imperceptibility and anti-detection capabilities of the images. Specifically, the PSNR of its stego-images reaches 93.14dB, and the extracted images’ PSNR reaches 91.23dB, an approximate improvement of 30% over existing technologies; the attack success rate is reduced to 2.4x10-38. These experimental outcomes validate the efficacy and superiority of the method in image steganography tasks.