Fernando Ramiro-Manzano
The wave is considered a paradigm in dance and connects bodily expression with nature. Although wave concepts such as propagation and phase have proven to be powerful tools for dance analysis, many aspects of bodily expression, including partner dance, have been investigated using numerical approaches and neural networks. Complementarily, compact analytical models have been especially successful for describing human motion, particularly gait. Here, we leverage wave-physics concepts to provide a comprehensive wave-based and oscillatory analytical characterization of expressive motion in partner dance. We apply this framework to Bachata Sensual, a dance style in which the wave is the leitmotif. We analyse three dance couples (Phase I) performing five movement sequences and one composite. The sequences exhibit multiple wave phenomena, from time-dependent interference to the generation-like emergence of harmonics. Within this wave-physics perspective, the formalism can be viewed as a choreographic motion notation. As an illustrative acoustic analogy, harmonic components extracted under boundary conditions can be mapped to audible frequencies, forming musical dyads. Within certain limits and not rigidly constrained by body morphology, modal response can be tuned to underpin fluid motion, adapting across musical timescales and movement patterns. Overall, this wave-physics notation highlights connections between partner-dance expressivity and harmonic nature.
Konrad Szocik, Abraham Loeb
Recent work on the Loeb Scale has provided astronomy a structured framework for assessing anomalous interstellar objects, including a quantitative mapping of a classification ranking, its evolution with the addition of data, and a broader observational strategy for firming its verdict. What remains unclear is the epistemic and methodological meaning of the threshold built into that framework. Here we argue that the central philosophical issue is no longer whether astronomy can define such a threshold, but how a threshold already in place should regulate scientific inquiry under uncertainty. We suggest that candidate technosignature status, such as Level 4 on the Loeb Scale, should be understood as an intermediate epistemic status: stronger than permissive openness, weaker than confirmation, yet sufficient to justify methodological escalation. The argument proceeds in three steps. First, it reconstructs the recent philosophical debate through the work of Lomas, Lane, and Cowie. Second, it turns to historical cases discussed by Kaplan (2026) to show that important discoveries are often delayed not only by weak evidence, but also by paradigms, prestige, and institutional filtering. Third, it interprets candidate status as a form of structured scientific commitment under uncertainty, one that justifies intensified observation, broader hypothesis management, and more deliberate allocation of attention and resources without licensing belief in artificial origin. The paper concludes by arguing that AI should not be the arbitrator in deducing an extraterrestrial origin, but can support the detection, comparison, and prioritization of anomalies once a candidate status has been formally recognized.
Liangzhu Leon Wang, Huiheng Liu, Honghao Fu, Zhipeng Deng, Bing Dong, Naiping Gao
Quantum computing is a new approach to computation that utilizes superposition, entanglement, interference, and tunneling to solve problems too complex for classical computers. This paper discusses the basic concepts and development of quantum computing, exploring its potential applications in the built environment and urban microclimate research. In buildings, quantum computing may help optimize energy management, control HVAC systems, and plan electric vehicle charging networks more efficiently. For urban microclimates, it could accelerate renewable energy planning and support multi-objective design, making it easier to balance urban building performance with climate conditions. Since current quantum hardware is still in the Noisy Intermediate-Scale Quantum (NISQ) stage, we propose the "BITE" principle to guide researchers in choosing suitable problems for quantum acceleration: B (Big search), I (Input-light), T (Tiny computation), and E (Evaluation polish). Although quantum computing still faces challenges such as noise and hardware limits, it offers great potential for developing more climate-resilient, sustainable, and energy-efficient cities of the future.
Celia Blanco, Jacob Haqq-Misra, George Profitiliotis
How long a technological civilization remains active, and what determines whether it collapses or persists, is a central question for both projecting humanity's future and assessing the prevalence of detectable intelligence in the galaxy. We model collapse-recovery dynamics across ten plausible futures for Earth-originating civilization using a hybrid deterministic-stochastic simulation over a 1000-year window. The duty cycle, defined as the fraction of its total lifespan that a civilization is technologically active, ranges from ~0.38 to 1.00, with trajectory outcomes shaped by the interplay of governance structure, resource pressure, and hazard exposure. Several model parameters map onto actionable resilience levers, and modest improvements can qualitatively alter long-term trajectories. Sensitivity analysis reveals that the resource depletion rate and the post-collapse recovery fraction are consistently the most impactful levers across scenarios, suggesting that reducing resource consumption may be at least as important as mitigating existential hazards for avoiding civilizational collapse. We discuss implications for Earth's civilizational resilience and for the search for extraterrestrial technosignatures. We also derive an effective detectability duration that accounts for intermittent civilizational activity, and show that the apparent absence of extraterrestrial signals may reflect the prevalence of low-duty-cycle civilizations rather than the rarity of intelligent life.
J. Antonio del Río Portilla, Argelia Balbuena-Ortega, Anabel López-Ortiz, Jorge Alberto Tenorio, Nicté Yasmín Luna-Medina, Patricio Javier Valadés-Pelayo, Federico del Río-Portilla, Mayra León-Santiago, Alfonso Valiente-Banuete
This paper explores the electrification of mezcal distilling in Oaxaca, Mexico, as a sustainable alternative to traditional firewood methods. We investigate the mezcal process, including cooking, grinding, fermentation, and distillation, and propose a photovoltaic system for distillation. The research also includes scientific outreach activities in the producing communities. We, in collaboration with the communities, proposed novel uses of renewable energies. The results of chemical analysis (chromatography and FTIR) and sensory data for distillation using firewood and electricity are presented to compare the mezcal produced with solar energy and traditional mezcal. Our studies conclude that electrical distillation can reduce environmental impact and improve energy efficiency without compromising product quality.
Naoki Seto
We present the first observational test of the hybrid ring strategy, a general coordinated signaling scheme proposed by Seto (2025), which provides a practical Schelling-point realization for interstellar signaling. We use the exceptionally bright GRB 221009A as the anchoring flash for the scheme, together with the accurately measured distance to the Galactic center. This combination provides a high-precision relation linking sky position to a tightly constrained arrival-time window. TESS observed the region around the GRB nearly continuously for ~50 days in 2024, providing survey light curves that enable a direct test of this scheme with sharply predicted arrival-time windows of $\sim$3.4 days. Among 58 carefully selected stars, we identify two that show noticeable single-time-bin brightenings inside their predicted windows (where each time bin corresponds to a 200 s integrated TESS exposure). In both cases the brightenings coincide with excursions in at least one nearby star and are therefore most consistent with instrumental origins. This test demonstrates that the hybrid ring strategy is practical with existing survey data and could serve as a promising basis for future technosignature searches.
William Yicheng Zhu, Lei Zhu
The recent, super-exponential scaling of autonomous Large Language Model (LLM) agents signals a broader, fundamental paradigm shift from machines primarily replacing the human hands (manual labor and mechanical processing) to machines delegating for the human minds (cognition, reasoning, and intention). The uncontrolled offloading and scaling of "thinking" itself, beyond human's limited but efficient biological capacity, has profound consequences for humanity's heat balance sheet, since thinking, or intelligence, carries thermodynamic weight. The Earth has already surpassed the heat dissipation threshold required for long-term ecological stability, and projecting based on empirical data reveal a concerning trajectory: without radical structural intervention, anthropogenic heat accumulation will breach critical planetary ecological thresholds in less than 6.5 years, even under the most ideal scenario where Earth Energy Imbalance (EEI) holds constant. In this work, we identify six factors from artificial intelligence that influence the global heat dissipation rate and delineate how their interplay drives society toward one of four broad macroscopic trajectories. We propose that the integration of artificial intelligence and its heat dissipation into the planetary system constitute the tenth planetary boundary (9+1). The core empirical measurement of this boundary is the net-new waste heat generated by exponential AI growth, balanced against its impact on reducing economic and societal inefficiencies and thus baseline anthropogenic waste heat emissions. We demonstrate that managing AI scaling lacks a moderate middle ground: it will either accelerate the breach of critical planetary thermodynamic thresholds, or it will serve as the single most effective lever on stabilizing the other nine planetary boundaries and through which safeguarding human civilization's survival.
Yue Li, Xu Pan, Kaiyuan Guo
Project Daedalus (1973--1978), the most detailed interstellar probe design study ever conducted, specified a 9 mm beryllium erosion shield to protect the spacecraft payload during its 5.9 light-year cruise to Barnard's Star at 12% of the speed of light. This design, however, predated both the isolation of two-dimensional materials and the development of graph neural network (GNN) property predictors. Here, we systematically screen 20 candidate materials--spanning conventional aerospace metals, transition metal dichalcogenides, and ultra-high-temperature ceramics--using density functional theory (DFT) data from the JARVIS database (76,000 materials) with independent validation by the Atomistic Line Graph Neural Network (ALIGNN). We evaluate candidates across four criteria: specific mechanical stiffness (KV/rho), sputtering resistance, thermal neutron absorption cross-section, and thermodynamic stability. Our screening identifies hexagonal boron nitride (h-BN) and boron carbide (B4C) as dual-function materials offering simultaneous mechanical protection and neutron radiation shielding, and we propose a graphene/h-BN/polymer layered heterostructure shield design that achieves an estimated 47% mass reduction relative to the original beryllium specification. These findings will become immediately actionable upon the successful development of fusion pulse propulsion, which we note remains an outstanding engineering challenge.
Nan Li, Shiyin Shen
Mar 31, 2026·astro-ph.CO·PDF The cosmological principle states that the universe is uniform and does not favor any specific position or direction. However, research conducted by \cite{Shen2025} has revealed that the universe demonstrates a notable inclination towards parity-odd states. Furthermore, it remains uncertain whether the universe also favors prime numbers. In this study, we examine the largest available catalogs of galaxy groups to investigate this hypothesis. Specifically, we assess whether the number of galaxies within a galaxy group or cluster is more likely to be a prime number. Our results strongly suggest that the universe does indeed have a preference for prime numbers, with findings exceeding the 4.1 sigma significance threshold. This insight explains why the Primes consistently triumphs over Unicorn. Consequently, it may be necessary to consider revising the cosmological principle in the context of a higher-dimensional feature space. Moreover, our research establishes a connection between the Riemann Zeta function and cosmology pioneeringly, paving the way for the development of Cosmozetaology.
Michael B. Lund
The high frequency of satellite launches, particularly over the last few years, has been a subject of significant concern, particularly relating to the future of observational astronomy, the stability of low Earth orbits, and environmental impacts. We call attention to the insufficiently-addressed silver lining of this looming satellite cloud. If the high rates of satellites continue as we model, we can expect the solar flux received by the Earth to significantly decrease in the relatively near future. We address how this decrease in flux could provide a solution for another major problem, anthropogenic climate change. This would allow us to solve one problem with another problem as early as late March 2031.
Abhinna Sundar Samantaray, Finnja Annika Fluhrer, Dhruv Saini, Omkar Charaple, Anish Kumar Singh, Dhruv Vansraj Rathore
Astrology has long been used to interpret human personality, estimate compatibility, and guide social decision-making. Zodiac-based systems in particular remain culturally influential across much of the world, including in South Asian societies where astrological reasoning can shape marriage matching, naming conventions, ritual timing, and broader life planning. Despite this persistence, astrology has never established either a physically plausible mechanism or a statistically reliable predictive foundation. In this work, we examine zodiac-based personality prediction using a controlled machine-learning framework. We construct a synthetic dataset in which individuals are assigned zodiac signs and personality labels drawn from a shared pool of 100 broadly human traits. Each sign is associated with a subset of 10 common descriptors, intentionally overlapping with those assigned to other signs, thereby reproducing the ambiguity characteristic of practical astrological systems. We then train Logistic Regression, Random Forest, and neural-network classifiers to infer personality labels from zodiac-based features and nuisance covariates. Across all experiments, predictive performance remains at or near random expectation, while shuffled-label controls yield comparable accuracies. We argue that the apparent success of astrology arises not from measurable predictive structure, but from trait universality, category overlap, cognitive biases such as the Barnum effect and confirmation bias, and the interpretive flexibility of astrologers and pundits. We conclude that zodiac-based systems do not provide reliable information for predicting human behavior and instead function as culturally durable narrative frameworks. This paper is intended as a humorous academic exercise.
David R. Rice, Michael J. Radke
Mar 30, 2026·astro-ph.EP·PDF Exoplanet atmospheres are usually discussed as tracers of climate, chemistry, and habitability, but they may also preserve signatures of planetary defense. We consider three folklore-motivated deterrents against monsters: reduced organosulfur gases as anti-hematophage repellents, argentiferous reflective aerosols as anti-lycanthropic countermeasures, and haline aerosols as a counting problem for specters. We show that globally-mixed garlic-smelly levels of DMS/DMDS could produce observable mid-infrared transmission features, that silver hazes would show up as anomalous optical brightening, and that sea-salt lofting sustained by strong near-surface winds appears as muted spectra. None of these signatures is unique, which is precisely the observational challenge. A defended world may first appear merely sulfur-rich, bright, or hazy. Therefore, some atmospheres may encode not only biosignatures, but also evidence that the local biosphere has stopped being afraid of the dark.
Sindhunil Barman Roy
Classical Marxism and the algebra of revolution were formulated within the ontological constraints of 19th-century Newtonian materialism-a world of discrete, predictable, billiard-ball interactions. However, the 20th-century transitions in physics, from Thomas Kuhn's paradigm shifts to Phil Anderson's philosophy of emergence, have dismantled the reductionist foundations of this mechanical worldview. This paper proposes a New Manifesto for Scientific Socialism by synthesizing modern condensed matter physics with the non-dual philosophy of Advaita Vedanta. By examining the concepts of Geometric Frustration and Competing Interactions through the lens of Spin-Glasses and Mott Insulators, we argue that social stasis and synthesis are emergent properties of a universal consciousness field rather than mechanical inevitabilities. We further explore how this quantum-informed dialectic resolves the essential tension between the individual and the collective, echoing the intuitions of Schrodinger and Heisenberg regarding the foundational unity of reality.
Jacob Haqq-Misra
This paper develops a two-parameter matrix that can be used to describe four general strategies in the search for technosignatures. The first parameter is domain accessibility: can the technosignature be accessed within the spatial domain accessible to us today? The second parameter is recognizability: would the technosignature be recognizable to us if discovered today? This yields a matrix with four options that each comprise different search strategies. "Exploration" is the strategy for technosignatures that are accessible within our domain and recognizable, which includes radio and optical signals that have reached Earth and any artifacts that might be identifiable within the solar system. "Expansion" is the strategy for technosignatures that are recognizable but beyond our spatial domain, which includes diffuse technology elements that may exist in nearby systems but could not be remotely observed from Earth. "Evolution" is the strategy for technosignatures that are accessible within our domain but unrecognizable; this would require advances in sensory perception, technological or biological, before such technosignatures could be discovered. Finally, "Existence" is the strategy for technosignatures that are neither within our domain nor recognizable. The implications of these four options are discussed with relevance to the Fermi paradox and strategies for searching for technosignatures.
David Awad
We present an empirical argument against the existence of single timeline backward time travel using the price behavior of prediction markets. If rational agents could travel backward in time, binary prediction contracts would converge to degenerate prices (0 or 1) immediately upon market formation. We observe no such behavior across large datasets of resolved contracts. This yields a directly falsifiable prediction and sharpens prior economic arguments while avoiding reliance on physical experimentation. The argument requires only the existence of a single profit motivated agent in the future capable of interacting with markets along a closed timelike curve intersecting the market's spacetime location. We further argue that such agents would have no incentive to conceal trades in causally inert events, where outcomes are independent of market prices, implying that any such activity would be visible in aggregate price behavior. While many worlds interpretations evade this test, we argue that only single timeline models are empirically falsifiable, and prediction market evidence is inconsistent with their existence.
Maria Gritsevich, Marcin Wesołowski, Josep M. Trigo-Rodríguez, Alberto J. Castro-Tirado, Jorma Ryske, Markku Nissinen, Peter Carson
Mar 17, 2026·astro-ph.EP·PDF A quantitative understanding of cometary outbursts requires robust constraints on the size distribution of ejected particles, which governs outburst dynamics and underpins estimates of released gas and dust. In the absence of direct measurements of particle sizes, assumptions about the size distribution play a central role in modelling dust-trail formation, their dynamical evolution and observability, and the potential production of meteor showers following encounters with Earth. We analyse brightness amplitude variations associated with outbursts of comet 17P/Holmes from 1892 to 2021, with particular emphasis on the exceptional 2007 mega-outburst. During this event the comet underwent a rapid and substantial brightening: at its peak, the expanding coma reached a diameter exceeding that of the Sun and briefly became the largest object in the Solar System visible to the naked eye. We constrain the size distribution and total mass of porous agglomerates composed of ice, organics, and dust ejected during the outburst. The inferred particle size distribution is consistent with a power law of index q, yielding effective particle sizes between 10^-6 m for q = 4 and 5 x 10^-3 m for q = 2. Accounting for effective particle size, sublimation flux, and bulk density, we find that the total number of ejected particles increases with both q and sublimation flux. These results place constraints on the physical properties of outburst ejecta and provide physically motivated initial conditions for long-term dust-trail evolution modelling. They further indicate that cometary outburst brightness is determined primarily by the number of particles and their size distribution, rather than by the total ejected mass alone, with direct implications for the origin and evolution of meteoroid streams and the interplanetary dust population.
Pertti O. Tikkanen
I present a reanalysis of temperature data from a publicly available certified laboratory report that documented the self-discharging behavior of an energy-storage device during 10 days. Graphs of temperature variations of both the tested device itself and the test chamber (fume hood) were given mainly for monitoring without further analysis, and variations in the ambient temperature signal were attributed to "other cells being cycled simultaneously in the same fume hood". I show that the ambient temperature signal alone -- together with some quite mild and reasonable assumptions -- allow to extract previously unpublished information on the simultaneously run test on the other cells: 1) the number of charge/discharge cycles 2) the cycle period, 3) the charge/discharge half-cycle asymmetry, and -- most significantly -- evidence that 4) the mentioned "other device" completed 338 full charge/discharge cycles at 3C rate at room temperature without any detectable thermal degradation signature.
Alice Gasparini, Florian Stern, Marine Delaval, Andreas Müller
According to the literature, mobile devices as experimental tools (MDET) can offer educational benefits by creating authentic, real-life contexts for physics learning, enhancing student motivation through the use of familiar technology, and supporting cognitive processes by providing multiple representations of phenomena. However, concerns have been raised about potential distractions and cognitive overload. Regarding these conflicting perspectives, few empirical studies on the impact of MDET in real classroom settings of regular, full-length physics courses are available, focusing on a non-specialized high-school target group. We present a study of a mechanics course in such a new setting, addressing the tight curricular, material, and practical constraints inherent to it. A quasi experimental pre post design comparing a treatment group using MDET and a control group without (same content, lesson plan, and teachers) was used. The 19-week teaching sequence focused on conceptual learning and motivational outcomes, controlled by several predictor variables. Findings reveal substantial pre post learning gains for both groups (Cohen d = 0.9) and small gains for perceived relation to reality (d = 0.29). But no significant differences between treatments were found, indicating that MDET do not outperform conventional teaching under the given constraints. Moreover, no evidence of negative effects such as distraction or cognitive overload was observed, and little to no interactions with predictors such as gender or prior knowledge were found. In conclusion, MDET show considerable potential as an effective option for integrating technology into teaching, offering learning outcomes comparable to those of successful conventional teaching, but not better.
B. Zuckerman
One of the most interesting questions that astronomy can hope to answer is: are we alone in our Milky Way galaxy? A detection of an electromagnetic (EM) signal generated by an extraterrestrial technological intelligence (ETI), or the presence in our solar system of an alien probe, would answer this question in the negative. Purposeful interstellar communication is a 2-way street - the transmitting and receiving technological intelligence (TI) both need to do its part. As the receiving TI, our EM search programs should incorporate a model of what a transmitting TI is likely to be doing. Published searches for extraterrestrial technological intelligence (SETI) have generally not done so and, thus, have often been sub-optimally designed. We propose an improved search technique that more closely corresponds to astronomical surveys that have been undertaken for reasons that have nothing to do with SETI. Published non-SETI radio and optical surveys are sufficiently extensive that they already supply meaningful constraints on the prevalence of nearby purposely communicative alien civilizations. Purposeful communication can also include the sending of spaceships (probes). The absence of evidence for alien probes in the solar system suggests that no alien civilization has passed within 100 light-years of Earth during the past few billion years.
Gilles Montambaux
On April 27, 1900, William Thomson, better known as Lord Kelvin, delivered a visionary speech before the Royal Institution of Great Britain. In it, he presented two unresolved problems which, to him, appeared fundamental and unavoidable at the turn of the 20th century. He compared them to two clouds obscuring our understanding of physics. Dissipating these two clouds would eventually require the development of special relativity and quantum mechanics. This article revisits the second cloud which, contrary to what is often claimed in the literature, did not concern black-body radiation, but rather the specific heat of polyatomic molecules. To clarify this, the article aims to place Kelvin's speech within the historical context of the time and to situate it within the sequence of developments, from Kirchhoff to the first Solvay Conference in 1911, that marked the path of the extraordinary intellectual adventure that led to the birth of quantum mechanics. It will also be shown that Max Planck's initial motivation was not to solve the problem of the so-called "ultraviolet catastrophe."