Xiaodan Shao, Xiaoming Chen, Rundong Jia
Grant-free random access is a promising protocol to support massive access in beyond fifth-generation (B5G) cellular Internet-of-Things (IoT) with sporadic traffic. Specifically, in each coherence interval, the base station (BS) performs joint activity detection and channel estimation (JADCE) before data transmission. Due to the deployment of a large-scale antennas array and the existence of a huge number of IoT devices, JADCE usually has high computational complexity and needs long pilot sequences. To solve these challenges, this paper proposes a dimension reduction method, which projects the original device state matrix to a low-dimensional space by exploiting its sparse and low-rank structure. Then, we develop an optimized design framework with a coupled full column rank constraint for JADCE to reduce the size of the search space. However, the resulting problem is non-convex and highly intractable, for which the conventional convex relaxation approaches are inapplicable. To this end, we propose a logarithmic smoothing method for the non-smoothed objective function and transform the interested matrix to a positive semidefinite matrix, followed by giving a Riemannian trust-region algorithm to solve the problem in complex field. Simulation results show that the proposed algorithm is efficient to a large-scale JADCE problem and requires shorter pilot sequences than the state-of-art algorithms which only exploit the sparsity of device state matrix.
Xiaodan Shao, Xiaoming Chen, Caijun Zhong, Zhaoyang Zhang
Millimeter-wave (mmW)/Terahertz (THz) wideband communication employing a large-scale antenna array is a promising technique of the sixth-generation (6G) wireless network for realizing massive machine-type communications (mMTC). To reduce the access latency and the signaling overhead, we design a grant-free random access scheme based on joint active device detection and channel estimation (JADCE) for mmW/THz wideband massive access. In particular, by exploiting the simultaneously sparse and low-rank structure of mmW/THz channels with spreads in the delay-angular domain, we propose two multi-rank aware JADCE algorithms via applying the quotient geometry of product of complex rank-$L$ matrices with the number of clusters $L$. It is proved that the proposed algorithms require a smaller number of measurements than the currently known bounds on measurements of conventional simultaneously sparse and low-rank recovery algorithms. Statistical analysis also shows that the proposed algorithms can linearly converge to the ground truth with low computational complexity. Finally, extensive simulation results confirm the superiority of the proposed algorithms in terms of the accuracy of both activity detection and channel estimation.
Xiaoming Chen, Chau Yuen, Zhaoyang Zhang
In this paper, we consider a multi-antenna system where the receiver should harvest energy from the transmitter by wireless energy transfer to support its wireless information transmission. In order to maximize the harvesting energy, we propose to perform adaptive energy beamforming according to the instantaneous channel state information (CSI). To help the transmitter to obtain the CSI for energy beamforming, we further propose a win-win CSI quantization feedback strategy, so as to improve the efficiencies of both power and information transmission. The focus of this paper is on the tradeoff of wireless energy and information transfer by adjusting the transfer duration with a total duration constraint. Through revealing the relationship between transmit power, transfer duration and feedback amount, we derive two wireless energy and information transfer tradeoff schemes by maximizing an upper bound and an approximate lower bound of the average information transmission rate, respectively. Moreover, the impact of imperfect CSI at the receiver is investigated and the corresponding wireless energy and information transfer tradeoff scheme is also given. Finally, numerical results validate the effectiveness of the proposed schemes.
Jian Chen, Xiaoming Chen, Tao Liu, Lei Lei
In this paper, we address the problem of energy-efficient power allocation for secure communications in an amplify-and-forward (AF) large-scale multiple-input multiple-output (LS-MIMO) relaying system in presence of a passive eavesdropper. The benefits of an AF LS-MIMO relay are exploited to significantly improve the secrecy performance, especially the secrecy energy efficiency (bit per Joule). We first analyze the impact of transmit power at the relay on the secrecy outage capacity, and prove that the secrecy outage capacity is a concave function of transmit power under very practical assumptions, i.e. no eavesdropper channel state information (CSI) and imperfect legitimate CSI. Then, we propose an energy-efficient power allocation scheme to maximize the secrecy energy efficiency. Finally, simulation results validate the advantage of the proposed energy-efficient scheme compared to the capacity maximization scheme.
Xiaoming Chen, Zhaoyang Zhang, Caijun Zhong, Derrick Wing Kwan Ng
This paper aims to provide a comprehensive solution for the design, analysis, and optimization of a multiple-antenna non-orthogonal multiple access (NOMA) system for multiuser downlink communication with both time duplex division (TDD) and frequency duplex division (FDD) modes. First, we design a new framework for multiple-antenna NOMA, including user clustering, channel state information (CSI) acquisition, superposition coding, transmit beamforming, and successive interference cancellation (SIC). Then, we analyze the performance of the considered system, and derive exact closed-form expressions for average transmission rates in terms of transmit power, CSI accuracy, transmission mode, and channel conditions. For further enhancing the system performance, we optimize three key parameters, i.e., transmit power, feedback bits, and transmission mode. Especially, we propose a low-complexity joint optimization scheme, so as to fully exploit the potential of multiple-antenna techniques in NOMA. Moreover, through asymptotic analysis, we reveal the impact of system parameters on average transmission rates, and hence present some guidelines on the design of multiple-antenna NOMA. Finally, simulation results validate our theoretical analysis, and show that a substantial performance gain can be obtained over traditional orthogonal multiple access (OMA) technology under practical conditions.
Xiaoming Chen, Xiumin Wang
In this paper, we present a space-time-frequency joint block coding (STFBC) scheme to exploit the essential space-time-frequency degrees of freedom of multiuser MISO-MC-CDMA systems. Specifically, we use a series of orthogonal random codes to spread the space time code over several sub-carriers to obtain multi-diversity gains, while multiuser parallel transmission is applied over the same sub-carriers by making use of multiple orthogonal code channels. Furthermore, to improve the system performance, we put forward to linear precoding to the predetermined orthogonal STFBC, including transmitting directions selection and power allocation over these directions. We propose a precoder design method by making use of channel statistical information in time domain based on the Kronecker correlation model for the channels, so feedback amount can be decreased largely in multi-carrier systems. In addition, we give the performance analysis from the perspectives of diversity order and coding gain, respectively. Moreover, through asymptotic analysis, we derive some simple precoder design methods, while guaranteeing a good performance. Finally, numerical results validate our theoretical claims.
Qi Wang, Xiaoming Chen, Qiao Qi, Mili Li, Wolfgang Gerstacker
Integrated sensing and communication (ISAC) and ubiquitous connectivity are two usage scenarios of sixth generation (6G) networks. In this context, low earth orbit (LEO) satellite constellations, as an important component of 6G networks, is expected to provide ISAC services across the globe. In this paper, we propose a novel dual-function LEO satellite constellation framework that realizes information communication for multiple user equipments (UEs) and location sensing for interested target simultaneously with the same hardware and spectrum. In order to improve both information transmission rate and location sensing accuracy within limited wireless resources under dynamic environment, we design a multiple-satellite cooperative information communication and location sensing algorithm by jointly optimizing communication beamforming and sensing waveform according to the characteristics of LEO satellite constellation. Finally, extensive simulation results are presented to demonstrate the competitive performance of the proposed algorithms.
Jianhang Chu, Xiaoming Chen, Caijun Zhong, Zhaoyang Zhang
In this paper, we investigate the issue of massive access in a beyond fifth-generation (B5G) multi-beam low earth orbit (LEO) satellite internet of things (IoT) network in the presence of channel phase uncertainty due to channel state information (CSI) conveyance from the devices to the satellite via the gateway. Rather than time division multiple access (TDMA) or frequency division multiple access (FDMA) with multi-color pattern, a new non-orthogonal multiple access (NOMA) scheme is adopted to support massive IoT distributed over a very wide range. Considering the limited energy on the LEO satellite, two robust beamforming algorithms against channel phase uncertainty are proposed for minimizing the total power consumption in the scenarios of noncritical IoT applications and critical IoT applications, respectively. Both thoeretical analysis and simulation results validate the effectiveness and robustness of the proposed algorithms for supporting massive access in satellite IoT.
Xiaodan Shao, Lei Cheng, Xiaoming Chen, Chongwen Huang, Derrick Wing Kwan Ng
This paper investigates the problem of joint massive devices separation and channel estimation for a reconfigurable intelligent surface (RIS)-aided unsourced random access (URA) scheme in the sixth-generation (6G) wireless networks. In particular, by associating the data sequences to a rank-one tensor and exploiting the angular sparsity of the channel, the detection problem is cast as a high-order coupled tensor decomposition problem. However, the coupling among multiple devices to RIS (device-RIS) channels together with their sparse structure make the problem intractable. By devising novel priors to incorporate problem structures, we design a novel probabilistic model to capture both the element-wise sparsity from the angular channel model and the low rank property due to the sporadic nature of URA. Based on the this probabilistic model, we develop a coupled tensor-based automatic detection (CTAD) algorithm under the framework of variational inference with fast convergence and low computational complexity. Moreover, the proposed algorithm can automatically learn the number of active devices and thus effectively avoid noise overfitting. Extensive simulation results confirm the effectiveness and improvements of the proposed URA algorithm in large-scale RIS regime.
Feiyan Tian, Xiaoming Chen, Lei Liu, Derrick Wing Kwan Ng
In this paper, we investigate an unsourced random access scheme for massive machine-type communications (mMTC) in the sixth-generation (6G) wireless networks with sporadic data traffic. Firstly, we establish a general framework for massive unsourced random access based on a two-layer signal coding, i.e., an outer code and an inner code. In particular, considering Rician fading in the scenario of mMTC, we design a novel codeword activity detection algorithm for the inner code of unsourced random access based on the distribution of received signals by exploiting the maximum likelihood (ML) method. Then, we analyze the performance of the proposed codeword activity detection algorithm exploiting Fisher Information Matrix, which facilitates the derivative of the approximated distribution of the estimation error of the codeword activity vector when the number of base station (BS) antennas is sufficiently large. Furthermore, for the outer code, we propose an optimization algorithm to allocate the lengths of message bits and parity check bits, so as to strike a balance between the error probability and the complexity required for outer decoding. Finally, extensive simulation results validate the effectiveness of the proposed detection algorithm and the optimized length allocation scheme compared with an existing detection algorithm and a fixed length allocation scheme.
Weiqing You, Guozhen Shi, Xiaoming Chen, Jian Qi, Chuang Qing
The rapid development of computer technology will be the whole world as a whole, the widespread application of instant messaging technology to bring great convenience to people's lives, while privacy protection has become a more significant problem. For ordinary it's hard to equip themselves with a cryptograph machine. In this paper, through in-depth study of elliptic curve cryptosystem ECC and advanced encryption standard AES encryption algorithm, according to the characteristics of public key cryptography, elliptic curve version through the establishment of Diffie-Hellman key exchange protocol, combined with AES, design a set of perfect forward secrecy mixed cryptograph system .The system can guarantee the security of communication, easy to implement, the operation speed is quick and the cost is low. At last, the security of the system is analyzed under the environment of common network attacks.
Ming Ying, Xiaoming Chen, Xiaodan Shao
With the rapid development of Internet of Things (IoT), low earth orbit (LEO) satellite IoT is expected to provide low power, massive connectivity and wide coverage IoT applications. In this context, this paper provides a massive grant-free random access (GF-RA) scheme for LEO satellite IoT. This scheme does not need to change the transceiver, but transforms the received signal to a tensor decomposition form. By exploiting the characteristics of the tensor structure, a Bayesian learning algorithm for joint active device detection and channel estimation during massive GF-RA is designed. Theoretical analysis shows that the proposed algorithm has fast convergence and low complexity. Finally, extensive simulation results confirm its better performance in terms of error probability for active device detection and normalized mean square error for channel estimation over baseline algorithms in LEO satellite IoT. Especially, it is found that the proposed algorithm requires short preamble sequences and support massive connectivity with a low power, which is appealing to LEO satellite IoT.
Qiao Qi, Xiaoming Chen, Caijun Zhong, Chau Yuen, Zhaoyang Zhang
In this paper, we investigate the issue of uplink integrated sensing and communication (ISAC) in 6G wireless networks where the sensing echo signal and the communication signal are received simultaneously at the base station (BS). To effectively mitigate the mutual interference between sensing and communication caused by the sharing of spectrum and hardware resources, we provide a joint sensing transmit waveform and communication receive beamforming design with the objective of maximizing the weighted sum of normalized sensing rate and normalized communication rate. It is formulated as a computationally complicated non-convex optimization problem, which is quite difficult to be solved by conventional optimization methods. To this end, we first make a series of equivalent transformation on the optimization problem to reduce the design complexity, and then develop a deep learning (DL)-based scheme to enhance the overall performance of ISAC. Both theoretical analysis and simulation results confirm the effectiveness and robustness of the proposed DL-based scheme for ISAC in 6G wireless networks.
Kaiwei Xiong, Xiaoming Chen, Ming Ying
In order to provide wireless services for wide sea area, this paper designs an integrated satellite-terrestrial maritime communication framework. Specifically, the terrestrial base station (TBS) serves near-shore users, while the low earth orbit (LEO) satellite communicates with off-shore users. We aim to improve the overall performance of integrated satellite-terrestrial maritime communication system. Thus, it makes sense to jointly optimize transmit beamforming at the TBS and LEO satellite. Due to sea wave fluctuation, the obtained channel state information (CSI) is often imperfect. In this context, a robust beamforming design algorithm is proposed with the goal of minimizing the total power consumption of integrated satellite-terrestrial maritime communication system while satisfying quality of service (QoS) requirements. Both theoretical analysis and simulation results confirm the effectiveness of proposed algorithm in maritime communications.
Ming Ying, Xiaoming Chen, Qiao Qi, Wolfgang Gerstacker
Low earth orbit (LEO) satellite internet of things (IoT) is a promising way achieving global Internet of Everything, and thus has been widely recognized as an important component of sixth-generation (6G) wireless networks. Yet, due to high-speed movement of the LEO satellite, it is challenging to acquire timely channel state information (CSI) and design effective multibeam precoding for various IoT applications. To this end, this paper provides a deep learning (DL)-based joint channel prediction and multibeam precoding scheme under adverse environments, e.g., high Doppler shift, long propagation delay, and low satellite payload. {Specifically, this paper first designs a DL-based channel prediction scheme by using convolutional neural networks (CNN) and long short term memory (LSTM), which predicts the CSI of current time slot according to that of previous time slots. With the predicted CSI, this paper designs a DL-based robust multibeam precoding scheme by using a channel augmentation method based on variational auto-encoder (VAE).} Finally, extensive simulation results confirm the effectiveness and robustness of the proposed scheme in LEO satellite IoT.
Feiyan Tian, Xiaoming Chen, Yong Liang Guan, Chau Yuen
In this paper, we investigate unsourced random access for massive machine-type communications (mMTC) in the sixth-generation (6G) wireless networks. Firstly, we establish a high-efficiency uncoupled framework for massive unsourced random access without extra parity check bits. Then, we design a low-complexity Bayesian joint decoding algorithm, including codeword detection and stitching. In particular, we present a Bayesian codeword detection approach by exploiting Bayes-optimal divergence-free orthogonal approximate message passing in the case of unknown priors. The output long-term channel statistic information is well leveraged to stitch codewords for recovering the original message. Thus, the spectral efficiency is improved by avoiding the use of parity bits. Moreover, we analyze the performance of the proposed Bayesian joint decoding-based massive uncoupled unsourced random access scheme in terms of computational complexity and error probability of decoding. Furthermore, by asymptotic analysis, we obtain some useful insights for the design of massive unsourced random access. Finally, extensive simulation results confirm the effectiveness of the proposed scheme in 6G wireless networks.
Xiaoming Chen
This paper introduces CKTSO (abbreviation of "circuit solver"), a novel sparse linear solver specially designed for the simulation program with integrated circuit emphasis (SPICE). CKTSO is a parallel solver and can be run on a multi-core, shared-memory computer. The algorithms of CKTSO are designed by considering the features of matrices involved in SPICE simulations. CKTSO is superior to existing similar solvers mainly in the following three aspects. First, the matrix ordering step of CKTSO combines different types of ordering algorithms such that it can generally obtain the fewest fill-ins for a wide range of circuit matrices. Second, CKTSO provides a parallel fast LU factorization algorithm with pivot check, which behaves good performance, scalability, and numerical stability. Third, CKTSO provides a structure-adaptive hybrid parallel triangular solving algorithm, which can adapt to various circuit matrices. Experiments including both benchmark tests and SPICE simulations demonstrate the superior performance of CKTSO. The libraries of CKTSO are available at https://github.com/chenxm1986/cktso.
Xiaoming Chen, Chau Yuen
In this paper, we give a unified performance analysis of interference alignment (IA) over MIMO interference channels. Rather than the asymptotic characterization, i.e. degree of freedom (DOF) at high signal-to-noise ratio (SNR), we focus on the other practical performance metrics, namely outage probability, ergodic rate and symbol error rate (SER). In particular, we consider imperfect IA due to the fact that the transmitters usually have only imperfect channel state information (CSI) in practical scenario. By characterizing the impact of imperfect CSI, we derive the exact closed-form expressions of outage probability, ergodic rate and SER in terms of CSI accuracy, transmit SNR, channel condition, number of antennas, and the number of data streams of each communication pair. Furthermore, we obtain some important guidelines for performance optimization of IA under imperfect CSI by minimizing the performance loss over IA with perfect CSI. Finally, our theoretical claims are validated by simulation results.
Xiaoming Chen, Zhaoyang Zhang, Chau Yuen
In this paper, we design a resource allocation framework for the delay-sensitive Multi-User MIMO (MU-MIMO) broadcast system with limited feedback. Considering the scarcity and interrelation of the transmit power and feedback bandwidth, it is imperative to optimize the two resources in a joint and efficient manner while meeting the delay-QoS requirement. Based on the effective bandwidth theory, we first obtain a closed-form expression of average violation probability with respect to a given delay requirement as a function of transmit power and codebook size of feedback channel. By minimizing the total resource cost, we derive an optimal joint resource allocation scheme, which can flexibly adjust the transmit power and feedback bandwidth according to the characteristics of the system. Moreover, through asymptotic analysis, some simple resource allocation schemes are presented. Finally, the theoretical claims are validated by numerical results.
Jian Chen, Xiaoming Chen, Xiumin Wang, Lei Lei
In this paper, we address the problem of optimal power allocation at the relay in two-hop secure communications. In order to solve the challenging issue of short-distance interception in secure communications, the benefit of large-scale MIMO (LS-MIMO) relaying techniques is exploited to improve the secrecy performance significantly, even in the case without eavesdropper channel state information (CSI). The focus of this paper is on the analysis and design of optimal power allocation for the relay, so as to maximize the secrecy outage capacity. We reveal the condition that the secrecy outage capacity is positive, prove that there is one and only one optimal power, and present an optimal power allocation scheme. Moreover, the asymptotic characteristics of the secrecy outage capacity is carried out to provide some clear insights for secrecy performance optimization. Finally, simulation results validate the effectiveness of the proposed scheme.