Inter-Radio Access Technology Coexistence: State of the art 

Get Complete Project Material File(s) Now! »

LTE/Wi-Fi Coexistence in Unlicensed Spectrum

As a first step, to discover the challenges facing LTE/Wi-Fi coexistence in an unlicensed spec-trum, research communities have gone in the direction of evaluating the performance of LTE and Wi-Fi without any coexistence technique that allows sharing the channel access among both technologies. Without any coexistence technique this means that LTE keeps its usual schedule-based channel access scheme to coexist with Wi-Fi. The performance evaluation of both LTE and Wi-Fi has been presented and shown in [29] for different coexistence scenar-ios using simulations. The results show that LTE performance is slightly affected whereas Wi-Fi performance is significantly degraded. Indeed, since LTE does not adopt any coexistence technique to share the channel with Wi-Fi, LTE completely dominates the channel access, es-pecially when LTE traffic load is high as shown in [102], and Wi-Fi is blocked the most of time from accessing the channel. The main reason is completely obvious, since LTE adopts a schedule-based channel access scheme that allows access to the channel regardless of the chan-nel state, i.e. it is idle or occupied by Wi-Fi transmission. In contrast, Wi-Fi adopts DCF proto-col, i.e. LBT-based protocol, where Wi-Fi is only allowed to access the channel only whenever the channel is idle from LTE transmission. As a result, we can say that the channel access is fully controlled by LTE and Wi-Fi is able to access the channel whenever LTE decides to evacuate the channel, i.e. Wi-Fi is considered as an LTE slave.
The same conclusion has been drawn again using an experimental testbed in [42] where real LTE and Wi-Fi systems have been deployed in a typical indoor office scenario, and using a simulation study in [54] for outdoor scenarios. In addition, in LTE/Wi-Fi sparse deployment scenarios, the results show that Wi-Fi performance has been improved. In fact, increasing the distance between LTE and Wi-Fi systems helps Wi-Fi to find the channel idle from LTE trans-mission, as a result, Wi-Fi can grab the channel access more times with respect to dense deploy-ment scenarios. However, increasing the channel access probability of Wi-Fi by increasing the distance between LTE and Wi-Fi systems is not necessarily improving Wi-Fi performance as in [56]. Indeed, over a certain range of distance between LTE and Wi-Fi systems, Wi-Fi could find frequently the channel idle from LTE transmission but it does not necessarily guarantee that Wi-Fi transmission will not face a transmission collision with LTE transmission. This problem is known as the LTE/Wi-Fi coexistence below energy detection threshold as we will illustrate in Section 2.1.5. Accordingly, we emphasize that Wi-Fi performance is not directly proportional to the distance between LTE and Wi-Fi systems as shown in both [56] and [72].
The authors in [56] measure the negative impact of LTE on Wi-Fi for different LTE system configuration parameters such as LTE bandwidth, central frequency and transmission power using an experimental platform. Another testbed in [84] measures also the mutual impact of LTE and Wi-Fi in different coexistence scenarios where LTE coexists with Wi-Fi over its pri-mary and/or secondary channel with fully or partially overlapping. All the results show that the performance of both technologies is completely related to the coexistence conditions and that Wi-Fi is the technology that suffers the most in coexisting with LTE. Another experimen-tal analysis presented in [45], confirms that LTE degrades Wi-Fi performance while the in-terference of LTE downlink traffic is more harmful than that of LTE uplink traffic on Wi-Fi performance. As a result, for different LTE time division duplex (LTE-TDD) configuration, whenever the number of uplink subframes increases within LTE frame, Wi-Fi performance is improved [102]. To further enhancing Wi-Fi performance, the authors in [32] and [31] pro-pose an interference-aware coexistence scheme based on controlling LTE uplink power trans-mission. LTE uplink performance has been also examined in [100] for different Wi-Fi traffic loads in different coexistence scenarios. In addition, LTE can leverage its beamforming capa-bility as shown in [78] to avoid causing a noticeable interference to Wi-Fi. An analytical framework characterizing LTE/Wi-Fi mutual interference has been presented in [106] for dense deployment scenarios where the Wi-Fi throughput performance is degraded by up to 97% due to LTE interference. The authors in [51] developed a new interference anal-ysis technique to quantitatively analyze the mutual interference, where LTE has to increase its cell radius to reduce the interference with Wi-Fi. The authors in [107] propose a statistical model for LTE indoor planning based on an interference model in a way that LTE avoids as much as possible to interfere with Wi-Fi.
The latter research results confirm that LTE has to adopt a coexistence technique to share the channel access with Wi-Fi otherwise Wi-Fi performance will be severely degraded. In fact, the above proposed solutions only aim to mitigate the interference between LTE and Wi-Fi without ensuring any fair channel sharing. In other words, these solutions are considered as complementary solutions to enhance LTE/Wi-Fi coexistence. Thus, LTE still needs to adopt a new MAC layer protocol that allows better sharing of the channel access with Wi-Fi. Con-sequently, a full understanding of each new LTE MAC layer protocols operation is needed. In Table 2.1, we classified the new main MAC layer protocols which were proposed for LTE in unlicensed spectrum by 3GPP. Our classification is mainly based on the answer of several questions related to the MAC layer protocol operation and more precisely the behaviour with regard to the channel access. We list below the most important questions that could help some-one to do such classification:
• Duty cycle period non-LBT TDM-based • ON & OFF periods (without CCA) • Number of ON & OFF periods per duty cycle.
• The order of ON & OFF periods FBE • ON & OFF periods LBT-based (without Back-off) • Fixed frame period.
• Sensing period (with CCA) LBE • COT (fixed CW) • CW size.
• Back-off slot duration.
• COT Cat4 •Minimum value of CW size (Binary exponential CW) • Maximum value of CW size
•Retransmission limit.
• Is the MAC layer protocol a non LBT or LBT-based?
• What is the allowed temporal granularity of the MAC layer protocol to contend the channel access?
• Does the MAC layer protocol apply a back-off period before accessing the channel?
• What are the rules that the MAC layer protocol follows to configure its back-off period?
• What are the constraints imposed on the MAC layer configuration parameters?
LBT-based protocols are a family of protocols such as FBE, LBE and Cat4 protocols. As its name indicates, the communication system has to “Listen” to the channel to perform a clear channel assessment (CCA) “Before” accessing and “Talking” over the channel which means before transmitting a frame over the channel. After performing CCA, if the channel is idle from any transmission, the communication system may transmit its frame, otherwise the channel is busy then it refrains from accessing the channel. In contrast, non LBT-based MAC layer protocols such as TDM-based protocol do not perform any CCA prior to the channel access so that the communication system accesses the channel whether the channel is idle or busy. For this reason, LBT-based protocol are well-known by their “politeness” regards to the channel access with respect to non LBT-based protocols which are considered as “aggressive” protocols.

TDM-based MAC layer Protocol

The TDM-based MAC layer protocol is the first proposal for LTE to coexist with Wi-Fi in un-licensed spectrum [8],[38]. 3GPP also has proposed the same protocol with a different name called Category 1 protocol [7]. Several telecommunication companies formed an industry al-liance called LTE-U forum [3], [9] where they show keen interest in considering TDM-based protocol as a new MAC layer protocol for LTE. The main reason is the simplicity of adopt-ing this protocol by today’s LTE systems since the TDM-based protocol exploits one of the key existing LTE features called Small Cell activation/deactivation [108]. Accordingly, LTE in unlicensed spectrum could be deployed fastly based on latest LTE Release 12 [12] without the need of waiting for a new LTE standard, so that mobile operators could enhance their LTE ecosystems quickly. The main idea behind the TDM-based protocol is a sharing between LTE and Wi-Fi in a time division multiplexing fashion where LTE transmission is activated for cer-tain periods called LTE-ON periods, i.e. several LTE frames or subframes, interposed between LTE-OFF periods allocated for Wi-Fi channel access, see Figure 2.1. Accordingly, LTE adopts an ON/OFF pattern to form a duty cycle period that is repeated over the time. That is why this modified LTE is so-called Duty Cycled LTE.
The TDM-based protocol offers a synchronous operation of the channel where LTE frames transmission can only start at subframes boundaries, i.e. at the beginning of subframes. The synchronization of LTE operation is a must to LTE user equipment’s to be able to receive and decode LTE signal. In case of LTE adopts a MAC layer protocol such as LBE and Cat4 proto-cols, LTE will not be able to ensure a synchronization operation of the channel since LTE can grab the channel access at any time. To solve this problem, an LTE reservation signal [20] is used to prevent Wi-Fi from grabbing the channel access until the next LTE subframe boundary where LTE starts its frame transmission. As a result, LTE throughput suffers from reservation signal overhead which is almost equal to one subframe. However, the price of this benefit of the TDM-based protocol is its well known negative impact on Wi-Fi performance in terms of collision as we will see through chapters 3 and 4. With TDM-based protocol, LTE accesses ag-gressively the channel since LTE does not perform CCA before transmitting LTE frames unlike Wi-Fi. As a result, a collision between LTE and Wi-Fi frames is occurred leading to Wi-Fi per-formance degradation. For this reason, the TDM-based protocol has been dismissed for being used in different regions such as Europe, Japan and India, where accessing unlicensed spec-trum requires that LTE adopts an LBT-based protocol. The usage of TDM-based protocol is currently limited for regions such as USA and Korea that allow non-LBT protocols. The TDM-based protocol is still a good candidate for other inter-radio access technology coexistence as in [134] and [138].
The authors provide simulation studies in [89] where LTE coexists with Wi-Fi in indoor deployment scenarios. There, both LTE and Wi-Fi use the channel only for downlink data traffic and the main result is that under a relatively low traffic load for both technologies, both are able to sustain an acceptable performance. However, increasing the traffic load degrades severely Wi-Fi performance with respect to LTE performance. In fact, increasing the traffic load means that both LTE and Wi-Fi are in a full competition on the channel access, as a result, several collision will be encountered with respect to low traffic load. Besides, LTE can still recover its frames even in the case of collisions thanks to its robust PHY layer compared to Wi-Fi PHY layer. The authors provide a new technique in [11] where LTE applies a short duty cycle period using its almost blank subframe allocation technique. Almost blank subframes was introduced in LTE Release 10 where LTE mutes certain LTE subframes within an LTE frame to allow Wi-Fi access to the channel during the muted subframes. Although, the previous procedure does not avoid performance degradation of Wi-Fi, the authors propose to improve Wi-Fi performance by increasing the number of muted subframes allocated for Wi-Fi. The results show that the Wi-Fi performance is not only sensitive to the number of muted subframes but also to the way the muted subframes are distributed along the LTE frame.

READ  A NURBS-based Topology Optimisation Algorithm 

Load Based Equipment MAC Layer Protocol

Another LBT-based protocol is the Load Based Equipment (LBE) protocol, standardized by ETSI in [2] and considered by 3GPP as Category 3 protocol for LTE in [7]. To cope with the expected LTE performance degradation with FBE protocol, LBE protocol attracted many companies during the same forenamed 3GPP workshop [7]. This is because LTE is allowed to contend for the channel access with Wi-Fi at any time instant. However, to ensure a reasonable fair coexistence with the DCF protocol that adopts a back-off period prior to the channel ac-cess, LBE protocol implements also a Back-off period before any frame transmission over the channel. LBE protocol Back-off period is randomly selected from a fixed CW in the range of {1,…,q} where the value of q is selected by the manufacturer in the range {4,…,32}, see Figure 2.3. The main difference between LBE and DCF is that when a collision is occurred over the channel, Wi-Fi CW size follows a binary exponential increase to reduce the contention over the channel access, whereas LTE CW is kept fixed despite of the number of collisions. For this reason, the standard imposes a limitation on the channel occupancy time by the LBE protocol which must be less than 1332 x q ms. In such a way, if the manufacturer decides to use a small CW size to increase the channel access probability, the channel occupancy time over the channel is reduced as a compensation to ensure fair coexistence with other communication systems; still, the fixed CW size of LBE protocol poses the problem of heterogeneity with the DCF protocol. In fact, there are two more main differences between LBE and DCF protocols:
• The LBE slot duration that forms LBE Back-off period may be fixed to 20µs or even larger, while the DCF slot duration is 9 µs for most of Wi-Fi (IEEE 802.11) standards. As a result, LTE performance will be degraded as the evolution of its back-off period will be longer than the Wi-Fi back-off period as noticed in [68]. In [83], the author propose a new analytical framework that takes into consideration such heterogeneous sensing and back-off slot duration between both technologies. Based on a novel Markov chain approach, the authors were able to identify and to measure the impact of this het-erogeneity. In the context of Wi-Fi/Wi-Fi coexistence, the problem of heterogeneity of back-off slot duration has also been discovered in [22] between different Wi-Fi com-mercial network cards showing clearly a negative impact on the performance.
• After the channel becomes idle, LBE starts immediately to decrement its frozen back-off counter that forms the back-off period while Wi-Fi waits an additional time before the decrementation known as initial CCA time or simply the DIFS period. As a result, we expect that LTE would have more chance to grab the channel access than Wi-Fi. It is worth to mention here that there are some works such as [37] and [63] do not con-sider such lack of an equivalent DIFS period for LBE protocol, likely for simplification reasons. Nevertheless, the ETSI standard [2] mentions clearly that LBE Option-B pro-tocol does not include any initial CCA.
To sum up, with LBE protocol, LTE may have a higher probability to grab the channel ac-cess than Wi-Fi due to the fixed CW size and the lack of an equivalent additional sensing period DIFS before the back-off period as Wi-Fi. On the contrary, LTE may have a lower probability to grab the channel access than Wi-Fi due to its longer slot duration. Finally, with LBE pro-tocol, LTE will not be able any more to ensure a synchronization operation of the channel as it was the case for both TDM-based and FBE protocols since LTE can grab the channel ac-cess at any time instant. In Figure 2.3, an LTE reservation signal is used to prevent Wi-Fi from grabbing the channel access until the first coming LTE subframe boundary where LTE starts its frame transmission. As a result, LTE throughput suffers from a reservation signal overhead which is almost equal to one subframe. For example, in case of LTE Transmission Opportunity (TXOP) is equal to 3 subframes, LTE can only use for data transmission 2 subframes out of 3 subframes, which will incur almost 33% of reduction of LTE throughput.
In [50], LBE was tested with its maximum CW size, i.e. q=32, to coexist with Wi-Fi in indoor and outdoor small cell deployments. The simulation results show that Wi-Fi performance is not impact by LTE whatever the traffic load, low, medium or high. We cannot come to a trustable conclusion through this work because LTE adopts the maximum CW size. A different LBE CW size can lead to another conclusion, specially that DCF protocol minimum contention size for Wi-Fi is equal to 15 which is almost equal to the half of LBE CW size adopted in this work. In fact, the minimum value of the CW size for any LBT-based protocol is considered as one of the most important parameters that determines the protocol performance. In [88], the authors confirm the above remark since Wi-Fi performance degrades when LTE adopts a CW less than 32.
A theoretical analysis for both downlink performance of LTE and Wi-Fi in a coexistence scenario was presented in [41] and [33] based on Markov chains. Even through their analyt-ical models have not been validated by simulations, their numerical results show interestingly that the transmission probability of LTE over the channel is higher than Wi-Fi. As mentioned previously, the main reason is due to the fixed CW size of LBE protocol. As a result, when the traffic load of LTE is very high, LTE performance outperforms Wi-Fi but it comes with the price of degradation of Wi-Fi performance. For this reason, an adaptation of LBE CW size was presented in [116] to enhance Wi-Fi performance by achieving a fair channel access. In this work, LTE adapts LBE CW size based on the collected quality of service metrics from Wi-Fi neighbors. The simulation results confirm the need of an LBE CW size adaptation for enhanc-ing both LTE and Wi-Fi performance in terms of throughput and delay.

Enhancing LTE/Wi-Fi Coexistence

Ensuring fair coexistence between LTE and Wi-Fi is not only a question of both adopting the same MAC layer protocol. Indeed, in the extreme case, when LTE adopts the Cat4 protocol which is similar to the DCF protocol adopted by Wi-Fi, fair coexistence is still not ensured but rather close to be reached. Especially that we know the fact that Wi-Fi is not really coexist fairly with itself even though Wi-Fi networks use the same protocol stack [117], [95], [85]. Regarding to LTE/Wi-Fi coexistence, Wi-Fi will have an important interference from LTE due to the poor Wi-Fi channel assessment of LTE activities over the channel. Indeed, in many coexistence scenarios, Wi-Fi will consider LTE signal as a background noise and will not refrain from the channel access while LTE transmits over the channel. As a result, Wi-Fi transmission have several collisions with LTE transmission which degrades Wi-Fi performance. The origin of this problem is that Wi-Fi has two different CCA mechanisms:
• Carrier Sense CCA (CS-CCA): That defines the ability of Wi-Fi to detect another Wi-Fi frame transmission by decoding its frame preamble. The carrier sense CCA called also the Preamble Detection CCA (PD-CCA).
• Energy Detection CCA (ED-CCA): That defines the ability of Wi-Fi to detect any other communication system transmission such as LTE.
For both CCA mechanisms, Wi-Fi PHY layer measures the total received signal power and compares it with a certain detection threshold. Whenever the signal power is above this de-
tection threshold, Wi-Fi PHY declares that the channel is considered as busy, otherwise it is idle. In fact, Wi-Fi detection thresholds for carrier sense and energy detection mechanisms are different with about 20 dB: The CS threshold is -82 dBm and ED threshold is -62 dBm. The difference comes from the fact that Wi-Fi receiver network card is able to detect easily another Wi-Fi signal thanks to the a priori knowledge of Wi-Fi frame preamble symbols [98]. Accord-ingly, in Wi-Fi/Wi-Fi coexistence scenarios, Wi-Fi will be able to detect easily the presence of another Wi-Fi and declares the channel as busy and the DCF protocol will refrain from the channel access. In contrast, in LTE/Wi-Fi coexistence scenarios, Wi-Fi can consider LTE sig-nal as a background noise and consider the medium as idle during LTE activity leading to a collisions. Accordingly, below Wi-Fi ED-CCA threshold, LTE is considered as a hidden node for Wi-Fi and the problem is usually called by LTE/Wi-Fi coexistence below energy detection threshold. The hidden node problem [90] in wireless communications is not a new issue, but in the context of LTE/Wi-Fi coexistence, it is more destructive.

Table of contents :

1 Introduction 
1.1 Context and Motivations
1.2 LTE in Unlicensed Spectrum
1.2.1 Network deployment scenarios
1.3 Research Challenges and Problem Statements
1.4 Thesis Contributions and Research Approaches
1.5 Thesis Outline
2 Inter-Radio Access Technology Coexistence: State of the art 
2.1 LTE/Wi-Fi Coexistence in Unlicensed Spectrum
2.1.1 TDM-based MAC layer Protocol
2.1.2 Frame Based Equipment MAC Layer Protocol
2.1.3 Load Based Equipment MAC Layer Protocol
2.1.4 Category 4 MAC Layer Protocol
2.1.5 Enhancing LTE/Wi-Fi Coexistence
2.2 WiMAX/Wi-Fi Coexistence
2.3 Synthesis
3 TDM-based Protocol: Studying the Impact of LTE on Wi-Fi Downlink Performance 
3.1 LTE/Wi-Fi Interaction
3.2 Wi-Fi Analytical Model
3.2.1 Transition Probabilities and Probability of Collision
3.2.2 Downlink Wi-Fi Throughput
3.3 Model Validation and Simulation Results
3.3.1 Comparison With Simulation Results
3.3.2 Analyzing LTE Impact on Wi-Fi
3.4 Summary
4 TDM-based Protocol: Modeling and Performance Analysis of LTE/Wi- Fi Coexistence 
4.1 Revisiting Wi-Fi Analytical Models
4.2 New Wi-Fi Analytical Models
4.2.1 Slot-by-Slot Random Walk Model
4.2.2 Exponential Model for LTE
4.2.3 Frame-by-Frame Random Walk Model
4.2.4 Linking the two analytical models
4.2.5 Wi-Fi Model with Capture Effect
4.3 LTE Analytical Model
4.4 Models Validation and Simulation Results
4.4.1 Validation Through Simulation and Observations
4.4.2 Wi-Fi/LTE Coexistence Model Analysis and Application
4.4.3 Comparison Between Different LTE Configurations
4.4.4 Controlling Wi-Fi/LTE Coexistence Using the Model
4.4.5 LTE Analytical Model Validation
4.5 Summary
5 FBE Protocol: Modeling and Performance Analysis of LTE/Wi-Fi Coexistence
5.1 LTE/Wi-Fi Interaction
5.2 LTE/Wi-Fi Analytical Models
5.2.1 LTE Markov Chain Model
5.2.2 Transition Probabilities and LTE Throughput
5.2.3 LTE Renewal Process
5.2.4 Wi-Fi Throughput
5.3 Models Validation and Simulation Results
5.4 Summary
6 Category 4 Protocol: Adaptive Energy Threshold for Improved Coexistence Between Licensed Assisted Access and Wi-Fi 
6.1 Coexistence Below Energy Detection Threshold
6.2 Adaptive Energy Detection Threshold Scheme
6.2.1 Scheme at the Wi-Fi side
6.2.2 Scheme at the LAA side
6.3 System Model
6.4 Performance Comparison
6.4.1 Single UE per technology located both at the cell-edge
6.4.2 Single UE per technology located both at the cell-centre
6.4.3 Multiple UE per technology uniformly distributed
6.5 Summary
7 Conclusions and Future work 
7.1 TDM-based MAC layer protocol
7.2 Frame Based Equipment MAC layer protocol
7.3 Insights towards a New Hybrid MAC layer protocol
7.4 Enhancing LTE/Wi-Fi Coexistence
References 

GET THE COMPLETE PROJECT

Related Posts