Simulation Environment

Get Complete Project Material File(s) Now! »

Long Term Evolution (LTE)

Long Term Evolution or LTE [7] [8] [9], as it is commonly known, is meant to bring the concept of singularity in communication, which all of the previously deployed networks lack.
For cellular networks, we use HSDPA/HSUPA (our latest 3.5 G) or 2G networks in some countries, and WIFI – IEEE 802.11a/b/g for our Internet access. For each of this technology to exist different equipments and frequency bands are used. What LTE does is unify all this into a single entity for communication [7] making the end-user save money and feel the comfort. WiMAX is a serious competitor to this technology but it loses to LTE on all grounds except for its deployment, which is complete in few countries.
LTE is regarded as a pre-4G Technology as it does not fulfill the International Telecommunication Union (ITU-R) requirements for data rate and heterogeneity of networks.
LTE can operate in the frequency range from 900 MHz to 2.6 GHz [7]. LTE is aimed to provide high data rate, low latency and packet optimized radio access technology supporting flexible bandwidth deployment. LTE supports a wide range of bandwidth from 1.25 MHz to 20 MHz. The 20 MHz bandwidth gives peak data rate of 326 Mbps using 4×4 Multiple Input Multiple Output (MIMO). For uplink, MIMO is not yet implemented so the uplink data rate is limited to 86 Mbps [7]. It supports Orthogonal Frequency Division Multiple Access (OFDMA) which gives high robustness and spectral efficiency against multipath fading [9].
While comparing to HSPA, LTE provides high spectral efficiency of two to four times. Moreover, LTE system in terms of its radio interface network is capable of providing low latency for packet transmission of 10 ms from network to User Equipment (UE). Similarly, there is some improvement in cell edge performance, utilizing the same macro network. LTE supports both unicast and multicast traffic in microcells up to 100 of meters and in macro cells more than 10 km in radius. LTE system also supports FDD (Frequency Division Duplex) and TDD (Time Division Duplex), in its Half-FDD, UE is not require to transmit and receive at the same time which avoids the requirement of costly duplexer in UE. Generally, it is optimized for 15 km/h but can be used up to 350 km/h with some tolerance to performance degradation [9]. For its uplink it uses Single Carrier FDMA (SC-FDMA) access technique which gives greater coverage for uplink with the fact of low Peak to Average Power Ratio (PAPR).
For this purpose new network architecture is designed with the aim to support packet switched traffic with seamless mobility, low latency and high quality of service (QoS). Some basic LTE parameters related to air interface is summarized in Table 1.

OFDM & MIMO Systems

LTE uses Orthogonal Frequency Division Multiplexing – OFDM [8] [9]. For modulation it employs 64 QAM (Quadrature Amplitude Modulation Technique) which combines both ASK-Amplitude Shift Keying and PSK- Phase Shift Keying thereby enabling several bits to be transmitted per symbol. Every symbol used will now be a result of a particular amplitude and phase. Also, the adjacent symbols are now wider hence bringing down the Bit Error Ratio [9]. In LTE a 64 array QAM is used [Figure 3-2]. The data is multiplexed in several slow rate channels which sums up to more data. These sinusoidal signals as spaced close together orthogonally without interfering with each other. Hence the available spectrum is efficiently used i.e. more bits per seconds per Hz (Spectral Efficiency). Every sub-carrier is modulated by using QAM technique. OFDM signal is hence a composite of several low rate data streams which are flat-fading channels [10].
Time Domain – Orthogonality is ensured by limiting the integral number of cycles occurs for each within a specific time window [9] [10].
Frequency Domain – Ensured by placing the peak of any one of the carrier wave with the nulls of rest [Figure 3-3].
The benefit from such spacing is that the demodulator can now easily and distinctly distinguish between the frequencies and hence the receiver extracts the correct frequencies. This is highly essential in terrestrial environment with several distortions [9] [10].
Since the distributive channel effects are eliminated in OFDM, MIMO – Multiple Input Multiple Output radio antennas can be used effectively also. OFDM divides the channel into several sub- frequencies making this is an effective scheme to implement. This boosts the throughput as several independent data streams are transmitted in parallel via different antennas.
Since these sub-frequencies are orthogonal which means that they do not correlate with each other. The subcarrier frequency is shown in the equation given below: Eq. 2 where ’ Δf’ is the subcarrier spacing.
Subcarrier is first modulated with a data symbol of either 1 or 0, the resulting OFDMA symbol is then formed by simply adding the modulated carrier signal. This OFDM symbol has larger magnitude than individual subcarrier and thus having high peak value which is the characteristics of OFDMA technique [10].

Network Architecture

System Architecture Evolution (SAE) [13] is referred to as the Core-Network (CN) Architecture in the modern cellular networks. The important factor of this network architecture is that it is heterogeneous. It enables other wireless networks (2.5G, 3G) to co-exist and LTE is merely an extension. SAE has been evolved from the GPRS Core Network. This makes the implementation on the LTE networks cheap and use the existing network infrastructure and making LTE a much better choice in comparison with the other wireless technologies.
The SAE Architecture is an all IP based network. Every service in Network is in PS (Packet Service) Domain and not CS (Circuit Switched) Domain. The Architecture also inadvertently supports mobility to other systems like the WiMAX, 2G and 3G.
In addition to the GPRS core of the network, the Evolved Packet Core (EPC) is embedded to facilitate LTE [13]. This will provide IP-Based voice and data service simultaneously. Also the scalability of the services application and users will increase. There will also be minimal upgrades once it is embedded into the network as the services become IP based. The Control Plane functions are made simpler and easier to implement [10] [14].

Frame/Sub-Frame

Each Frame consists of 10 sub-frames containing 14 OFDM Symbols [15]. The Frame is of 10 ms duration and each sub frame is 1 ms in duration. Each Sub Frame is further sub divided into time slots containing 6-7 OFDM symbols, depending up to the length of the cyclic prefix. The LTE downlink physical channels consist of, Physical Downlink Shared Channel (PDSCH) – consumes 3 ODFM symbols per sub-frame. This is 21% of the Symbols in the sub-frame, hence 14% over- head for control.
Physical Downlink Control Channel (PDCCH) [15]– consumes 3 ODFM symbols. It states the mobile device specific information to optimize the communication.
Common Control Physical Channel (CCPCH) – It states the cell-wide control information. It is transmitted near to the centre frequency [9] [15].
The Cyclic prefixes used can be seen in the diagram. They can also be used as pilot signals as a reference for the MIMO. Every Resource Block can be divided into 12 (sub carrier frequency range) x 14 Symbols and the mapping is based on scheduling – localized or Distributed (Distributed preferred) [10].

READ  Basic benefits of global sourcing

Sub-Carrier Allocation and User Scheduling

The OFDMA orthogonal frequency division multiple access characterizes every user with a sub carrier. In a single channel spectrum several sub carriers are allocated to every used based on his requirement. This scheme is particularly useful in downlink when the number of users are high. If the data rate required is low then scheme is adapted as it consumes less resource and the delay is reduced effectively. The mobile users can be synchronized in time domain and frequency domain. This makes the uplink to be orthogonal and in sync [16].
In LTE the data is sent in resource blocks with each of them encompassing 12 sub-carries in a single time slot. This resource block size is maintained constant [15] .

Time and Freq Domain – user scheduling [17]

The Channel’s quality can also be taken in this scenario wherein the user’s domain pertains to a particular time and frequency period with respect to the channel quality. Here the information on channel quality will have to be in transmission so that the base station can effectively slot the user in that that particular time and frequency domain. LTE reports the channel quality every 1ms [17]. At low mobility speeds, frequency selective scheduling performs equally or even better than stationary as the channel quality varies rapidly; hence there is more chance to transmit at good channel quality. The mobile device’s performance varies on different portions on the spectrum since the high frequency (20 MHz) causes frequency-selective fading. These best sub-frequencies can be determined by feedback information (Channel quality information) from the mobile device. There are two ways to select the better sub-carrier – Sequential scheduler and matrix based scheduler. The matrix based scheduler performs slightly better [14] [10].
Two types of Signals are primarily used to assign the resource elements to the UE. Reference Signals [10] [17]: It is a product of an Orthogonal sequence and PRN sequence (pseudorandom numerical). 3GPP estimates 510 unique reference signals. Each can be assigned to a cell to identify it in the network.
Synchronization Signals [10] [17]: There are two Synchronization signals: Primary and Secondary. They are used in the cell search by mobile device. They are transmitted in the 0 and 10th slot in the frame.

LTE Benefits [10] [9]

 LTE can provide practical data rates of 100Mbps for downlink and 50 Mbps uplink (20 MHz Spectrum) with very low latency period of as low as <10ms.
 It has high spectral efficiency and Scalable bandwidth ranging from 1.25 MHz to 20MHz.

Indoor Solutions

As mentioned in Chapter 1, Increase in the overall cellular traffic is mainly due to indoor users. In order to cope with this ever increasing demand of bandwidth, there is a dire need to come up with a solution. WiFi and Femto Cells, both present an attractive alternative. We look through the pros and cons of each of them separately.

WiFi

WiFi refers to wireless communication standard IEEE 802.11 [18]. The most recent standard is IEEE 802.11n [19] which can provide peak data rate up to 75 Mbps. As there is high penetration of WiFi access points and WiFi enabled devices, WiFi can be an attractive solution for indoor traffic-offload for mobile operators.
WiFi works in an unlicensed spectrum but has no independent voice service and could result in higher latencies for the voice packets. Also, the WiFi protocol does not ensure the secure communication due to the vulnerability of its protocols such as WPA [20].

Femto Cells

Femtocell can be defined as « a personal mobile network in a box » [21]. Femtocell uses a low power Femto Access Point (FAP) that utilizes fixed broadband connections to route Femtocell traffic to cellular networks. Moreover, it is a part of self organizing network (SON) with zero touch installation. It supports limited number of active connections 3 or 4 but can be extended from 8 to 32. It is used to improve indoor signal strength as it avoids walls penetration loss.

Interference Management

With the deployment of femto cells within the macro cells, the role of interference management becomes extremely important. The idea is to optimize the macro network behavior with respect to interference and capacity relationship. Macro-Macro, Macro-Femto, Femto-Macro Interferences are considered. For the sake of simplicity, interfering impact of a femto cell on the neighboring femto cell (Femto-Femto) is not considered, mainly, because femto cells are low powered devices and added penetration loss due to indoor environment would make the impact insignificant.
As a starting point, Shared Spectrum technique, in which both macro and femto cells are using the same frequency band, is applied which was then compared against Split-Spectrum technique, in which both femto and macro cells have dedicated non-overlapping bandwidth available, to evaluate the impact of the two techniques in enhancing the macro network performance.

Table of contents :

List of Abbreviations
1 Introduction
1.1 Motivation
1.2 Goal
1.3 Thesis Layout
2 Cellular technologies
2.1 Evolution of Networks
2.2 Related Work
3 Long Term Evolution (LTE)
3.1 Description
3.2 OFDM & MIMO Systems
3.2.1 OFDM
3.2.2 MIMO
3.3 Network Architecture
3.4 Frame Structure
3.4.1 Frame/Sub-Frame
3.5 Sub-Carrier Allocation and User Scheduling
3.5.1 Time and Freq Domain – user scheduling [17]
3.6 LTE Benefits [10] [9]
3.7 Indoor Solutions
3.7.1 WiFi
3.7.2 Femto Cells
3.7.3 Choice of solution
3.8 Interference Management
4 Simulation Environment
4.1 Simulation tool
4.1.1 Matlab-based System Level Simulator
4.2 Simulation Scenarios
4.2.1 Scenario 1
4.2.2 Scenario 2
4.2.3 Scenario 3
4.3 Common Simulation Parameters [15]
4.3.1 eNodeB Parameters
4.3.2 Femto cell parameters
5 Performance Evaluation
5.1 Without Femto cell deployment (Scenario 1)
5.2 With deployment of Femto Cells (Scenario 2)
5.3 Effect of Interference Management (Scenario 3)
6 Conclusion and Future Work
7 References

GET THE COMPLETE PROJECT

Related Posts