Understanding 5G NB-IoT NTN System Performance

Matching satellite configurations to 5G NB-IoT capacity needs

Each satellite network has a different system set-up. It can be a challenge to figure out how this could be paired with 5G NB-IoT NTN, to achieve proper network performance and link conditions. 

In this webinar, we shed some light on common questions regarding the actual link level as well as system level performance, obtainable capacity, throughput, latency, number of devices, antenna design, and coverage, which need to be answered by satellite operators and service providers. We discuss how to get answers to all those questions, considering topics such as individual hardware platform details, fading models. 

Enjoy this free webinar in which Rene Brandborg Sørensen and Juline Hilsch discuss: 

  • How 5G NB-IoT performs in individual satellite systems 
  • How to investigate trade-offs in NB-IoT performance and satellite configurations 
  • Concrete simulation examples for a non-GEO and GEO system in comparison 

On demand

FAQ

How do you simulate real-life-scenarios in your feasibility study?

In essence, the realism of the feasibility study depends on the configuration of the scenario (input parameters) and the results are generally approximation, worst/best-case results and where applicable they have been compared to similar SoTA results. All modelling is an attempt to deconstruct or approximate reality in a way that we can more easily deal with. In our feasibility-study, we have divided the RAN (radio access network) in three major parts the fading channel, the link-level and the system level. We can develop fading channels based on Ray-tracing, which will be very realistic or use a more abstract/generalized model – or 3GPP standardized models depending on choice. On the Link-level, we do extensive monte-carlo simulations to find the link performance given the chosen fading model. On the system level we have rigorous analytical models, which account for many protocol aspects and signaling overheads (e.g. the various message sequences) – and this level relies on the realism of the two layers below.

Read more about our 5G NTN Feasibility Study 

Which use cases for 5G NB-IoT would work best for GEO vs non-GEO sats?

The use-case would be delay-tolerant applications for both LEO and GEO and GEO has the advantage of providing terrestrial-like cells while LEO has the advantage of providing global (discontinuous) coverage and a lower propagation delay. It is cheaper to launch satellites into LEO than GEO, so typically a GEO payload can be more expensive and justify an increased power budget compared to LEO satellite payloads. The new space-race with cube-sats is especially allowing for low-cost LEO payloads to be launched.

Can your feasibility study be used to determine the coverage and capacity of a satellite within a specific geographic area?

The feasibility study allows for ascertaining system level KPIs (System capacity, UE QoS (Throughput, latency) and UE energy consumption. This is done on the basis of the scenario definition – so it is indeed possible to define a specific geographic area, say the Himalayas and ascertain the performance of a Cell or a UE in that location.

What bandwidth can be reached (in bits per second)?

The peak throughput is a bit less than for terrestrial NB-IoT around 127 kbits/s in PDSCH(DL) and the same in PUSCH(UL) at the link-level without accounting for propagation time. In reality the obtainable throughput will depend heavily on the link-budget throughout the cell and this is a function of the satellite payload. In our feasibility study we can take this evaluation one step further to account for overhead in terms of static signaling and the dynamic message exchanges (an application payload is embedded in a larger message exchange, eg. RA+)

Does the beam center move with the satellite movement in NGSO or does it “track” the location of the NB-IOT devices in FOV?

There are two scenarios defined by 3GPP in the NGSO case: 1) Earth-fixed cells, where an NGSO satellite steers its beams such that the cell projected on the ground does not move and 2) Earth-moving cells, where an NGSO satellite has a fixed beam direction, such that the cell moves around with the satellite.

Why is NB-IoT based on 5G? Is there a technical limitation that prevented NB-IoT from working with 4G standards?

5G is a set of requirements for networks – as was 4G. In 5G one of the targeted use-cases is massive machine type communications (mMTC). The requirement for a 5G mMTC technology is that it must be able to service 1 million devices per km2 sending 32bytes of L2 data every 2 hr. After the requirements had been set, the development of the new technologies for 5G started. It was quickly found that NB-IoT and eMTC were sufficient for this requirement (terrestrially) given enough channels. Therefore these radio access networks are 5G compliant and hence now called 5G. In the backbone of the network there is a core network, here the 5G variant is called 5GC (5G core) and the 4G variant is called EPC (evolved packet core). Even though the RAN remains largely the same (but has developed over the 3GPP releases) there are some differences in base-station depending on whether it is interfacing with 5GC or EPC.

What kinds of waveforms have you developed?

We have developed waveforms ranging from GMR-1 to DAMA protocols, to Inmarsat BGAN, to 5G NB-IoT – for military purposes as well as commercial services. If you´d like more details on a specific waveform, just let us know.

Has Gatehouse Satcom done any study concerning potential interference between terrestrial component and NTN component within the same network?

We have not studied interference between TN and NTN. The networks should be seperated in frequency with appropriate guard bands handling Doppler shift in the NGSO case. The bands and channels allocated for NTN and TN are being etermined by standardization organisations like ITU, 3GPP and ETSI. As a general rule you can count on interference not being allowed.

Is it possible to emulate 5G NB-IoT network links?

Yes, it would be possible to do real-life testing with our in-orbit emulator. Gatehouse Satcom offers tools for both off-air and in-orbit testing that can create controlled and fully configurable environments to emulate various real-world scenarios. 

How is your system coping with finding satellites when both devices and satellites are moving?

3GPP has defined functionality wrt. the channel raster such that UEs will always be able to look for, find and appropriately identify any available channel. The trick is to find an available cell by searching for that particular channel while in coverage of a serving satellite. This can be helped by satellite assistance information, which is a feature included in Rel-17.

What has been done to minimize signaling overhead?

NB-IoT is a LPWAN, that is a low power wide area networks, such protocols are optimized for long-range transmissions of small data packets. Thus NB-IoT already has comparatively little signaling overhead compared to other protocols (which is why the feature set is also minimized). Further, GH is implementing DoNAS in it’s waveform and it is already implemented in the analysis.

How are your simulations handling dynamics of moving satellites?

In the case of a GEO satellite, the cell will have a static link-budget and the elevation angle toward the satellite does not vary throughout the cell. In the case of NGSO earth-fixed cell, the cell has a fixed position and so would a stationary UE within it, but the link-budget and elevation angles are dynamic and change overt time, so we compute these for a satellite pass. In case of a earth-moving cell NGSO, we have a cell which moves within the cell the link budget and elevation angles are static, but the cell moves over the UE. This is equivalent to a UE travelling within a GEO cell (at approximately 7.3km/s or so)

Does 5G NB-IoT work on Ka/Ku band?

Rel-17 will work on the S-band, but preliminary work has already been started on the Ka-band. It is likely that higher bands will be supported in future releases. The higher frequencies are a source of wider spectrum/bandwidth for the NTN networks, but there are major challenges involved with higher frequencies – in particular dealing with the increased signal propagation. It could very well be unfeasible to launch ka/Ku band on cubesat payloads due to the limited power budget.

Apart from UEs and satlinks, is there any need for ground infrastructure to establish 5G IoT communication?

Indeed, the radio access network (RAN) NB-IoT, LTE, LoRaWAN, etc. are just the communication link between UEs and satellites. To make this link useful a link to the core-network on earth should be established. This latter link is known as the feeder link in SatCom terminology and is established between the satellite and large ground-stations. The service link must provide sufficient capacity for the cumulative RAN information (and then some other telemetry) to be exchanged which is why ground-station typically have large steerable antennas and a large transmission power.

What are considerations for latency for IoT use case?

The latency in NTN is larger than in terrestrial networks due to the larger propagation delay. In some satellite constellations coverage can not be provided continuously on the ground either. So IoT devices for NTN must be delay tolerant.

Are there any satellite crosslink capabilities providing global coverage and connectivity within one satellite footprint?

Yes, inter-satellite links (ISL) can be used for networking and routing between satellites. However in Rel-17, the focus has been on bent-pipe satellite payloads, i.e.. satellites that act as relays where the ground-station is the actual base-station – so first the focus in a future release needs to switch to regenerative payloads i.e. base-stations onboard the satellite – and then to ISL later. Nothing hinders ISL at the moment – it is just not standardized.

What are the typical messages lengths (in kilobytes) that can be sent and received via satellite NB-IoT? Does it compare with cellular NB-IoT?

The transport block sizes in NTN NB-IoT are the same as in NB-IoT so the difference is in the fading model and the link budget. Provided that the link budget of a satellite payload is comparable to that of a satellite cell the typical message lengths will be comparable between TN and NTN. Basically, you should in most cases be able to expect TN-like performance if the satellite payload is well designed.

How does the signaling overhead compare for the satellite assistance SIB in LEO vs GEO configurations?

In short, GEO will have little overhead while NGSO and especially LEO will see more overhead, but we expect at most a few percent overhead on the anchor channel. Two SIBs are defined for NTN IoT, the first being for uplink synchronization and the second (to be defined in May) is for helping UEs to predict coverage in discontinuous coverage scenarios, to better enable mobile originating (MO)-traffic. The fist SIB has a fixed size regardless of the use-case, but in LEO it may be necessary to transmit for example once per second (but this will depend on the Orbit, satellite payload GNSS and the band of interest) where in GEO a UE need only receive it once. Overall this SIB should at most take up a few percent of the anchor channel. The SIB for satellite assistance information (SAI) is not defined yet, but we expect it to be of a variable size with plenty of optional parameters. This SIB-SAI is optional and should not be an overhead in GEO. SIB SAI should be expected as overhead in discontinuous NGSO only. The SIB SAI need only be received by UEs once, but the overhead here will again be larger for LEO where the satellite will move faster – a rate of once per 5 or 10 sec should be feasible.

How will the beamforming be applied from the satellite?

In NTN IoT the goal is to reuse the hardware platforms of terrestrial cellular. So the UEs are essentially similar to handheld devices with an omnidirectional antenna. Beamforming may be applied from the satellite site to orient the beam towards a specific geolocation for the “earth-fixed cell” scenario.

 

 

Is there a difference between cell size and beam size?

Yes, a ‘beam’ refers to the RF or ‘physical’ power from the TX side, which is a continuous function. A ‘cell’ is a logical entity on the RX side in a cellular network and is determined as an area within the ‘beam’ where certain criteria are met: Synchronization and SNR above threshold.

Are there standardized models for fading simulation made by 3GPP as well?

Yes, 3GPP has standardised CDL and TDL fading models for NTN based on the “IST winner II” model.