Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Hardware Implementation of NB-IoT Downlink Receiver Synchronization Blocks

PhdSA

Junior Member level 1
Junior Member level 1
Joined
Mar 18, 2024
Messages
15
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
172
I am currently working on the implementation of a downlink NB-IoT receiver, focusing on the programming of all blocks in the receiver chain. Specifically, I am looking to implement the timer and frequency synchronization blocks using VHDL for hardware implementation on an FPGA. I would greatly appreciate any projects, documents, or resources related to this area that could assist me in my endeavor. Your support and guidance would be invaluable as I navigate this complex and exciting project.

Best Regards
 
I have done NBIoT receiver on FPGA. But I am not clear what you mean by timer and frequency synchronisation.
The first step is extraction of the NBIoT channel in time domain, dc centering, followed by cp removal then FFT to get the preamble in frequency domain.
The FFT size required was 8K in my case. This is way too big frame so we decided to decimate down using a chain of FIR decimators so that we could apply much smaller FFT like 512.
The tones were extracted (48 of them mostly but other options contained less or more tones) and sent to software team for further processing including delay measurement (TA).
 
Last edited:
Hello;
Thank you for your email and for sharing your experience with the NB-IoT receiver on FPGA.

To clarify, the block diagram of the NB-IoT receiver we are using includes a synchronization and CP removal block as the first step.
block.png


In this block, we need to detect the time and frequency of the detected NB-IoT signal. This is achieved by performing cross_correlation between the received signal and the known sequence of the Primary Synchronization Signal (PSS). The equation used for this process is shown below.
cross_correlation.png

I would be very grateful if you could share any projects or documents related to this area that could assist me in my implementation. Your support and guidance would be invaluable.

Thank you again for your help.

Best regards
 
Your block diagram and terminology relates to LTE ofdm and no NBIoT channel is indicated. I think you need to verify the topic. PSS and SSS etc. are part of LTE signal.

The LTE receiver is what you want to design. This is a major task usually shared between a team of designers.
For more details you can try this forum: https://www.telecomhall.net/
Theis forum is flooded with software engineers chaos at protocol layer and there is little on Phy layer for FPGA/ASIC platform but at least you will see the enormity of LTE and 5G.

If LTE system supports NBIoT then there are two channels allocated: nPrach for preamble and nNBIoT for data. Once the LTE signal receiver locks to LTE signal then NBIoT channels need not worry about such locking tasks, it just needs extraction, dc centering and FFT.
 
I am working on NB-IoT, which supports different deployment modes: in-band, guard-band, and standalone within an LTE carrier. The synchronization process of the received signal is directly cross-correlated with the NPSS in the time domain, as proposed years ago for LTE to extract the NB-IoT channel in the time domain. Additionally, I am interested in DC centering, which corrects any DC offset in the signal to ensure it is centered around zero frequency, and the FFT process. I would be grateful if you could provide documents and projects that help with the hardware implementation of the NB-IoT receiver.

Thank you again for your help.

Best regards
 
Yes, i'm trying this link, and i start to code with VHDL the primary synchronisation with NPSS, the sampling frequency, which is fs = 1.92 MHz, thus allowin 19200 samples in each frame which make the process of computing complex, so i need to do the decimation and the signal would be downsampled by a factor of d = 8 allowing 2400 samples in each frame. However, the additional processing necessary for the Finite Impulse Response (FIR)filter needs to be taken into account. So how can do this filter. Moreover i'm so gratefull if you can help me to do DC centering, which corrects any DC offset in the signal to ensure it is centered around zero frequency.
 
For dc centering, this should be done before decimation filters. you need to know the centre frequency Fc of NBIoT then apply NCO of -Fc to mix to dc.
Additionally you may need to manage CFO (carrier frequency offset). You can check how they do that for LTE and adapt it to NBIoT.

For decimate by 8, I suggest you use a cascade of 3 halfband filters. I used 19 taps each for Prach. You may also consider more decimation. The idea is that if number of tones is 48 then even an FFT of 64 can do provided Fs/N gives the right tone spacing which I think is 3.75KHz for NBIoT.

In any case I will model my work in Matlab or Octave before committing to vhdl
 
Thank you for your answer, i'm working under downlink NBIOT receiver, so the tone spacing is 15 KHz. I'm so gratefull if you can send me some documents about the implementation of filters and the carrier frequency offset. Even working in matlab can help me thank you very much for your help and support.
Best Regards
 
Thank you for your answer, i'm working under downlink NBIOT receiver, so the tone spacing is 15 KHz. I'm so gratefull if you can send me some documents about the implementation of filters and the carrier frequency offset. Even working in matlab can help me thank you very much for your help and support.
Best Regards
Notice that you may not need to worry about CFO if your local oscillator is not that different from carrier frequency with respect to 15KHz spacing. I mean that if 15KHz is wide enough relative to RF /osc error then there is no need to correct CFO.
 
Thank you very much for your assistance. Could you please guide me on how to select the appropriate USRP data rate to minimize oscillator error when detecting an NB-IoT signal at a center frequency of 900 MHz?
 
assuming oscillator of 2.5 ppm then it means 900*2.5 possible error. that is 2.25 KHz so I assume it is ok for 15KHz spacing.
 
Last edited:

LaTeX Commands Quick-Menu:

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top