How to measure the Packet Error Rate during the BLE connection

This blog shows a method how to measure the packet error rate during the BLE connection.

Basically, if you run on the DTM (Directed Test Mode), it has the option to measure the Packet Error Rate (PER).

For example,

DTM BLE testing的圖片搜尋結果

By using python script to send the UART data for controlling DTM firmware, the device can count how many data receive correctly in order to report the PER.

Setting up production test using DTM ( nAN-034 , white paper)

The computer (Upper Tester) software is written in the Python programming language and consists of a DTM library, an example script, and a readme file. The software is enclosed in the zip-file together with this application note. The software has been run and tested on a Windows 7 computer. The software can be ported to other platforms that support Python. The DTM library can set up different DTM modes through UART commands as described in the Bluetooth specifications Ver 4.0, Vol 6, part F. The example script uses the DTM library to control both Tester and DUT over their respective COM ports on the computer. The DTM library supports both receive and transmit test modes, and will output Packet Error Rate (PER) for both modes. Typically a PER of > 30% in either mode can be used to detect a failing DUT. Refer to the readme file included in the attached software for documentation on the requirements and usage of the software.

Click to access nan_34.pdf


Is it possible to measure the Packet Error Rate during the BLE connection??

In order to answer this question, let have a review on the link layer at the BLE.

When the BLE link is established, the master and slave would keep to send the sync packet on each connection interval no matter whether it has the payload or empty payload.

In my github ( https://github.com/jimmywong2003/nrf5-packet-error-rate-measurement-on-ble-connection), I use the event of the RADIO TX ready and RX CRC to measure the PER.

The packet error ratio (PER) is the number of incorrectly received data packets divided by the total number of received packets. A packet is declared incorrect if at least one bit is erroneous. In this example, it counts the difference between number of TX packets send and number of RX (CRC OK) packets to estimate the PER.

This example is also to show the throughput together.

How to add the packet error rate module inside the code

Note: packet_error_rate is only working for the central role with connection.

Step 1:

Including the header file

Step 2:

Reset the packet error rate counter and enable it.

(Start the RSSI measurement on each connection event also)

Step 3:

Print out all the RSSI and packet success rate on each BLE_GAP_EVT_RSSI_CHANGED event.

Step 4:

The result (print screen) would be shown on the UART terminal or RTT terminal.

The example code can be found at

https://github.com/jimmywong2003/nrf52-ble5-long-range-demo

This example doesn’t need to have any LCD display. All the results are printed on the UART terminal.

Welcome to give some comments and feedback on this blog. Please remember to give a “Like” if you think it is useful.

2 thoughts on “How to measure the Packet Error Rate during the BLE connection

  1. Hello, if I want to implement the Packet Error Rate into my code are implementing those 2 functions enough?

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.