+0  
 
0
2
2957
1
avatar

An analogue sensor has a bandwidth which extends from very low frequencies up to a maximum of 14.5 kHz. Using the Sampling Theorem, what is the minimum sampling rate (number of samples per second) required to convert the sensor signal into a digital representation?

 

If each sample is now quantised into 2048 levels, what will be the resulting transmitted bitrate in kbps?

 

Give your answer in scientific notation to 1 decimal place. 
 

Hint: you firstly need to determine the number of bits per sample that produces 2048 quantisation levels.

 Feb 7, 2021
 #1
avatar
0

2 x 14.5 khz = 29 khz sampling rate

 

2048 can be represented as 211         so 11 bits will be able to encode 2048 levels

 

29000 samples / sec  *  11 bits/sample = 319000 bits/sec  = 319 kb/s

 Feb 7, 2021

1 Online Users

avatar