+0  
 
0
345
1
avatar

An analog sensor device has a bandwidth that extends from very low frequencies up to 7.25 kHz. Using the Sampling Theorem what is the minimum sampling rate (number of samples per second) required to convert the output signal from the device into a digital representation?

 

If each sample is now quantized into 2048 levels, what will be the resulting device output bitrate in kbps?

 

Give your answer in scientific notation to 1 decimal place.

 Feb 12, 2022
 #1
avatar+36915 
+1

You need to sample at least 2x the frequency to avoid losing data.....higher rates are better

 

2 x 7.25 kHz = 14.5 kHz sampling rate

 

14 500 samples per second   and each requires 2048 levels   2048 = 2 11     So you need 11 bits to quantisize 2048 levels

 

14500 * 11 = 159500 bits per second   159.5 kbps      or   ~~  1.6 x 102  kbps

 Feb 12, 2022

3 Online Users