+0  
 
+1
1607
1
avatar+5 

An analogue signal has a bandwidth extending from very low frequencies up to 4 kHz. Using the sampling theorem , what is the minimum sampling rate (number of samples/sec) required to convert this signal into a digital representation?

 

If each sample is now quantised into 4096 levels, what is the resulting transmitted bit rate, expressing your answer in scientific notation?

Hint: you first need to calculate the number of bits per sample that will produce 4096 quantisation levels 

 

Thanks for any help

 Mar 12, 2019
 #1
avatar+36916 
0

I believe if I remember correctly, the sampling rate should be TWICE the highest frequency  ...    so    8 kHz sampling rate.

I'll have to review the second part of your question...

   but maybe each sample will need to be  2^x = 4096     where x is the numer of bits   = 12 bits long

      12 bits / sample   x   8 kHz  = 96000 bits/sec      9.6 x 10^4 bits/sec

 Mar 12, 2019

1 Online Users