+0  
 
0
53
1
avatar

An analogue signal has a bandwidth which extends from very low frequencies up to 12 kHz. Using the sampling theorem what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?

If each sample is now quantised into 512 levels, what will be the resulting transmitted bit-rate, expressing your answer in scientific notation to 2 decimal places?

 Feb 4, 2020
edited by Guest  Feb 4, 2020
 #1
avatar+20217 
+2

Not POSITIVE, but pretty sure the minimum sampling rate would be    2   x 12 khz = 24 000/sec

 

put into 512 levels       24000 x 512 = 12.288 Mbs

 Feb 4, 2020

34 Online Users