+0  
 
0
90
1
avatar

An analogue signal has a bandwidth which extends from very low frequencies up to 12 kHz. Using the sampling theorem, what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?

If each sample is now quantised into 512 levels, what will be the resulting transmitted bit-rate, expressing your answer in scientific notation to 2 decimal places?

Hint: you will need to calculate the number of bits for each sample that produces 512 quantisation levels

 Feb 8, 2020
 #1
avatar+20217 
0

I answered this Q the other day, but it may have been incorrect

 

I believe the sampling frequency to be   2  x  12 khz = 24 khz

 

to represent 512 levels would require  9 bits :       2^9 = 512

 

So each of 24000 samples will require an add'l 9 bits      24000 x 9 =   2.16 x 105    bits / s   

 Feb 8, 2020

39 Online Users

avatar