+0  
 
0
1042
1
avatar+16 

An analogue signal has a bandwidth that extends from very low frequencies up to 12 kHz. Using the sampling theorem, what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?

If each sample is now quantised into 512 levels, what will be the resulting transmitted bit-rate, expressing your answer in scientific notation to 2 decimal places?

 

Hint: you will need to calculate the number of bits for each sample that produces 512 quantization levels.

 Feb 8, 2020
 #1
avatar+37146 
0

https://web2.0calc.com/questions/can-anyone-please-solve-this-mathematics-question

 Feb 8, 2020

0 Online Users