An analogue signal has a bandwidth that extends from very low frequencies up to 12 kHz. Using the sampling theorem, what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?

If each sample is now quantised into 512 levels, what will be the resulting transmitted bit-rate, expressing your answer in scientific notation to 2 decimal places?

Hint: you will need to calculate the number of bits for each sample that produces 512 quantization levels.

Rob93old Feb 8, 2020

#1**0 **

https://web2.0calc.com/questions/can-anyone-please-solve-this-mathematics-question

ElectricPavlov Feb 8, 2020