An analogue signal has a bandwidth which extends from very low frequencies up to 12 kHz. Using the sampling theorem what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?
If each sample is now quantised into 512 levels, what will be the resulting transmitted bit-rate, expressing your answer in scientific notation to 2 decimal places?
Not POSITIVE, but pretty sure the minimum sampling rate would be 2 x 12 khz = 24 000/sec
put into 512 levels 24000 x 512 = 12.288 Mbs