An analogue sensor has a bandwidth which extends from very low frequencies up to a maximum of 14.5 kHz. Using the Sampling Theorem, what is the minimum sampling rate (number of samples per second) required to convert the sensor signal into a digital representation?
If each sample is now quantised into 2048 levels, what will be the resulting transmitted bitrate in kbps?
Give your answer in scientific notation to 1 decimal place.
Hint: you firstly need to determine the number of bits per sample that produces 2048 quantisation levels.