An analogue sensor has a bandwidth which extends from very low frequencies up to 8.75 kHz. Using the Sampling Theorem (Section 3.3.1), what is the minimum sampling rate (number of samples per second) required to convert the sensor output signal into a digital representation without incurring any aliasing?
If each sample is now quantised into 512 levels, what will be the resulting sensor output bitrate in kbps?
Give your answer in scientific notation to one decimal place.