An analogue signal has a bandwidth which extends from very low frequencies up to 12 kHz. Using the sampling theorem , what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?
i got 2.16x105 iis this correct