+0  
 
0
411
0
avatar

An analogue signal has a bandwidth which extends from very low frequencies up to 12 kHz. Using the sampling theorem , what is the minimum sampling rate (number of samples per second) required to convert this signal into a digital representation?

 

 

i got 2.16x10 iis this correct 

 
 Mar 3, 2020

6 Online Users

avatar
avatar