The average time it takes a computer to execute one instruction is measured in picoseconds. There are 3.6 x 1015 picoseconds per hour. What fraction of a second is a picosecond?
3.6 x 10^15 picoseconds per hour/ 3600 = 1,000,000,000,000 pico seconds per second
So ...a picosecond is one-trillionth of a second