Suppose ten distinct, positive integers have a median of 100. ("Distinct integers" means that no two integers are the same.) What is the smallest the average of those ten integers could be?
Suppose ten distinct, positive integers have a median of 100. ("Distinct integers" means that no two integers are the same.) What is the smallest the average of those ten integers could be?
Definition: The median is the middle value in a set of numbers when they are arranged in order. It represents the point that divides a dataset in half, with half of the values falling above it and half below it.
Here's my submission: 1 2 3 4 99 101 102 103 104 105
Median = 100
Average = 62.4
.