A computer takes 3x^2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?

Respuesta :

Louli
The average time taken to process each line can be calculated as follows:
average time = total time / number of lines

For this program:
the total time = 3x^2 + 2 milliseconds
number of lines = 4 + x 
(4 static lines and x variable lines)

Therefore:
average time for each line = (3x^2 + 2) / (x + 4) milliseconds