As computer words get larger and larger, there is a law of diminishing returns: the speed of execution of real application programs does not increase and may, in fact, decrease. Why do you suppose that this is so?

Respuesta :

Answer:

Explanation:

I can pick a reason out of many, and that would be the software. The fact is that, even as they are computers and machines, the softwares themselves have particular design cap. A software that was designed for systems with a narrow word size will most likely have issues as more and more words are used on it. I don't know if you understand what I'm saying, everything has a cap, once that particular meter is approaching, systems tend to slow onwards.

When it comes to the hardware part using the same technology, it might take a longer time to do basic arithmetic on larger inputs.

When you're under using a system, it tends to have a lot of space, excess, to perform its functions, but as soon as it is nearing its limit, it will slow down so as to accommodate all the processes at once.

ACCESS MORE