Chapter 8: Problem 35
Find a formula for \(g\) by scaling the output of \(f\). Let \(f(n)\) give the average time in seconds required for a computer to process \(n\) megabytes (MB) of data, and \(g(n)\) the time in microseconds. Use the fact that \(1 \mathrm{~s}\) equals \(1,000,000 \mu \mathrm{s}\).
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.