Chapter 4: Problem 9
Small differences in large numbers can lead to nonsense. Using the results from Problem 8, show that the propagated error is larger than the difference itself for \(f(x, y)=x-y\), with \(x=20 \pm 2\) and \(y=19 \pm 2\).
Short Answer
Expert verified
Error (2.83) is larger than the difference (1).
Step by step solution
01
Understand the Problem
We need to find the propagated error for the function \(f(x, y) = x - y\). The given values are \(x = 20 \pm 2\) and \(y = 19 \pm 2\).
02
Calculate the Difference
First, calculate the difference \(f(x, y) = x - y\). For the given values, this is \(f(20, 19) = 20 - 19 = 1\).
03
Determine Partial Derivatives
Find the partial derivatives of \(f(x, y)\) with respect to \(x\) and \(y\). \(\frac{\partial f}{\partial x} = 1\) and \(\frac{\partial f}{\partial y} = -1\).
04
Calculate the Propagated Error
Use the formula for propagated error: \[ \sigma_f = \sqrt{ \left( \frac{\partial f}{\partial x} \sigma_x \right)^2 + \left( \frac{\partial f}{\partial y} \sigma_y \right)^2 } \]Substitute the given values: \[ \sigma_f = \sqrt{(1 \cdot 2)^2 + (-1 \cdot 2)^2} = \sqrt{4 + 4} = \sqrt{8} = 2\sqrt{2} \approx 2.83 \]
05
Compare the Propagated Error with the Difference
The difference \(f(20, 19) = 1\). The propagated error is approximately \(2.83\), which is larger than the difference itself.
06
Conclusion
Therefore, the propagated error is larger than the difference itself for \(f(x, y) = x - y\) with the given values and uncertainties.
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with Vaia!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Partial Derivatives
In scientific computations, especially when dealing with functions of multiple variables, understanding partial derivatives is crucial. Partial derivatives measure how a function changes as each variable changes, independently of the others. For example, consider the function given in the exercise, \(f(x, y) = x - y\).
To understand this concept better:
\(\frac{\partial f}{\partial x} = 1\)
and
\(\frac{\partial f}{\partial y} = -1\).
These values help us understand how the overall function value changes with small changes in \(x\) and \(y\).
To understand this concept better:
- The partial derivative of \(f\) with respect to \(x\), noted as \(\frac{\partial f}{\partial x}\), tells us how the function \(f\) changes when only \(x\) is varied.
- The partial derivative of \(f\) with respect to \(y\), noted as \(\frac{\partial f}{\partial y}\), tells us how the function \(f\) changes when only \(y\) is varied.
\(\frac{\partial f}{\partial x} = 1\)
and
\(\frac{\partial f}{\partial y} = -1\).
These values help us understand how the overall function value changes with small changes in \(x\) and \(y\).
Propagated Error
When performing calculations with measurements that have uncertainties, the inaccuracies can propagate through the computations. Propagated error gives us an understanding of how the uncertainties in the input variables affect the result of the function.
To quantify this, we use the following formula:
\[ \sigma_f = \sqrt{ \left( \frac{\partial f}{\partial x} \sigma_x \right)^2 + \left( \frac{\partial f}{\partial y} \sigma_y \right)^2 } \] Here,
\(\sigma_x = 2\)
and
\(\sigma_y = 2\)
we calculated:
\( \sigma_f = \sqrt{ (1 \cdot 2)^2 + (-1 \cdot 2)^2 } = \sqrt{8} \approx 2.83 \).
This example shows that the uncertainty in the result (\(\sigma_f\)) is larger than the calculated difference, highlighting the significance of propagated error in scientific computations.
To quantify this, we use the following formula:
\[ \sigma_f = \sqrt{ \left( \frac{\partial f}{\partial x} \sigma_x \right)^2 + \left( \frac{\partial f}{\partial y} \sigma_y \right)^2 } \] Here,
- \(\sigma_x\) and \(\sigma_y\) are the uncertainties in the variables \(x\) and \(y\).
- The partial derivatives \(\frac{\partial f}{\partial x}\) and \(\frac{\partial f}{\partial y}\) show how sensitive the function is to changes in \(x\) and \(y\).
\(\sigma_x = 2\)
and
\(\sigma_y = 2\)
we calculated:
\( \sigma_f = \sqrt{ (1 \cdot 2)^2 + (-1 \cdot 2)^2 } = \sqrt{8} \approx 2.83 \).
This example shows that the uncertainty in the result (\(\sigma_f\)) is larger than the calculated difference, highlighting the significance of propagated error in scientific computations.
Statistical Thermodynamics
Statistical thermodynamics often involves complex functions of multiple variables where errors and uncertainties can immensely affect the results. Statistical mechanics provides a bridge between microscopic properties of individual atoms and molecules and macroscopic properties of materials.
Concepts in statistical thermodynamics include:
Concepts in statistical thermodynamics include:
- Ensemble averages, where we calculate average properties over a large collection of microstates.
- Partition functions, which summarize the statistical properties of a system in thermodynamic equilibrium.
- Fluctuations and response functions, where uncertainties and their propagation play a crucial role.