An 800-kHz radio signal is detected at a point 8.5 km distant from a transmitter tower. The electric field amplitude of the signal at that point is 0.90 V/m. Assume that the signal power is radiated uniformly in all directions and that radio waves incident upon the ground are completely absorbed. What is the average electromagnetic energy density at that point? (c = 3.0 x 108 m/s, μ0 = 4π × 10-7 T ∙ m/A, ε0 = 8.85 × 10-12 C2/N ∙ m2)

A. 7.2 pJ/m3
B. 10 pJ/m3
C. 3.6 pJ/m3
D. 14 pJ/m3
E. 5.1 pJ/m3

Respuesta :

To solve this problem we need to apply the concepts related to the average electromagnetic energy density. Which is given as

[tex]U = \frac{1}{2}\epsilon_0 E^2[/tex]

Where,

\epsilon_0 = Permettivity of free space constant

E = Electric Field amplitude

Since the average electromagnetic energy density is directly proportional to the amplitude of the magnetic field then we have to

[tex]E = \frac{1}{2} (8.85*10^{-12}C^2/N\cdot m^2)(0.9V/m)^2[/tex]

[tex]E = 3.6*10^{-12}J/m^3[/tex]

[tex]E = 3.6pJ/m^3[/tex]

Therefore the correct answer is C.

ACCESS MORE