In a music situation, two notes are said to be an octave apart when one note is exactly twice the frequency of the other. Suppose you have a guitar string playing frequency f0. To increase the frequency upward one octave to 2f0, by what factor should you decrease the length?

Respuesta :

Answer:

[tex]\frac{1}{2}[/tex]

Explanation:

Let's see how we can approach the problem:

Using a simple simulation:

One octave separation = 2 x the length = 1/2 length

Two octaves separation = 4 x the length = 1/4 length

Three octaves separation =  8 x the length = 1/8 length

Four octaves separation = 16 x the length  = 1/16

and so on...

As we can see, the length is decreasing by a factor of [tex]\frac{1}{2}[/tex]

To increase the frequency upward one octave to 2f0, the length of wave will be decreased by a factor of half ([tex]\frac{1}{2}[/tex]).

The given parameters;

  • initial frequency, = F₀
  • final frequency, = 2F₀

Let the initial wave length  = λ₁

Let the final wave length = λ₂

The relationship between frequency and wavelength is given as;

[tex]v = f\lambda\\\\f_1 \lambda _1 = f_2 \lambda _2[/tex]

The decrease in the length of the wave to achieve a double frequency is calculated as follows;

[tex]\lambda _2 = \frac{\lambda _1 f_1}{f_2} \\\\\lambda _2 = \frac{\lambda _1 f_0}{2f_0} \\\\\lambda_2 = \frac{1}{2} \ \lambda_1[/tex]

Thus, to increase the frequency upward one octave to 2f0, the length of wave will be decreased by a factor of half ([tex]\frac{1}{2}[/tex]).

Learn more here:https://brainly.com/question/4386945