a. Expectation is linear, so that
[tex]E[27X_1+125X_2+512X_3]=27E[X_1]+125E[X_2]+512E[X_3]=98,630[/tex]
The variance of a sum of independent random variables is equal to a weighted sum of the variances, with weights equal to the squares of the coefficients of the [tex]X_i[/tex]:
[tex]V[27X_1+125X_2+512X_3]=27^2V[X_1]+125^2V[X_2]+512^2V[X_3]=2,306,838[/tex]
b. If the [tex]X_i[/tex] were dependent on one another, we would have the same expectation, but now the variance of a sum of random variables becomes the sum of their covariances:
[tex]\displaystyle\sum_{i=1}^3V[\alpha_iX_i]=\sum_{i=1}^3\sum_{j=1}^3\mathrm{Cov}[X_i,X_j]=\sum_{i=1}^3{\alpha_i}^2V[X_i]+2\sum_{i\neq j}\alpha_i\alpha_j\mathrm{Cov}[X_i,X_j][/tex]
where
[tex]\mathrm{Cov}[X_i,X_j]=E[(X_i-E[X_i])(X_j-E[X_j])]=E[X_iX_j]-E[X_i]E[X_j][/tex]
When we assumed independence, we were granted [tex]E[X_iX_j]=E[X_i]E[X_j][/tex], but this may not be the case if the variables are dependent.