A jewelry store offers Chuck a job paying 40 thousand dollars per year plus 2% of every sale that he makes. The equation models the money Chuck makes if he sells x dollars of jewelry. y= 40000 +0.02x Or Chuck can take the job and make 60 thousand dollars per year. How much jewelry does Chuck need to sell so that the first option is more profitable? Construct a viable argument to support your solution.