Every night between 7pm and midnight, two computing jobs from two different sources are randomly executed with each one lasting one hour. Unfortunately, when the jobs simultaneously run, they cause a failure in some of the company’s other nightly jobs, resulting in downtime for the company that costs it one thousand dollars. Assume that the start times of the two jobs are independent of one another and that each one individually has the property that given two time intervals of equal duration in the 7pm-midnight period, the chance the job starts in the first interval is equal to that of the second interval. The CEO, who only has enough time today to hear no more than one word, needs a single number representing the annual (365 days) cost of this problem. What is your succinct response?✱