I am working on the HackerRank Top Earners problem. The problem states the following:
We define an employee's total earnings to be their monthly salary x months worked,
and the maximum total earnings to be the maximum total earnings for
any employee in the Employee table. Write a query to find the maximum
total earnings for all employees as well as the total number of
employees who have maximum total earnings. Then print these values as 2
space-separated integers.
The Employee table containing employee data for a company is described as follows:
| Column | Type |
|---|---|
| employee_id | Integer |
| name | String |
| months | Integer |
| salary | Integer |
The MS Server solution I found that works is:
SELECT MAX(months * salary), COUNT(salary*months) FROM employee WHERE salary * months IN ( SELECT MAX(salary * months) FROM employee ); What I am struggling to understand is how to break this solution down piece by piece. I know what is going on from SELECT to FROM employee, but after that I feel completely lost as to what this query is doing.
COUNT(salary*months)might as well beCOUNT(*). Alternative possible solutions includeSELECT TOP (1) months * salary maxSalary, COUNT(*) count FROM employees GROUP BY months, salary ORDER BY maxSalary DESC