There are essentially two types of KPIs, the ones you can control and the ones you can't. The later ones essentially suck and no matter how aspirational they are they should not be used to measure the performance of engineering teams.
Engineers typically enjoy operating in a very tangible universe that is almost binary in a way: something either works or it doesn't, a deployment either passed or failed a set of automation tests, an HTTP request is successful or it isn't.
This means that engineers expect to be measured in a very concrete and objective way, and this is only possible if whatever metrics they need to work towards are realistic, achievable and within their control.
A few examples of KPIs that suck:
- Audience or conversion metrics that they cannot directly influence,
- Overall financial / P&L metrics that they can only impact indirectly,
- Everything and anything they cannot directly change, influence of fix.
KPIs that work for engineers:
- Software specific KPIs: Code quality (peer reviewed against specific pre-set quality standards), roll-backs that a specific piece of code has caused, estimation accuracy (initial estimation versus actual delivery time), time to market of new features within a specific part of the code or component, tech debt reduction over time (assuming its being tracked).
- Business centric KPIs: Business metrics of features that the developer has worked on with a direct impact on key business metrics: i.e. when an engineer is working on a search results page; conversion rate from search to the details page or to the end of the funnel would be something he/she can influence.
Whatever you do, make sure that engineers understand the WHY before the WHAT and that any key performance metric that is put in front of them is realistic, achievable, measurable, and most importantly, something that they can directly influence.
Above all else, make sure that Engineers have KPIs to work for, even if they suck.