Engineering Enablement

The Engineering Enablement Substack focuses on the latest research and perspectives regarding developer productivity. It explores strategies for managing technical debt, effective engineering metrics, software quality, and factors affecting developer productivity. Themes include the importance of non-technical factors, team and individual performance metrics, and the impact of workplace culture and processes.

Developer Productivity Technical Debt Management Engineering Metrics Software Quality Team Performance Workplace Culture Code Review Processes Workplace Tools and Technologies

The hottest Substack posts of Engineering Enablement

And their main takeaways
7 implied HN points 19 Feb 25
  1. Communicate openly with leaders about new productivity metrics to avoid surprises. It's important to have conversations, not just send emails, to build trust.
  2. Be clear about what the metrics cover to reduce fear. Focus on process-related data, and explain how it helps teams improve.
  3. Invest time in change management, as it’s crucial for success. Engage key players early, ask for their input, and keep everyone informed through various channels.
21 implied HN points 12 Feb 25
  1. Software quality has four main types: process quality, code quality, system quality, and product quality. Each type affects the others, so improving one can help improve the rest.
  2. Process quality is crucial because a good development process leads to better code quality. This means having proper testing and code reviews can help avoid defects later on.
  3. Product quality is what customers experience and it includes a product's usability and reliability. Engineers need to team up with product managers to ensure that products meet customer needs.
21 implied HN points 05 Feb 25
  1. Metrics for developers should help improve their work experience, not just measure their output. Goodhart's Law reminds us that once metrics are tied to rewards, they can become misleading.
  2. Developer experience is more about effectiveness than happiness. Measuring how developers feel needs to focus on the frustrations they face, and not just on making them comfortable.
  3. Using benchmarks is important but context is key. Just like medical tests, numbers need interpretation to make sense; comparing different teams requires understanding their unique challenges.
11 implied HN points 29 Jan 25
  1. Using Core 4 metrics helps link developer productivity projects to important business outcomes. This way, everyone can understand the impact of these projects.
  2. Investing in improving developer processes can save a lot of time and money. For example, fixing slow review times can free up hours that can be used for more productive work.
  3. Regularly measuring progress helps teams keep improving. It's important to revisit these metrics to find new areas to enhance and continue moving forward.
12 implied HN points 19 Jan 25
  1. Use a survey to gather Core 4 metrics easily. It's designed for simplicity, so anyone can set it up.
  2. Calculate your metrics by averaging survey responses for Speed, Quality, and Impact. For Effectiveness, look at the positive responses overall.
  3. Once you have your results, compare them with industry benchmarks to see how you're doing. This helps you understand your team's performance better.
Get a weekly roundup of the best Substack posts, by hacker news affinity:
14 implied HN points 12 Jan 25
  1. Focus on the specific needs of leaders. Show how your ideas can solve their biggest issues to get their support.
  2. Talk about money. Explain how improving developer productivity can directly impact profits and save costs.
  3. Create a sense of competition. Use benchmarks to show how the organization compares to others, making leaders want to improve.
12 implied HN points 23 Dec 24
  1. Companies are using AI tools to help engineers work faster, with data showing that these tools can significantly improve productivity. For example, tasks were completed 40% faster in some studies.
  2. Understanding the differences between platform engineering and developer experience teams is important for improving how developers work. Companies are putting focus on their developer productivity teams to ensure that their developers have what they need.
  3. New frameworks are being introduced to measure developer productivity more effectively. These frameworks help identify inefficiencies and understand how developers feel about their working conditions.
13 implied HN points 17 Dec 24
  1. Smaller companies are quicker at delivering work than larger ones. Tech companies with fewer than 500 developers are particularly fast, completing more tasks per week.
  2. Tech companies spend more time creating new features and have a better experience for developers compared to traditional businesses. This helps them innovate more effectively.
  3. Large traditional companies may work slower, but they often have fewer errors in their work. This makes them safer, even if they don't deliver as quickly as tech firms.
14 implied HN points 10 Dec 24
  1. The DX Core 4 is a new framework that combines existing models like DORA, SPACE, and DevEx to measure developer productivity more effectively. It aims to give clear guidance on what companies should measure.
  2. This framework focuses on four main areas: speed, effectiveness, quality, and impact, each with specific metrics to help organizations understand and improve their developer processes.
  3. The DX Core 4 is intended to be transparent and helpful for developers, promoting conversations around their challenges rather than using metrics against them.
14 implied HN points 05 Nov 24
  1. Platform teams handle a broader range of responsibilities compared to Developer Experience teams. This means they are involved in more of the underlying tech operations.
  2. Local development, source code management, and incident management are key tasks for both types of teams. These areas help developers write and deploy their code more smoothly.
  3. The name of the team can reflect its focus. Some teams prioritize overall developer support while others are more infrastructure-focused, suggesting that their approach can change based on company needs.
15 implied HN points 30 Oct 24
  1. Using AI tools can actually make software delivery worse, as they lead to larger code changes that are riskier. This is surprising because many people think AI would improve coding efficiency.
  2. Software delivery performance indicators are becoming more independent from each other. This year's report shows some unexpected trends, like medium performance groups having fewer failures than high performance groups.
  3. To boost productivity, companies should focus on creating user-friendly internal platforms for developers. It's important for leaders to understand their team's needs and provide clear support to improve overall performance.
8 implied HN points 03 Dec 24
  1. PR throughput is a useful metric for understanding the health of a software system. It can highlight issues that developers face while coding, helping teams identify where improvements can be made.
  2. It's important to use PR throughput as part of a larger set of metrics. This approach helps ensure that you get a balanced view of productivity, developer satisfaction, and overall system efficiency.
  3. When measuring PR throughput, context matters. A rise in this metric can mean different things, like increased workloads or improved processes, so it's essential to look deeper into the reasons behind the changes.
9 implied HN points 25 Nov 24
  1. Engineers often have bad days due to issues with their tools and systems. Problems like unreliable tools or slow processes can make it tough to work efficiently.
  2. Having a bad day can lower a developer's productivity and increase their stress. Both senior and junior developers feel these effects, but in different ways; seniors may get frustrated, while juniors often doubt their abilities.
  3. Research confirmed that issues causing bad days also slow down work processes. Measuring things like how long it takes to complete tasks showed that these problems really affect productivity.
6 implied HN points 19 Nov 24
  1. A structured rollout of tools like Copilot can significantly improve user satisfaction and adoption, with increases seen by up to 20%.
  2. Training and support during the rollout process lead to better tool usage, helping teams realize the full benefits of their tools.
  3. Creating community spaces for users to share experiences and asking for feedback can enhance overall satisfaction and engagement with the tool.
37 implied HN points 05 Jan 24
  1. Software quality encompasses four types: process, code, system, and product quality.
  2. Process quality sets the foundation for overall software quality by having a strong development process.
  3. Code quality is crucial for system quality and product quality, focusing on maintainability and reducing defects.
23 implied HN points 12 Jan 24
  1. The SPACE Framework for developer productivity includes dimensions like satisfaction, performance, activity, communication, and efficiency.
  2. SPACE is useful for software organization leaders defining productivity, teams seeking comprehensive measurements, and leaders involving teams in productivity improvement.
  3. Implementing SPACE involves understanding various metrics, balancing workflow and perception measurements, and considering the holistic approach to developer productivity.
19 implied HN points 09 Feb 24
  1. Code reviews at Meta were taking too long, so they experimented with NudgeBot to speed up the process.
  2. The team identified a correlation between slow code reviews and dissatisfaction, leading to the implementation of NudgeBot.
  3. By using NudgeBot to nudge reviewers to act on 'stale' diffs, Meta successfully reduced the time taken for code reviews.
14 implied HN points 15 Mar 24
  1. On average, developers report 22% of their time being wasted, resulting in significant potential productivity loss.
  2. Efficiency in engineering organizations varies widely within teams, showing room for improvement and optimization.
  3. There is a correlation between the size of an organization in terms of employee count and revenue with the amount of developer time wasted, indicating larger organizations may struggle with efficiency compared to smaller ones.
31 implied HN points 01 Sep 23
  1. Developer productivity can be conceptualized through three dimensions: Velocity, Quality, and Satisfaction.
  2. Leaders should clarify their goals for measuring productivity by considering stakeholders, level of measurement, and time period.
  3. Transitioning from dimensions to selecting metrics can be done using the Goals, Signals, Metrics approach.
14 implied HN points 01 Mar 24
  1. The DevEx framework focuses on the lived experiences of developers by measuring feedback loops, cognitive load, and flow state to enhance developer productivity.
  2. Teams interested in using metrics to improve developer productivity, such as platform engineering teams, engineering managers, and engineering executives, can benefit from implementing the DevEx framework.
  3. To successfully implement the DevEx framework, organizations should focus on getting feedback from developers, setting targets, driving impact through projects, running experiments, and then measuring progress to improve developer experience and productivity.
28 implied HN points 11 Aug 23
  1. Time pressure in software development is influenced by poor effort estimates, project management issues, and company culture.
  2. Three theories explain the effects of time pressure: Yerkes-Dodson Law, Job Demands-Resources Model, and Dimensional Model of Emotions.
  3. Time pressure impacts individuals by decreasing confidence, process by affecting quality assurance, and efficiency and quality by increasing efficiency up to a certain point.
23 implied HN points 22 Sep 23
  1. Factors like job enthusiasm, peer support for new ideas, and useful feedback strongly correlate with developer productivity.
  2. Non-technical factors like job satisfaction are crucial for productivity, while technical factors can vary among companies.
  3. Improving job enthusiasm, supporting new ideas, and providing feedback can enhance developer productivity.
21 implied HN points 25 Aug 23
  1. Team norm clarity is a stronger predictor of performance and satisfaction than psychological safety.
  2. Psychological safety and team norm clarity are both important for team performance and job satisfaction.
  3. Focusing on team norm clarity has a more significant impact on performance and satisfaction than psychological safety.
22 implied HN points 28 Jul 23
  1. Reflective goal-setting can increase productivity at work.
  2. Developers set goals to improve time management, avoid deviation from planned work, improve impact on the team, maintain work-life balance, and continuously learn.
  3. Reflective goal-setting helps developers identify concrete goals, increase perceived productivity, and sustain positive behavior changes.
9 implied HN points 16 Feb 24
  1. The Thoughtworks Technology Radar categorizes technologies into four rings: Hold, Assess, Trial, and Adopt based on their readiness and suitability for adoption.
  2. The Radar provides a snapshot of technologies seen in the previous six months and aims to showcase what's happening globally in the tech industry.
  3. The Radar is produced through a process of collecting technology proposals from Thoughtworks employees, voting on their inclusion, and finalizing around 100 blips for publication.
4 implied HN points 08 Mar 24
  1. Telemetry metrics like pull requests per developer and code review time can give a high-level view of how GenAI tools are impacting developer output, but they may not provide a complete picture of tool utilization and benefits.
  2. Experience sampling, where developers are surveyed in real-time as they use GenAI tools, can offer valuable insights into specific time savings and tool usage, helping organizations understand the effectiveness of GenAI.
  3. Surveys are useful for measuring developer adoption, satisfaction, and self-reported productivity related to GenAI tools, providing a different perspective to complement telemetry metrics and experience sampling.
4 implied HN points 23 Feb 24
  1. Change description is crucial for code review, including explaining the motivation behind a change and what is being altered.
  2. Smaller code changes are easier to review and have a higher chance of acceptance.
  3. Commit history matters: a concise, self-explanatory message is preferred, and fewer commits increase the likelihood of acceptance.