Sep 30, 2024
At Meta, engineers are our biggest asset which is why we have an entire org tasked with making them as productive as possible. But how do you know if your projects for improving developer experience are actually successful? For any other product, you would run an A/B test but that requires metrics and how do you measure developer productivity? Sarita and Moritz have been working on exactly that with Diff Authoring Time which measures how long it took to submit a change to our codebase. Host Pascal talks to them about the way this is implemented, the challenges and abilities this unlocks.
Got feedback? Send it to us on Threads (https://threads.net/@metatechpod), Twitter (https://twitter.com/metatechpod), Instagram (https://instagram.com/metatechpod) and don’t forget to follow our host @passy (https://twitter.com/passy, https://mastodon.social/@passy, and https://threads.net/@passy_). Fancy working with us? Check out https://www.metacareers.com/.
You can follow our guest Moritz on X (https://x.com/Inventitech) or check out his website on inventitech.com.
Links
Meta Connect 2024: https://www.meta.com/en-gb/connect/
Timestamps
Episode intro 0:05
Sarita Intro 2:33
Moritz Intro 3:44
DevInfra as an Engineer 4:25
DevInfra as a Data Scientist 5:12
Why DevEx Metrics? 6:04
Average Diff Authoring Time at Meta 9:55
Events for calculating DAT 10:55
Edge cases 13:15
DAT for Performance Evaluation? 20:29
Analyses on DAT data 22:29
Onboarding to DAT 23:23
Stat-sig data 25:06
Validating the metric 26:34
Versioning metrics 28:09
Detecting and handling biases 29:19
Diff coverage 30:30
Do we need DevX metrics in an AI software engineering world? 31:23
Measuring the impact of AI tools 32:23
What's next for DAT? 33:40
Outtakes 36:22