You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem: Sometimes we merge PRs that have unintended downstream consequences. For example, an upstream join can unexpectedly change the value of a metric or make it so records aren't being added to a table. Currently we have no way to see these problems until someone finds them through manual data inspection.
This issue is about researching and prototyping a possible solution, probably using either great expectations or dbt expectations. Using solutions like these we should be able to create expectations based on a static dataset like tuva synthetic. For example:
The readmission rate should be 10.7%
Core.condition should have 1537 total records
I made both of these statistics up but you get the ideal. This would be built into our pipeline so that this testing runs when we go to merge PRs. This speeds up our dev cycle time while lowering the probability of errors.
The text was updated successfully, but these errors were encountered:
Problem: Sometimes we merge PRs that have unintended downstream consequences. For example, an upstream join can unexpectedly change the value of a metric or make it so records aren't being added to a table. Currently we have no way to see these problems until someone finds them through manual data inspection.
This issue is about researching and prototyping a possible solution, probably using either great expectations or dbt expectations. Using solutions like these we should be able to create expectations based on a static dataset like tuva synthetic. For example:
I made both of these statistics up but you get the ideal. This would be built into our pipeline so that this testing runs when we go to merge PRs. This speeds up our dev cycle time while lowering the probability of errors.
The text was updated successfully, but these errors were encountered: