Fairness EVAluation of Voice Technologies
Fair EVA is an open source project that is gathering resources and building tools
to help researchers and developers, technology activists and voice technology users
evaluate and audit bias and discrimination in voice technologies.
Voice technologies have become a part of modern life. They are integrated into every smartphone, they drive smart speakers, automate call centers, inform forensic investigations and activate hands-free interactions with products and services.
Big data and AI are key enablers of voice technologies.
We believe that voice technologies should work reliably for all users, and are concerned about the potential for bias and discrimination presented by the unchecked use of data and AI in their development. This is why we have launched the Fair EVA project.
We welcome new voices who want to support our mission.