Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing
This paper explores design considerations and ethical tensions related to auditing of commercial facial processing technology.

The document presents “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” introducing BERT, a novel language representation model that achieves state-of-the-art results on a wide array of NLP tasks.
BERT’s key innovation is its use of deep bidirectional training, allowing the model to consider context from both directions, which significantly improves performance on tasks such as question answering and language inference.
Share this Resource: