BERT

Definition : Bidirectional Encoder Representations from Transformers
Category : Computing » Programming & Development
Country/Region : Worldwide Worldwide
Popularity :
Type :
Acronym

What does BERT mean?

Bidirectional Encoder Representations from Transformers (BERT) is a Machine Learning (ML) model for Natural Language Processing (NLP) developed by Google. NLP is the field of Artificial Intelligence (AI) that aims for computers to read, analyze, interpret and derive meaning from text and spoken words.
BERT is based on Transformers, a deep learning model in which every output element is connected to every input element, and the weightings between them are dynamically calculated based on their connection. Transformers were first introduced by Google in 2017.

Frequently Asked Questions (FAQ)

What is the full form of BERT?

The full form of BERT is Bidirectional Encoder Representations from Transformers

What is the full form of BERT in Computing?

Bidirectional Encoder Representations from Transformers

What is the full form of BERT in Worldwide?

Bidirectional Encoder Representations from Transformers

Translation

Find translations of Bidirectional Encoder Representations from Transformers