본문 바로가기

AI Models

Fine-tuned version of BERT : a Transformer Bidirectional Encoder based Architecture trained on MLM (Mask Language Modeling)

반응형

Hugging Face code

https://huggingface.co/nickwong64/bert-base-uncased-poems-sentiment

 

nickwong64/bert-base-uncased-poems-sentiment · Hugging Face

nickwong64/bert-base-uncased-poems-sentiment Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective. bert-base-uncased finetuned on the poem_sentiment dataset using HuggingFace Trainer with below tra

huggingface.co

 

BERT101 : State of the Art NLP Model Explained 

https://huggingface.co/blog/bert-101

 

BERT 101 - State Of The Art NLP Model Explained

BERT 101 🤗 State Of The Art NLP Model Explained BERT, short for Bidirectional Encoder Representations from Transformers, is a Machine Learning (ML) model for natural language processing. It was developed in 2018 by researchers at Google AI Language and

huggingface.co

 

BERT (Bidirectional Encoder Representations from Transformers) Wikipedia

https://en.wikipedia.org/wiki/BERT_(language_model)

 

BERT (language model) - Wikipedia

From Wikipedia, the free encyclopedia Language model developed by Google Bidirectional Encoder Representations from Transformers (BERT) is a language model based on the transformer architecture, notable for its dramatic improvement over previous state of t

en.wikipedia.org

 

 

반응형
LIST