Text
FINE-TUNING MODEL BERT UNTUK NAMED ENTITY RECOGNITION
A sentence can contain various named entities with important meanings, such as names of people, locations, organizations, and time expressions. However, extracting this information manually requires significant time and resources. Named Entity Recognition (NER) offers an automated solution that improves the efficiency of this task. One method for developing an NER system is by using BERT, a transformer-based model that has proven effective in a wide range of natural language processing tasks. The BERT-base-cased model was fine-tuned using several hyperparameter configurations to identify the best combination based on the F1-score. The evaluation results showed that the best configuration was achieved with a learning rate of 5e-5, a batch size of 32, and 2 epochs, yielding an F1-score of 0.836765
Inventory Code | Barcode | Call Number | Location | Status |
---|---|---|---|---|
2507003334 | T175150 | T1751502025 | Central Library (Reference) | Available but not for loan - Not for Loan |
No other version available