Run "src/pretrain_iot_BERT.py" to pretrain BERT on Masked Language Modelling and Next Sentence Prediction. This is following the approach described in the following video: https://www.youtube.com/watch?v=IC9FaVPKlYc&t=1079s.
Run "src/entity_extractor.py" to use IoT BERT and fine tune it on the entity extraction task on CoAP protocol only. We load IoT BERT and stack a token classification layer on top of it.Run "src/extract_MQTT_entity.py" to test the generalisation ability.
Run "src/relation_extractor.py" to use IoT BERT and fine tune it on the relation extraction task. We load IoT BERT and stack a sequence classification layer on top of it. This is still in experiment.