Reconstruct damaged inscriptions by predicting missing characters with BERT
Use [MASK] to mark where you want BERT to predict. Example: 今天天气很[MASK]
This tool uses a BERT (Bidirectional Encoder Representations from Transformers) model specifically trained on Chinese text. BERT understands context and semantic relationships between characters, making it ideal for reconstructing damaged or incomplete ancient texts. Place [MASK] where characters are missing or illegible.