Inactive evaluations
method: CLOVA OCR DEER2022-07-20
Authors: Taeho Kil, Seonghyeon Kim, Sukmin Seo
Affiliation: Clova AI OCR Team, NAVER/LINE Corp.
Description: An end-to-end scene text spotter based on CNN backbone, deformable transformer encoder, location decoder and text decoder. The location decoder based on the segmentation method (Differentiable Binarization) detects text regions, and text decoder based on the deformable transformer decoder recognizes each instances from image features and detected location information. We use not multiple ensemble models but a single model, and all sub-modules are end-to-end trainable. We use real datasets provided by this challenge (train + val split), and synthetic dataset. Since cocotext dataset has a lots of label noises (with regards to alphabet capitalization), we refined the cocotext dataset annotation using teacher model (trained without cocotext).
method: e2e text spotter - final version2022-07-21
Authors: Taeho Kil, Seonghyeon Kim, Sukmin Seo
Affiliation: Clova AI OCR Team, NAVER/LINE Corp.
Description: An end-to-end scene text spotter based on CNN backbone, deformable transformer encoder, location decoder and text decoder. The location decoder based on the segmentation method (Differentiable Binarization) detects text regions, and text decoder based on the deformable transformer decoder recognizes each instances from image features and detected location information. We use not multiple ensemble models but a single model, and all sub-modules are end-to-end trainable. We use real datasets provided by this challenge (train + val split), and synthetic dataset. Since cocotext dataset has a lots of label noises (with regards to alphabet capitalization), we refined the cocotext dataset annotation using teacher model (trained without cocotext).
method: Detector Free E2E Method2022-07-21
Authors: Kim Seonghyeon
Affiliation: NAVER
Description: An detection free end-to-end text recognizer. CNN + Deformable Encoder & Decoder is used. Trained with training + valid data from RRC competitions and additional SynthText synthesized with MJSynth 90k dictionary, and for longer schedule.
All | OOV | IV | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Date | Method | Hmean | Precision | Recall | Hmean | Precision | Recall | Hmean | Precision | Recall | Hmean | |||
2022-07-20 | CLOVA OCR DEER | 0.4242 | 0.6716 | 0.5213 | 0.5870 | 0.1856 | 0.4876 | 0.2689 | 0.6450 | 0.5259 | 0.5794 | |||
2022-07-21 | e2e text spotter - final version | 0.4239 | 0.6717 | 0.5204 | 0.5864 | 0.1858 | 0.4872 | 0.2690 | 0.6451 | 0.5249 | 0.5788 | |||
2022-07-21 | Detector Free E2E Method | 0.4201 | 0.6615 | 0.5244 | 0.5850 | 0.1797 | 0.4935 | 0.2635 | 0.6344 | 0.5286 | 0.5767 | |||
2022-07-20 | oCLIP_v2 | 0.4133 | 0.6737 | 0.4682 | 0.5524 | 0.2028 | 0.4842 | 0.2859 | 0.6441 | 0.4660 | 0.5408 | |||
2022-07-19 | CLOVA OCR DEER | 0.4057 | 0.6399 | 0.5243 | 0.5764 | 0.1624 | 0.4800 | 0.2427 | 0.6129 | 0.5303 | 0.5686 | |||
2022-07-20 | large_param3 | 0.4032 | 0.6344 | 0.5345 | 0.5802 | 0.1544 | 0.4721 | 0.2327 | 0.6082 | 0.5429 | 0.5737 | |||
2022-07-28 | fnnrcv3 | 0.3930 | 0.7186 | 0.3919 | 0.5072 | 0.2270 | 0.3781 | 0.2837 | 0.6933 | 0.3937 | 0.5022 | |||
2022-07-20 | DB_threshold2_TRBA_CocoValid | 0.3910 | 0.6408 | 0.4993 | 0.5613 | 0.1526 | 0.4229 | 0.2243 | 0.6160 | 0.5096 | 0.5578 | |||
2022-07-28 | tbd | 0.3794 | 0.6944 | 0.3881 | 0.4979 | 0.2069 | 0.3740 | 0.2664 | 0.6679 | 0.3900 | 0.4924 | |||
2022-07-21 | zyk | 0.3560 | 0.5463 | 0.5367 | 0.5415 | 0.1114 | 0.4687 | 0.1800 | 0.5190 | 0.5459 | 0.5321 | |||
2022-08-01 | cnnrcv4 | 0.3520 | 0.6327 | 0.3856 | 0.4791 | 0.1680 | 0.3794 | 0.2329 | 0.6033 | 0.3864 | 0.4711 | |||
2022-07-20 | BIT | 0.3489 | 0.5487 | 0.5065 | 0.5268 | 0.1127 | 0.4440 | 0.1798 | 0.5213 | 0.5149 | 0.5181 | |||
2022-08-11 | Baseline - GLASS | 0.3487 | 0.7580 | 0.3063 | 0.4363 | 0.2491 | 0.2723 | 0.2602 | 0.7368 | 0.3109 | 0.4373 | |||
2022-07-21 | E2E_MASK | 0.3213 | 0.4790 | 0.5414 | 0.5083 | 0.0864 | 0.4673 | 0.1458 | 0.4520 | 0.5514 | 0.4968 | |||
2022-07-21 | yyds | 0.2868 | 0.5153 | 0.3554 | 0.4207 | 0.1063 | 0.3336 | 0.1612 | 0.4857 | 0.3583 | 0.4124 | |||
2022-07-21 | yyvis | 0.2848 | 0.5120 | 0.3531 | 0.4180 | 0.1054 | 0.3326 | 0.1600 | 0.4823 | 0.3559 | 0.4095 | |||
2022-07-21 | sudokill-9 | 0.2834 | 0.5162 | 0.3408 | 0.4106 | 0.1103 | 0.3322 | 0.1656 | 0.4854 | 0.3420 | 0.4012 | |||
2022-07-21 | rickyyds | 0.2833 | 0.5159 | 0.3406 | 0.4103 | 0.1104 | 0.3327 | 0.1657 | 0.4850 | 0.3417 | 0.4009 | |||
2022-07-21 | PAN | 0.2813 | 0.5050 | 0.3481 | 0.4121 | 0.1050 | 0.3358 | 0.1600 | 0.4745 | 0.3498 | 0.4027 | |||
2022-07-21 | transformer | 0.2813 | 0.5047 | 0.3478 | 0.4118 | 0.1052 | 0.3368 | 0.1603 | 0.4740 | 0.3493 | 0.4022 | |||
2022-07-20 | CVO detection and recognition model | 0.2657 | 0.5227 | 0.2928 | 0.3753 | 0.1145 | 0.2902 | 0.1643 | 0.4912 | 0.2931 | 0.3671 | |||
2022-07-18 | Double-U | 0.2500 | 0.4895 | 0.3051 | 0.3759 | 0.0847 | 0.2472 | 0.1262 | 0.4642 | 0.3129 | 0.3738 | |||
2022-07-20 | oCLIP | 0.2404 | 0.4772 | 0.0751 | 0.1298 | 0.4121 | 0.4842 | 0.4452 | 0.1746 | 0.0198 | 0.0355 | |||
2022-07-18 | DBNetpp | 0.2034 | 0.3942 | 0.2700 | 0.3205 | 0.0562 | 0.2075 | 0.0885 | 0.3715 | 0.2784 | 0.3183 | |||
2022-08-19 | Baseline - TextTranSpotter (Poly) | 0.1855 | 0.3737 | 0.2497 | 0.2994 | 0.0445 | 0.1637 | 0.0700 | 0.3548 | 0.2614 | 0.3010 | |||
2022-07-21 | BIT_OCR | 0.1590 | 0.2631 | 0.2585 | 0.2608 | 0.0398 | 0.2517 | 0.0687 | 0.2399 | 0.2594 | 0.2493 | |||
2022-08-11 | Baseline - POLYGON | 0.1588 | 0.3144 | 0.2093 | 0.2513 | 0.0444 | 0.1778 | 0.0710 | 0.2918 | 0.2136 | 0.2467 | |||
2022-08-11 | Baseline - BEZIER | 0.1383 | 0.2815 | 0.1812 | 0.2205 | 0.0374 | 0.1509 | 0.0600 | 0.2609 | 0.1853 | 0.2167 | |||
2022-07-28 | End-to-end OCR with transformer | 0.1257 | 0.2556 | 0.1527 | 0.1912 | 0.0438 | 0.1711 | 0.0698 | 0.2293 | 0.1502 | 0.1815 | |||
2022-07-20 | TH-DL | 0.0932 | 0.1839 | 0.1323 | 0.1539 | 0.0216 | 0.1087 | 0.0360 | 0.1689 | 0.1355 | 0.1504 | |||
2022-07-21 | End-to-end OCR with transformer | 0.0014 | 0.0026 | 0.0017 | 0.0020 | 0.0006 | 0.0032 | 0.0010 | 0.0020 | 0.0015 | 0.0017 | |||
2022-07-19 | NNRC | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | |||
2022-07-19 | NNRC_OCR | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 | 0.0000 |