--- license: cc-by-4.0 pipeline_tag: image-to-image tags: - pytorch - super-resolution --- [Link to Github Release](https://github.com/Phhofm/models/releases/tag/4xNomos2_realplksr_dysample) 4xNomos2_realplksr_dysample Scale: 4 Architecture: [RealPLKSR with Dysample](https://github.com/muslll/neosr/?tab=readme-ov-file#supported-archs) Architecture Option: [realplksr](https://github.com/muslll/neosr/blob/master/neosr/archs/realplksr_arch.py) Author: Philip Hofmann License: CC-BY-0.4 Purpose: Pretrained Subject: Photography Input Type: Images Release Date: 30.06.2024 Dataset: [nomosv2](https://github.com/muslll/neosr/?tab=readme-ov-file#-datasets) Dataset Size: 6000 OTF (on the fly augmentations): No Pretrained Model: [4xmssim_realplksr_dysample_pretrain](https://github.com/Phhofm/models/releases/tag/4xmssim_realplksr_dysample_pretrain) Iterations: 185'000 Batch Size: 8 GT Size: 256, 512 Description: A Dysample RealPLKSR 4x upscaling model that was trained with / handles jpg compression down to 70 on the Nomosv2 dataset, preserved DoF. Based on the [4xmssim_realplksr_dysample_pretrain](https://github.com/Phhofm/models/releases/tag/4xmssim_realplksr_dysample_pretrain) I released 3 days ago. This model affects / saturate colors, which can be counteracted a bit by using wavelet color fix, as used in these examples. Added a static (3 256 256) onnx conversion, with fp32 and fully optimized. This can be used with chaiNNer, since the dysample pth file would be unsupported. (Removed other conversions like statis with 128 because they would produce different results, but static 256 gives the same result as using the pth file with neosr testscript. Showcase: [Slowpics](https://slow.pics/s/p3sMnd5l) (Click on image for better view) ![Example1](https://github.com/Phhofm/models/assets/14755670/2259feb8-816e-4bbf-8a78-bb8494835204) ![Example2](https://github.com/Phhofm/models/assets/14755670/a041d4ef-2b72-4325-8eba-796052083619) ![Example3](https://github.com/Phhofm/models/assets/14755670/74dbed5e-950b-415d-8281-1cf20ef7a2b3) ![Example4](https://github.com/Phhofm/models/assets/14755670/6ebfe31e-fff4-4239-a679-3a90a491ef1e) ![Example5](https://github.com/Phhofm/models/assets/14755670/da1aaba4-8253-4048-a46b-a100f00ce3a6) ![Example6](https://github.com/Phhofm/models/assets/14755670/9f678af8-f02d-42c0-981c-b704a0847936) ![Example7](https://github.com/Phhofm/models/assets/14755670/e165af6b-c546-4903-bdb8-7002de93aedc) ![Example8](https://github.com/Phhofm/models/assets/14755670/e94c1c03-bf18-4c11-806a-0a4b9646bea4) ![Example9](https://github.com/Phhofm/models/assets/14755670/51e78e8d-4aa0-405a-87b5-ee43f0577c6e) ![Example10](https://github.com/Phhofm/models/assets/14755670/ba38a1e1-0957-471d-a8fa-df9f81d10fc7) ![Example11](https://github.com/Phhofm/models/assets/14755670/fc7bac98-53cf-4bfd-9633-80d48c36f1ec) ![Example12](https://github.com/Phhofm/models/assets/14755670/47fb1f87-6fc3-417e-aa09-2a4e588e35d2) ![Example13](https://github.com/Phhofm/models/assets/14755670/cb49b862-5e8a-477d-8350-da30f38880cf)