首页 ›
模型分享 › LIAE-UD_128_WF_69w迭代_4万亚洲人脸_小显存可用
jaysonteng
发表于 2021-12-11 17:41:41
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
h1715246
发表于 2021-12-11 18:47:30
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
wcw888
发表于 2021-12-11 18:52:15
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
楼兰古国
发表于 2021-12-11 20:05:44
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
nmb177681
发表于 2021-12-11 21:41:45
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
muyanzhao
发表于 2021-12-12 10:31:02
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
natsumi
发表于 2021-12-12 10:38:14
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
tktkl
发表于 2021-12-12 10:56:48
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
edwardwang138
发表于 2021-12-13 00:05:41
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览
weblang123
发表于 2021-12-13 21:23:56
4.8万 亚洲人脸数据(512分辨率) 迭代100w 的预训练模型。
人脸数据是我从网上爬的(已切好),见我另一个帖子:https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
我用的batch_size是4(占用显存4.64G),减小batch后 4G卡也可以用
参数如下:
=============== Model Summary ===============
== ==
== Model name: pretrained_SAEHD ==
== ==
== Current iteration: 688068 ==
== ==
==------------- Model Options -------------==
== ==
== resolution: 128 ==
== face_type: wf ==
== models_opt_on_gpu: True ==
== archi: liae-ud ==
== ae_dims: 256 ==
== e_dims: 64 ==
== d_dims: 64 ==
== d_mask_dims: 22 ==
== masked_training: True ==
== eyes_mouth_prio: True ==
== uniform_yaw: True ==
== blur_out_mask: False ==
== adabelief: True ==
== lr_dropout: y ==
== random_warp: True ==
== true_face_power: 0.0 ==
== face_style_power: 0.0 ==
== bg_style_power: 0.0 ==
== ct_mode: rct ==
== clipgrad: True ==
== pretrain: False ==
== autobackup_hour: 5 ==
== write_preview_history: False ==
== target_iter: 0 ==
== random_src_flip: False ==
== random_dst_flip: False ==
== batch_size: 4 ==
== gan_power: 0.0 ==
== gan_patch_size: 16 ==
== gan_dims: 16 ==
== ==
==-------------- Running On ---------------==
== ==
== Device index: 0 ==
== Name: GeForce RTX 2060 ==
== VRAM: 4.64GB ==
== ==
目前是4个batch size,占用了显存4.64G,如果减小batch size,就可以在4G甚至2G上使用了。
使用的是数据 4w+的亚洲人脸数据,链接:
https://www.dfldata.xyz/forum.php?mod=viewthread&tid=4526&extra=
一个印度兄弟说他用不了百度网盘,让我传到mega网盘上,我也顺便传分享一个到百度网盘。
度盘链接:
https://pan.baidu.com/s/1g_KMbIjV2nuFEdPZMZ9R8Q
购买主题
已有 6 人购买
本主题需向作者支付
10 灵石 才能浏览