Using lora for idefics-8b-chatty finetuning with two RTX4080 32G, gather_map error

#78
by shuminzhou26803586 - opened

Here is the trace info:
Traceback (most recent call last):
File "D:\vsdev\pythontest\main.py", line 201, in
trainer.train()
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\transformers\trainer.py", line 1859, in train
return inner_training_loop(
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\transformers\trainer.py", line 2203, in _inner_training_loop
tr_loss_step = self.training_step(model, inputs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\transformers\trainer.py", line 3138, in training_step
loss = self.compute_loss(model, inputs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\transformers\trainer.py", line 3161, in compute_loss
outputs = model(**inputs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\data_parallel.py", line 187, in forward
return self.gather(outputs, self.output_device)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\data_parallel.py", line 204, in gather
return gather(outputs, output_device, dim=self.dim)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\scatter_gather.py", line 113, in gather
res = gather_map(outputs)
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\scatter_gather.py", line 102, in gather_map
return type(out)((k, gather_map([d[k] for d in outputs]))
File "", line 9, in init
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\transformers\utils\generic.py", line 393, in post_init
for idx, element in enumerate(iterator):
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\scatter_gather.py", line 102, in
return type(out)((k, gather_map([d[k] for d in outputs]))
File "c:\ProgramData\anaconda3\envs\pytorchcp310cu124\lib\site-packages\torch\nn\parallel\scatter_gather.py", line 108, in gather_map
return type(out)(map(gather_map, zip(*outputs)))
TypeError: DynamicCache.init() takes 1 positional argument but 2 were given

I trace the error by step, finding the reason is that idefics2 model implements past_key_values by DynamicCache, which is not supported type(out) with positional parameters. Is there any solution for this?

Sign up or log in to comment