workaround so training doesn't hang when packed dataloader batches aren't even (#461)
c69faee
unverified
winglian
commited on