Post
Emmanuel Ameisen @mlpowered ยท Dec 26, 2022
  • From Twitter

In a great YT video @karpathy challenges viewers to find a vectorized way to backprop through a complex embedding lookup. Tada! mapping = [link] num_classes=27).float() dC = torch.tensordot(mapping, demb, dims=([0,1], [0,1]))

Replies
No replies yet